Bonfire of the Vanities

The road to hell is paved with good intentions

As an ideological mantra, ‘Never Again’ is associated primarily with the genocide politics of the 1940s, and in this context its effectiveness has been questionable, at best. As a dominating imperative, it has been vastly more consequential within the economic sphere, as a response to the Great Depression of the 1930s. Whilst ethnically selective mass killing is widely frowned upon, its attractions have been difficult to suppress. Deflationary depression, on the other hand, is simply not allowed to happen. This has been the supreme axiom of practical morality for almost a century, uniquely and distinctively shaping our age. We can call it the Prime Directive.

For the Western world, the 1930s were a near-death experience, an intimate encounter with the abyss, recalled with religious intensity. Because the threat was ‘existential’ – or unsurpassable – the remedy was invested with the absolute passion of a faith. The Prime Directive was adopted as a basic and final law, to which all social institutions and interests were subordinated without reservation. To question or resist it was to invite comprehensive disaster, and only a radically uninformed or criminally reckless heretic – a ‘crank’ – would do that. Anything is better than deflationary depression. That is the New Deal Law.

The consolidation of financial central planning, based on central banking and fiat currencies, provided the priesthood of the Prime Directive with everything it needed to ensure collective obedience: No deflationary depression without deflation, and no deflation with a well-oiled printing press. ‘Counter-cyclical’ inflation was always an option, and the hegemony of Anglophone economic-historical experience within the flourishing American century marginalized the memory of inflationary traumas to global backwaters of limited influence. Beside the moral grandeur of the Prime Directive, monetary integrity counted for nothing (only a crank, or a German, could argue otherwise).

The Prime Directive defines a regime that is both historically concrete and systemically generalizable. As Ashwin Parameswaran explains on his Macroeconomic Resilience blog, this type of regime is expressed with equal clarity in projects to manage a variety of other (non-economic) complex systems, including rivers and forests. Modern forestry, dominated by an imperative to fire suppression, provides an especially illuminating example. He notes:

The impetus for both fire suppression and macroeconomic stabilisation came from a crisis. In economics, this crisis was the Great Depression which highlighted the need for stabilising fiscal and monetary policy during a crisis. Out of all the initiatives, the most crucial from a systems viewpoint was the expansion of lender-of-last-resort operations and bank bailouts which tried to eliminate all disturbances at their source. In [Hyram] Minsky’s words: “The need for lender-of-Iast-resort operations will often occur before income falls steeply and before the well nigh automatic income and financial stabilizing effects of Big Government come into play.” (Stabilizing an Unstable Economy pg 46)

Similarly, the battle for complete fire suppression was won after the Great Idaho Fires of 1910. “The Great Idaho Fires of August 1910 were a defining event for fire policy and management, indeed for the policy and management of all natural resources in the United States. Often called the Big Blowup, the complex of fires consumed 3 million acres of valuable timber in northern Idaho and western Montana…..The battle cry of foresters and philosophers that year was simple and compelling: fires are evil, and they must be banished from the earth. The federal Weeks Act, which had been stalled in Congress for years, passed in February 1911. This law drastically expanded the Forest Service and established cooperative federal-state programs in fire control. It marked the beginning of federal fire-suppression efforts and effectively brought an end to light burning practices across most of the country. The prompt suppression of wildland fires by government agencies became a national paradigm and a national policy” (Sara Jensen and Guy McPherson). In 1935, the Forest Service implemented the ‘10 AM policy’, a goal to extinguish every new fire by 10 AM the day after it was reported.

In both cases, the trauma of a catastrophic disaster triggered a new policy that would try to stamp out all disturbances at the source, no matter how small.

At Zerohedge, The World Complex elaborates on the history of fire suppression in the United States:

The forests of the southwestern United States were subjected to a lengthy dry season, quite unlike the forests of the northeast. The northeastern forests were humid enough that decomposition of dead material would replenish the soils; but in the southwest, the climate was too dry in the summer and too cool in the winter for decomposition to be effective. Fire was needed to ensure healthy forests. Apart from replenishing the soils, fire was needed to reduce flammable litter, and the heat or smoke was required to germinate seeds.

In the late 19th century, light burning — setting small surface fires episodically to clear underbrush and keep the forests open — was a common practice in the western United States. So long as the fires remained small they tended to burn out undergrowth while leaving the older growth of the forests unscathed. The settlers who followed this practice recognized its native heritage; just as its opponents called it “Paiute forestry” as an expression of scorn (Pyne, 1982).

Supporters of burning did so for both philosophical and practical reasons — burning being the “Indian way” as well as expanding pasture and reducing fuels for forest fires. The detractors argued that small fires destroyed young trees, depleted soils, made the forest more susceptible to insects and disease, and were economically damaging. But the critical argument put forth by the opponents of burning was that it was inimical to the Progressive Spirit of Conservation. As a modern people, Americans should use the superior, scientific approaches of forest management that were now available to them, and which had not been available to the natives. Worse than being wrong, accepting native forest management methods would be primitive.

Spelling out the eventual consequences of the ‘progressive’ reformation of forest management practices probably isn’t necessary, since – in striking contrast to its economic analog – its lessons have been quite thoroughly absorbed, widely and frequently referenced. Ecologically-sophisticated environmentalists, in particular, have become attached to it as a deterrent model of arrogant intervention, and its perverse consequences. Everybody knows that the attempt to eliminate forest fires, rather than extinguishing risk, merely displaced, and even accentuated it, as the accumulation of tinder transformed a regime punctuated by comparatively frequent fires of moderate scale with one episodically devastated by massive, all-consuming conflagrations.

Parameswaran explains that the absence of fires leads to fuel build-up, ecological drift towards less fire-resistant species, reduction in diversity, and increased connectivity. The ‘protected’ or ‘stabilized’ forest changes in nature, from a cleared, robust, mixed, and patch-worked system, to a fuel-cluttered, fragile, increasingly mono-cultural and tightly interconnected mass, amounting almost to an explosive device. Stability degrades resilience, and preventing the catastrophe-to-come becomes increasingly expensive and uncertain, even as the importance of prevention rises. By the penultimate stage of this process, crisis management has engineered an impending apocalypse: a disastrous event that simply cannot possibly be allowed to happen (although it surely will).

Parameswaran calls this apocalyptic development sequence The Pathology of Stabilisation in Complex Adaptive Systems. It’s what the Prime Directive inevitably leads to. Unfortunately, diagnosis contains no hint of remedy. Every step up the road makes escape more improbable, as the scale of potential calamity rises. Few will find much comfort in the realization that taking this path was insane.

‘Black-boxes’ (or flight recorders) retrieved from air disasters are informative in this respect. With surprising regularity, the last words of the pilot, announced to no one in particular, eloquently express an acknowledgment of unattractive but unmistakable reality: “Oh $#it!” Less common – in fact, unheard of – is any honest address to the passengers: “Ladies and gentlemen, this is your captain speaking. We are all about to die.” What would be the point?

Everything to be realistically expected from our ruling political and financial elites can be predicted by rigorous analogy. This flight doesn’t end anywhere good, but it would be foolish to await an announcement.

Unencumbered by official position in the Cathedral of the Prime Directive, ‘Mickeyman’ at World Complex is free to sum things up with brutal honesty:

We have lived through a long period of financial management, in which failing financial institutions have been propped up by emergency intervention (applied somewhat selectively). Defaults have not been permitted. The result has been a tremendous build-up of paper ripe for burning. Had the fires of default been allowed to burn freely in the past we may well have healthier financial institutions. Instead we find our banks loaded up with all kinds of flammable paper products; their basements stuffed with barrels of black powder. Trails of black powder run from bank to bank, and it’s raining matches.


Kinds of Killing

How bad is genocide, really?

Like ‘fascism’ – with which it is closely connected in the popular imagination – ‘genocide’ is a word carrying such exorbitant emotional charge that it tends to blow the fuses of any attempt at dispassionate analysis. We can thank the political black magic of Adolf Hitler and his Nazi accomplices for that.

Prior to the Third Reich and its systematic, industrialized attempts to eradicate entire ethno-racial populations (Jews, Roma, and perhaps Slavs) along with other numerous other groups (mental and physical ‘defectives’ or ‘useless eaters’, homosexuals, communists, Jehova’s Witnesses …) international law restricted its attention to the actions and grievances of states and individuals, with the latter subdivided into combatants and noncombatants. The National Socialist trauma changed that fundamentally.

On December 9, 1948, the United Nations adopted the Convention on the Prevention and Punishment of the Crime of Genocide (as Resolution 260), defining a new category of internationally recognized crimes as “acts committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group.”

Since 1948, defending genocide has been the surest way to ruin a dinner party. That doesn’t mean, however, that the topic deserves to be immunized from controversy. There is one question in particular that merits intense and prolonged scrutiny: Is genocide really worse than killing a lot of people?

Posed slightly more technically: Is there a crime of genocide that stands above and beyond mass murder (of equivalent scale)? Or (a rough equivalent): Can groups be the specific victims of crime? This is to ask whether groups exist – and have value — as anything more than a nominal or strictly formal set, whose reality is exhausted by its constituent individual members. The existence of genocide as a legal category presumes a (positive) answer to this question, and in doing so it closes down a problem of great and very general importance.

The classical liberal presumption is quite different, as summarized (a little bluntly) by the provocative remark made by British Prime Minister Margaret Thatcher in 1987 “… there is no such thing as society. There are individual men and women, and there are families.” Harshly extrapolating from this position, a certain irony might be found in the fact that a horrified response to National Socialist crimes has taken the form of a legal codification of racial collectivism. At the very least, it is puzzling that suspicions directed at legal references to ‘group rights’ and ‘hate crimes’ among those of a libertarian bent has not been extended to the category of genocide.

In the opposite camp, the most fully articulated defense of collectives as real entities is found, as might be expected, in the foundation of sociology as an academic discipline, and more particularly in Émile Durkheim’s argument for ‘social facts’. Larry May looks back further, to Thomas Hobbes’ Leviathan, or social being, in which human individuals are absorbed as organic parts.

Whilst the distinction of ‘society’ and ‘individual’ has colloquial (and political) meaning, those inclined to the analysis of complex systems are more likely to ask which groups or societies are real individuals, exhibiting functional or behavioral integrity, as self-reproducing wholes. In pursuing this line of investigation, it is far more relevant to discriminate between types of groups than between groups and individuals, or even wholes and parts. It is especially helpful to distinguish feature groups from unit groups.

A feature group is determined by logical classification. This might be expressed as a self-identification or sense of ‘belonging’, an external political or academic categorization, or some combination of these, but the essentials remain the same in each case. Certain features of the individual are isolated and emphasized (such as genitalia, sexual orientation, skin-color, income, or religious belief), and then employed as the leading clue in a process of formal grouping, which conforms theoretically to the mathematics of sets.

A unit group, in contrast, is defined as an assemblage, or functional whole. Its members belong to the group insofar as they work together, even if they are entirely devoid of common identity features. Membership is decided by role, rather than traits, since one becomes part of such a group through functional involvement, rather than classification of characteristics. Social instances of such groups include primitive tribes (determined by functional unities rather than the categories of modern ‘identity politics’), cities, states, and companies. The most obvious instance in socialist theory is the ‘soviet’ or ‘danwei’ work unit (whilst social classes are feature groups).

To take a non-anthropomorphic example, consider a skin cell. Its feature group is that of skin cells in general, as distinguished from nerve cells, liver cells, muscle cells, or others. Any two skin cells share the same feature group, even if they belong to different organisms, or even species, exist on different continents, and never functionally interact. The natural unit group of the same skin cell, in contrast, would be the organism it belongs to. It shares this unit group with all the other cells involved in the reproduction of that organism through time, including those (such as intestinal bacteria) of quite separate genetic lineages. Considered as a unit group member, a skin cell has greater integral connection with the non-biological tools and other ‘environmental’ elements involved in the life of the organism than it does with other skin cells – even perfect clones – with which it is not functionally entangled.

Clearly, both feature groups and unit groups are ‘fuzzy sets’, and the distinction itself – whilst theoretically precise – is empirically hazy. An urban American street gang, for instance, will in most cases be vague in its features and unity, perhaps ‘ethnic’ to some degree of definition, with a determinable age-range, and with ambiguous functional connections to groupings on a larger scale, or to peripheral members whose status of ‘belonging’ is not strictly decidable. Tattoos and other membership markings are likely to involve both identity and integrity aspects – traits and roles. Rituals of belonging (ordeals, oaths, rites of passage) are designed to disambiguate membership.

Despite such haziness, the distinction between these two types of groups strikes directly at the core problematic of genocide (as a legal category). When a unit group is destroyed, a real individual is ‘killed’ above and beyond whatever human losses are incurred. The destruction of a feature group, in contrast, whatever the cultural loss, is not any kind of killing beyond the mass murder of human individuals. If this is worse than murder, we should know why.

This conclusion seems relevant when weighing, for instance, the 1937 Massacre of Nanjing on the scale of historical atrocity. It suggests, at least, that an act of violence directed against a city – or integrated population unit — is no less worthy of specific legal attention than a quantitatively equivalent offense against an ethnicity, or determined population type. It seems to be no more than an accident of history that, in order to appropriate the category of genocide, massive crimes of the former variety need to be recoded as if they more properly belonged to the latter.

Complex systems ontology aside, these matters resolve ultimately into obscure social values. Orthodox conceptions of ‘genocide’ assume that ethnic identity simply and unquestionably means more than active citizenship, or participation in the life of a city. Perhaps this assumption is even arguable. But has it been argued?