An Anxious Peace: ‘God’ After Auschwitz (2)

This is the last of two posts under the same general title.

*     *     *     *     *     *

1.

Difference in hue and hair is old. But the belief in the preeminence of hue and hair, the notion that these factors can correctly organize a society and that they signify deeper attributes, which are indelible—this is the new idea at the heart of these new people who have been brought up hopelessly, tragically, deceitfully, to believe that they are white.

These new people are, like us, a modern invention. But unlike us, their new name has no real meaning divorced from the machinery of criminal power. The new people were something else before they were white—Catholic, Corsican, Welsh, Mennonite, Jewish—and if all our national hopes have any fulfillment, then they will have to be something else again.

 

Ta-Nehisi Coates is the African-American father of a teen-aged son, as well as a national correspondent for the Atlantic. The lines above come from his recent book Between the World and Me (Melbourne, Australia: The Text Publishing Company, 2015, page 7), which is cast as a letter to his son. They address what Coates presents as the second “ideal” that has defined the United States of America in its historical reality. That “ideal,” Coates writes a few lines earlier (pages 6-7) is “one that Americans implicitly accept but to which they make no conscious claim.” As his next lines make clear, what is at issue in that second “ideal” is American racism. He writes:

Americans believe in the reality of “race” as a defined, indubitable feature of the natural world. Racism, the need to ascribe bone-deep features to people and then humiliate, reduce, and destroy them—inevitably follows from this inalterable condition. In this way, racism is rendered as the innocent daughter of Mother Nature, and one is left to deplore the Middle Passage or the Trail of Tears the way one deplores an earthquake, a tornado, or any other phenomenon that can be cast as beyond the handiwork of men.

As I have already said, this rarely acknowledged ideal of race and racism is actually the second definitive American ideal Coates discusses at the very opening of his book, his long letter to his son. The first is the ideal of democracy itself, as paradigmatically articulated by Lincoln at Gettysburg, concerning which Coates writes (page 6):

Americans deify democracy in a way that allows for a dim awareness that they have, from time to time, stood in defiance of their God. But democracy is a forgiving God and America’s heresies—torture, theft, enslavement—are so common among individuals and nations that none can declare themselves immune. In fact, Americans, in a real sense, have never betrayed their God. When Abraham Lincoln declared in, in 1863, that the battle of Gettysburg must ensure “that government of the people, by the people, for the people, shall not perish from the earth,” he was not merely being aspirational; at the onset of the Civil War, the United States of America had one of the highest rates of suffrage in the world. The question is not whether Lincoln truly meant “government of the people” but what our country has, throughout its history, taken the political term “people” to actually mean. In 1863 it did not mean your mother or your grandmother, and it did not mean you and me. Thus America’s problem is not its betrayal of “government of the people,” but the means by which “the people” acquired their names.

 

“This,” he writes in his very next line, “leads us to another equally important ideal,” the second one—the racist one that, as I’ve already quoted, “Americans implicitly accept but to which they make no conscious claim.” It is in pursuing that second ideal, however unacknowledged the pursuit, that so many people in America have been historically excluded from “the people” of, by, and for whom the United States was founded and preserved, according to the first, most deeply foundational ideal of America. In what I’ve also already quoted, Coates suggests that the presumed “reality of ‘race’ as a defined, indubitable feature of the natural world” is itself used by the powers that be in America who take themselves to be to be white—precisely to justify racism as “the innocent daughter of Mother Nature.”

Against any such racist self-justification, however, Coates reminds his readers that “race is the child of racism, not the father,” and points out that “the process of naming ‘the people’ has never been a matter of genealogy and physiognomy so much as one of hierarchy.” He then adds the lines I quoted to begin this post, in which he speaks of needs to happen among the “new people” of America, if that same America is ever to fulfill its “national hopes.”

Exactly one hundred pages later, Coates writes to his son concerning these “new people” of America: “And I would not have you like them.” He writes that (on page 107) to his son, despite everything that would seem to speak against any such wish for someone he loves—that is, despite all the death, destruction, and misery inflicted on the black American community by the racism, rarely even acknowledged, of “those Americans who believe that they are white,” as Coates first puts it back at the very beginning of his book. “You have been cast into a race in which the wind is always at your face and the hounds are always at your heel,” he continues in his letter to his son. “And to varying degrees this is true of all life. The difference is that you do not have the privilege of living in ignorance of this essential fact.” Coates’s son’s own true privilege, the one that gives Coates reason not to wish his son could have been born white, does not lie in the ignorance that grounds white privilege. Rather, Coates’s son’s genuine privilege lies in knowledge. That is a knowledge taught only by centuries of cruelty, but which nevertheless brings with it a genuine potential of liberation, as does all true knowledge, however painful to acquire.

The knowledge to Coates’s son is heir by right of being born black is precisely the knowledge we all need, in fact, if we Americans are ever to attain genuine peace among us all, and fulfill the true potential of America’s first ideal by creating a real democracy as the rule of, by, and for all the people of America.

The peace we really need in America today to fulfill that first ideal is not the peace of mutual respect between “blacks” and “whites.” What we really need is mutual respect between all the various peoples who make up the one American people, once we finally succeed in defining “the American people” in a way that excludes no one who lives here. For any such anxious peace to break out, however, “white” unity must first be destroyed. For America’s national hopes ever to find true fulfillment, the “white” community must first be fragmented into pieces. Or, rather, since “the ‘white’ community” does not actually succeed in naming any unified community at all in the first place, the illusion that there is any such thing must first be shattered.

There will never be any final peace between “whites” and “blacks.” There can only be an anxious peace between peoples, and for that, there must first be some peoples—more than just one—“divorced from the machinery of criminal power.”

That is itself one worthy ideal we can glean from reading Coates’s book, in fact. In turn, it leads us to another equally important ideal: of an America freed from the belief in magic.

 

2.

Not everyone who says to me, ‘Lord, Lord,’ will enter the kingdom of heaven, but only the one who does the will of my Father in heaven. On that day many will say to me, ‘Lord, Lord, did we not prophesy in your name, and case out demons in your name, and do many deeds of power in your name?’ Then I will declare to them, ‘I never knew you; go away from me, you evil-doers.’

—Matthew 7: 21-23 (New Revised Standard Version)

 

We live in a “goal oriented” era. Our media vocabulary is full of hot takes, big ideas, and grand theories of everything. But some time ago I rejected magic in all its forms. The rejection was a gift from your grandparents, who never tried to console me with ideas of an afterlife and were skeptical of preordained American glory. In accepting both the chaos of history and the fact of my total end, I was freed to truly consider how I wished to live—specifically, how do I live in this black body? It is a profound question because American understands itself as God’s handiwork, but the black body is the clearest evidence that America is the work of men. [. . .] The question is unanswerable, which is not to say futile. The greatest reward of this constant interrogation, of confrontation with the brutality of my country, is that it has freed me from ghosts and girded me against the sheer terror of disembodiment.

—Ta-Nehisi Coates, Between the World and Me (page 12)

 

The magic from belief in which his grandparents’ gift freed Coates is magic in the sense of the endeavor to control events through the manipulation of supernatural or occult forces by way of charms, spells, incantations, rituals, or the like. To believe in magic in the sense at issue is to believe, for example, that one can assure oneself victory in contests such as are involved in war, love, or football by more properly invoking the name of God before the contest than do one’s opponents.

Such belief is a form of superstition, and as such is based in ignorance and fear—just such fear as love drives out, at least according to Christian scriptural tradition. According to 1 John 4:18 (NRSV): “There is no fear in love, but perfect love casts out fear; for fear has to do with punishment, and whoever fears has not reached perfection in love.”

A few centuries later, John Cassian relayed, from the early Christian desert solitaries, the teaching that there are three kinds of “obedience.” First, there is the obedience of the slave, who obeys out of fear of punishment for disobedience. Second, there is the obedience of the servant, who obeys in hopes of reward—of getting paid for obeying. Third, there is the obedience of the child, who obeys out of love.

It is only in that third and final form, where obedience and love become indistinguishable, that each at last comes into its own. Either to love or to obey—and most especially to obey the command to love, that most, if not only, divine command—either out of fear of punishment if one doesn’t, or out of hope for reward if one does, is neither to love nor to obey at all, really. It is to confuse love and obedience with acts of magic, or at least magic-acts: attempts to manipulate forces beyond one’s own, or at least to engender the illusion that one can do so. Both acts of magic and magic-acts engage superstition, not faith. They are among the childish things that St. Paul (to remain within the Christian tradition) advises us to leave behind with our childhoods when we grow up.

In the Jewish tradition from which the Christian one grows—that tradition to which Hannah Arendt and Emmanuel Levinas, for example, belong—belief in magic is just a form of idolatry. It confuses God with what is not God, but just the work of human hands (Psalm 135:15).

By their gift, Ta-Nehisi Coates’s grandparents helped him grow up, and leave idolatry behind.

 

3.

What confuses the most is that everyone everywhere more and more agrees on one way of thinking, which counts as giving the only standard.

—Martin Heidegger, “Confusion” (“Die Wirrnis”), in Gesamtausgabe 76 (Frankfurt: Vittorio Klostermann, 2009, page 269)

 

Contradiction tends to be negatively viewed in an intellectual milieu dominated by positivistic empiricism. Thus the demonstration that there are contradictions in a body of theory is likely to be understood as a refutation of the validity of the theory. The reader should not conclude that this is the author’s view.

—A. Belden Fields, Trotskyism and Maoism: Theory and Practice in France and the United States (Brooklyn: Autonomedia, Inc., 1988, page 250)

 

Become who you are!

—Nietzsche, Thus Spoke Zarathustra

Only if we grow up can America ever become what The Pledge of Allegiance—which was not formally adopted by Congress until1942 and not formally given that name until three years later, in 1945—says that we are (or at least has so said since the last formal change was made to The Pledge in 1954, adding the invocation of God’s name). That is, only by finally growing up can America truly be “one nation, under God, indivisible, with liberty and justice for all.” Until then, America will remain many nations and no nation, superstitious, conflicted, with privilege for some and justice for none. Properly understood, truly to pledge allegiance to what The Pledge says we are, is really to pledge to work to change who we really still all too much are. Truly pledged, The Pledge has to be taken as a promise still demanding to be kept, and not as a boast of past achievements that establish how exceptional we already are.

I don’t know if it is still fashionable in some right-wing circles, as it was a few years ago, to insist that America is a “republic,” and not a “democracy.” At any rate, at least the last part of that does hold: America is indeed not a democracy, if by “being” one means current actuality, stripped of any not yet actual but still definitive potential. American is not a democracy in the same sense that an acorn is not an oak. That means that America still is a democracy, in terms of the promise that in effect defines that name, and challenges what bears that name to become worthy of it, just as an acorn is, so to speak, an oak, only an oak to be.

The manipulative proliferation of ever more cheeply maintained and ever less deeply grounded “opinions” and “views” on everything under (and over) the sun, a proliferation so characteristic of our current American system of governing the diverse peoples of America, has made what America is as a pure actuality void of all promise into what might best be called, not demo-cracy, Lincoln’s “rule of, by, and for the people” (from Greek demos, people), but doxo-cracy, that is, rule of, by, and for opinion (from Greek doxa, opinion). And if America remains satisfied to be no more than such a doxocracy, it will only betray itself and what it most truly has always been—been as that “last, best hope” for humankind that Lincoln also famously said it was.

What is needed if America is to evade such self-betrayal is no politics of consensus. The solution does not lie in the promotion of dialogue in search of areas of agreement between the proponents of fundamentally different positions on divisive issues—for example, the search for some supposed “common ground” between so called “pro-life” and “pro-choice” advocates. The ever-growing confusion of “opinions” and “views” among the diverse population in the United States (and increasingly globe-wide) cannot be dispelled through any push to “come to an agreement” about the matters about which those opinions and view are maintained with ever-growing rancor and contentiousness. That just makes the confusion by which the doxocracy governs grow worse, and the governing all the easier.

The politics of consensus is part of the problem, not part of the solution, to borrow a useful popular way of putting the point. As opposed to the politics of consensus, what is really required is something such as the politics of dis-sensus for which contemporary French political philosopher Jacques Rancière, for one, has called. The politics of consensus has always ended up, regardless of the intentions of those who pursue it, keeping everything going along smoothly on an even keel. Politically considered, that means it always works to serve the already powerful, and to protect their special privileges, rather than to serve and protect everyone, equally and justly. In contrast, a politics of dissensus would strive to rock the boat. It would seek to disrupt “business as usual.” To use a different metaphor, it would seek to point out that the king has no clothes, and thereby to make of the king a laughingstock.

What is sorely needed for America to fulfill its own defining promise as America is not the formation of any consensus of opinion out of the swirling proliferation of them. What is needed is to replace the proliferation and protection of opinions with the proliferation and protection of all the peoples of America, whether they be Sioux or Navaho, Aleut or Afghan or African, Coptic, Muslim, Hindu, Shinto, or Zen, and including even all us uprooted, thoroughly assimilated, people-less folks who have been taught to think that we are nothing but white, and may not have one single clue among us all about how to find our way home again.

What we need in America, as all around the planet, is no peace in the sense of “the repose of a self within itself,” as Levinas puts it in In the Time of the Nations, that pseudo peace that lets us feel secure and safe in our supposed “autonomous self-sufficiency.” We need no such easy, self-satisfied simulation of real peace. We need, instead, “an anxious peace,” the sort of true peace that Levinas characterizes purely and simply as the peace of “love for one’s fellow man.” We need the peace of a love anxious for the wellbeing of others—and anxious not to betray itself.

If America is what The Plede of Allegiance says it is, then America can only be the place of such peace. If America is what The Pledge says it is, then America is not yet. It remains an open question whether America ever will be.

Shattering Wholes: Creatively Subverting the University and Other Mobs–Final Fragment

After a long interruption, I am resuming work on this blog. The post below is the last of three in a series under the same general title—the last of three “Fragments” of “Shattering Wholes.”

*     *     *     *     *     *

            Every critique of the present has its right only as a mediated illumination of the knowledge of future necessities. All fixation on grievances clouds vision into the essential; it lacks what alone supports critiques: the capacity to differentiate that arises from dedication to something not yet real—that is, present at hand—but therefore all the more originally having the rank of what already is.

— Heidegger, Überlegungen VI, §113 (GA 94)

Only one who has once overcome contempt for others has no further need to feel superior in order to be great—which is to say to be, and let others fall where and how they may.

— Heidegger, Überlegungen VI, §140 (GA 94)

Last fall, on Saturday, November 29, 2014, memorial services in Colorado commemorated the 150th anniversary of the Sand Creek Massacre. On that date in 1864 a large body of Colorado Territory militia under the command of Col. John Chivington, who was also a Methodist preacher, slaughtered around 160 peaceful Cheyenne and Arapahoe Indians, mostly women and children, and then mutilated their corpses for fleshly souvenirs–including vulvas, breasts, and penises to be flown atop flags and pennants as the butchers rode away celebrating their glorious victory.

In addition, on the same day as the 150th anniversary of the Sand Creek Massacre, another event also took place. That day, November 29, 2014, was the day on which an Egyptian court formally dismissed all charges against former Egyptian President Hosni Mubarak, who was overthrown in 2011 during the so-called Arab Spring.

The two events of the Sand Creek Massacre, on the one hand, and the official exoneration of Mubarak, on the other, are separated in time by a century and one-half. Nevertheless, those two events are connected in telling ways, ways much more important than the trivial fact that they both took place on the same day of the same month, though 150 years apart. Above all, the two events, the Sand Creek Massacre in 1864 and the exoneration of Mubarak 150 years later, both embody efforts by powers that be to secure their power.   Both are examples of power “circling the wagons,” as it were, to protect itself.

That image of “circling the wagons” derives, of course, more from the time of the Sand Creek Massacre than from the much more recent times of Mubarak. It comes from what is in effect dominant US culture’s sanctioned narrative of the westward expansion of the United States in fulfillment of its supposed “Manifest Destiny.” That is the narrative in accordance with which the United States was divinely destined to spread itself from the Atlantic to the Pacific, across the whole expanse of North America between Mexico and Canada–or at least what the United States left of them, especially Mexico, after that expansion.

The story of the Sand Creek Massacre is granted a place within that larger narrative. It is usually a small place, as befits what is presented in the meta-narrative as an unfortunately regrettable exception to the generally glorious story of US exceptionalism.

In that broader story, waves of fabled wagon trains carried intrepid settler-families west during the 19th century, across the Great Plains and the Rocky Mountains, to the western edge of California and the Pacific Northwest, fulfilling the United States’ self-proclaimed destiny. As those wagons rolled west, they were subject to attacks by Indians presumptuous enough to resist the fulfillment of that very destiny, no matter how manifest it might have been to those who proclaimed and enacted it. To repel such attacks and overcome such resistance, the westward tending settler-trekkers would “circle the wagons,” as the story goes. They would thereby create a wall of protection for themselves, a wall behind which they could stand to use their massively superior killing technology to mow down the unfriendly “savages” who dared to attack them as invaders.

The 150th anniversary of the Sand Creek Massacre was marked not only by various memorial services—especially but not exclusively in Colorado, where the massacre occurred—but also by various official apologies pertaining to the atrocities performed at Sand Creek on November 29, 1864. To start with the most publicized example, on Wednesday, December 3, four days after the anniversary of the massacre itself, during a memorial ceremony at the State Capital, Colorado Governor John Hickenlooper became, according to his own office, the first Colorado governor to issue an official public apology for the butchery that had occurred at Sand Creek a century and a half before.

Just the other day as I am writing this, a court in South Carolina voided the conviction of the “Friendship 9,” who publicly broke South Carolina’s Jim Crow laws back in 1961 by daring to sit at a lunch-counter designated “Whites only,” and the prosecutor officially apologized for what had been officially done to them back then. Carolina thereby apologized for a wrong it had committed only forty-four years before, which compares favorably with the one-hundred-and-fifty years it took Colorado to apologize for the butchery it inflicted on 160 or so innocent American Indians at Sand Creek in 1864, which took place only a little less than one hundred years before the butchery of justice in the case of the Friendship 9. If those figures are any indication of general human progress, and if the rate of such improvement can be presumed to remain steady across time and countries, then perhaps we can hope that it will take only about 24 years for Egypt to apologize for its whitewashing last November 29 of Mubarak’s various butcheries.

At any rate, no official Egyptian apology for the wrong whereby Egypt officially dismissed all charges against Mubarak can be expected until the officiating power in Egypt feels safe and secure enough to issue it. That, in turn, will only come once the conditions that triggered the commission of the original wrong in the first place have ceased to exist. That is, only once everything that was in play in the Arab spring in 2011 that threatened to subvert Egyptian officialdom has withered away in one fashion or another, will it then be safe for official Egypt to admit to its official wrong, and officially apologize for it.

To put the point generally and simply, it is only when such apologies no longer cost anything to the entities that, through their representative mouthpieces, make them, that they will be made at all. Such official apologies are made, as a rule, only when they no longer really accomplish anything. Or rather, all they really accomplish is further to solidify the coercive power that is apologizing for its own past abuses—to help circle the wagons ever more tightly, as it were.

At issue is not the integrity of the individual mouthpieces through which the apology gets issued. For example, I have no reason to doubt the personal integrity of Colorado Governor John Hickenlooper (at least no reason aside from the fact that he is an elected official of an official state apparatus, which should always make one somewhat sceptical). I have even less reason to doubt the personal integrity of the prosecutor in South Carolina who officially apologized to the Friendship 9 the other day, and least reason of all to doubt that of the judge there who officially voided their convictions and expunged their records. I’m not quite as free of suspicion toward the members of the Egyptian court that dismissed the charges against Mubarak, but even in that case I am not interested in raising any issues of personal integrity. That is simply not my point.

My point, rather, is that we should institutionalize in ourselves suspicion against institutional apologies, and the institutions that sooner or later (most often later) issue such apologies for their own past institutional misbehavior.  We should never just trust an institution when it issues such an apology. Rather, such official apologies should give us even more reason to distrust the institutions issuing them.

Years ago, I used to warn students in my classes never to trust anyone who made a point of telling you how honest he was, since he was probably picking your pocket even while he spoke. That applies even more to institutions than to individuals, and most especially to institutions wielding coercive power of any sort.

Even if I trust Governor Hickenlooper personally, I do not trust the State of Colorado, that “authority” for which, as Governor, he spoke his recent official apology for the Sand Creek Massacre of 1864. The State of Colorado has too much to gain, and nothing to lose, by issuing such an apology—too much to gain and too little to lose for me to take it at its word.

Nor was it only the State of Colorado that apologized recently for the role it played with regard to the Sand Creek Massacre. So did two universities. One of them (the University of Denver) is itself in Colorado. However, the other (Northwestern) is in Illinois. The University of Denver and Northwestern University both issued apologies pertaining to the Sand Creek Massacre because the two schools share a common founder: John C. Evans. Besides going around and founding institutions of higher education, John Evans also preceded John Hickenlooper in the Colorado Governor’s chair—though when Evans was Governor, Colorado was still a Territory, not yet a State. Evans, in fact, was Colorado Territorial Governor back when the Sand Creek Massacre occurred, and the Colorado troops that did all the massacring did so under his final authority. That particular buck stopped with him.

I personally know almost all the faculty members on the University of Denver (DU) committee that researched and wrote the report detailing Evan’s culpability in the massacre, his involvement in which led to the recent DU apology. Over the many years that I taught at DU, I worked with them. I respected and liked them. I still do. I have no doubt whatsoever about their personal integrity, their scholarship, or their ethical commitment. I have read their report, and find it to be a thorough, thoroughly admirable analysis.

Thus, toward the DU committee and their report, I feel no suspicion at all. I trust the committee. I do not, however, trust the University that commissioned their work, nor its pronouncement of regrets with regard to the massacre in which its founder had an important hand. The University has too much to gain, and nothing to lose, by issuing the committee’s report with its official imprimatur, and adding an expression of institutional chagrin at the University’s founder’s complicity in the Sand Creek Massacre.

To an extent, at least, universities are themselves coercive institutions. Even insofar as they are not, however, it was nevertheless to serve such institutions that the University first arose; and ever since it arose the University has continued to provide such service. The University exists for the sake of “authority,” that is, coercive power. We should therefore always be suspicious of universities and their proclamations, most especially when those proclamations tend to cast the University in a good light, as uttering apologies for old wrongs can easily do.

That the University has much to apologize for is a given. The University has committed wrongs aplenty to go around to all the diverse universities that are its individual class-members. There are, for example, many examples of collusion between the University and such more directly and obviously coercive institutions as the army and the police. Many instances have occurred during my own lifetime, and I will mention only a few of the most egregious.

In 1968 at the University of Nanterre, in the France of De Gaulle’s “Fifth Republic,” students went to the streets protesting the American war in Vietnam, French collusion with that war, especially through the University system itself, and in general the whole market-capitalist fabric that underlay such acts of official violence. What began with those protests at Nanterre soon enough culminated in the largest general strike anywhere ever, one that shut the whole of France down—but which has been glossed over since, in the officially sanctioned memory, as no more than a “student revolt,” one seeking to increase such individual liberties as what used to be called “free love,” in Paris in May ‘68.

Back at the beginning of that whole process, when the protesting students first took to the streets of Nanterre, authorities at the University there called out the cops. As Kristin Ross, an American professor of comparative literature, writes in her excellent study, May ’68 and Its Afterlives (University of Chicago Press, 2002, page 28): “The very presence of large numbers of police, called to Nanterre by a rector, Pierre Grappin, who had himself been active in the Resistance [to the Nazis during the German occupation of France in World War II], made the collusion between the university and the police visible to a new degree.”

Not to be outdone by their French counterparts, American University administrators soon followed Grappin’s suit, by calling in police or army to quell student protests at American universities. That included most famously the protests at Kent State University in Ohio in May of 1970, after Nixon and Kissinger unleashed the American bombing of Cambodia. Then Ohio Governor Jim Rhodes called in the Ohio Army National Guard, who soon killed four unarmed Kent State students and wounded nine others, permanently paralyzing one.

That in turn set off waves of student protests at other universities across the country. Among them was what came to be known as “Woodstock West.” That took place at the same University of Denver that recently apologized for its founder’s culpability for the Sand Creek Massacre. In the spring of 1970, the spring of “Woodstock West,” then Chancellor Maurice Mitchell appealed to then Colorado Governor John Love, who called out the Colorado National Guard to rout the protesting DU students who, eschewing violence, had set up a shanty-town of protest on the DU campus–where I joined the faculty myself a little over two years later, returning to my native Colorado after three years being occupied elsewhere.

I began this current series of three posts—three “Fragments” under the same general title of “Shattering Wholes: Creatively Subverting the University and Other Mobs”—with a quote from an essay by Jean-Claude Milner about the University as an institution in service to coercive power, that power that lays claim to being the “authority” in charge of things at any given time. In his essay Milner does a nice job of pointing out how, as the identity of “authority” changes over time, the University undergoes a change in masters, as well as in how exactly it renders those masters service.

The University as we have come to know it first developed during the Middle Ages. At that time the University arose, as Milner points out, in order to produce more priests for the Christian Church, the authority of the day. Especially with its insistence on celibacy for the priesthood, the Church was constantly in need of more priests, and the job of the University was to provide them.

Then in the modern era, Milner explains, as the authority of the Church waned and came to be replaced by the modern nation-state, so did the needs of authority change. What it needed “more” of, was no longer priests. Instead, modern power needed more members of the bourgeoisie. So that became what the University turned out: good bourgeois citizens.

Today, however, things have changed once again. What contemporary authority needs more of today is no longer good bourgeois citizens. What authority needs more of today is broader—and emptier—than that. What the powers that be today need is ever more of what Milner aptly calls “agents of the market,” which above all means good consumers for the products that market markets.

So that is just what the University produces today: all sorts of obedient agents of the global consumer market. As Milner writes (L’Universal en éclats: Court traité politique 3, Verdier: 2013, page 104): “Sellers, buyers, producers, consumers form [what Freud called] a ‘natural mob [or “mass,” crowd,” “group”: all being possible as translations of the French foule, which Milner uses for Freud’s German term Masse, which is itself most often rendered my “group” in the standard English translation of Freud’s works].’ From now on, that is coextensive with the entirety of humanity. It dedicates itself to a constant growth. To that growth of a mob taken for natural, the artificial mob that is the University wishes to offer its assistance.”

Whichever presumably “natural” mob it may serve at a given time, the obviously “artificial” mob of the University turns all into one, both as assembly of persons and as system of knowledges—of all the “arts and sciences,” to use a term that began to become dated about three decades ago, at least at DU, where I spent almost all of my professorial career, and where the old “College of Arts and Sciences” was rendered defunct by the then-resident University authorities in the mid-1980s. Such turning into one of all persons and knowledges only befits the name of the institution charged with that task: University, from Latin unus, “one,” and versus, the past participle of the verb vertere, “to turn.”

Today, in service to the rulers of the global marketplace, the University turns everyone into a good consumer, and everything into a product to be consumed. That includes especially, turning all who attend its classes into good, never sated consumers of “information” and—first, last, and above all—faithful, lifelong “consumers of education,” to use the corporate-market jargon favored by up-to-date University administrators today.

At the very end of his classic, Masse und Macht, first published in German in 1960 and translated into English by Carol Stewart as Crowds and Power (London: Victor Gollancz, 1962), Elias Canetti, who received the Nobel Prize for literature in 1981, writes this:

The system of commands is acknowledged everywhere. It is perhaps most articulate in armies, but there is scarcely any sphere of civilized life where commands do not reach and none of us they do not mark. Their threat of death is the coin of power, and here it is all too easy to add coin to coin and amass wealth. If we would master power [by which Canetti, as I read him, means “break its hold on us”] we must face command openly and boldly, and search for means to deprive it of its sting.

For those who are under the command of the University, as I was for my entire adult life until my recent retirement and elevation to emeritus professor status, the way to heed Canetti’s admonition—if anything, an admonition that calls for heeding even more loudly today than it did 55 years ago, when Canetti first issued it (or even just 21 years ago, when he died)—Milner points the way. It is the way of cheerful, apparently compliant subversion indicated in the quotation with which I began this three-fragment series, and by repeating which I will now end it. The lines come from page 114 of his L’Universal en éclats, which most appropriately means “The universal in pieces” (or “in fragments), in his essay called “De l’Université comme foule,” “On the University as mob”:

The University is not an alma mater, but a milk-cow.   Not just scoundrels can milk it. Neither to believe it, nor to believe in it, nor to serve it, but to serve oneself to it, should be the order of the day. To place in doubt, though it be only by detour, one, several, or all, facile universals—that program is not easy, and not without risk. But being wise doesn’t preclude being sly. It is possible for the wise to shatter the mass.

Shattering Wholes: Creatively Subverting the University and Other Mobs–Another Fragment

“You see, it’s easy for the musicians to feel as if they were serving the conductor. They even call their rehearsals and performances ‘services.’ The very physical structure of the organization—with the orchestra radiating out from a central raised platform and the conductor standing over them—promotes that dynamic. In this kind of an environment, many orchestral musicians feel disconnected.”

“Yes,” I said, nodding. “It’s a perfect setup for ‘Shut up, and do what you’re told.’”

“Exactly. The very context of an orchestra fosters a culture in which the players don’t own the work; the conductor does.”

–Roger Nierenberg

 

There is a difference between trusting someone as a leader, and being dependent on someone. Leadership depends upon trust. What depends upon dependency is something else, however. It is tyranny. Leaders build trust in those they lead. Tyrants build insecurity.

The approach to conducting that Roger Nierenberg models in his Music Paradigm program—as embodied in his novel Maestro: A Surprising Story About Leading by Listening (Portfolio, 2009), from early in which (page 20) the citation above is taken—provides a fine example of genuine leadership. As the citation suggests, the exercise of such leadership may well require working against the grain of the very organizational or institutional setting within which it takes place. That is especially the case whenever that setting is both built upon and designed to foster dependency rather than trust.

Nierenberg makes the connection between leadership—at least the sort he models—and trust explicit in an even earlier passage, near the very start of the novel (page 5). The fictional narrator, a business executive facing a downturn in company business, comes home from work one day and overhears a conversation between his daughter and Robert, her music teacher, about the new conductor in the orchestra to which he belongs. His interest perked by what he hears, the narrator asks Robert what is so special about the new conductor. Robert replies: “When he’s on the podium it’s as if the differences between us [various musicians in the orchestra] somehow magically disappear, which in turn promotes trust and confidence.” “Trust in him?” the narrator asks. After hesitation, Robert replies: “I guess so. But I think we get the feeling that he trusts us. Somehow that makes us work together so much better. It never seems as if he’s dictating. You always feel like you’re contributing toward something bigger than yourself.”

As Nierenberg depicts his sorts of conductors, they, too, are guided by a vision of something bigger than themselves. In the later parts of the brief novel, the maestro of the title repeatedly points to how the good conductor must always be guided by such a vision. In the case of conductors, it is an auditory vision, as it were. That is: a vision of how the score being played here and now by this given orchestra, with all of its diverse parts with diverse talents and degrees of accomplishment, can sound, if all the diverse musician that make up the orchestra can indeed be brought fully to trust themselves and one another, and give themselves over to the piece.

The “eyes” that can see such visions—regardless of whether they be eyes or ears or whatever other organs—are the eyes of love. Leadership guided by such visions, and in turn guiding others to share them, is a loving leadership.   It is creative: it brings into being.

Such leadership is magical.

*     *     *     *     *     *

Mentioning magic, at one point in his book-length analysis of the Harry Potter films, published just this last spring (Harry Potter: À l’école des sciences morales et politique, PUF, 2014, page 51), Jean-Claude Milner remarks that “one might define magic as an integrally anti-capitalist enterprise. Because it can transform objects without labor and without machines, it makes the material base of capitalism, which is to say surplus value and the power of labor, disappear.”

So conceived, magic—as celebrated not only in the Harry Potter novels and films, which might, because their lack of significant Christian references, be accused of blasphemy by those defensive about their Christianity,* but also in Tolkien’s Lord of the Rings and other hobbit” narratives, and even in C. S. Lewis’s blatantly Christian Chronicles of Narnia—is inherently subversive of the ruling power of our endless day. Yet magic, of course, has a power of its own, one that can all too easily be made to undergo a completely non-magical transformation into the snakiest imaginable servant of what the better angels of its nature would have it subvert.

There is a scene towards the beginning of Harry Potter and the Deathly Hallows: Part I—which came out in 2010, the first of the two-part finale to the Harry Potter films—that serves well as a counter-model to the leadership exemplified by Nierenberg’s “maestro.” Voldemort, the Dark Lord of the films, has returned, literally from the other side of the grave, to grasp a second time for unchallenged power over wizards, witches, and “Muggles” (i.e., ordinary mortals) alike. He has called all the heads of the old sorcerer families that supported his return together at one of their castles, and at one point during the proceedings he subjects the entire assembly to a demonstration of his power, and of what awaits any of them who may for whatever reason run afoul of it. Voldemort floats the paralyzed but very much still living and conscious body of Charity Burbage, Professor of Muggle Studies at the Hogwarts school of sorcery who has made the mistake of teaching the equality of Muggles and sorcerers and the legitimacy of marriage between them, above the table where they are all seated. “Dinner!” says Voldemort after speaking a few apt words, therewith unleashing Nagini, the magical snake who is his irreplaceable supporting companion, to devour her as they watch.

The lesson is clear, as Milner notes in his book on the Harry Potter films when he discusses the scene. By his act, writes Milner (pages 107-108), Voldemort lets those who have thought to serve themselves by serving him “see a close-up of what they had chosen to ignore: the power they have worked to put in place accepts no limits to its own exercise.” Such a power will exercise itself, regardless of consequences. By its very nature, it is cruel, such that “even if a cruelty shows itself to have no utility [on its own], that will be no reason not to pursue it to the extreme.” Indeed, “to the contrary,” since the whole point of such egregious acts of cruelty is precisely to display the unlimited nature of the claim to power so exercised. What those who are made to witness such displays have thrust upon their attention is their own impotence in the face of such power. “In a general way,” what Voldemort’s act of wanton cruelty makes clear is that, under such a sovereign power as his, “rational politics will never have the last word, because the last word comes back to Voldemort’s pleasure.”

Milner calls attention to the parallels between the fictional character of Voldemort and the historical one of Hitler. In the case at hand, the parallel is between the “old families” of wizards and witches who help Voldemort rise to power in the story of Harry Potter, on the one hand, and the rich industrialists and other “conservative” elements of German society who did the same for Hitler in the 1930s, on the other. The “old families” in the Potter narratives are enamored of themselves because of what they perceive as the “superiority” their magic powers give them over the Muggles, and protective of the privileges that accrues to them through those magic powers. Just like the rich under the Weimar Republic, merely replacing “magic” with “money” and “Muggles” with “hoi polloi.”

Unfortunately, a sense of superiority easily follows upon the recognition that one has been given special powers, whether those powers be magical, mental, or musical. In turn, that sense of superiority brings in its own train defensiveness against anything perceived as challenging it. Thus, as Milner is quick to point out, the sense of superiority that goes with the recognition that one has unusual talents or gifts is nearly always accompanied by the fear of inferiority—of somehow not being worthy of having the very powers one finds oneself to have.

That is especially so when the special powers at issue are dispensed randomly, without their recipients having done or been anything special to deserve them.   However, that is exactly how it is with most talents, gifts, and powers, of course. They come to those to whom they come by accident, not as a reward for merit.

For instance, in the Harry Potter story Harry’s basic magical capacities—what makes him different from the Muggles who raise him after his parents have been killed during his infancy—are nothing he sought and acquired through his own efforts. He is born with them, inheriting them from his parents. Similarly, physical beauty, musical or other artistic talent, physical prowess, and the intelligence measured by IQ tests, are all based on natural gifts dispensed without regard to antecedent individual merit.

For that matter, so are most of the conditions that account for some individuals becoming aware of their special talents and capacities, whereas others never even come to know they have such talents.   Furthermore, even if circumstances conspire to let one become aware that one has some special gift, they must also conspire to grant one the opportunity to develop that gift. By accident, for instance, a child may learn she has a talent and taste for playing the cello, as our own daughter learned when she was 11. But then it is no less by accident that the same child may be provided with the resources needed to develop that talent and taste—as was, once again, our own daughter, who, when she found she had both a desire and a gift for playing the cello, also found herself living in a reasonably well-funded school system and with a set of reasonably well-paid parents, so that she could be provided the material and educational means to pursue that desire and develop that gift.

Having special powers does not make one somebody special. They do not make those who have them superior to those who don’t. Nevertheless, those so endowed are subject to the temptation to become, as Milner puts it (page 112), “bearers of an ideology of superiority.” The specially gifted “can be seduced, not despite their exceptional talents, but by reason of those talents. Especially if they are ignored or mistreated by their entourage,” as those with special talents often are—again, not despite, but because of, those same talents, we might add, since any gift that makes someone “different” can easily evoke such defensive reactions from those around them, those not so gifted.

Once seduced to such an ideology of superiority, those with special powers can, like Voldemort, also easily succumb to the temptation to exercise those powers over others. They can, like him, come to take pleasure in imposing their will upon others, in the process convincing themselves of their right so to enslave those to whom they have come to consider themselves superior.

However, the underlying, ever-present doubt of their own superiority and their defensiveness about it, grounded in their awareness of having been and done nothing special to deserve their special gifts, continues to carry “a germ of vulnerability” even in the midst of wanton displays of “brutality and terror.” That sense of continuing, inescapable vulnerability sets up such self-styled masters, who delight in subjecting others to their will, to subject themselves in turn to yet others claiming mastery, and indeed to find relief and solace in such submission. For example, Milner writes (p. 113): “Let us suppose that an admired thinker, taken as the greatest of his generation, rallies to an ignorant, belching, hysterical tribune. [Think Heidegger and Hitler, of course!**] Simple folks are astonished; but on the contrary nothing is more normal: this thinker is doubtful of the admiration he knows surrounds him, until it confirms itself in the admiration of which he discovers himself capable.” Thus, imagined superiority doesn’t just lead one to enslave those one takes to be inferior to oneself, it also leads one to let oneself be enslaved in turn.

Against such temptations and perversions of gifts, talents, and powers, Milner suggests, only humility offers any real, final defense. Humility alone would accept gifts as just that—gifts: things for which thanks are be offered.

Humility is not that easy a thing to come by, however.  It is itself a gift, in fact.

What is more, if that gift of humility itself is given, it is also no easy thing truly to give thanks for such a gift. There is a strong, constant tendency to turn thanks for the gift of humility into its very opposite, making of it no more than an exercise in even greater arrogance—the arrogance of thinking oneself humble, like the righteous man at the back of the temple thanking God for making him so superior to the disgusting tax collector beating his chest and weeping in the profession of his guilt down at the altar.

Above all, the way that one properly gives thanks for a gift by accepting and using it. However, just what are the uses of humility? Perhaps Harry Potter can show us something of that, as well. At least it may be worth briefly reflecting upon what Milner calls “the Potterian narrative” with that in mind.

Although that is a direction of reflection that Milner himself does not explicitly pursue, what he says provides good clues. That is especially true of a line in the Potter films to which Milner calls his reader’s attention, one that occurs in more than one of the films and is spoken by more than one of the character, about Harry and to him: “You have your mother’s eyes.”   In explanation of that remark, Milner cites (on page 33) what one of the characters in the narrative says about Harry’s mother Lily Potter’s eyes, which is that they had the power to see the beauty in others, most especially when they weren’t able to see any themselves.

The use of humility is to open eyes like Harry’s mother’s, eyes that in turn open others, calling forth—which is to say creating—the beauty that is in them. The gift of humility is given not for the good of the humble themselves, at least not directly. It is given for the good of others. To give proper thanks for such a gift is to use it by practicing seeing through eyes like Lily Potter’s.***

Such eyes are simply the eyes of love—which brings me back to where I started this fragment, and which is also a good place to end it.

* On page 28 of his Harry Potter book, Milner says that so far he is unaware of any such charges being leveled against the Harry Potter stories, but then adds sarcastically that he “does not despair of learning one day that the Potterian narrative has been banned in part for blasphemy.” In these benighted United States, of course, at least a few such charges and such efforts have indeed been made.

** And appropriately so, at least by one reading of Heidegger’s relationship to Hitler and the Nazis—though not the only reading possible, nor necessarily the one finally to be preferred.

*** Lest one think that is an easy thing to do, one might want to go back and watch the Harry Potter films again. Or read Roger Nierenberg’s Maestro.

 

Shattering Wholes: Creatively Subverting the University and Other Mobs—A Fragment

The University is not an alma mater, but a milk-cow.   Not just scoundrels can milk it. Neither to believe it, nor to believe in it, nor to serve it, but to serve oneself to it, should be the order of the day. To place in doubt, though it be only by detour, one, several, or all, facile universals—that program is not easy, and not without risk. But being wise doesn’t preclude being sly. It is possible for the wise to shatter the mass.

— Jean-Claude Milner, “De l’Université comme foule”

 

When I finally sobered up a bit over a quarter of a century ago, one of the things that first hooked me on sobriety was the sheer freedom of it. No one but a happily abstinent alcoholic can experience the joy of the freedom sobriety brings with it.

One way my newfound sobriety freed me was in my driving.

I am not proud of having done so, but during the years of my drinking I often drove “under the influence.” Once I embraced sobriety I no longer had to contend with at least one constant anxiety that accompanies any dedicated drinker who drives after drinking, even if is that drinker and driver feels no real anxiety about a possible accident. That is the anxiety that, however attentively one minds the road, one might not detect every lurking unmarked (or even marked) police car, and might get pulled over and risk arrest for drunk driving.

In fact, I got so hooked on the wonderful freedom of not having to care about being pulled over by the police, that I even went through a period of challenging them to pull me over.   Most of the time most of us (drinkers or not) will automatically slow down if we are driving along and suddenly notice a police car sitting somewhere up ahead. We have long grown accustomed to doing that even if we are not exceeding the speed limit at the time. So anxious have we become before the representatives of that which claims authority over us that we often relate to ourselves as criminals even when we are being the best-behaved, most law-abiding citizens. If we are indeed breaking the law by driving “under the influence,” that anxiety is exponentially heightened.

Well, for a while not long after I embraced the life of sobriety, when I would come over a hill on, say, the 50-mile drive along the interstate between my home and my office at the university where I taught, and spy a police car waiting down the road a bit, instead of slowing down I would actually speed up. What did it matter if I got pulled over for speeding? At most, I’d have to pay a few (maybe even quite a few) bucks for it, but so what? What did such trivia matter? It mattered nothing to speak of, so far as I was concerned in my newfound exuberance of abstinence. Because I was at last free of the guilt of being me, I was also free of any concern—or at least any crippling concern—for what “the authorities” might do to me.

Thus, sobriety not only set me free not to drink any more. It also set me free to break the law—with, in effect, a good conscience.

I’m glad to report that soon, so soon that I never even got a single speeding ticket from such doings, it dawned on me that sobriety also set me free not to break the law—and to do that, too, with a good conscience. Indeed, I saw how much more important the freedom not to break the law was than the freedom to break it. That was because the freedom not to break it gave me the chance creatively to subvert it.

One way of putting it is that I saw how obeying the letter of the law could be a skillful means for subverting the law’s whole spirit. That is the spirit of subservience. It is the spirit, that is, of spiritlessness.

The point is not subservience. It is subversion—or, rather, the freedom that makes skilful subversion possible.

*     *   *     *     *     *

Only in the freedom recovery brought me was I able clearly to apprehend something of my preceding bondage, and of just what role my addiction itself had played in it. For the powers that be, and that would have us serve them, addiction is a very socially useful tool. It puts us addicts in service to power despite ourselves, however hard we may try to make ourselves unserviceable. It puts us at the mercy of power. Especially in our consumer society today, addicts make perfect subjects: obedient to the laws even in their very efforts to disobey.

*     *     *     *     *     *

At one point in Ghandi’s Truth (New York: W. W. W. Norton and Co., 1969) Erik Erikson describes how challenging it was for Gandhi to maintain the vow of vegetarianism he made to his Jain mother when he left India for England, that land of ubiquitous beef and mutton, to study at Oxford. Erikson writes that, to preserve his vow, Gandhi had to learn to do something more—and, indeed, completely different from—just resisting the temptation to eat meat. He had to learn, instead, to make not-eating meat itself into a definitive positive goal all on its own. As Erickson puts it (on page 145, emphasis in original), Gandhi “had to learn to choose actively and affirmatively what not to do—and ethical capacity not to be confused with the moralistic inability to break a prohibition.”

As I have pointed out before (in my Addiction and Responsibility, page 143), using that same reference: “The only proof against addiction in general is the sort of active and affirmative choice of ‘what not to do’ that Erikson mentions, the sort of choice involved in Gandhi’s vegetarianism or genuine calls to celibacy.” After noting (on the next page) that abstinence is “the general term for refraining from some common practice or pursuit,” I go on to observe:

What allows us to transform abstinence (whether from meat, from genital sex, from heroin, from child molestation, or whatever) from negative avoidance into positive embrace is this element of self-restraint at the heart of all abstinence. If we abstain from doing something merely because we fear the consequences of doing it, either on practical or moral grounds (Erikson’s “moral inability to break a prohibition” . . .), then we remain at the level of negative avoidance. However, once we begin to abstain from something for the sake of exercising our own self-restraint, we pass over from a negative abstinence to a positive one. From that point on, abstaining becomes its own, ever-growing reward.

 

Then it’s just for fun.

*     *     *     *     *     *

The citation from Milner with which I began this post is from the third of his “short political treatises,” L’universel en éclats: Court traité politique 3 (Verdier, 2014, page 114). The quoted lines are the closing ones of the fourth of six essays in that book. We might translate the title of the essay as “The University as Mob”—in the sense, for example, that organized crime is called “the Mob.” Foule, the French term Milner uses, is the same one used in the standard French translation of Freud’s Massenpsychologie und Ich-Analyse. Freud’s work provides Milner with a basis for his thinking about the University.

The standard English translation of the same work is called Group Psychology and the Analysis of the Ego. Etymologically, the German Masse and the English mass are the same word. Die Massen would be translated by “the masses.” The translation of Freud’s title by “group” can weaken his meaning. The French foule, which can be translated by “crowd,” “mob,” or “mass,” depending on context, comes closer.

What Freud is talking about in the essay at issue, as he tells us there, is not just any grouping of diverse individuals. Rather, what concerns him are assemblages that arise when diverse individuals come to identify themselves with some group, and with others insofar as they also so identify themselves. Above all, in his essay Freud is concerned with such assemblages insofar as they arise from diverse individuals coming to identify with one another insofar as each in turn identifies with one and the same leader, who comes through such identification to take over the role of what Freud calls the “ego ideal” for each individual.

Freud’s own discussion focuses on two “mobs” or “masses” as paradigms: the Army and the Church. Both are examples of what he calls “artificial masses.”   An artificial mass, as the name implies, is one that has to be brought about and then maintained by some external force—with all the hierarchical organization and directorial leadership that typically entails. The Nazi Party (NSDAP, from the German for “National Socialist German Workers Party”), the rise to power of which was eventually to drive Freud out of Vienna in 1938, seventeen years after his book about mass psychology and ego-analysis first appeared, would be another example, to go along with the Army and the Church.

Freud distinguishes such artificial masses from “natural masses,” which form spontaneously of themselves and, left to themselves, eventually dissolve. Often natural masses do not last for very long. We could use the mob that stormed the Bastille in 1789 to inaugurate the French Revolution as an example of such a natural mass of relatively brief duration. Another example would be the crowd that congealed in Cairo’s Tahrir Square and overthrew Mubarak in the Arab Spring of 2011.

*     *     *     *     *     *

Just while composing this post, I came across an interesting case of what strikes me as a creative subversion of one “artificial mass (though we don’t normally think of it that way): an orchestra. On the third page of the arts section of the New York Times for September 18, 2014, is a piece by critic James R. Oestreich about conductor Roger Nierenberg bringing his “Music Paradigm” program to the Lincoln Center for the Arts, before “an audience of nursing directors from New York-Presbyterian Hospital.”

Mr. Nierenberg began (“without apparent irony,” writes Mr. Oestreich) by remarking: “An orchestra is a great place to model organizational dysfunction.” According to Mr. Oestreich, the conductor, 67, had only rehearsed the 26 string players he brought with him for an hour before the performance—of Samuel Barber’s Adagio for Strings—but had otherwise left them unprepared for what was going to happen next, which was that “he continued to rehearse them in public, running through snippets and discussing those with players and audience alike, drawing lessons in leadership from the work of the conductor and the interactions of the players.” In the process, says Mr. Oestreich, Mr. Nierenberg did indeed “model dysfunction,” by showing “how a performance might be adversely affected if the conductor micromanaged with his baton, eyes and gestures, or if the conductor were simply disengaged or fidgety.”

But then he went on to model something else—or at least so it seems to me, though Mr. Oestreich does not himself say this: He modeled a fine, creative alternative to the organizational dysfunction by way of bad leadership that he had already displayed. Instead of having all the players focus their attention on his augustly conducting—albeit potentially micromanaging and/or disengaged and/or fidgety—self. Mr. Oestreich writes:

He had the players shift their focus to a particular colleague and attune their playing to complement one another’s. He had them perform with a conductor, then without a conductor and with eyes closed, to show how adept they were at intuitively adjusting to others on their own.

He had them start the piece at different tempos of their choice and alter tempos spontaneously, slowing down, perhaps, in midstream. The musicians were called on to speak as well as play, and audience members were occasionally drafted into action.

The watchword throughout was listening: players listening to one another and to the conductor, but just as much, the conductor listening to the players, how they sound, what they said.

This went on for some 75 minutes. Then the orchestra, with Mr. Nirenberg in place, performed the Adagio complete, beautifully, and departed to huge applause.

Later, toward the very end of his review, Mr. Oestreich quotes these lines from Mr. Nierenberg’s Maestro: A Surprising Story About Leading by Listening (Portfolio, 2009), an attempt to present his Music Paradigm idea in the form of a novel. Mr. Oestreich quotes the maestro of the novel as saying: “Every word I speak, every inflection in my tone of voice, every gesture is directed toward the goal of creating a feeling of community. A community simply acts faster, more intelligently, more creatively and with more joy than a group that is primarily focused on its leader.”

Since even before I ever started my own career as a teacher, I’ve always thought that the job of teachers was to make themselves unnecessary as soon as possible. To me, that’s always been a corollary of Nietzsche’s great line that students who always remain only students are repaying their teachers poorly. Taken at his own word (as well as Mr. Oestreich’s), in his Music Paradigm program Roger Nierenberg is in effect modeling how conductors in turn can—and should—model themselves on what I would call Nietzschean teachers.

What a wonderfully creative way to subvert the orchestra as mob! What a way to lead out of dependence on leaders!

What a way, too, to turn a mob into a community—but more on that in my next fragment.

Pulling Out of the Traffic: The Après-Coups After The Coup (3)

This is the third and final post of a series.

*     *     *     *     *     *

Third After-Shock: Flashes of Imagination

I do not, in the conventional sense, know many of these things. I am not making them up, however. I am imagining them. Memory, intuition, interrogation and reflection have given me a vision, and it is this vision that I am telling here. . . . There are kinds of information, sometimes bare scraps and bits, that instantly arrange themselves into coherent, easily perceived patterns, and one either acknowledges those patterns, or one does not. For most of my adult life, I chose not to recognize those patterns, although they were patterns of my own life as much as Wade’s. Once I chose to acknowledge them, however, they came rushing toward me, one after the other, until at last the story I am telling here presented itself to me in its entirety.

For a time, it lived inside me, displacing all other stories until finally I could stand the displacement no longer and determined to open my mouth and speak, to let the secrets emerge, regardless of the cost to me or anyone else. I have done this for no particular social good but simply to be free.

— Russell Banks, Affliction

 

What a great distinction! Making up vs. imagining! To “make up” is to confabulate, to cover, to lie. So, for example, do those who claim power over others make up all sorts of ways in which the usurpation of such power is necessary “for the common good” or the like. In contrast, to imagine is to make without making up. It is to create, which is to say to open out and draw forth sense and meaning. Making up is telling stories in the sense of fibs and prevarications. Imagining is telling stories in the sense of writing fiction. The former is a matter of machinations and manipulations; the latter is a matter of truth and art.

The passage above comes early in Affliction (on pages 47-48). The words are spoken in the voice of the fictional—which means the imagined—narrator of the novel, Rolfe Whitehouse. Rolfe is telling the story of his brother Wade’s life, and therewith of his own life, too, as he remarks in the passage itself.

*     *     *     *     *     *

A mere symmetry, a small observed order, placed like a black box in a corner of one’s turbulent or afflicted life, can make one’s accustomed high tolerance of chaos no longer possible.

— Russell Banks, Affliction (page 246)

 

Imagine, for example, a big black cube, surrounded by a neon glow, appearing in the sky over Oakland, setting off car horns and causing dogs to bark throughout the city in what soon ceases to sound like sheer cacophony, and becomes a new, hitherto unheard of harmony, in the sounding of which everyone is invited to join, each in each’s own way. Such a thing might all of a sudden make those who witnessed it no longer suited to tolerate the chaos in which, they now suddenly see, they had been living till then, without even knowing it.

*     *     *     *     *     *

. . . facts do not make history; facts do not even make events. Without meaning attached, and without understanding causes and connections, a fact is an isolate particle of experience, is reflected light without a source, planet with no sun, star without constellation, constellation beyond galaxy, galaxy outside the universe—fact is nothing. Nonetheless, the facts of a life, even one as lonely and alienated as Wade’s, surely have meaning. But only if that life is portrayed, only if it can be viewed, in terms of its connections to other lives: only if one regard it as having a soul, as the body has a soul—remembering that without a soul, the human body, too, is a mere fact, a pile of minerals, a bag of waters: body is nothing.

— Russell Banks, Affliction (page 339)

 

Ever since my mid-teens I have kept a sort of philosophical journal. That is, I’ve kept notebooks in which I’ve jotted down passages from what I was reading at the time that made me think, along with some of the thoughts they brought to me, or brought me to. For various periods of varied lengths I’ve let that practice lapse since then, but I always pick it up again eventually. For the last few years, there have been no lapses of any duration; and, in fact, my blog posts almost always arise from things I’ve already written more briefly about in my philosophical journals.

On our recent trip to San Francisco to watch our daughter work with The Coup, I carried my current philosophical journal along. Here’s what I wrote one morning while we were still out in the Bay area.

“The Essence of Accident, the Accident of Essence.”

That came to me this morning as the title for a possible blog post in which I’d explore the idea that the essential—or, more strictly speaking, the necessary—is itself essentially accident. That “accident,” the “accidental,” is precisely “essence,” the “essential.”

That goes with the idea of truth as event (and not, as Milner would say, as possible predicate of an event, a pro-position—to give an accidental connection, via my current reading and other experiences, its essential due). It was itself suggested to me by the accidental conjunction of a variety of factors, coming together with/in our trip out here to see [our daughter] perform with “Classical Revolution” (the name of the “group” from which the quartet with her on cello came) at/in conjunction with/as part of The Coup’s performance on Saturday, two days ago. Among those diverse but accidentally/essentially (i.e., as insight-bringing) connected factors are: (1) my reading in Heidegger’s Überlegungen [Reflections: from Heidegger’s so called “Black Notebooks,” which only began to be published this past spring in the Gesamtausgabe, or Complete Edition, of his works] this morning; (2) my ongoing reflection and talk (with [my daughter] and/or [my wife]) about Saturday’s “Coup” event; (3) my noticing yesterday one of the stickers on [my daughter’s] carbon-cello case, which sticker has a quote from Neal Cassady: “Art is good when it springs from necessity. This kind of origin is the guarantee of its value; there is no other.” That third factor was the catalytic one: the “necessity” Cassady is talking about has nothing to do with formal rules or mechanisms, but is precisely a matter of the “accidental,” which is to say be-falling (like a robber on the road), coalescence into a single work/flash/insight of all the diversity of factors that otherwise are chaotically just thrown together as a simultaneous series, as it were. . . . There’s another major factor so far not recorded as such: (4) attending The Coup’s performance at the Yerba Buena Center for the Arts in San Francisco on Saturday. That is the real arch-piece/factor here.

Which brings me to another possible blog post, which [my wife and daughter] yesterday suggested I should do, before the one on accidental essence and essential accidentality suggested itself to me this morning. That is a post about the impact of Saturday night’s event [that is, The Coup’s Shadowbox].

 

As readers of this current series of three posts to my blog already know, of course, I took my wife’s and daughter’s suggestion. But I expanded upon it, doing three posts about my experience of The Coup, rather than just one. And I was also able to incorporate it with my idea for a post on accident and essence, which became my preceding post, the second of the three of this series.

Whether there is any necessity to all that will have to speak for itself. (I can confidently say, at any rate, that it is not art.) All I know for sure is that my journal entry, and this subsequent series of three posts, came about from the accidental conjunction of the four facts I mention in the passage above, taken from my philosophical journal. That entry tells the tale of that conjunction, from which tale alone derives whatever significance or meaning those otherwise isolated particles of my experience may have.

*     *    *     *     *     *

I’ve just recently begun reading Wendy Doniger’s The Hindus: An Alternative History (New York: Penguin Press, 2009), a book that has been on my list to read ever since it first appeared, and that I’m finally getting around to. So far, I’m still in the first chapter, which is an introductory discussion. One of the lines that already especially struck me is this (on page 8): “This is a history, not the history of the Hindus.”

One reason that struck me when I read it was that earlier the same day I’d noted a remark Heidegger makes in his Überlegungen (on page 420 of Gesamtausgabe 94) about the “idols” we worship today (which is still the same day, really, as when Heidegger wrote his remark, back in the Nazi period). Today, among the idols we are most tempted to fall prey to worshipping are, by his partial listing: Science (with a capital ‘S’: “ ‘die’ Wissenschaft”), Technology (with a capital ‘T’: “‘die’ Technik”), “the” common good, (“‘die’ Gemeinnutzen), “the” people (“ ‘das’ Volk”), Culture (with a capital ‘C’: “ ‘die’ Kultur”). In all those cases, idolatry happens when we turn what are themselves really ways or paths of our life in the world with one another—including knowledges (“sciences”), know-hows (“technologies”), shared benefits (“common goods”), and cultivations (“cultures”)—into “ ‘purposes’ and ‘causes’ and ‘agents,’ all the forms and ‘goals’ of wheeling and dealing.”

When we restrict the term knowledge only to what can be con-formed to the one form we have come to call “science”—the paradigm of which is taken to be physics and the other so called “natural sciences”—and confine all other forms of knowledge to mere “opinion” (to which, of course, everyone has a right, this being America and all), then we become idolators. In the same way we fall into idolatry when we try to make the rich multiplicity of varied ways of doing things conform to our idea of some unitary, all embracing thing we call techonology—especially insofar as the idea of technology is connected for us with that of science, to create one great, Janus-faced über-idol. No less do we fall into idolatry when we buy into thinking that there is any such thing as “the” one and only one universal “common good,” which itself goes with the idea that there is some one universal “people” to which we all belong, as opposed to a rich diversity of distinct peoples, in the plural, with no “universal” to rule over them all. In turn, the idea of “culture” as itself some sort of goal or purpose that one might strive to attain—such that some folks might come to have “more” of it than others, for example—turns culture itself, which includes all those made things (made, but not made up: so we might even name them “fictions”) we call science, and technology, and common goods, and the like, into idols. No longer cherished as what builds up and opens out, what unfolds worlds, opening them out and holding them open, such matters gets perverted into service to the opposite sort of building, which closes everything down and shuts it away safe.

A few pages later in the same volume of his Überlegungen (on page 423), Heidegger mentions, in passing, “the working of an actual work.” That sounds better in the German: “die Wirkung eines wirklichen Werkes.” To preserve something of the resonance of the line in translation, we might paraphrase: “the effectiveness of an effective work”—keeping in mind that “to work” in English sometimes means “to bring about an effect” (as in the saying, “That works wonders!”). Or, to push the paraphrase even a bit further, we might even say: “the acting of an actual act.”

At any rate, in the remark at issue Heidegger says that “the working of an actual work” is that “the work be-works [or “effects”: the German is “das Werk erwirkt”]—when it works—the transposition [namely, of those upon whom it works] into the wholly other space that first ground itself through it [namely, grounds itself through the very work itself, an artwork, for instance].”

What I have translated as “transposition” is the German tern Versetzung, which comes from the verb setzen, “to place, put, or set.” Heidegger says that the work of the working work—the work of the work insofar as the work works, and doesn’t go bust—is to grab those upon whom it works and to set them down suddenly elsewhere. That is the shock of the work, as he calls it in “The Origin of the Work of Art,” from the same general period. It is the blow or strike, that is, the coup, that the work delivers to us, and in the delivery of which the work delivers us somewhere else. In the face of the work, at least when the working of that works strikes us in the face, then, as Dorothy said to Toto, we are not in Kansas anymore.

Such transposition is indeed shocking. It can be terrifying, in fact; and it is worth remarking that in German one word that can be translated as “to terrify” is Entsetzen, from the same root as Versetzen, “to transpose.” It is challenging to keep ourselves open to such terrifying transposition, such suddenly indisposing re-disposition of ourselves. We tend to close down toward it, trying to bar ourselves against it, withdrawing into safe places. Idolatry is no less than the endeavor so to enclose ourselves within safe places, rather than keeping ourselves open to such transpositions.*

*   *     *     *     *     *

From the beginning of my interest in them, I have known that the politics of The Coup is communist, at least in one good definition of that term (the definition Boots Riley, cofounder of the group, uses). As I have said before in this blog series, I am not certain about the complexion either of The Coup’s erotics or of their scientificity. However, I have now come to have it on good authority that The Coup are culinary anarchists.

The conjunction of the communist slant of their politics with the anarchist bent of their culinary persuasions gives me nothing but esteem for The Coup. On the other hand, that esteem would have been lessened not one bit if I had learned that they were, in reverse, culinary communists and political anarchists. The point is that neither in their politics nor in their food choices are The Coup into following the dictates of who or what lays claim to authority and power.

Adolf Hitler, who was no slouch when it came to claiming authority and power (all in the name of the common good of “das Volk,” of course), is just one of many claimers to authority from Aristotle on down to today who have cited for their own purposes this line from Homer’s Illiad: “The rule of many is not good, one ruler let there be.” Hitler was into that sort of thing. The Coup are into something different.

So is the Yerba Buena Center for the Arts in San Francisco, where my wife and I attended the world premier of The Coup’s Shadowbox. Making good on the promise I delivered toward the start of my second post of this three-post series on the after-shocks of that attendance, I want to come back to the “Note from the Curators” that opens the brochure I also mentioned there, the one about the Shadowbox premier. In it, the curators at issue write that YBCA “is in process of coalescing more consistently” with what they call “the energetic and aesthetic trajectories” of “local [aristic] ecologies,” especially the “local dance and music ecologies” of the Bay Area. By engaging in such a process, they write, YBCA, while “identifying itself as a physical place,” is also “aspiring to define itself as something more than brick and mortar.” YBCA is, of course, a physical place, and an imposing one at that, right in the heart of downtown San Francisco. More importantly, however, it “aspires,” as I read the curators’ note, to be a place that gives place to the taking place of works of art. As the two YBCA curators go on to write on behalf of the Center: “We aspire to hold firmly onto our institutional status while softening our institutional walls, locating the joy of less formal performance structure within our particularly austere architecture.” Pursuing that worthy—and, I would say, wonderfully anarchical, chaos-empowering—goal, they go on to write at the end of their note: “We plan to have hella fun** in this enterprise, to reposition participatory sweat as currency, to build momentum through the mechanism of witness, to celebrate the too often unseen, to make serious work of taking ourselves not too seriously while fixing our gaze on the exemplary unsung.”

Given that curators’ note, it strikes me that The Coup is right at home in such a venue as YBCA. So, for that matter, is Classical Revolution, which is the outfit (to use a word that seems to me to be appropriate to the case) from which came the quartet in which our daughter played one of her cellos as part of the world premier of The Coup’s Shadowbox at YBCA recently—and whose website (http://classicalrevolution.org/about/) I encourage my readers to consult, to check my just expressed judgment.

Nor is YBCA the only place-opening place where the performances of place-makers such as The Coup—and Classical Revolution and the other groups with whom The Coup shared their Shadowbox spotlight at the recent premier performance—are given a place to take place. Another such place in the Bay Area, one my wife and I also discovered thanks to our daughter during our recent trip to the West Coast, is The Revolution Café in San Francisco’s Mission District (http://www.revolutioncafesf.com/). That, it turns out, is the place where Classical Revolution was founded back in November 2006 by violist Charith Premawardhana, and where performances by Classical Revolution musicians take place every Monday night. There are many more such places, too, not only throughout the rest of the Bay Area, but also throughout the rest of the United States—and, I dare say, the whole, wide world.

To which I can only say: Amen! Which is to say: So be it!

 

 

*In reading Doniger’s words shortly after reading Heidegger’s, one thought that struck me was the question of whether Heidegger himself might not have succumbed to a sort of idolatry regarding “history,” Geschichte in German. Just as it is idolatry to think that there is any such thing as “the” common good or “the” people, isn’t it idolatrous to think that there is any such thing as “the” human story—“History,” with the capital ‘H’—as opposed to multiple, indeed innumerable, human stories, in the plural—“histories,” we might say, following Doniger’s lead? Yet Heidegger throughout his works talks about die’ Geschichte” (which, by the way, also means “story” in German, in addition to “history,” the latter in the sense of “what happened,” was geschiet), not just multiple Geschichten (“histories” or “stories,” in the plural). Perhaps that was at play in his involvement with the Nazis, despite the fact that, as the passage I’ve cited shows, he knew full well that it was mere idolatry to think in terms of “the” people, “das” Volk, as the Nazis so notoriously and definitively did. That, at least, was the question that came to my mind when I read Doniger’s line so soon after reading Heidegger’s. Even to begin to address that question adequately would take a great deal of careful thought, at least one upshot of which would surely be, in fact, that it is necessary to keep the matter open as a true question—rather than seeking the safety of some neatly enclosed, dismissive answer.

** As out of such things as I am, I don’t know if that is a mistake, or a way currently fashionable in some circles (or “ecologies,” if one prefers) of saying “have a hell of a lot of fun.” Whatever!

 

Pulling Out of the Traffic: The Après-Coups After The Coup (2)

Second After-Shock*: Accidental Strokes of Necessity

Art is good when it springs from necessity. This kind of origin is the guarantee of its value; there is no other.

— Neal Cassady

Our daughter has two cellos. To go with them, she has two cello-cases. Both cases are pretty well covered with various stickers and posts-ups that have struck her fancy from time to time. When we went to San Francisco recently to watch her play the cello in a quartet representing Classical Revolution, as part of The Coup’s Shadowbox premier, I noticed a new sticker on one of her cello cases. It had the lines above, from Neal Cassady.

That’s the same Neal Cassady who inhabited the heart of the Beat movement. Later he was not only “on the bus,” but even drove it. He drove the bus—namely, the psychedelic bus filled with Ken Kesey and his Merry Pranksters, the same bus Tom Wolfe eventually rode to fame in 1968 with the publication of TheElectric Kool-Aid Acid Test, that foundational text of the “New Journalism” that already long ago became old hat.

I didn’t notice our daughter’s new (to me at least) Neal Cassady sticker till a day or two after we’d attended Shadowbox, and when I read Cassady’s remark it resonated for me with my experience of the concert. That resonance was deepened when, even later, I noticed a brochure our daughter had lying on a bookshelf—an advertisement for the concert we had just attended. Put out by the Yerba Buena Center for the Arts and by Bay Area Now, the brochure started with “A Note from the Curators”—Marc Bamuthi Joseph, YBCA Director of Performing Arts, and Isabel Yrigoyen, Associate Director of Performing Arts—to which I’ll eventually return. That was followed by “A Note from the Artist,” in which an explanation, of a certain sort, was given for titling the concert Shadowbox. It read:

Late one night in the skies over Oakland, a strange object appeared. A cube. Perfectly still, 200 feet in the air. A reflective black box, with a neon glow surrounding it. Thousands of people hurriedly got out of bed, or filed out of bars and house parties, or left the cash register unattended—to stand on the street and gaze at the sight. Dogs barked and howled, louder and louder, in various pitches and timbres until it was clear that there was a consistent melody and harmony to their vocalizations. The cube started trembling, sending out a low vibration that made the asphalt shake, windows rattle, and car alarms across the city go off. Thousands of car alarms went off in a tidal wave of honks, beeps, and bleeps until they formed a percussive rhythm that accompanied the dogs’ beautiful howling. From the cube, a kick drum was heard that tied it together. A spiral staircase descended from the box. Only a few dared enter. What those few experienced has been the subject of several poorly made documentaries, an article in US Weekly, and three half-assed anthropology dissertations. What you will see tonight is a re-enactment of that experience.

I suggest that the “re-enactment” at issue be taken in the sense of an enacting again, as legislators are said to re-enact a law that will otherwise expire, rather than in the more ordinary sense of a miming, an acting out, as a community theatre group might re-enact Tennessee Williams’ A Streetcar Named Desire or Walt Disney’s Dumbo, or as bunch of court stooges might re-enact a crime in a courtroom at the behest of a prosecuting attorney, let’s say.   The Coup’s Shadowbox doesn’t just represent or mime the enactment of community that seems to have proven necessary following the sudden, unaccountable appearance—“fictitiously,” of course (and I’ll eventually return to that, too)—of a strange, black cube suddenly hovering in the sky over Oakland one night.

After all, The Coup—although it may be erotically capitalist and even, for all I know, scientifically fascist—is “politically communist,” as Wikipedia has it; and what The Coup is trying to do in Shadowbox, at least if we are to believe (as I do) Coup front-man and co-founder Boots Riley, is to get everybody moving. And although the movement at issue may be a dance, it is a dance that even such dance-dysfunctional still-standers as myself can join into, as I also wrote about last time. It is a political dance.

Which brings me to Jean-Claude Milner.

*     *     *     *     *     *

According to Jean-Claude Milner, ever since the ancient Greeks, politics—which term is itself derived from a Greek word, of course: polis, “city”—has been a hostage of mimesis, which is to say of just the sort of acting-out, of play-acting, that “represents” the action it mimes without re-presenting it, that is, without committing that action again. The mimetic re-enactment of a murder as part of a courtroom trial does not culminate in a second murder. In the same way, politics as the mimetic re-enactment of whatever acts mimetic politics re-enacts does not result in any new enactments of those original acts.

The acts that mimetic politics re-enacts are acts whereby the polis or “city” itself–which for the Greeks meant, in effect, the place where all real, truly human be-ing took place, to use again a way of speaking I favor—is first opened and set up, then kept open and going after that. From the days of the ancient Greeks until relatively recently, in one way or another such decisive political acts were taken not by everyone together, but only by a few.

Of course, those few invariably found it useful to represent themselves as making their decisions for the good of “all.” As Milner points out, however (3rd treatise, page 58**): “It is always in the name of all that each is mistreated.”

For the few who did make the decisions, and then impose them on everybody else, to keep their claim to be acting for the good of all even remotely plausible it always also helped to get “the people”—as we’ve grown long used to calling those the rulers rule over, though the term is supposedly inclusive of both—to believe that they were somehow actually participants in the decision-making itself. Those who were being decided over needed to be kept down on the farm, as it were, regardless of whether they ever got a chance to see Paree or not. The decided-over needed to be given the impression that somehow they were themselves deciders—as President George W. Bush once in/famously called himself.

Milner argues that classically, among the ancient Athenians, the theatre, specifically as staged in the great public performances of tragedies, was the crucial device that permitted the governors to govern those they governed—that is, permitted those who exercised power over others to keep those others in line. It did so by regularly bringing together all those who counted as “the people”*** to witness re-enactments, by actors behind masks, of the heroic deeds that were taken originally to have defined the people as the very people they were (with running commentaries provided by choruses that took over the job of being mouth-pieces for “the people,” who were thereby relieved of any need to speak for themselves). By so convening to witness such re-enactments, the citizenry—the public, the people—actually constituted itself as such.

Furthermore, in being brought openly together as an audience to witness the re-enactments of the original, originating tragic acts of the great heroes of Greek tradition, religion, and mythology, the people were also brought, through empathy, to vicarious identification with those people-defining heroes themselves, and their suffering for the people’s sake. Through such identification the people as audience were allowed to process the terror and pity with which the mimetic re-enactments of tragedy filled them, achieving catharsis, as Aristotle observed. That also helped keep them down on the farm.

Precisely because they were assembled as such an otherwise passive audience for the spectacle of decisive acts re-enacted or mimed in front of them, the people were effectively distanced from the underlying definitive decisions and actions being so mimed. They were allowed to feel a part of what was being re-enacted before them, in the sense of being mimed or “acted out,” while they were simultaneously being distanced from all the underlying genuine action itself. They could marvel and weep as “destiny” unfolded itself in the actions being mimed before them, while being dispensed from the need to undergo that destiny themselves.

As Milner puts it (2nd treatise, page 59):) “That distanced object, which in the crucial tradition of tragedy was called destiny, carries in politics, of course, the names: power, state, liberty, justice, or quite simply government.” What is more, he says, in our times the role that used to be played by tragic theatre is now played by—political discussion: the endless expression of opinions compulsively formed about political matters. Such discussion permits the discussants to think that they are really part of the political action, when in fact they are distanced effectively from it by the endless palaver about it. They are merely playing at politics, the way children play at being adults. They are “actors” only it that mimetic sense, not in the sense of decisive agents.

The difference, however, is that today, unlike in ancient Athens, everybody is reduced to the status of such a mere play-actor. That even includes the few who presumably, in the days of the ancient Greeks and for a long while thereafter, used actually to govern—to be genuine agents or “deciders.”

The reality today is simply this: No one decides, decisions just get made. Things of themselves get decided, as though things themselves are dictating the decisions—hence the name of Milner’s first short political treatise, which translates as The Politics of Things—but without anyone doing the actual deciding.

Accordingly, as I already indicated in my previous series of posts on “The Future of Culture,” no possibility of clearly assigning responsibility for decisions remains. Even more importantly, there are therefore no identifiable political pressure points, points where political pressure might be exerted in order to effect significant change. Everything just keeps on chugging along, with no one directing anything, despite how deluded some may still be into thinking they have some impact (for example, the President of the United States, whoever that may happen to be at any given time). The whole thing is no more than a dumb-show. Nobody is in charge of anything.

*     *     *     *     *     *

Sometimes, though, lightning strikes. Or suddenly a huge black cube with a neon glow appears in the sky. The Coup comes, and folks get moving.

*     *     *     *     *     *

Necessity is not causality. For necessity to emerge, in fact, the causal chain must actually be broken. Causality brings inevitability, Nietzsche’s “eternal recurrence of the same”—always the same old same old, never anything truly new under the sun (or the moon and stars at night). The necessity that Neal Cassidy says is the only guarantee of real worth in art is not causal inevitability. It is the necessity, the need, of creativity—the need of a pregnancy brought full term finally to burst and bring forth new life.

Any child born of such necessity always comes unexpected. The child always comes as an unexpected, un-expectable surprise, even for parents long filled with the knowledge that they are “expecting.” What can be expected is at most a child, one or another of the innumerably substitutable instances of the class of children, but never this child, the very one who so suddenly, so urgently, so imperiously, insistently comes into the world, and who, once come into it, simply demands, by its very being there, to be named.

Giving a name in the sense of what we call a “proper” name—which is to say “insofar as it is not just another name” (as, for example, dog, Hund, or chien are just three names for the same thing), that is, a name “insofar as it [names] not just anyone,” as Milner writes at one point (3rd treatise, page 75)—always “appears as an obstacle” to whatever or whomever claims to act in the name of “all.” What Milner means in that context is “all” taken in the sense of a closed totality, such as what is ordinarily called a “nation,” for example, the “borders” of which must be secured and protected. The singular, the radically unique, what escapes number, substitutability, and, therewith, any capacity to be “represented” by another, always constitutes a threat to all claims to special authority in the name of any such totalizing “all.”

However, universal quatification, as logicians call it, over “us” or over “human being”—as in “all of us,” or “all human beings”—need not be the move to any such totality as a “nation.” The “all” need not be taken in any such collective sense. Instead, the “all” can be taken in the distributive sense of “each and every single one,” so that “all of us” means each and every one of us as someone who has been given, or at least cries out to be given, a proper name, a name by which that singular one, and that one alone, no other, can be called.

The name by which the singular individual is called, however, calls that one as just that very one, and not as no more than an instance of what that one has in common with a bunch of other ones—for example, being black, white, brown, or yellow, young or old, educated or uneducated, employed or unemployed, American, Mexican, Honduran, Syrian, Iranian, or Indian. The bearer of a proper name—by which I would like above all to mean a name that is truly just that, a genuine name, and not a mere place-holder for a description—is no mere instance of a type, replaceable with any other. The bearer of a proper name is, rather, irreplaceable. (Regular readers of my blog might think of Fluffy, my daughter’s childhood pet guinea pig, for instance.)

*     *     *     *     *     *

As cacophonous as it may initially sound—like the sound of multiple dogs howling and multiple horns blowing in the night—to say so, it is only such an irreplaceable singularity that can be “necessary” in the way Neal Cassady says the authentic work of art is necessary. The necessity of artistic work is the same as the necessity of seizing one’s one and only opportunity to become who one is, when that opportunity suddenly presents itself. It is the same as the necessity of joining the fight against injustice into the reality of which one is suddenly given clear insight, or the necessity of giving oneself over completely to a suddenly awakened love. In short, it is the necessity of selling everything one owns for the sake of pursing what one is given to see is priceless.

Necessity is order, to be sure. However, it is the order that comes from the unexpected emergence of connection between what theretofore seemed to be no more than a randomly thrown together bunch of discreet, isolated facts. Necessity gives birth to the cosmos. That word is from the Greek word for “ordered whole,” but which originally meant “ornament,” which is why we also get cosmetic from the same word.  Cosmos is the “all” of everything insofar as everything has been brought together into one coherent whole, like an ornament. Cosmos is the ornamental whole of everything emerging out of chaos itself, which also a Greek word, which originally meant something like “yawning gap.” Necessity is the origin of that genuine cosmos which is the coming into an ordered whole of chaos itself. Necessity is the origin of that order that is not imposed upon chaos from without, as though by some ruler, but that arises, instead, of necessity, from chaos itself.

Among the same ancient Greeks to whom we owe tragic drama, the emergence of cosmos from chaos was attributed to Zeus. However, Zeus, the god of thunder and the thunder-bolt, was not himself without genesis. King of the gods he might have been, but Zeus himself came from the chaos; and if he came to order the latter, he still came at its bidding, and from within. He came of necessity, which origin demonstrates the authenticity of his glory.

*     *     *     *     *     *

Coming from out of the Greek chaos, Zeus also came from out of the Greek imagination, that same imagination from which sprang all the gods of Greek mythology. The order that the Greek imagination attributed to Zeus was itself anything but an imaginary order. Nevertheless, its origin—and its guarantee of worth, which is also to say its real necessity—lay in the Greek imagination.

Imagine that!

*     *     *    *     *     *

I will try to imagine something of it, in my next post, which will continue—and, I think, end—this present series on the after-coups of The Coup.

* Only while writing this post did it occur to me to call the separate posts of this series not “Parts,” as I had it when I put up the series’ first post a few days ago, but “After-Shocks,” which is much more appropriate. So I went back and edited my first post a couple of days ago. First, I slightly changed the title. Originally, I had used après-coup, French for “after-shock,” in the singular. I turned that into the plural, après-coups. Then I changed the title of the first series’ post itself from “Part One” to “First After-Shock.” Thus, it was only by one of the smaller après-coups of the coup delivered to me by attending The Coup concert that I was coincidentally struck by the need to change my titles a bit. Appropriate indeed!

** Milner has published three “short political treatises,” all brought out in France by Verdier: La Politique des Choses is his Court traité politique 1 (20011), followed by Pour une politique des êtres parlant as treatise 2 (2011) and L’Universal en éclats as treatise 3 (2014). I will give references in the text of this post, when needed, by the number of Milner’s treatise, followed by the page number at issue.

*** That is, the “citizens,” which means literally the habitants of the “city” as such, the polis, the place where human being took place. So, of course, that left out slaves, women, and all the other others who simply didn’t count—including counting as fully human, since they were not “citizens,” not full-fledged inhabitants of the place human beings as such inhabit. As non-citizens, those other others didn’t need to be brought on board the city boat because they were simply subject to force, with no need to rely on subterfuge—conscious and deliberate or not, who cares?—to make them think they were free even while they were being coerced.

Pulling Out of the Traffic: The Future of Culture (4)

This is the fourth in a series of posts under the same general title.

*     *     *     *     *     *

All sorts of things transpire—but nothing any longer happens—that is, no more decisions fall . . .

— Martin Heidegger, Überlegungen IV (in GA 94), ¶219

 

. . . it’s neither here, nor elsewhere . . .

— Alain Badiou, Images du temps present (January 14, 2014)

 

I had one opportunity. I had to cut out all ties with the flattening, thoroughly corrupt world of culture where everyone, every single little upstart, was for sale, cut all my ties with the vacuous TV and newspaper world, sit down in a room and read in earnest, not contemporary literature but literature of the highest quality, and then write as if my life depended on it. For twenty years if need be.

But I couldn’t grasp the opportunity. I had a family . . . And I had a weakness in my character . . . that was so afraid of hurting others, which was so afraid of conflict and which was so afraid of not being liked that it could forgo all principles, all dreams, all opportunities, everything that smacked of truth, to prevent this happening.

I was a whore. This was the only suitable term.

— Karl Ove Knausgaard, My Stuggle. Book Two: A Man in Love

 

Points of decision are crisis points. “Critical condition” in the medical sense is the condition of a patient who is at the decision point between survival and demise, where the body—with, it is to be hoped, the assistance of the medical staff—must marshal all its resources to sustain life, in the minimal, zoological sense. In the passage cited above, Knausgaard describes how he came to stand at a critical point of decision for or against life in the full, no longer merely biological sense of the term—the truly live-ly sense, we might say, in contrast to the rather deadening sense of bare survival.

Actually, that way of putting it, “ a critical point of decision for or against life,” won’t quite work. Rather, Knausgaard describes coming to a point where he was faced with the need and opportunity at last actually and fully to make a decision in the first place and, by and in making it, to become truly alive at last. At that point he was faced with either “choosing to choose,” as Heidegger puts it in Being and Time, or else just going on going on, literally just surviving (“living-through” or “-over”) his own life, having already outlived himself, as it were, by letting his moment of opportunity slip by, in failing or refusing to decide at all.

The way that Alain Badiou puts it in his seminar on “images of the present times” (in the session of November 27, 2003) is that what he calls simply a “point” is “the moment where you make the world [as such and as a whole] manifest in the yes or the no of a decision. . . . It is the manifestation of the world in the figure of the decision.” He adds right away that “[o]ne is not always in the process of dealing with points, thank God!” Badiou, a self-proclaimed atheist proud of his atheistic family heritage, adds that ejaculation of thanks because, as he goes on to say: “It is terribly astringent, this imperative necessity that suddenly the totality of your life, your world, comes to be the eye of a needle of yes or no. Do I accept or do I refuse? That is a point.”

*    *     *     *     *     *

Early in the second of the six volumes of the long story of his “stuggle”—Kampf in German, it is worth remembering, as in Hitler’s Mein Kampf—Knausgaard himself has already noted how challenging it is actually to have to decide to live one’s life, rather than just to keep on living through it. Toward the very beginning of that second volume—toward the very end of which comes the passage already cited –he writes: “Everyday life, with its duties and routines, was something I endured, not a thing I enjoyed, nor something that was meaningful or that made me happy.” The everyday life at issue for him during the time he is addressing was one of an at-home husband of an employed wife, and a father taking care of his young children while his wife was at work. Thus, it was a life filled with such things as washing floors and changing diapers. However, Knausgaard immediately tells us that his mere endurance rather than enjoyment of such a life “had nothing to do with a lack of desire to wash floors or change diapers.” It was not that he disdained such activities, or regarded them as beneath him, or anything else along such lines. It had nothing to do with all that, “but rather,” he continues, “with something more fundamental: the life around me was not meaningful. I always longed to be away from it. So the life I led was not my own.”

Knausgaard immediately goes on to tell us that his failure to make his everyday life his own was not for lack of effort on his part to do just that. In the process of telling us of his efforts, he also offers at least one good explanation for giving his massive, six-volume, autobiographical novel the title it bears. “I tried to make it mine,” he writes, “this was my struggle, because of course I wanted it . . .”

He loved his wife and his children, and he wanted to share his life with them all—a sharing, it is to be noted, that requires that one first have one’s life as one’s own to share. Thus, “I tried to make it mine,” he writes, “ . . . but I failed.” That failure was not for lack of effort but because: “The longing for something else undermined all my efforts.”

Conjoining the two passages, one from near the start of the book and one from near its very end, suggests that Knausgaard’s long struggle has been of the same sort as that of St. Augustine, as the latter depicted it in his Confessions. That is, the “struggle” at issue derives from the ongoing condition of not yet having made a real decision, one way or another. In such struggles, the struggle itself comes to an end only in and with one’s finally making up one’s mind, finally coming to a resolution, finally deciding oneself.

In the passage at the start of today’s post, coming more than 400 pages of “struggle’ after the one just cited, Knausgaard gives the fact that he “had a family” as the first reason he “couldn’t grasp” the “one opportunity” that he says he had.   Nevertheless, what is really at issue cannot be grasped in terms of choosing between two equally possible but conflicting options, either living the life of a family man or living the life of an artist. Rather, what is at issue is something only Knausgaard’s subsequent remarks really bring to focus: what kept him from seizing his sole opportunity was nothing but himself. It was not the love of his family that hindered him. It was the love of his own comfort—or at least the desire not to disturb his own comfort by disturbing the comfort of others nearby.

I can identify! It was really not my love of my daughter that tripped me up when her childhood pet, Fluffy the guinea pig, died one day, causing me to tempt my own daughter to betray her love for her pet by rushing out to buy a replacement, as I recounted in my preceding post. I did love my daughter, to be sure, as I still do. But, as I already revealed when first discussing the episode, what tripped me up was really not my love for her. Rather, it was my discomfort with my own discomfort over her discomfort over Fluffy’s death. I betrayed myself out of love of my own comfort, not out of love for her. So my betrayal as such was not done out of any genuine love at all; it was done just out of fear—the fear of dis-comfort. That is how clinging to one’s precious comfort always manifests itself, in fact: in Knausgaard’s case no less than my own.

Now, there may truly be cases in which points of decision manifest as what we might call “Gauguin moments.” That is, there may really be cases in which, in order to make one’s life one’s own, one must indeed leave behind one’s family and one’s home and go off into some other, far country, as Gauguin did in the 19th century for the sake of his art (or as Abraham does in the Bible, though not, of course, for the sake of art).

What truly marks points as points of decision, however, is not a matter of the difference in content between two equally possible life-options (let alone the romantic grandiosity of the choices suggested by Gauguin’s, or Abraham’s, model). What defines them (including in such dramatic examples) is just that they are points at which one confronted with the necessity at last truly to decide, that is to resolve oneself—to say yes or no to one’s world, and one’s life in it, as a whole, as Badiou puts it.

*     *     *     *     *     *

German for “moment” is Augenblick—literally, “the blink of an eye.” Heidegger likes to note that etymologically Blick, an ordinary German word for look, glance, view, or sight, is the same as Blitz, the German for lightning-flash, lightning-bolt. Points of decision, in the sense that I am using that expression, are moments that proffer what Heidegger calls an “Einblick in das, was ist,” an in-sight or illuminating in-flash into that which is. Points of decision are moments of illumination of what is there and has been there all along, though we are only now, in a flash, given the opportunity to see it. They are those points in our lives that offer us the chance to make our lives our own: to come fully alive ourselves—at last and for firsts.

In common with Blitzen in the everyday sense of lightning-bolts, moments or points of decisive in-sight/in-flash sometimes come accompanied by loud thunderclaps, or the equivalent. God may come down and talk to us as God did to Moses as the burning bush, or come in a whirlwind, or with bells and whistles. At least as often, however, moments or points of decision come whispering to us in a still, small voice, one easily and almost always drowned out by all the noise of the everyday traffic with which we everywhere surround ourselves (even if only in the space between our ears), for very fear of hearing that voice . . . and being discomfited by it.

Points of decision may break the surface of our the everyday lives—those lives that, like Knausgaard, we endure without enjoying—as suddenly and dramatically as the white whale breaks the surface at the end of Melville’s Moby Dick. Or they may come upon us slowly, and catch up on us all unawares, such that we waken one morning and realize that for a long while now, we have not been in, say, Kansas any longer, but have no idea of just where and when we might have crossed the border into whatever very different place we are now.

All such differences make no difference, however. What counts is only that we come to a moment, a point of clarity, where we are struck, as though by a bolt of lightning, with the realization that we do indeed have a choice, but only one choice. We have a choice, not in the sense that we can pick between two different options, as we might pick between brands of cereal to buy for our breakfast. Rather, we have a choice in the sense that, like Knausgaard, we realize that we do indeed have one and only one opportunity, which we can either take, or fail to take. We are faced with the choice, as the Heidegger of Being and Time put it, of choosing to choose, choosing to have a choice to exercise, rather than continuing just to let ourselves live through our own lives, without ever having to live them. The choice is either to live, or just to go on living.

An acquaintance of mine once came to such a point of decision in his own life, and who did indeed decide to make his life his own at that point. When asked about it, he says that up until that point it had always been as though his life was running on alongside him, while he was just sort of standing there observing it. What his moment of decision offered him, he says, was precisely the opportunity to “take part in” his own life, rather than just continue to let it run itself next to him. In a certain sense, he may have “had” a life up to that point, but only at that point did he come to live it himself.

*     *     *     *     *     *

In The Politics of Things (La politique des choses, first published in France in 2005 by Navarin, then in a slightly revised, updated edition in 2011 by Verdier) contemporary French philosopher Jean-Claude Milner traces the global processes driving inexorably, in what passes for a world in what passes for today, toward eliminating the very possibility of there being any genuine politics at all. That goal is being achieved above all through the development of ever more new techniques of “evaluation,” and the ubiquitous spread of processes of such evaluationinto ever more new dimensions of individual and collective life. (In the United States, we might add, the deafening demand for incessant development and promulgation of ever more new ways and means of evaluating everything and everyone is typically coupled with equally incessant palaver about the supposed need for “accountability.”)

What Milner calls “the politics of things” aims at what he calls “government by things.” At issue is the longstanding global drive to substitute what is presented as the very voice of “things” themselves—that is, what is passed off for “reality,” and its supposed demands—for any such messy, uncertain politics or government as that which requires actual decisions by human beings.

Thus, for example, “market mechanisms” are supposed to dictate austerity according to one set of “experts,” or deficit spending according to another set. Whichever set of experts and whichever direction their winds may blow doesn’t really make any difference, however. What counts, as Milner says, is just that it be one set or another, and one direction or another.

That’s because, he observes in his fourth and final chapter, “Obedience and Liberties” (in French, “Obéissance ou libértes”), the real aim of the whole business is simply the former: sheer obedience—what is indeed captured in the English word “obeisance,” derived from the French term. He writes (page 59) that, “contrary to appearances, the government of things does not place prosperity at the summit of its preoccupations; that is only a means to its fundamental goal: the inert tranquility of bodies and souls.”

To achieve that goal, the government of things plays upon human fears—two above all: the fear of crime, and the fear of illness. Under the guise of “preventing” crime and/or illness, the government of things reduces us all to un-protesting subservience. We prove always willing to do just as we’re told, as unpleasant as we may find it, because we have let ourselves be convinced that it is all for the sake of preventing crime or illness.

I will offer two examples of my own.  The first is how we line up docilely in long queues in airports, take our shoes (and, if requested, even our clothes) off, subject ourselves to pat-downs and scan-ups, delays and even strip-searches—all because we are assured that otherwise we run the risk, however slight, of opening ourselves to dreaded terrorist attacks. My second example is how we readily subject ourselves to blood-tests, digital rectal examinations, breast ex-rays, hormone treatments, and what not, all the tests, checks, and re-checks that our medical experts tell us are necessary to prevent such horrors as prostate or breast or colon or skin cancer, or whatever. We readily subject ourselves to all these intrusive procedures, only to be told sooner or later by the very same experts that new evidence has changed their collective expert thinking, and that we must now stop subjecting ourselves to the same evaluation procedures, in order to prevent equally undesirable outcomes. In either case, we do just as we’re told, without complaint.

We do as we’re told, whatever that may be at the moment, to prevent crime and/or illness because, as Milner writes (page 61): “Under the two figures of crime and illness, in effect one and the same fear achieves itself, that one which, according to Lucretius, gives birth to all superstition: the fear of death.” In fact, we are all so afraid of death and so subject to manipulation through that fear that we fall easy prey to the “charlatans,” as Milner appropriately calls them (on page 62), through whom the government of things seeks to universalize what amounts (page 64) to the belief in Santa Claus (Père Noël in France, and in Milner’s text)—a belief, finally, that “consists of supposing that in the last instance, whether in this world or in the next, the good are rewarded and the evil are punished.”

The government of things strives to make everyone believe in such a Santa Claus “with the same effect” that it fosters the development and universalization of techniques and procedures of evaluation: the effect of “planetary infantilization.” Furthermore:

One knows that no Santa Claus is complete without his whip. Indefectible solidarity of gentle evaluation and severe control [our American Santa making up his lists of who’s naughty and nice, then rewarding the latter with goodies and punishing the former with lumps of coal, for instance]! The child who does not act like a child [by being all innocent and obedient, sleeping all nice and snug in her bed, with visions of sugar-plumbs dancing away in her head] is punished; that is the rule [and we must all abide by the rules, musn’t we?]. All discourse not conducive to infantilization will be punished by the evaluators, that is the constant. Among its effects, control also carries this one: the promise of infantilization and the initiation of transformation into a thing.

After all, the desideratum is a government not only of things, but also by things and for things (pace Lincoln—at least it we grant him the charity of thinking that’s not what he really meant all along).

In the closing paragraphs of his little book (pages 66-67), Milner issues a call for resistance and rebellion against all such pseudo-politics and pseudo-government of things, and in affirmation of a genuine politics. It is a call, quite simply, for there to be again decision.

“If the name of politics has any meaning,” Milner writes, “it resolutely opposes itself to the government of things.” In rejecting the pretense of a politics of things, real politics “supposes that the regime of generalized subordination can be put in suspense.” A politics worthy of the name can emerge only if at last an end is put to all the endless chatter about how we all need to show “respect for the law,” “respect for authority,” and the like, all of which is just code for doing what we’re told.

Such suspension of generalized subordination and end of infantilizing chatter may not last long: “Maybe only for an instant . . .” But that instant, that moment, that blink of an eye, “that’s already enough, if that instant is one of decision. What’s needed is that there again be decision.”

That’s all that’s needed, but that’s everything. As Milner writes, “politics doesn’t merit the name unless it combats the spirit of subordination. One doesn’t demand that everyone be generous, or fight for the liberties of everyone; it is quite enough if each fights for her own freedom.” The return of a genuine politics requires that we stop relinquishing our own choices to “the order of things.” It requires, instead, “[t]hat at times we decide for ourselves . . .”

There is no future of politics otherwise. Nor, without decision, is there any future of culture in any form, be it political, artistic, philosophical, or whatever. But that just means that, without decision, there really is no future at all.

*     *     *     *     *     *

I intend my next post to be the last in this current series on “Pulling Out of the Traffic: The Future of Culture.”

Pulling Out of the Traffic: The Future of Culture (2)

This is the second in a series of posts under the same general title.

*     *     *     *     *     *

In the New York Times for Thursday, June 26 of this year—which was also the day I put up the post to which this one is the sequel—there was a news-piece by Mark Mazzetti under the headline “Use of Drones for Killings Risks a War Without End, Panel Concludes in Report.” The report at issue was one set to be released later that same morning by the Stimson Center, “a nonpartisan Washington think tank.” According to Mr. Mazzetti’s opening line the gist of the report was that “[t]he Obama administration’s embrace of targeted killings using armed drones risks putting the United States on a ‘slippery slope’ into perpetual war and sets a dangerous precedent for lethal operations that other countries might adopt in the future.” Later in the article, Mr. Mazzetti writes that the bipartisan panel producing the report “reserves the bulk of its criticism for how two successive American presidents have conducted a ‘long-term killing program based on secret rationales,’ and on how too little thought has been given to what consequences might be spawned by this new way of waging war.”     For example, the panel asked, suppose that Russia were to unleash armed drones in the Ukraine to kill those they claimed to have identified as “anti-Russian terrorists” on the basis of intelligence they refused to disclose for what they asserted to be issues of national security. “In such circumstances,” the panel asks in the citation with which Mr. Mazzetti ends his piece, “how could the United States credibly condemn Russian targeted killings?”

Neither Mr. Mazzetti nor—by his account at least—the panel responsible for the Stimson Center report bothers to ask why, “in such circumstances,” the United States would want to “condemn” Russia for such “targeted killings” on such “secret rationales.” It is just taken for granted that the United States would indeed want to condemn any such action on the Russians’ part.

That is because, after all, the Russians are among the enemies the United States must defend itself against today to maintain what, under the first President Bush, used to be called “the New World Order”—the order that descended by American grace over the whole globe after the “Cold War,” which itself characterized the post-war period following the end of World War II. Today is still just another day in the current “post post-war” period that set in after the end of the Cold War—as Alain Badiou nicely put it in 2002-2003, during the second year of his three-year monthly seminar on Images of the Present Times, just recently published in France as Le Seminaire: Images du temps present: 2001-2004 (Librarie Arthème Fayard, 2014).

It is really far too late on such a post post-war day as today to begin worrying, as the Stimson panel penning the report at issue appears to have begun worrying, about entering upon the “slippery slope” that panel espies, the one that slides so easily into “perpetual war.” For one thing, what’s called the Cold War was itself, after all, still war, as the name says. It was still war, just “in another form,” to twist a bit a famous line from Clausewitz. Cold as that war may have been, it was still but a slice of the same slope down which the whole world had been sliding in the heat of World War II, which was itself just a continuation of the slide into which the world had first swiftly slipped at the beginning of World War I.

Let us even go so far as to assume that the great, long, European “peace” that ran from the end of the Franco-Prussian War in 1870 all the way down to 1914, one hundred year ago this summer, when it was suddenly interrupted by a shot from a Serbian “terrorist” in Sarajevo, was peace of a genuine sort, and not just the calm of the proverbial gathering storm. Even under that assumption, peace has never really been restored to the world again since the guns began firing in August or that same year, 1914, if the truth is to be told. Instead, the most that has happened is that, since then, from time to time and in one place or another there has occurred a temporary, local absence of “hot” war, in the sense of a clash of armed forces or the like. The guns have just stopped firing for a while sometimes in some places—in some times and places for a longer while than in others.

So, for example, even today, a quarter of a century after the end of the post-war period and the beginning of the post post-war one, the western and west-central European nations have remained places where “peace,” in the minimal, minimizing sense of the mere absence of “active hostilities,” has prevailed. Of course, elsewhere, even elsewhere in Europe—for example, in that part of Europe that during part of the time-span at issue was Yugoslavia—plenty of active hostilities have broken out. In many such cases (including the case of what once had been Yugoslavia) those episodes have often and popularly been called “wars,” of course.

Then, too, there have been, as there still are, such varied, apparently interminable enterprises as what Lyndon Johnson labeled America’s “war on poverty,” or what Richard Nixon labeled the American “war on drugs.” In cases of that sort, it would seem to be clear that we must take talk of “war” to be no more than metaphorical, in contrast to cases such as that of, say, America’s still ongoing “war in Afghanistan,” where the word would still seem to carry its supposedly literal meaning.

Another of the wars of the latter, “literal” sort is the one that began with the American invasion of Iraq on March 20, 2003. As it turned out, that particular war broke out right in the middle of the second year of Badiou’s seminar on “images of the present times.”  In fact, the hostilities in Iraq started right in the middle of some sessions of his seminar in which Badiou happened to be addressing the whole issue of “war” today, during our “post post-war” period—as though tailor-made for his purposes.

In his session of February 26, 2003, less than a month before the start of hostilities in Iraq, Badiou had begun discussing what war has become today, in these present times. He resumed his discussion at the session of March 26—following a special session on March 12, 2003, that consisted of a public conversation between Badiou and the French theatre director, Lacanian psychoanalyst, and philosopher François Regnault. President George W. Bush had meanwhile unleashed the American invasion of Iraq.

In his session of February 26, 2003, Badiou had maintained that in the times before these present times—that is, in the post-war period, the period of the Cold War—the very distinction between war and peace had become completely blurred. Up until the end of World War II, he writes, the term war was used to mark an “exceptional” experience. War was an “exception” in three interconnected dimensions at once: “ a spatial exception, a temporal exception and also a new form of community, a singular sharing, which is the sharing of the present,” that present defined as that of “the war” itself.

We might capture what Badiou is pointing to by saying that, up till the end of World War II and the start of the Cold War, war was truly a punctuating experience. That is, it was indeed an experience in relation to which it did make clear and immediate sense to all those who had in any way shared in that experience to talk of “before” and “after.” It also made sense to distinguish between “the front” and “back home.” Some things happened “at the front,” and some “back home”; some things happened “before the war,” and some only “after the war.” And war itself, whether at the front or back home, and despite the vast difference between the two, was a shared experience that brought those who shared it together in a new way.

During the Cold War, however, all that changed, and the very boundaries of war—where it was, when it was, and who shared in it—became blurred. Badiou himself uses the example of the “war on terror” (as George W. Bush, who declared that war, was wont to call it, soon accustoming us all to doing so) that is still ongoing, with no end in sight. The war on terror is no one, single war at all, Badiou points out. Instead, the term is used as a cover-all for a variety of military “interventions” of one sort or another on the part of America and—when it can muster some support from others—its allies of the occasion. Indeed, the term can be and often is easily stretched to cover not only the invasions of Afghanistan and Iraq under the second President Bush but also the Gulf War unleashed against the same Iraq under the first President Bush, even before the war on terror was officially declared—and so on, up to and including the ever-growing use of armed drones to kill America’s enemies wherever they may be lurking (even if they are Americans themselves, though so far—at least so far as we, “the people,” know—only if those targeted Americans could be caught outside the homeland).

So in our post post-war times there is an erasure of the boundary between war and peace, a sort of becoming temporally, spatially, and communally all-encompassing—we might well say a “ going global”—of the general condition of war. Coupled with that globalization of the state of war there also occurs, as it were, the multiplication of wars, in the plural: a sort of dissemination of war into ever new locations involving ever new aspects of communal life. Wars just keep on popping up in more and more places, both geographically and socially: the war in Afghanistan, the war in Iraq (just recently brought back again—assuming it went away for a while—by popular demand, thanks to ISIS), the war in Syria, the wars in Sudan, Nigerian, Myanmar, Kosovo, the Ukraine, or wherever, as well as the wars against poverty, drugs, cancer, “undocumented”/“illegal” immigration, illiteracy, intolerance, or whatever.

At the same time, this globalization of war and proliferation of wars is also inseparable from what we might call war’s confinement, or even its quarantine. By that I mean the drive to insure that wars, wherever and against whatever or whomever they may be waged, not be allowed to disrupt, damage, or affect in any significant negative way, the ongoing pursuit of business as usual among those who do the war-waging. (The most egregious example is probably President George W. Bush in effect declaring it unpatriotic for American consumers not to keep on consuming liberally—including taking their vacations and driving all over lickety-split—in order to keep the American economy humming along properly while American military might was shocking and awing the world in Baghdad and the rest of Iraq.)

Thus—as Badiou puts it in his session of March 26, 2003—in league with the expansion of war into global presence and the random proliferation of wars goes a movement whereby simultaneously, among the wagers of war, “[e]verything is subordinated to a sort of essential introversion.” That is a reference above all, of course, to America, the only superpower that remained once one could no longer go back to the USSR. On the one hand, as both Badiou and the Stimson report with which I began this post indicate, the American government does not hesitate to claim the right to “intervene” anywhere in the world that it perceives its “national interests” to be at stake, no matter where that may be. It claims for itself the right to make such interventions whenever, against whomever, and by whatever means it judges to be best, and irrespective of other nations’ claims to sovereignty—even, if need be, against the wishes of the entire “international community” as a whole (assuming there really is any such thing). Yet at the same time such interventionism is coupled essentially with a growing American tendency toward “isolationism.”

This counter-intuitive but very real American conjunction of interventionism and isolationism is closely connected, as Badiou also points out, to the ongoing American attempt to come as close as possible to the ultimate goal of “zero mortality” on the American side, whenever, wherever, against whomever, and however it does conduct military interventions under the umbrella of the claimed defense of its national interests, as it perceives them, on whatever evidence it judges adequate. That is best represented, no doubt, by the aforementioned increasing American reliance on using unmanned, armed drones to strike at its enemies, a reliance that began under the Bush administration and has grown exponentially under the Obama administration.

Furthermore, the drive toward zero war-wager mortality is coupled, in turn, with another phenomenon Badiou addresses—namely, what we might call the steady escalation of sensitivity to offense. The more American power approaches what Badiou nicely calls “incommensurability,” and the nearer it comes to achieving the zero American mortality that goes with it, the less it is able to tolerate even the slightest slight, as it were. Rather, in such an affair—as he says in the session of March 26, shortly after the American attack on Iraq under the second President Bush—“where what is at stake is the representation of an unlimited power, the slightest obstacle creates a problem.” Any American deaths at all, or any remaining resistance, even “the most feeble, the worst armed, . . . the most disorganized,” is “in position to inflict damage to the imperious power that it faces.” As there is to be zero American mortality, so is there to be zero resistance (or whatever origin, including on the part of Americans themselves).

*     *     *     *     *     *

All these interlocked features belong to what we have come to call “war” today. Or rather, the situation today is really one in which the very notion of war has come to be entirely flattened out, as I would put it. War itself has ceased to be any distinctive event—anything “momentous,” properly speaking: marking out a clear division between a “before” and an “after,” such that we might even speak of the “pre-war” world and the “post-war” one. That is what Badiou means by saying that we live today in the “post post-war” period. It is a strange “period” indeed, since there is, in truth, no “point” at all to it—either in the sense of any clearly defined limit, or in the sense of any clearly defined goal, I might add—which is what I had in mind in my earlier remark that war today has ceased to be any truly “punctuating” experience.

In one of my posts quite a while ago, I wrote that, in line with contemporary Italian philosopher Giorgio Agamben’s thought about sovereignty and subjectivity, an insightful hyperbole might be to say that it had been necessary to defeat the Nazis in World War II in order that the camp-system the Nazis perfected not be confined to Nazi-occupied territory, but could go global—so the whole world could become a camp, in effect, and everyone everywhere made a camp inmate subject to being blown away by the winds of sovereignty gusting wherever they list.

Well, in the same way it might be suggested that the whole of the long period of preparation for, and then eventual outbreak and fighting of, the (“two”) World War(s), as well as the whole post-war period of Cold War that followed, was just the long ramp-up necessary for the true going global of war in our post post-war period.  That is, the whole of the unbelievably bloody 20th century, ushered in by the whole of the 19th, back at least to the French Revolution of the end of the 18th, can be seen as nothing but the dawning of the new, ever-recurring day of our present post post-war, unpunctuated period.

Indeed, war today has become so enveloping spatially, temporally, and communally, all three, that it is no longer even perceivable as such, except and unless it breaks out in some ripple of resistance somewhere, by some inexplicable means. Whenever and wherever and from whomever, if anywhere any-when by anyone, the power into whose hands the waging of war has been delivered suffers such an offense against it, no matter how slight the slight, then the only conceivably appropriate response is, as the old, post-war saying had it, to “nuke ‘em.”

Furthermore, since offenses are in the feelings of the offended, none of us, “the people,” has any assurance at any time that we will not, even altogether without our knowingly having had any such intent, be found to have done something, God knows what, to offend. If we do, then we may also come to be among those getting nuked (or at least deserving to be)—probably by an armed drone (maybe one pretending to be delivering us our latest Amazon.com order).

*     *     *     *     *     *

By now, even the most patient among my readers may be wondering what this whole post, devoted as it is to discussion of the meaning of “war” today, has to do with “the future of culture,” which is supposed to be the unifying topic in the entire current series of posts of which this one is supposed to be the second. That will only become evident as I proceed with the series—though perhaps it will not become fully evident until the whole series draws together at its close. At any rate, I will be continuing the series in my next post.

The Traffic in Trauma: Learning Whom to Hate

Jean-Paul Sartre once wrote in recommendation of a book* that it had the great merit of teaching the young whom to  hate.  That is a lesson still well worth learning, not only for the young but for all ages.

Just the other day I read a passage in newly published book by a well-known author that, under the guise of teaching that same lesson, actually teaches anything but.

Out in Colorado where I live, we were just recently treated to the news of the retirement of Grayson Robinson, the Sheriff of Arapahoe County, who not long before retiring presided over the various press hearings concerning the shootings just this last December at Arapahoe High School.  Sheriff Robinson refused throughout all such proceedings to use the name of the shooter, whose final shot took his own life, lest by using his name he be granted a celebrity that, even posthumously, Sheriff Robinson wanted no part in granting.  (Although he restrained himself from using the young man’s name, the Sheriff did not refrain from labeling the shooter “evil”—a point I will not pursue further, though it certainly deserves careful reflection, above all about who is served by such talk, and who is not.)  I will take at least one page from Sheriff Robinson’s own book.  I will not name the work in which I read the passage I want to discuss, the one I just read recently, the one that fails to teach the lesson that Sartre praised Nissan’s novel for teaching.  Nor will I name the author.  I see no good reason, either humanitarian or selfish, for doing so.

At any rate, the passage at issue comes at the end of a discussion—itself to the point and worthwhile, in my judgment—of how offensive, indeed how truly obscene, the normalization of torture in the relatively recent, for the dominant part positively received film, Zero Dark Thirty, which tells the back-story to the long trail of sleuthing that eventually culminated in the American killing of Osama bin Laden, really is.  The author then goes on to mention the linguistic sleight-of-hand wherein the Bush administration, long before that actual killing, replaced the term ‘torture’ with the expression ‘enhanced interrogation techniques,’ to classify and talk about such then (at least) standard American practices as water-boarding those from whom the American government hoped to extract information thought to be of possible use in pursuit of what that government defined to be America’s own self-interest.

So far, so good:  To that point I have no objections.  However, I do object to what the author at issue goes on to do, which is to posit an analogy—in fact, not just an analogy, but also an identity.  He compares the verbal substitution of ‘enhanced interrogation techniques’ for ‘torture,’ on the one had, with the substitution of ‘physically challenged’ for ‘disabled,’ on the other.  Then he asserts that both substitutions are, in fact, just two different instances of one and the same underlying malady, which he characterizes, following what has become an almost universally dominant current linguistic fashion, as being the malady of “Political Correctness,” to adopt the author’s own device of capitalizing the two words of that expression in his usage of it.

Nothing could be more politically correct today than such usage of the buzz-word “Political Correctness.”

Communication is not coercion.  Communication is co-mund-ication, as I wrote in my preceding post—from the Latin mundus, “world.”  That is, it builds, in sharing, a shared world.  In contrast, coercion calls a halt to sharing.  It imposes limits, barriers, and blockages to communication, stopping it, or at least trying to.  It breaks apart the world.  Words, phrases, or in general expressions have what is deserving of being called “meaning” or “sense” only in the stream of communication, to paraphrase a line from Wittgenstein.  Taken out of that stream and pressed into forced service as implements of coercion, they lose all meaning and cease to make any sense, properly speaking (and by “proper” here, I mean “appropriate to ongoing communication,” since what expressions as such are for is just that).

Long ago now, the term ‘politically correct’ was simply gutted of all meaning.  It was hollowed out completely.  All that was left was the mere verbal shell, which could then be filled with something other than sense or meaning—filled, namely, with coercive force, used to accomplish a no longer communicative but now anti-communicative, purely coercive purpose.  In short, ‘political correctness’ was replaced by  ‘Political Correctness,’ to adopt my passage’s author’s convention.

Before it underwent evisceration of sense, of saying power, and was stamped into ‘Political Correctness,’ a mere tool of coercive power, the term ‘politically correct’ would have meant that which was required to maintain political viability in the concrete circumstances under discussion. Accordingly, just what sort of talk or action might have been politically correct at any given time and setting would have been a function of the political conditions and circumstances of that time and setting.  The term would not have named any one, single style of speech and action, whether of the left, of the right, or of the middle.  In one case—for example, America during the McCarthy era—espousing left-wing political causes might be tantamount to committing political suicide, whereas the same speech and action in another case—perhaps in the Soviet Union during the same era—would have been required to exert any political effectiveness.  What would have been “politically correct” would have varied according to the specifics of the given situations to which the term was applied.

The moment came, however, when the term ‘politically correct’ ceased to have any meaning within the stream of conversation, and instead was shanghaied by the American right-wing for use as a quick and handy label by which to dismiss and ridicule one specific sort of communication.  The sort of communication at issue is any that tries to address instances in which our everyday ways of talking themselves embody extra-communicative—indeed, anti-communicative, which is to say world-destroying, rather than world-building through world-sharing—elements that function coercively, and do so at the greatest price to those who can least afford to pay for it.  That is, the term was co-opted by the American right wing and made to apply exclusively to what my dictionary, as its sole entry for the expression ‘political correctness,’ characterizes this way:  “the avoidance, often considered as taken to extremes, of forms of expression or action that are perceived to exclude, marginalize, or insult groups of people who are socially disadvantaged or discriminated against.”

Thus does even The New Oxford American Dictionary itself succumb to the reigning linguistic coercion, not even bothering to mention the meaning that the same expression would once have had, prior to its capture and torture precisely by those who “consider” the “avoidance” at issue “often” to be “taken to extremes”!  Just how “often,” a thoughtful reader might ask?  Well, for those who abducted the expression and pressed it into slavery to serve their own interests in the first place the answer is:  AlwaysWhenever such avoidance—any such avoidance—manifests itself at all!

As for me, I must admit that in my own judgment it is “often” (please read:  “always and in every instance”) the case that those who use the terms ‘political correctness’ or ‘politically correct’ in the way my dictionary defines them are abusing those terms.  They are, as some readers may already have caught me remarking, torturing those terms.

Of course, as “often” fits the interests of torturers, they would prefer not to call it that.  They would prefer to call it, perhaps, the employment of “enhanced meaning-clarification techniques.”  So it goes.

Such torture of language would perhaps not matter much—unless, perhaps, to someone who is “going to extremes” in order to be Politically Correct—if all it concerned was language itself (pace language lovers, wimps that they may be).  But such language abuse abuses more than language, unfortunately.  It abuses those who, through such linguistic sleights-of-hand, are effectively robbed of the very possibility of voicing objection to being abused, or finding such voice through those who speak on their behalf.  The abuse against them is thereby, as others have often pointed out before me, compounded—indeed, exponentially so, especially when coupled with the further abuse, as it often is, of being blamed for their own being abused.

As is common to abusers, those who abuse language like to blame their abuse on those they abuse and whom they are using their language-abuse to abuse even further.   The substitution of ‘enhanced interrogation techniques’ for ‘torture’ is anything but “exactly the same” as the substitution of ‘physically challenged’ for ‘disabled,’ despite the author of the passage with which I began this post saying so.  In truth, the two operations operate in exactly opposite ways.  The first substitution is one in the service of the torturers, whereas the second is—or is at least intends to be—in the service of the tortured.  The conflation of those two opposed operations of verbal substitution, the washing out of the crucial, defining difference between them, can itself only serve the interests of the torturers, and not of the tortured.

What the author of the passage at issue goes on to say right after first equating those two utterly divergent operations of verbal substitution is that they both also operate the same way yet another imagined substitution would operate.  The two substitutions already considered, according to that author, both operate as would the substitution—patently absurd and offensive, as the author intends readers to hear—of ‘enhanced seduction technique’ for ‘rape.’

But if one asks oneself just who would ever suggest such a substitution as that third one, of ‘enhanced seduction technique’ for ‘rape’—that is, if one asks just whose interests would possibly be served by it—the answer would, I think, be obvious:  Only rapists themselves and their accomplices would be served by such a substitution, hardly the raped.

There are three sets of terms involved in the passage at issue.  The first set is ‘enhanced interrogation technique’ and ‘torture.’  The second is ‘physically challenged’ and ‘disabled.’  The third is ‘enhanced seduction technique’ and ‘rape.’  The author of the passage at issue is apparently so intent on verbally abusing those who would seek to avoid “forms of expression or action that are perceived to exclude, marginalize, or insult groups of people who are socially disadvantaged or discriminated against,” as my dictionary puts it, that he ends up (whether deliberately or not I will leave up to readers to decide) using the obvious analogy between substituting ‘enhanced interrogation technique’ for ‘torture,’ on the one hand, and substituting ‘enhanced seduction technique’ for ‘rape,’ on the other—to hide the dis-analogy between either of those two, on the one hand, and substituting ‘physically challenged’ for ‘disabled,’ on the other.  As I have already argued, that substitution is in not at all analogous to the other two.  The attempt to equate all three simply does not at all hold, since the substitution of ‘physically challenged’ for ‘disabled’ is, at least in its intention, done in the service of the abused, whereas the substitution of ‘enhanced interrogation technique’ for ‘torture,’ like that of ‘enhanced seduction technique’ for ‘rape,’ cannot, regardless of anyone’s intention, serve anyone but the abusers.

In fact, if one is looking for a genuine analogy to the substitution of ‘enhanced interrogation technique’ for ‘torture’ (or of ‘enhanced seduction technique’ for ‘rape’) then here is one:  As the substitution of ‘enhanced interrogation technique’ for ‘torture’ (or ‘enhanced seduction technique’ for ‘rape’) is in the service of the torturers (or the rapists), so is the use of the term ‘political correctness’ to stigmatize, ridicule, and silence anyone who dares to advocate avoidance of “forms of expression or action that are perceived to exclude, marginalize, or insult groups of people who are socially disadvantaged or discriminated against,” as my dictionary puts it, in the service of those who practice exclusion, marginalization, and insulting of the socially disadvantaged or discriminated against.   Both the substitution of ‘enhanced interrogation techniques’ for ‘torture,’ and the dominant contemporary usage of the term ‘political correctness,’ are designed to obfuscate, confuse, and hinder, if not altogether halt, serious, ethically and morally informed, genuine discussion.  They are designed to do the opposite of keeping the conversation going, to borrow a favorite phrase from Richard Rorty.

The replacement of the expression ‘torture’ by the expression ‘enhanced interrogation’ operates in exactly the same way as would the replacement of the expression ‘rape’ by the expression ‘enhanced seduction technique.’  Both in turn operate in exactly the same way as does the regnant usage of the expression ‘political correctness.’  All three cut off communication rather than fostering it.  They block off the stream of life in which alone expressions have meaning, as Wittgenstein said, and deal death instead.

All three are in the service of the traffic in trauma.


*His friend Paul Nissan’s novel Aden Araby, if I remember correctly.

The Terror, Terror, and Terrorism #2 of 2

7/15/09

This is the second of two consecutive posts pertaining to French historian Sophie Wahnich’s La liberté ou la mort:  essai sur la Terreur et le terrorisme (Liberty or Death:  Essay on the Terror and Terrorism).  I originally wrote the entry below in my philosophical journal on the date indicated.

Friday, December 12, 2008

Wahnich does an excellent job not only of contrasting The Terror [of the French Revolution] with “terrorism,” but also of [contrasting] The Terror as response to dread (the anger and demand for justice that, under and as The Terror, is what became of the ressentiment of the oppressed against their oppressors), to the Bush response (the anger and demand for justice that Bush issued after 9/11) to 9/11 itself.  In the latter case (p. 98), “the image [the “fascinating” image of horror and “cruelty”] precedes the account.”  And she contrasts that with the joy with which the news and images of “9/11” were greeted in Nigerian, Palestine, even parts of France and elsewhere, where, despite all, the attacks at last gave a voice to the oppressed who theretofore had been denied all voice.

Bush (pp. 99-103, the book’s end) in effect shanghaied the “sacred body” of America, which he identified not with any sovereign power of the people at last finding its place and its voice, but with the “victims” of 9/11 and the  “heroes” made of the rescue workers.  Thus, it was an altogether de-politicized sacred public body.  Pp. 101-102:  “These bodies [of the dead of 9/11], divested of their responsibility in terms of common political existence, are the effective incarnation of the American political project.  Such a project assumes that  the true mode of liberty consists of  knowing nothing any longer of such responsibility.”

P. 103:  “The political project of the Year II [the year of The Terror during the French Revolution] envisioned a universal justice which still remains a hope:  that of equality between human beings as reciprocity of liberty, that of equality between peoples as reciprocity of sovereignty.”  Then, two paragraphs later, she ends her book this way:  “The violence exercised on September 11, 2001, did not envision either equality or liberty.  No more does the preventive war announced by the President of the United States.”