Faith in Trauma: Breaking the Spell

Trauma-Faith: Breaking the Spell

To enchant is to cast a spell. In turn, to disenchant is to break the spell of an earlier enchantment. In the first decades of the 20th century, Max Weber made popular the idea that modernization—with its ever more exclusive prioritization of science, technology, and instrumental rationality over faith, tradition, and belief—centrally involved a process of the “disenchantment” (Entzauberung) of nature. Ever since Weber, however, it can and has been debated whether modernization really broke a spell, or whether it cast one.

So, for example, in one of his writings on the rise of modern technology in volume 76 of the Gesamtausgabe (“Complete Edition”) of his works, Martin Heidegger makes explicit reference to the Weberian idea of disenchantment, only to argue against that thesis. Rather than a dis-enchantment (Entzauberung), says Heidegger (pages 296-297), what is truly involved in the rise of modern technology itself is instead an en-chantment (Verzauberung), a bewitching, hexing, or casting of a spell. That enchantment, according to him, is one whereby the very power at play in modern technology can make good on its own exclusive claim to power, as it were—just as, in the fairy story, the wicked witch, to secure her own claim to the power of beauty, casts a spell over Sleeping Beauty, the legitimate claimant.

According to Heidegger, that enchantment—the casting of the spell whereby what is at work in modern technology (as well as at work in all of the modern science and instrumental rationality that goes with that technology) seizes and secures its own power—goes hand in hand with the de-worlding (Entweltung) of the world, the de-earthing (Enterdung) of the earth, the de-humanizing (Entmenschung) of humanity, and the de-divinizing (Entgötterung) of divinity. “Technology,” writes Heidegger, “as the unleashing and empowering of energies [. . .] first creates ‘new needs’,” and then produces the resources to satisfy them: technology “first discloses the world to which it then fits its products.”

Badiou said essentially the same thing just last year in À la recherche du réel perdu (“In Search of the Lost Real”), his critique of our contemporary “entertainment world,” as he calls it at one point, using the English expression—a world-less pseudo-world actually, one ever more frenziedly devoted to the pursuit of Pascalian diversion from reality. In such a desolate pseudo-world, what falsely but inescapably presents itself as “reality” is in truth so utterly crushing that it permits no genuine, full living at all any longer, but only survival. Nor does such a divertingly fake world any longer have any room for any true faith. It only makes room for superstitions—precisely the sort of dangerously superstitious nonsense, for example, that United States Supreme Court Justice Antonin Scalia spouted at a high school commencement speech shortly before his recent demise, when he attributed the global success of the United States to the frequent invocation of God’s name by our Presidents and other public officials (see my citation of his remarks to that effect at the beginning of my earlier post, “An Anxious Peace: ‘God’ After Auschwitz”).

In a world already deeply asleep, under the bewitching spell cast by what Badiou lucidly calls “triumphant capitalism,” what we need is precisely dis-enchantment, the breaking of the spell. The spell that holds the world in thrall today is broken whenever, anywhere in the world, reality suddenly and unexpectedly breaks through to dispel (good word for it: “de-spell”) any illusion that happiness consists of endlessly buying what the global market endlessly offers for sale.

In Métphysique du bonheur réel (“Metaphysics of real happiness”)—a short book he also published earlier last year and in which he was already “in search of the lost real”—Badiou describes the illusion that the shock of reality shatters. It is the illusion wherein one takes the height of happiness to consist of the conjunction of the following factors, as he puts it in his introduction (p. 6): “a tranquil life, abundance of everyday satisfactions, an interesting job, a good salary, sound health, a flourishing relationship, vacations one doesn’t soon forget, a bunch of sympathetic friends, a well-equipped home, a roomy car, a loyal and cuddly domestic pet, [and] charming children with no problems who succeed in school.” In short, it is the illusion that one could be happy while living a life of crushing consumerist boredom, where nothing disruptive ever happens—life as no more than survival: outliving oneself from birth, in effect.

As opposed to any such pseudo-happiness of mere security and consumerist comfort in sheer survival, real happiness comes only as a by-product of true living. In turn, real life in itself begins only in the deliberate choice, the decision, to engage fully with reality, whenever it does break through our numbing consumerist spell to strike us. When it does, it reawakens us to the realization that, as Badiou puts it later (p. 38), “existence is capable of more than self-perpetuation.” When the consumerist spell that has held us in thrall is finally broken, we reawaken to the awareness that real happiness is nothing but the feeling that accompanies true life—a life fully lived “even unto death,” as the Christian biblical formula has it, rather than just survived.

Thinking Time, Drinking Time: A Beginner’s Thought (1)

1.

Somewhere in the world it’s 3 o’clock

Time to get out of school and think

Somewhere in the world it’s 5pm

And quittin time means it’s time to drink

—Boots Riley, lyrics from “Somewhere in the World It’s Midnight,” in Boots Riley: Tell Homeland Security—We Are the Bomb (Chicago: Haymarket Books, 2015)

 

Schools have failed our individual needs, supporting false and misleading notions of ‘progress’ and development fostered by the belief that ever-increasing production, consumption and profit are proper yardsticks for measuring the quality of human life. Our universities have become recruiting centers for the personnel of the consumer society, certifying citizens for service, while at the same time disposing of those judged unfit for the competitive rat race.

—Back-cover blurb on a paperbound re-edition of Ivan Illich’s Deschooling Society (London: Marion Boyars, 2000)

By my own experience, Boots Riley’s lines about school and thinking are all too true. As I used to tell my own university students, I always learned far more despite school than because of it. Schools present themselves—and the people who work in them (most of whom are good enough, decent enough, caring enough people) typically have to buy the presentation—as places of learning. Schools also like to present themselves, above all to themselves, as devoted to teaching students to think, giving them “critical thinking” skills they can then use to go out and live rich lives of genuine self-awareness. Most schools and teachers would endorse Socrates’ dictum, “The unexamined life if not worth living,” and would claim that the purpose of good schooling is to give students what they need to live an examined, worthwhile life.

However, if we were to judge institutionalized schools and schooling in terms of what they do, rather than what they say, we would be driven to a very different conclusion, by my experience. We would be driven to the conclusion that the real purpose of institutionalized schools and schooling is to teach students not to think—not to dare to do so. In terms of the actual effect on students of being subjected to schooling for year after year from their early years to adulthood, we would have to say that schools do indeed “teach you to think,” but only in the same sense as that in which one might say to an unruly child, “I’ll teach you to sass me!” just before applying the rod the sparing of which the Bible tells us spoils that child. In saying such a thing, one is not promising to help the child acquire effective sassing skills. Rather, one is beginning to inflict punishment on the child for having just done some sassing, punishment designed to teach the child to refrain from doing any more sassing in the future. Judged by their deeds rather than their words, that’s precisely the sense in which schools “teach students to think.” All too many students learn the lesson all too well, though no fault of their own.

Insofar as that is the reality of schools and schooling, then the reality is also that in order to do any thinking, if one doesn’t want to get punished for it one is wise to wait till after the school-bell rings 3 o’clock, announcing the end of the school day. Only once the daily torture of school is finally over is it safe for students to think. Unfortunately, by then it’s unlikely many of them will have enough energy left to try very hard to think, even if despite school they have already somehow managed at least to begin to lean how to do so. About all they will be suited to do is drink—as many will learn soon enough just how to do.

 

I know that’s how school always was for me, at any rate. It was that way ever since my very first day in Kindergarten, which I hated with a passion. Kindergarten made me sick. At least that’s what I’d tell my Kindergarten teacher regularly—especially when she made the class play with some messy, oily clay, from which I recoiled as from excrement. For a while, when I’d leave the clay or other reeking pile of whatever we were made to foul ourselves with and go up to her desk to tell her I was sick to my stomach and needed to go home, she’d have the office call my mother, who’d come the mile from our suburban home to get me and deliver me from my bondage, at least for the rest of that day. Eventually, however, my Kindergarten teacher wised up to my ways. She scheduled a meeting with my mother, with me also to be present. At that meeting she told my mother, having made sure that I would be there to hear it too, that if I did not change my attitude toward school, I’d never even make it through first grade.

She was wrong. I knew that even when she said it, when I was only five. In fact, I never did change my attitude toward school. If anything, it just grew harder every year. Nevertheless, I not only made it through first grade, but through a whole bunch of other grades after that, long enough to get my Ph.D. degree. Then I even stayed in school forty-five years longer, having had to take on her role—a teacher in a school—myself, since it was the only way I could find to make a living doing what I found myself called to do. Thus, I ended up spending pretty much my whole life “making it” in school. I guess I showed her!

Anyway, reading Boots Riley’s lines above brought back for me all the memories of the years and years of dead and deadening boredom I used to experience in school when I was a child, and how I’d keep looking up at the clock on the wall, which never seemed to move. Each day, I had to endure such eternities till 3 o’clock finally set me free for a little while, so I could think. Yet even then I couldn’t really completely relax and think with full freedom, because the constant threat of having to go back the next day for further “schooling” (i.e., more torture) robbed even my after-school hours of truly free time—real time to think.

As a kid, I loved Saturdays. Saturdays were the closest I ever got to any of the carefree days that are so commonly and so falsely attributed to childhood. Saturday was the first of two whole days without school! Time I could use to read and think and do other things I wanted just because I wanted to!

But after Saturday came Sunday. No matter what thoughts I tried to milk on Sundays, the milk was always curdled by my underlying anxiety. I was never that fond of Sunday. Sunday was always poisoned by my knowing that the next day was Monday, and that then I would have to face five more endless days back in hell. Sundays were days ruined by that anxiety. It wasn’t till I learned to drink that I finally found a way to begin to appreciate Sundays.

To make all that abuse even more abusive, throughout all my schooldays nobody, of course, would ever even acknowledged the abuse that was being inflicting on me and all the other kids. That was not really the teachers’ and other immediate abusers’ fault, since it just went with them having been subjected to the same abuse themselves for so long, so unrelentingly, and so effectively over so many years. That long abuse had made them, regardless of their own desires and intentions, into the abuse-system’s unwitting accomplices. Their own histories of being abused had deadened them to all the abuse going on all around them, now being inflicted—often by their own unwitting hands—on all the new kids on the school-block (a block such as farmers use to cut the heads off chickens).

Thus, that schooling conditions students not to think has nothing essential to do with the conscious intentions of teachers as such. The intentions of those who get shanghaied for service as teachers in the schools of our consumer-production system are often tripped up and trapped by their own good qualities, such as a genuine desire to help children learn (teachers’ pay, after all, is hardly that great, so they’re not in it for the money). Rather, it has to do with the institution itself—which is anything but an institution where thought—or life, I will add—is sanctified.

My own long life in schools confirms what the Deschooling Society blurb above says: that “schooling” is really just pure conditioning, designed to turn out good little consumers—fodder for the market system. Boots Riley’s follow-up about quitting time meaning time to drink says the same thing, in high hip-hop style.

Deschooling Society first came out in the early 1970s, when I was already embarked on my own long career of university teaching, after having spent eighteen years of being schooled myself. I found my own way to it at a bookstore. It was a real gift to me. It showed me I was not alone—and not just some ungrateful whiner. It gave me just the sort of general diagnosis of my condition and its causes that reconfirmed for me just who—or, more precisely, what—I should hate.

 

2.

            This means, above all, that our job is to think. As consumers of culture, we are lulled into passivity or, at best, prodded toward a state of pseudo-semi-self-awareness, encouraged toward either the defensive group identity of fandom or a shallow, half-ironic eclecticism. We graze, we binge, we pick up and discard aesthetic experiences as if they were cheap toys. Which they frequently are—mass-produced widgets from the corporate assembly line.

—A. O. Scott, “Everybody’s a Critic,” NY Times op-ed section for 1/31/16, adapted from his book Better Living through Criticism: How to Think About Art, Pleasure, Beauty, and Truth

A few years ago, the thought occurred to me that the next time someone asked me if I would mind a little helpful criticism, I would reply: “How would I know? I’ve never received any.” Unfortunately, nobody’s asked me that pseudo-question since then, so I have not yet had a chance to use that line—at least until now, in this post.

I can only hope that confessing to thinking such critical thoughts about criticism doesn’t expose me as having taken on the defensive group identity of fandom—or, even worse, put on full display my shallow, half-ironic eclecticism. I take pride in not being some mere consumer of culture, prodded toward a state of pseudo-semi-self-awareness, if not altogether lulled into pure passivity. I doubt that I could stand the humiliation if my criticism of criticism revealed me to be no better than one of those who graze among aesthetic experiences, bingeing on them, picking them up and then discarding them as though they were cheap toys. (If I have to binge, I’d rather just go back to bingeing on booze instead. That has its own honesty, and would at least allow me to preserve a modicum of self-respect.)

Be that as it may, I shudder to have to admit that from time to time I do indeed catch myself taking seriously some mass-produced widget from the corporate assembly line passing itself off as a work of art—just as I sometimes, to my shame, take seriously what is really no more than just such a widget passing itself off as a work of thought. Maybe not today, though.

But enough about me! (Or maybe not.)

 

3.

To philosophize means to begin to think. We must always become beginners again. Those who hold themselves to be advanced easily fall prey to the danger of taking what they assume they already known to be no longer worthy of thought, and thus to hold themselves exempt from needing to begin. To begin means: every time to think every thought as though it were being thought for the first time.

—Martin Heidegger, Leitgedanken zur Enstehung der Metaphysik, der Neuzeitlichen Wissenschaft und der Modernen Technik (Gesamtausgabe 76, 2009), p. 54 (freely translated)

When I was fifteen, I taught myself enough German to be my family’s interpreter when we spent most of a month in Germany. My parents had managed to pay off the mortgage on our family home in Colorado a few years before. But they had taken out a new one just to finance a trip to Europe for themselves and me, the youngest of their three children and the only one still at home—a trip to Europe to visit my brother, who is three years older than I am. My brother had enlisted in the U. S. Army the year before. After boot camp and some extra army training, he had ended up in the Army Security Agency, for God knows what reason, and was stationed at Frankfurt am Main. He managed to get himself a one-month leave, and we drove all over the place in Western Europe crammed into the VW Bug he’d bought for himself while stationed there.

My family on both sides is mainly of German origin, and I’d long had a fascination with Germany and things German. So I had talked my parents into buying me a set of German language vinyl records, and I’d used them to get a beginner’s sense of the language. When the four of us went driving around Germany that summer, I was the one who asked for rooms at inns, meals from waiters in Bierstuben and other places we ate (including angering my brother at one stop when I ordered him a Pilsner beer, rather than a Lager), asking directions, and the like.

But I never really got a good reading knowledge of German, or mastery of German vocabulary much past guide-book level, until I finally got to graduate school, where I was enrolled in a Ph.D. program in philosophy that required a reading ability in two non-English languages, as demonstrated by results on the standardized Princeton ETS language exams of that age. Given my own interests in philosophy, the two languages that made the most sense were the two standard ones of German and French. I did the German first, then the French.

I went about both in the same way, starting with a vinyl record set for learning the language at issue, to get a sense of the grammar, basic vocabulary, and sound of the language. (For the German, I just had to brush up some, since I’d already done the record thing when I was fifteen.) Then I’d get myself a dictionary for translating the language into English, and start reading through some work of philosophy in that language, a work of my own selection. When I started my reading, going was very slow indeed. But by the time I’d finished my selected book, I was ready to take—and ace—the standardized language exam.

For French, I used a book containing Descartes’ Discourse on Method and his Meditations. After scoring well on the standardized language test, I promptly forgot my French, since my philosophical interests didn’t really give me any special reason back then for continuing to read philosophy written in French. What was available in English translations was sufficient for my purposes, and I never went back to French until our daughter did me the favor of marrying a Frenchman. That gave me incentive to teach myself French again, going about it my same old, proven way—only this time I used CDs, rather than vinyl records. Since then, I regularly read philosophy works in French, both to keep my language skills up, and so that I’m not at the mercy of the translation-industry and its market-driven widget-making decisions.

For German, I used Heidegger’s Holzwege, a collection of essays he’d published at the beginning of the 1950s. It contains his famous lecture “Vom Wesen des Kunstwerkes” (“On the Origin of the Work of Art”), which he originally delivered back in 1935. When I first read it in Holzwege, it was still a few years before that lecture was first published in an English translation by Albert Hofstadter. By the time I and my Langenscheidt’s German-English dictionary were done with Holzwege, I could read German. At least I could read philosophical German, stuff like Heidegger’s Zur Sache des Denkens, which came out in the original German version just a couple of years later (it was eventually translated by Joan Stambaugh and published by Harper under the title On Time and Being), or Kant’s Grundlegung der Metaphysics der Sitten (Foundations–or Groundwork/s—of the Metaphysics of Morals). But when it came to reading the daily newspaper, I still had to use my Langenscheidt’s. I had far less trouble reading Kant in German on the transcendental unity of apperception in his Critique of Pure Reason than I did reading the appeal in the local paper for help finding the lost dog who’d jumped over the fence in some Frankfurt suburb, for example.

As I’ve already mentioned, because I had no ongoing use of my own for it at the time, I soon forgot the French that I taught myself in the first place only in order to jump over one of the hurdles the school I attended made me jump over to get my Ph.D. degree, so I could start to sell myself for money. But when my daughter married a Frenchman and I went back through the very same process to teach myself to read French again, I retained what I’d learned, and have been able to keep my reading knowledge of French current ever since. That is precisely because I want to retain it. I want to, in turn, because I have a French son-in-law, of course, but even more (he’s fluent in English) because I have interests that send me regularly to read stuff in French philosophy before it gets translated into English—if it ever does, since quite a bit of what I find most interesting doesn’t seem to fit the market needs of the current crop of translation publishers.

The German I taught myself by reading Heidegger has stayed with me ever since I first learned it. The timing of my thereby teaching myself to read German was in part dictated by the same need to jump a hurdle my school made me jump before it would finally let me get out of school after all the years I had to be in it. But that really only played a secondary role, even in terms of timing. That’s because by then I was already hooked on Heidegger, including knowing I wanted to do my Ph.D. dissertation on his thought. At the time, not all that much of Heidegger had been translated into English. So I knew I’d need to learn to read him in German.

In fact, had I not had to jump over a number of other hurdles my school also put in the way of people like me getting out of school as quickly as possible, without being penalized for the rest of our lives because we left—the way that those who just can’t stand more abuse drop out of high school are made to pay the penalty for their sassing the school system by dropping out that way: the penalty of dead-end jobs for unconscionably low pay—I’d have probably taught myself to read Heidegger in German even earlier than I did. I’ve read (and reread) Heidegger in German ever since I taught myself to read German by reading him. I continue regularly to read him in German to this day, and plan to continue to do so till death or senility stops me.

I read Heidegger in German even when there are English translations of what I’m reading available. He’s better in German. That’s just another thing I learned despite all my schooling.

*     *     *     *     *     *

To be continued.

The Traumatic Word (5)

Though for the word’s own sake I could still say much more, this is the final post of my series on “The Traumatic Word.”

*     *     *     *     *     *

It is human to see the world made up of three kinds of things: food, proscribed edibles, and non-food. For a Hindu pork is taboo, not so begonias. These he has never thought of eating. By eating pork, he loses caste. If, however, he joins an Indio from central Mexico eating begonia flowers not he, but the world around him has changed. Begonias have moved from non-food to food.

Issues as well can be thus divided. Some are considered legitimate. Others not to be raised in polite society. A third kind seems to make no sense at all. If you raise these, you risk being thought impossibly vain.

___________________________________________

So far, every single attempt to substitute a universal commodity for a vernacular value has led, not to equality, but to a hierarchical modernization of poverty.

— Ivan Illich

 

Both of the citations above are taken from Ivan Illich’s 1981 book Shadow Work—from which I already cited two lines in a note appended to my previous post, the fourth of this series of five on “The Traumatic Word.” With regard to what he says in the first of the two citations above, about there being some issues the very raising of which runs risks of being thought to be as impossibly vain as a begonia-eater, Illich offers as an example the issue he risks raising in Shadow Work itself. That is the issue of the distinction between what he therein calls “the vernacular domain,” on the one hand, and “the shadow economy,” on the other (the emphasis is Illich’s own in both cases).

Being far less of a risk-taker than Illich himself, I will not risk discussing both sides of that risky conceptual disjunction. I will leave it up to interested readers to read Illich’s book itself for enlightenment (or befuddlement, if Illich loses his wager with those readers) about what he means by “the shadow economy.” For my own risk-averse purposes in this post, I will simply focus on the first disjunct, the notion of “the vernacular domain.” In fact, to minimize my risk even further, I’ll confine my attention to what is named in just the first two words of that three-word phrase—“the vernacular.”

With regard to the vernacular, I will risk saying this: the vernacular is the parochial.

In saying that, just as it stands, I am not risking much. That’s because, just as it stands, it will sound bland and innocuous to most modern ears. Of course the vernacular is the parochial, those who hear with such ears might well remark. After all, both refer to what’s local, informal, and more or less uneducated or “backwoods”-ish—as when we speak of “parochial concerns” and of putting something “in the vernacular,” for example. Such ways of speaking and putting things contain within themselves what amounts (to use the vernacular) to “putting them down,” reducing them to the sorts of concerns and ways of speaking characteristics of “hicks,” more or less (of the mindless masses of “the great unwashed,” to use the educated way of saying it that, as I mentioned in my preceding post, one of my old DU colleagues used to like to use).   That is, having concerns that count as “parochial,” or a tendency toward putting things “in the vernacular,” is just not the sort of thing one wants to do if one is concerned to preserve one’s status as an educated, well-schooled person who would resort to the vernacular only by putting what one says within quotation marks, as I’ve been trying to be careful to do so far. To the well-trained, well-schooled understanding, both the vernacular and the parochial always carry a whiff of vulgarity with them—vulgar being a word derived eventually from Latin vulgus, meaning “the common people,” where that phrase in turn is already pressed into service to put down such people, reducing them to the status of “the multitude,” that is, “the crowd” or “the throng,” the mere and sheer human “swarm” of “the great unwashed.”

At least part of what Illich is trying to call to our attention in his own usage of vernacular is how uppity we are in our dismissal, as always being somewhat vulgar, of everything local, home-grown, and genuinely “convivial,” to use another word he likes to risk using in unusual ways, at least by today’s hoity-toity, “grammatically correct” standards. As I already noted in my immediately preceding post, the word vernacular derives from the Latin vernaculus, which means “domestic, native, indigenous.” What I left out in my preceding post what that vernaculus itself derives from verna, a Latin word of Etruscan origin that meant a “home-borne slave.” By my reading of him, Illich is in effect running the risk of trying to liberate the vernacular itself from its slavery, thereby restoring to it the full, fully ambiguous freedom that is the birth-right of all words as words, whose worth as such is taken violently away whenever they are pressed into service as mere signs or symbols (in the sense of those two words that Walter J. Ong, for one, helps us hear).

Since Illich has already run all the big risks of such a liberation of words with vernacular, I am free to run the much smaller risk of trying to do some of the same for parochial, a word the origins of which are not already tainted by such hierarchies of master and slave as are the origins of the word vernacular.

Parochial derives eventually from Latin parachoia, which means “of or pertaining to a parish.” In turn parachoia derives from the Greek paroikos. According to the Online Etymology Dictionary (www.etymonline.com) that last word was used by early Christian writers to mean “a sojourner”—after its classical Greek usage to mean “neighbor,” from para, “near, beside,” and oikos, “house.” Insofar as those origins can be heard back into what parochial says, the parochial is that which belongs to home, the place where we dwell, where we are “at home”—the same “home-grown” stuff, in short, as makes up the vernacular, at least in Illich’s liberation of that word from its bondage.   The parochial, the vernacular, is what is of or pertains to where we do indeed sojourn, from Latin sub-, “under, until,” plus diunare, “to last long,” from diurnum, “day.” Where we sojourn is literally where we “spend our day,” day after day throughout our human life from birth to death—“we” being all of us common people, in all the glorious, irremediably vernacular vulgarity of our utter parochialism, our great unwashed-ness.

*     *     *     *     *     *

I claim no special expertise on Gerard Manley Hopkins, and most especially none on the proper scholarly interpretation of his poetry. However, one of his poems once delivered an especially resonant word to me—a word pertaining to trauma. That was during my own traumatic summer vacation of 1987, about which I have written on this blog before, without at that time discussing the contribution my reading of that one of Hopkins’ poems made to my experience then, back when I first read it in 1987. When I recently read Ong’s book on Hopkins, including Hopkins’ own letter to his friend Bridges about the word sake, I was reminded of that contribution.

Hopkins’ remarks in the letter On cites struck me as no surprise when I read them for the first time in my recent reading of Ong’s book, because they struck me as already familiar to me from my much earlier reading of the poem at issue. In the light of Hopkins’ letter I was able to see—or, more accurately put, perhaps, in the resonance of that letter I was able to hear—how that poem, as I first received it years ago, during my summer of 1987, really said the same thing already, at least to me, in a poetic rather than a prosaic way. Here is Hopkins’ poem, #34 in the standard numbering:

As kingfishers catch fire, dragonflies draw flame;

As tumbled over rim in roundy wells

Stones ring; like each tucked string tells, each hung bell’s

Bow swung finds tongue to fling out broad its name;

Each mortal thing does one thing and the same:

Deals out that being indoors each one dwells;

Selves – goes itself; myself it speaks and spells,

Crying Whát I dó is me: for that I came.

 

I say móre: the just man justices;

Keeps gráce: thát keeps all his goings graces;

Acts in God’s eye what in God’s eye he is –

Chríst – for Christ plays in ten thousand places,

Lovely in limbs, and lovely in eyes not his

To the Father through the features of men’s faces.

 

When I first encountered that poem, during my summer vacation of 1987—when I underwent, in full public display (at least in a rather parochial sense of “public”), a traumatic reliving of a much earlier trauma from my childhood—I heard Hopkins’ two stanzas as constituting what Ivan Illich in Gender calls
a duality, characterized by the asymmetric complementarity of that duality’s own constitutive duo. That duality emerged, and was marked by, my hearing, at the start of Hopkins’ second stanza something that remained unsaid, but nevertheless determinative for my entire understanding of everything said in the poem as a whole, in both its stanzas taken together.

The unsaid I heard then, during my traumatic summer vacation—which was most especially traumatically healing, I will add, with regard to a much earlier trauma from my childhood—of 1987, when I first heard Hopkins’s poem, was but a single word. In fact, it was but that very word: “But.” Though it is not there in what Hopkins actually says, not written there in letters beside all the ones he did write in that poem, I heard (and still do) the second stanza sound a silent “but” at its very beginning, to set the tone not only of what was to follow as that second stanza itself, but also of what lay there already to be found in the first.

According to the first stanza of the poem, “each mortal thing” keeps on redundantly saying over and over again the same old thing. That same old thing is nothing but itself. Each thing says the same thing all the time: “Myself it speaks and spells/Crying Whát I dó is me: for that I came.”

However, the “I” who speaks in the second stanza, does not just say that same, does not just “fling out broad its name,” crying out always only “Myself.” Rather, the “I” who speaks in the second stanza remains utterly anonymous, which is to say name-less. That nameless speaker does not cry out itself, and beyond that shut up, saying nothing else. Rather, that anonymous “I” says “more”—which Hopkins himself already doubly emphasizes by placing the diacritical mark over that word in the already wholly italicized stanza: “I say móre . . .”

The “I” who speaks the second stanza says “more” than what “each mortal thing” says, according to the first stanza. That is what I heard in hearing a silent “But” sounding to open, and thereby thoroughly to tune, the whole second stanza–and, with it, to attune the reader’s ears for properly hearing what the whole poem gave voice to.

What it gave voice to, when I first heard it during my own doubly traumatic summer vacation of 1987—“doubly traumatic,” because it was an itself-traumatic, asymmetrically complementary reliving of an earlier trauma—was itself dual, precisely in Illich’s sense of that. What I heard was the duality, in short, on the one hand of entrapment in hell—the pure hell of total self-absorption, in which the self, has become so wrapped up and entangled in asserting itself that it has lost itself entirely—and on the other hand of liberation from that entrapment—the very “harrowing of Hell” by Christ himself between his death on the cross and his resurrection on the first Easter Sunday, according to Christian tradition, which was of course the tradition to which Hopkins himself so crucially belonged.

According to another tradition, that of Mahayana Buddhism, samsara and nirvana are said to be “the same.” Well, in the same sense of “the same,” hell and the liberation from hell—which is to say hell and heaven—as Hopkins’ poem 34 long ago now gave me at least to hear, are “the same.” That is, coming to be liberated from hell is not like being taken from one location and transported, by magic or airplane or any other means, to some other, new, different location. It is, rather, being freed from the bondage of self, wherein the self loses itself entirely in the entanglements of claiming its own, into genuinely being oneself, which one can only be in what Ong—glossing Hopkins’ remarks about the sake of such expressions as “for one’s own sake,” in Hopkins’ letter to his friend Bridges—well names one’s “outreach to others.” Only when liberated from the bondage of having always only to be myself alone, am I given to know that I have all along been no one other than myself—but always already and only myself among others.

That’s what I heard when I first heard Hopkins’ poem 34, during my summer vacation of 1987. It’s what I hear still, when I listen through all the noise, rather than to it.

*     *     *     *     *     *

It is far from accidental that, as Walter J. Ong reminds us in the lines from The Presence of the Word with which I began this whole series of posts on “The Traumatic Word,” the word as word is not only not a “sign,” but also not a “symbol” either. To take each in turn:

The word is not a “sign,” properly speaking, since the word sign itself ultimately bespeaks something visible, something to be seen, whereas the word word bespeaks something audible, something itself spoken, to be heard rather than seen.

What is more, to repeat, the word is not only no such sign, says Ong, but also no “symbol.” That is because, as he tells us, originally “symbolon was a visible sign, a ticket, sometimes a broken coin or other object the matching parts of which were held separately by each of two contracting parties.”

In the concentration upon the visible imposed upon him, regardless of his own will in the matter, by the already now long-standing tradition of treating language as nothing more than an elaborate system of “signs,” and the word itself as no more than a “symbol” of what it names, in the just re-cited passage Ong may himself have misheard some of what sounds in the word word itself.   It is not simply because the word belongs among what sounds, and so gives itself to be heard, rather than belonging among the visible, which gives itself to be seen, that the word as word is no “symbol.” It is also—and in my own judgment above all—because the word as word is no token of coercive power, that drives to make everything fit. The word as word is no sign, such as a torn ticket or a broken coin, the two sides of which fit perfectly together, thus signifying the official authorization of the messenger, who carries one half of the symbol with him, to carry some official message to the officially designated recipient of that message, who proves his own authorization to receive it by providing the matching other half of the symbol, to perfectly fit the messenger’s half. A word as word, as a breaker of the silence to which it gives voice, is no such torn ticket or broken coin or modern digitized equivalent that testifies to such polarized and polarizing authorization. The symbol as such is always a sign of claimed power, claimed “authority.” The word, as word, claims no authority. It just speaks.

That is why the word is no sign. As Ong so rightly observes in the next to last line from the epigraph with which I began this entire blog series: “The word cannot be seen, cannot be handed about, cannot be ‘broken’ and reassembled.” However, he misses, I’d say, the deepest, truest reason that the word cannot be broken, as is every “symbol.

That the word cannot be broken derives not from some timeless or indestructible durability of the stuff of which the word consists, certainly. After all, as Ong himself repeatedly emphasizes, there is nothing more passing, less enduring, more easily destroyed than sound, which is finally all the word consists of. The reason the word cannot be broken—and why it is therefore so unsuitable for being made to do service to coercive power, the sort of power that imposes itself on those it over-powers, as do all institutions that have passed beyond conviviality—has nothing to do with that.

The word cannot be broken because it is always already broken to begin with, and only so does it speak. The name and what it names—the same as glory and the glorious, or luster and the lustrous, or shine and the shining of that which shines of itself—are never two halves of some once presumably unitary totality that somehow got subsequently broken apart, such that the pieces could ever, even in the wildest fantasy of security and authority (beyond even “all the king’s horses and all the king’s men” that couldn’t re-fit Humpty-Dumpty back together after he dumped from his wall), be fitted seamlessly back together again.

The name, the very being of what is named—its being “outside” itself, with and among others—on the one hand, and its being “in” itself, “indoors which it dwells,” on the other, constitute a duality, not a polarity. The two are strictly incommensurable: There is no common standard by which they can both be neatly operationalized, measured, ranked, and set to order within a hierarchy.

The name and what it names are really the same, but that is so only in the way that men and women are really the same, which does not in the least mean that the two are “one and the same thing.” If the name and the named were just one and the same thing, then the name could not be the named itself outside itself, given to others. Then neither God nor anything else could ever be honored for its own sake, and nothing would ever have any glory.

The word can never be broken, because it is, as word, the break itself. The word is the very breaking open of the cosmic egg, without which the egg can never attain its own glory, for its own sake. The word itself is traumatic. That’s why I have entitled this whole series “The Traumatic Word.”

*     *     *     *     *     *

There is no one, all-encompassing, all-comprehending uni-vision, uni-perspective, uni-conception that can reduce everything to one single all-inclusive, all “other” exclusive, totality of beings. As Heidegger already taught in “What Is Metaphysics?”—his inaugural address in 1929 when he took over his mentor Edmund Husserl’s chair of philosophy at the University of Freiburg—we are never given “the whole of beings” (das Ganze des Seienden). We are only—but also always—given “beings in the whole” (das Seiende im Ganzen). Whatever gives itself at all has, as so self-giving and self-given, its own being “outside” itself, as Hopkins so well puts it, that is to say, its being open and given to all the other beings with and among which alone it can be.

To my ears, ever since they were first attuned to hear it during my traumatically healing summer vacation of 1987, Hopkins teaches the very same lesson in poem 34: “As kingfishers catch fire . . .” As I hear them, Heidegger and Hopkins say the same. It’s just that they say it, appropriately, in two radically different, asymmetrically complementary ways.

Such differences can only help us hear if we let them. And only a hearing attuned to such difference can hear at all. So we should let them.

What they help us to hear, among other things, is that, as for the universe, in opposition to the cosmos, at least in the original sense of that latter word—well, there simply is no such thing. There is no “uni-verse,” no one thing that is the whole of everything, and turns everything into just one thing. There is no such all encompassing, all other excluding, single thing. There is only and always what might well be called “the di-verse,” if I may risk putting it that way.

The universe, were it to be, would be nothing but a total, monotone horror, and a colossally monotonous bore, on top of that. The diverse, however, is richly chromatic—we might call it extra-chromatic—and ever entertaining.

Therefore let us thank God that there is no such thing as the universe, but that there is only the diverse. That is, let us give thanks that there is only the being together of each with all—in which all things act for the sake of each other, to the glory of each other’s name: the word by which each is called, the very being of each outside itself, with and among all us others.

Amen! Which is to say: So be it!

The Traumatic Word (4)

As plans have a way of doing, my plan to complete this series on “The Traumatic Word” with today’s post has fallen through. However, this series of posts of my words on the word will end with my next post, most of which is already composed.

*     *     *     *     *     *

            Sake is a word I find it convenient to use: I did not know when I did so first that it is common in German in the form sach. It is the sake of ‘for the sake of,’ forsake, namesake, keepsake. It mean by it the being of a thing outside itself, as a voice by its echo, a face by its reflection, a body by its shadow, a man by his name, fame, or memory and also that in the thing by virtue of which it has this being abroad, and that is something distinctive, marked, specifically or individually speaking, as for a voice and echo clearness; for a reflected image light, brightness; for a shadow-casting body, bulk; for a man, genius, great achievements, amiability, and so on.

— Gerard Manley Hopkins

 

Exchange drives partners toward ever clearer fit (homogeneity and not ambiguity), whose asymmetry therefore tends toward hierarchy and dependence. Where exchange structures relationships, a common denominator defines the fit. Where ambiguity constitutes the two entities that it also relates, ambiguity engenders new partial incongruities between men and women, constantly upsetting any tendency toward hierarchy and dependence.

— Ivan Illich

 

The passage immediately above—that is, the second epigraph for today’s post—comes from Ivan Illich’s 1983 book Gender (Berkeley: Heyday Books, page 76, end of footnote 57). That book was no less controversial when it first appeared than were such earlier Illich publications as Deschooling Society, first printed in 1971, and Medical Nemesis, the first edition of which appeared in 1975 and which probably gained the most widespread attention, and engendered the most controversy, of all his works.

Born in Vienna in 1926, as a young man Illich became a Roman Catholic priest. He remained in the priesthood from then until his death in 2002, despite falling into conflict with the Vatican and—by mutual but non-official agreement, in effect, between him and the institutionalized hierarchy of the Catholic Church—ceasing to function publicly as a priest toward the end of the 1960s, though he even continued to say the Catholic Mass in private on occasion throughout the rest of his life.

Recurrently in his work, Illich argued and documented that the formal institutionalization of practices and processes pursued beyond a certain point becomes counterproductive. That is, pursued beyond that point institutionalization no longer facilitates the realization of that for the sake of which the institution was purportedly established. Instead, it begins to become an obstacle rather than an avenue for such realization, even beginning to generate specifically opposite results.

For example, in Medical Nemesis Illich argues that the institutionalization of medical care, carried beyond a certain point, starts making the society in which such institutionalization occurs less healthy overall, rather than more healthy. Put in different terms, pursued beyond that critical point, the institutionalization of medical care not only passes what economists call “the point of diminishing returns,” but actually sets off an inflationary spiral of ever-rising overall social costs for healthcare. As is true of all such inflation, although it massively benefits an ever more select few, it works to the growing disadvantage of the vast majority of society. In the case of medical care, that means medicine institutionalized past that tipping point starts making the society as a whole sicker, even and especially generating iatrogenic (“doctor- caused”) illnesses—a clear example of which is the disturbingly high rate of hospital-caused infections in the United States today.

In case after case, book after book, Illich advanced the same general argument about institutionalization becoming specifically counterproductive whenever it is pursued beyond such a certain, surprisingly minimal point—“surprising” at least for those of us today who long ago became used to living in a globally over-institutionalized society. Whereas in Medical Nemesis he addressed the counterproductivity of contemporary institutionalized medicine, a few years before that book appeared Illich addressed, in Deschooling Society, the institutionalization whereby education becomes “schooling,” which takes place only in specially designated places called “schools” at specially designated times (“school-time,” as we say) and ages of life (as reflected in talk about someone being “school-aged,” for example—though with the rampant commercialization of education and the emergence of the total horror of “life-long” schooling well under way today, that expression may be well on the way to losing its currency). Illich does a good job of showing how such over-institutionalization of education by enclosing it ever more tightly within schools and schooling ends up making the society as a whole less, rather than more, educated.*

In general, institutionalization becomes counterproductive once it passes the point of what Illich calls “conviviality.” He uses that term in the title of his 1973 book Tools for Conviviality, and means by the “convivial”—which he will also connect with what in various works, including Gender, he calls the “vernacular”—what can be pursued within ongoing local community life as such, and is “expressive” of that community itself.   “Convivial” tools as well as institutions would be those that are established and maintained truly for the sake of those who establish and maintain them, as expressions of themselves.

*     *     *     *     *     *

The first epigraph for today’s post, from Gerard Manley Hopkins, about the sake of such expressions as “for the sake of,” comes from a letter Hopkins wrote his friend Robert Bridges dated 26 May 1879. Walter J. Ong cites it in his book Hopkins, the Self, and God (University of Toronto Press, 1986, page 38), and then glosses it by writing: “Doing something ‘for my sake’ is doing something for me in so far as I have an outreach to you. What is distinctive about ‘my sake’ is not that I am totally self-contained in a solipsistic, self-sufficient world but that the outreach to you is in this case the outreach that comes from me and only from me, that is distinctive of me, not found in any other.”

All the way back at least to Being and Time, Heidegger distinguished between, on the one hand, what we find or fabricate for use “in order to” (um zu) pursue some extrinsic end (a redundant expression, actually, since any end as such is necessarily extrinsic to the thing we find or fabricate for use to achieve that end) and, on the other hand, what we use all such means for pursuing all such ends “for the sake of” (um willen). His discussion helps make clear that what we do “for its own sake” is precisely what we no longer do “in order to” accomplish something else.

So, for example, what we do “for God’s sake” (in German: um Gottes willen) is nothing that we do for any “ulterior motive,” as we put it—some such motive as currying favor with “the Czar of the universe” (to borrow an apt phrase from AA co-founder Bill Wilson’s telling of his own tale in the first chapter of Alcoholics Anonymous), in order to keep the Big Bully from zapping us for not obeying his orders, or to get him to give us something we want, or the like. What we do “for God’s sake” is just what we do for no other end or reason at all, save adding to God’s own “glory.”

Hopkins is right in what he says in his letter to his friend Bridges about the English word sake, including his remark about the German cognate of that word, which by the conventions governing written modern German would be Sache, meaning “thing” or “matter.” So, for example, a work Heidegger published late in his life was a collection of essays all of which dealt with the same matter—what he called, in the title he gave the whole thing, Zur Sache des Denkens. That title for its own sake might be translated as “On the Thing of Thinking” (or “of Thought”), if we use that word thing the way Baby Boomers such as I still do when we speak on occasion of “doing our own thing.”  Or it might be translated as “On the Matter of Thinking.” At any rate, what Heidegger means by his title could perhaps best be captured by noting that all of the essays in the book address that for the sake of which thinking occurs, that for the sake of which thought takes place.   That is, to ask after die Sache des Denkens is to inquire into what thinking or thought adds to the glory of—what it adds to the luster of, as gold adds to the luster of those suited to wear it.

Hopkins himself is deeply thoughtful to note, for Bridges sake and for his own, that he, Hopkins, himself means by the word sake “the being a thing has outside itself.” That is why I have been speaking in my own turn of what is done “for the sake of” someone or something as done “for the glory” of that one or thing. I will continue to use the example of doing something solely “for God’s sake,” that is, doing it solely to add to God’s own luster, God’s own glory.

The “glory” of God is not something extrinsic to God. It is, rather, to use Hopkins’ own way of putting it, the very being of God as such, God Him-self/Her-self/God-self, “outside” Him-/Her-/God-self. How gloriously Hopkins puts it! The “sake” of a thing is the thing itself as outside itself—as itself there in its shine, its splendor, in short, its glory.

The glory of God’s—God’s very “sake” as such, in Hopkins’ glorious sense of that word—is not there for its own sake, however. The (Hopkinsian) “sake” of God is there to the glory of God, not to it own glory. It is God’s own luster–God’s “name, fame, or memory,” to borrow what Hopkins applies to what he names “man,” but which in his spirit we can happily apply just as aptly (if not even more so) to what we name “God.”

To do something solely “for God’s sake” is thus the same as doing it solely “in the name of God,” or as we also say “for His [sic] name’s sake.” In turn, to act solely “for God’s name’s sake” is not to act to the glory of something apart from God—since God’s “sake” is God’s “name” itself, and both the same are not different from God, but are God’s very being “outside” God Her-/Him-/God-self, that is, what we could aptly and happily call, borrowing from Ong, God’s “outreach” or “presence” to others. To act “in God’s name” or “for God’s name’s sake” is to act to the glory of God God-self. (I hope I have sufficiently indicated by now that I am using that expression God Godself to avoid talking of God Himself or Herself, while still avoiding turning God, that “who” of all “who’s” rather than “what’s,” into any “It”—Id in Latin, and Lat-anglicized Freud. In the name of God let us, to be sure, avoid drafting God’s name into service to sexism, but not at the price of letting that name degenerate to no more than the sign of an “it.”) To act solely for God’s name’s sake is to act in such as way as just to add glory God’s own glory, shine to God’s own shine, luster to God’s own luster. It is to polish the gold in which God always already comes decorously bejeweled. In short, it is to adore the divinely adorned.

*     *     *     *     *     *

Division by “gender,” as Illich analyses it in his 1983 book of that name, is a convivial duality, as opposed to the non-convivial, specifically counterproductive polarity of division by “sex.” He thereby reverses—or rather “transfigures,” to use a more convivial term, since he does not just turn it around—what still at that time at least (the early 1980s), passed as conventional feminist wisdom. The latter took sex to be less “socially constructed” than gender, and objected above all to distinguishing between two supposedly natural genders rather than the two sexes, of “masculine” and “feminine,” “male” and “female,” “man” and “woman.” Thus, “gender” was commonly taken by feminists to mean something “social” or “cultural,” whereas “sex” was taken to mean something “biological.” In sharp difference, Illich writes (pages 3-4):

I use gender, then, in a new way to designate a duality that in the past was too obvious even to be named, and is so far removed from us today that it is often confused with sex. By ‘sex’ I mean the result of a polarization in those common characteristics that, starting in the late eighteenth century, are attributed to all human beings. Unlike vernacular [from Latin vernaculus, “native, domestic”—so what is “convivial,” in the sense Illich gives that term, which I explained above] gender, which always reflects an association between a dual, local, material culture and the men and women who live under its rule, social sex is ‘catholic’ [that is, claims “universality”—from the literal, etymological meaning of catholic]; it polarizes the human labor force, libido, character of intelligence, and is the result of a diagnosis (in Greek, ‘discrimination’) of derivations from [what, under such a “diagnosis,” becomes] the abstract, genderless norm of ‘the human.’ Sex can be discussed in the unambiguous [a mark of its “catholicity,” since the “vernacular” is always and inescapably ambiguous] language of science [that most universal, or catholic, language of that purely, purified catholic “knowledge” that is science]. Gender [in sharp contrast to the exclusively uniform and uni-forming totality of “sex”] bespeaks a complementarity [What a glorious word for it!] that is enigmatic and asymmetrical.

As he sums that up nicely, much later in Gender (in footnote 101, bottom of page 138): “Gendered speech constantly breathes, whispers, and utters gendered duality, while sexed language imposes discrimination. Grammatical gender (genus), therefore, becomes in sexed language what it could not be in gendered speech: a constant device for a put-down.”

For my purposes in this post, what I will take from such fine passages, and from Illich’s Gender as a whole, will not be the issues of sex, gender, totalization, discrimination, globalism, and feminism, the disconnections and interconnections of which he deftly traces in that book. That discussion is most certainly worthy of careful reading and reflection upon for its own sake, to be sure. But for my purposes here all I want to extract from it is the distinction therein between what he calls “duality,” characterized by the “asymmetrical, ambiguous complementarity” of its two sides or halves, and what he calls “polarity,” characterized by how it “imposes discrimination.”

In a brief footnote discussion entitled “Complementarity and Social Science,” within a chapter called “Vernacular Gender” (footnote 52, to pages 68-69), Illich observes that light, in the sense of the Latin lumen, or “way of looking,” was once thought to “stream” from the eye out to the visible thing—in effect, “palpating” it, as Merleau-Ponty liked to put it in various texts, though Illich doesn’t mention him here. Applying that to the vernacular duality of gender, Illich writes that in the analysis he is attempting to present through using that duality, “each culture appears as a metaphor, a metaphoric complementarity relating two distinct sets of tools, two types of space-time, two domains,” which “find expression in different but related styles in which the world is understood or grasped”—two incommensurably different but related beams of light, streaming out from two incommensurably different but related sets of eyes to palpate the visible.

In contrast, he goes on, science “is a filter that screens from the observer’s eye the ambiguity of gendered [that is dual, asymmetrically complementary] light.” As a result of such filtering out of all such irreducible ambiguity within what is called “social science,” the “asymmetry that constitutes the social reality of each vernacular is effected by the central perspective of cultural anthropology,” which institutionalizes a “monochromatic, genderless [that is, utterly univocal and uni-sighted] lumen”—the single, glaring, contour-blanching light “of such concepts as rule, exchange, and structure.” Such concepts—which word comes from Latin con, “with,” and capare, “take, grasp, seize”—cease to conceptualize (to grasp in and for thought) anything of what Illich calls “the Eigen-value [from the German eigen, “own,’ in the sense of belonging or being “proper to” that which has, manifests, or in short shines forth with and in, it] of each and every vernacular reality,” that is, every local, native, domestic, home-grown and home-growing, concrete, really real reality.

Accordingly: “What the scientific observer sees through his diagnostic spectacles are not men and women who really act in a gendered subsistence society but sexual deviants from an abstract, genderless cultural norm who have to be operationalized, measured, ranked, and structured into hierarchies.” Thus, as Illich then concludes his discussion in this footnote by writing: “Cultural anthropology that operates with genderless concepts is inevitably sexist,” with a sexism that is “much more blinding than old-style ethnocentric arrogance.”

Later in the same chapter, in a footnote discussion entitled “Ambiguous Complementarity” (footnote 57, bottom of pages 75-76), Illich himself nicely grasps in his own thought just what such pseudo-concepts as exchange actually accomplish, which has nothing to do with vision, but everything to do with imposition. I have already given that passage above, as the second epigraph for this post, but it bears repeating here, to end today’s post:

Exchange drives partners toward ever clearer fit (homogeneity and not ambiguity), whose asymmetry therefore tends toward hierarchy and dependence. Where exchange structures relationships, a common denominator defines the fit. Where ambiguity constitutes the two entities that it also relates, ambiguity engenders new partial incongruities between man and women, constantly upsetting any tendency toward hierarchy and dependence.

*     *     *     *     *     *

My next post will finish the current series on “The Traumatic Word.” (I promise!)

* Of course, a select few are singled out by the schooling system to become hyper-educated (Ph.D.’s like me, for example), but just as the income gap between the monetarily rich and the monetarily poor keeps on widening, so does the education gap between us members of the hyper-educated elite and the common folk whom one of my colleagues at the University of Denver used to like to dismiss by calling them “the great unwashed.”   As to how schooling pursued beyond the tipping point at issue can create its own teacher-caused equivalent to doctor-caused illnesses, I am reminded of something I used to tell the students in my own classes, before I learned more skillful means of subverting the university: “Any idiot can get a Ph.D.—in fact, being an idiot helps.”   In Shadow Work, published in 1981 (Boston and London: Marion Boyars), two years before Gender, Illich himself writes (page 31): “Students ask if they are in school to learn or to collaborate in their own stupefaction. Increasingly, the toil of consumption overshadows the relief consumption promised.”

 

The Traumatic Word (3)

This is the third post in a series on “The Traumatic Word.”

*     *     *     *     *     *

All that glitters is not gold.

— Old commonplace

Even for us, gold still glitters. However, we don’t any longer attend especially either to gold or to glittering . . . We have no sense for that “sense” any longer. Insofar as gold “is” gold for us, it is only as a metal that carries value.

— Martin Heidegger

 

The word gives voice to the silence it breaks.

Sometimes during the second half of my long university teaching career, I would bring a small Tibetan meditation gong to class, to give the students an opportunity to experience two different modalities of listening, as I myself had first experienced them once by fortuitous accident. I would ask the students to find a comfortable position in their chairs, close their eyes gently, and hold themselves relaxed but attentive. Then, before ringing the bell, I would telling them to focus their attention on the sound of the ringing itself, and to hold onto the sound for as long as they could continue to hear it, however dimly, then just to stay quiet and attentive, eyes closed. After giving the ringing sound ample time to die away, I would ring the bell again. This time, however, I would first direct the students not to focus on the ringing of the bell as such, trying to hear it as long as they could, but rather to listen for the silence to return to the bell.

Afterwards, the class and I would talk about the difference between the two experiences of listening. Some of the students reported that they really hadn’t been able to tell any difference. However, others—usually a smaller number, which is to be expected, for reasons I need not discuss here—would report surprise at just how different in quality the two experiences were.

I would then end by encouraging all of the students, whichever of those two reporting groups they belonged to, to practice the two different ways of listening on their own. I know from subsequent feedback that some did, but I also have good grounds for suspecting that most did not—for reasons similar to those I think account for the disparity in size between the two reporting groups, but that, once again, I do not need to discuss here.

As I already remarked above, when I first experienced the difference at issue myself it was not under any special guidance or direction, but just by serendipity. It happened twenty or so years ago. I was quietly meditating one fall morning, with my eyes gently closed, outside the chapel of the secluded Benedictine Monastery where I’ve retreated for a few days from time to time for the last quarter-century. As I was calmly and quietly sitting there, thinking nothing, the bell in the chapel tower began to ring, calling the monks to come together for one of their daily session of common prayer. Calm and comfortable yet attentive as I found myself at that moment to be, I just continued to sit there, eyes closed, thinking nothing, and just let the ringing of the bell continue to sound. I was so calm and comfortable that I didn’t even find myself listening to the ringing itself. Rather, as I said, I just let it go on, giving it no special attention, but still fully aware of it in my open, attentive frame of mind.   To my surprise, as the sound of the rung bell died away, I heard the silence return to the bell, and with it to the world of the monastery as a whole.

Through the slow dying away of the bell’s ringing, I heard the silence itself began to ring.

*     *     *     *     *     *

Decorations, ornaments and adornments are there to call attention to what they decorate, ornament, or adorn. So they glitter, like gold.

In Der Spruch des Anximanders, a manuscript that Heidegger wrote apparently in the 1940s for a never-delivered lecture course, but that was not published until 2010, when it came out as volume 78 of his Gesamtausgabe (GA: the “Complete Edition” of Heidegger’s Works published by Vittorio Klostermann in Frankfurt). The title means “the saying (or ‘dictum,’ to use a common Latin-derived term) of Anaximander.” Anaximander was the second of the three “Milesians” (the first being Thales, and the third Anaximenes), so called because all lived in Miletus, a Greek colony in Asia Minor. The three have gone down in tradition as the first three philosophers. Only one saying or dictum has survived from Anaximander, and that is what is at issue for Heidegger in his manuscript.

At one point in the text, Heidegger has a lengthy discussion about gold, and what gold was for the ancient Greeks. I have taken my second epigraph for this post, above, from that discussion (from a passage to be found on page 70 of GA 78). In addition, a bit earlier in the same discussion (on page 67) Heidegger himself cites the German version of the old commonplace I used for my first epigraph for this post, “All that glitters is not gold,” which in the German Heidegger uses is, “Es ist nicht alles Gold, was glänzt.”

That commonplace, Heidegger goes on to add, contains implicitly the recognition that “gold is what authentically glitters, such that on occasion what also glitters can appear to be gold, even though that appearance is a sheer semblance.” The German glänzen means to glitter, that is, to sparkle, glisten, or shine. That last word, shine, can be used as a verb, as I just used in the preceding sentence, but also as a noun, as when we speak about the shine of a pair of polished shoes, or of gold itself. The noun shine is indistinguishable in sound from the German equivalent, Schein. To form the infinitive of the corresponding verb, “to shine,” however, German adds the suffix –en to form scheinen, which in turn can become again a noun when given a capital first letter, Scheinen. The German phrase “das Scheinen” would need to be translated in some contexts as “the shining” (as in the title of the famous Steven King novel or Stanley Kubrick’s movie version thereof). In other contexts, however, it would need to be translated differently, as I have done in quoting Heidegger in saying that what isn’t gold can sometimes appear to be gold although that appearance is “a sheer semblance,” which I could also have rendered as “a mere seeming”: “ein blosses Scheinen.”

To be sure, not everything that glitters is gold. However, whatever is gold does glitter. Glittering, sparkling, glistening, shining, belongs essentially to gold, constituting its very being-gold, its very golden-ness. So says Heidegger at any rate. Glittering or shining as such (page 68) “belongs to being-gold itself, so truly that it is in the glittering [or shining: das Glänzen] of gold that its very being(-gold) resides.” Glittering resides essentially in gold regardless, Heidegger says, of whether the gold has been polished up already, or is still dull from being newly mined, or has had its shine go flat through neglect.

Gold glitters. It shines. That is the very purpose of gold, what it is for: to shine. In other words, gold as such, the golden, has no “purpose,” is not “for” anything. It just shines. Gold is simply lustrous, that is, “filled with luster,” from Latin lustrare, “spread light over, brighten, illumine,” related to lucere, “shine.” As essentially shining in itself, gold adds shine to that on which it shines, as it were: as lustrous, filled with luster, it is suited in turn to add luster to what is suited to wear or bear it.

Hence the role that gold has always had as decoration, ornament, and adornment. Decorate derives from Latin decoris, as does decorous. Latin decoris is the genitive form of decus, from the presumed Indo-European root *dek-, “be suitable.” What is decent, from the same root, is what is becoming, comely, befitting, proper; what is decent is what is suitable.

Ornament comes from Latin ornare, which means to equip, to fix up or deck out, to adorn—which last ends up saying the same thing twice, since adorn also comes from ornare, plus the prefix ad-, “to.”

Worn decorously, gold adorns those it ornaments: When it fits, it adds luster to what it decks out.

*     *     *     *     *     *

W. G. Sebald devotes one of his essays in A Place in the Country (New York: Random House, 2013) to Gottfried Keller, the great nineteenth century Swiss poet, novelist, and story-teller. “One might say,” writes Sebald in the essay, “that even as high capitalism was spreading like wildfire in the second hall of the nineteenth century, Keller in his work presents a counter-image of an earlier age in which the relationships between human beings were not yet regulated by money.”

A bit latter in the same essay Sebald writes: “It is, too, a particularly attractive trait in Keller’s work that he should afford the Jews—whom Christianity has for centuries reproached with the invention of moneylending—pride of place in a story intending to evoke the memory of a precapitalist era.” Sebald then recounts how, in that story, Jews who are welcomed into a shop built not on capital but on barter—thus, a shop that serves as an example of just such a pre-capitalist era. The non-Jewish proprietress welcomes itinerant Jewish traders among those who regularly frequent her shop, to come inside to sit and talk.

When the talk in the shop turns to tales of how the Jews abduct children, poison wells, and the like, those Jewish traders, writes Sebald:

merely listen to these scaremongering tales, smile good-humoredly and politely, and refuse to be provoked. This good-natured smile on the part of the Jewish traders at the credulity and foolishness of the unenlightened Christian folk, which Keller captures here, is the epitome of true tolerance: the tolerance of the oppressed, barely endured minority toward those who control the vagaries of their fate. The idea of tolerance, much vaunted in the wake of the Enlightenment but in practice always diluted, pales into insignificance beside the forbearance of the Jewish people. Nor do the Jews in Keller’s works have any dealings with the evils of capitalism. What money they earn in their arduous passage from village to village is not immediately returned to circulation but is for the time being set to one side, thus becoming like the treasure hoarded by Frau Margaret [the non-Jewish proprietress of the shop herself], as insubstantial as gold in a fairy tale.

Sebald then concludes the passage: “True gold, for Keller, is always that which is spun with great effort from next to nothing, or which glistens as a reflection above the shimmering landscape. False gold, meanwhile, is the rampant proliferation of capital constantly reinvested, the perverter of all good instincts.”

In their remarks on gold, Sebald and Heidegger are two fingers pointing to the same thing.

*     *     *     *     *     *

The English word order derives from the same roots as do the English words ornament and adorn. All three come from the Latin ornare, which, as I’ve already noted, means to equip, to fix up or deck out. That is fitting, which is to say decorous, since proper order—well-ordered order, we well might say, as opposed to disordered order (or “dysfunctional” order, to use some currently commonplace jargon, even though it has already lost much of its shine, having been in circulation for quite a while by now)—is there for the sake of what it sets to order, rather than the other way around.

Proper order is an ornament to be worn by what it orders, in order to let the latter come fully into its own radiance, its own shine. Such proper order is rare, so rare as to be genuinely golden.

What is genuinely golden—what shines of itself, and needs no trafficking in the market to give it monetary value—does not really call attention to itself, properly speaking. Rather, like the sun in Plato’s Divided Line at the end of Book VI and Myth of the Cave at the start of Book VII in the Republic, which calls attention to that on which it shines, but, as shining itself, vanishes in its own blinding brilliance, the genuinely golden calls attention to that which it adorns.

Soon after the lines I have used as this post’s second epigraph, in which Heidegger says that we of today have lost all sense for the genuine sense of gold and the golden, he observes that ornaments, decorations, and adornments do not as such call attention to themselves for their own sake, but rather to that which they ornament, decorate, or adorn, for its sake. As he writes (on page 73), “decoration and ornament [der Schmuck und die Zier] are in their proper essence nothing that shines for itself and draws the glance away from others to itself. Decoration and ornament are far rather such wherein [that is, in the “shine” of which, we might say] the decorated is first made ‘decorous’ [“schmuck”: “bejeweled,” that is, “decked out, as with jewels”—so “neat,” “natty,” “smart,” in effect], that is, stately [stattlich, “imposing,” from a root meaning “place”—so: having “status”], something that, upright in itself, has a look [hat ein Aussehen, a word that also suggests “splendor”: “good looks,” in effect, to go with its imposing status] and stands out [hervorragt], that is, itself comes to appearance [zum Scheinen].”

Thus, for instance, jewelry, does not distract attention from the one who decorously wears it, the one to whom it is fitting or suited. Rather, decorously worn, jewelry calls attention to the splendor already there in the wearer, adding luster to that luster. It lets the wearer shine forth in all her own glory, shining brilliantly with all her own splendor, radiant.

So adorned, the radiant one is there to be adored.

*     *     *     *     *     *

The two words, adorn and adore, have distinct etymologies. The former, as I’ve already noted, comes from ad-, “to,” plus ornare, “to deck out, add luster to.” On the other hand, adore comes from ad- plus orare—with no ‘n,’ just as English adore is bare of the sound of ‘n’ that gets added to adorn. Orare means “to speak,” most especially in the decorous, stately sense of “praying” or “pleading,” as in delivering an “oration,” a formal speech before a court or other august assembly, a speaking that honors and thereby “praises” the high standing of the assembly being addressed.

Despite the disparate etymologies of the two terms, my own hearing discerns a deeper, semantic resonance between adorning and adoring. To add luster to what is already lustrous, as adornments add shine to those who already shine of themselves, polishing that shine to its own full radiance, and to speak to and of what already speaks for itself, addressing it in such a way as to honor its stature, attesting to its renown, fit together. Each, adorning and adoring, adds luster to the other in my eyes. Each praises the other—as creation, in Christian tradition, is said to praise its Creator.

Adornments speak well of those they decorously adorn. When decorous, adornments fit the adorned, fitting them in such a way as to defer to them, letting the adorned come forth in their own glory, bespeaking the radiance of the adorned, rather than boasting of their own adorning sparkle.

So do I like to think, at any rate. It fits for me. Most especially it fits my experience, years ago, of sitting outside the monastery as the bell rang, calling the community together to pray, and calling my own attention not to itself but to the silence it decorously broke, giving it voice—calling: “Oh come, let us adore!”

*     *     *     *     *     *

I plan to complete this series on “The Traumatic Word” with my next post.

The Traumatic Word (2)

This is the second post in a series on “The Traumatic Word.”

*     *     *     *     *     *

The word in its purest form, in its most human and divine form, in its holiest form, the word which passes orally between man and man to establish and deepen human relations, the word in a world of sound, has its limitations. It can overcome some of these—impermanence, inaccuracy—only by taking on others—objectivity, concern with things as things, quantification, impersonality.

The question is: Once the word has acquired these new limitations, can it retain its old purity? It can, but for it to do so we must reflectively recover that purity. This means that we must now seek further to understand the nature of the word as word, which involves understanding the word as sound.

— Walter J. Ong, S. J., The Presence of the Word (page 92)

The spoken word is a gesture, and its meaning, a world.

— Maurice Merleau-Ponty, Phenomenology of Perception (page 184)

We listen not so much to words as through them.

May years ago, when I first had to start wearing glasses, which was not until well into adulthood, it took me a while to adjust, as is common. Until that adjustment had taken place, I often found myself seeing my glasses themselves, rather than (or at least in addition to) what I saw through them. My eyes were unsure, as it were, about just where to focus: on my glasses, or on what lay beyond them. During that adjustment period, the glasses were more of a distraction to my vision than an enhancement of it. I found myself wanting to look at my glasses, rather than through them.

Similarly, when some year later I had to start wearing hearing aids, at first they were also more distractions to my hearing than aids to it. I found myself wanting to listen to the hearing aids, rather than through them.

As is true for any good, useful tool, the job of glasses and hearing aids is to vanish into their usage—in the case of glasses and hearing aids, into the vision and audition they are respectively designed to make possible. That’s just what both my glasses and my hearing aids did, at least as soon as I’d adjusted to wearing them.

Insofar as words are no more for us than means of conveying information or “messages” back and forth between “senders” and “receivers”—they too, at least when they are good little words, vanish into their usage. Otherwise, they become “noise” in the sense at issue in information theory: “interference” that distorts the message, just as static does on a radio. Words that call attention to themselves are just so much noise, when it comes to the transfer of information.

It is worth noting that, taken as the Word of God, Jesus is very noisy. He constantly calls attention to himself in one way or another.

*     *     *     *     *     *

At one point in The Presence of the Word Walter J. Ong discusses how the word, as spoken sound, is “noninterfering” (page 147), whereas in contrast the gesture is “interfering” (page 148). By that he does not mean that the word is low on the noise-making scale, and the gesture high on it. Obviously, the contrary is the case. As sound, the word is nothing but noise, whereas a gesture makes no noise at all. The word is to be heard, and therefore must sound off; it must make noise. The gesture, however, is given to be seen.

Of course, in saying such things I am clearly just playing with the word noise, since the noisiness of the word is not a matter of its interference with the delivery of a message, but is instead actually essential to the usefulness of the word for carrying messages. A word that made no noise in the sense that it did not sound at all, would be a word that remained unspoken and therefore incapable of sending any message, conveying any information, whatever. In turn, however, the same thing applies to the gesture: a gesture that called no attention to itself—which made no noise in that sense—would be no less incapacitated as an information-transfer system than would a never-sounded word. It would be tantamount to a gesture that did not “give itself to be seen” in the first place, and therefore utterly failed to deliver any message at all.

By making such noise about the word noise, by playing noisily with that word, what I want to call to readers’ attention is, at least in part, that when Ong says the sounded word is “noninterfering,” whereas the gesture is “interfering,” he is not using that latter term the same way it is used in information theory. Rather, what he means when he says the sounded word, the voice, is “noninterferring” is, he explains, that “one can use the voice while doing other things with the muscles of the hands, legs, and other parts of the body.” In contrast, the gesture is “interfering”: “It demands the cessation of a great many physical activities which can be carried on easily while one is talking.”

Despite differentiating between gesture and word in that way, Ong nevertheless writes (on page 148) that “[i]t may be that human communication began with gesture and proceeded from there to sound (voice). Gesture would be a beautiful beginning, for gesture is a beautiful and supple thing.” If we take that suggestion seriously, then it may even turn out that the word itself remains a gesture—only a vocal, audible gesture, rather than a nonvocal, visible one. That would still fit with Ong’s point about the voiced word being “noninterfering,” since it would simply require confining “interfering” to non-vocal gestures. And that, in turn, would still leave room for what Ong says next, right after remarking on the beauty of a possible gestural beginning for the word: “But, if this was a development which really took place, the shift from gesture [that is, now: non-vocal gesture] to sound [vocal gesture] was, on the whole, unmistakably an advance in communications and in human relations.”

Yet even if that be granted, it still remains the case that, in the sense of “interference” at issue in information theory, as opposed to Ong’s own usage of that term, it is not just what he calls gesture, that is, what I just suggested might better be called “non-vocal gesture,” that “interferes.” Rather, both his “gesture” (my “non-vocal gesture”) and his “word” (my “vocal gesture”) are essentially “interfering.” That is, both by their very nature throw up obstacles to optimum transparency of any “message” they might be used to carry, any transmission of information they might be used to accomplish. That is because both call attention to themselves, not just to what comes packaged in them.

The beauty of gesture to which Ong himself calls attention is inseparable from gesture’s thus calling attention to itself. Beauty does that. It stops us in our tracks, brings us up short, dazzles us, stuns us, shocks us into silence and admiration—from Latin mirare, “to look,” and ad, “to or at,” but we also extend our usage of “admire” with ease to cover as well our attitude toward perceived auditory beauty, beauty that is heard rather than seen. Both gestures and words (or non-verbal gestures and verbal ones, if that is what the distinction at issue finally turns out really to be) have that arresting quality. Both a raised middle finger and the verbal equivalent, for example, have it.

*     *     *     *     *     *

Whether I silently “give the finger” to people or yell “Fuck you!” at them, in either case I am telling them the same thing. What is more, however, the process of telling those to whom they are directed whatever those two, the nonverbal gesture and the verbal one, do tell them, both gestures tell it in a way designed to call attention to the telling itself. For both, just delivering information is far from all they are doing, or even the most important thing.

What they are doing, when taken in their fullness as gestures, is actually sharing a world. To be sure, the specific nonverbal and verbal gestures I have chosen as my examples (flipping someone off, or telling someone the same thing verbally) share the world with the person to whom they are directed in a very polemical, which is to say war-like, way (from Greek polemos, “war” or “strife” ”—which, according to Heraclitus, is “the father of all things”). Such gestures, verbal or not, convey enmity, even hatred. Indeed, it is for that very reason that I have chosen them as my examples.

As Sartre was good at pointing out, hate no less than love is a way of taking the other person seriously. It is a way of remaining genuinely in communication with that other person, rather than breaking the communication off. What breaks off communication—or never lets it get started in the first place—is not hate, but rather the indifference of passing one another by, unheeded.

In communicating with one another, we certainly process information back and forth. By yelling, “Fuck you!” at someone, I convey considerable information to that person, should said person wish to treat my behavior as no more than a message to be processed—ignoring me and focusing instead on decoding whatever information my behavior encodes. Such a decoder could decode lots and lots of bits of information from that single bit of my behavior: information about me (such as information about the current condition of my vocal apparatus, or where I was born, from details of my pronunciation); information about the culture from which I come; information about the decoder himself or herself (including that he or she apparently just did something that somehow triggered my outburst, and may even be under immediate threat of danger from me as a result, should I stop yelling and start acting). My behavior is chock full of all sorts of information, enough to satisfy any would-be decoder. However, in ignoring me to focus instead on decoding the information contained in my outburst, the person to whom I directed that outburst would run the very real risk of just enraging me further through such a display of personal indifference.

Sartre’s point that hating someone is a way of remaining in relationship with that person can be put in a more Heideggerian way by saying that hating is continuing to care about the other person. Ong also makes essentially the same point in The Presence of the Word, when he says that no matter how polemical or even verbally abusive talk between people may become, at its core (page 192) “[t]he word moves toward peace because it mediates between person and person.” As he proceeds to point out (page 193):

When hostility becomes total, the most vicious name-calling is inadequate: speech is simply broken off entirely. One assaults another physically or at least ‘cuts’ him by passing him in total silence. Or one goes to court, where, significantly, the parties do not speak directly to each other but only to the judge, whose decision, if accepted as just by both parties, at least in theory and intent brings them to resume normal conversation with each other once more.**

To pass from speech, no matter how vicious or even abusive, to a fist striking a jaw or a bullet tearing flesh is to cease gesturing at all any longer, whether verbally or nonverbally. To send a fist into the face of another or a bullet into that other’s chest is not to gesture at anyone. It is to break off all gesturing, and therewith to break off all genuine further communication.

To continue with Ong’s ways of formulating things, what is truly distinctive about communication, properly so called, is that it is the sharing with one another of what is “interior” with regard to each of the communicants—sharing it precisely as “interior,” so that it continues, in its very being shared, still to be closed off, unseen, not laid out in the open, in short, continues to be invisible. That is why Ong repeatedly insists that the word as such is sound. Sound alone can plumb the interior depths that vision—or taste or smell or touch, for that matter, in the final analysis—can never attain, depths that vision can never “sound,” as we by no accident say. Sound sounds from, and “resounds” or “resonates” from, the interior of that which is sounding, whether sounding of itself (as does the animal in its cry or the human being in speaking) or sounding through the action of another (as does a melon when thumped or a wall when knocked).

In that telling sense, communication is the sharing of what can never be processed as information, in short, the sharing of the un-sharable. Ultimately, to communicate is gives voice to the incommunicable.

*     *     *     *     *     *

“The spoken word is a genuine gesture, and it contains its meaning in the same way as the gesture contains it. This is what makes communication possible.”   So writes Maurice Merleau-Ponty in his 1945 Phenomenology of Perception (translated by Colin Smith, London: Routledge & Kegan Paul, 1962, page 183). Those two sentences occur a bit earlier in the same passage that ends with the line I used for my second epigraph at the beginning of this post. Right after those two sentences, the passage at issue continues as follows (pages 183-184):

In order that I may understand the words of another person it is clear that his vocabulary and syntax must be ‘already known’ to me. But that does not mean that words do their work by arousing in me ‘representations’ associated with them, and which in aggregate eventually reproduce in me the original ‘representation’ of the speaker. What I communicate with primarily is not ‘representations’ or a thought, but a speaking subject, with a certain style of being and with the ‘world’ at which he directs his aim. Just as the sense-giving intention which has set in motion the other person’s speech is not an explicit thought, but a certain lack which is asking to be made good, so my taking up of this intention is not a process of thinking on my part, but a synchronizing change of my own existence, a transformation of my being.

Nevertheless, because to live in the world together is also to live in, with, and by building, “institutions” together, there is a tendency of the spoken word to lose its sonority, as it were—to lose what, favoring the visual over the auditory as our culture has done since the Greeks (that, too, has become institutionalized), we might well call the word’s “shine” or even its “glitter.” The word comes no longer to call attention to itself, but instead sinks down to the level of the commonplace utterance, and language becomes no more than a system of signs. The word no longer calls out to be heard, and to be given thought. Accordingly, the passage from Merleau-Ponty continues:

We live in a world where speech is an institution. For all these many commonplace utterances, we possess within ourselves ready-made meanings. They arouse in us only second order thoughts; these in turn are translated into other words which demand from us no real effort of expression and will demand from our hearers no effort of comprehension. Thus language and the understanding of language apparently raise no problems. The linguistic and intersubjective world no longer surprises us, we no longer distinguish it from the world itself, and it is within a world already spoken and speaking that we think. We become unaware of the contingent element in expression and communication, whether it be in the child learning to speak, or in the writer saying and thinking something for the first time, in short, in all who transform a certain kind of silence into speech. It is, however, quite clear that constituted speech, as it operates in daily life, assumes that the decisive step of expression has been taken. Our view of man will remain superficial so long as we fail to go back to that origin, so long as we fail to find, beneath the chatter of words, the primordial silence, and as long as we do not describe the action which breaks this silence.

Silence is broken by the action of speaking, of sounding the word. Hence, Merleau-Ponty ends his long passage with the line I already used as my second epigraph for this post:

The spoken word is a gesture, and its meaning, a world.

*     *     *     *     *     *

My next post will continue this series on “The Traumatic Word.”

** In future, I may devote one or more posts to how it stands between the word, sound, and peace—especially today, our endless day of global market capitalism. If so, I may call the post/s something such as “Shattering Silence of Peace.”

The Traumatic Word (1)

In the strict sense, the word is not a sign at all. For to say its is a sign is to liken it to something in the field of vision. Signum was used for the standard which Roman soldiers carried to identify their military units. It means primarily something seen. The word is not visible. The word is not in the strict sense even a symbol either, for symbolon was a visible sign, a ticket, sometimes a broken coin or other object the matching parts of which were held separately by each of two contracting parties. The word cannot be seen, cannot be handed about, cannot be “broken” and reassembled.

Neither can it be completely defined.

— Walter J. Ong, S. J.

We would like language to be no more than a system of signs, a means for conveying information. At least since Aristotle, and down past C. S. Pierce to the present day, that view of language has been all but universally taken for granted, just assumed as true. It isn’t, as Walter J. Ong realized.

Ong was a United States professor of English who focused upon linguistic and cultural history—especially the cleft between oral and literary cultures, which was the topic of his most influential work, Orality and Literacy: The Technologizing of the Word, originally published in 1982.  The lines above are taken from an earlier work, however. They are from next to last page of The Presence of the Word: Some Prolegomena for Cultural and Religious History, first published in 1967 but consisting of lectures Ong gave by invitation at Yale in 1964, as the Dwight Harrington Terry Foundation Lectures On Religion in the Light of Science and Philosophy for that year.

Besides being a professor of English, with a Ph.D. in that field from Harvard, Ong had done graduate work in both philosophy and theology, and was also a priest of the Society of Jesus, that is, the Jesuit order, as the “S. J.” after his name indicates. That religious provenance is manifest in his work. In The Presence of the Word, it is especially evident in Ong’s focus not just on any old word, so to speak, but on “the” word in a particular sense. His concern in his Terry Lectures is not just on “words in general,” as the ordinary way of taking his title would suggest. So understood, “the word” in Ong’s title would function the same way “the whale” functions in the sentence, “The whale is a mammal,” which is equivalent to “All whales are mammals,” thus picking out a feature that is common to whales in general, applying indifferently to each and every whale whatever. Ong’s underlying focus in his Terry Lectures, however, is not upon words in general but rather upon the word in the distinctive sense that one might say, for example, that Mount Everest is not just a mountain but rather the mountain, the very embodiment of mountain as such.

Befitting the intent of the grant establishing the Terry Lectures, Ong’s underlying focus in The Presence of the Word, furthermore, is not upon some word that might come out of just anyone’s mouth. It is, rather, upon one uniquely singular word that comes out of one uniquely singular mouth—namely, “the Word of God.” At issue is the Word of which John says in the very opening verse of his version of the Christian Gospel (John 1:1): “In the beginning was the Word, and the Word was with God, and the Word was God.”

Thus, to put it in terms that became traditional within Christianity only long after John but based upon his Gospel, Ong’s underlying focus in The Presence of the Word is on Christ, the Second Person of the Trinity.

*     *     *     *     *     *

Alain Badiou’s seven-session seminar in 1986 was devoted to Malebranche (published in French by Fayard in 2013 as Malebranche: L’être 2—Figure thélogique). In his session of April 29, 1986, Badiou argued that Malebranche, being the committed Christian thinker that he was, found it necessary to think of God’s being (être) in terms of the cleavage (clivage) of God into Father and Son—which, we should note, though Badiou himself calls no special attention to it at this point, is a self-cleavage definitive of the Christian God in that God’s very being, such that God is God only in so self-cleaving.

However, to think of God’s being by thinking it back into his self-cleavage into Father and Son is to empty the thought of God of any substantial content beyond self-cleaving action itself: “In the retroaction of his cleavage,” as Badiou puts it (page 149), “God is empty: he is nothing but his process, his action.” God, so thought, is nothing but the very action set in action by the act of God’s self-cleaving. God voids God-self of any substantively separate self in such self-cleavage, and is only in such vanishing.

*     *     *     *     *     *

It is no accident—and it is deeply resonant with the opening of the John’s Gospel, it bears noting—that Walter Ong, long after Malebranche but more than twenty years before Badiou’s seminar on the latter, says the very same thing of the word. According to Ong (page 9 of The Presence of the Word), the emergence of electronic media in the 20th century “gives us a unique opportunity to become aware at a new depth of the significance of the word.” Not many pages later (on page 18) he expands on that point, writing: “Our new sensitivity to the media has brought with it a growing sense of the word as word, which is to say of the word as sound.” That growing sense of the word as word calls upon us to pay “particular attention to the fact that the word is originally, and in the last analysis irretrievably, a sound phenomenon,” that is, the fact that originally and always the word sounds. The word as word—which is to say the word as saying something—is the word as sound. The word only speaks by sounding.

Not every sound is a word, of course. However, every word is a sound. Or, to put that more resoundingly—that is, to make the sound louder (using the re- of resound not in its sense of “again,” but rather in its intensifying sense, as when we speak of a “resounding success”)—the word as word is nothing but sound, or rather sound-ing. As Malbranche’s God is nothing but his own process or action, so is the word nothing but “how it sounds,” if you will.

The word as sound, Ong insists repeatedly, is pure event. “A word [as spoken sound] is a real happening, indeed a happening par excellence” (page 111). In that sense, we might say that the word never is, but rather forever vanishes. The word as word is a “vocalization, a happening,” as Ong puts it at one point (page 33), adding a bit later (on pages 41-42):

Speech itself as sound is irrevocably committed to time. It leaves no discernable direct effect in space[. . .]. Words come into being through time and exist only so long as they are going out of existence. It is impossible [. . .] to have all of an utterance present to us at once, or even all of a word. When I pronounce “reflect,” by the time I get to the “-flect” the “re-” is gone.* A moving object in a visual field can be arrested. It is, however, impossible to arrest sound and have it still present. If I halt a sound it no longer makes any noise [that is, no longer “sounds” at all].

The word’s sounding is its event-ing, its coming forth in its very vanishing: as sounding, it “does not result in any fixity, in a ‘product,’” but instead “vanishes immediately” (page 95). The word as such is a vanishing that, in so vanishing, speaks, or says something. It speaks or says, as Ong observes (page 73), in the sense “caught in one of the accounts of creation in Genesis (1:3): ‘God said, Let there be light. And there was light.’ ” Such saying is creation itself, as the very letting be of what is bespoken.

In thus vanishing before what it calls forth, just what does the word—not just any old word, but the word as word—say?

It says the world.

*     *     *     *     *     *

More than once in his lecturing and writing, Heidegger addressed a poem by Stefan George entitled “Das Wort” (“The Word”), the closing line of which is: “Kein ding sei wo das wort gebricht.” In German, gebrechen means “to be missing or lacking”; and sei is the subjunctive form of the verb sein, “to be”—as, for example, in the line “If this be love, then . . .”   If we take sei that way in George’s poem, then his closing line says something such as: “no thing may be, where the word is lacking.” It would then express the relatively commonplace idea that, if we don’t have a name for something, as a sort of label to attach to it, then that thing doesn’t really take on full, separate status for us, such that we can retain it clearly in our thought, memory, and discourse with one another. That’s the idea that a thing really and fully “is” for us, separate and distinct from other things, only when we come up with such a name by which to label it—as, for example, an old bit of what passes for popular wisdom has it that we, who do not have a whole bunch of different names for different qualities of snow, such as the Eskimos are said to have, are not really able to see those differences, at least not with the clarity and ease with which the Eskimos are purported to be able to see them.

At the same time, however, sei is also the imperative form of the same verb, sein, “to be”—the form, for instance, a teacher might use to admonish a classroom full of unruly children, “Sei ruhig!” (“Be still!”). Taken that way, George’s closing line would have to be rendered as the imperative, “Let no thing be, where the word is lacking.”

What’s more, gebrechen, “to be missing or lacking,” derives from brechen, “to break,” which is not heard any longer at all in “missing” or “lacking.” At the same time, used as a noun, ein Gebrechen means a more or less lasting debilitation of some sort, such as a chronic limp from an old broken leg, or a mangled hand from an industrial accident (and it is interesting, as a side-note, that “to lack” in German is mangeln). If we were to try to carry over some part of what thus sounds in the German gebrechen, then we might translate the word no longer as “to be missing or lacking,” but instead by something such as “to break” (as the waves break against the shore), or “to break off” (as a softly sounded tone might suddenly be broken off in a piece of music, perhaps to be suddenly replaced or overridden by another, more loudly sounded one—or by a demanding call coming in on a cell-phone with a ringer set on high volume), or “to break up” (as the voices of those stricken by grief might break up when speaking of their losses).

Hearing gebricht along such lines, the closing verse of George’s poem “The Word” would say something to the effect that where the word breaks, or breaks off, or breaks up, there is no thing.

The way I just worded the end of the preceding sentence—“there is no thing”—is intentionally ambiguous, designed to retain some of the rich ambiguity of George’s own line, most especially a part of its ambiguity which is important to what Heidegger would have us hear in that line. To say that where the word breaks, or breaks off, or breaks up, “there is no thing” can be taken two different ways. First, it can be taken to say that no thing “exists.” That way of taking it would fit with the presumably common way of taking George’s line articulated above, whereby that line says that things fully “are” or “exist” for us as distinct and separate things only when we have names for them in their distinctness. However, the same phrase, “there is no thing,” can also be taken in a second way, one in which the first word is emphasized: “there”—that is at such and such a specific place. At what place, exactly, would no thing be? By George’s line, no thing would be exactly there, where the word breaks up, breaks off, just breaks: There, where the word breaks, don’t look for any thing. There, where the word breaks, you will have to look for something else altogether, something that really is no “thing” at all.

Yet if we are not to look for any thing there, where the word breaks, just what are we to look for? What are we to expect to take place there, where the word breaks? Heidegger’s response to that question is that there, where the word breaks, no thing, but rather the “is” itself takes place—the very letting be of whatever may be, as it were, takes place there.

“Thar she blows!” old whalers would call, at least by our stereotypes of them, when a whale broke the water’s surface again after diving when harpooned. “There she be!” they could as well have said, though less colorfully. Well, where the word breaks, there be world.

Just how would the word break—in the sense that the waves break against the beach or Moby Dick breaks the ocean’s surface—if it were not as sound, breaking against silence? Sounding in the silence, the very silence that it breaks, the word is word: It speaks.

As I said before, what the word says—what its says there, where it breaks out, and up, and off as sound—is world.

*     *     *     *     *     *

At this point, I will break off my reflections on “The Traumatic Word,” to resume them, given the breaks to do so, in my next post.

* That is worth repeating. So Ong repeats it almost twenty years later, in Orality and Literacy, just varying his example: instead of using “reflect,” he uses “existence,” and says that by the time I get to the “-tence,” the “exist-” no longer exists. That example especially well suits the word itself, which as word—that is to say, as sound sounding—“exists only at the instant when it is going out of existence,” to use Ong’s way of puting it at one point in The Presence of the Word (page 101).

Pulling Out of the Traffic: The Après-Coups After The Coup (3)

This is the third and final post of a series.

*     *     *     *     *     *

Third After-Shock: Flashes of Imagination

I do not, in the conventional sense, know many of these things. I am not making them up, however. I am imagining them. Memory, intuition, interrogation and reflection have given me a vision, and it is this vision that I am telling here. . . . There are kinds of information, sometimes bare scraps and bits, that instantly arrange themselves into coherent, easily perceived patterns, and one either acknowledges those patterns, or one does not. For most of my adult life, I chose not to recognize those patterns, although they were patterns of my own life as much as Wade’s. Once I chose to acknowledge them, however, they came rushing toward me, one after the other, until at last the story I am telling here presented itself to me in its entirety.

For a time, it lived inside me, displacing all other stories until finally I could stand the displacement no longer and determined to open my mouth and speak, to let the secrets emerge, regardless of the cost to me or anyone else. I have done this for no particular social good but simply to be free.

— Russell Banks, Affliction

 

What a great distinction! Making up vs. imagining! To “make up” is to confabulate, to cover, to lie. So, for example, do those who claim power over others make up all sorts of ways in which the usurpation of such power is necessary “for the common good” or the like. In contrast, to imagine is to make without making up. It is to create, which is to say to open out and draw forth sense and meaning. Making up is telling stories in the sense of fibs and prevarications. Imagining is telling stories in the sense of writing fiction. The former is a matter of machinations and manipulations; the latter is a matter of truth and art.

The passage above comes early in Affliction (on pages 47-48). The words are spoken in the voice of the fictional—which means the imagined—narrator of the novel, Rolfe Whitehouse. Rolfe is telling the story of his brother Wade’s life, and therewith of his own life, too, as he remarks in the passage itself.

*     *     *     *     *     *

A mere symmetry, a small observed order, placed like a black box in a corner of one’s turbulent or afflicted life, can make one’s accustomed high tolerance of chaos no longer possible.

— Russell Banks, Affliction (page 246)

 

Imagine, for example, a big black cube, surrounded by a neon glow, appearing in the sky over Oakland, setting off car horns and causing dogs to bark throughout the city in what soon ceases to sound like sheer cacophony, and becomes a new, hitherto unheard of harmony, in the sounding of which everyone is invited to join, each in each’s own way. Such a thing might all of a sudden make those who witnessed it no longer suited to tolerate the chaos in which, they now suddenly see, they had been living till then, without even knowing it.

*     *     *     *     *     *

. . . facts do not make history; facts do not even make events. Without meaning attached, and without understanding causes and connections, a fact is an isolate particle of experience, is reflected light without a source, planet with no sun, star without constellation, constellation beyond galaxy, galaxy outside the universe—fact is nothing. Nonetheless, the facts of a life, even one as lonely and alienated as Wade’s, surely have meaning. But only if that life is portrayed, only if it can be viewed, in terms of its connections to other lives: only if one regard it as having a soul, as the body has a soul—remembering that without a soul, the human body, too, is a mere fact, a pile of minerals, a bag of waters: body is nothing.

— Russell Banks, Affliction (page 339)

 

Ever since my mid-teens I have kept a sort of philosophical journal. That is, I’ve kept notebooks in which I’ve jotted down passages from what I was reading at the time that made me think, along with some of the thoughts they brought to me, or brought me to. For various periods of varied lengths I’ve let that practice lapse since then, but I always pick it up again eventually. For the last few years, there have been no lapses of any duration; and, in fact, my blog posts almost always arise from things I’ve already written more briefly about in my philosophical journals.

On our recent trip to San Francisco to watch our daughter work with The Coup, I carried my current philosophical journal along. Here’s what I wrote one morning while we were still out in the Bay area.

“The Essence of Accident, the Accident of Essence.”

That came to me this morning as the title for a possible blog post in which I’d explore the idea that the essential—or, more strictly speaking, the necessary—is itself essentially accident. That “accident,” the “accidental,” is precisely “essence,” the “essential.”

That goes with the idea of truth as event (and not, as Milner would say, as possible predicate of an event, a pro-position—to give an accidental connection, via my current reading and other experiences, its essential due). It was itself suggested to me by the accidental conjunction of a variety of factors, coming together with/in our trip out here to see [our daughter] perform with “Classical Revolution” (the name of the “group” from which the quartet with her on cello came) at/in conjunction with/as part of The Coup’s performance on Saturday, two days ago. Among those diverse but accidentally/essentially (i.e., as insight-bringing) connected factors are: (1) my reading in Heidegger’s Überlegungen [Reflections: from Heidegger’s so called “Black Notebooks,” which only began to be published this past spring in the Gesamtausgabe, or Complete Edition, of his works] this morning; (2) my ongoing reflection and talk (with [my daughter] and/or [my wife]) about Saturday’s “Coup” event; (3) my noticing yesterday one of the stickers on [my daughter’s] carbon-cello case, which sticker has a quote from Neal Cassady: “Art is good when it springs from necessity. This kind of origin is the guarantee of its value; there is no other.” That third factor was the catalytic one: the “necessity” Cassady is talking about has nothing to do with formal rules or mechanisms, but is precisely a matter of the “accidental,” which is to say be-falling (like a robber on the road), coalescence into a single work/flash/insight of all the diversity of factors that otherwise are chaotically just thrown together as a simultaneous series, as it were. . . . There’s another major factor so far not recorded as such: (4) attending The Coup’s performance at the Yerba Buena Center for the Arts in San Francisco on Saturday. That is the real arch-piece/factor here.

Which brings me to another possible blog post, which [my wife and daughter] yesterday suggested I should do, before the one on accidental essence and essential accidentality suggested itself to me this morning. That is a post about the impact of Saturday night’s event [that is, The Coup’s Shadowbox].

 

As readers of this current series of three posts to my blog already know, of course, I took my wife’s and daughter’s suggestion. But I expanded upon it, doing three posts about my experience of The Coup, rather than just one. And I was also able to incorporate it with my idea for a post on accident and essence, which became my preceding post, the second of the three of this series.

Whether there is any necessity to all that will have to speak for itself. (I can confidently say, at any rate, that it is not art.) All I know for sure is that my journal entry, and this subsequent series of three posts, came about from the accidental conjunction of the four facts I mention in the passage above, taken from my philosophical journal. That entry tells the tale of that conjunction, from which tale alone derives whatever significance or meaning those otherwise isolated particles of my experience may have.

*     *    *     *     *     *

I’ve just recently begun reading Wendy Doniger’s The Hindus: An Alternative History (New York: Penguin Press, 2009), a book that has been on my list to read ever since it first appeared, and that I’m finally getting around to. So far, I’m still in the first chapter, which is an introductory discussion. One of the lines that already especially struck me is this (on page 8): “This is a history, not the history of the Hindus.”

One reason that struck me when I read it was that earlier the same day I’d noted a remark Heidegger makes in his Überlegungen (on page 420 of Gesamtausgabe 94) about the “idols” we worship today (which is still the same day, really, as when Heidegger wrote his remark, back in the Nazi period). Today, among the idols we are most tempted to fall prey to worshipping are, by his partial listing: Science (with a capital ‘S’: “ ‘die’ Wissenschaft”), Technology (with a capital ‘T’: “‘die’ Technik”), “the” common good, (“‘die’ Gemeinnutzen), “the” people (“ ‘das’ Volk”), Culture (with a capital ‘C’: “ ‘die’ Kultur”). In all those cases, idolatry happens when we turn what are themselves really ways or paths of our life in the world with one another—including knowledges (“sciences”), know-hows (“technologies”), shared benefits (“common goods”), and cultivations (“cultures”)—into “ ‘purposes’ and ‘causes’ and ‘agents,’ all the forms and ‘goals’ of wheeling and dealing.”

When we restrict the term knowledge only to what can be con-formed to the one form we have come to call “science”—the paradigm of which is taken to be physics and the other so called “natural sciences”—and confine all other forms of knowledge to mere “opinion” (to which, of course, everyone has a right, this being America and all), then we become idolators. In the same way we fall into idolatry when we try to make the rich multiplicity of varied ways of doing things conform to our idea of some unitary, all embracing thing we call techonology—especially insofar as the idea of technology is connected for us with that of science, to create one great, Janus-faced über-idol. No less do we fall into idolatry when we buy into thinking that there is any such thing as “the” one and only one universal “common good,” which itself goes with the idea that there is some one universal “people” to which we all belong, as opposed to a rich diversity of distinct peoples, in the plural, with no “universal” to rule over them all. In turn, the idea of “culture” as itself some sort of goal or purpose that one might strive to attain—such that some folks might come to have “more” of it than others, for example—turns culture itself, which includes all those made things (made, but not made up: so we might even name them “fictions”) we call science, and technology, and common goods, and the like, into idols. No longer cherished as what builds up and opens out, what unfolds worlds, opening them out and holding them open, such matters gets perverted into service to the opposite sort of building, which closes everything down and shuts it away safe.

A few pages later in the same volume of his Überlegungen (on page 423), Heidegger mentions, in passing, “the working of an actual work.” That sounds better in the German: “die Wirkung eines wirklichen Werkes.” To preserve something of the resonance of the line in translation, we might paraphrase: “the effectiveness of an effective work”—keeping in mind that “to work” in English sometimes means “to bring about an effect” (as in the saying, “That works wonders!”). Or, to push the paraphrase even a bit further, we might even say: “the acting of an actual act.”

At any rate, in the remark at issue Heidegger says that “the working of an actual work” is that “the work be-works [or “effects”: the German is “das Werk erwirkt”]—when it works—the transposition [namely, of those upon whom it works] into the wholly other space that first ground itself through it [namely, grounds itself through the very work itself, an artwork, for instance].”

What I have translated as “transposition” is the German tern Versetzung, which comes from the verb setzen, “to place, put, or set.” Heidegger says that the work of the working work—the work of the work insofar as the work works, and doesn’t go bust—is to grab those upon whom it works and to set them down suddenly elsewhere. That is the shock of the work, as he calls it in “The Origin of the Work of Art,” from the same general period. It is the blow or strike, that is, the coup, that the work delivers to us, and in the delivery of which the work delivers us somewhere else. In the face of the work, at least when the working of that works strikes us in the face, then, as Dorothy said to Toto, we are not in Kansas anymore.

Such transposition is indeed shocking. It can be terrifying, in fact; and it is worth remarking that in German one word that can be translated as “to terrify” is Entsetzen, from the same root as Versetzen, “to transpose.” It is challenging to keep ourselves open to such terrifying transposition, such suddenly indisposing re-disposition of ourselves. We tend to close down toward it, trying to bar ourselves against it, withdrawing into safe places. Idolatry is no less than the endeavor so to enclose ourselves within safe places, rather than keeping ourselves open to such transpositions.*

*   *     *     *     *     *

From the beginning of my interest in them, I have known that the politics of The Coup is communist, at least in one good definition of that term (the definition Boots Riley, cofounder of the group, uses). As I have said before in this blog series, I am not certain about the complexion either of The Coup’s erotics or of their scientificity. However, I have now come to have it on good authority that The Coup are culinary anarchists.

The conjunction of the communist slant of their politics with the anarchist bent of their culinary persuasions gives me nothing but esteem for The Coup. On the other hand, that esteem would have been lessened not one bit if I had learned that they were, in reverse, culinary communists and political anarchists. The point is that neither in their politics nor in their food choices are The Coup into following the dictates of who or what lays claim to authority and power.

Adolf Hitler, who was no slouch when it came to claiming authority and power (all in the name of the common good of “das Volk,” of course), is just one of many claimers to authority from Aristotle on down to today who have cited for their own purposes this line from Homer’s Illiad: “The rule of many is not good, one ruler let there be.” Hitler was into that sort of thing. The Coup are into something different.

So is the Yerba Buena Center for the Arts in San Francisco, where my wife and I attended the world premier of The Coup’s Shadowbox. Making good on the promise I delivered toward the start of my second post of this three-post series on the after-shocks of that attendance, I want to come back to the “Note from the Curators” that opens the brochure I also mentioned there, the one about the Shadowbox premier. In it, the curators at issue write that YBCA “is in process of coalescing more consistently” with what they call “the energetic and aesthetic trajectories” of “local [aristic] ecologies,” especially the “local dance and music ecologies” of the Bay Area. By engaging in such a process, they write, YBCA, while “identifying itself as a physical place,” is also “aspiring to define itself as something more than brick and mortar.” YBCA is, of course, a physical place, and an imposing one at that, right in the heart of downtown San Francisco. More importantly, however, it “aspires,” as I read the curators’ note, to be a place that gives place to the taking place of works of art. As the two YBCA curators go on to write on behalf of the Center: “We aspire to hold firmly onto our institutional status while softening our institutional walls, locating the joy of less formal performance structure within our particularly austere architecture.” Pursuing that worthy—and, I would say, wonderfully anarchical, chaos-empowering—goal, they go on to write at the end of their note: “We plan to have hella fun** in this enterprise, to reposition participatory sweat as currency, to build momentum through the mechanism of witness, to celebrate the too often unseen, to make serious work of taking ourselves not too seriously while fixing our gaze on the exemplary unsung.”

Given that curators’ note, it strikes me that The Coup is right at home in such a venue as YBCA. So, for that matter, is Classical Revolution, which is the outfit (to use a word that seems to me to be appropriate to the case) from which came the quartet in which our daughter played one of her cellos as part of the world premier of The Coup’s Shadowbox at YBCA recently—and whose website (http://classicalrevolution.org/about/) I encourage my readers to consult, to check my just expressed judgment.

Nor is YBCA the only place-opening place where the performances of place-makers such as The Coup—and Classical Revolution and the other groups with whom The Coup shared their Shadowbox spotlight at the recent premier performance—are given a place to take place. Another such place in the Bay Area, one my wife and I also discovered thanks to our daughter during our recent trip to the West Coast, is The Revolution Café in San Francisco’s Mission District (http://www.revolutioncafesf.com/). That, it turns out, is the place where Classical Revolution was founded back in November 2006 by violist Charith Premawardhana, and where performances by Classical Revolution musicians take place every Monday night. There are many more such places, too, not only throughout the rest of the Bay Area, but also throughout the rest of the United States—and, I dare say, the whole, wide world.

To which I can only say: Amen! Which is to say: So be it!

 

 

*In reading Doniger’s words shortly after reading Heidegger’s, one thought that struck me was the question of whether Heidegger himself might not have succumbed to a sort of idolatry regarding “history,” Geschichte in German. Just as it is idolatry to think that there is any such thing as “the” common good or “the” people, isn’t it idolatrous to think that there is any such thing as “the” human story—“History,” with the capital ‘H’—as opposed to multiple, indeed innumerable, human stories, in the plural—“histories,” we might say, following Doniger’s lead? Yet Heidegger throughout his works talks about die’ Geschichte” (which, by the way, also means “story” in German, in addition to “history,” the latter in the sense of “what happened,” was geschiet), not just multiple Geschichten (“histories” or “stories,” in the plural). Perhaps that was at play in his involvement with the Nazis, despite the fact that, as the passage I’ve cited shows, he knew full well that it was mere idolatry to think in terms of “the” people, “das” Volk, as the Nazis so notoriously and definitively did. That, at least, was the question that came to my mind when I read Doniger’s line so soon after reading Heidegger’s. Even to begin to address that question adequately would take a great deal of careful thought, at least one upshot of which would surely be, in fact, that it is necessary to keep the matter open as a true question—rather than seeking the safety of some neatly enclosed, dismissive answer.

** As out of such things as I am, I don’t know if that is a mistake, or a way currently fashionable in some circles (or “ecologies,” if one prefers) of saying “have a hell of a lot of fun.” Whatever!

 

Pulling Out of the Traffic: The Future of Culture (5)

This is the final post in a series of five under the same title.

*     *     *     *     *     *

In my lifetime up to that point and for many years before, despite our earnest desires, especially Father’s, all that we had shared as a family—birth, death, poverty, religion, and work—had proved incapable of making our blood ties mystical and transcendent. It took the sudden, unexpected sharing of a vision of the fate of our Negro brethren to do it. And though many times prior to that winter night we had obtained glimpses of their fate, through pamphlets and publications of the various anti-slavery societies and from the personal testimonies given at abolitionist meetings by Negro men and women who had themselves been slaves or by white people who had travelled into the stronghold of slavery and had witnessed firsthand the nature of the beast, we had never before seen it with such long clarity ourselves, starred at it as if the beast itself were here in our kitchen, writing before us.

We saw it at once, and we saw it together, and we saw it for a long time. The vision was like a flame that melted us, and afterwards, when it finally cooled, we had been hardened into a new and unexpected shape. We had been re-cast as a single entity, and each of us had been forged and hammered into an inseparable part of the whole.

. . . .

Father’s repeated declarations of war against slavery, and his asking us to witness them, were his ongoing pronouncement of his lifelong intention and desire. It was how he renewed and created his future.

— Russell Banks, Cloudsplitter: A Novel

 

There is a way of building that closes down, and there is a way of building that opens up. Correspondingly, there is a way of preserving that checks locks and enforces security, and there is a way of preserving that springs locks and sets free.

Cloudsplitter is Russell Banks fine 1998 novel of the life of the great American/abolitionist John Brown, as told through the narrative voice of Brown’s third son, Owen. What Banks/Owen describes in the passage above is a building and then a preservation of the second sort, the sort of building that opens up, then the sort of preservation that keeps open.

The passage comes from relatively early on in the long novel, in the second chapter. What is at issue is at one level a very minor, everyday thing (everyday, at least, in 19th century American families such as John Brown’s): a shared family reading, begun by John himself, then continued by other family members in turn, each reading aloud from the same book, passed on from one to the other.

What the Browns are reading at that point in the narrative is a book recounting the horrors of American slavery. The book does that very simply and straightforwardly. It just presents page after page of the contents of ads of a type often placed, at the time, in newspapers—throughout the slave-holding states, at least. They are ads in which property owners who have suffered thefts of a certain kind solicit help, mainly for monetary reward, to track down and retrieve their stolen property. The property at issue consists of human beings owned as slaves, and the thefts at issue have been committed by that property itself—that is, by slaves who have tried to steal themselves away from their lawful owners, by running off. In ad after ad, slaveholders detail the scars that they have inflicted to the faces, backs, limbs, and torsos of their slaves. The slave-owners catalogue such traces of whippings, cuttings, burnings, and other abuses they have inflicted on their slaves, in order that those traces might now serve, in effect, as brand-marks by which their (self-)stolen goods can be identified, in order to be returned, it is to be hoped, to its rightful owners.

The experience of listening together to such genuinely reportorial reading during the evening at issue galvanizes the Brown family into a “new body,” to borrow an exactly apt term from Alain Badiou’s seminar on “images of the present times” (in which at one point he cites Cloudsplitter, and praises Banks).   Until that uneventful event of such simple family reading of an evening, the Browns had been, despite all family relations, affection, and sharing, no more than a collection of individuals—just instances of a family named “Brown,” as it were. “It took,” as Banks has Owen tell us in the passage above, “the sudden, unexpected sharing of a vision,” a vision “like a flame that melted us,” truly to meld them together and “re-cast” them “as a single entity,” in which each one of them “had been forged and hammered into an inseparable part of the whole.”

In the quiet of their family kitchen, their shared reading that evening brings the Brown family—brings that family as a whole and in each of its family members—to a point of decision. In the fire of that experience the family, each and all, is brought to decision; it gets decided as it were. That night, the family gets resolved. And so it will remain, one way or another.

Lapses will continue to remain possible, of course. In fact, they will all too often actually occur. One or another family member—now Owen, now one of his brothers or sisters, now even “the Old Man,” John himself—will lose his or her resolve, becoming irresolute again. But that will no more rescind the resolution than the breaking of a marriage vow rescinds that vow.

Broken vows and lapses in resolve are betrayals and acts of infidelity. As such, they do not cancel out the original vows or resolutions. Rather, they call for acts of contrition, repentance, and expiation, and, above all, a return to fidelity—that is, they call to renewed faithfulness to the vow or resolve that was betrayed.

*     *     *     *     *     *

In Toward a Politics of Speaking Beings: Short Political Treatise II—Pour une politique des êtres parlant: Courte traité politique II (Verdier, 20011), page 56—Jean-Claude Milner cites the 1804 remark, often attributed to Talleyrand, “It’s worse than a crime, it’s a mistake.” As Milner points out, a “mistake” is, at most, a significant “error in calculation.” It is therefore the sort of thing that may indeed sorely need to be corrected. However, unlike a crime, “it does not need to be expiated.”

*     *     *     *     *     *

“We blew it!”

That’s said by Dennis Hopper’s character in Easy Rider, the classic 1960s buddy-movie about two hippies’ cross-country motorcycle journey together—costarring Peter Fonda, who also directed the film. Hopper delivers the line as the two are riding along a country road side by side on their two bikes, after doing their thing in New Orleans for a while. It comes just before Hopper’s character gets blown away with a shotgun by a Southern cracker in a pick-up.

The moral of the story? Don’t blow it—or you’ll be blown away!

*     *     *     *     *     *

Exactly how the two hippie bikers in Easy Rider “blew it” is open to diverse interpretations. However, by any interpretation worth considering, “blowing it”—whether done by the characters played by Hopper and Fonda in that movie, or by the members of the Brown family in Banks’ Cloudsplitter, or by whomever in whatever circumstances—is not a matter of an error in calculation. It is no omission or oversight in cost-benefit analysis, no limitation in one’s capacities for “rational decision-making.” In short it is not a mistake.

It is a crime.

“Blowing it” is not necessarily—or even in any important case—a crime in the sense of a violation of any law of any such state as Louisiana. It is a crime, rather, in the sense of a breach of faith, a failure to keep faith—above all, a failure to keep faith with oneself. As such, it cries out not for correction, but for expiation.

*     *     *     *     *     *

The institution of American slavery was a crime, not a mistake. It was a human betrayal of humanity, not an error in calculations or a failure in “rational decision-making.” By the passage I have cited from Banks’ novel, John Brown’s third son Owen and the rest of John Brown’s family were brought together—which should itself be read in a double sense here, to mean both that the whole bunch of them were brought, and that the bunch of them were brought no longer to be just a bunch, but to be a true whole—by an insight into the reality of that institution, American slavery.   Given such insight by nothing more than the everyday event of an evening’s family reading, they were thereby brought together to a point where they no longer had any choice but to join the family patriarch in his declared war against that criminal institution. They either had to join John Brown, the family patriarch, or betray him—and, along with him, themselves.

To find oneself at such a point of decision—but what am I saying? To be brought to such a point of decision is precisely to find oneself! So I should have said that to find oneself at last, by being brought to a point of decision, is precisely in such a way to be given no choice. At such a point, one “can do no other” than one is given as one’s own to do, as Luther said at the Diet of Worms in affirming his continuing defiance of the Church hierarchy and its self-claimed “authority.” One can do no other at such a point than what one finds oneself, at and in that point, called to do.

If one does not heed that call, then one lapses back into loss of oneself, lost-ness from oneself, again. Thus, as I have written in this series of posts before, at a point of decision, one is not given two equally passable options, between which one must choose. Rather, one is given one’s one and only opportunity, the opportunity at last to become oneself, to make one’s life one’s own.*

When one is faced with such an opportunity, such a moment of clarity, such a point of decision, if one even bothers to “count the costs” before declaring oneself, then one has already declared oneself—already declared oneself, namely, to be a coward and a criminal. By counting the costs before one makes up one’s mind in such a situation, at such a point, one has already lost one’s opportunity, and, with it, any mind worth keeping, no matter how “rational” that mind may be. One has blown it.

*     *    *     *     *     *

In 1939 Random House published a new novel by William Faulkner. Faulkner had given his work the title If I Forget Thee, Jerusalem. In the novel Faulkner interwove two stories, each of which could perfectly well stand on its own, as each—one of the two, especially—has often been made to do, in anthologies and other later re-publications of Faulkner’s works. One such potentially autonomous story is called “Wild Palms,” and the other one, which is the one most often published just by itself alone, is called “Old Man.”

Faulkner took the title he gave the combined whole of the two tales from Psalm 137 (136 in the Septuagint numbering), which sings out Israel’s own vow not to forget Jerusalem during Israel’s long captivity in Babylon. It is an intemperate psalm, declaring an intemperate vow, which is intemperately sealed by a prayer that the singer’s right hand might wither, and the singer’s tongue cleave to the roof of the singer’s mouth, if that vow is not kept. The psalm then intemperately ends by calling down wrathful vengeance on the Babylonians, blessing those of that city’s enemies who might one day, as the psalmist fervently hopes they do, seize the Babylonians children and bash their brains out on the rocks.

Especially today, decent, rational folks are shocked by such sentiments.

They didn’t seem to shock Faulkner, however. Or, if they did, it would seem to have been with the shock of insight and recognition, since he not only chose a crucial line from the psalm as the title to his double-storied 1939 novel, but was also chagrinned—and protested, to no avail—when Random House, on the basis of its own cost-benefit analyses no doubt, made the quite rational decision to refuse to bring the book out under the title Faulkner had given it. Instead, they took the title of one story (with ironic justice, it turned out to be the title of the story that has subsequently “sold” far less well than the other, in the long run, judging from subsequent re-printings/anthologizings) and published the whole as The Wild Palms. Not until 1990, twenty-eight years after Faulkner’s death, did an edition come out under the title Faulkner originally chose.

The Wikipedia entry for If I Forget Thee, Jerusalem (http://en.wikipedia.org/wiki/If_I_Forget_Thee,_Jerusalem) characterizes the novel as “a blend of two stories, a love story and a river story,” identifying “Wild Palms” as the former and “Old Man” as the latter. However, the entry goes on to point out that “[b]oth stories tell us of a distinct relationship between a man and a woman.” Indeed they do, and I would say that, in fact, both are love stories—only that one is the story of a love kept, and the other the story of a love thrown away. Or perhaps it would be more accurate to say that one, “Wild Palms,” is the story of a decision to love, a decision boldly taken and faithfully maintained, regardless of the cost, whereas the other, “Old Man,” is the story of refusal to decide to love, and a cowardly clinging to security instead.   The first is a story of love enacted; the second, a story of love betrayed.

I would say that, read with ears tuned for hearing, the Wikipedia entry brings this out very nicely, actually, in the following good, short synopsis:

Each story is five chapters long and they offer a significant interplay between narrative plots. The Wild Palms tells the story of Harry and, Charlotte, who meet, fall in forbidden love, travel the country together for work, and, ultimately, experience tragedy when the abortion Harry performs on Charlotte kills her. Old Man is the story of a convict who, while being forced to help victims of a flood, rescues a pregnant woman. They are swept away downstream by the flooding Mississippi, and she gives birth to a baby. He eventually gets both himself and the woman to safety and then turns himself in, returning to prison.

To be sure! Whoever refuses the opportunity to love does indeed return to prison!

That’s just how it is with decisions, whether they be decisions to love, or to take to the streets in protest of injustice, or to hole oneself up in a room and read, read, read, in order to write, write, write—or, perhaps, the decision never to forget.

Faulkner’s story of Harry and Charlotte’s decision to love one another whatever the cost, especially when that story is read in counterpoint to his story of the old man who prefers the security of prison to the risks of love (and who is made “old,” regardless of his chronological age, precisely by so preferring), shows that such decisions can have serious, even fatal, consequences. Yet it also shows, even more strongly, that only an old coward would count such costs before deciding to love, when the opportunity to do so presents itself.

Most of us most of the time are old cowards. Far too often, all of us are. None of us never is. That, however, is no excuse.

*     *     *     *     *     *

Making a genuine decision is something very different from choosing between brands of beer, political parties, or walks of life–all of which are subject to the sorts of cost-benefit analysis that pertains to what is, in our longstanding “Newspeak,” called “rational decision-making.” In sharp contrast, making a genuine decision is nothing “rational.” Rather, it is taking one’s one and only chance to live, and to do it abundantly—rather than just going on surviving, hanging on and waiting around until one can finally “pass away.”

It is just because that is the nature of genuine decision that there is always an ongoing need, past the point of decision, after one has decided oneself, from then on to continue regularly admonishing oneself to stay faithful to one’s decision, to keep one’s resolution.   For the same reason, it is essential, having made a decision, to continue regularly to ask for, and accept, whatever help one can get from others to keep to one’s decision—and, in turn, willingly to help others who have joined one in one’s decision to do the same: to “keep the faith,” as the old saying goes. **

It was in just such a way, “in repeated declarations of war against slavery,” and in repeatedly “asking [his family] to witness them,” and thereby making “ongoing pronouncement of his lifelong intention and desire,” his life-defining intention and desire, that John Brown “renewed and created his future,” as Banks has Brown’s son Owen say at the end of the passage cited above. So must it be not only for John Brown, but also for us all. Only with such help and such repetitions of our own declarations of whatever may demand such declaration from each and all of us, can we have any hope of “renewing and creating” our own future.

*     *     *     *     *     *

Since the ancient Greeks, the work of art has been taken as a paradigmatic cultural product, in the sense that I have been giving that latter expression. In 1935, when he first delivered his lectures on “The Origin of the Work of Art,” Heidegger argued that the work of the work of art, as it were—what the artwork does, we could put it—is to bring those on whom it works to a point of decision, to use my way of articulating it. The work of art, says Heidegger, this time still using his own terms, opens up a world, and sets that world up by anchoring or grounding it in the earth. The artwork is the very place where that takes place. As such, it is not interchangeable with any other place. Rather, it is absolutely singular, utterly unique: something truly new under the sun, something the like of which has not been seen before, nor will ever be seen again. It is one of a kind—namely, one of that very kind of kind that is really no “kind” at all, since it has only one “instance,” to use one of my ways of speaking from earlier in this series of blog posts.

The shock of such a work as such a place, the shock that such a work, such a place, is there at all, calls upon those whom it shocks to make a decision. That’s the work of works of culture, the produce of cultural production. So shocked, one can enter into the work of the work itself—as John Brown’s family in Banks’ novel entered into the work of John Brown (though he was no work of art, to be sure), when that family was suddenly shocked into seeing reality. Or one can decline so to enter into such work—and, in so declining, enter, despite oneself, into one’s own decline.

If one does not decline, but joins the work in its work—as John Brown’s family joined him in his—then one preserves the work. That does not mean, as Heidegger insists it does not, that one takes the artwork and locks it away safe somewhere. Rather, one preserves the work by doing what one must to keep open the world that the work first opened up. That is, one preserves the work of art by persevering in the work of that work, regardless of whether that work of art itself even continues to be around. Only in that way does one truly keep or preserve the work.

That includes keeping or preserving it “in mind,” that is, remembering it. To remember a work of art properly—that is, as the very work one seeks to remember—is not recurrently to call up any “memory-images” of it that one keeps locked away in one’s memory banks somewhere, whether those banks are in one’s brain or in one’s smart-phone or wherever they may be. Rather, properly to remember a work of art is to keep open the world that the work first opened, or at least open to it.

In just the same way, to stick with the analogy I’ve been using, those who preserved John Brown’s memory, once he was arrested by Federal forces and then hanged by the state of Virginia, did so not by erecting memorials to him at Harper’s Ferry or anywhere else. Nor did they preserve his memory by recurrently spending time looking at old pictures or other images of the man himself. Rather, those who preserved John Brown’s memory—those who did not forget John Brown’s body as it lay “moldering in the grave,” as the song says—did so by continuing to carry on the very war he had declared against American slavery. Well, just as John Brown continued to call people to decision even after his death, so can works of art call those who encounter them even after have ceased to be at work themselves.

What is more, John Brown can continue to call us to decision even today. Even now—long after John Brown’s body has moldered completely away, and nearly as long since the war he waged morphed into the Civil War that eventually brought the institution of American slavery as he knew it to an end—we can still be moved by being reminded of him. It no longer makes sense to speak today of joining John Brown in his war against the institution of American slavery, of course. The world in which that did make sense is no longer our world today. Nevertheless, we can still continue to be moved (even moved to join new wars declared in our own day) by the memory of John Brown—moved that way by reading Russel Banks’ retelling of Brown’s story today in Cloudsplitter, for example, or perhaps by visiting memorials to the sacrifice he and the others who carried out the raid at Harper’s Ferry made.

In just the same way, the world that was opened up by and in the works of art of the ancient Greeks has been dead for a long time now, far longer than John Brown. Yet we can still be moved by visiting the remains of such works in the museums of our own day. The world those works themselves opened up is no longer there for us to keep open, any more than the war John Brown declared against the institution of American slavery is any longer one in which we can enlist. But being reminded that there once was such a world, just as being reminded that there was once such a war as John Brown’s to fight, can still bring us to a point of decision of our own, a point where we are at last given our “one opportunity,” as Knausgaard was once given his. Even reminders of long dead worlds brought to us by mere fragments of what were once genuine works of art, genuinely still at work as works in opening up such worlds, can deliver to us the message that an “archaic torso Apollo,” according to Rilke in a poem of that name, delivers to those with eyes to see who visit it in the museum—the message, “You must change your life!”

The future of culture is dependent upon no more, and no less, than keeping alive the memory of such works. It does not even depend on the possibility that new works of such a kind-less kind will continue to be created. Even if they are not, the future still has culture—and, far more importantly, there still continues to be the future “of “culture, the future culture itself opens and holds open, which is to say the future as such—just so long as we keep on doing the work of preservation. There will be a future of culture so long as we truly do, but only so long as we truly do, “never forget.”

If we don’t remember, and do forget, then our right hands will wither, and our tongues will cleave to the roofs of our mouths, regardless of whether we pray it may be so or not.

*     *     *     *     *     *

In my next post, which will have the same main title as this series (“Pulling Out of the Traffic”) but a different subtitle, I plan to discuss an example of how we can “keep our memories green,” as it were.

 

* As Knausgaard found himself given his one opportunity, as he describes in the passage I cited at the beginning of my preceding post in this series.

 

** That, in turn, is something very different from demonstrating one’s “fidelity” to some “brand,” such as Coors or Budweiser when it comes to drinking beer, or Republicans or Democrats when it comes to electing politicians.

 

Pulling Out of the Traffic: The Future of Culture (4)

This is the fourth in a series of posts under the same general title.

*     *     *     *     *     *

All sorts of things transpire—but nothing any longer happens—that is, no more decisions fall . . .

— Martin Heidegger, Überlegungen IV (in GA 94), ¶219

 

. . . it’s neither here, nor elsewhere . . .

— Alain Badiou, Images du temps present (January 14, 2014)

 

I had one opportunity. I had to cut out all ties with the flattening, thoroughly corrupt world of culture where everyone, every single little upstart, was for sale, cut all my ties with the vacuous TV and newspaper world, sit down in a room and read in earnest, not contemporary literature but literature of the highest quality, and then write as if my life depended on it. For twenty years if need be.

But I couldn’t grasp the opportunity. I had a family . . . And I had a weakness in my character . . . that was so afraid of hurting others, which was so afraid of conflict and which was so afraid of not being liked that it could forgo all principles, all dreams, all opportunities, everything that smacked of truth, to prevent this happening.

I was a whore. This was the only suitable term.

— Karl Ove Knausgaard, My Stuggle. Book Two: A Man in Love

 

Points of decision are crisis points. “Critical condition” in the medical sense is the condition of a patient who is at the decision point between survival and demise, where the body—with, it is to be hoped, the assistance of the medical staff—must marshal all its resources to sustain life, in the minimal, zoological sense. In the passage cited above, Knausgaard describes how he came to stand at a critical point of decision for or against life in the full, no longer merely biological sense of the term—the truly live-ly sense, we might say, in contrast to the rather deadening sense of bare survival.

Actually, that way of putting it, “ a critical point of decision for or against life,” won’t quite work. Rather, Knausgaard describes coming to a point where he was faced with the need and opportunity at last actually and fully to make a decision in the first place and, by and in making it, to become truly alive at last. At that point he was faced with either “choosing to choose,” as Heidegger puts it in Being and Time, or else just going on going on, literally just surviving (“living-through” or “-over”) his own life, having already outlived himself, as it were, by letting his moment of opportunity slip by, in failing or refusing to decide at all.

The way that Alain Badiou puts it in his seminar on “images of the present times” (in the session of November 27, 2003) is that what he calls simply a “point” is “the moment where you make the world [as such and as a whole] manifest in the yes or the no of a decision. . . . It is the manifestation of the world in the figure of the decision.” He adds right away that “[o]ne is not always in the process of dealing with points, thank God!” Badiou, a self-proclaimed atheist proud of his atheistic family heritage, adds that ejaculation of thanks because, as he goes on to say: “It is terribly astringent, this imperative necessity that suddenly the totality of your life, your world, comes to be the eye of a needle of yes or no. Do I accept or do I refuse? That is a point.”

*    *     *     *     *     *

Early in the second of the six volumes of the long story of his “stuggle”—Kampf in German, it is worth remembering, as in Hitler’s Mein Kampf—Knausgaard himself has already noted how challenging it is actually to have to decide to live one’s life, rather than just to keep on living through it. Toward the very beginning of that second volume—toward the very end of which comes the passage already cited –he writes: “Everyday life, with its duties and routines, was something I endured, not a thing I enjoyed, nor something that was meaningful or that made me happy.” The everyday life at issue for him during the time he is addressing was one of an at-home husband of an employed wife, and a father taking care of his young children while his wife was at work. Thus, it was a life filled with such things as washing floors and changing diapers. However, Knausgaard immediately tells us that his mere endurance rather than enjoyment of such a life “had nothing to do with a lack of desire to wash floors or change diapers.” It was not that he disdained such activities, or regarded them as beneath him, or anything else along such lines. It had nothing to do with all that, “but rather,” he continues, “with something more fundamental: the life around me was not meaningful. I always longed to be away from it. So the life I led was not my own.”

Knausgaard immediately goes on to tell us that his failure to make his everyday life his own was not for lack of effort on his part to do just that. In the process of telling us of his efforts, he also offers at least one good explanation for giving his massive, six-volume, autobiographical novel the title it bears. “I tried to make it mine,” he writes, “this was my struggle, because of course I wanted it . . .”

He loved his wife and his children, and he wanted to share his life with them all—a sharing, it is to be noted, that requires that one first have one’s life as one’s own to share. Thus, “I tried to make it mine,” he writes, “ . . . but I failed.” That failure was not for lack of effort but because: “The longing for something else undermined all my efforts.”

Conjoining the two passages, one from near the start of the book and one from near its very end, suggests that Knausgaard’s long struggle has been of the same sort as that of St. Augustine, as the latter depicted it in his Confessions. That is, the “struggle” at issue derives from the ongoing condition of not yet having made a real decision, one way or another. In such struggles, the struggle itself comes to an end only in and with one’s finally making up one’s mind, finally coming to a resolution, finally deciding oneself.

In the passage at the start of today’s post, coming more than 400 pages of “struggle’ after the one just cited, Knausgaard gives the fact that he “had a family” as the first reason he “couldn’t grasp” the “one opportunity” that he says he had.   Nevertheless, what is really at issue cannot be grasped in terms of choosing between two equally possible but conflicting options, either living the life of a family man or living the life of an artist. Rather, what is at issue is something only Knausgaard’s subsequent remarks really bring to focus: what kept him from seizing his sole opportunity was nothing but himself. It was not the love of his family that hindered him. It was the love of his own comfort—or at least the desire not to disturb his own comfort by disturbing the comfort of others nearby.

I can identify! It was really not my love of my daughter that tripped me up when her childhood pet, Fluffy the guinea pig, died one day, causing me to tempt my own daughter to betray her love for her pet by rushing out to buy a replacement, as I recounted in my preceding post. I did love my daughter, to be sure, as I still do. But, as I already revealed when first discussing the episode, what tripped me up was really not my love for her. Rather, it was my discomfort with my own discomfort over her discomfort over Fluffy’s death. I betrayed myself out of love of my own comfort, not out of love for her. So my betrayal as such was not done out of any genuine love at all; it was done just out of fear—the fear of dis-comfort. That is how clinging to one’s precious comfort always manifests itself, in fact: in Knausgaard’s case no less than my own.

Now, there may truly be cases in which points of decision manifest as what we might call “Gauguin moments.” That is, there may really be cases in which, in order to make one’s life one’s own, one must indeed leave behind one’s family and one’s home and go off into some other, far country, as Gauguin did in the 19th century for the sake of his art (or as Abraham does in the Bible, though not, of course, for the sake of art).

What truly marks points as points of decision, however, is not a matter of the difference in content between two equally possible life-options (let alone the romantic grandiosity of the choices suggested by Gauguin’s, or Abraham’s, model). What defines them (including in such dramatic examples) is just that they are points at which one confronted with the necessity at last truly to decide, that is to resolve oneself—to say yes or no to one’s world, and one’s life in it, as a whole, as Badiou puts it.

*     *     *     *     *     *

German for “moment” is Augenblick—literally, “the blink of an eye.” Heidegger likes to note that etymologically Blick, an ordinary German word for look, glance, view, or sight, is the same as Blitz, the German for lightning-flash, lightning-bolt. Points of decision, in the sense that I am using that expression, are moments that proffer what Heidegger calls an “Einblick in das, was ist,” an in-sight or illuminating in-flash into that which is. Points of decision are moments of illumination of what is there and has been there all along, though we are only now, in a flash, given the opportunity to see it. They are those points in our lives that offer us the chance to make our lives our own: to come fully alive ourselves—at last and for firsts.

In common with Blitzen in the everyday sense of lightning-bolts, moments or points of decisive in-sight/in-flash sometimes come accompanied by loud thunderclaps, or the equivalent. God may come down and talk to us as God did to Moses as the burning bush, or come in a whirlwind, or with bells and whistles. At least as often, however, moments or points of decision come whispering to us in a still, small voice, one easily and almost always drowned out by all the noise of the everyday traffic with which we everywhere surround ourselves (even if only in the space between our ears), for very fear of hearing that voice . . . and being discomfited by it.

Points of decision may break the surface of our the everyday lives—those lives that, like Knausgaard, we endure without enjoying—as suddenly and dramatically as the white whale breaks the surface at the end of Melville’s Moby Dick. Or they may come upon us slowly, and catch up on us all unawares, such that we waken one morning and realize that for a long while now, we have not been in, say, Kansas any longer, but have no idea of just where and when we might have crossed the border into whatever very different place we are now.

All such differences make no difference, however. What counts is only that we come to a moment, a point of clarity, where we are struck, as though by a bolt of lightning, with the realization that we do indeed have a choice, but only one choice. We have a choice, not in the sense that we can pick between two different options, as we might pick between brands of cereal to buy for our breakfast. Rather, we have a choice in the sense that, like Knausgaard, we realize that we do indeed have one and only one opportunity, which we can either take, or fail to take. We are faced with the choice, as the Heidegger of Being and Time put it, of choosing to choose, choosing to have a choice to exercise, rather than continuing just to let ourselves live through our own lives, without ever having to live them. The choice is either to live, or just to go on living.

An acquaintance of mine once came to such a point of decision in his own life, and who did indeed decide to make his life his own at that point. When asked about it, he says that up until that point it had always been as though his life was running on alongside him, while he was just sort of standing there observing it. What his moment of decision offered him, he says, was precisely the opportunity to “take part in” his own life, rather than just continue to let it run itself next to him. In a certain sense, he may have “had” a life up to that point, but only at that point did he come to live it himself.

*     *     *     *     *     *

In The Politics of Things (La politique des choses, first published in France in 2005 by Navarin, then in a slightly revised, updated edition in 2011 by Verdier) contemporary French philosopher Jean-Claude Milner traces the global processes driving inexorably, in what passes for a world in what passes for today, toward eliminating the very possibility of there being any genuine politics at all. That goal is being achieved above all through the development of ever more new techniques of “evaluation,” and the ubiquitous spread of processes of such evaluationinto ever more new dimensions of individual and collective life. (In the United States, we might add, the deafening demand for incessant development and promulgation of ever more new ways and means of evaluating everything and everyone is typically coupled with equally incessant palaver about the supposed need for “accountability.”)

What Milner calls “the politics of things” aims at what he calls “government by things.” At issue is the longstanding global drive to substitute what is presented as the very voice of “things” themselves—that is, what is passed off for “reality,” and its supposed demands—for any such messy, uncertain politics or government as that which requires actual decisions by human beings.

Thus, for example, “market mechanisms” are supposed to dictate austerity according to one set of “experts,” or deficit spending according to another set. Whichever set of experts and whichever direction their winds may blow doesn’t really make any difference, however. What counts, as Milner says, is just that it be one set or another, and one direction or another.

That’s because, he observes in his fourth and final chapter, “Obedience and Liberties” (in French, “Obéissance ou libértes”), the real aim of the whole business is simply the former: sheer obedience—what is indeed captured in the English word “obeisance,” derived from the French term. He writes (page 59) that, “contrary to appearances, the government of things does not place prosperity at the summit of its preoccupations; that is only a means to its fundamental goal: the inert tranquility of bodies and souls.”

To achieve that goal, the government of things plays upon human fears—two above all: the fear of crime, and the fear of illness. Under the guise of “preventing” crime and/or illness, the government of things reduces us all to un-protesting subservience. We prove always willing to do just as we’re told, as unpleasant as we may find it, because we have let ourselves be convinced that it is all for the sake of preventing crime or illness.

I will offer two examples of my own.  The first is how we line up docilely in long queues in airports, take our shoes (and, if requested, even our clothes) off, subject ourselves to pat-downs and scan-ups, delays and even strip-searches—all because we are assured that otherwise we run the risk, however slight, of opening ourselves to dreaded terrorist attacks. My second example is how we readily subject ourselves to blood-tests, digital rectal examinations, breast ex-rays, hormone treatments, and what not, all the tests, checks, and re-checks that our medical experts tell us are necessary to prevent such horrors as prostate or breast or colon or skin cancer, or whatever. We readily subject ourselves to all these intrusive procedures, only to be told sooner or later by the very same experts that new evidence has changed their collective expert thinking, and that we must now stop subjecting ourselves to the same evaluation procedures, in order to prevent equally undesirable outcomes. In either case, we do just as we’re told, without complaint.

We do as we’re told, whatever that may be at the moment, to prevent crime and/or illness because, as Milner writes (page 61): “Under the two figures of crime and illness, in effect one and the same fear achieves itself, that one which, according to Lucretius, gives birth to all superstition: the fear of death.” In fact, we are all so afraid of death and so subject to manipulation through that fear that we fall easy prey to the “charlatans,” as Milner appropriately calls them (on page 62), through whom the government of things seeks to universalize what amounts (page 64) to the belief in Santa Claus (Père Noël in France, and in Milner’s text)—a belief, finally, that “consists of supposing that in the last instance, whether in this world or in the next, the good are rewarded and the evil are punished.”

The government of things strives to make everyone believe in such a Santa Claus “with the same effect” that it fosters the development and universalization of techniques and procedures of evaluation: the effect of “planetary infantilization.” Furthermore:

One knows that no Santa Claus is complete without his whip. Indefectible solidarity of gentle evaluation and severe control [our American Santa making up his lists of who’s naughty and nice, then rewarding the latter with goodies and punishing the former with lumps of coal, for instance]! The child who does not act like a child [by being all innocent and obedient, sleeping all nice and snug in her bed, with visions of sugar-plumbs dancing away in her head] is punished; that is the rule [and we must all abide by the rules, musn’t we?]. All discourse not conducive to infantilization will be punished by the evaluators, that is the constant. Among its effects, control also carries this one: the promise of infantilization and the initiation of transformation into a thing.

After all, the desideratum is a government not only of things, but also by things and for things (pace Lincoln—at least it we grant him the charity of thinking that’s not what he really meant all along).

In the closing paragraphs of his little book (pages 66-67), Milner issues a call for resistance and rebellion against all such pseudo-politics and pseudo-government of things, and in affirmation of a genuine politics. It is a call, quite simply, for there to be again decision.

“If the name of politics has any meaning,” Milner writes, “it resolutely opposes itself to the government of things.” In rejecting the pretense of a politics of things, real politics “supposes that the regime of generalized subordination can be put in suspense.” A politics worthy of the name can emerge only if at last an end is put to all the endless chatter about how we all need to show “respect for the law,” “respect for authority,” and the like, all of which is just code for doing what we’re told.

Such suspension of generalized subordination and end of infantilizing chatter may not last long: “Maybe only for an instant . . .” But that instant, that moment, that blink of an eye, “that’s already enough, if that instant is one of decision. What’s needed is that there again be decision.”

That’s all that’s needed, but that’s everything. As Milner writes, “politics doesn’t merit the name unless it combats the spirit of subordination. One doesn’t demand that everyone be generous, or fight for the liberties of everyone; it is quite enough if each fights for her own freedom.” The return of a genuine politics requires that we stop relinquishing our own choices to “the order of things.” It requires, instead, “[t]hat at times we decide for ourselves . . .”

There is no future of politics otherwise. Nor, without decision, is there any future of culture in any form, be it political, artistic, philosophical, or whatever. But that just means that, without decision, there really is no future at all.

*     *     *     *     *     *

I intend my next post to be the last in this current series on “Pulling Out of the Traffic: The Future of Culture.”