Remembering the Third Reich American Style

This is the third in a series of posts.

*     *     *     *     *     *

Part Two: Pissing on Language (1)

His kind of bourgeois knows only how to talk about business. I’m not saying he’d lost his soul; he no longer had the language to express it.

—Léon Werth, 33 Days (p. 62)


But from the point of view of the philologist I also believe that Hitler’s shamelessly blatant rhetoric was able to make such an enormous impact because it penetrated a language which had hitherto been protected from it with the virulence which accompanies the outbreak of a new epidemic, because this rhetoric was in essence as un-German as the salute and uniform copied from the [Italian] Fascists—replacing a black-shirt with a brown-shirt is not a particularly original idea—and as un-German as the whole decorative embellishments of the public occasions.

But regardless of how much National Socialism learned from the preceding ten years of Fascism, and how much of the infection was caused by foreign bodies, it was, or rather became, in the end, a specifically German disease, a rampant degeneration of German flesh which, through a process of reinfection from Germany, destroyed not only Nazism, but also Italian Fascism, which was undoubtedly criminal, but not quite so bestial.

—Victor Klemperer, The Language of the Third Reich: LTI—Lingua Tertii Imperii: A Philologist’s Notebook, translated by Martin Brady (Bloomsbury, 2013, p. 57)


The path of poetic faith in our century suggests that repressive regimes do not tolerate, are in fact afraid of, the subversive powers of language, most especially poetry in the hand of those whom the political order aims to keep powerless.

—Terrence Des Pres, Praises and Dispraises: Poetry and Politics in the 20th Century (Penguin, 1988, p. 203)


One major gift the poet can give to what Des Pres calls the poet’s “tribe”—by which he means the “audience” for the poet’s poetry, that audience which, as Heidegger taught, the poetic work itself calls forth, to hear and heed it—is that of a genuine memorial, a place in the communally shared language where real remembrance can occur. Once so marked out, that place is one to which everyone can readily return, to remember—and a place that regularly and recurrently calls one back to itself, to do just that. Striking lines of poetry that, once heard, stick in the mind work that way. They become a storehouse memory, memory that rises back up into awareness spontaneously whenever it is reactivated, we know not (and need not know) how, by some chance encounter or event, like the taste of marmalade for Proust. When they do, they call us back to ourselves, and to the memories that define us, both as individuals all as members of our community, our “tribe.”

In sharp contrast, the role of the pure slogan, insofar as it remains no more than that—the paradigm being the sheer advertising slogan, like this one from my childhood: “You’ll wonder where the yellow went, when you brush your teeth with Pepsodent!”—is not to call us back to ourselves. It is, rather, to divert us from ourselves, and most especially from openly sharing our life together in community. Its role is to isolate us from one another, rather than to bring us together in shared recall of who we really are, and to drive us into compulsive action. Instead of fostering community, remembrance, and thought, the sheer slogan divides us from one another and blocks memory, hindering thought, and any genuine thoughtfulness for one another. The pure slogan is trying to sell us something, in one fashion or another, for someone’s profit, not tell us something about ourselves, for our own good.

To use a current example, “Make America great again!” is not a call to each of us to come together with all other “Americans” in remembering the gaps and failures in our community and in our relationships one to another. It is not a call to us to atone for those failures, and address them honestly and humbly. It is, rather, a slogan designed to get us to “buy” a certain Presidential candidate, and vote for him. Or, to use a much older example, but one still from Presidential politics, the slogan “He kept us out of war!” was used to sell the nation on reelecting the man who would then lead us into it—“to keep the world safe for democracy,” of course.

Slogans, like striking lines of poetry, stick in the mind. However, whereas genuinely poetic lines keep the channels of thought’s and recollection’s flow open and clear, the lines of slogans block that flow and close those channels, diverting thinking and memory into fixation on whatever the slogan is selling, whether that be toothpaste, Presidential candidates, or nationalistic fervor (“Remember the Maine!”, “Support our troops!”, “Never forget!”, and the like). Poetry cultivates the field of language, opening new possibilities for rich yields of diverse flowers and grains. Slogans fence language in and flatten out the field of linguistic possibilities, turning it into a smooth, monotonous, dead surface where nothing but the most noxious weeds can grow.

At least that is so unless something manages to breathe life and soul (to be redundant, since those two words really say the same thing) back into the slogan, opening it up into poetry.

That happened to me with regard to the old Pepsodent slogan I mentioned above. That was way back in my childhood, when that insipid, mindless, mind-numbing jingle was still in circulation, not yet having worn itself down completely smooth, so that it needed to be replaced by some new coin of the same (dis)value, destined to the same ultimate flattening into worthlessness. That’s because the version of that slogan that stuck itself into my mind was first, last, and recurrently the version of it that my father used to delight in singing whenever the mood struck him. His version was this: “You’ll wonder where the yellow went, when you wash your drawers in Pepsodent.” By heightening the definitive vulgarity of the original slogan, he thereby brought to the fore—without even trying, and certainly with no special intention on his part, given the simple, though intelligent, working man he always remained—the tastelessness and vulgarity of the original, and gave to me, his youngest son, a lasting memory of just what such slogans are ultimately all about, and worth. All of that—and, above all, the memory of my father himself saying his even more vulgarized version of the already vulgar toothpaste slogan—often comes back to me without effort on my own part, recalling me to myself, when some new advertising vulgarity worms its way into my mind and sticks itself fast there, like some foul, contagion-carrying tick.

Thanks, Dad!

*     *     *     *     *     *

To be continued.

Remembering the Third Reich American Style

This is the second in a series of posts.

*     *     *     *     *     *

Part One: The American Way of Remembering (2)

[. . .] we whites seem curiously unwilling to shoulder any responsibility for our own part in racial inequity. If we’re so concerned about “personal responsibility,” shouldn’t we show more?

—Nicholas Kristof, “When Whites Just Don’t Get It, Part 7” (NY Times 10/2/16)


We both detested the war [. . . but] both knew that if Hitler was responsible, he wasn’t as important as he was made out to be and he hadn’t invented himself without help.

—Léon Werth, 33 Days, translated by Austin Denis Johnston (Brooklyn & London: Melville House Publishing, 2015, p. 13)


Léon Werth, a French Jew who fled from Paris with his wife and daughter when the conquering Germans approached the city early in World War II, wrote what became the book 33 Days during that time of flight, when he and his family were refugees in their own country. In the line above Werth is speaking for himself and a friend he made along the way—a farmer who earned Werth’s gratitude by granting him and his family genuine hospitality along their way, and whose open, discerning, and honest thoughtfulness also earned Werth’s respect.

Werth and his new friend saw that it was no more than a subterfuge to blame all the destruction of that war on the single figure of Hitler, who “hadn’t invented himself without help,” and was therefore not the only one responsible. They understood that at least a good part of the reason Hitler was being “made out to be” so important was so that those who helped make Hitler possible were could thereby hide—even, and perhaps most of all, from themselves—their own responsibility.

The same basic thing is also at issue in the citation above from Nicholas Kristof, about racism in the United States: The avoidance of responsibility. Such avoidance can take the form of shirking one’s own responsibility by hiding behind some figure who has been “made out to be” much more important. But it can also take the form of projecting one’s own failure to assume one’s own responsibility onto those less fortunate than oneself. Either way, the effect is the same. Both are handy devices for denying one’s own responsibility—for refusing to remember what one has done, and hold oneself accountable for it.

In addition, all too often such slogans as “Always remember!” or “Never forget!” are employed in the same way: to create a sheer pretence of remembering that actually does dis-service to all genuine remembrance. Similarly, all too often, “official” memorials supposedly erected to honor memory actually dishonor it in just such a way. All too often, they just piss on all genuine, spontaneous memorialization, the same way that, as Jeff Chan recounts in the citation with which I began this series of posts, one cop let his dog piss on the memorials that sprang up spontaneously on the street in Ferguson, Missouri, where another cop had killed unarmed Michael Green. To borrow Chan’s way of putting it, they practice that same “American way of remembering.”

*     *     *     *     *     *

I will continue this series on “Rembering the Third Reich American Style” in my next post.

Remembering the Third Reich American Style—Part One: The American Way of Remembering (1)

This is the first in a series of posts.

*     *     *     *     *

One cop walked his dog over to the memorial that [Lesley] McSpadden had made for her son [Michael Brown, killed by another cop who’d pulled the teen-ager over for walking improperly on the street in Ferguson, Missouri, on August 9, 2014] and let it pee on the flowers and candles. After the rest of the policemen got into their vehicles to leave, car by car they rolled over what was left of the memorial. In the days to come, these memorials to Michael Brown Jr. would be destroyed over and over, as if to say, This is the American way of remembering.

—Jeff Chang, We Gon’ Be Alright: Notes on Race and Resegregation (Picador, 2016)


Continuing compulsively to repeat something over and over again is one way never to forget it. If by “always remembering” we mean nothing more or less than just “never forgetting,” then such compulsive repetition is a fail-proof way of assuring ourselves that we will always remember.

By steadfastly refusing ever to become aware of what we are doing in the first place, we guarantee that we will never forget it. We can only forget what we have once allowed to come into our awareness. So if we simply refuse ever to get clear about just what it is we are really doing, we never have to worry about forgetting it either. Such memory manifests as compulsive repetition.

All that’s needed for the practice of that form of remembering is the cultivation of stupidity. By the definition that has long been my favorite, “stupidity” is “willful ignorance.” So defined, stupidity is not just not knowing something, which is the literal, etymological meaning of the term ignorance—from Latin ignorantem, “not knowing,” the present participle of ignorare, “not to know, not to be acquainted with, to take no notice of, pay no attention to,” itself deriving from in-, in the sense of “not,” plus Old Latin gnarus, “aware, acquainted with.”

There are various possible reasons for ignorance, for not knowing, as such. That includes something being “hidden in plain sight,” so that we need someone or something else to call our attention to it before we notice it, like the glasses on our face we keep searching for until someone finally points out to us that we’re wearing them.

Many years ago, when I was only 15, I went with my parents to Germany one summer, to visit my older brother, who at that time stationed in Frankfurt as a volunteer in the U.S. Army. Because I had taught myself some German in preparation for the trip, I became the designated family translator. On one occasion, my father needed something from the drug store, and I went with him to do any translation that might prove necessary. Standing right in front of the pharmacy counter, my father asked me to ask the pharmacist for whatever it was my father wanted. I did so, in my limited German. But then the pharmacist answered in fluent English. My father looked at me inquiringly, waiting for my translation. As my exasperation began to mount, I repeated exactly what the pharmacist had just said, using the same English words. My father then gave me some more to say to the waiting pharmacist. My exasperation burst out as I responded, “You tell him! He’s speaking English!” My father just smiled at his own ignorance, and took things from there.

That remains for me to this day a fond memory of my father, and of the gracefulness with which one can respond when one finds that one has gotten bent down just a bit, even despite all one’s own perfectly innocent intentions.

Such innocent ignorance is not the only kind, however.

There is also the sort of ignorance that is rooted in the desire not to know, because what is all too clearly there to be known does not happen to accord with what one would like to be true. It is the sort of cherished ignorance that insists, against all opposition and despite all evidence to the contrary, that what actually is the case is precisely what one wants to be the case, because that would serve one’s own selfish wants, desires, or needs (including especially the need always to think highly of oneself, despite all one’s misdeeds, failures, or vices). There is no innocence to ignorance of such a sort, as there was innocence in my father’s sort of ignorance. It is willful ignorance, the product of wanting not to know.

That is what I mean by “stupidity”: just such willful ignorance.

Precisely because it is so willful, such stupidity also has nothing to do with lacking intelligence. In fact, my own experience throughout my life is that intelligence actually makes such stupidity easier. That’s because intelligence is useful for discovering more and more ways to avoid coming to know what one does not want to know. In general, the more intelligent one is, the craftier one can become, including crafty in the ways of hiding oneself from oneself.

Stupidity, willful ignorance, is back of the American way of remembering, as Chan describes it in the passage with which I began.

*     *     *     *     *

To be continued.

Can We Mourn Yet?


The more time passes, the more difficult it becomes to acknowledge these mistakes.

—Dan Jianzhong, Beijing sociologist, concerning the Chinese Cultural Revolution, which began 50 years ago, in 1966 (quoted by journalist Chris Buckley in “High-Level Commentary Breaks Silence in China,” The New York Times , 5/17/16)


Mourning involves living in a world totally not of one’s choosing. It’s a world of paradoxes: a world that one doesn’t want to live in, but doesn’t want to die in either.

—Charles W. Brice, poet and retired psychoanalyst (personal communication)


One thing has been made very clear to me. Many people resent being confronted with information about how racism still shapes—and sometimes, ruins—life in this country.

—Jenée Desmond-Harris, “The Upside to Overt Racism” (The New York Times, 5/1/16)


Whoever, so as to simplify problems, denies the existence of certain obligations has, in his heart, made a compact with crime.

—Simone Weil, The Need for Roots (Routledge Classics, 2002; Fr. orig. 1949)

In general, it is no doubt right to say that the difficulty of acknowledging past mistakes increases with time. However, when those mistakes carry traumatic consequences, the more time passes the greater grows the urgency to do just that, to acknowledge them—and, even more, to set them right. Trauma, after all, has its own time, growing ever more insistent the longer it goes unaddressed, repeating its demands more and more loudly until they are finally heard, and elicit a proper response. Sooner or later, trauma’s time will come. Sooner or later, we will be able to mourn. We can only hope that the day for our mourning will come this side of Judgment Day, the eschatological end of days as such. However, there are reasons for pessimism on that matter.

Perhaps the greatest obstacle that stands between us as a nation and the dawning of our day of national mourning is precisely, as I put it in my preceding post, because we really are not “one nation, indivisible, with liberty and justice for all” except in our national Pledge of Allegiance. What keeps us from uniting in acknowledging and mourning the crimes that some of us have perpetrated on others of us (not to mention other nations or peoples), is that we who are perpetrators or their descendants continue to derive so much benefit from those same crimes. Those of us who have the privilege of thinking ourselves “white,” for example, continue to derive great benefits from that very privilege, including the benefit of being able to drive our cars around our cities without being stopped and harassed by the police for no better reason than our not being among those so privileged.

Some time ago I wrote here, in a two-post series on “The Unforgiveable,” about Auschwitz survivor Jean Améry’s stipulation of the conditions under which he would be willing to let go of what he called his “resentments” against the Germans as a people or nation. In brief, Améry lays out a two-fold condition for such a settlement to occur at the level of what he calls “historical practice.” First, a true settlement would require “permitting resentment to remain alive in the one camp,” the camp of the victims of the crimes. Second, and simultaneously, “self-distrust” would need to be first enkindled and then kept carefully alive “in the other camp,” the camp of the perpetrators—the very self-distrust engendered by the perpetrators’ awareness and acceptance of their victims’ resentment. Genuine reconciliation could occur only by allowing the wounds of the victims to remain open and acknowledged, while simultaneously opening and keeping open an answering wound of deep self-mistrust in the perpetrators. Only if that were to happen would “the overpowered and those who overpowered them [. . .] be joined in the desire that time be turned back and, with it, that history become moral.”

In the case of Germany and what it did during World War II, for that nation to awaken such self-distrust would require it to become, as Améry says, “a national community that would reject everything, but absolutely everything, that it accomplished in the days of its own deepest degradation [that is, during the Nazi years of 1933-1945], and what here and there may appear to be as harmless as the Autobahns.” Nor was Améry blind to the fact that the entire postwar German “economic miracle” that allowed West Germany to become the economic powerhouse of Europe was itself only possible on the basis of the devastation of Germany at the end of the war, which allowed for the sort of radical retooling that fed the postwar German economic machine. Truly to “reject everything, but absolutely everything, that it accomplished” through its own criminal acts of its Nazi period, Germany would have had to reject not only Cold War financial support through the Marshall Plan but also everything else that Germany’s own utter defeat made possible for subsequent German economic development. Of course, “nothing of the sort will ever happen,” as Améry already knew and insisted long ago.

Nor will the United States as a nation every truly mourn its own crimes. For one thing, it will never truly mourn the genocide of American Indians on which America is founded. For various reasons, it is even less likely ever truly to mourn the centuries of enslavement of African Americans on which the United States as a whole—not just the South, but the entire county—built its unparalleled global economic might.

It recently made the news that Georgetown University in Washington, D.C., was largely built on funds it acquired from direct engagement in the slave trade. In one sense, at least, there’s really nothing new in such news. As has long been recognized, many foundational United States universities—Brown, Cornell, Harvard, the University of Virginia, and others—were themselves founded, either directly or indirectly, on the bodies of slaves. So were many other institutions, both North and South. Then, too, of course, the institution of slavery was built right into the Constitution of the United States itself.

If the United States were ever really to mourn slavery and its hundreds of millions of victims, then at least at a bare minimum those of us who still continue to benefit from the consequences of slavery would need to let go of our resentment toward African Americans for their own ongoing resentment for those very consequences. We who are privileged to think ourselves “white” would have to grant those not so privileged the right to hold on to their resentment of us, and we would need simultaneously to match their resentment with deep, abiding self-distrust of ourselves, to borrow Améry’s way of putting the point.

Of course, nothing of the sort will ever happen, I know.

*     *     *     *     *     *

So where do we go from here?

Well, that question really calls for thinking.

Published in: on May 23, 2016 at 5:48 pm  Leave a Comment  
Tags: , , ,

Can We Mourn Yet?


[H]ow can the memory of the colonists be reconciled with the memory of the colonized?

—Sadri Khiari, “The People and the Third People,” in What Is a People? (trans. Jody Gladding, Columbia University Press, 2016, p. 99)

There is something questionable about lumping all veterans together as all equally deserving of honor. Many pointed just that out, to give one relatively recent example, when President Ronald Regan accompanied West German Chancellor Helmut Kohl to a commemoration service at the military cemetery in Bitburg, Germany, thirty-one years ago this month, in May of 1985, to honor the German war-dead buried there. Unfortunately. the German veterans buried at Bitburg included even members of the Nazi Waffen-SS—despite the fact that the entire SS had been deservedly judged a “criminal organization” by the Nuremburg Tribunal at the end of World War II. Reagan’s remarks on that occasion in 1985 suggested it was time by then to let bygones be bygones. Even though some of those buried at Bitburg had served in a criminal organization of a regime that murdered millions in gas chambers, Reagan apparently thought it fitting, after so many years, to let all the dead be honored equally. Unfortunately, however, to honor the memory of murderers equally with the memory of those they murdered is to dishonor the latter—and to join the former, if only symbolically.

In the same way, to honor Confederate soldiers equally with Union soldiers and all other American veterans who died while serving in the United States military, as we have long done on Memorial Day, is to paper over the differences between the Confederacy and the Union. And in the process it is to dishonor the millions of African Americans who were sold into the very slavery over which the Civil War was fought in the first place. It is to forget their bondage and its toll of misery, and to forget who was responsible for it—which was by no means the Confederacy alone, be it added (a point to which I will return in my next post).

Forgetting also occurs under the appearance of remembrance when all U. S. veterans whatever are lumped together by robbing Armistice Day of its original significance and turning it into Veterans Day. That change involves the expropriation of the very day originally set aside in memoriam of what Woodrow Wilson was benighted or vicious enough (he certainly betrayed both character-traits) to call “the war to end all wars,” and appropriating it instead for the purpose of glorifying all U.S. military service of all times, even if that service consisted, for example, of dropping nuclear bombs on Hiroshima and Nagasaki, or torturing Iraqi captives, and not just such non-controversially good deeds as liberating Paris from the Nazis or inmates from Dachau. What happens in such appropriation is the erasure and expropriation of the suffering of all our wars’ greatest victims, just as honoring all German war dead equally, including even Waffen-SS, dishonors those millions of Germans who died while serving honorably in their country’s armed forces.

Remembering the dead is a certainly a debt we owe them, one that should be honored and paid in full. Official memorializing of their deaths, however, is all too often a way of reneging on that debt, and failing to honor it. That is what almost always happens when memorializing is mandated by official state decree, as opposed to springing up spontaneously by popular action. The former is most often in the service of coercive power, helping to strengthen that power, or at least to maintain it. The later expresses the desire to honor those who call out to be honored in remembrance.

In a riven society, memory is also riven. To be genuine, mourning must honor the rift. The discord must be heard and remembered, not drowned out and covered over, for real healing to occur. The wounds of division must be kept open. They must be acknowledged and mourned, if a truly single and united community—inclusive of all as true equals, rather than preserving privileges for some who remain always “more equal” than others—is ever to be formed among those who remain.

Whether “under God” (as Congress mandated only in 1954) or not, the society of the United States today is “one nation, indivisible, with liberty and justice for all” only in its own official Pledge of Allegiance. In reality, the society of the United States today continues to be riven by a variety of deep, longstanding social injustices it has never yet properly mourned.

Just this morning (May 16, 2016), The New York Times carried an article concerning one relatively recent instance of such a still unhealed wound of national division. The article addresses the controversy that is currently surfacing again as President Obama prepares to visit Vietnam soon, the third U.S. President to do so since the fall of Saigon to Communist forces in 1975. The rekindled controversy repeats that of old divisions generated by the United States war in Vietnam in the 1960s and 1970s. We as a nation have yet to face and to mourn what we did in that war, and what it did to us.

Slavery and all its consequences of ongoing discrimination and persecution against African Americans down to the present is perhaps the most obvious other example, and the one I have discussed most in this current series of posts. Unfortunately, there are many other examples as well. The oldest one goes all the way back to the genocidal warfare on which this country was founded, and the consequences of which continues to afflict American Indians to this day.

The list could be continued.

At any rate, what are the odds that we as a nation will ever be able to mourn such old yet still destructive divisions, and truly begin finally to heal them? The odds are not at all good, for reasons I have touched upon in my earlier posts in this series, and will address more directly in my next one, which I intend to be the last of this series on our own national capacity—or lack of it—to mourn.

Can We Mourn Yet?


One of the official editorials by the editorial board of The New York Times on Sunday, April 3, 2016—the very same edition in which appeared the two articles I wrote about in my preceding post (one by Nicholas Kristoff and the other by Margret Sullivan)—was entitled “Race and the Death Penalty in Texas.” The editorial pointed to the overwhelming evidence demonstrating that the imposition of capital punishment in Texas discriminates against African Americans, and suggested that the death penalty both there and elsewhere in the United States should be abolished.

It certainly should be. However, that does not seem likely at present. One reason it is not likely is the current composition of the U. S. Supreme Court, which forty years ago reversed its own earlier judgment against capital punishment, permitting it again so long as it is not imposed in an “arbitrary or capricious manner,” as The Times quotes the Court saying in reversing itself. A deeper, even more intransigent factor is indicated by something else The Times itself says in its editorial: “Racism, of course, has been central to the American death penalty from the very start.” What The Times does not go on to say—but should have—is that for that very reason our national focus should not be on the death penalty at all, but rather on the racism that underlies and sustains it.

That, our national racism, is what we really need to eliminate. Otherwise, even if we as a nation were to side-step the Supreme Court and legislate the elimination of capital punishment, our real problem would still persist. In fact, it would in all probability just grow worse. We as a nation would all but certainly interpret the elimination of the death penalty as no more than the elimination of a lingering vestige of the racism we want to think we have already consigned to the past, rather than an ongoing crime we continue to perpetrate in the present. (In just the same way, according to numerous opinion polls, the majority of citizens of the United States were happy to convince themselves that the election of President Obama eight years ago proved that we as a nation had overcome racism.)

Capital punishment should be eliminated in the United States, and the election of our first African American President in 2008 deserves to be universally celebrated (regardless of what one thinks of him personally, or of the accomplishments of his Administration). However, neither eliminating the death penalty nor celebrating our first election of an African American President would prove the United States had overcome the racism that is such an ongoing national shame. Furthermore, by allowing us to pretend that we had already faced and overcome our racism, both would all too easily just harden our inability to mourn that racism, and its millions of victims past and present.

*     *     *     *     *     *

That brings me to yet a fourth article that caught my attention in the Sunday, April 3, edition of The New York Times. That fourth piece was also in the op-ed section. It was a column entitled “Why Slaves Graves Matter,” by Sara A. Arnold, whom The Times identifies as the “founder of the Periwinkel Initiative and the National Burial Database of Enslaved Americans,” preliminary submissions to which, she tells us in her article, she is currently processing. In her piece Arnold argues that “community preservation initiatives can contribute to healing understanding and potentially even reconciliation,” then ends her piece with the following paragraph:

Our country should explore ways to preserve the public memory of enslaved Americans. Their overlooked lives are an inextricable part of the historical narrative of our country—and not just because they were “beneficiaries” of the 13th Amendment. We should remember enslaved Americans for the same reason we remember anyone; because they were fathers, mothers, siblings and grandparents who made great contributions to our nation. Regardless of our country’s history or our ambivalence about the memory of slavery, we can choose to remember the enslaved—the forgotten. They offer our contemporary society examples of resilience and humanity. Preserving heir memory contributes to our own humanity.

Unfortunately, in this case, too, I remain a skeptic. Here, my skepticism is above all because everything depends on just how we go about doing our “remembering.” There’s remembering, and then there’s remembering.

One kind of remembering is that officially done on such occasions as Veterans Day or Memorial Day, when we all get together to put flowers on graves or watch parades, maybe even with banners admonishing us never to forget those who have sacrificed for our national good, sometimes even with their lives. Those two cases—Veterans Day and Memorial Day—are deserving of more attention, since they can be used as good examples of the hidden complexities involved in the whole mixing of remembering with the setting up of official memorials or days of remembrance.

Armistice Day was officially created to memorialize the day when the armistice that ended active hostilities between the Allies and Germany on the Western Front in World War I went into effect. The armistice officially went into effect at 11:00 a.m. on November 11, 1918—the symbolically significant “eleventh hour of the eleventh day of the eleventh month,” just a clock-tick of history away from the World’s midnight. However, in many countries it eventually became an occasion to remember not just veterans of World War I but also all military veterans whatever. Following that trend, in 1954 the United States officially changed “Armistice” Day into “Veterans” Day.

Similarly, the “Grand Army of the Republic,” a Union veterans’ organization founded in Decatur, Illinois, held the first “Decoration Day” to memorialize Union soldiers who died in the American Civil War. In former Confederate states after the war, there were also various celebrations, held on various days, to commemorate Confederate veterans who had died in the same conflict. In the 20th century, all such celebrations, North and South, were merged into the current “Memorial Day,” which was also extended to honor all Americans who died while in the military service, in or out of combat, not just those who died during the Civil War. Thus, unlike Veterans Day, which was set aside as the official U. S. holiday to honor all veterans, regardless of whether they died while in military service, Memorial Day was set aside specifically to honor only those who did die while so serving.

All too often, however, officially setting aside such days of remembrance—or officially setting up such memorials as The Tomb of the Unknown soldier in Arlington National Cemetery (or even setting up that cemetery itself as the official cemetery to honor U. S. veterans killed in combat)—does not, regardless of anyone’s intention, really serve genuine remembrance at all. All too often in such cases, what looks like an endeavor to encourage or even mandate remembrance in reality ends up just helping whatever powers that be keep the public order that perpetuates their power, an order that actually has good reason to fear being disturbed by genuine, spontaneous, uncontrolled remembrance.

In my next post, I will address that issue.

Can We Mourn Yet?


That segment of the population [the privileged segment] wants to be surrounded by people with similar characteristics.

—Kevin Sheehan, former chief executive of Norwegian Cruise Lines, as quoted by Nelson D. Schwartz in “In New Age of Privilege, Not All Are in Same Boat,” the lead article on the front page of The New York Times for Sunday, April 24, 2016 (the 100th anniversary of the start of the Irish Easter Rising of 1916, be it noted)

In my preceding post, the first in this series on whether we can mourn yet, I wrote about two articles that appeared in The New York Times for Sunday, March 20, this year. This new post will also concern pieces from The Times, but from an even more recent issue.

The first piece is itself a sort of recent reissue of an older one. Two years ago, regular Times contributor Nicholas Kristoff did a series of columns he called “When Whites Just Don’t Get It.” Then just a few weeks ago, in The Times for Sunday, April 3, he wrote a reprise called “When Whites Just Don’t Get It, Revisited”—a revisiting he wrote was necessary because “public attention to racial disparities seems to be flagging even as the issues are as grave as ever.”

“Why do whites discriminate?” Kristoff asks in his recent reprise. “The big factor,” he writes in answer to his own question, “isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality [that is, “whites” who, when asked, say they believe in equality, even and especially, I will add, if they are just asking and answering themselves] but act in ways that perpetuate inequality.” Kristoff then cites Eduardo Bonilla-Silva, whom he identifies as “an eminent sociologist,” and who “calls this unconscious bias ‘racism without racists.’” About such presumably covert racism, Kristoff says, “we whites should be less defensive.” One reason, he adds, that “we whites” don’t need to be so defensive about our own lingering, unacknowledged racism, is that, in his judgment at least, such bias “affects blacks as well as whites, and we [all of “us,” presumably: “blacks as well as whites”] also have unconscious biases about gender, disability, body size and age.” Then a few paragraphs later he ends his column by writing: “The challenge is to recognize that unconscious bias afflicts us all—but that we just may be able to overcome it if we face it.”

How likely Kristoff thinks it is that “we” will ever actually face the fact of such bias, he doesn’t say. Speaking solely for myself, I do not think it is very likely at all. Hence, I am equally skeptical that “we” have any real ability to overcome such bias.

*     *     *     *     *     *

Kristoff’s remarks about how we all have such bias makes me assume that what he means by that term bias is very broad. It would seem to cover such things as the simple uneasiness that we all have toward that which is different from us or unfamiliar to us. For example, if we grow up in a place where no one has red hair, and suddenly find ourselves visited by some red-haired stranger, then we will naturally tend toward being suspicious of, or at least not completely at ease with, our visitor, at least till we get to know him or her better: We will have an “unconscious bias” against any such red-heads, as Kristoff seems to be using that phrase.

It is precisely with regard to unconscious “biases” of that perfectly natural and universal sort that our chances of coming to face them, and then perhaps even to overcome them, are best. However, if we turn to a different subset of unconscious biases, the odds against such change rise sharply. That applies above all to that subset of unconscious biases with regard to which our not knowing we have them is all too often because we do not want to know—those biases we have of which we do not just happen to be unaware, but which we actually have a vested interest, as it were, in keeping secret—secret even, and perhaps especially, from ourselves. At issue are those biases that we actually have a vested interest in maintaining, precisely because of all the benefits maintaining such biases brings us, at the cost of the very people against whom we do maintain them. That very self-interest then also strongly motivates us unconsciously to hide those unconscious biases from ourselves.

To give an example that is still of great ongoing importance, when it comes to racial bias in this country, it seems to me that, in general, the benefits from such bias are overwhelmingly weighted in favor of those of us who think ourselves “white,” as Ta-Nehisi Coates puts it in Between the World and Me (and which I have discussed in some earlier posts), rather than those of us who are not encouraged—if even permitted—so to think of ourselves. It directly benefits all of us who think we are “whites” to think that the rest of us, all the “non-whites,” are inferior to us “whites,” since that lets us “whites” keep on denying such supposed inferiors their fair share of the communal pie, so that we can keep on getting bigger slices for ourselves.

*     *     *     *     *     *

To give another example: It happens that in the same op-ed section of the Sunday issue of The New York Times in which Mr. Kristoff has his column revisiting how, still, “whites just don’t get it,” there also appears another column, by Margaret Sullivan, who was then serving as the “Public Editor” for The Times (she’s since stepped down), called “Are Some Terrorism Deaths More Equal Than Others?” The answer editor Sullivan gives to that question is clearly in the affirmative, at least insofar as it comes to coverage of such deaths in dominant United States news media, including The Times itself. After devoting the first half of her column to various readers’ letters to her about the matter, Sullivan asks the most pertinent question, that of “why [there is] the persistent inequality that readers rightly observe?”

Her own answer to that question is four-fold. “Part of the answer,” she writes, “is access. It’s far easier to get a number of staff members to Paris or Brussels than, for example, to Pakistan [. . .] .” Next she addresses “another factor,” that of “deployment of resources,” of which she writes: “The Times has more than a dozen correspondents working in bureaus in Western Europe; far fewer, for example, are based in Africa.” As a third factor, according to her, “there is a legitimate question of newsworthiness. News is, by definition, something out of the ordinary. In some places, like Iraq, the tragic has become commonplace.” She then gives Egypt as another example (besides Iraq), citing a former Times correspondent stationed there who says that, while it used to be that “a bombing in Cairo would have been ‘a shock’,” that is no longer the case. Today, as the former correspondent says, “We can’t cover every attack there.” Finally, Sullivan cites as a fourth factor “the relationship between the United States and the country where an attack takes place.” In effect, she is saying that since France, for example, is important for our own interests (and, we might add, we even feel fondness for the French at the moment, a moment when it is no longer de rigueur for all good United States patriots who want to be politically correct to substitute “freedom fries” for “French fries,” and to call attention to themselves for doing so), we pay more attention to what happens there than in some place that has far less strategic importance for us (such as, say, Somalia or Haiti) or that we don’t like so much (such as, say, Finland or Indonesia).

Sullivan then draws her piece toward its end by patting her own employer on the back, writing that she is “glad that Times journalists recognize the need to reflect the importance of all human life lost to terrorism—whether it happens in a place where we Americans [by which she means United States citizens in good standing, of course] may have gone sightseeing [if we’re fortunate enough to be part of the minority of the United States population that can afford global tourist-travel] or one we will probably never set foot in [probably because the amenities there are not up to our standards for “exploring the world in comfort,” to borrow a slogan from Viking River Cruises]. And regardless of whether the victims seem ‘like us.’” In her following, final paragraph Sulllivan concludes by writing: “Because, in fact, they surely are”—by which I assume she means that, even if some other people don’t “seem” so, all people really do turn out, upon thorough enough investigation, to be “like us.” That assumption is confirmed by the rest of her final paragraph, where she writes: “And it’s part of The Times’s journalistic mission to help its readers not only know that intellectually, but feel it in their hearts.”

*     *     *     *     *     *

I find that I am even more skeptical in the face of Margaret Sullivan’s apparent optimism that her employer is fulfilling a high “journalistic mission” than I am in the face of Nicholas Kristoff’s apparent optimism that those in the United States who most need to face and change their “unconscious biases” will ever do so. I have already given one reason for my skepticism in contrast to Kristoff’s optimism, a reason that has to do with how, for some of us, such biases are too deeply grounded in preserving our own privileges.

Among my reasons for skepticism about Sullivan’s optimism, I have one that is similar, which is this: Among the factors Sullivan lists to account for the “persistent inequality,” in dominant news sources such as the New York Times, of coverage of “terrorism deaths” in diverse places, she nowhere even mentions the factor of profit. But after all, what really accounts for the four factors she does address—the factors of “journalistic access, deployment of resources, and the admittedly subjective idea of what’s newsworthy,” as she summarizes her account (leaving out, for some reason, the fourth factor she mentions, that of being more concerned about deaths in nations that are of more strategic importance to our own national self-interest than deaths in nations with less such importance)—being factors in the first place? One need not even be as cynical about such things as I tend to be to suspect that the reason for those reasons themselves is above all because it is far more profitable to The Times to keep things that way, rather than to face its own biases, let alone change them.

Nor is that all. I have other grounds for skepticism. Indeed, even in the very same Sunday edition of The New York Times that contains Kristoff’s and Sullivan’s two columns, there are two more articles that remind me of those grounds. In my next post, I will turn to those two remaining pieces from that morning’s Times.

Can We Mourn Yet?

We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed.

—Martin Luther King, Jr., “Letter from a Birmingham Jail,” April 16, 1963


Recently a couple of articles in The New York Times for Sunday, March 20, of this year caught my eye. My attention was drawn to them both at least in large part because, around that same time, I was writing posts for my preceding series on “Faith in Trauma,” which included some discussion of the classic 1967 book by Alexander and Margarete Mitscherlich, Die Unfähigkeit zu trauern: Grundlagen kollectiven Verhaltens (Munich: Piper Verlag)—eventually translated into English as The Inability to Mourn: Principles of Collective Behavior (New York: Grove Press, 1975).

The first of the pieces in that Sunday’s Times that drew my attention was on the front page of the op-ed section. It was a column by Eric Fair, who was a civilian contractor helping United States forces conduct interrogations in Iraq after the U.S. invasion of that country in 2003, under President George W. Bush. Fair assisted in the torture of Iraqi prisoners—what the Bush administration, of course, preferred to call the use of “enhanced interrogation techniques” since, after all, “the United States doesn’t torture,” as Bush blithely insisted.

Fair’s piece was given the title “Owning Up To Torture,” and served, among other things, to advertise his since-released memoir Consequence. What first made me notice the piece was the line inserted by the editors in large, boldfaced print near the end of the first of the article’s two columns. It read:

Men like Donald Trump and Ted

Cruz don’t have to bear the cost

In one paragraph late in his article, Fair writes about how during this election season both Trump and Cruz have repeatedly “suggested that waterboarding and other abhorrent interrogation tactics should not be considered illegal.” A bit later, Fair adds that, given “the opportunity to speak to other interrogators and intelligence professionals, I would warn them about men like Donald Trump and Ted Cruz.” Fair says he “would warn them that they’ll be told to cross lines by men who would never be asked to do it themselves”—just as neither Trump nor Cruz ever would be—but that “once they cross the line, those [same] men will not be there to help them find their way back.” He concludes his article by writing: “As an interrogator, torture forced me to set aside my humanity when I went to work. It’s something I’ve never been able to fully pick back up again. And it’s something we must never ask another American to do.”

In that final sentence, by the pronoun ‘we’ Fair obviously does not mean fellow “interrogators and intelligence professionals,” since it is precisely they whom “we” must never again ask to do the sorts of things “we” have in the past asked Fair and others to do in Iraq—and all too many other places. Presumably, by “we” Fair means the United States as a nation.

In their 1967 book Alexander and Margarete Mitscherlichs address the German nation’s inability to mourn its Nazi past from 1933-1945, and especially to mourn the victims of Germany’s many acts of aggression and genocide during that period, the many millions of people the Germans murdered in those years. That inability to mourn was still all too definitive of Germany at least in 1967, twenty-two years after the end to World War II, as it may well still be today, almost half a century further on.

Fair’s article—and the book it advertises, which has since appeared and which I’ve also now read—raises the same issue for the United States as a nation today with regard to its actions in Iraq after we invaded that country in 2003, just thirteen years ago. If we as a nation are not able to mourn those we have asked such men as Eric Fair to torture and murder in our name, then neither will we be able to heed Fair’s admonition that we never again ask such a thing of anyone.

It is all too easy to say “never again.” The hard part is to keep our word, once we do say that. Part of what makes that so hard, in turn, is that, in order to keep our word, we must first truly acknowledge and mourn all we have lost by having once done what we now say we will never do again. Can we so mourn?

Our national history gives us scant reason for optimism that we can.

*     *     *     *     *     *

The second piece in the same recent Sunday New York Times that drew my attention was a book review of The Black Calhouns: From Civil War to Civil Rights With One African Family, by Gail Lumet Buckley, daughter of Lena Horne and a descendent of the same Calhouns. The review was by Patricia J. Williams, a law professor at Columbia and columnist for The Nation. In the second half of her review, Williams touches briefly on the blatant racism of such all-American icons as Woodrow Wilson and John C. Calhoun. Then she remarks that The Black Calhouns “makes for particularly interesting reading against the backdrop of today’s culture wars, from Donald Trump’s disingenuous claim not to know anything about white supremacy to efforts in Texas [Ted Cruz’s home state, be it noted] to cut all mention of Jim Crow and the Klan from social studies textbooks.” She ends her review by complementing Buckley for how well the “meticulously detailed recollections” of her book call out insistently to the reader, on behalf of black slaves and their descendants: “We were here! We were there! Do not forget!”

However, as Williams goes on to remark, that’s just what we have done. We have “forgotten, over and over.” Williams compliments Buckley for giving us in her book “a comprehensive reminder of how, even when not immediately visible, the burden of racial trauma is carried deep within the body politic.” Then Williams concludes her review with this line: “With so much of our collective national experience consigned to oblivion, we tread unknowingly on the graves of those whose lack of accorded dignity echoes with us yet.”

How can we possibly mourn what we refuse even to remember?

*     *     *     *     *     *

We can let the Germans concern themselves with the question of whether they have even yet proven themselves capable of doing their own mourning for their own dark past. We need to focus on the question of our own national ability—or lack of it—to mourn our own such past, whether that be so recent a past as our war in Iraq, or a more distant past, such as that of the centuries during which some of us built the power of the United States as a nation on the backs of others of us, the backs, that is, of African American slaves.

The two pieces—one to each of those two: the relatively recent American invasion and occupation of Iraq, and the long American history of the enslavement of African Americans—that especially drew my attention in the Times for that recent Sunday of March 20 suggest that we, as a nation, lack that ability.

In my next post, I will introduce more disheartening recent evidence of our own continuing, shameful national incapacity to mourn.

Faith in Trauma: Breaking the Spell

Trauma-Faith: Breaking the Spell (continued and concluded)

The decision whereby one comes truly alive is itself never without risk. If it were, there would be nothing decisive about it. To take that risk is to risk oneself, not just such stuff as one’s money, one’s comfort, or one’s security; and to run such a risk—where the stakes are one’s very being as a “self” in the first place—requires real faith, not just comforting self-bewitchment.

Yet, as Kathleen Norris notes, that faith is nothing out of the ordinary, reserved for only the few. That is the “fascinating trait” of every real choice for life over death—every choice, as Alain Badiou puts it at one point in his recent book on the “metaphysics” of happiness (p. 37), to surmount “the tissue of mediocre satisfactions” held out to us all by our rampantly consumerist society as its vision of what constitutes a happy life. It is a choice to risk real life, and the real happiness that goes with such life, and only with it.

Norris and Badiou are at one in insisting that the opportunity, the opening, to make such a choice is nothing that comes only in rare or unusual moments, and only to a select few. It is, rather, an opportunity, an opening, that can suddenly present itself, as Badiou writes, “in every episode of life, no matter how trivial or minor it may be.” Even the most everyday of occurrences can suddenly break the spell that binds us, calling upon us to display real faith by choosing to begin really living our lives, rather than just passively undergoing them, just going on outliving ourselves day after day to the grave.

Once we are truly given a real choice, everything depends on us, and whether we have the faith to go ahead and choose.

What is more, such simple faith, the faith that permits choosing actually to live one’s own life rather than just trying to survive it, can never be claimed as some sort of permanent acquisition. It is not some piece of privately owned property that, once acquired, can be disposed of as one sees fit. The decision to live, however everyday it may be, is a decision whereby one accepts martyrdom for one’s faith—from the Greek term martyr, to witness—which need have nothing flashy or Hollywood-heroic about it. As Norris helps us see, such genuine martyrdom can be as quiet and unpretentious as the small daily sacrifices, fully embraced, that parents continually make for their children.

Nor, short of death itself, is such witnessing ever over and done with. It is always there in front of us, needing to be demonstrated ever again anew. It demands constant, ongoing reaffirmation—exactly what Kierkegaard called “repetition.” Exchanging truly understood and meant wedding vows in some formal setting, to use one of Kierkegaard’s own best examples, does not let spouses off the hook of then having to honor those vows, to keep them and the love they sacramentally express alive in their daily life together—forever repeating their vows and the love the bestowing of those vows effectively signifies, “till death do us part.”

Nor is that anything peculiar to getting married. It is the same with every decision, once really taken.

The faith witnessed by any real decision to run the risk of coming truly alive is just such a faith that must be kept. The specific “content,” as it were, of the decision and faith at issue, may vary greatly, of course, from person to person and even from one day to the next.

In the same way, each day for each person, temptation to “break faith” (a tellingly accurate expression) with one’s own decision can take a new form. Whatever form the temptation to break the faith with one’s own life may take, however, each and every day one is faced again with the decision either to keep on truly living, or just to fall back into letting one’s days dribble on endlessly, one after another, till one can finally check out of the whole game altogether and just expire—like Nietzsche’s ever-contented “last man.”

Only a faith that accepts the risk of living is one that finally turns and faces trauma, rather than running from it, and then tests and proves itself by faithfully facing trauma again anew, each and every day, day after day thereafter.

That is true faith in trauma, a faith that always keeps the wound open.

Faith in Trauma: Breaking the Spell

Trauma-Faith: Breaking the Spell

To enchant is to cast a spell. In turn, to disenchant is to break the spell of an earlier enchantment. In the first decades of the 20th century, Max Weber made popular the idea that modernization—with its ever more exclusive prioritization of science, technology, and instrumental rationality over faith, tradition, and belief—centrally involved a process of the “disenchantment” (Entzauberung) of nature. Ever since Weber, however, it can and has been debated whether modernization really broke a spell, or whether it cast one.

So, for example, in one of his writings on the rise of modern technology in volume 76 of the Gesamtausgabe (“Complete Edition”) of his works, Martin Heidegger makes explicit reference to the Weberian idea of disenchantment, only to argue against that thesis. Rather than a dis-enchantment (Entzauberung), says Heidegger (pages 296-297), what is truly involved in the rise of modern technology itself is instead an en-chantment (Verzauberung), a bewitching, hexing, or casting of a spell. That enchantment, according to him, is one whereby the very power at play in modern technology can make good on its own exclusive claim to power, as it were—just as, in the fairy story, the wicked witch, to secure her own claim to the power of beauty, casts a spell over Sleeping Beauty, the legitimate claimant.

According to Heidegger, that enchantment—the casting of the spell whereby what is at work in modern technology (as well as at work in all of the modern science and instrumental rationality that goes with that technology) seizes and secures its own power—goes hand in hand with the de-worlding (Entweltung) of the world, the de-earthing (Enterdung) of the earth, the de-humanizing (Entmenschung) of humanity, and the de-divinizing (Entgötterung) of divinity. “Technology,” writes Heidegger, “as the unleashing and empowering of energies [. . .] first creates ‘new needs’,” and then produces the resources to satisfy them: technology “first discloses the world to which it then fits its products.”

Badiou said essentially the same thing just last year in À la recherche du réel perdu (“In Search of the Lost Real”), his critique of our contemporary “entertainment world,” as he calls it at one point, using the English expression—a world-less pseudo-world actually, one ever more frenziedly devoted to the pursuit of Pascalian diversion from reality. In such a desolate pseudo-world, what falsely but inescapably presents itself as “reality” is in truth so utterly crushing that it permits no genuine, full living at all any longer, but only survival. Nor does such a divertingly fake world any longer have any room for any true faith. It only makes room for superstitions—precisely the sort of dangerously superstitious nonsense, for example, that United States Supreme Court Justice Antonin Scalia spouted at a high school commencement speech shortly before his recent demise, when he attributed the global success of the United States to the frequent invocation of God’s name by our Presidents and other public officials (see my citation of his remarks to that effect at the beginning of my earlier post, “An Anxious Peace: ‘God’ After Auschwitz”).

In a world already deeply asleep, under the bewitching spell cast by what Badiou lucidly calls “triumphant capitalism,” what we need is precisely dis-enchantment, the breaking of the spell. The spell that holds the world in thrall today is broken whenever, anywhere in the world, reality suddenly and unexpectedly breaks through to dispel (good word for it: “de-spell”) any illusion that happiness consists of endlessly buying what the global market endlessly offers for sale.

In Métphysique du bonheur réel (“Metaphysics of real happiness”)—a short book he also published earlier last year and in which he was already “in search of the lost real”—Badiou describes the illusion that the shock of reality shatters. It is the illusion wherein one takes the height of happiness to consist of the conjunction of the following factors, as he puts it in his introduction (p. 6): “a tranquil life, abundance of everyday satisfactions, an interesting job, a good salary, sound health, a flourishing relationship, vacations one doesn’t soon forget, a bunch of sympathetic friends, a well-equipped home, a roomy car, a loyal and cuddly domestic pet, [and] charming children with no problems who succeed in school.” In short, it is the illusion that one could be happy while living a life of crushing consumerist boredom, where nothing disruptive ever happens—life as no more than survival: outliving oneself from birth, in effect.

As opposed to any such pseudo-happiness of mere security and consumerist comfort in sheer survival, real happiness comes only as a by-product of true living. In turn, real life in itself begins only in the deliberate choice, the decision, to engage fully with reality, whenever it does break through our numbing consumerist spell to strike us. When it does, it reawakens us to the realization that, as Badiou puts it later (p. 38), “existence is capable of more than self-perpetuation.” When the consumerist spell that has held us in thrall is finally broken, we reawaken to the awareness that real happiness is nothing but the feeling that accompanies true life—a life fully lived “even unto death,” as the Christian biblical formula has it, rather than just survived.