Wounding Warriors: Their Own Wounds That Time Can’t Heal (4)

This is the fourth and last of a series of posts under the same title.

 

*     *   *

For many years now—many years before ever reading Jane E. Brody’s article, and encountering there the use of the term “moral injury” at issue in what she reports—I have found myself on various occasions wondering about the very issues that arise in her article: issues pertaining to how those who committed wartime atrocities might best be helped to confront themselves and their own crimes, and how they might still be embraced in community. However, the context in terms of which I have always thought about those issues before I read Brody’s article was not the one that concerns her, which is that of U.S. military veterans who fought in our wars in Vietnam, Afghanistan, and Iraq. Instead, the context in terms of which I thought of those issues before reading her article was always that of the Holocaust itself. What I would find myself thinking about was not those U.S. perpetrators of atrocities, but rather those who helped perpetrate the murder of millions during the Holocaust, in death camps or killing fields or any of the other places where such murders were carried out. I would find myself wondering what one could and should say concerning them, and the suffering that, surely, many of them must have experienced by the memories of what they had done—suffering they must have continued to undergo even decades after they had committed the crimes they did commit. What would one say to the former SS camp-guard at Auschwitz, for example, who had bashed babies heads in, and led countless men, women, and children into the gas chambers, and who suffered from the overwhelming guilt the recollection of such deeds should haunt anyone who did them, and must have haunted at least some of the surviving perpetrators of the Holocaust itself?

The anger I felt when I read Jane E. Brody’s article about contemporary clinical approaches to the counseling of U.S. veterans of our wars in Vietnam, Afghanistan, and Iraq who suffer from the same sort of “moral injury” had its roots deep in the soil of all those earlier thoughts of my own pertaining to German veterans and the veterans of other nations that assisted Germany in perpetrating the Holocaust. My anger rose especially when I read the remarks attributed to Dr. Brett Litz, about the need to assure U.S. vets who suffered from such “moral injury” that “that they will not be judged and are deserving of forgiveness,” the need to tell them that “disclosing, sharing, [and] confessing” their crimes “is fundamental to repair” of their own injuries, and the need to encourage them “ to ‘engage in the world in a way that is repairing—for example, by helping children or writing letters’,” so that they can “find forgiveness within themselves or from others.” I could not help but wonder how receptive readers of those remarks in the Times article would be to them if they were addressed, not to U.S. veterans who perpetrated atrocities in Vietnam and after, but to German and other European nations’ veterans who perpetrated the Holocaust.

I wondered, and I found my anger rising.

As I have said before here, the issue is not to engage in some comparison of atrocities, trying to decide which atrocity was worse, which nation guilty of the most or worst crimes. We should have no patience for the disgusting business of drawing such comparisons, trying to establish the victor in the race of nations into moral depravity. That is not in the least the issue.

The issue is, rather, to abandon all such self-serving attempts to justify “our” own “exceptionalism,” whether that be the exceptionalism of Germany, home of the “master race,” or of the United States of America, “home of the brave and land of the free.” The issue is to forget ourselves and our own obsessive self-concern, that we might at last remember who we really are, and act accordingly.

Published in: on November 9, 2016 at 5:00 am  Leave a Comment  
Tags: , ,

Remembering the Third Reich American Style

 

This is the fifth in a series of posts.

*     *     *     *     *     *

Part Two: Pissing on Language (3)

It is revealing to read John Chang’s We Gon’ Be Alright: Notes on Race and Resegregation, published in the United States just this year, 2016, alongside Victor Klemperer’s The Language of the Third Reich: LTI—Lingua Tertii Imperii: A Philologist’s Notebook, which was originally published in Germany sixty-two years ago, in 1954. Both detail the distortion and flattening of language that paves the way for the degeneration of populism into demagogy and, concomitant with that, of democracy into authoritarianism.

In one passage of his book (pages 52-53 of the English translation), Klemperer does a good job of articulating the difference at issue. After reminding the reader that “politics is after all the art of leading a polis, a city,” he goes on to note that with the emergence of modern democracy the “city” that is to be lead is no longer something of the size of ancient Athens, the birthplace of the idea of “democracy”— which is to say rule by “the people” themselves, consisting of “everyone” equally, and not just by some privileged segment of society for their own benefit. The polis today, Klemperer acknowledges, is vastly greater both in physical extent and in population than the ancient Greek city-state of Athens. Yet, he writes, even when the massive modern polis eventually emerged, political leaders were still challenged, as they were in Athens centuries ago, to “turn to ‘everyone’ in person,” if they were truly to lead.

That remains so for modern political leaders, with a polis as large as a nation-state and with populations in the millions or even billions. It remains true, as Klemperer goes on to say, “even if ‘everyone’ amounts to millions, and even if thousands of kilometers separate their individual groups.” Though “the people” can no longer be gathered together and addressed at a single place of assembly, as they were in ancient Athens, they must still be gathered together and addressed, even if only “virtually,” through digital means.

Thus, as Klemperer writes: “In this way the speech, as one of the tools and duties of the statesman, was reinvested [with the emergence of modern mass democracies] with the status that it had enjoyed in Athens, indeed an even greater status given that instead of Athens the orator now addressed an entire country, and indeed more than just one country.” Furthermore,

[. . .] a speech was not only more important than it had been previously [that is, during the long period separating original Athenian democracy from the emergence of modern democracy], it was also, of necessity, different in nature. In addressing itself to everyone rather than just select representatives of the people it had to make itself comprehensible to everyone and thus become more populist. Populist means more concrete; the more emotional a speech is, the less it addresses itself to the intellect, the more populist it will be. And it will cross the boundary separating populism from demagogy and mass seduction as soon as it moves from ceasing to challenge the intellect to deliberately shutting it off and stupefying it.

I have added the emphasis to the concluding sentence of that passage, in order to highlight the crucial distinction Klemperer is drawing between populism on the one hand and demagogy on the other.

In saying that populism addresses its audience by working with everyone’s emotions rather than intellect, Klemperer does not mean to belittle populism, or to deny it a crucial positive role in helping to build democracy. Quite the contrary is the case. Our emotions—as Heidegger for one knew and consistently taught—are the primary access we have to how we experience ourselves and our place in the world moment by moment. What is needed is not learning how to disregard our emotions in favor of some disembodied pure intellect. What is needed, rather, is to learn to listen to our emotions properly, to let them give us the disclosure about our world, and our insertion in it, that only they can give.

We need to learn how to listen to our emotions—and also how to address them, if we are to move ourselves together to act wisely and well. Unless we do, populism degenerates into demagogy, and democracy is lost.

Language itself is always fundamentally at work in our emotional natures, forming them and us through them. In listening to our emotions, we are also listening inevitably to our language—our “native tongue,” the language of the community into which we are born. As Klemperer writes earlier in his book (page 15), “language does not simply write and think for me, it also increasingly dictates my feelings and governs my entire spiritual being the more unquestioningly and unconsciously I abandon myself to it.” We can certainly add that insofar as I am born, rather than somehow bootstrapping myself into my own existence, I always have already “unquestioningly and unconsciously abandon[ed] myself” to the language of my birth, the language that others have already cultivated for me from long before I was born.

But “what happens,” Klemperer pointedly goes on to ask, “if the cultivated language [at issue] is made up on poisonous elements or has been made the bearer of poisons?” After all, he goes on to observe (pages 15-16): “Words can be like tiny doses of arsenic: they are swallowed unnoticed, appear to have no effect, and then after a little time the toxic reaction sets in after all.”

The same thing applies to the conflation of words as well, the erasure of significant differences of meaning between them. A prime example, one no less relevant today in the United States than it was in Germany in the 1930s, is the blurring of the very distinction Klemperer points to in the first lines I cited above, the distinction between populism and demagogy.

Coupled with the washing out of the difference between populism and demagogy also goes the process of equating appeals to the emotions with rejection of thought, as though thought were the same thing as sheer intellection, mere calculative rationality. Blurring the boundary between thinking and calculating— between free and open reflection, on the one hand, and the purpose- and profit-driven computation of possible outcomes, on the other—joins readily with blurring the boundary between populism and demagogy, and the one reinforces the other.

Thus, that joint process of erasure of apparently “merely verbal” boundaries has far from “merely verbal” consequences. Rather, as Klemperer knew and witnessed to his horror in Nazi Germany, it completely undercuts both genuine thought and genuine democracy.

Neither democracy nor thought can thrive without one another, and the flattening of language flattens them both.

*     *     *     *     *     *

To be continued.

Remembering the Third Reich American Style

This is the fourth in a series of posts.

*     *     *   *     *     *

Part Two: Pissing on Language (2)

Before the 1980s, it was mostly Marxists who used the term “politically correct” to mock other Marxists. Since then, charging someone else with political correctness has become the first line of defense for racists, one of the best ways to shut down any discussion about inequality.

— Jeff Chang, We Gon’ Be Alright: Notes on Race and Resegregation

 

It is by just such semantic means that we are all robbed even of any voice and vocabulary to report—even, and most especially, to ourselves—the robbery of our very freedom, let alone to protest against it.

I have not researched the matter myself to confirm what Chang says about Marxists’ abuse of the term “politically correct” prior to the 1980s, but it sounds right to me, and I certainly have no reason to contest it. However, throughout my own childhood and youth, which had already passed by before the 1980s ever got underway, the sense that same term always had anytime I encountered or used it was free either of left-wing, Marxist or of right-wing, racist appropriation and distortion. Instead, in my experience back then it had a neutral, descriptive, situation-specific usage. To say that some statement or idea or action was the “politically correct” thing to say or do meant that it was what the given politics of the time and place at issue required be done or said. That is, to be “politically correct” meant not to go beyond the confines of what was regarded as acceptable given the concrete political atmosphere of that time. Thus, one and the same remark or behavior that was “politically correct” in one historically concrete situation might very well be anything but that in a different situation.

For example, in the United States during the 1950s, when I was a child—the era of Joe McCarthy, “loyalty oaths,” and the House Un-American Affairs Committee witch-hunts—it was not “politically correct” to express, or even to be discovered to hold, left-leaning views. Doing so could and often did subject those who expressed or could be found out to hold such views to censure and worse (e.g., the Hollywood “black-listing” of writers such as Dalton Trumbo). Yet during the same period, not holding or expressing those very same views, the very one that could get one censured and banished in the United States, was not “politically correct” in various other places in the “Cold-War” world of that day.

In the United States today, sixty years later, things are as Chang characterizes them in the lines I cited above. In this country now, it is no longer possible to use the term “politically correct” in its original, neutral, descriptive, and situation-specific meaning. Instead, the term has been taken over by right-wingers and perverted into being used solely as a label for certain egalitarian or anti-racist positions and views. Today, here, the phrase is no longer used as a label applied with any interest for clarifying the shared political situation or furthering serious political discussion. Rather, it is applied in order to obscure the underlying reality, and to preclude genuine discussion of issues.

To sum up: Today in the Unites States, after years of misappropriation and distortion of the phrase by right-wing ideologues, it is no longer politically correct to use the expression “political correctness” correctly.

By such theft of language, we are all robbed of some of the rich potential of our linguistic heritage. In the process, our communal linguistic topography is leveled down to a single monotonous surface.

Nor is what has happened to such phrases as “political correctness” the only example of such impoverishing of our common language, such flattening of our shared linguistic terrain.

*     *     *     *     *     *

To be continued.

Remembering the Third Reich American Style

This is the third in a series of posts.

*     *     *     *     *     *

Part Two: Pissing on Language (1)

His kind of bourgeois knows only how to talk about business. I’m not saying he’d lost his soul; he no longer had the language to express it.

—Léon Werth, 33 Days (p. 62)

 

But from the point of view of the philologist I also believe that Hitler’s shamelessly blatant rhetoric was able to make such an enormous impact because it penetrated a language which had hitherto been protected from it with the virulence which accompanies the outbreak of a new epidemic, because this rhetoric was in essence as un-German as the salute and uniform copied from the [Italian] Fascists—replacing a black-shirt with a brown-shirt is not a particularly original idea—and as un-German as the whole decorative embellishments of the public occasions.

But regardless of how much National Socialism learned from the preceding ten years of Fascism, and how much of the infection was caused by foreign bodies, it was, or rather became, in the end, a specifically German disease, a rampant degeneration of German flesh which, through a process of reinfection from Germany, destroyed not only Nazism, but also Italian Fascism, which was undoubtedly criminal, but not quite so bestial.

—Victor Klemperer, The Language of the Third Reich: LTI—Lingua Tertii Imperii: A Philologist’s Notebook, translated by Martin Brady (Bloomsbury, 2013, p. 57)

 

The path of poetic faith in our century suggests that repressive regimes do not tolerate, are in fact afraid of, the subversive powers of language, most especially poetry in the hand of those whom the political order aims to keep powerless.

—Terrence Des Pres, Praises and Dispraises: Poetry and Politics in the 20th Century (Penguin, 1988, p. 203)

 

One major gift the poet can give to what Des Pres calls the poet’s “tribe”—by which he means the “audience” for the poet’s poetry, that audience which, as Heidegger taught, the poetic work itself calls forth, to hear and heed it—is that of a genuine memorial, a place in the communally shared language where real remembrance can occur. Once so marked out, that place is one to which everyone can readily return, to remember—and a place that regularly and recurrently calls one back to itself, to do just that. Striking lines of poetry that, once heard, stick in the mind work that way. They become a storehouse memory, memory that rises back up into awareness spontaneously whenever it is reactivated, we know not (and need not know) how, by some chance encounter or event, like the taste of marmalade for Proust. When they do, they call us back to ourselves, and to the memories that define us, both as individuals all as members of our community, our “tribe.”

In sharp contrast, the role of the pure slogan, insofar as it remains no more than that—the paradigm being the sheer advertising slogan, like this one from my childhood: “You’ll wonder where the yellow went, when you brush your teeth with Pepsodent!”—is not to call us back to ourselves. It is, rather, to divert us from ourselves, and most especially from openly sharing our life together in community. Its role is to isolate us from one another, rather than to bring us together in shared recall of who we really are, and to drive us into compulsive action. Instead of fostering community, remembrance, and thought, the sheer slogan divides us from one another and blocks memory, hindering thought, and any genuine thoughtfulness for one another. The pure slogan is trying to sell us something, in one fashion or another, for someone’s profit, not tell us something about ourselves, for our own good.

To use a current example, “Make America great again!” is not a call to each of us to come together with all other “Americans” in remembering the gaps and failures in our community and in our relationships one to another. It is not a call to us to atone for those failures, and address them honestly and humbly. It is, rather, a slogan designed to get us to “buy” a certain Presidential candidate, and vote for him. Or, to use a much older example, but one still from Presidential politics, the slogan “He kept us out of war!” was used to sell the nation on reelecting the man who would then lead us into it—“to keep the world safe for democracy,” of course.

Slogans, like striking lines of poetry, stick in the mind. However, whereas genuinely poetic lines keep the channels of thought’s and recollection’s flow open and clear, the lines of slogans block that flow and close those channels, diverting thinking and memory into fixation on whatever the slogan is selling, whether that be toothpaste, Presidential candidates, or nationalistic fervor (“Remember the Maine!”, “Support our troops!”, “Never forget!”, and the like). Poetry cultivates the field of language, opening new possibilities for rich yields of diverse flowers and grains. Slogans fence language in and flatten out the field of linguistic possibilities, turning it into a smooth, monotonous, dead surface where nothing but the most noxious weeds can grow.

At least that is so unless something manages to breathe life and soul (to be redundant, since those two words really say the same thing) back into the slogan, opening it up into poetry.

That happened to me with regard to the old Pepsodent slogan I mentioned above. That was way back in my childhood, when that insipid, mindless, mind-numbing jingle was still in circulation, not yet having worn itself down completely smooth, so that it needed to be replaced by some new coin of the same (dis)value, destined to the same ultimate flattening into worthlessness. That’s because the version of that slogan that stuck itself into my mind was first, last, and recurrently the version of it that my father used to delight in singing whenever the mood struck him. His version was this: “You’ll wonder where the yellow went, when you wash your drawers in Pepsodent.” By heightening the definitive vulgarity of the original slogan, he thereby brought to the fore—without even trying, and certainly with no special intention on his part, given the simple, though intelligent, working man he always remained—the tastelessness and vulgarity of the original, and gave to me, his youngest son, a lasting memory of just what such slogans are ultimately all about, and worth. All of that—and, above all, the memory of my father himself saying his even more vulgarized version of the already vulgar toothpaste slogan—often comes back to me without effort on my own part, recalling me to myself, when some new advertising vulgarity worms its way into my mind and sticks itself fast there, like some foul, contagion-carrying tick.

Thanks, Dad!

*     *     *     *     *     *

To be continued.

Remembering the Third Reich American Style

This is the second in a series of posts.

*     *     *     *     *     *

Part One: The American Way of Remembering (2)

[. . .] we whites seem curiously unwilling to shoulder any responsibility for our own part in racial inequity. If we’re so concerned about “personal responsibility,” shouldn’t we show more?

—Nicholas Kristof, “When Whites Just Don’t Get It, Part 7” (NY Times 10/2/16)

 

We both detested the war [. . . but] both knew that if Hitler was responsible, he wasn’t as important as he was made out to be and he hadn’t invented himself without help.

—Léon Werth, 33 Days, translated by Austin Denis Johnston (Brooklyn & London: Melville House Publishing, 2015, p. 13)

 

Léon Werth, a French Jew who fled from Paris with his wife and daughter when the conquering Germans approached the city early in World War II, wrote what became the book 33 Days during that time of flight, when he and his family were refugees in their own country. In the line above Werth is speaking for himself and a friend he made along the way—a farmer who earned Werth’s gratitude by granting him and his family genuine hospitality along their way, and whose open, discerning, and honest thoughtfulness also earned Werth’s respect.

Werth and his new friend saw that it was no more than a subterfuge to blame all the destruction of that war on the single figure of Hitler, who “hadn’t invented himself without help,” and was therefore not the only one responsible. They understood that at least a good part of the reason Hitler was being “made out to be” so important was so that those who helped make Hitler possible were could thereby hide—even, and perhaps most of all, from themselves—their own responsibility.

The same basic thing is also at issue in the citation above from Nicholas Kristof, about racism in the United States: The avoidance of responsibility. Such avoidance can take the form of shirking one’s own responsibility by hiding behind some figure who has been “made out to be” much more important. But it can also take the form of projecting one’s own failure to assume one’s own responsibility onto those less fortunate than oneself. Either way, the effect is the same. Both are handy devices for denying one’s own responsibility—for refusing to remember what one has done, and hold oneself accountable for it.

In addition, all too often such slogans as “Always remember!” or “Never forget!” are employed in the same way: to create a sheer pretence of remembering that actually does dis-service to all genuine remembrance. Similarly, all too often, “official” memorials supposedly erected to honor memory actually dishonor it in just such a way. All too often, they just piss on all genuine, spontaneous memorialization, the same way that, as Jeff Chan recounts in the citation with which I began this series of posts, one cop let his dog piss on the memorials that sprang up spontaneously on the street in Ferguson, Missouri, where another cop had killed unarmed Michael Green. To borrow Chan’s way of putting it, they practice that same “American way of remembering.”

*     *     *     *     *     *

I will continue this series on “Rembering the Third Reich American Style” in my next post.

Remembering the Third Reich American Style—Part One: The American Way of Remembering (1)

This is the first in a series of posts.

*     *     *     *     *

One cop walked his dog over to the memorial that [Lesley] McSpadden had made for her son [Michael Brown, killed by another cop who’d pulled the teen-ager over for walking improperly on the street in Ferguson, Missouri, on August 9, 2014] and let it pee on the flowers and candles. After the rest of the policemen got into their vehicles to leave, car by car they rolled over what was left of the memorial. In the days to come, these memorials to Michael Brown Jr. would be destroyed over and over, as if to say, This is the American way of remembering.

—Jeff Chang, We Gon’ Be Alright: Notes on Race and Resegregation (Picador, 2016)

 

Continuing compulsively to repeat something over and over again is one way never to forget it. If by “always remembering” we mean nothing more or less than just “never forgetting,” then such compulsive repetition is a fail-proof way of assuring ourselves that we will always remember.

By steadfastly refusing ever to become aware of what we are doing in the first place, we guarantee that we will never forget it. We can only forget what we have once allowed to come into our awareness. So if we simply refuse ever to get clear about just what it is we are really doing, we never have to worry about forgetting it either. Such memory manifests as compulsive repetition.

All that’s needed for the practice of that form of remembering is the cultivation of stupidity. By the definition that has long been my favorite, “stupidity” is “willful ignorance.” So defined, stupidity is not just not knowing something, which is the literal, etymological meaning of the term ignorance—from Latin ignorantem, “not knowing,” the present participle of ignorare, “not to know, not to be acquainted with, to take no notice of, pay no attention to,” itself deriving from in-, in the sense of “not,” plus Old Latin gnarus, “aware, acquainted with.”

There are various possible reasons for ignorance, for not knowing, as such. That includes something being “hidden in plain sight,” so that we need someone or something else to call our attention to it before we notice it, like the glasses on our face we keep searching for until someone finally points out to us that we’re wearing them.

Many years ago, when I was only 15, I went with my parents to Germany one summer, to visit my older brother, who at that time stationed in Frankfurt as a volunteer in the U.S. Army. Because I had taught myself some German in preparation for the trip, I became the designated family translator. On one occasion, my father needed something from the drug store, and I went with him to do any translation that might prove necessary. Standing right in front of the pharmacy counter, my father asked me to ask the pharmacist for whatever it was my father wanted. I did so, in my limited German. But then the pharmacist answered in fluent English. My father looked at me inquiringly, waiting for my translation. As my exasperation began to mount, I repeated exactly what the pharmacist had just said, using the same English words. My father then gave me some more to say to the waiting pharmacist. My exasperation burst out as I responded, “You tell him! He’s speaking English!” My father just smiled at his own ignorance, and took things from there.

That remains for me to this day a fond memory of my father, and of the gracefulness with which one can respond when one finds that one has gotten bent down just a bit, even despite all one’s own perfectly innocent intentions.

Such innocent ignorance is not the only kind, however.

There is also the sort of ignorance that is rooted in the desire not to know, because what is all too clearly there to be known does not happen to accord with what one would like to be true. It is the sort of cherished ignorance that insists, against all opposition and despite all evidence to the contrary, that what actually is the case is precisely what one wants to be the case, because that would serve one’s own selfish wants, desires, or needs (including especially the need always to think highly of oneself, despite all one’s misdeeds, failures, or vices). There is no innocence to ignorance of such a sort, as there was innocence in my father’s sort of ignorance. It is willful ignorance, the product of wanting not to know.

That is what I mean by “stupidity”: just such willful ignorance.

Precisely because it is so willful, such stupidity also has nothing to do with lacking intelligence. In fact, my own experience throughout my life is that intelligence actually makes such stupidity easier. That’s because intelligence is useful for discovering more and more ways to avoid coming to know what one does not want to know. In general, the more intelligent one is, the craftier one can become, including crafty in the ways of hiding oneself from oneself.

Stupidity, willful ignorance, is back of the American way of remembering, as Chan describes it in the passage with which I began.

*     *     *     *     *

To be continued.

Can We Mourn Yet?

5.

The more time passes, the more difficult it becomes to acknowledge these mistakes.

—Dan Jianzhong, Beijing sociologist, concerning the Chinese Cultural Revolution, which began 50 years ago, in 1966 (quoted by journalist Chris Buckley in “High-Level Commentary Breaks Silence in China,” The New York Times , 5/17/16)

 

Mourning involves living in a world totally not of one’s choosing. It’s a world of paradoxes: a world that one doesn’t want to live in, but doesn’t want to die in either.

—Charles W. Brice, poet and retired psychoanalyst (personal communication)

 

One thing has been made very clear to me. Many people resent being confronted with information about how racism still shapes—and sometimes, ruins—life in this country.

—Jenée Desmond-Harris, “The Upside to Overt Racism” (The New York Times, 5/1/16)

 

Whoever, so as to simplify problems, denies the existence of certain obligations has, in his heart, made a compact with crime.

—Simone Weil, The Need for Roots (Routledge Classics, 2002; Fr. orig. 1949)

In general, it is no doubt right to say that the difficulty of acknowledging past mistakes increases with time. However, when those mistakes carry traumatic consequences, the more time passes the greater grows the urgency to do just that, to acknowledge them—and, even more, to set them right. Trauma, after all, has its own time, growing ever more insistent the longer it goes unaddressed, repeating its demands more and more loudly until they are finally heard, and elicit a proper response. Sooner or later, trauma’s time will come. Sooner or later, we will be able to mourn. We can only hope that the day for our mourning will come this side of Judgment Day, the eschatological end of days as such. However, there are reasons for pessimism on that matter.

Perhaps the greatest obstacle that stands between us as a nation and the dawning of our day of national mourning is precisely, as I put it in my preceding post, because we really are not “one nation, indivisible, with liberty and justice for all” except in our national Pledge of Allegiance. What keeps us from uniting in acknowledging and mourning the crimes that some of us have perpetrated on others of us (not to mention other nations or peoples), is that we who are perpetrators or their descendants continue to derive so much benefit from those same crimes. Those of us who have the privilege of thinking ourselves “white,” for example, continue to derive great benefits from that very privilege, including the benefit of being able to drive our cars around our cities without being stopped and harassed by the police for no better reason than our not being among those so privileged.

Some time ago I wrote here, in a two-post series on “The Unforgiveable,” about Auschwitz survivor Jean Améry’s stipulation of the conditions under which he would be willing to let go of what he called his “resentments” against the Germans as a people or nation. In brief, Améry lays out a two-fold condition for such a settlement to occur at the level of what he calls “historical practice.” First, a true settlement would require “permitting resentment to remain alive in the one camp,” the camp of the victims of the crimes. Second, and simultaneously, “self-distrust” would need to be first enkindled and then kept carefully alive “in the other camp,” the camp of the perpetrators—the very self-distrust engendered by the perpetrators’ awareness and acceptance of their victims’ resentment. Genuine reconciliation could occur only by allowing the wounds of the victims to remain open and acknowledged, while simultaneously opening and keeping open an answering wound of deep self-mistrust in the perpetrators. Only if that were to happen would “the overpowered and those who overpowered them [. . .] be joined in the desire that time be turned back and, with it, that history become moral.”

In the case of Germany and what it did during World War II, for that nation to awaken such self-distrust would require it to become, as Améry says, “a national community that would reject everything, but absolutely everything, that it accomplished in the days of its own deepest degradation [that is, during the Nazi years of 1933-1945], and what here and there may appear to be as harmless as the Autobahns.” Nor was Améry blind to the fact that the entire postwar German “economic miracle” that allowed West Germany to become the economic powerhouse of Europe was itself only possible on the basis of the devastation of Germany at the end of the war, which allowed for the sort of radical retooling that fed the postwar German economic machine. Truly to “reject everything, but absolutely everything, that it accomplished” through its own criminal acts of its Nazi period, Germany would have had to reject not only Cold War financial support through the Marshall Plan but also everything else that Germany’s own utter defeat made possible for subsequent German economic development. Of course, “nothing of the sort will ever happen,” as Améry already knew and insisted long ago.

Nor will the United States as a nation every truly mourn its own crimes. For one thing, it will never truly mourn the genocide of American Indians on which America is founded. For various reasons, it is even less likely ever truly to mourn the centuries of enslavement of African Americans on which the United States as a whole—not just the South, but the entire county—built its unparalleled global economic might.

It recently made the news that Georgetown University in Washington, D.C., was largely built on funds it acquired from direct engagement in the slave trade. In one sense, at least, there’s really nothing new in such news. As has long been recognized, many foundational United States universities—Brown, Cornell, Harvard, the University of Virginia, and others—were themselves founded, either directly or indirectly, on the bodies of slaves. So were many other institutions, both North and South. Then, too, of course, the institution of slavery was built right into the Constitution of the United States itself.

If the United States were ever really to mourn slavery and its hundreds of millions of victims, then at least at a bare minimum those of us who still continue to benefit from the consequences of slavery would need to let go of our resentment toward African Americans for their own ongoing resentment for those very consequences. We who are privileged to think ourselves “white” would have to grant those not so privileged the right to hold on to their resentment of us, and we would need simultaneously to match their resentment with deep, abiding self-distrust of ourselves, to borrow Améry’s way of putting the point.

Of course, nothing of the sort will ever happen, I know.

*     *     *     *     *     *

So where do we go from here?

Well, that question really calls for thinking.

Published in: on May 23, 2016 at 5:48 pm  Leave a Comment  
Tags: , , ,

Can We Mourn Yet?

4.

[H]ow can the memory of the colonists be reconciled with the memory of the colonized?

—Sadri Khiari, “The People and the Third People,” in What Is a People? (trans. Jody Gladding, Columbia University Press, 2016, p. 99)

There is something questionable about lumping all veterans together as all equally deserving of honor. Many pointed just that out, to give one relatively recent example, when President Ronald Regan accompanied West German Chancellor Helmut Kohl to a commemoration service at the military cemetery in Bitburg, Germany, thirty-one years ago this month, in May of 1985, to honor the German war-dead buried there. Unfortunately. the German veterans buried at Bitburg included even members of the Nazi Waffen-SS—despite the fact that the entire SS had been deservedly judged a “criminal organization” by the Nuremburg Tribunal at the end of World War II. Reagan’s remarks on that occasion in 1985 suggested it was time by then to let bygones be bygones. Even though some of those buried at Bitburg had served in a criminal organization of a regime that murdered millions in gas chambers, Reagan apparently thought it fitting, after so many years, to let all the dead be honored equally. Unfortunately, however, to honor the memory of murderers equally with the memory of those they murdered is to dishonor the latter—and to join the former, if only symbolically.

In the same way, to honor Confederate soldiers equally with Union soldiers and all other American veterans who died while serving in the United States military, as we have long done on Memorial Day, is to paper over the differences between the Confederacy and the Union. And in the process it is to dishonor the millions of African Americans who were sold into the very slavery over which the Civil War was fought in the first place. It is to forget their bondage and its toll of misery, and to forget who was responsible for it—which was by no means the Confederacy alone, be it added (a point to which I will return in my next post).

Forgetting also occurs under the appearance of remembrance when all U. S. veterans whatever are lumped together by robbing Armistice Day of its original significance and turning it into Veterans Day. That change involves the expropriation of the very day originally set aside in memoriam of what Woodrow Wilson was benighted or vicious enough (he certainly betrayed both character-traits) to call “the war to end all wars,” and appropriating it instead for the purpose of glorifying all U.S. military service of all times, even if that service consisted, for example, of dropping nuclear bombs on Hiroshima and Nagasaki, or torturing Iraqi captives, and not just such non-controversially good deeds as liberating Paris from the Nazis or inmates from Dachau. What happens in such appropriation is the erasure and expropriation of the suffering of all our wars’ greatest victims, just as honoring all German war dead equally, including even Waffen-SS, dishonors those millions of Germans who died while serving honorably in their country’s armed forces.

Remembering the dead is a certainly a debt we owe them, one that should be honored and paid in full. Official memorializing of their deaths, however, is all too often a way of reneging on that debt, and failing to honor it. That is what almost always happens when memorializing is mandated by official state decree, as opposed to springing up spontaneously by popular action. The former is most often in the service of coercive power, helping to strengthen that power, or at least to maintain it. The later expresses the desire to honor those who call out to be honored in remembrance.

In a riven society, memory is also riven. To be genuine, mourning must honor the rift. The discord must be heard and remembered, not drowned out and covered over, for real healing to occur. The wounds of division must be kept open. They must be acknowledged and mourned, if a truly single and united community—inclusive of all as true equals, rather than preserving privileges for some who remain always “more equal” than others—is ever to be formed among those who remain.

Whether “under God” (as Congress mandated only in 1954) or not, the society of the United States today is “one nation, indivisible, with liberty and justice for all” only in its own official Pledge of Allegiance. In reality, the society of the United States today continues to be riven by a variety of deep, longstanding social injustices it has never yet properly mourned.

Just this morning (May 16, 2016), The New York Times carried an article concerning one relatively recent instance of such a still unhealed wound of national division. The article addresses the controversy that is currently surfacing again as President Obama prepares to visit Vietnam soon, the third U.S. President to do so since the fall of Saigon to Communist forces in 1975. The rekindled controversy repeats that of old divisions generated by the United States war in Vietnam in the 1960s and 1970s. We as a nation have yet to face and to mourn what we did in that war, and what it did to us.

Slavery and all its consequences of ongoing discrimination and persecution against African Americans down to the present is perhaps the most obvious other example, and the one I have discussed most in this current series of posts. Unfortunately, there are many other examples as well. The oldest one goes all the way back to the genocidal warfare on which this country was founded, and the consequences of which continues to afflict American Indians to this day.

The list could be continued.

At any rate, what are the odds that we as a nation will ever be able to mourn such old yet still destructive divisions, and truly begin finally to heal them? The odds are not at all good, for reasons I have touched upon in my earlier posts in this series, and will address more directly in my next one, which I intend to be the last of this series on our own national capacity—or lack of it—to mourn.

Can We Mourn Yet?

3.

One of the official editorials by the editorial board of The New York Times on Sunday, April 3, 2016—the very same edition in which appeared the two articles I wrote about in my preceding post (one by Nicholas Kristoff and the other by Margret Sullivan)—was entitled “Race and the Death Penalty in Texas.” The editorial pointed to the overwhelming evidence demonstrating that the imposition of capital punishment in Texas discriminates against African Americans, and suggested that the death penalty both there and elsewhere in the United States should be abolished.

It certainly should be. However, that does not seem likely at present. One reason it is not likely is the current composition of the U. S. Supreme Court, which forty years ago reversed its own earlier judgment against capital punishment, permitting it again so long as it is not imposed in an “arbitrary or capricious manner,” as The Times quotes the Court saying in reversing itself. A deeper, even more intransigent factor is indicated by something else The Times itself says in its editorial: “Racism, of course, has been central to the American death penalty from the very start.” What The Times does not go on to say—but should have—is that for that very reason our national focus should not be on the death penalty at all, but rather on the racism that underlies and sustains it.

That, our national racism, is what we really need to eliminate. Otherwise, even if we as a nation were to side-step the Supreme Court and legislate the elimination of capital punishment, our real problem would still persist. In fact, it would in all probability just grow worse. We as a nation would all but certainly interpret the elimination of the death penalty as no more than the elimination of a lingering vestige of the racism we want to think we have already consigned to the past, rather than an ongoing crime we continue to perpetrate in the present. (In just the same way, according to numerous opinion polls, the majority of citizens of the United States were happy to convince themselves that the election of President Obama eight years ago proved that we as a nation had overcome racism.)

Capital punishment should be eliminated in the United States, and the election of our first African American President in 2008 deserves to be universally celebrated (regardless of what one thinks of him personally, or of the accomplishments of his Administration). However, neither eliminating the death penalty nor celebrating our first election of an African American President would prove the United States had overcome the racism that is such an ongoing national shame. Furthermore, by allowing us to pretend that we had already faced and overcome our racism, both would all too easily just harden our inability to mourn that racism, and its millions of victims past and present.

*     *     *     *     *     *

That brings me to yet a fourth article that caught my attention in the Sunday, April 3, edition of The New York Times. That fourth piece was also in the op-ed section. It was a column entitled “Why Slaves Graves Matter,” by Sara A. Arnold, whom The Times identifies as the “founder of the Periwinkel Initiative and the National Burial Database of Enslaved Americans,” preliminary submissions to which, she tells us in her article, she is currently processing. In her piece Arnold argues that “community preservation initiatives can contribute to healing understanding and potentially even reconciliation,” then ends her piece with the following paragraph:

Our country should explore ways to preserve the public memory of enslaved Americans. Their overlooked lives are an inextricable part of the historical narrative of our country—and not just because they were “beneficiaries” of the 13th Amendment. We should remember enslaved Americans for the same reason we remember anyone; because they were fathers, mothers, siblings and grandparents who made great contributions to our nation. Regardless of our country’s history or our ambivalence about the memory of slavery, we can choose to remember the enslaved—the forgotten. They offer our contemporary society examples of resilience and humanity. Preserving heir memory contributes to our own humanity.

Unfortunately, in this case, too, I remain a skeptic. Here, my skepticism is above all because everything depends on just how we go about doing our “remembering.” There’s remembering, and then there’s remembering.

One kind of remembering is that officially done on such occasions as Veterans Day or Memorial Day, when we all get together to put flowers on graves or watch parades, maybe even with banners admonishing us never to forget those who have sacrificed for our national good, sometimes even with their lives. Those two cases—Veterans Day and Memorial Day—are deserving of more attention, since they can be used as good examples of the hidden complexities involved in the whole mixing of remembering with the setting up of official memorials or days of remembrance.

Armistice Day was officially created to memorialize the day when the armistice that ended active hostilities between the Allies and Germany on the Western Front in World War I went into effect. The armistice officially went into effect at 11:00 a.m. on November 11, 1918—the symbolically significant “eleventh hour of the eleventh day of the eleventh month,” just a clock-tick of history away from the World’s midnight. However, in many countries it eventually became an occasion to remember not just veterans of World War I but also all military veterans whatever. Following that trend, in 1954 the United States officially changed “Armistice” Day into “Veterans” Day.

Similarly, the “Grand Army of the Republic,” a Union veterans’ organization founded in Decatur, Illinois, held the first “Decoration Day” to memorialize Union soldiers who died in the American Civil War. In former Confederate states after the war, there were also various celebrations, held on various days, to commemorate Confederate veterans who had died in the same conflict. In the 20th century, all such celebrations, North and South, were merged into the current “Memorial Day,” which was also extended to honor all Americans who died while in the military service, in or out of combat, not just those who died during the Civil War. Thus, unlike Veterans Day, which was set aside as the official U. S. holiday to honor all veterans, regardless of whether they died while in military service, Memorial Day was set aside specifically to honor only those who did die while so serving.

All too often, however, officially setting aside such days of remembrance—or officially setting up such memorials as The Tomb of the Unknown soldier in Arlington National Cemetery (or even setting up that cemetery itself as the official cemetery to honor U. S. veterans killed in combat)—does not, regardless of anyone’s intention, really serve genuine remembrance at all. All too often in such cases, what looks like an endeavor to encourage or even mandate remembrance in reality ends up just helping whatever powers that be keep the public order that perpetuates their power, an order that actually has good reason to fear being disturbed by genuine, spontaneous, uncontrolled remembrance.

In my next post, I will address that issue.

Can We Mourn Yet?

2.

That segment of the population [the privileged segment] wants to be surrounded by people with similar characteristics.

—Kevin Sheehan, former chief executive of Norwegian Cruise Lines, as quoted by Nelson D. Schwartz in “In New Age of Privilege, Not All Are in Same Boat,” the lead article on the front page of The New York Times for Sunday, April 24, 2016 (the 100th anniversary of the start of the Irish Easter Rising of 1916, be it noted)

In my preceding post, the first in this series on whether we can mourn yet, I wrote about two articles that appeared in The New York Times for Sunday, March 20, this year. This new post will also concern pieces from The Times, but from an even more recent issue.

The first piece is itself a sort of recent reissue of an older one. Two years ago, regular Times contributor Nicholas Kristoff did a series of columns he called “When Whites Just Don’t Get It.” Then just a few weeks ago, in The Times for Sunday, April 3, he wrote a reprise called “When Whites Just Don’t Get It, Revisited”—a revisiting he wrote was necessary because “public attention to racial disparities seems to be flagging even as the issues are as grave as ever.”

“Why do whites discriminate?” Kristoff asks in his recent reprise. “The big factor,” he writes in answer to his own question, “isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality [that is, “whites” who, when asked, say they believe in equality, even and especially, I will add, if they are just asking and answering themselves] but act in ways that perpetuate inequality.” Kristoff then cites Eduardo Bonilla-Silva, whom he identifies as “an eminent sociologist,” and who “calls this unconscious bias ‘racism without racists.’” About such presumably covert racism, Kristoff says, “we whites should be less defensive.” One reason, he adds, that “we whites” don’t need to be so defensive about our own lingering, unacknowledged racism, is that, in his judgment at least, such bias “affects blacks as well as whites, and we [all of “us,” presumably: “blacks as well as whites”] also have unconscious biases about gender, disability, body size and age.” Then a few paragraphs later he ends his column by writing: “The challenge is to recognize that unconscious bias afflicts us all—but that we just may be able to overcome it if we face it.”

How likely Kristoff thinks it is that “we” will ever actually face the fact of such bias, he doesn’t say. Speaking solely for myself, I do not think it is very likely at all. Hence, I am equally skeptical that “we” have any real ability to overcome such bias.

*     *     *     *     *     *

Kristoff’s remarks about how we all have such bias makes me assume that what he means by that term bias is very broad. It would seem to cover such things as the simple uneasiness that we all have toward that which is different from us or unfamiliar to us. For example, if we grow up in a place where no one has red hair, and suddenly find ourselves visited by some red-haired stranger, then we will naturally tend toward being suspicious of, or at least not completely at ease with, our visitor, at least till we get to know him or her better: We will have an “unconscious bias” against any such red-heads, as Kristoff seems to be using that phrase.

It is precisely with regard to unconscious “biases” of that perfectly natural and universal sort that our chances of coming to face them, and then perhaps even to overcome them, are best. However, if we turn to a different subset of unconscious biases, the odds against such change rise sharply. That applies above all to that subset of unconscious biases with regard to which our not knowing we have them is all too often because we do not want to know—those biases we have of which we do not just happen to be unaware, but which we actually have a vested interest, as it were, in keeping secret—secret even, and perhaps especially, from ourselves. At issue are those biases that we actually have a vested interest in maintaining, precisely because of all the benefits maintaining such biases brings us, at the cost of the very people against whom we do maintain them. That very self-interest then also strongly motivates us unconsciously to hide those unconscious biases from ourselves.

To give an example that is still of great ongoing importance, when it comes to racial bias in this country, it seems to me that, in general, the benefits from such bias are overwhelmingly weighted in favor of those of us who think ourselves “white,” as Ta-Nehisi Coates puts it in Between the World and Me (and which I have discussed in some earlier posts), rather than those of us who are not encouraged—if even permitted—so to think of ourselves. It directly benefits all of us who think we are “whites” to think that the rest of us, all the “non-whites,” are inferior to us “whites,” since that lets us “whites” keep on denying such supposed inferiors their fair share of the communal pie, so that we can keep on getting bigger slices for ourselves.

*     *     *     *     *     *

To give another example: It happens that in the same op-ed section of the Sunday issue of The New York Times in which Mr. Kristoff has his column revisiting how, still, “whites just don’t get it,” there also appears another column, by Margaret Sullivan, who was then serving as the “Public Editor” for The Times (she’s since stepped down), called “Are Some Terrorism Deaths More Equal Than Others?” The answer editor Sullivan gives to that question is clearly in the affirmative, at least insofar as it comes to coverage of such deaths in dominant United States news media, including The Times itself. After devoting the first half of her column to various readers’ letters to her about the matter, Sullivan asks the most pertinent question, that of “why [there is] the persistent inequality that readers rightly observe?”

Her own answer to that question is four-fold. “Part of the answer,” she writes, “is access. It’s far easier to get a number of staff members to Paris or Brussels than, for example, to Pakistan [. . .] .” Next she addresses “another factor,” that of “deployment of resources,” of which she writes: “The Times has more than a dozen correspondents working in bureaus in Western Europe; far fewer, for example, are based in Africa.” As a third factor, according to her, “there is a legitimate question of newsworthiness. News is, by definition, something out of the ordinary. In some places, like Iraq, the tragic has become commonplace.” She then gives Egypt as another example (besides Iraq), citing a former Times correspondent stationed there who says that, while it used to be that “a bombing in Cairo would have been ‘a shock’,” that is no longer the case. Today, as the former correspondent says, “We can’t cover every attack there.” Finally, Sullivan cites as a fourth factor “the relationship between the United States and the country where an attack takes place.” In effect, she is saying that since France, for example, is important for our own interests (and, we might add, we even feel fondness for the French at the moment, a moment when it is no longer de rigueur for all good United States patriots who want to be politically correct to substitute “freedom fries” for “French fries,” and to call attention to themselves for doing so), we pay more attention to what happens there than in some place that has far less strategic importance for us (such as, say, Somalia or Haiti) or that we don’t like so much (such as, say, Finland or Indonesia).

Sullivan then draws her piece toward its end by patting her own employer on the back, writing that she is “glad that Times journalists recognize the need to reflect the importance of all human life lost to terrorism—whether it happens in a place where we Americans [by which she means United States citizens in good standing, of course] may have gone sightseeing [if we’re fortunate enough to be part of the minority of the United States population that can afford global tourist-travel] or one we will probably never set foot in [probably because the amenities there are not up to our standards for “exploring the world in comfort,” to borrow a slogan from Viking River Cruises]. And regardless of whether the victims seem ‘like us.’” In her following, final paragraph Sulllivan concludes by writing: “Because, in fact, they surely are”—by which I assume she means that, even if some other people don’t “seem” so, all people really do turn out, upon thorough enough investigation, to be “like us.” That assumption is confirmed by the rest of her final paragraph, where she writes: “And it’s part of The Times’s journalistic mission to help its readers not only know that intellectually, but feel it in their hearts.”

*     *     *     *     *     *

I find that I am even more skeptical in the face of Margaret Sullivan’s apparent optimism that her employer is fulfilling a high “journalistic mission” than I am in the face of Nicholas Kristoff’s apparent optimism that those in the United States who most need to face and change their “unconscious biases” will ever do so. I have already given one reason for my skepticism in contrast to Kristoff’s optimism, a reason that has to do with how, for some of us, such biases are too deeply grounded in preserving our own privileges.

Among my reasons for skepticism about Sullivan’s optimism, I have one that is similar, which is this: Among the factors Sullivan lists to account for the “persistent inequality,” in dominant news sources such as the New York Times, of coverage of “terrorism deaths” in diverse places, she nowhere even mentions the factor of profit. But after all, what really accounts for the four factors she does address—the factors of “journalistic access, deployment of resources, and the admittedly subjective idea of what’s newsworthy,” as she summarizes her account (leaving out, for some reason, the fourth factor she mentions, that of being more concerned about deaths in nations that are of more strategic importance to our own national self-interest than deaths in nations with less such importance)—being factors in the first place? One need not even be as cynical about such things as I tend to be to suspect that the reason for those reasons themselves is above all because it is far more profitable to The Times to keep things that way, rather than to face its own biases, let alone change them.

Nor is that all. I have other grounds for skepticism. Indeed, even in the very same Sunday edition of The New York Times that contains Kristoff’s and Sullivan’s two columns, there are two more articles that remind me of those grounds. In my next post, I will turn to those two remaining pieces from that morning’s Times.