Can We Mourn Yet?

5.

The more time passes, the more difficult it becomes to acknowledge these mistakes.

—Dan Jianzhong, Beijing sociologist, concerning the Chinese Cultural Revolution, which began 50 years ago, in 1966 (quoted by journalist Chris Buckley in “High-Level Commentary Breaks Silence in China,” The New York Times , 5/17/16)

 

Mourning involves living in a world totally not of one’s choosing. It’s a world of paradoxes: a world that one doesn’t want to live in, but doesn’t want to die in either.

—Charles W. Brice, poet and retired psychoanalyst (personal communication)

 

One thing has been made very clear to me. Many people resent being confronted with information about how racism still shapes—and sometimes, ruins—life in this country.

—Jenée Desmond-Harris, “The Upside to Overt Racism” (The New York Times, 5/1/16)

 

Whoever, so as to simplify problems, denies the existence of certain obligations has, in his heart, made a compact with crime.

—Simone Weil, The Need for Roots (Routledge Classics, 2002; Fr. orig. 1949)

In general, it is no doubt right to say that the difficulty of acknowledging past mistakes increases with time. However, when those mistakes carry traumatic consequences, the more time passes the greater grows the urgency to do just that, to acknowledge them—and, even more, to set them right. Trauma, after all, has its own time, growing ever more insistent the longer it goes unaddressed, repeating its demands more and more loudly until they are finally heard, and elicit a proper response. Sooner or later, trauma’s time will come. Sooner or later, we will be able to mourn. We can only hope that the day for our mourning will come this side of Judgment Day, the eschatological end of days as such. However, there are reasons for pessimism on that matter.

Perhaps the greatest obstacle that stands between us as a nation and the dawning of our day of national mourning is precisely, as I put it in my preceding post, because we really are not “one nation, indivisible, with liberty and justice for all” except in our national Pledge of Allegiance. What keeps us from uniting in acknowledging and mourning the crimes that some of us have perpetrated on others of us (not to mention other nations or peoples), is that we who are perpetrators or their descendants continue to derive so much benefit from those same crimes. Those of us who have the privilege of thinking ourselves “white,” for example, continue to derive great benefits from that very privilege, including the benefit of being able to drive our cars around our cities without being stopped and harassed by the police for no better reason than our not being among those so privileged.

Some time ago I wrote here, in a two-post series on “The Unforgiveable,” about Auschwitz survivor Jean Améry’s stipulation of the conditions under which he would be willing to let go of what he called his “resentments” against the Germans as a people or nation. In brief, Améry lays out a two-fold condition for such a settlement to occur at the level of what he calls “historical practice.” First, a true settlement would require “permitting resentment to remain alive in the one camp,” the camp of the victims of the crimes. Second, and simultaneously, “self-distrust” would need to be first enkindled and then kept carefully alive “in the other camp,” the camp of the perpetrators—the very self-distrust engendered by the perpetrators’ awareness and acceptance of their victims’ resentment. Genuine reconciliation could occur only by allowing the wounds of the victims to remain open and acknowledged, while simultaneously opening and keeping open an answering wound of deep self-mistrust in the perpetrators. Only if that were to happen would “the overpowered and those who overpowered them [. . .] be joined in the desire that time be turned back and, with it, that history become moral.”

In the case of Germany and what it did during World War II, for that nation to awaken such self-distrust would require it to become, as Améry says, “a national community that would reject everything, but absolutely everything, that it accomplished in the days of its own deepest degradation [that is, during the Nazi years of 1933-1945], and what here and there may appear to be as harmless as the Autobahns.” Nor was Améry blind to the fact that the entire postwar German “economic miracle” that allowed West Germany to become the economic powerhouse of Europe was itself only possible on the basis of the devastation of Germany at the end of the war, which allowed for the sort of radical retooling that fed the postwar German economic machine. Truly to “reject everything, but absolutely everything, that it accomplished” through its own criminal acts of its Nazi period, Germany would have had to reject not only Cold War financial support through the Marshall Plan but also everything else that Germany’s own utter defeat made possible for subsequent German economic development. Of course, “nothing of the sort will ever happen,” as Améry already knew and insisted long ago.

Nor will the United States as a nation every truly mourn its own crimes. For one thing, it will never truly mourn the genocide of American Indians on which America is founded. For various reasons, it is even less likely ever truly to mourn the centuries of enslavement of African Americans on which the United States as a whole—not just the South, but the entire county—built its unparalleled global economic might.

It recently made the news that Georgetown University in Washington, D.C., was largely built on funds it acquired from direct engagement in the slave trade. In one sense, at least, there’s really nothing new in such news. As has long been recognized, many foundational United States universities—Brown, Cornell, Harvard, the University of Virginia, and others—were themselves founded, either directly or indirectly, on the bodies of slaves. So were many other institutions, both North and South. Then, too, of course, the institution of slavery was built right into the Constitution of the United States itself.

If the United States were ever really to mourn slavery and its hundreds of millions of victims, then at least at a bare minimum those of us who still continue to benefit from the consequences of slavery would need to let go of our resentment toward African Americans for their own ongoing resentment for those very consequences. We who are privileged to think ourselves “white” would have to grant those not so privileged the right to hold on to their resentment of us, and we would need simultaneously to match their resentment with deep, abiding self-distrust of ourselves, to borrow Améry’s way of putting the point.

Of course, nothing of the sort will ever happen, I know.

*     *     *     *     *     *

So where do we go from here?

Well, that question really calls for thinking.

Published in: on May 23, 2016 at 5:48 pm  Leave a Comment  
Tags: , , ,

Can We Mourn Yet?

4.

[H]ow can the memory of the colonists be reconciled with the memory of the colonized?

—Sadri Khiari, “The People and the Third People,” in What Is a People? (trans. Jody Gladding, Columbia University Press, 2016, p. 99)

There is something questionable about lumping all veterans together as all equally deserving of honor. Many pointed just that out, to give one relatively recent example, when President Ronald Regan accompanied West German Chancellor Helmut Kohl to a commemoration service at the military cemetery in Bitburg, Germany, thirty-one years ago this month, in May of 1985, to honor the German war-dead buried there. Unfortunately. the German veterans buried at Bitburg included even members of the Nazi Waffen-SS—despite the fact that the entire SS had been deservedly judged a “criminal organization” by the Nuremburg Tribunal at the end of World War II. Reagan’s remarks on that occasion in 1985 suggested it was time by then to let bygones be bygones. Even though some of those buried at Bitburg had served in a criminal organization of a regime that murdered millions in gas chambers, Reagan apparently thought it fitting, after so many years, to let all the dead be honored equally. Unfortunately, however, to honor the memory of murderers equally with the memory of those they murdered is to dishonor the latter—and to join the former, if only symbolically.

In the same way, to honor Confederate soldiers equally with Union soldiers and all other American veterans who died while serving in the United States military, as we have long done on Memorial Day, is to paper over the differences between the Confederacy and the Union. And in the process it is to dishonor the millions of African Americans who were sold into the very slavery over which the Civil War was fought in the first place. It is to forget their bondage and its toll of misery, and to forget who was responsible for it—which was by no means the Confederacy alone, be it added (a point to which I will return in my next post).

Forgetting also occurs under the appearance of remembrance when all U. S. veterans whatever are lumped together by robbing Armistice Day of its original significance and turning it into Veterans Day. That change involves the expropriation of the very day originally set aside in memoriam of what Woodrow Wilson was benighted or vicious enough (he certainly betrayed both character-traits) to call “the war to end all wars,” and appropriating it instead for the purpose of glorifying all U.S. military service of all times, even if that service consisted, for example, of dropping nuclear bombs on Hiroshima and Nagasaki, or torturing Iraqi captives, and not just such non-controversially good deeds as liberating Paris from the Nazis or inmates from Dachau. What happens in such appropriation is the erasure and expropriation of the suffering of all our wars’ greatest victims, just as honoring all German war dead equally, including even Waffen-SS, dishonors those millions of Germans who died while serving honorably in their country’s armed forces.

Remembering the dead is a certainly a debt we owe them, one that should be honored and paid in full. Official memorializing of their deaths, however, is all too often a way of reneging on that debt, and failing to honor it. That is what almost always happens when memorializing is mandated by official state decree, as opposed to springing up spontaneously by popular action. The former is most often in the service of coercive power, helping to strengthen that power, or at least to maintain it. The later expresses the desire to honor those who call out to be honored in remembrance.

In a riven society, memory is also riven. To be genuine, mourning must honor the rift. The discord must be heard and remembered, not drowned out and covered over, for real healing to occur. The wounds of division must be kept open. They must be acknowledged and mourned, if a truly single and united community—inclusive of all as true equals, rather than preserving privileges for some who remain always “more equal” than others—is ever to be formed among those who remain.

Whether “under God” (as Congress mandated only in 1954) or not, the society of the United States today is “one nation, indivisible, with liberty and justice for all” only in its own official Pledge of Allegiance. In reality, the society of the United States today continues to be riven by a variety of deep, longstanding social injustices it has never yet properly mourned.

Just this morning (May 16, 2016), The New York Times carried an article concerning one relatively recent instance of such a still unhealed wound of national division. The article addresses the controversy that is currently surfacing again as President Obama prepares to visit Vietnam soon, the third U.S. President to do so since the fall of Saigon to Communist forces in 1975. The rekindled controversy repeats that of old divisions generated by the United States war in Vietnam in the 1960s and 1970s. We as a nation have yet to face and to mourn what we did in that war, and what it did to us.

Slavery and all its consequences of ongoing discrimination and persecution against African Americans down to the present is perhaps the most obvious other example, and the one I have discussed most in this current series of posts. Unfortunately, there are many other examples as well. The oldest one goes all the way back to the genocidal warfare on which this country was founded, and the consequences of which continues to afflict American Indians to this day.

The list could be continued.

At any rate, what are the odds that we as a nation will ever be able to mourn such old yet still destructive divisions, and truly begin finally to heal them? The odds are not at all good, for reasons I have touched upon in my earlier posts in this series, and will address more directly in my next one, which I intend to be the last of this series on our own national capacity—or lack of it—to mourn.

Can We Mourn Yet?

3.

One of the official editorials by the editorial board of The New York Times on Sunday, April 3, 2016—the very same edition in which appeared the two articles I wrote about in my preceding post (one by Nicholas Kristoff and the other by Margret Sullivan)—was entitled “Race and the Death Penalty in Texas.” The editorial pointed to the overwhelming evidence demonstrating that the imposition of capital punishment in Texas discriminates against African Americans, and suggested that the death penalty both there and elsewhere in the United States should be abolished.

It certainly should be. However, that does not seem likely at present. One reason it is not likely is the current composition of the U. S. Supreme Court, which forty years ago reversed its own earlier judgment against capital punishment, permitting it again so long as it is not imposed in an “arbitrary or capricious manner,” as The Times quotes the Court saying in reversing itself. A deeper, even more intransigent factor is indicated by something else The Times itself says in its editorial: “Racism, of course, has been central to the American death penalty from the very start.” What The Times does not go on to say—but should have—is that for that very reason our national focus should not be on the death penalty at all, but rather on the racism that underlies and sustains it.

That, our national racism, is what we really need to eliminate. Otherwise, even if we as a nation were to side-step the Supreme Court and legislate the elimination of capital punishment, our real problem would still persist. In fact, it would in all probability just grow worse. We as a nation would all but certainly interpret the elimination of the death penalty as no more than the elimination of a lingering vestige of the racism we want to think we have already consigned to the past, rather than an ongoing crime we continue to perpetrate in the present. (In just the same way, according to numerous opinion polls, the majority of citizens of the United States were happy to convince themselves that the election of President Obama eight years ago proved that we as a nation had overcome racism.)

Capital punishment should be eliminated in the United States, and the election of our first African American President in 2008 deserves to be universally celebrated (regardless of what one thinks of him personally, or of the accomplishments of his Administration). However, neither eliminating the death penalty nor celebrating our first election of an African American President would prove the United States had overcome the racism that is such an ongoing national shame. Furthermore, by allowing us to pretend that we had already faced and overcome our racism, both would all too easily just harden our inability to mourn that racism, and its millions of victims past and present.

*     *     *     *     *     *

That brings me to yet a fourth article that caught my attention in the Sunday, April 3, edition of The New York Times. That fourth piece was also in the op-ed section. It was a column entitled “Why Slaves Graves Matter,” by Sara A. Arnold, whom The Times identifies as the “founder of the Periwinkel Initiative and the National Burial Database of Enslaved Americans,” preliminary submissions to which, she tells us in her article, she is currently processing. In her piece Arnold argues that “community preservation initiatives can contribute to healing understanding and potentially even reconciliation,” then ends her piece with the following paragraph:

Our country should explore ways to preserve the public memory of enslaved Americans. Their overlooked lives are an inextricable part of the historical narrative of our country—and not just because they were “beneficiaries” of the 13th Amendment. We should remember enslaved Americans for the same reason we remember anyone; because they were fathers, mothers, siblings and grandparents who made great contributions to our nation. Regardless of our country’s history or our ambivalence about the memory of slavery, we can choose to remember the enslaved—the forgotten. They offer our contemporary society examples of resilience and humanity. Preserving heir memory contributes to our own humanity.

Unfortunately, in this case, too, I remain a skeptic. Here, my skepticism is above all because everything depends on just how we go about doing our “remembering.” There’s remembering, and then there’s remembering.

One kind of remembering is that officially done on such occasions as Veterans Day or Memorial Day, when we all get together to put flowers on graves or watch parades, maybe even with banners admonishing us never to forget those who have sacrificed for our national good, sometimes even with their lives. Those two cases—Veterans Day and Memorial Day—are deserving of more attention, since they can be used as good examples of the hidden complexities involved in the whole mixing of remembering with the setting up of official memorials or days of remembrance.

Armistice Day was officially created to memorialize the day when the armistice that ended active hostilities between the Allies and Germany on the Western Front in World War I went into effect. The armistice officially went into effect at 11:00 a.m. on November 11, 1918—the symbolically significant “eleventh hour of the eleventh day of the eleventh month,” just a clock-tick of history away from the World’s midnight. However, in many countries it eventually became an occasion to remember not just veterans of World War I but also all military veterans whatever. Following that trend, in 1954 the United States officially changed “Armistice” Day into “Veterans” Day.

Similarly, the “Grand Army of the Republic,” a Union veterans’ organization founded in Decatur, Illinois, held the first “Decoration Day” to memorialize Union soldiers who died in the American Civil War. In former Confederate states after the war, there were also various celebrations, held on various days, to commemorate Confederate veterans who had died in the same conflict. In the 20th century, all such celebrations, North and South, were merged into the current “Memorial Day,” which was also extended to honor all Americans who died while in the military service, in or out of combat, not just those who died during the Civil War. Thus, unlike Veterans Day, which was set aside as the official U. S. holiday to honor all veterans, regardless of whether they died while in military service, Memorial Day was set aside specifically to honor only those who did die while so serving.

All too often, however, officially setting aside such days of remembrance—or officially setting up such memorials as The Tomb of the Unknown soldier in Arlington National Cemetery (or even setting up that cemetery itself as the official cemetery to honor U. S. veterans killed in combat)—does not, regardless of anyone’s intention, really serve genuine remembrance at all. All too often in such cases, what looks like an endeavor to encourage or even mandate remembrance in reality ends up just helping whatever powers that be keep the public order that perpetuates their power, an order that actually has good reason to fear being disturbed by genuine, spontaneous, uncontrolled remembrance.

In my next post, I will address that issue.

Can We Mourn Yet?

2.

That segment of the population [the privileged segment] wants to be surrounded by people with similar characteristics.

—Kevin Sheehan, former chief executive of Norwegian Cruise Lines, as quoted by Nelson D. Schwartz in “In New Age of Privilege, Not All Are in Same Boat,” the lead article on the front page of The New York Times for Sunday, April 24, 2016 (the 100th anniversary of the start of the Irish Easter Rising of 1916, be it noted)

In my preceding post, the first in this series on whether we can mourn yet, I wrote about two articles that appeared in The New York Times for Sunday, March 20, this year. This new post will also concern pieces from The Times, but from an even more recent issue.

The first piece is itself a sort of recent reissue of an older one. Two years ago, regular Times contributor Nicholas Kristoff did a series of columns he called “When Whites Just Don’t Get It.” Then just a few weeks ago, in The Times for Sunday, April 3, he wrote a reprise called “When Whites Just Don’t Get It, Revisited”—a revisiting he wrote was necessary because “public attention to racial disparities seems to be flagging even as the issues are as grave as ever.”

“Why do whites discriminate?” Kristoff asks in his recent reprise. “The big factor,” he writes in answer to his own question, “isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality [that is, “whites” who, when asked, say they believe in equality, even and especially, I will add, if they are just asking and answering themselves] but act in ways that perpetuate inequality.” Kristoff then cites Eduardo Bonilla-Silva, whom he identifies as “an eminent sociologist,” and who “calls this unconscious bias ‘racism without racists.’” About such presumably covert racism, Kristoff says, “we whites should be less defensive.” One reason, he adds, that “we whites” don’t need to be so defensive about our own lingering, unacknowledged racism, is that, in his judgment at least, such bias “affects blacks as well as whites, and we [all of “us,” presumably: “blacks as well as whites”] also have unconscious biases about gender, disability, body size and age.” Then a few paragraphs later he ends his column by writing: “The challenge is to recognize that unconscious bias afflicts us all—but that we just may be able to overcome it if we face it.”

How likely Kristoff thinks it is that “we” will ever actually face the fact of such bias, he doesn’t say. Speaking solely for myself, I do not think it is very likely at all. Hence, I am equally skeptical that “we” have any real ability to overcome such bias.

*     *     *     *     *     *

Kristoff’s remarks about how we all have such bias makes me assume that what he means by that term bias is very broad. It would seem to cover such things as the simple uneasiness that we all have toward that which is different from us or unfamiliar to us. For example, if we grow up in a place where no one has red hair, and suddenly find ourselves visited by some red-haired stranger, then we will naturally tend toward being suspicious of, or at least not completely at ease with, our visitor, at least till we get to know him or her better: We will have an “unconscious bias” against any such red-heads, as Kristoff seems to be using that phrase.

It is precisely with regard to unconscious “biases” of that perfectly natural and universal sort that our chances of coming to face them, and then perhaps even to overcome them, are best. However, if we turn to a different subset of unconscious biases, the odds against such change rise sharply. That applies above all to that subset of unconscious biases with regard to which our not knowing we have them is all too often because we do not want to know—those biases we have of which we do not just happen to be unaware, but which we actually have a vested interest, as it were, in keeping secret—secret even, and perhaps especially, from ourselves. At issue are those biases that we actually have a vested interest in maintaining, precisely because of all the benefits maintaining such biases brings us, at the cost of the very people against whom we do maintain them. That very self-interest then also strongly motivates us unconsciously to hide those unconscious biases from ourselves.

To give an example that is still of great ongoing importance, when it comes to racial bias in this country, it seems to me that, in general, the benefits from such bias are overwhelmingly weighted in favor of those of us who think ourselves “white,” as Ta-Nehisi Coates puts it in Between the World and Me (and which I have discussed in some earlier posts), rather than those of us who are not encouraged—if even permitted—so to think of ourselves. It directly benefits all of us who think we are “whites” to think that the rest of us, all the “non-whites,” are inferior to us “whites,” since that lets us “whites” keep on denying such supposed inferiors their fair share of the communal pie, so that we can keep on getting bigger slices for ourselves.

*     *     *     *     *     *

To give another example: It happens that in the same op-ed section of the Sunday issue of The New York Times in which Mr. Kristoff has his column revisiting how, still, “whites just don’t get it,” there also appears another column, by Margaret Sullivan, who was then serving as the “Public Editor” for The Times (she’s since stepped down), called “Are Some Terrorism Deaths More Equal Than Others?” The answer editor Sullivan gives to that question is clearly in the affirmative, at least insofar as it comes to coverage of such deaths in dominant United States news media, including The Times itself. After devoting the first half of her column to various readers’ letters to her about the matter, Sullivan asks the most pertinent question, that of “why [there is] the persistent inequality that readers rightly observe?”

Her own answer to that question is four-fold. “Part of the answer,” she writes, “is access. It’s far easier to get a number of staff members to Paris or Brussels than, for example, to Pakistan [. . .] .” Next she addresses “another factor,” that of “deployment of resources,” of which she writes: “The Times has more than a dozen correspondents working in bureaus in Western Europe; far fewer, for example, are based in Africa.” As a third factor, according to her, “there is a legitimate question of newsworthiness. News is, by definition, something out of the ordinary. In some places, like Iraq, the tragic has become commonplace.” She then gives Egypt as another example (besides Iraq), citing a former Times correspondent stationed there who says that, while it used to be that “a bombing in Cairo would have been ‘a shock’,” that is no longer the case. Today, as the former correspondent says, “We can’t cover every attack there.” Finally, Sullivan cites as a fourth factor “the relationship between the United States and the country where an attack takes place.” In effect, she is saying that since France, for example, is important for our own interests (and, we might add, we even feel fondness for the French at the moment, a moment when it is no longer de rigueur for all good United States patriots who want to be politically correct to substitute “freedom fries” for “French fries,” and to call attention to themselves for doing so), we pay more attention to what happens there than in some place that has far less strategic importance for us (such as, say, Somalia or Haiti) or that we don’t like so much (such as, say, Finland or Indonesia).

Sullivan then draws her piece toward its end by patting her own employer on the back, writing that she is “glad that Times journalists recognize the need to reflect the importance of all human life lost to terrorism—whether it happens in a place where we Americans [by which she means United States citizens in good standing, of course] may have gone sightseeing [if we’re fortunate enough to be part of the minority of the United States population that can afford global tourist-travel] or one we will probably never set foot in [probably because the amenities there are not up to our standards for “exploring the world in comfort,” to borrow a slogan from Viking River Cruises]. And regardless of whether the victims seem ‘like us.’” In her following, final paragraph Sulllivan concludes by writing: “Because, in fact, they surely are”—by which I assume she means that, even if some other people don’t “seem” so, all people really do turn out, upon thorough enough investigation, to be “like us.” That assumption is confirmed by the rest of her final paragraph, where she writes: “And it’s part of The Times’s journalistic mission to help its readers not only know that intellectually, but feel it in their hearts.”

*     *     *     *     *     *

I find that I am even more skeptical in the face of Margaret Sullivan’s apparent optimism that her employer is fulfilling a high “journalistic mission” than I am in the face of Nicholas Kristoff’s apparent optimism that those in the United States who most need to face and change their “unconscious biases” will ever do so. I have already given one reason for my skepticism in contrast to Kristoff’s optimism, a reason that has to do with how, for some of us, such biases are too deeply grounded in preserving our own privileges.

Among my reasons for skepticism about Sullivan’s optimism, I have one that is similar, which is this: Among the factors Sullivan lists to account for the “persistent inequality,” in dominant news sources such as the New York Times, of coverage of “terrorism deaths” in diverse places, she nowhere even mentions the factor of profit. But after all, what really accounts for the four factors she does address—the factors of “journalistic access, deployment of resources, and the admittedly subjective idea of what’s newsworthy,” as she summarizes her account (leaving out, for some reason, the fourth factor she mentions, that of being more concerned about deaths in nations that are of more strategic importance to our own national self-interest than deaths in nations with less such importance)—being factors in the first place? One need not even be as cynical about such things as I tend to be to suspect that the reason for those reasons themselves is above all because it is far more profitable to The Times to keep things that way, rather than to face its own biases, let alone change them.

Nor is that all. I have other grounds for skepticism. Indeed, even in the very same Sunday edition of The New York Times that contains Kristoff’s and Sullivan’s two columns, there are two more articles that remind me of those grounds. In my next post, I will turn to those two remaining pieces from that morning’s Times.

Can We Mourn Yet?

We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed.

—Martin Luther King, Jr., “Letter from a Birmingham Jail,” April 16, 1963

1.

Recently a couple of articles in The New York Times for Sunday, March 20, of this year caught my eye. My attention was drawn to them both at least in large part because, around that same time, I was writing posts for my preceding series on “Faith in Trauma,” which included some discussion of the classic 1967 book by Alexander and Margarete Mitscherlich, Die Unfähigkeit zu trauern: Grundlagen kollectiven Verhaltens (Munich: Piper Verlag)—eventually translated into English as The Inability to Mourn: Principles of Collective Behavior (New York: Grove Press, 1975).

The first of the pieces in that Sunday’s Times that drew my attention was on the front page of the op-ed section. It was a column by Eric Fair, who was a civilian contractor helping United States forces conduct interrogations in Iraq after the U.S. invasion of that country in 2003, under President George W. Bush. Fair assisted in the torture of Iraqi prisoners—what the Bush administration, of course, preferred to call the use of “enhanced interrogation techniques” since, after all, “the United States doesn’t torture,” as Bush blithely insisted.

Fair’s piece was given the title “Owning Up To Torture,” and served, among other things, to advertise his since-released memoir Consequence. What first made me notice the piece was the line inserted by the editors in large, boldfaced print near the end of the first of the article’s two columns. It read:

Men like Donald Trump and Ted

Cruz don’t have to bear the cost

In one paragraph late in his article, Fair writes about how during this election season both Trump and Cruz have repeatedly “suggested that waterboarding and other abhorrent interrogation tactics should not be considered illegal.” A bit later, Fair adds that, given “the opportunity to speak to other interrogators and intelligence professionals, I would warn them about men like Donald Trump and Ted Cruz.” Fair says he “would warn them that they’ll be told to cross lines by men who would never be asked to do it themselves”—just as neither Trump nor Cruz ever would be—but that “once they cross the line, those [same] men will not be there to help them find their way back.” He concludes his article by writing: “As an interrogator, torture forced me to set aside my humanity when I went to work. It’s something I’ve never been able to fully pick back up again. And it’s something we must never ask another American to do.”

In that final sentence, by the pronoun ‘we’ Fair obviously does not mean fellow “interrogators and intelligence professionals,” since it is precisely they whom “we” must never again ask to do the sorts of things “we” have in the past asked Fair and others to do in Iraq—and all too many other places. Presumably, by “we” Fair means the United States as a nation.

In their 1967 book Alexander and Margarete Mitscherlichs address the German nation’s inability to mourn its Nazi past from 1933-1945, and especially to mourn the victims of Germany’s many acts of aggression and genocide during that period, the many millions of people the Germans murdered in those years. That inability to mourn was still all too definitive of Germany at least in 1967, twenty-two years after the end to World War II, as it may well still be today, almost half a century further on.

Fair’s article—and the book it advertises, which has since appeared and which I’ve also now read—raises the same issue for the United States as a nation today with regard to its actions in Iraq after we invaded that country in 2003, just thirteen years ago. If we as a nation are not able to mourn those we have asked such men as Eric Fair to torture and murder in our name, then neither will we be able to heed Fair’s admonition that we never again ask such a thing of anyone.

It is all too easy to say “never again.” The hard part is to keep our word, once we do say that. Part of what makes that so hard, in turn, is that, in order to keep our word, we must first truly acknowledge and mourn all we have lost by having once done what we now say we will never do again. Can we so mourn?

Our national history gives us scant reason for optimism that we can.

*     *     *     *     *     *

The second piece in the same recent Sunday New York Times that drew my attention was a book review of The Black Calhouns: From Civil War to Civil Rights With One African Family, by Gail Lumet Buckley, daughter of Lena Horne and a descendent of the same Calhouns. The review was by Patricia J. Williams, a law professor at Columbia and columnist for The Nation. In the second half of her review, Williams touches briefly on the blatant racism of such all-American icons as Woodrow Wilson and John C. Calhoun. Then she remarks that The Black Calhouns “makes for particularly interesting reading against the backdrop of today’s culture wars, from Donald Trump’s disingenuous claim not to know anything about white supremacy to efforts in Texas [Ted Cruz’s home state, be it noted] to cut all mention of Jim Crow and the Klan from social studies textbooks.” She ends her review by complementing Buckley for how well the “meticulously detailed recollections” of her book call out insistently to the reader, on behalf of black slaves and their descendants: “We were here! We were there! Do not forget!”

However, as Williams goes on to remark, that’s just what we have done. We have “forgotten, over and over.” Williams compliments Buckley for giving us in her book “a comprehensive reminder of how, even when not immediately visible, the burden of racial trauma is carried deep within the body politic.” Then Williams concludes her review with this line: “With so much of our collective national experience consigned to oblivion, we tread unknowingly on the graves of those whose lack of accorded dignity echoes with us yet.”

How can we possibly mourn what we refuse even to remember?

*     *     *     *     *     *

We can let the Germans concern themselves with the question of whether they have even yet proven themselves capable of doing their own mourning for their own dark past. We need to focus on the question of our own national ability—or lack of it—to mourn our own such past, whether that be so recent a past as our war in Iraq, or a more distant past, such as that of the centuries during which some of us built the power of the United States as a nation on the backs of others of us, the backs, that is, of African American slaves.

The two pieces—one to each of those two: the relatively recent American invasion and occupation of Iraq, and the long American history of the enslavement of African Americans—that especially drew my attention in the Times for that recent Sunday of March 20 suggest that we, as a nation, lack that ability.

In my next post, I will introduce more disheartening recent evidence of our own continuing, shameful national incapacity to mourn.

Faith in Trauma: Breaking the Spell

Trauma-Faith: Breaking the Spell (continued and concluded)

The decision whereby one comes truly alive is itself never without risk. If it were, there would be nothing decisive about it. To take that risk is to risk oneself, not just such stuff as one’s money, one’s comfort, or one’s security; and to run such a risk—where the stakes are one’s very being as a “self” in the first place—requires real faith, not just comforting self-bewitchment.

Yet, as Kathleen Norris notes, that faith is nothing out of the ordinary, reserved for only the few. That is the “fascinating trait” of every real choice for life over death—every choice, as Alain Badiou puts it at one point in his recent book on the “metaphysics” of happiness (p. 37), to surmount “the tissue of mediocre satisfactions” held out to us all by our rampantly consumerist society as its vision of what constitutes a happy life. It is a choice to risk real life, and the real happiness that goes with such life, and only with it.

Norris and Badiou are at one in insisting that the opportunity, the opening, to make such a choice is nothing that comes only in rare or unusual moments, and only to a select few. It is, rather, an opportunity, an opening, that can suddenly present itself, as Badiou writes, “in every episode of life, no matter how trivial or minor it may be.” Even the most everyday of occurrences can suddenly break the spell that binds us, calling upon us to display real faith by choosing to begin really living our lives, rather than just passively undergoing them, just going on outliving ourselves day after day to the grave.

Once we are truly given a real choice, everything depends on us, and whether we have the faith to go ahead and choose.

What is more, such simple faith, the faith that permits choosing actually to live one’s own life rather than just trying to survive it, can never be claimed as some sort of permanent acquisition. It is not some piece of privately owned property that, once acquired, can be disposed of as one sees fit. The decision to live, however everyday it may be, is a decision whereby one accepts martyrdom for one’s faith—from the Greek term martyr, to witness—which need have nothing flashy or Hollywood-heroic about it. As Norris helps us see, such genuine martyrdom can be as quiet and unpretentious as the small daily sacrifices, fully embraced, that parents continually make for their children.

Nor, short of death itself, is such witnessing ever over and done with. It is always there in front of us, needing to be demonstrated ever again anew. It demands constant, ongoing reaffirmation—exactly what Kierkegaard called “repetition.” Exchanging truly understood and meant wedding vows in some formal setting, to use one of Kierkegaard’s own best examples, does not let spouses off the hook of then having to honor those vows, to keep them and the love they sacramentally express alive in their daily life together—forever repeating their vows and the love the bestowing of those vows effectively signifies, “till death do us part.”

Nor is that anything peculiar to getting married. It is the same with every decision, once really taken.

The faith witnessed by any real decision to run the risk of coming truly alive is just such a faith that must be kept. The specific “content,” as it were, of the decision and faith at issue, may vary greatly, of course, from person to person and even from one day to the next.

In the same way, each day for each person, temptation to “break faith” (a tellingly accurate expression) with one’s own decision can take a new form. Whatever form the temptation to break the faith with one’s own life may take, however, each and every day one is faced again with the decision either to keep on truly living, or just to fall back into letting one’s days dribble on endlessly, one after another, till one can finally check out of the whole game altogether and just expire—like Nietzsche’s ever-contented “last man.”

Only a faith that accepts the risk of living is one that finally turns and faces trauma, rather than running from it, and then tests and proves itself by faithfully facing trauma again anew, each and every day, day after day thereafter.

That is true faith in trauma, a faith that always keeps the wound open.

Faith in Trauma: Breaking the Spell

Trauma-Faith: Breaking the Spell

To enchant is to cast a spell. In turn, to disenchant is to break the spell of an earlier enchantment. In the first decades of the 20th century, Max Weber made popular the idea that modernization—with its ever more exclusive prioritization of science, technology, and instrumental rationality over faith, tradition, and belief—centrally involved a process of the “disenchantment” (Entzauberung) of nature. Ever since Weber, however, it can and has been debated whether modernization really broke a spell, or whether it cast one.

So, for example, in one of his writings on the rise of modern technology in volume 76 of the Gesamtausgabe (“Complete Edition”) of his works, Martin Heidegger makes explicit reference to the Weberian idea of disenchantment, only to argue against that thesis. Rather than a dis-enchantment (Entzauberung), says Heidegger (pages 296-297), what is truly involved in the rise of modern technology itself is instead an en-chantment (Verzauberung), a bewitching, hexing, or casting of a spell. That enchantment, according to him, is one whereby the very power at play in modern technology can make good on its own exclusive claim to power, as it were—just as, in the fairy story, the wicked witch, to secure her own claim to the power of beauty, casts a spell over Sleeping Beauty, the legitimate claimant.

According to Heidegger, that enchantment—the casting of the spell whereby what is at work in modern technology (as well as at work in all of the modern science and instrumental rationality that goes with that technology) seizes and secures its own power—goes hand in hand with the de-worlding (Entweltung) of the world, the de-earthing (Enterdung) of the earth, the de-humanizing (Entmenschung) of humanity, and the de-divinizing (Entgötterung) of divinity. “Technology,” writes Heidegger, “as the unleashing and empowering of energies [. . .] first creates ‘new needs’,” and then produces the resources to satisfy them: technology “first discloses the world to which it then fits its products.”

Badiou said essentially the same thing just last year in À la recherche du réel perdu (“In Search of the Lost Real”), his critique of our contemporary “entertainment world,” as he calls it at one point, using the English expression—a world-less pseudo-world actually, one ever more frenziedly devoted to the pursuit of Pascalian diversion from reality. In such a desolate pseudo-world, what falsely but inescapably presents itself as “reality” is in truth so utterly crushing that it permits no genuine, full living at all any longer, but only survival. Nor does such a divertingly fake world any longer have any room for any true faith. It only makes room for superstitions—precisely the sort of dangerously superstitious nonsense, for example, that United States Supreme Court Justice Antonin Scalia spouted at a high school commencement speech shortly before his recent demise, when he attributed the global success of the United States to the frequent invocation of God’s name by our Presidents and other public officials (see my citation of his remarks to that effect at the beginning of my earlier post, “An Anxious Peace: ‘God’ After Auschwitz”).

In a world already deeply asleep, under the bewitching spell cast by what Badiou lucidly calls “triumphant capitalism,” what we need is precisely dis-enchantment, the breaking of the spell. The spell that holds the world in thrall today is broken whenever, anywhere in the world, reality suddenly and unexpectedly breaks through to dispel (good word for it: “de-spell”) any illusion that happiness consists of endlessly buying what the global market endlessly offers for sale.

In Métphysique du bonheur réel (“Metaphysics of real happiness”)—a short book he also published earlier last year and in which he was already “in search of the lost real”—Badiou describes the illusion that the shock of reality shatters. It is the illusion wherein one takes the height of happiness to consist of the conjunction of the following factors, as he puts it in his introduction (p. 6): “a tranquil life, abundance of everyday satisfactions, an interesting job, a good salary, sound health, a flourishing relationship, vacations one doesn’t soon forget, a bunch of sympathetic friends, a well-equipped home, a roomy car, a loyal and cuddly domestic pet, [and] charming children with no problems who succeed in school.” In short, it is the illusion that one could be happy while living a life of crushing consumerist boredom, where nothing disruptive ever happens—life as no more than survival: outliving oneself from birth, in effect.

As opposed to any such pseudo-happiness of mere security and consumerist comfort in sheer survival, real happiness comes only as a by-product of true living. In turn, real life in itself begins only in the deliberate choice, the decision, to engage fully with reality, whenever it does break through our numbing consumerist spell to strike us. When it does, it reawakens us to the realization that, as Badiou puts it later (p. 38), “existence is capable of more than self-perpetuation.” When the consumerist spell that has held us in thrall is finally broken, we reawaken to the awareness that real happiness is nothing but the feeling that accompanies true life—a life fully lived “even unto death,” as the Christian biblical formula has it, rather than just survived.

Faith in Trauma: Breaking the Spell

Faith Purified by Trauma (concluded)

In my second post of this current series on “Faith in Trauma,” I cited Jean Améry’s observation that the very denial of reality that is present in what he calls “Finalistic” religious or political faith gave believers imprisoned in the horror of Auschwitz a certain distance from the horrifying reality around them—a distance that actually increased such believers’ odds of survival. In contrast, non-believers, lacking such denial-based protection, were more nearly certain to be overcome and crushed by the horrific reality they so clearly saw surrounding them.

Améry’s observation can fruitfully be juxtaposed to a remark that at first glance appears to oppose it, a remark Alexander and Margarete Mitscherlich make in the 1970 afterword to the classic analysis of “the inability to mourn” they give in their 1967 book of that title (Die Unfähigkeit zu trauern, Munich: Piper). In that book the Mitscherlichs are addressing specifically the inability of the Germans as a nation to mourn the misdeeds of their own Nazi past, a past wherein they created such death-camps as Auschwitz—and above all their concomitant incapacity to mourn the millions of innocent victims they murdered there and throughout Europe. What the Mitscherlichs observe at one point in their 1970 afterword applies not just to Germans, however. It applies to everyone. “To endure reality as it is,” they write, “is the presupposition that first makes it possible to alter it into something more bearable; denial unwillingly preserves the status quo.”

Despite the appearance of opposition between the two remarks—Améry’s on the one hand and the Mitscherlichs’ on the other—they can and should be combined as follows:

When reality permits no hope for a better outcome beyond sheer survival, the denial of reality is necessary simply to preserve bare life itself—and with it the possibility of some day returning to true, full, and abundant living, rather than just surviving. As I already argued when first discussing Améry’s observation, that is really just an instance of the numbing against traumatic shock that allows those it strikes to live through it at all (the literal meaning of survive). However, if the survivor is not to be locked forever after into a pattern of compulsive repetition of the traumatic situation itself, at some point that survivor must grow strong enough at last to endure the very reality that has been thus denied. It is only then that any genuine recovery of real life, that is, life in its full sense and not just some endless survival, becomes possible, precisely as the Mitscherlichs observe.

Such faith to live a recovered life face to face with trauma can only be the sort of pure and purified faith Walter J. Ong attributes to Gerard Manley Hopkins, as discussed in my post before last. It is, as well, the simple but difficult faith of a woman freely consenting to bear a child, with no illusions about what that child itself may have to bear once born.

Such faith purified by trauma is true faith, not merely some defense mechanism.

Faith in Trauma: Breaking the Spell

Faith Purified by Trauma (continued) [for posting 3/28/16]

In The Quotidian Mysteries: Laundry, Liturgy, and “Women’s Work” (New York and Mahway, NJ: Paullist Press, 1998), Kathleen Norris, an American poet, best-selling spiritual writer, and Benedictine oblate, recognizes that simple, non-dramatic faith of the purest sort is actually as quotidian (“everyday”) as pregnancy—or, rather, as the free, un-coerced consent to that condition of a woman who, finding herself pregnant, decides to go ahead and bear her pregnancy to term.

In my earlier series of posts on “An Anxious Peace: ‘God’ After Auschwitz,” I quoted Emmanuel Levinas’s suggestion that the final meaning of Auschwitz may well just be “that God requires a love that entails no promise on his part,” one that undergoes “a suffering devoid of any promise, totally gratuitous.” Norris gives an unexpected undertone to that suggestion by writing: “At its deepest level the pregnant woman must find the courage to give birth to a creature who will one day die, as she herself must die. And there are no promises, other than the love of God, to tell us that this human round is anything but futile.”

Norris is not proclaiming any “pro-life” dogma that would deny women the right to terminate a pregnancy, if that is their choice. Nor does what she says entail that, once a woman discerns she is pregnant, the only truly courageous choice, “at the deepest level,” would be to continue the pregnancy. In fact, those who freely choose to abort a pregnancy often show no less courage in so choosing that do those who freely choose to embrace one. In some circumstances—circumstances, in fact, that were all too common in the United States not so very long ago—it often takes far more courage to abort a pregnancy than to continue it. When general social conventions, the expectations of significant others (husbands, parents, friends, etc.), the availability of proper medical resources, financial circumstances, and even laws, all militate against abortion, to choose to do whatever is necessary to obtain one can take great courage. Under such circumstances, it is going ahead and having the baby that is typically “the easier, softer way,” to borrow a phrase.

The damage that women in the United States before the Supreme Court decision in Roe v. Wade often inflicted on themselves because they were denied access to inexpensive legal abortions is well known, no matter how often certain segments of the American public might like to forget it. However, even beyond such grisly realities, the damage inflicted by coerced continuations of unwanted pregnancies—damage done not only to mothers but also, and above all, to their children—is incalculable. To put it mildly, a mother who does not really want a child, but who is pressured by laws or social norms into having one anyway, is not likely to be a very loving mother. She is unlikely to show her child the sort of love all children are entitled to receive from their mothers (and fathers, be it added). The fault for such sad states of affairs lies neither with those made unwilling mothers by being forced against their own desires and interests into carrying a pregnancy to term, nor with the children unfortunate enough to be born to such mothers-by-coercion. It lies with those who continue to derive selfish benefit from denying women the opportunity to live their own lives fully, by making their own choices for themselves, and then living with the natural consequences of their own decisions—rather than having arbitrary penalties imposed upon them by others, should they choose differently than those others want them to.

For some women under certain circumstances, choosing to carry a pregnancy to term does indeed take great courage “at the deepest level,” as Norris says. For other women in other circumstances, however, the greatest courage at the deepest level may be shown in choosing to terminate the pregnancy. In general, whether any choice or decision bears witness to courage and the faith that goes with it, or betrays cowardice and the lack of any real faith, is not a matter of what we might call the “content” of the choice. Put in terms from Aristotle’s Nichomachean Ethics, “choosing and acting rightly and well” is not a matter of choosing this over that. Rather, it is a matter of how one goes about doing the choosing, as it were. If one chooses rightly, which means goes about the business of choosing in the right way, then in those particular circumstances one has make “the right choice”—regardless of what one chooses.

When it comes to choice, there’s always a choice. No one choice fits all.

Faith in Trauma: Breaking the Spell

Faith Purified by Trauma

At the end of his 1986 book Hopkins, the Self, and God (University of Toronto Press), Jesuit priest and scholar Walter J. Ong addresses the sort of Christian faith to which the life and work of 19th century poet Gerard Manley Hopkins bears witness. Like Ong, Hopkins was a Jesuit priest. He was also an exact contemporary of Nietzsche. Both were born in 1844 and both entered into darkness in 1889—the darkness of the grave for Hopkins, that of the madness in which he spent the last eleven years of his life for Nietzsche.

Ong finds in the poems, prose, and letters of Hopkins a “forthright” view of Jesus’ crucifixion, one in which there is no weakening of the suffering and failure involved. That includes any weakening of that suffering and that failure through any consoling idea that what Jesus was working for would somehow still finally be accomplished even after his own death on the cross. That is, there was nothing such as, a century after Hopkins was born, allowed orthodox Marxists to find consolation, even in the face of imprisonment and death in Auschwitz, in the sustained conviction that the eventual victory of communism remained inevitable. In the view of Jesus on the cross that Ong finds in Hopkins, there is no such reality-weakening faith in play. Rather, by the “forthright” view Ong attributes to Hopkins, “[t]he truth was that what Jesus was working for, what he had planned, turned out a total and spectacular failure.” In confirmation of that interpretation, Ong quotes from a letter Hopkins once wrote to his friend Dixon:

His [Jesus’] career was cut short and, whereas he would have wished to succeed by success—for it is insane to lay yourself out for failure, prudence is the first of the cardinal virtues, and he was the most prudent of men—nevertheless he was doomed to succeed by failure; his plans were baffled, his hopes dashed, and his work was done by being broken off undone. However much he understood all this he found it an intolerable grief to submit to it. He left the example: it is very strengthening, but except in that sense it is not consoling.

Ong expands upon that passage by remarking that, in Christian teaching as Hopkins understood it, “God the Father had let Jesus’ ‘career’ work out as a failure not to cancel out the failure later but because he had plans about the consequences of the failure. The failure was never cancelled out and never will be,” regardless of whatever subsequent history—or the supposed end of it—might bring.

A faith purified by trauma, which is to say a faith that no longer avoids or numbs itself in the face of trauma but instead opens to it, can only be the sort of clear-eyed faith that Ong sees in Hopkins. It is not anything like a faith in “pie in the sky by and by,” as one popular put-down of reality-weakening religious faith puts it—no sort of defensive certainty that everything will prove to have been for the best in the end, when the whole story finally gets told, and the mysterious ways of God are at last made clear. Central to Hopkins’ sort of “forthright” Christian faith, a faith that faces trauma, rather than denying it, is the insistence that the wounds will always remain open, even in Christ’s resurrected body.

A faith that has been purified by trauma need not prove itself in dramatic acts that command attention. Instead, such faith is one that carries itself out in the fidelity (which is what faith is all about, after all) demonstrated by the daily living out of a life fully open to traumatic reality. In an important sense, there is nothing complex about such faith. It is a very simple and straightforward. Despite that, it remains demanding and difficult.

The real difficulty lies precisely in the fidelity—what St. Paul in his letters calls the “perseverance”—required for keeping such faith. The hard part is remaining faithful day after day in a life fully lived, and therefore lived in full exposure to the suffering that all true life entails. Yet however difficult the ongoing keeping of it may be, manifestations of such faith are really not all that rare. One does not have to have any special gifts, such as Hopkins’ for poetry, to keep such faith. It can be, and often is, kept faithfully in the daily life of the most ordinary people—a point I will continue to explore in my next post.