Can We Mourn Yet?


The more time passes, the more difficult it becomes to acknowledge these mistakes.

—Dan Jianzhong, Beijing sociologist, concerning the Chinese Cultural Revolution, which began 50 years ago, in 1966 (quoted by journalist Chris Buckley in “High-Level Commentary Breaks Silence in China,” The New York Times , 5/17/16)


Mourning involves living in a world totally not of one’s choosing. It’s a world of paradoxes: a world that one doesn’t want to live in, but doesn’t want to die in either.

—Charles W. Brice, poet and retired psychoanalyst (personal communication)


One thing has been made very clear to me. Many people resent being confronted with information about how racism still shapes—and sometimes, ruins—life in this country.

—Jenée Desmond-Harris, “The Upside to Overt Racism” (The New York Times, 5/1/16)


Whoever, so as to simplify problems, denies the existence of certain obligations has, in his heart, made a compact with crime.

—Simone Weil, The Need for Roots (Routledge Classics, 2002; Fr. orig. 1949)

In general, it is no doubt right to say that the difficulty of acknowledging past mistakes increases with time. However, when those mistakes carry traumatic consequences, the more time passes the greater grows the urgency to do just that, to acknowledge them—and, even more, to set them right. Trauma, after all, has its own time, growing ever more insistent the longer it goes unaddressed, repeating its demands more and more loudly until they are finally heard, and elicit a proper response. Sooner or later, trauma’s time will come. Sooner or later, we will be able to mourn. We can only hope that the day for our mourning will come this side of Judgment Day, the eschatological end of days as such. However, there are reasons for pessimism on that matter.

Perhaps the greatest obstacle that stands between us as a nation and the dawning of our day of national mourning is precisely, as I put it in my preceding post, because we really are not “one nation, indivisible, with liberty and justice for all” except in our national Pledge of Allegiance. What keeps us from uniting in acknowledging and mourning the crimes that some of us have perpetrated on others of us (not to mention other nations or peoples), is that we who are perpetrators or their descendants continue to derive so much benefit from those same crimes. Those of us who have the privilege of thinking ourselves “white,” for example, continue to derive great benefits from that very privilege, including the benefit of being able to drive our cars around our cities without being stopped and harassed by the police for no better reason than our not being among those so privileged.

Some time ago I wrote here, in a two-post series on “The Unforgiveable,” about Auschwitz survivor Jean Améry’s stipulation of the conditions under which he would be willing to let go of what he called his “resentments” against the Germans as a people or nation. In brief, Améry lays out a two-fold condition for such a settlement to occur at the level of what he calls “historical practice.” First, a true settlement would require “permitting resentment to remain alive in the one camp,” the camp of the victims of the crimes. Second, and simultaneously, “self-distrust” would need to be first enkindled and then kept carefully alive “in the other camp,” the camp of the perpetrators—the very self-distrust engendered by the perpetrators’ awareness and acceptance of their victims’ resentment. Genuine reconciliation could occur only by allowing the wounds of the victims to remain open and acknowledged, while simultaneously opening and keeping open an answering wound of deep self-mistrust in the perpetrators. Only if that were to happen would “the overpowered and those who overpowered them [. . .] be joined in the desire that time be turned back and, with it, that history become moral.”

In the case of Germany and what it did during World War II, for that nation to awaken such self-distrust would require it to become, as Améry says, “a national community that would reject everything, but absolutely everything, that it accomplished in the days of its own deepest degradation [that is, during the Nazi years of 1933-1945], and what here and there may appear to be as harmless as the Autobahns.” Nor was Améry blind to the fact that the entire postwar German “economic miracle” that allowed West Germany to become the economic powerhouse of Europe was itself only possible on the basis of the devastation of Germany at the end of the war, which allowed for the sort of radical retooling that fed the postwar German economic machine. Truly to “reject everything, but absolutely everything, that it accomplished” through its own criminal acts of its Nazi period, Germany would have had to reject not only Cold War financial support through the Marshall Plan but also everything else that Germany’s own utter defeat made possible for subsequent German economic development. Of course, “nothing of the sort will ever happen,” as Améry already knew and insisted long ago.

Nor will the United States as a nation every truly mourn its own crimes. For one thing, it will never truly mourn the genocide of American Indians on which America is founded. For various reasons, it is even less likely ever truly to mourn the centuries of enslavement of African Americans on which the United States as a whole—not just the South, but the entire county—built its unparalleled global economic might.

It recently made the news that Georgetown University in Washington, D.C., was largely built on funds it acquired from direct engagement in the slave trade. In one sense, at least, there’s really nothing new in such news. As has long been recognized, many foundational United States universities—Brown, Cornell, Harvard, the University of Virginia, and others—were themselves founded, either directly or indirectly, on the bodies of slaves. So were many other institutions, both North and South. Then, too, of course, the institution of slavery was built right into the Constitution of the United States itself.

If the United States were ever really to mourn slavery and its hundreds of millions of victims, then at least at a bare minimum those of us who still continue to benefit from the consequences of slavery would need to let go of our resentment toward African Americans for their own ongoing resentment for those very consequences. We who are privileged to think ourselves “white” would have to grant those not so privileged the right to hold on to their resentment of us, and we would need simultaneously to match their resentment with deep, abiding self-distrust of ourselves, to borrow Améry’s way of putting the point.

Of course, nothing of the sort will ever happen, I know.

*     *     *     *     *     *

So where do we go from here?

Well, that question really calls for thinking.

Published in: on May 23, 2016 at 5:48 pm  Leave a Comment  
Tags: , , ,

Can We Mourn Yet?


[H]ow can the memory of the colonists be reconciled with the memory of the colonized?

—Sadri Khiari, “The People and the Third People,” in What Is a People? (trans. Jody Gladding, Columbia University Press, 2016, p. 99)

There is something questionable about lumping all veterans together as all equally deserving of honor. Many pointed just that out, to give one relatively recent example, when President Ronald Regan accompanied West German Chancellor Helmut Kohl to a commemoration service at the military cemetery in Bitburg, Germany, thirty-one years ago this month, in May of 1985, to honor the German war-dead buried there. Unfortunately. the German veterans buried at Bitburg included even members of the Nazi Waffen-SS—despite the fact that the entire SS had been deservedly judged a “criminal organization” by the Nuremburg Tribunal at the end of World War II. Reagan’s remarks on that occasion in 1985 suggested it was time by then to let bygones be bygones. Even though some of those buried at Bitburg had served in a criminal organization of a regime that murdered millions in gas chambers, Reagan apparently thought it fitting, after so many years, to let all the dead be honored equally. Unfortunately, however, to honor the memory of murderers equally with the memory of those they murdered is to dishonor the latter—and to join the former, if only symbolically.

In the same way, to honor Confederate soldiers equally with Union soldiers and all other American veterans who died while serving in the United States military, as we have long done on Memorial Day, is to paper over the differences between the Confederacy and the Union. And in the process it is to dishonor the millions of African Americans who were sold into the very slavery over which the Civil War was fought in the first place. It is to forget their bondage and its toll of misery, and to forget who was responsible for it—which was by no means the Confederacy alone, be it added (a point to which I will return in my next post).

Forgetting also occurs under the appearance of remembrance when all U. S. veterans whatever are lumped together by robbing Armistice Day of its original significance and turning it into Veterans Day. That change involves the expropriation of the very day originally set aside in memoriam of what Woodrow Wilson was benighted or vicious enough (he certainly betrayed both character-traits) to call “the war to end all wars,” and appropriating it instead for the purpose of glorifying all U.S. military service of all times, even if that service consisted, for example, of dropping nuclear bombs on Hiroshima and Nagasaki, or torturing Iraqi captives, and not just such non-controversially good deeds as liberating Paris from the Nazis or inmates from Dachau. What happens in such appropriation is the erasure and expropriation of the suffering of all our wars’ greatest victims, just as honoring all German war dead equally, including even Waffen-SS, dishonors those millions of Germans who died while serving honorably in their country’s armed forces.

Remembering the dead is a certainly a debt we owe them, one that should be honored and paid in full. Official memorializing of their deaths, however, is all too often a way of reneging on that debt, and failing to honor it. That is what almost always happens when memorializing is mandated by official state decree, as opposed to springing up spontaneously by popular action. The former is most often in the service of coercive power, helping to strengthen that power, or at least to maintain it. The later expresses the desire to honor those who call out to be honored in remembrance.

In a riven society, memory is also riven. To be genuine, mourning must honor the rift. The discord must be heard and remembered, not drowned out and covered over, for real healing to occur. The wounds of division must be kept open. They must be acknowledged and mourned, if a truly single and united community—inclusive of all as true equals, rather than preserving privileges for some who remain always “more equal” than others—is ever to be formed among those who remain.

Whether “under God” (as Congress mandated only in 1954) or not, the society of the United States today is “one nation, indivisible, with liberty and justice for all” only in its own official Pledge of Allegiance. In reality, the society of the United States today continues to be riven by a variety of deep, longstanding social injustices it has never yet properly mourned.

Just this morning (May 16, 2016), The New York Times carried an article concerning one relatively recent instance of such a still unhealed wound of national division. The article addresses the controversy that is currently surfacing again as President Obama prepares to visit Vietnam soon, the third U.S. President to do so since the fall of Saigon to Communist forces in 1975. The rekindled controversy repeats that of old divisions generated by the United States war in Vietnam in the 1960s and 1970s. We as a nation have yet to face and to mourn what we did in that war, and what it did to us.

Slavery and all its consequences of ongoing discrimination and persecution against African Americans down to the present is perhaps the most obvious other example, and the one I have discussed most in this current series of posts. Unfortunately, there are many other examples as well. The oldest one goes all the way back to the genocidal warfare on which this country was founded, and the consequences of which continues to afflict American Indians to this day.

The list could be continued.

At any rate, what are the odds that we as a nation will ever be able to mourn such old yet still destructive divisions, and truly begin finally to heal them? The odds are not at all good, for reasons I have touched upon in my earlier posts in this series, and will address more directly in my next one, which I intend to be the last of this series on our own national capacity—or lack of it—to mourn.

Can We Mourn Yet?


One of the official editorials by the editorial board of The New York Times on Sunday, April 3, 2016—the very same edition in which appeared the two articles I wrote about in my preceding post (one by Nicholas Kristoff and the other by Margret Sullivan)—was entitled “Race and the Death Penalty in Texas.” The editorial pointed to the overwhelming evidence demonstrating that the imposition of capital punishment in Texas discriminates against African Americans, and suggested that the death penalty both there and elsewhere in the United States should be abolished.

It certainly should be. However, that does not seem likely at present. One reason it is not likely is the current composition of the U. S. Supreme Court, which forty years ago reversed its own earlier judgment against capital punishment, permitting it again so long as it is not imposed in an “arbitrary or capricious manner,” as The Times quotes the Court saying in reversing itself. A deeper, even more intransigent factor is indicated by something else The Times itself says in its editorial: “Racism, of course, has been central to the American death penalty from the very start.” What The Times does not go on to say—but should have—is that for that very reason our national focus should not be on the death penalty at all, but rather on the racism that underlies and sustains it.

That, our national racism, is what we really need to eliminate. Otherwise, even if we as a nation were to side-step the Supreme Court and legislate the elimination of capital punishment, our real problem would still persist. In fact, it would in all probability just grow worse. We as a nation would all but certainly interpret the elimination of the death penalty as no more than the elimination of a lingering vestige of the racism we want to think we have already consigned to the past, rather than an ongoing crime we continue to perpetrate in the present. (In just the same way, according to numerous opinion polls, the majority of citizens of the United States were happy to convince themselves that the election of President Obama eight years ago proved that we as a nation had overcome racism.)

Capital punishment should be eliminated in the United States, and the election of our first African American President in 2008 deserves to be universally celebrated (regardless of what one thinks of him personally, or of the accomplishments of his Administration). However, neither eliminating the death penalty nor celebrating our first election of an African American President would prove the United States had overcome the racism that is such an ongoing national shame. Furthermore, by allowing us to pretend that we had already faced and overcome our racism, both would all too easily just harden our inability to mourn that racism, and its millions of victims past and present.

*     *     *     *     *     *

That brings me to yet a fourth article that caught my attention in the Sunday, April 3, edition of The New York Times. That fourth piece was also in the op-ed section. It was a column entitled “Why Slaves Graves Matter,” by Sara A. Arnold, whom The Times identifies as the “founder of the Periwinkel Initiative and the National Burial Database of Enslaved Americans,” preliminary submissions to which, she tells us in her article, she is currently processing. In her piece Arnold argues that “community preservation initiatives can contribute to healing understanding and potentially even reconciliation,” then ends her piece with the following paragraph:

Our country should explore ways to preserve the public memory of enslaved Americans. Their overlooked lives are an inextricable part of the historical narrative of our country—and not just because they were “beneficiaries” of the 13th Amendment. We should remember enslaved Americans for the same reason we remember anyone; because they were fathers, mothers, siblings and grandparents who made great contributions to our nation. Regardless of our country’s history or our ambivalence about the memory of slavery, we can choose to remember the enslaved—the forgotten. They offer our contemporary society examples of resilience and humanity. Preserving heir memory contributes to our own humanity.

Unfortunately, in this case, too, I remain a skeptic. Here, my skepticism is above all because everything depends on just how we go about doing our “remembering.” There’s remembering, and then there’s remembering.

One kind of remembering is that officially done on such occasions as Veterans Day or Memorial Day, when we all get together to put flowers on graves or watch parades, maybe even with banners admonishing us never to forget those who have sacrificed for our national good, sometimes even with their lives. Those two cases—Veterans Day and Memorial Day—are deserving of more attention, since they can be used as good examples of the hidden complexities involved in the whole mixing of remembering with the setting up of official memorials or days of remembrance.

Armistice Day was officially created to memorialize the day when the armistice that ended active hostilities between the Allies and Germany on the Western Front in World War I went into effect. The armistice officially went into effect at 11:00 a.m. on November 11, 1918—the symbolically significant “eleventh hour of the eleventh day of the eleventh month,” just a clock-tick of history away from the World’s midnight. However, in many countries it eventually became an occasion to remember not just veterans of World War I but also all military veterans whatever. Following that trend, in 1954 the United States officially changed “Armistice” Day into “Veterans” Day.

Similarly, the “Grand Army of the Republic,” a Union veterans’ organization founded in Decatur, Illinois, held the first “Decoration Day” to memorialize Union soldiers who died in the American Civil War. In former Confederate states after the war, there were also various celebrations, held on various days, to commemorate Confederate veterans who had died in the same conflict. In the 20th century, all such celebrations, North and South, were merged into the current “Memorial Day,” which was also extended to honor all Americans who died while in the military service, in or out of combat, not just those who died during the Civil War. Thus, unlike Veterans Day, which was set aside as the official U. S. holiday to honor all veterans, regardless of whether they died while in military service, Memorial Day was set aside specifically to honor only those who did die while so serving.

All too often, however, officially setting aside such days of remembrance—or officially setting up such memorials as The Tomb of the Unknown soldier in Arlington National Cemetery (or even setting up that cemetery itself as the official cemetery to honor U. S. veterans killed in combat)—does not, regardless of anyone’s intention, really serve genuine remembrance at all. All too often in such cases, what looks like an endeavor to encourage or even mandate remembrance in reality ends up just helping whatever powers that be keep the public order that perpetuates their power, an order that actually has good reason to fear being disturbed by genuine, spontaneous, uncontrolled remembrance.

In my next post, I will address that issue.

Can We Mourn Yet?


That segment of the population [the privileged segment] wants to be surrounded by people with similar characteristics.

—Kevin Sheehan, former chief executive of Norwegian Cruise Lines, as quoted by Nelson D. Schwartz in “In New Age of Privilege, Not All Are in Same Boat,” the lead article on the front page of The New York Times for Sunday, April 24, 2016 (the 100th anniversary of the start of the Irish Easter Rising of 1916, be it noted)

In my preceding post, the first in this series on whether we can mourn yet, I wrote about two articles that appeared in The New York Times for Sunday, March 20, this year. This new post will also concern pieces from The Times, but from an even more recent issue.

The first piece is itself a sort of recent reissue of an older one. Two years ago, regular Times contributor Nicholas Kristoff did a series of columns he called “When Whites Just Don’t Get It.” Then just a few weeks ago, in The Times for Sunday, April 3, he wrote a reprise called “When Whites Just Don’t Get It, Revisited”—a revisiting he wrote was necessary because “public attention to racial disparities seems to be flagging even as the issues are as grave as ever.”

“Why do whites discriminate?” Kristoff asks in his recent reprise. “The big factor,” he writes in answer to his own question, “isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality [that is, “whites” who, when asked, say they believe in equality, even and especially, I will add, if they are just asking and answering themselves] but act in ways that perpetuate inequality.” Kristoff then cites Eduardo Bonilla-Silva, whom he identifies as “an eminent sociologist,” and who “calls this unconscious bias ‘racism without racists.’” About such presumably covert racism, Kristoff says, “we whites should be less defensive.” One reason, he adds, that “we whites” don’t need to be so defensive about our own lingering, unacknowledged racism, is that, in his judgment at least, such bias “affects blacks as well as whites, and we [all of “us,” presumably: “blacks as well as whites”] also have unconscious biases about gender, disability, body size and age.” Then a few paragraphs later he ends his column by writing: “The challenge is to recognize that unconscious bias afflicts us all—but that we just may be able to overcome it if we face it.”

How likely Kristoff thinks it is that “we” will ever actually face the fact of such bias, he doesn’t say. Speaking solely for myself, I do not think it is very likely at all. Hence, I am equally skeptical that “we” have any real ability to overcome such bias.

*     *     *     *     *     *

Kristoff’s remarks about how we all have such bias makes me assume that what he means by that term bias is very broad. It would seem to cover such things as the simple uneasiness that we all have toward that which is different from us or unfamiliar to us. For example, if we grow up in a place where no one has red hair, and suddenly find ourselves visited by some red-haired stranger, then we will naturally tend toward being suspicious of, or at least not completely at ease with, our visitor, at least till we get to know him or her better: We will have an “unconscious bias” against any such red-heads, as Kristoff seems to be using that phrase.

It is precisely with regard to unconscious “biases” of that perfectly natural and universal sort that our chances of coming to face them, and then perhaps even to overcome them, are best. However, if we turn to a different subset of unconscious biases, the odds against such change rise sharply. That applies above all to that subset of unconscious biases with regard to which our not knowing we have them is all too often because we do not want to know—those biases we have of which we do not just happen to be unaware, but which we actually have a vested interest, as it were, in keeping secret—secret even, and perhaps especially, from ourselves. At issue are those biases that we actually have a vested interest in maintaining, precisely because of all the benefits maintaining such biases brings us, at the cost of the very people against whom we do maintain them. That very self-interest then also strongly motivates us unconsciously to hide those unconscious biases from ourselves.

To give an example that is still of great ongoing importance, when it comes to racial bias in this country, it seems to me that, in general, the benefits from such bias are overwhelmingly weighted in favor of those of us who think ourselves “white,” as Ta-Nehisi Coates puts it in Between the World and Me (and which I have discussed in some earlier posts), rather than those of us who are not encouraged—if even permitted—so to think of ourselves. It directly benefits all of us who think we are “whites” to think that the rest of us, all the “non-whites,” are inferior to us “whites,” since that lets us “whites” keep on denying such supposed inferiors their fair share of the communal pie, so that we can keep on getting bigger slices for ourselves.

*     *     *     *     *     *

To give another example: It happens that in the same op-ed section of the Sunday issue of The New York Times in which Mr. Kristoff has his column revisiting how, still, “whites just don’t get it,” there also appears another column, by Margaret Sullivan, who was then serving as the “Public Editor” for The Times (she’s since stepped down), called “Are Some Terrorism Deaths More Equal Than Others?” The answer editor Sullivan gives to that question is clearly in the affirmative, at least insofar as it comes to coverage of such deaths in dominant United States news media, including The Times itself. After devoting the first half of her column to various readers’ letters to her about the matter, Sullivan asks the most pertinent question, that of “why [there is] the persistent inequality that readers rightly observe?”

Her own answer to that question is four-fold. “Part of the answer,” she writes, “is access. It’s far easier to get a number of staff members to Paris or Brussels than, for example, to Pakistan [. . .] .” Next she addresses “another factor,” that of “deployment of resources,” of which she writes: “The Times has more than a dozen correspondents working in bureaus in Western Europe; far fewer, for example, are based in Africa.” As a third factor, according to her, “there is a legitimate question of newsworthiness. News is, by definition, something out of the ordinary. In some places, like Iraq, the tragic has become commonplace.” She then gives Egypt as another example (besides Iraq), citing a former Times correspondent stationed there who says that, while it used to be that “a bombing in Cairo would have been ‘a shock’,” that is no longer the case. Today, as the former correspondent says, “We can’t cover every attack there.” Finally, Sullivan cites as a fourth factor “the relationship between the United States and the country where an attack takes place.” In effect, she is saying that since France, for example, is important for our own interests (and, we might add, we even feel fondness for the French at the moment, a moment when it is no longer de rigueur for all good United States patriots who want to be politically correct to substitute “freedom fries” for “French fries,” and to call attention to themselves for doing so), we pay more attention to what happens there than in some place that has far less strategic importance for us (such as, say, Somalia or Haiti) or that we don’t like so much (such as, say, Finland or Indonesia).

Sullivan then draws her piece toward its end by patting her own employer on the back, writing that she is “glad that Times journalists recognize the need to reflect the importance of all human life lost to terrorism—whether it happens in a place where we Americans [by which she means United States citizens in good standing, of course] may have gone sightseeing [if we’re fortunate enough to be part of the minority of the United States population that can afford global tourist-travel] or one we will probably never set foot in [probably because the amenities there are not up to our standards for “exploring the world in comfort,” to borrow a slogan from Viking River Cruises]. And regardless of whether the victims seem ‘like us.’” In her following, final paragraph Sulllivan concludes by writing: “Because, in fact, they surely are”—by which I assume she means that, even if some other people don’t “seem” so, all people really do turn out, upon thorough enough investigation, to be “like us.” That assumption is confirmed by the rest of her final paragraph, where she writes: “And it’s part of The Times’s journalistic mission to help its readers not only know that intellectually, but feel it in their hearts.”

*     *     *     *     *     *

I find that I am even more skeptical in the face of Margaret Sullivan’s apparent optimism that her employer is fulfilling a high “journalistic mission” than I am in the face of Nicholas Kristoff’s apparent optimism that those in the United States who most need to face and change their “unconscious biases” will ever do so. I have already given one reason for my skepticism in contrast to Kristoff’s optimism, a reason that has to do with how, for some of us, such biases are too deeply grounded in preserving our own privileges.

Among my reasons for skepticism about Sullivan’s optimism, I have one that is similar, which is this: Among the factors Sullivan lists to account for the “persistent inequality,” in dominant news sources such as the New York Times, of coverage of “terrorism deaths” in diverse places, she nowhere even mentions the factor of profit. But after all, what really accounts for the four factors she does address—the factors of “journalistic access, deployment of resources, and the admittedly subjective idea of what’s newsworthy,” as she summarizes her account (leaving out, for some reason, the fourth factor she mentions, that of being more concerned about deaths in nations that are of more strategic importance to our own national self-interest than deaths in nations with less such importance)—being factors in the first place? One need not even be as cynical about such things as I tend to be to suspect that the reason for those reasons themselves is above all because it is far more profitable to The Times to keep things that way, rather than to face its own biases, let alone change them.

Nor is that all. I have other grounds for skepticism. Indeed, even in the very same Sunday edition of The New York Times that contains Kristoff’s and Sullivan’s two columns, there are two more articles that remind me of those grounds. In my next post, I will turn to those two remaining pieces from that morning’s Times.

Mourning and Celebration: Embracing Our Dear Departed

This is the second in a series of three posts occasioned by the death of Osama Bin Laden.  I dedicated the first post to the students in the undergraduate Existentialism class I am currently teaching.  Toward the same end of  rendering credit (or blame, as the case may be) where it is due, I dedicate today’s post, the second of the three, to the students in my current seminar, in the later writings of Heidegger.

*  *  *  *  *

Mourning and Celebration:  Embracing Our Dear Departed

Which dead are mine, among all the dead?  Must I not first identify my dead, before I can properly mourn them?  And, once identified, do not my dead, in the proper mourning they require of me, not also require that even in my very mourning itself I never forget to celebrate the lives they lived, and sacrificed for me, that I might live in turn?  Do I not owe my dead such celebration in my mourning, owe it even to those among my dead who died too young, before having lived much at all—such as my cousin, youngest of my mother’s nephews and nieces, who died of leukemia when she was only 11, and I was near the same age?  Don’t even those of my dead who died before they’d been properly born at all–such as my father’s first son, who died in being born of his mother, my father’s first wife, who also died at the same time, at that same child’s childbirth—ordain such celebratory mourning and mournful celebration?

Which dead are truly mine?  And how am I to mourn them?

*  *  *

Those were the sorts of questions that were already on my mind on the recent morning of Monday, May 2, 2011, even before I opened that morning’s newspaper and found out that Osama Bin Laden was dead.  They were on my mind because of my having just the day before reread Jean-Paul Sartre’s play The Flies—written and first performed in Paris under the German occupation during World War II, and a ringing call for resistance against oppression.  I had been rereading that play–in preparation for teaching my first class of the coming week (as I explained in my preceding post), for which The Flies was the reading assignment.  One of the questions Sartre raises in the play is precisely that of whom we should mourn and how, and I was already planning to discuss those aspects of the play with my class.

In addition to the rereading of Sartre’s play having thus already reawakened my concern with the question of proper mourning, so that it was already on my mind the morning I learned of Bin Laden’s death, the day before I had also done something else that had an effect on how news of his death affected me when I opened the paper that morning.  In the evening of that same day before, my wife and I had watched a DVR recording of the 60 Minutes broadcast from a bit earlier that same evening.  In fact, it was precisely because we were watching that recorded program rather than live TV, which we might otherwise have been watching, that we did not come to know about Bin Laden’s death until the next day.  When news of that death was first being released by the White House and then quickly finding its way to circulation through the mass media, my wife and I were watching that recording of 60 Minutes, and then we went to bed for the night, not to learn that Bin Laden was dead till the next morning.

My wife and I had both been especially affected by one particular segment of 60 Minutes, which consisted of a lengthy interview with television journalist Laura Logan about her horrendous ordeal, on an earlier evening this same spring, when she was subjected to vicious and brutal sexual violence and violation at the hands–literally—of some in the crowd that filled the streets of downtown Cairo in wild celebration of the success of the popular uprising that had that very day succeeded in overthrowing Mubarak.  Laura Logan was with her camera crew in the midst of that wildly celebrating crowd when she was suddenly and repeatedly sexually assaulted and brutally raped by multiple perpetrators.  She had been able to survive the ordeal only thanks to the eventual intervention of some Arab women, who were almost completely veiled, in accordance with the Arab tradition to which they belonged.  Risking themselves regardless of any tradition, however, these women actively and directly intervened on Laura Logan’s behalf.  They literally reached out and took her into their own hands, taking her out of and away from the hands of her captors, rapists, and would-be murderers.   

Though undergoing the interview was obviously very difficult for her, given the trauma she had experienced still so very recently, Laura Logan courageously revealed her deep personal woundedness, as she told, in clear, linear, narrative fashion, and in all its deeply disturbing details, the story of what had happened to her.  In the process she explained why she had decided to do the interview itself, despite how difficult it was for her to do so.  She had agreed to do the interview, she clearly and emphatically claimed, explained, and insisted, for the sake of all female journalists everywhere, who are constantly put at risk of suffering the same sort of abuse she suffered, for the simple reason that they are women, and who, compounding the abuse, are almost never granted a forum for articulating and protesting their situation.  Precisely because she had herself actually and publicly endured the devastating, degrading abuse she had endured, however, Laura Logan had been placed in a unique position.  That position allowed her—and her own responsibility demanded of her—to give her voice at last to all those otherwise still voiceless women.  All those women spoke with one voice in her deeply wounded, often breaking, singular voice—a voice that carried the unquestionable power and authority given it by all the bitterness, profundity, pain, and horror of the agony she was made to undergo at the hands of her assailants among the crowd at Cairo that night of the celebration of the toppling of Mubarak.

With regard to that last, Logan also went out of her way in the interview to insist that the Egyptians who crowded into the streets of Cairo to celebrate the newfound freedom they had won in their triumph over Mubarak and all his vestiges of coercive power were by all means right to celebrate.  What they had done for and by themselves deserved to be celebrated, she insisted, and nothing in or about her own horrendous ordeal said anything to the contrary.  Indeed, as she also indicated herself, all around the world men and women of any decency celebrated with and for Egypt and the Egyptian people that night.  She did so herself, and still felt the same way, as she strongly affirmed in the interview.

Immanuel Kant said much the same thing about the French Revolution.  Kant remarked that, despite all its excesses, which he and other decent persons should reject and bemoan, nevertheless what lay at the very heart of the French Revolution deserved universal approbation.  That was the unqualified or unreserved assertion and affirmation of universal human freedom, universal human equality, and universal human solidarity.  All human hearts still capable of beating at all had to beat a bit more quickly when the French Revolution happened to them—that is, when news of it reached their ears.  Every human heart had to beat a bit more quickly at that news –as Kant says his own heart did—in joyful celebration.

Both Immanuel Kant and Laura Logan are indisputably right in what they say.  There is a corollary, however, that neither Kant nor Logan expressly states.   That may be simply because it is so obvious to them both that neither ever thought to state it explicitly.  At any rate, the corollary is that the same human heart that beats a little faster in celebration when it first hears about the French Revolution is also necessarily a heart that also beats a little faster once more again, each time it hears yet again of that same Revolution, however many times it may have heard of it before:  No matter how many times it may have heard of that Revolution before, each time the still-human heart hears of it again, that heart leaps again in celebration.  That is true, at least, so long as the heart does not grow jaded, so that it can no longer hear what it is being told, when it is told yet again of the French Revolution.  Indeed, that leap of joy is rekindled yet again, however faintly, but still truly, even—perhaps especially (there are certain reasons for thinking so, at any rate)—when it is the heart itself that reminds itself of that Revolution, calling it back to mind, remembering it.

In that sense, which I would say is the single most important sense, whenever anyone anywhere recalls the French Revolution–reminds herself or himself of it, remembers it–then the French Revolution happens again, in and as the very leap of the heart in celebration at the merest memory of that glorious event.  Then once again, yet literally re-newed, made new again—so that no matter how many times it happens, every time it happens again it happens again for the very first time—the French Revolution happens.  In that same sense, “1789” is not a year that, though it may once have been, is no longer, and with each “new” year retreats by yet one more year more distantly into the distant past.

Not only may, but also in a certain crucial sense must, the future be “now.”  So must the past.  And in the past that is still now–that past that, as William Faulkner famously said, isn’t over yet, it’s not even past—“1789” is not a year that was.  Rather, this year—the very “calendar year” 2011– is still “1789.”  The time of such events, the real events of a real history–all that finally counts once all the counting and recounting is finally over–does not fly by with the ticks of the clock, like the dead time when nothing ever really happens and there is never anything new under the sun.  Rather, in the real time of real history, all years are simultaneous, and every year is every other.  That is the time “it is” eternally– eternal time, when, regardless of what year the chronically still-born clock of the calendar may say it is, it is always really still “1789,” but also no less “1776,” and “1812,” and “1848,” and “1914,” and “December the 7th, 1941” (that “day that will live in infamy”), and “May 1968” (in Paris, in the spring), and, for that matter, “September 11, 2001.”

To “mourn” means to keep the dead alive in memory.  That does not in the least mean to keep little pictures of our “dear departed” in lockets on chains worn around our necks, or in family photo albums, or in supposed memory-images in our supposed minds or our demonstrably convoluted brains.  Not that there is anything wrong with such things, with lockets, and albums, and images, and engrams, or the like.  But to cling to such images, as though to lose them would be to lose our memory of the dead themselves, is one sure way to bury our dead beyond recall, substituting an idol for the holy, an illusion for reality.  If we so treasure our images of the dead that we lapse into such a substitution, then what we are doing is not mourning at all, we are avoiding mourning, like the father in the story such as Freud often liked to tell about his patients, the father who shows no signs of grief when his wife dies, but who later breaks down sobbing helplessly when the pet hamster to which he has devoted himself to avoid having to face his real loss gives up the ghost.

The verb mourn derives, according to my dictionary, from the Middle English mournen, which itself derives from the Old English murnan.  That latter, my dictionary further informs me, is akin to the Gothic moúrnan, which means to be anxious, and itself derives from the hypothetical Indo-European base (s)mer-, meaning to remember, think of, whence comes, for example, the Sanskrit smárati, (he) remembers, or the Latin memor, mindful of.  That etymology just reinforces the reality of what the verb mourn still says today, if we just let ourselves have the ears we’ve been given to hear it:  To mourn is no more and no less than—because it is the same as—to stay mindful of whatever and/or whomever we have lost, that is, to stay mindful of our dead.

Properly to mourn our dead, then, is ever to remember them, to be ever mindful of them, never to forget them.  That is the mourning we owe the dead, and that they demand of us:  That we never forget them.  Thus are we told never to forget those who died in “Auschwitz”—that is, the Jewish victims of the Nazi “final solution.”  So we are told, too, each September, when bumper stickers and window-decals remind us we should “never forget” those who died on “September 11, 2001.  Thus does even the psalmist sing in Psalm 137 (136), verses 6-7 (Grail translation), adding notes and tones that underscore both the seriousness of our responsibility to remember, and what we will deserve if we don’t:

O how could we sing

the song of the Lord

on alien soil?

If I forget you, Jerusalem,

let my right hand wither!

O let my tongue

Cleave to my mouth

If I remember you not,

if I prize not Jerusalem

above all my joys!

To mourn truly—that is, fully and deeply and with full propriety, as only accords with the heavy debt of mourning we owe to our dead—is to keep ourselves ever mindful of them.  That, however, means that we owe it to our dead to keep the wound of our pain at the loss of them to death open, keeping ourselves vulnerable to that pain and that wounding.  That is the mourning the dead require of us, and not the cherishing of any images or other idols we are tempted to make of them.

To mourn our dead is to hold to our ongoing pain at our loss of them to death, to hold to that pain as what itself holds us to them, by a bond unbreakable even by death itself.  To mourn our dead is to refuse to be “reconciled” to their absence, to “get over it,” as we are told we inevitably will by all the well-intentioned folk who keep telling us so.  They tell us that we will, with time, be “consoled,” and will then find ourselves again able to get on with our lives “despite” our presently shattering pain at our loss and our present bereavements.   “Time,” we are reminded, “heals all wounds.”

Our mourning itself knows better.  What it dreads and eschews as the worst thing that could possibly happen is to be deprived of the sharpness of our pain at the death of those to whom we are bound, so tightly that even the grave cannot unbind us.  Like love, of which it is really a modality, our mourning is stronger than death, and reaches beyond the grave.  It recognizes that even to want to “get over” our pain at our loss is already to betray the dead, not to honor them.  It is to let them fall our of memory and be forgotten, rather than cherishing the memory of them and  never forgetting them, not even if we forget their names—and even our own.  Mourning refuses such betrayal of the dead, and insists upon remaining un-consoled and un-consolable.  Mourning recognizes time, that time that would tempt us into “healing” of our wounds, rather than holding to them, as the dead indebt us to do—that dead, chronic time of the clock, of Chronos, who devours its own children—as its greatest enemy; and it rejects scornfully any suggestion of eventual “reconciliation.”

In such defiance of all messages of consolation, and revolt against any movement toward reconciliation, mourning knows with clear, unshakeable certainty that it is our very pain itself that is our memory of our dead, our holdoing on to those to whom we owe it never to forget them, as we have vowed never to do.  Mourning insists upon keeping the wound of the death of our dead open, the pain intense, because anything less or other than that would be blasphemy against the dead, the uttering of which would enact our own final, irrevocable self-condemnation.

Our bereft pain and its irrevocable rejection of all reconciliation with the brutality of the death of our dead is itself what keeps our connection to our dead, binding us forever to them, keeping the memory of them always alive in our hearts.  It is our very pain, our always still-open wound, that unites us with all our dead.  For that very reason, mourning as such—and not just sometimes or in some cases–is itself, without ever ceasing to be anything but pure mourning, is celebratory:  All mourning is, as mourning, already celebration.

My dictionary also tells me about that word, too.  It tells me that celebrate comes from Middle English celebraten, from the Latin celebratus, the past participle of the verb celebrare, which means to frequent, go in great numbers, honor, and is itself derived from celeber, meaning frequented, or populous.  Keeping that derivation in mind, we may say that “celebrating” is joining with others in honoring what’s honorable—what deserves to be honored.  In short, celebrating is joining the crowd at the celebration.  Furthermore, the greater the honor due what is honorable, the larger grows the community of those who should render it honor—the greater, that is, grows the crowd.

All mourning, whatever form it may take, from dancing a jig at a drunken Irish wake, to gnashing one’s teeth and tearing one’s hair in the agony of one’s loss, is celebration.  That is because in mourning—all mourning—we join the crowd.  We join, in fact, the crowd of crowds, that crowd than which no greater crowd can be, nor even be conceived, because it is a crowd literally without end, a crowd the number of which is beyond all counting.  In mourning we join, by our pain itself, that pain which is our very bond, our very ligature or junction to those for whom we mourn, the throng that travels ceaselessly along the most frequented, most densely populated road of all—the only road along which absolutely all human beings without exception must travel together, even and especially because each must travel altogether alone.  That is the road of death, of our mortality.

In mourning, all mourning of whatever kind, we join the largest crowd of all, and therefore join into the celebration of all celebrations:  the crowd of all the living, and the dead—living and dead both named and unnamed, known and unknown, but always all alike in the final, inexorable, holy anonymity of the grave itself.  In mourning, the soul swoons in celebration, in adoration.

At the end of “The Dead,” the long story with which he, in turn, ends Dubliners, James Joyce describes the experience of his character Gabriel Conroy, the narrative center of the story, as Gabriel sits looking out the window of the Dublin hotel room where his wife is already asleep in the bed beside his chair.  It is Christmas Eve, the end of a day and a night during which Gabriel has had his own solitude and mortality unexpectedly revealed to him in what would pass by all regular accounts as some thoroughly commonplace, trivial events of that day, but the accumulation of which finally shatters Gabriel and his complacency completely.   As he sits in the darkness and the silence beside his sleeping wife, Gabriel gazes out the window of their hotel room at the snow gently falling outside.  Joyce brings “The Dead,” and with it the entirety of Dubliners, the collection of stories wherein he tells “the moral history of his community,” to an end by writing:  “His soul swooned slowly as he heard the snow falling faintly through the universe and faintly falling, like the descent of their last end, upon all the living and the dead.”

In mourning our own dead properly, we, too, let our own souls swoon, as we join that same community—the one and only universal human community of “all the living and the dead.”

As for just what all this has to do with the death of Osama Bin Laden, I will try to answer that question at the start of my next and final post in this series of three occasioned by his death.