Tuesday, July 22, 2014

Choosing life in the face of death

A transcript of my remarks at Congregation Torat Emet in Bexley, Ohio, on July 17. 2014.

I never wanted to be known for having a fatal disease. But you don’t get to choose your reputation any more than you get to choose your fate. Several years ago terminal cancer called to me and I answered Hineni, “Here I am.”

The religious language may seem blasphemous, as if I were claim­ing to be a prophet, but that’s not what I mean at all. What I mean is Hashem places you in your circumstances, and even the most ordinary of persons can discover his unique role in life, his calling—he can help to complete creation—if he recognizes and accepts where he has been placed.

Etty Hillesum, a 28-year-old Dutch Jew who voluntarily reported to the Westerbork transit camp in 1942 to work in the social-welfare depart­ment there, explained her reasons like this:This much I know: you have to forget your own worries for the sake of others, for the sake of those whom you love. All the strength and faith in God which one possesses must be there for everyone who chances to cross one’s path and who needs it. . . . You must learn to forgo all personal desires and to surrender completely. And sur­render does not mean giving up the ghost, fading away with grief, but offering what little assistance I can wherever it has pleased God to place me.[1]I was diagnosed just before Sukkot in late September of 2007. My doctor phoned to say that an “opacity” had shown up on my chest X-ray after routine physical exam. A biopsy at Methodist Hospital in Houston about ten days later revealed Stage IV metastatic prostate cancer with a Gleason score of nine, meaning the cancer would be extremely difficult to treat. I was given one to three years to live.

I hungrily compared myself to other men with the same cancer—the literary critic Anatole Broyard got 14 months, the rock musician Dan Fogelberg three-and-a-half years, my friend and mentor Denis Dutton, editor of Philosophy and Literature, two years—and so I was vigilant for death, although I never knew when it would arrive. Naomi and I planned our lives as if the cloud of uncer­tainty were not hovering above us. We moved to Columbus and joined Torat Emet in August 2010, hoping that my cancer would remain dormant. By spring, however, it had awakened from its slumber and begun to spread again. By last fall the cancer stopped responding to drugs and invaded my bone marrow. I began pal­liative chemotherapy, to improve my quality of life, and I was taken under the wings of hospice care. It is now just a matter of time.

The facts are vulgar, and perhaps even a little tedious. This year some 233,000 men will be diagnosed with prostate cancer, and about 29,500 will die of the disease. Between diagnosis and death, however, many cancer patients linger for a number of years. [My wife] Naomi points out that the term life-threatening disease is not always appro­priate. A lot of patients like me have what should be called a life-limiting disease.

That is, for many people cancer has become a chronic condition. The biblical span of their lives—seventy years, and if with strength, eighty years—has been limited, but so too has the scope of their lives, what they can do and can’t do any more because of their cancer.

Because of my hip, which has been destroyed by cancer, I can’t play catch or one-on-one basketball with my three boys. I can’t pick up Mimi; I can’t dance with Naomi. Perhaps most unhappily for me, I can no longer travel. I have never been to the state of Israel, and now I will never go.

But here, here on the downslope of life-limiting disease—here exactly is where I can offer a little assistance, since here is where God has placed me.

I can remember exactly when everything changed for me. It was more than six years ago now. We were still living in Houston. I was sitting in the back bedroom, rocking in a rocking chair between cycles of aggressive chemotherapy, and I was strug­gling to read some hefty book that would have caused me no trouble in my pre-cancerous days—The Adventures of Augie March, I think it was.

Chemotherapy had left me with “chemo brain,” a state of mind in which everything was fuzzy and no idea ever wandered. I could not make any sense of Bellow’s book. I felt profoundly sorry for myself. “Oy, I can’t think any more,” I moaned; “I can’t think any more.” Suddenly I stopped rocking. “Hey, wait a minute,” I said; “that’s a thought.”

From then on I decided that, if I could no longer think as sharply as I once did, I could still think. If I could no longer play with my boys as I once did, I could still play with them. If I could no longer be married to Naomi “for­ever,” as I once promised, I could still be married to her for as long or short a time as remained to me.

Since then I have become something of a public advocate for the view that even a person with terminal cancer, for whom it seems as if only death is real, can nevertheless choose life. As I wrote in a recent essay called “The Mercy of Sickness before Death”:Hope is not . . . what the terminal cancer patient needs. What cancer patients need more than anything is to take responsibility for their disease. From their doctors, from their family and friends, and especially from themselves, they need simple honesty about their condition, their treatment options, their chances. They require exactly what [anyone] requires if he is to grow as a human creature: the “square recognition of his being as he is, without minimizing or exaggerating.”Responsibility, honesty, facing reality—if my oncologist is to be believed, most cancer patients find these very difficult to achieve. Denial and despair are the more usual long-term reactions to a diagnosis of terminal cancer.

But denial and despair are merely refusals to accept the responsibility of finding, under the sign of death, a new purpose and meaning to life. Denial and despair are rejections of what the great American Catholic writer Flannery O’Connor calls “one of God’s mercies.”

In what way, though, can a diagnosis of terminal illness and a long sickness before death possibly be merciful?

Some of you know that Naomi’s and my brother-in-law Scott—her younger sister’s husband—died last year just six months after being diagnosed with multiple myeloma. By comparison, living for six-and-a-half years with a slowly wasting disease is a lenient sentence, even if it is a death sentence.

But there is more to God’s mercy than that.

On the same day I was diagnosed with cancer, the same day, Naomi learned that she was pregnant with our fourth child—our only daughter, Mimi. The coincidence was a miracle. Both Naomi and I saw God’s hand in it. It was as if God were saying, “I have set before you life and death, blessing and curse, the birth of your daughter and your death from cancer. Now, there­fore, choose your daughter—she is my blessing—choose life.”

Last Shabbes, my eye was snagged by David’s lines from Psalm 30, the shir for the dedication of the Beit HaMikdash, which we recite before Pesukei d’Zimra. The Psalm has long baffled our commentators, because nothing within it has anything whatever to do with the Mikdash. Instead, it is David’s reflection upon life-threatening illness. He cries out to God:Mah betsa b’dami,
b’ridti el-shahat.
Hayodkha afar,
Hayagid hamitekha.


What profit is there from my death,
from my descent into the pit?
Can the dust praise you?
Can it declare your truth? (v. 10)
What do these lines have to do with the Mikdash? The answer is this. For David, the Temple represents—and for me, who finds such comforting warmth within it, shul represents—an ascent from the pit, a respite from death, the opportunity to praise Hashem and luxuriate in declarations of his truth.

What a mercy it is to have that opportunity!

It is also the opportunity to prepare oneself for death, to lay back in the love of your friends and family, to live in absolute spiritual freedom.

This, by the way, is why I hate being advised to “fight” my cancer. I am angered by obituaries which say that so-and-so “lost his battle” against cancer. It’s bad enough the military metaphors imply that those who die of cancer have put up a weak and pathetic fight, as if they were sad sacks like the Polish Army overrun by the German Wehrmacht during World War II.

But what is worse, to seek to “fight” my cancer is to struggle fruit­lessly against physical necessity. There is nothing I can do to fight my cancer. It is going to kill me, and within the next few months. To rage against the verdict is a waste of my inner resources. It is another form of denial.

But if the language of “fight” and “battle” is not the right language, what is? What should people say to terminal cancer patients?

The frum [Orthodox Jewish] impulse is to say Refuah shalemah, “may you have a complete recovery.” But this is hardly fitting for some­one, like me, for whom there is no refuah, no recovery.

Not knowing what to say, then, many people say nothing at all. Oh, they will tell themselves that they wish to spare me, because they are afraid of saying the wrong thing, but the truth is they are only sparing themselves. It is a real problem, what to say to the dying, but the problem is not solved by not solving it.

What I have found most consoling is the knowledge that my wife and children will be looked after—they will not be left alone, even after I leave them. The best thing anyone has ever said is what Joni Schottenstein said to me: “Don’t worry, David. I already have someone picked out for Naomi.”

The quiet and firm assurance Joni’s husband David gives me, that he will be my sons’ surrogate father whenever they need him, silences my deepest fears. Kenny Steinman and Rafe Wenger decline the obligation of pulling long faces and being solemn—they treat me as if I still have a sense of humor and might still enjoy the human comedy. A friend who is a music critic [Terry Teachout], hearing that I was too beaten by chemo­therapy to do more than listen to music, recommended the blues singer Jimmy Rushing, who lifted my spirits like Mimi’s butterfly kisses.

The thing to remember is what Naomi and I have learned from this six-and-a-half-year journey: life is not a matter of peak experiences, of amazing sights and even more amazing thrills, but of small pleasures—a good meal, a good book, good company, good conversation. Right there is where life needs to take hold of the gravely ill again.

We who are dying need from you what we should be demanding from ourselves—responsibility, honesty, the courage to face reality squarely. It matters less what you say to us than how you talk to us—face-to-face, as Moses spoke with God. And after all, who knows but that you might be the one, by your kindness and faith, to give us the strength to choose life in the face of death?
____________________

[1] Etty: The Letters and Diaries of Etty Hillesum, 1941–1943, ed. Klaas A. D. Smelik, trans. Arnold J. Pomerans (Grand Rapids, Mich.: William B. Eerdmans, 2002), pp. 477–78.

Friday, July 18, 2014

Again, he is very old

Note: Last night my synagogue, Congregation Torat Emet in Bexley, Ohio, held a tribute in my honor. Several friends spoke, and several more friends, who could not attend the evening, wrote small things about me. The most unique—far and away the most amusing—was written by my former student Michael Schaub, one of the best young literary critics in the country. A regular contributor to NPR whose work has also appeared in the Washington Post, the Los Angeles Times, the San Francisco Chronicle, and other journals, Michael graduated from Texas A&M University in 1999. He now lives in Austin.

by Michael Schaub

I first met David in 1996, when I enrolled in a class he was teaching about the American novel at Texas A&M University. David, who is a master of self-deprecation, would argue that this was the first of many bad decisions I made as a new adult. He would be wrong, however. I had already made several bad decisions that year. But taking a class from David was the only one that didn’t end with the involvement of police officers or doctors. And it was the only one that worked out in the end. Despite David’s best efforts—he mocked my politics; rolled his eyes at my brilliant teenage analysis of Henry James, who I considered quite the square; and not-so-gently criticized my tendency toward rambling sentences (this one’s for you, David!)—I loved him instantly. I still consider him to be like a father to me—not just because he taught me how to be a human being, but also because he is very old.

Those of you who have never taken one of David’s classes might not fully appreciate the experience of being his student. Imagine a fever dream, with a slight, well-dressed, bespectacled man constantly screaming at you, sometimes in Yiddish. He would smile slightly when he agreed with your analysis of a book, and smile hugely when he didn’t. I think he truly preferred the latter. If you were wrong, he would launch into a perfectly-reasoned, erudite discussion of everything you had just said, and then close with making great fun of your Clinton-Gore ’96 button and urging you to read more Commentary magazine. (I had never read Commentary before I met David; I had only heard of it because of my reactionary Jewish grandfather. By which I mean David. Again, he is very old.)

I can’t say that I was David’s best student, but I like to think I was one of his favorites, although he’d deny it. I am, I believe, the only student who inspired him to throw a book across the room (and directly at my head) twice. I am a large and unathletic man, not capable of sudden movements even when faced with imminent heard injury, so the only reason I escaped unscathed is because David has pretty terrible aim. (A little-known fact: Sandy Koufax actually once asked David either to stop throwing things or convent to another religion.)

When I learned that David had cancer, I reacted the same way I’ve always reacted to bad news: with denial. That’s why I’m writing this the same way I wrote all of my term papers for David’s classes—at the last minute, trying not to cry, while my roommate smokes marijuana and listens to Sublime. (OK, not the last part.) If I get too sincere, too sentimental, David will literally board a plane to Texas right now and start throwing books at my head, so I’ll keep this short: He didn’t just teach me how to be a good person, and he didn’t just teach me to love literature, he also, quite literally, saved my life. I really do love him like a father, and that’s not just because he is very old (which he is), but because he taught me how to be brave. I’m not there yet, obviously. But when I do get there, It will be because of him.

He taught us all so many lessons, and it’s impossible for me to thank him adequately. I wish I could be with him tonight, not just to tell him this in person, but also because I just bought a “Ready for Hillary” t-shirt, and I would love to see his reaction when I walked in wearing it. He would, of course, start throwing books at me. And I would just stand there, because he couldn’t hit the Great Wall of China with a ping-pong ball, even if he was standing one foot away. I mean, I can only imagine his aim has gotten worse. Because, as you know, he is very old.

And he’s also one of the best people I’ve ever met. I think I can speak for all of us (excluding a few dozen university deans and administrators) when I say that my life is better because of him. David, I know you can’t abide this sentimental stuff, but you’re just going to have to deal with it tonight. Thank you for everything. I promise I won’t press changes for the flying books, and not just because the statute of limitations has expired. You are a great man, and I love you.

Monday, July 14, 2014

The 10 best novels of the 1940s

New York Post critic Kyle Smith’s series over at PJ Media on the best films of the decades has been entertaining to follow, especially when you disagree with the choices. Smith’s latest, an inventory of the best films of the ’forties from Double Indemnity (#10) to Citizen Kane (#1), got me to thinking. What are the best novels of the ’forties?—a decade that lies just outside my critical expertise. What follows is a preliminary listing: not a ranked order, but a chronological one.


Richard Wright, Native Son (1940). Not only the classic fictional treatment of race relations in America, but a novel that is more compelling for its very contradictions. The ’forties were a great decade for discursive fiction—novels that discuss ideas which are embodied in men’s obligations and commitments—and Wright’s was one of the decade’s great examples.


Christina Stead, The Man Who Loved Children (1940). More recently praised by the novelists Jane Smiley and Jonathan Franzen, Stead’s novel owes its fame to the 1965 reprint edition with an introduction by Randall Jarrell, who called it “one of those books that their own age neither reads nor praises, but that the next age thinks is a masterpiece.” It remained neglected in Stead’s native Australia until 2010, when it was finally reprinted with an introduction by the more fashionable Franzen instead of the more distinguished Jarrell.


Arthur Koestler, Darkness at Noon (1941). The classic fictional treatment of Moscow show trials and one of the great works of anti-Communism. A member of the German Communist Party for seven years, Koestler wrote the novel in Paris while his lover Daphne Hardy translated it into English. She smuggled her translation out of Paris just ahead of the Nazis and published it in England in 1941. It was not released in Germany until 1948.


Janet Lewis, The Wife of Martin Guerre (1941). If I were going to rank these books in order of greatness, Lewis’s novel, only slightly longer than a novella, would be my first choice. I have written about it elsewhere. The Wife of Martin Guerre is the perfect historical novel. Lewis understands Bertrande de Rols, her heroine, wholly in the customs and conventions of 16th-century provincial France. Unlike later retellers of Martin Guerre’s story, she does not permit modern values to stain her closely woven fabric.


Evelyn Waugh, Brideshead Revisted (1945). In my recent Books & Culture essay on the young Catholic novelists William Giraldi and Christopher Beha, I said that the “greatest religious novels are written out of a religious discernment much the same way that surrealistic poetry is written out of a particular vision of reality: it soaks the work from top to bottom.” Brideshead Revisted is the model for this approach. The account of Charles Ryder’s conversion to Catholicism is so subtle that many readers fail to notice that it is happening.


Ivo Andrić, The Bridge on the Drina (Serbo-Croatian, 1945; English, 1959). Anyone still interested in the former Yugoslavia must read two books—Rebecca West’s magisterial two-volume travel book Black Lamb and Grey Falcon (1941) and the masterpiece of Serbian literature, published four years later. Compared to One Hundred Years of Solitude for its multi-generational sweep, Andrić’s novel is a hundred pages shorter, scrupulously avoids the magic in magical realism, and might be more accurately described as The Painted Bird with a conscience.


Robert Penn Warren, All the King’s Men (1946). The ’forties were a decade for political fiction, but none of the decade’s political novels is like any of the others. One of America’s greatest poets, Warren wrote a lyrical account of an American populist demagogue modeled upon Huey Long (and played by Broderick Crawford in Robert Rossen’s 1949 film version). The novel is narrated by an onlooker whose own corruption is restrained by his prose style, which is Warren’s promise of something better in the American polis.


Hans Fallada, Every Man Dies Alone (German, 1947; English, 2009). One of the decade’s best novels had to wait sixty years for an English translation. In the New York Times, Liesl Schillinger called its long deferred publication in English the “signal literary event of 2009.” A harrowing account of two German anti-Nazi resistance fighters, based on the actual experiences of Otto and Elise Hampel (pictured above), Fallada’s long 500-page novel is as exciting as anything by Eric Ambler or Alan Furst, with the added dimension of a powerful vision of human freedom.


Albert Camus, The Plague (French, 1947; English, 1948). While The Stranger seems a product of its time (and confined to it), The Plague remains as fresh as any of Amazon’s recommendations for this month. Marina Warner has testified to how much more she saw in the novel when she read it again years later than when she first read it as a young ’sixties woman, basking in “existential disaffection.” If you think you already know this novel, reread it and think again. What never molds is the purity of Camus’s style.


George Orwell, Nineteen Eighty-Four (1949). Perhaps the most obvious book to include on this list. With the rise of a new style of totalitarianism in our time to rival Nazism and Communism, Nineteen Eighty-Four remains timely—even thirty years after the dystopic future in which it was set. Readers who are more animalistically political than I will never tire of Emmanuel Goldstein’s “Theory and Practice of Oligarchical Collectivism.” But I, I will need free my mind of Winston Smith’s greatest horror, which finally breaks him down into helpless love for Big Brother.

Honorable mention: William Faulkner, The Hamlet (1940); Vladimir Nabokov, The Real Life of Sebastian Knight (1941); Joyce Cary, The Horse’s Mouth (1944); Saul Bellow, Dangling Man (1944); Thomas Mann, Doctor Faustus (German, 1947; English, 1948); Malcolm Lowry, Under the Volcano (1947); James Gould Cozzens, Guard of Honor (1948); Graham Greene, The Heart of the Matter (1948).

Reader recommendations: Eric Ambler, Journey into Fear (1940); Carson McCullers, The Heart Is a Lonely Hunter (1940); Rex Warner, The Aerodrome (1941); Wright Morris, My Uncle Dudley (1942); Hermann Hesse, The Glass Bead Game (German, 1943; English, 1949); Thomas Mann, Joseph and His Brothers (German, 1943; English, 1948); Jean Stafford, Boston Adventure (1944); L. P. Hartley, Eustace and Hilda (1944–1947); Henry Green, Loving (1945); William Maxwell, The Folded Leaf (1945); R. K. Narayan, The English Teacher (1945); Eudora Welty, Delta Wedding (1946); Mervyn Peake, Titus Groan (1946); J. F. Powers, Prince of Darkness (stories, 1947); Yasunari Kawabata, Snow Country (Japanese, 1948; English, 1956); Nelson Algren, The Man With the Golden Arm (1949).

Sunday, July 06, 2014

An open condemnation of the murder of Mohammed Abu Khdeir

Cross-posted from Elder of Ziyon

We unequivocally condemn the horrific murder of Mohammed Abu Khdeir. It was unjustifiable under any circumstances. The killing was reprehensible and we hope that the criminals who did this sickening act are found and prosecuted to the fullest extent of the law.

Israel is a country run by the rule of law. There are reports that Jews have been arrested for this crime. If a trial finds that Jews are indeed guilty of this unconscionable killing, our condemnation is redoubled. The idea that Jews could do such an act fills us with shame and horror.

The people who murdered Mohammed do not represent us in any way. It is not enough to dissociate ourselves from the dreadful act; we must also ensure that crimes like this are never repeated.

Just as the appalling murders of Naftali Fraenkel, Eyal Yifrach and Gilad Shaar do not in any way justify the hideous murder of Mohammed Abu Khdeir, neither does Khdeir's murder justify the violence, terrorism, destruction and incitement we have seen over the past few days against Israelis and Jews.

We hope and pray that everyone, Arab and Jew, lives in peace and security in the region.

Signed,

Elder of Ziyon
Daphne Anson
CiFWatch - Adam Levick
Internet Haganah - A. Aaron Weisburd
Liberty's Spirit - Elise Ronan
Mike Cohen
Zach Novetsky
Beer Sheva
Edgar Davidson
Ray Cook
5 Minutes for Israel - David Guy
GabrielQuotes
This Ongoing War - Frimet and Arnold Roth
Israelkompetenzkollektion Shelly
Dr. Sharon Chard-Yaron
Always Write Again -Natalie Wood
Avi Eisenberg
MS Wallack
British-Israel Coalition - Harvey
Israel Matzav - Carl in Jerusalem
Joe Settler
Philosémitisme
Yid With Lid - Jeff Dunetz
A Commonplace Blog - D. G. Myers
Mystical Paths - Reb Akiva
Erika Dreifus
Meir Solomon
Is The BBC Biased - Sue and Craig

Note: Go to Elder of Ziyon to sign the open letter.

Wednesday, June 18, 2014

Cancer: the last obscenity

My third Good Letters post for the Image Journal is up this morning. It is a reflection on living with terminal cancer, as I have for the past six-and-a-half years. The theme of the post is also the theme of my book in progress, Life on Planet Cancer. My oncologist tells me that most patients collapse upon being being diagnosed, sinking into resignation and despair, but while I have discovered small goodness in cancer, I have learned to live with it—have learned that you can still have a life under its shadow.

Cancer redefines you forever; to live in denial, to pretend that you do not have the disease, is self-denial. You reject what you have become in favor of some fantasy image of yourself. If you were to do this with any of your other limitations, your height, your intelligence, your capacity for self-exertion, you would recognize it for the neurotic lie that it is. About cancer, though, our culture is forgiving: you are permitted the escape from responsibility you would not otherwise be permitted. And why? Because our culture no more wants to acknowledge the reality of cancer than those with the illness. Cancer is, perhaps, the last obscenity.

Anything, please, but the reality of cancer! The culture celebrates “survivors” who have triumphed over it, praises the dead for having “fought” it, but those who are living with the illness are invisible, hidden away. The culture will stoop to speaking the name of cancer, that is, only when it is past—not when it is a present reality for anyone.

When the baseball Hall-of-Famer Tony Gwynn died on Monday from salivary gland cancer, he was celebrated for his achievements on the field and off—rightly so—but his five-year experience of living with cancer was reduced to a “battle” and nothing more. How the disease affected his thinking, his coaching at San Diego State University, his self-image, his personal relationships, his faith—none of this was mentioned, let alone explored. He became a hero only in death, because his life while diseased is no one’s business, just as a person’s sex life used to be.

Monday, May 05, 2014

More on transcendence

My first post for the Image Journal’s Good Letters blog, which I have joined as a regular contributor, is up this morning. It is an attack on the vulgar understanding of religion as the experience of transcendence. I prefer what William James dismisses as religion’s “dull habit.”

The occasion of my Good Letters post was a review by Alan Lightman, the novelist who doubles as a physicist, in which he advances transcendence as the religious emotion. Transcendence is religion’s “high,” but like a drug-induced ecstasy, it grows blunt and less keen over time. Larger and more frequent doses of the original stimulus are necessary to revive the experience, which will never be as good as the first time. Or, to say the same thing in a different metaphor, the “acute fever” of religion, which James defends against its dull habit, is a condition in which no one can live for very long without suffering hallucinations.

There is a more adequate account of transcendence. The source is Jean Améry, the great Holocaust essayist. (He also wrote unforgettably about aging and suicide.) Transcendence, he wrote in At the Mind’s Limits, first published in 1976, is the “basic quality” of the human mind. Améry means something quite ordinary by this: the mind reveals itself in transcending the brute and unpleasant facts of physical reality. “The mind is its own place,” as Milton’s Satan famously says, “and in it self Can make a Heav’n of Hell, a Hell of Heav’n.” Terminal illness can become the enjoyment of the small pleasures that give life its subtle tam; performing the remunerative but difficult work upon which others depend can become the source of bitterness and complaint.

In the death camps, the German Nazis perfected a system for destroying the mind’s basic quality of transcendence. Améry recalled a winter’s night upon which the prisoners were being marched back from the I. G. Farben factory. The waving of a flag in front of a high-finished building caught his attention, and immediately it reminded him of a favorite poem by Friedrich Hölderlin. He quoted it aloud; nothing happened; he quoted it again, louder:The poem no longer transcended reality. There it was and all that remained was objective statement: such and such, and the Kapo roars “left,” and the soup was watery, and the flags are clanking in the wind.The reality of the camp subsumed the poem-fed mind. But the outcome was not inevitable. Prisoners who were committed to a reality beyond the camp—“militant Marxists, sectarian Jehovah’s Witnesses, practicing Catholics,” and of course Orthodox Jews—were more likely to survive or at least “died with more dignity than their irreligious and unpolitical intellectual comrades, who often were infinitely better educated and more practiced in exact thinking.” Believing in another reality (God’s love, the brit olam with the Jewish people, the final victory of Communism) they were able to detach themselves from conditions in Auschwitz that defy the imagination. “The grip of the horror reality,” Améry concludes, “was weaker where from the start reality had been placed in the framework of an unalterable idea.”

The function of transcendence in the religious life is to lay the foundation of an unalterable idea. The framework, though—the daily commitment to the idea—becomes what Lightman calls the “most persuasive evidence of God.” Transcendence is a glimpse of the reality created and sustained by dull habit.

No one in the literary culture understands this better than Christopher Beha. Without giving too much away—I plan to review it at length elsewhere—Arts and Entertainments, Beha’s second novel, turns on an experience of transcendence, just like What Happened to Sophie Wilder, his brilliant first novel. For Sophie, however, the experience of transcendence mandated a reorganization of life. Eddie Hartley, the hero of Beha’s followup novel, goes through something similar:As a ten-year-old altar boy at his family’s parish in Queens, Eddie had experienced a single unforgettable moment of what adults might call transcendence, when his whole body buzzed with the presence of something other than himself, a moment he had never talked about to anyone and didn’t like to think about now, because it still seemed unmistakably real to Eddie and didn’t make any sense to him.Instead, Eddie tries to find substitutes for the experience in acting (“Something like that feeling had sometimes visited him while he was onstage”), and it remains without religious significance for him: “If asked, he would have said he was Catholic, just as he would have said he was Irish—it was a matter of birth, not of action or belief.”

Everything that happens to Eddie in the sequel is a consequence of his failure to make “that feeling” the basis of action or belief. Like so many of his contemporaries, he prefers the fever to the habit.

Wednesday, April 09, 2014

Dying is a 12-step program

My seven-year-old son Isaac was listening to Gil Roth’s interview with me on the Virtual Memories Show. “He will be dead from prostate cancer within the next two years,” Roth said in introducing me. “You’re dying?” Isaac cried. Isaac is named after Isaac Rosenfeld, of whom the critic Ted Solotaroff said that “his very name itself still seems to possess an incantatory power: some of his friends speak it as though ‘Isaac’ were a magic word for joy and wit. . . .” My son too is a merry prankster, the family’s stand-up comic. He was not prepared to think of his father as dying, and not only because he is just seven years old.

Dying is the problem, not death. As an Orthodox Jew, I believe with perfect faith in the resurrection of the dead, but until that happens, death is the termination of consciousness. No peeking back into life. I won’t get to keep a scorecard of who is crying at my funeral, who is dry-eyed, who never bothered to show up. If I want someone to cry at my funeral, I need to patch things up with him before the last weak images flicker out.

In the past few weeks I have been approaching ex-friends whom I have damaged to ask their forgiveness. I’ve been behaving, in short, as if dying were a twelve-step program. Step 8: “Made a list of all persons we had harmed and became willing to make amends to them all.” Step 9: “Made direct amends to such people wherever possible, except when to do so would injure them or others.” Not that I mind having enemies. One person whom I approached recently accused me of “basking in self-importance,” which is one possible way, I suppose, of describing the tireless knowledge that death is near. But there are other persons, including some with whom I have had very public fallings-out, whom I don’t want as enemies when I pass away. To die without accepting responsibility for the damage I have done to relationships that were once meaningful to me would be shameful and undeniably self-important.

The remaining ten steps can be revised somewhat to suit the dying:

• “We admitted we were powerless over our dying and that prolonging our lives had become unmanageable—by us.”

• “Came to believe that a power greater than ourselves could restore us to acceptance of our death.”

• “Made a decision to turn our last remaining days, the peace and torment, over to the care of God as we understood Him.”

• “Made a searching and fearless moral inventory of ourselves.” (This one needs no revision.)

• “Admitted to God, to ourselves, and to another human being the exact nature of our regrets and reasons for happiness.”

• “Were entirely ready to have God receive us exactly as we have become, without the opportunity for additional effort or success.”

• “Humbly asked Him to make light of our failures.”

• “Continued to take personal inventory and when we indulged in magical thinking about death, promptly stopped it.”

• “Sought through prayer and meditation—and, sometimes, through literary exertions—to improve our conscious contact with God as we understood Him, praying (and, sometimes, writing) for knowledge of life under the shadow of death and the power to endure it.”

• “Having had a spiritual awakening as the result of these steps, we tried to carry this message to the dying and to practice these principles in our daily lives, even if we occasionally suffered dark nights of the soul, which we tried our best not to carry over into the next morning.”

The difference, of course, is that dying is an addiction from which there is no recovery. But the similarity is this. Dying is a mental discipline, even if the goal is not to be clean and sober, but simply to be ready.

Wednesday, February 12, 2014

The greatest debuts

In English-language prose fiction, that is. Here without much further ado or explanation is the list of the twenty-five greatest literary debuts, which I posted to Twitter earlier today. John Wilson of Books and Culture asked me to put the list in one place, and so.

As Darin Strauss recognized, the list is something of a jeu, recklessly tossing together great books that happened to be first books along with books that defined (and, in some cases, foreshortened) a literary career. (A couple of changes have been made to the original list, removing Charles Portis’s True Grit—in actuality, his second novel—and Joyce’s Dubliners and including Invisible Man, which I unaccountably overlooked the first time around.) At all events, the titles on this list are characterized as much by splash as by merit.

  1. Samuel Richardson, Pamela (1740)
  2. Charlotte Brontë, Jane Eyre (1847)
  3. Kingsley Amis, Lucky Jim (1954)
  4. Joseph Heller, Catch-22 (1961)
  5. Ralph Ellison, Invisible Man (1952)
  6. William Golding, Lord of the Flies (1954)
  7. Charles Dickens, The Pickwick Papers (1836–37)
  8. J. D. Salinger, Catcher in the Rye (1951)
  9. Margaret Mitchell, Gone With the Wind (1936)
10. Thomas Wolfe, Look Homeward, Angel (1929)
11. Theodore Dreiser, Sister Carrie (1900)
12. Walker Percy, The Moviegoer (1961)
13. Ken Kesey, One Flew over the Cuckoo’s Nest (1962)
14. Thomas Pynchon, V. (1963)
15. Philip Roth, Goodbye, Columbus (1959)
16. John O’Hara, Appointment in Samarra (1934)
17. Marilynne Robinson, Housekeeping (1980)
18. Carson McCullers, The Heart Is a Lonely Hunter (1940)
19. Harper Lee, To Kill a Mockingbird (1960)
20. Flannery O’Connor, Wise Blood (1952)
21. John Kennedy Toole, A Confederacy of Dunces (1980)
22. Raymond Chandler, The Big Sleep (1939)
23. Henry Roth, Call It Sleep (1934)
24. Michael Chabon, The Mysteries of Pittsburgh (1988)
25. (tie) Erica Jong, Fear of Flying (1973)
       (tie) Donna Tartt, The Secret History (1982)

Update: Honorable mention (that is, suggestions from readers)—Emily Brontë, Wuthering Heights (1847); Anita Loos, Gentlemen Prefer Blondes (1920); James Jones, From Here to Eternity (1951); Richard Yates, Revolutionary Road (1961); George V. Higgins, The Friends of Eddie Coyle (1970); Tom Wolfe, The Bonfire of the Vanities (1987); Zadie Smith, White Teeth (2000); Jhumpa Lahiri, Interpreter of Maladies (2000); ZZ Packer, Drinking Coffee Elsewhere (2003); Chad Harbach, The Art of Fielding (2011).

Update, II: Patrick Kurp’s additions (in his order): Stevie Smith, Novel on Yellow Paper; Laurence Sterne, Tristram Shandy; Philip Larkin, Jill; Herman Melville, Typee; Anthony Powell, Afternoon Men; Evelyn Waugh, Decline and Fall; Tobias Smollet, The Adventures of Roderick Random; Ivy Compton-Burnett, Pastors and Masters; and Henry Green, Blindness.

Tuesday, February 04, 2014

Bibliographing the ’sixties

You might say that my bibliography of ’sixties fiction has been a lifetime in the making. The first hardback book that I ever bought with my own money was Allen Drury’s 1968 novel Preserve and Protect, the fourth and last volume of the tetralogy about American politics that Drury had begun with his Pulitzer Prize-winning Advise and Consent. (Spoiler: Drury never duplicated the mastery of that first volume.)

By the next year I was bolting down Portnoy’s Complaint and writing a celebration of it for Ramona High School’s literary magazine. (The faculty adviser rejected it on the basis of its sensational subject matter and even more sensational language.) I came of age on the fiction of the ’sixties—Roth, Bellow, Malamud, Stanley Elkin, Wright Morris, Walker Percy, Peter De Vries, J. F. Powers, Mark Harris, Evan Connell, Thomas Berger, E. L. Doctorow, Maureen Howard, Wilfrid Sheed, R. V. Cassill, John Barth, Joan Didion, even Madison Jones. These were the writers who lined the bookshelves of my early self-education. I filled my head with useless details about publication order and publishing houses and copyright dates. Sitting down to compile my bibliography four decades later, I found myself doing much of the work from memory.

One reason I wanted to compile it was to leave a record, even a testament, to my useless literary learning. I am struggling against self-pity when I say that learning is no longer considered the sine qua non of the scholar, especially not in English departments. At one time, as J. V. Cunningham wrote in a 1964 Carleton Miscellany symposium on graduate education in English, bibliography was numbered among the specialized disciplines of literary study—that is, every literary scholar was assumed to be a capable hand at it, if not an adept. Now, however, what is prized in English departments is theoretical sophistication, interpretive cunning; being up to the minute, but not necessarily knowing “the impervious facts/ So well you can dispense with them” (to quote again from Cunningham), is what is sought in the bright young hires.

No one will ever again accuse me of being bright or young. As a dinosaur, though, perhaps I am in a good position to watch the meteor of an unsustainable economic model wipe out the last of my species. As Clay Shirky wrote in a brilliant essay last Wednesday, “The [university] faculty has stopped being a guild, divided into junior and senior members, and become a caste system, divided into haves and have-nots.” And nothing distinguishes the haves from the have-nots except for tenure—certainly not learning and not even theoretical sophistication. The idea that American society will go on indefinitely subsidizing an elite caste of low-responsibility intellectuals, who demand the leisure to teach advanced subjects while underpaid assistants perform the hard work of educating most of the undergraduate students in a university, is absurd.

Dedicated only to preserving its leisure and elite status, the university faculty has betrayed the ideal of learning. The “higher education bubble” (as Glenn Harlan Reynolds calls it) will burst. The university caste system will be swept away, along with the last subsidies for the last remaining scholars. At that point, scholarship will operate on the model of the blog—it will be a gift offered to an indifferent world in the hope that someone else might value it as highly as I myself, for example, value the fiction of the ’sixties.

Tuesday, January 28, 2014

The romance of certain old books

Among the distempers of learning in our day is the habit of reading canonical fiction as if it were the only fiction in existence. In the recent n+1 pamphlet No Regrets, for example, the blogger and novelist Emily Gould complains about the “midcentury misogynists”—Bellow, Kerouac, Mailer, Roth, Updike—who populate what Amanda Hess describes in Slate as the “hypermasculine literary canon.”

The unexamined assumption is that misogyny was the stock-in-trade of these “midcentury” writers. No one feels obligated to defend the proposition nor even to examine the misogyny in any detail. What becomes clear, in leafing through women writers’ grievances against the books that “rejected” them, is that male novelists from an earlier generation are being judged by an anachronism, a criterion they could not possibly have known—feminism’s current dictates about respect for women. The moral complacency and self-congratulation implicit in the judgments worry exactly no one.

But “presentism” or “present-mindedness” is merely one fallacy behind such exercises in reading current moral fashion back into literary history. Just as bad is the radical abbreviation of an entire age’s literature by studying only those figures who now appear to be dominant. For an account of “midcentury” literary misogyny to have any validity whatever, more than a handful of writers will have to read. (One is struck by how often the name of Philip Roth comes up, as if the postwar era should be known as the Age of Roth.)

You will familiarize yourself with all manner of generalizations about postwar American fiction in 21st-century literary journalism without ever encountering the names of Paul Horgan, Allan Seager, Willard Motley, Wright Morris, William Bradford Huie, Hortense Calisher, William Eastlake, J. F. Powers, John Leggett, George P. Elliott, Mary Lee Settle, Isaac Rosenfeld, James B. Hall, Thomas Gallagher, R. V. Cassill, Mario Puzo, Oakley Hall, Warren Miller, John Williams, Vance Bourjaily, Mark Harris, Chandler Brossard, Harry Mark Petrakis, Herbert Gold, Evan S. Connell Jr., Thomas Berger, Leo Litwak, Jack Matthews, Alison Lurie, Wallace Markfield, Edward Lewis Wallant, or Richard Yates.

I can’t be alone (can I?) in finding something romantic about the “forgotten” or “neglected” books of the past. While I love Bellow and Roth as much as the next critic—more, probably, since I named a son after Bellow—it is precisely their importance to me, their centrality in my thinking, that makes me want to know (in Triling’s phrase) the “hum and buzz of implication” from which they emerged. I don’t read their books to feel accepted or rejected, to have my lifestyle choices affirmed, but to appreciate their distance from me, their difference. And no method of literary study is more effective at making them strange again—those natives of the foreign country that is the past—than in understanding them as conventional (or not) for their times.

Books could be time machines, but rarely are. They are sadly familiar to us, because they are canonical; that is, because we read them in the present, with the standards and expectations of the present, as towering figures of the present. To be borne into the past, boats beating against the current, the best books are those which are least familiar: the books no one is assigned on any syllabus, the books discussed in no classroom. If nothing else, you have to read these “forgotten” or “neglected” books in editions from the period in which they were originally published, since many of them have never been reprinted. The cover art, the dust-jacket copy, the yellowing pages, the formal typography, the out-of-fashion author photos—even as physical objects, the books are visitors from another time and place.

Besides, there is the intellectual challenge in deciding for yourself whether a book is any good. The celebrated titles of this publishing season are surrounded by publicity; even an independent judgment sounds like an echo of the blurbs. And no one is ever surprised if you like Roth (or don’t). But what about Allan Seager or James B. Hall? Will Amos Berry or Racers to the Sun repay your time, or only waste it? Are you willing to accept the risk of recommending either of them to a friend? If you take seriously the adventure of reading you must involve yourself, sooner or later, in the romance of certain old books.

Monday, January 27, 2014

Five Books of cancer

A season into my sixth year of living with Stage IV metastatic prostate cancer, I am finally writing a book on the experience. Or, rather, what began as a wider-ranging memoir has refocused itself as a cancer book. My working title is Life on Planet Cancer (© the author). A literary critic by profession, I will be including some glances at the very best cancer writing. Where, then, if I were advising readers, would I begin on the literature?

• Mark Harris, Bang the Drum Slowly (1956). Harris tried his best to convince people that his second Henry Wiggen novel after The Southpaw (1953) was not a baseball novel. He was unsuccessful, largely because his descriptions of baseball prefer the plain speech of inside dope to syrupy lyricism (“The damn trouble [with hitting] is that knowing what is coming is only half the trick”). Harris’s story is about a third-string catcher on a major league team who is diagnosed with Hodgkins’s lymphoma when the disease was still incurable (the five-year survival rate is now above 80%). A Southerner who is prone to ignorance and racism, Bruce Pearson is a butt of cruel fun on the team until the news of his cancer slowly spreads through the roster, bringing the New York Mammoths together. Bruce’s attitude toward his own illness, lacking in self-pity, is pitch perfect. And its effect on hardened professional athletes, who do not permit any softness or sentimentality in their lives, is utterly convincing. The result may be the best single account of a death from cancer ever written.

• Peter De Vries, The Blood of the Lamb (1961). If Harris’s is not the best account of a death from cancer ever written then De Vries’s is. Many readers will prefer De Vries’s, because it is the more profound. (I will not shy from the word if you won’t.) Based on the death of De Vries’s own ten-year-old daughter Emily from leukemia, The Blood of the Lamb is the work of a deeply religious man, a Calvinist, who believed that God need not exist to save us. This wintry faith, as Martin Marty calls it in A Cry of Absence (another fine cancer book), a faith intimate with God’s absence, is strange and unfamiliar to most Americans, who are more used to the flush-faced, hallelujah, pass-the-collection-plate religious conviction of evangelicalism. Don Wanderhope, De Vries’s narrator, the father of the dying girl, concludes that “man’s search for meaning” is doomed to disappointment. But if “Human life ‘means’ nothing” it doesn’t follow that it is without truth. “Blessed are they that comfort, for they too have mourned, may be more likely the human truth”—this is Wanderhope’s conclusion in his desperate grief. One of the most eviscerating books you will ever read.

• Aleksandr Solzhenitsyn, Cancer Ward (1968). The first thing everyone says about Solzhenitsyn’s great 500-page novel is that it treats cancer as a metaphor for the totalitarian state. Perhaps it is time to turn the commonplace inside out: totalitarianism is, for Solzhenitsyn, a metaphor for cancer. He himself suffered from an undiagnosed cancer in the early ’fifties while incarcerated in a camp for political prisoners in Kazakhstan. Cancer is, he writes in the novel, a “perpetual exile.” There is no returning from it to a happy life of uncomplicated freedom. A peculiarly Russian vision, reeking of Dostoyevskian tragedy and pessimism? (Also the emotional byproduct of a third-rate medical system, which saved few and palliated the suffering of even fewer?) Yes, and all the more worth being soaked in as a consequence. The popular American attitude toward cancer is a dance to a sentimental tune about hope.

• Siddhartha Mukherjee, The Emperor of Maladies: A Biography of Cancer (2010). An oncologist and cancer researcher at Columbia University Medical Center, Mukherjee (no relation to the novelist Bharati Mukherjee) gave his 470-page book a misleading subtitle. The Emperor of Maladies is less cancer’s life-story than an informal and anecdotal survey of cancer research and treatment since the Second World War. Although it would have been improved by a tighter structure and perhaps a more exhaustive aim, its engaging tone and focus on the personalities involved in the “war on cancer” guaranteed the book a Pulitzer Prize. There is, however, no reason to read it from cover to cover. Like an oral history, it can be read a chapter here and then a chapter fifty pages on without loss or confusion. Mukherjee is good at cramming information into small spaces and clarifying the sometimes daunting language of medicine for general readers. He succeeds in his ambition to make cancer research into a modern adventure, and if this is not the same as writing the biography of cancer, it is as close as we are likely to get for a while; and not without value and pleasure.

• Christopher Hitchens, Mortality (2012). First diagnosed with esophageal cancer in June 2010, Hitchens died a year and a half later. In his last months he wrote a series of seven articles for Vanity Fair on his experience. These were collected and published in a short 93-page book along with some pages of notes toward a last unfinished article, which should probably have been discarded. The essays are characterized by Hitchens’s distinctive brand of honesty (“the novelty of a diagnosis of malignant cancer has the tendency to wear off”) and a unique ability to notice things that other writers on cancer have overlooked (for a cancer sufferer, for example—Hitchens’s preferred term—undergoing blood tests goes from being an easy routine to a painful ordeal). No other cancer book has quite the tone of immediacy that Hitchens’s has.

There are several memoirs that might also be mentioned, especially Lucy Grealy’s Autobiography of a Face (1994), Reynolds Price’s A Whole New Life (1994), Gillian Rose’s Love’s Work (1995), and Wilfred Sheed’s In Love with Daylight (1995), and they are perhaps the next books that should be read. Philip Roth’s Patrimony (1991) is about the suffering from cancer as watched helplessly from outside—Roth’s father Herman died of a brain tumor. The American poets L. E. Sissman and Jane Kenyon, both of whom died from cancer, wrote sharply and movingly of the disease. And I have discussed Anatole Broyard’s Intoxicated by My Illness (1993) at some length elsewhere, because Broyard died of the same cancer I am living with. If I could bring only five books with me to the hospital, though, these are the five I would bring.

Entrepreneurs of the spirit

Will Wilkinson laments the decline of “old school blogging,” the original style of blogging—before the media outlets launched their group blogs and bought up the first-generation “personal bloggers”—in which the blogger composed a self, day by day, “put[ting] things out there, broadcast[ing] bits of [his] mind,” and in return finding a place for himself “as a node in the social world.”

What Wilkinson has to say about the self is provocative and largely true, I think. The self is a convergence of loyalties and enthusiasms and beliefs and habits. That there is a “stable” self, which persists through the flux of illness and health and better and worse, is an “illusion.” Wilkinson’s best line is that the “self is more like a URL,” an “address in a web of obligation and social expectation.”

But I am even more interested in what Wilkinson has to say—or suggest, really—about the economics of blogging. “Old school blogging,” as he calls it, belongs to a “personal gift economy.” The blogger gives away his reflections, and “in return maybe you get some attention, which is nice, and some gratitude, which is even nicer.”

The minute a blogger joins the staff of a magazine, though, everything changes. Everyone likes to get paid for what he does—I am no exception—but blogging for pay changes forever the blogger’s relation to his audience. The “web of obligation and social expectation,” into which the blogger-for-free inserts himself, is narrowed and focused. In reality, his audience shrinks to one (or, at most, a handful): his boss or bosses.

When the blogger becomes a “channel” for a media organization (to use Wilkinson’s term for it), he must adhere to more than the house style. He must also trim his judgment to suit the editorial fashions of his employer. Even where the blogger thinks of himself as a member of the magazine’s family, as I thought of myself at Commentary, conflict is inevitable.

As a literary practice, blogging is fundamentally an exercise of intellectual independence. A blogger no more thinks in a house style than Thoreau did in writing his journal. Writing as a staff member of a magazine, though (even when, as I did at Commentary, you are writing a one-person blog), you must second-guess yourself with regularity, asking whether you are setting yourself, even if accidentally, at odds with editorial policy.

One of the incidents that soured my working relationship with John Podhoretz, Commentary’s editor, was when I reviewed Hillel Halkin’s novel Melisande! What Are Dreams? Halkin’s novel was released in England by Granta, but was not being published in America. It never occurred to me that this would be an issue—Halkin was a longtime contributor to the magazine, the novel was brilliant and memorable—but Podhoretz was justifiably annoyed with me, because the magazine’s policy was not to review books that are not published in this country.

In taking on Halkin’s novel, I acted like a blogger, not a staff writer. I failed to recognize that, when you write for pay, you no longer write for yourself. To the reading public, you do not even write under your own name. When I praised Stone Arabia in a review on the Literary Commentary blog, Dana Spiotta’s publisher whipped my praise into a blurb and attributed it to Commentary. My name went poof!

The other day a trio of journalism students at Ohio State University came by to interview me for a class project. “What would you say to my generation about the future of journalism?” one of them asked to wind up the interview. “I’d say the future is both exciting and frightening,” I replied—“or maybe that’s the same thing.” The internet has made it possible for anyone to set up as a journalist—that is, to write regularly, on any subject that catches a fancy, as if keeping a journal. No one can tell anyone else what to write or not to write, or in what style. The marvel is freedom. The problem, as always, is how to monetize the work.

I had no practical solutions, beyond repeating the naïve ’sixties slogan “If you do the right thing money will come” and telling about the novelist Roland Merullo, who worked as a carpenter while writing his first novels. Complete editorial freedom is available for perhaps the first time in the history of journalism, I told the students—but only if they were willing (God help me) to become “entrepreneurs of the spirit” and not employees.

Whether the “old school” and “personal” bloggers can return to their first spiritual entrepreneurship, after having their literary thinking altered forever by writing for pay, is a question that may concern more than themselves alone. The answer may also suggest something about the future of journalistic freedom.

Wednesday, January 15, 2014

Reply to critics of “Academe Quits Me”

A few days ago, the economist Thomas Sowell found himself obligated to write an op-ed column in which he pointed out that “trickle-down economics”—the economic policy of the political right, according to the political left—is non-existent. It is attacked widely on the left, Sowell observed, but “none of those who denounce a ‘trickle-down’ theory can quote anybody who actually advocated it.”

I thought of Sowell’s column yesterday when I studied the readers’ comments to my essay “Academe Quit Me,” reprinted at Inside Higher Ed. No fewer than eight nine ten commentators were quick to denounce me for “rehash[ing] the canon wars of a previous generation,” in the words of one, or “likening [my] experience of being let go with the Grand Fall of the English Canon,” as another said.

I’ve reread my essay closely several times now and for the life of me I can’t find the word canon anywhere in it. Is it possible I wrote the word in my sleep? One or two commentators acknowledged (if they couldn’t bring themselves to say so outright) that I never actually wrote what I was being denounced for writing. But the denunciations were valid anyhow, because my essay, in the words of one commentator, “sounds like someone who feels that English departments should only teach courses that discuss White men and eurocentric studies,” and “Myers implied that voices and opinions should be excluded,” as another said.

By the magic of sounds like and implies, a text can be made to say anything the critic wants it to say! I can’t think of a stronger case for improving the teaching of English than the example of such wild-eyed readers, who project their bogies and night sweats into texts that spook them.

Even if it is their habit to express themselves in talking points and received ideas, though, it doesn’t follow that everyone else lives by the same habit. I have been writing publicly for more a quarter century now, and nowhere in anything I have written do I call for a return to the canon. I mean nowhere. If there has been one consistency in my writing it has been this. For more than two-and-a-half decades I have dissented from both sides in the canon wars.

One of my first published essays—published in the Sewanee Review in 1989 as I was just beginning my academic career—was called “The Bogey of the Canon.” The title summarizes my argument. To spell it out further:

To the revising of canons there is no end. But the canon, the “old canon,” the “patriarchal canon,” the “restricted, canonical list,” the “fixed repertory”—this is a bogey. It has never existed. It has merely changed, from critic to critic and generation to generation; it bears no marks of persistence as well as change. . . . Those who fear canons have seen a pattern where there is only randomness, and have mistaken a selection for a principle. The name they have given to this is “the canon,” but there is not enough of an identity among canons for there to be any one canon. It cannot be said to be a substantial entity.In light of the comments to my essay yesterday, I’d go one step farther now. The canon is the name by which calls for the restoration of order and coherence to literary study are misunderstood in advance and rejected out of hand without additional examination.

What I actually wrote in “Academe Quits Me” is that academic literary study is no longer a “common pursuit.” It does not represent a “common body of knowledge.” It lacks “common disciplinary conceptions.” Does it say more about me or about my commentators that the only common pursuit they can imagine, the only common disciplinary conception, is a “canon” of “dead white males”?

Most English professors secretly know that I am right, however, even if they would never permit themselves to say so publicly. In my teaching, I have learned that I cannot assume any common background knowledge, not even in English majors.

Last spring I taught one of those boutique courses that could have been offered at the University of Minnesota this semester: an honors seminar on Evil in the Postwar American Novel. Among the books I taught was Cormac McCarthy’s Blood Meridian. I began the discussion by raising the question of Faulkner’s influence upon McCarthy. My students looked at me blankly. “How many of you have read Faulkner?” I asked. No one raised a hand. “How many of you have heard of Faulkner?” Three hands went up. In an upper-division seminar on Philip Roth, pretty much the same thing. Not one student had read Saul Bellow.

In “Academe Quits Me,” I warn that the loss of a common tradition in English study leaves every English professor exposed. No one is indispensable to a university, because no curricular subject, no great author, is indispensable. When he was in college not long ago, a younger friend wrote to me privately yesterday, “you could take Shakespeare’s Treatment of Women, but not Shakespeare.”

I’m not opposed to the inclusion of underrepresented voices—in principle I’m not even opposed to film studies as a part of English—but my critics have failed to grasp my warning. Where nothing is considered essential knowledge, then nothing (not even film or underrepresented voices) is guaranteed a niche in the world of institutionalized scholarship. What my tenured colleagues fail to realize is that their sense of having a secure and permanent place in English is an illusion created by tenure. Nothing else protects them, because nothing else they contribute to scholarship or the academic community is considered necessary, not even by them. They themselves, by acquiescing in the loss of the common pursuit, have made themselves superfluous.

And if they think that a university cannot take away their salaries and their offices while continuing to recognize their tenure—if they think that entire English departments cannot be eliminated—they had better think again. Because such things have already happened at more than one university in this country.

Wednesday, January 08, 2014

Academe quits me

Tomorrow I will step into a classroom to begin the last semester of a 24-year teaching career. Don’t get me wrong. I am not retiring. I am not “burned out.” The truth is rather more banal. Ohio State University will not be renewing my three-year contract when it expires in the spring. The problem is tenure: with another three-year contract, I would become eligible for tenure. In an era of tight budgets, there is neither money nor place for a 61-year-old white male professor who has never really fit in nor tried very hard to. (Leave aside my heterodox politics and hard-to-credit publication record.) My feelings are like glue that will not set. The pieces fall apart in my hands.

This essay is not a contribution to the I-Quit-Academe genre. (A more accurate title in my case would be Academe Quits Me.) Although I have become uncomfortably aware that I am out of step with the purposeful march of the 21st-century university (or maybe I just never adjusted to Ohio State), gladly would I have learned and gladly continued to teach for as long as my students would have had me. The decision, though, was not my students’ to make. And I’m not at all sure that a majority would have voted to keep me around, even if they had been polled. My salary may not be large (a rounding error above the median income for white families in the U.S.), but the university can offer part-time work to three desperate adjuncts for what it pays me. A lifetime of learning has never been cost-effective, and in today’s university—at least on the side of campus where the humanities are badly housed—no other criterion is thinkable.

My experience is a prelude to what will be happening, sooner rather than later, to many of my colleagues. Humanities course enrollments are down to seven percent of full-time student hours, but humanities professors make up forty-five percent of the faculty. The imbalance cannot last. PhD programs go on awarding PhD’s to young men and women who will never find an academic job at a living wage. (A nearby university—a university with a solid ranking from U.S. News and World Report—pays adjuncts $1,500 per course. Just to toe the poverty line a young professor with a husband and a child would have to teach thirteen courses a year.) If only as retribution for the decades-long exploitation of part-time adjuncts and graduate assistants, nine of every ten PhD programs in English should be closed down—immediately. Meanwhile, the senior faculty fiddles away its time teaching precious specialties.

Consider some of the undergraduate courses being offered in English this semester at the University of Minnesota:

• Poems about Cities
• Studies in Narrative: The End of the World in Literature & History
• Studies in Film: Seductions: Film/Gender/Desire
• The Original Walking Dead in Victorian England
• Contemporary Literatures and Cultures: North American Imperialisms and Colonialisms
• Gay, Lesbian, Bisexual, and Transgendered Literature: Family as Origin and Invention
• Women Writing: Nags, Hags, and Vixens
• The Image on the Page
• Bodies, Selves, Texts
• Consumer Culture and Globalization
• The Western: Looking Awry
• Dreams and Middle English Dream Visions

To be fair, there are also four sections of Shakespeare being offered there this semester, although these are outnumbered by five sections of Literature of Public Life (whatever that is). Maybe I’m missing something, but this course list does not make me salivate to enroll at Minnesota the way that Addison Schacht salivates to enroll in classics at the University of Chicago in Sam Munson’s 2010 novel The November Criminals:I could study the major texts of Latin literature, to say nothing of higher-level philological pursuits, all the time. Do you know how much that excites me? Not having to do classes whose subjects are hugely, impossibly vague—like World History, like English [like Literature of Public Life]. You know, to anchor them? So they don’t dissolve because of their meaningless? I’ve looked through the sample [U of C] catalog. Holy fuck! Satire and the Silver Age. The Roman Novel. Love and Death: Eros and Transformation in Ovid. The Founding of Epic Meter. I salivated when I saw these names, because they indicate this whole world of knowledge from which I am excluded, and which I can win my way into, with luck and endurance.That’s it exactly. The Minnesota course list does not indicate a whole world of knowledge. It indicates a miscellany of short-lived faculty enthusiasms.

More than two decades ago Alvin Kernan complained that English study “fail[s] to meet the academic requirement that true knowledge define the object it studies and systematize its analytic method to at least some modest degree,” but by then the failure itself was already two decades old. About the only thing English professors have agreed upon since the early ’seventies is that they agree on nothing, and besides, agreement is beside the question. Teaching the disagreement: that’s about as close as anyone has come to restoring a sense of order to English.

In 1952, at the height of his fame, F. R. Leavis entitled a collection of essays The Common Pursuit. It was his name for the academic study of literature. No one takes the idea seriously any more, but nor does anyone ask the obvious followup. If English literature is not a common pursuit—not a “great tradition,” to use Leavis’s other famous title—then what is it doing in the curriculum? What is the rationale for studying it?

My own career (so called) suggests the answer. Namely: where there is no common body of knowledge, no common disciplinary conceptions, there is nothing that is indispensable. Any claim to expertise is arbitrary and subject to dismissal. After twenty-four years of patiently acquiring literary knowledge—plus the five years spent in graduate school at Northwestern, “exult[ing] over triumphs so minor,” as Larry McMurtry says in Moving On, “they would have been unnoticeable in any other context”—I have been informed that my knowledge is no longer needed. As Cardinal Newman warned, knowledge really is an end in itself. I fill no gap in the department, because there is no shimmering and comprehensive surface of knowledge in which any gaps might appear. Like everyone else in English, I am an extra, and the offloading of an extra is never reported or experienced as a loss.

I feel the loss, keenly, of my self-image. For twenty-four years I have been an English professor. Come the spring, what will I be? My colleagues will barely notice that I am gone, but what they have yet to grasp is that the rest of the university will barely notice when they too are gone, or at least severely reduced in numbers—within the decade, I’d say.

Tuesday, November 12, 2013

Lessons in human dignity

Victor Brombert, Musings on Mortality: From Tolstoy to Primo Levi (Chicago: University of Chicago Press, 2013). 188 pages.

Victor Brombert, who just turned ninety, is one of the last great comparativists in literary scholarship. A younger, more present-minded scholar would decide upon his “approach” before starting a book like this, and whether the “approach” is even relevant to his texts would be of less moment than establishing himself, for a few months at least, ahead of the curve. For Brombert, the first question is what books to read. The Death of Ivan Ilych, Death in Venice, “A Hunger Artist” and “The Metamorphosis,” To the Lighthouse, The Garden of the Finzi-Continis, Waiting for the Barbarians, The Plague, and Survival in Auschwitz—the historical stretch (from 1886 to 1980 and later), the linguistic range (Russian, German, French, and Italian in addition to English), are why the comparativist is worth listening to.

Musings on Mortality, the eleventh book in an academic career that began in 1949, is like sitting in on a late-afternoon graduate seminar in the oak-paneled honors room with the comfortable chairs. The distinguished professor emeritus from Princeton, author of books on Stendhal and Flaubert, has no thesis to grind; he is blessedly “atheoretical,” as the graduate students who are impatient for their guild cards tend to complain. He describes the “foreshadowing” in The Garden of the Finzi-Continis, he speaks of Primo Levi’s “telling what [Auschwitz] was like,” without a trace of self-consciousness. He never quotes a text without giving both the original and the translation (usually his own). Indeed, he will not discuss a book unless he can read it in the original language. This self-limitation is not modesty, although its effect is that, but scholarly integrity. The first commandment of comparative literature is that texts must be studied in the original to be understood properly. He remains faithful to the comparative method from first to last.

There are disadvantages to the method. Brombert’s unfamiliarity with Jewish languages and traditional Jewish sources suspends Primo Levi from a significant portion of his literary heritage, and raises questions about Brombert’s knowledge of Holocaust literature. He explains why Levi “chose to devote an entire chapter [in Survival in Auschwitz] to a canto of Dante’s Divine Comedy”—a literary choice that disturbed his students, Brombert reports—concluding that the “recourse to lines of poetry buried in the memory, but not really forgotten, carried a humanistic message.”

How much the analysis would have benefitted from a comparison to another Holocaust memoir! In The Book and the Sword (1996), the Talmudic scholar David Weiss Halivni tells about the day in Auschwitz when he saw an SS guard eating a sandwich “wrapped in a page of Orach Chaim, a volume of the Shulchan Aruch, Pesil Balaban’s edition.” With tears in his eyes, Halivni begs the guard to give his “this bletl, this page,” as a souvenir:

On the Sundays we had off, we now had not only Oral Torah [to study] but Written Torah as well. The bletl became a visible symbol of a connection between the camp and the activities of Jews throughout history. . . . The bletl became a rallying point. We looked forward to studying it whenever we had free time. . . . It was the bletl, parts of which had to be deciphered because the grease made some letters illegible, that summoned our attention. Most of those who came to listen didn’t understand the subject matter, but that was irrelevant. They all perceived the symbolic significance of the bletl.The comparativist is welcome to prefer the humanistic message, but cut off from “the activities of Jews throughout history,” it begins to feel a little thin and undifferentiated, a synthetic product of the comparativist’s own method. If the reader can accept this limitation—if he can read Brombert’s book in the spirit of its title—Musings on Mortality will succeed on its terms, gently stroking the reader into wonderment.

Thus the confrontation with mortality leads Ivan Ilych “[f]rom self-love to pity and compassion,” a “trajectory” which is “immense.” Thomas Mann warned that the “attraction to the abyss of immensity and darkness, to the unorganized and immeasurable,” conceals a “longing for nothingness.” Kafka toyed with the “idea of liberation through death.” According to Virginia Woolf, art is intimate with death: “It immobilizes the vitally changeable and thereby projects an already posthumous view.” Camus may have been in love with life, but he was forever aware of encroaching death and stressed “the importance of remaining supremely conscious at the point of death.” J. M. Coetzee is “equally elusive and paradoxical” about his own beliefs in the face of death. “I have beliefs,” as one of his characters says, “but I do not believe in them.” Brombert permits his writers to speak for themselves, and if they pull back from the edge of definitiveness, so does he. He excels at summary; he is capable of following the scent of a theme throughout an entire life’s work, flashing the writer’s phrases whenever possible. Each chapter of Musings on Mortality is an education in itself.

Such a book is neither right nor wrong. Although the language breathes heavily sometimes from the academic lifting (“Kafka quickly deconstructs the fabric of his own mythotheological motifs”), this is both unusual for Brombert, who would sooner write in the straightforward tones of paraphrase, and yet also weirdly appropriate. Musings on Mortality is an invitation to learn gladly from a deeply cultured man who would gladly teach. His lesson, to use his own words about Primo Levi, is a “lesson in human dignity.” And among the dignities of man, as Victor Brombert convincingly demonstrates, is the serious discussion of serious literature, which treats it as having something worth saying to those who would only listen.

Tuesday, October 22, 2013

A sapphire anniversary

Sunday was the fifth anniversary of this Commonplace Blog. My very first post, appropriately enough given my sworn allegiance to him, was a review of Philip Roth. Few people read it, although I was happy and relieved to publish it here.

The fall of 2008—I was on sabbatical from Texas A&M University, Hurricane Ike wiped out much of the semester, and all of my interest in my current research came down with the power lines. I had begun a book that I was calling Battle Cry of Theory, a history of French theory’s invasion of English departments from the early ’seventies to the present. But as I felt my time slipping away—I’d been diagnosed with terminal cancer just one year before—suddenly I did not relish the thought of spending my last months over the pages of Paul De Man, Jonathan Culler, J. Hillis Miller, Geoffrey Hartman, and the camp followers of the “Yale critics.”

Or perhaps it was merely that, when my family escaped Houston for a few days at a Jewish youth camp in the Hill Country, it did not occur to me to take any theory along for the ride. Instead I immersed myself in Roth’s new novel Indignation, and having finished it much too quickly, borrowed my wife’s copy of The Brass Verdict by the crime novelist Michael Connelly. Back-to-back reviews to commence my career as a book blogger.

I’d been writing book reviews professionally—that is, for low pay—since 1981, when I reviewed Philip Appleman’s Shame the Devil for New York Newsday. Within two years I had attracted the notice of Mel Watkins, the editor of the New York Times Book Review, who put me to work writing short assessments of the novelists that more prominent critics wanted nothing to do with—Katherine Govier (my first), Sheila Bosworth (my first jacket blurb), Whitney Strieber, Jack Higgins, James Alexander Thom, Ernest K. Gann. When Mr Watkins left the Book Review in 1985 (I could never bring myself to call him “Mel”), the new editor quietly dropped me as a regular contributor.

For the next two decades I reviewed little fiction. My PhD was in the history of criticism, especially the history of American criticism, and The Elephants Teach, my first book, was intended as a contribution to that subject.

My original intent, when I had gone off to Northwestern University, was to write a biographical and critical study of the writers grouped around Yvor Winters—his wife Janet Lewis, his best and most famous student J. V. Cunningham, and writers largely forgotten and not typically associated with him, including John Williams, the author of Stoner. I wanted to bring some attention to obscure poets of moving perfection—Helen Pinkerton, for example—and I planned to call my book Peers of Tradition. The phrase was Cunningham’s. The idea was what set these writers apart.

But though Gerald Graff, my PhD advisor, had himself studied under Winters at Stanford, he vetoed my project. Jerry was working on the book that would become Professing Literature, the first history of English departments in America, and I was enlisted to assist him on the research. He suggested that I write a sort of companion volume. Thus was my story of creative writing workshops, in print for seventeen years now, first conceived.

Until I started A Commonplace Blog five years ago, I didn’t fully realize how gaunt and unhealthy-looking my prose had become under the influence of academic writing. The blog format proved unexpectedly congenial. I had no inkling, when I blindly began, that blogging would be so liberating. Not only was I freed from begging letters to editors (if I wanted to review a book, I could review it without anyone’s permission). But also I no longer had to worry about what the chairman of the English department referred to as “career logic,” wherein every printed word must contribute to the building of a limited but national reputation.

Other than the stray political or scandal-mongering post, which always accumulates more “hits,” my five most popular literary essays of the past five years have been these:

(1.) Review of Tim Winton’s novel Breath, probably because the novel’s subject (surfing) causes my review to pop up in search engines.

(2.) My lament “What Became of Literary History?” which mourns the success of New Criticism in reducing the study of literature to “close reading.”

(3.) “Darlings of Oblivion”—a reflection on cancer and the small struggles of daily living, inspired by a phrase from Nabokov.

(4.) My most popular list—“The 10 Worst Prize-Winning American Novels of All Time.” From Jerzy Kosinski to John Updike.

(5.) A reconsideration of Vladimir Jabotinsky’s Samson, a novel that is hard to find, despite Ruth R. Wisse’s inclusion of it in The Modern Jewish Canon. My essay on it is one of the few in “print.”

That two of the five are reviews or review-essays is oddly cheering. Book pages may be dying (and they never gave their reviewers enough space or pay to begin with), and reader reviews may be squeezing out professional reviewers, but I remain convinced that readers are starved for intelligent and serious book-talk. I am proud to have contributed my share over the past five years.

Friday, October 18, 2013

Remembering JVC

Yesterday the Powerline blog—a politically conservative blog out of the Twin Cities—linked to my essay on Mario Puzo’s novel The Godfather. Over a thousand first-time readers descended upon A Commonplace Blog, although few lingered long enough to poke around in the remains of my literary thought. One who did was the photographer and printmaker William Porter, who had been a classics scholar in another life. From 1979 to 1982, he had held a Mellon postdoctoral fellowship in Renaissance studies at Brandeis University. It was there that he became friends with J. V. Cunningham.

Porter soon discovered Cunningham’s significance to me as well. Four-and-a-half years ago on this blog I published my notes from a course in the history of literary criticism that Cunningham taught at Washington University in St. Louis, where he was the Hurst Visiting Professor in 1976. (I also reproduced a rare early photograph of Cunningham.) And of course I have repeated to anyone who would listen that John Williams’s brilliant minor novel Stoner, a testament to the scholarly life, is based on the life and personality of JVC.

Porter shared his own memories. (He has given me permission to quote them here.) Cunningham, he told me,

ended up writing an important letter for my dossier that helped me a lot when I moved on in 1982. I was also a poet and translator, and in particular fancied myself a writer of epigrams—so I had that to share with Cunningham as well. I got to know him and his lovely wife and visited his house out in Sudbury. I learned more from “hanging out” with Cunningham drinking coffee than I had from nearly any of the teachers with whom I’d studied for semesters or even years.Our experiences are oddly parallel. I too spent several happy afternoons with Cunningham and his wife Jessie MacGregor Campbell, an Austen scholar recently retired from Clark University, at their home in Sudbury. (Mrs Cunningham never failed to serve me carrot cake. Cunningham would not take a piece. He had given up sugar, he explained. Why? “I found that it was easier,” he said.)

For me too he wrote a recommendation, and though I doubt that it helped me very much—by the time I entered the profession of English in the late ’eighties, he was considered a reactionary by those to whom he was not obscure—the letter was precious to me. I have always wished I could use one line of it as a blurb to all my writing: “Mr Myers,” he said, “writes a prose that is always distinctive, and sometimes even distinguished.” Anyone who knows anything at all about Cunningham knows just how high this praise is. After having such a thing said about me (and by him!), there was no possible way for me to stop writing.

Porter himself turned away from the life of scholarship a decade and a half ago. “I wanted to stop reading other people’s footnotes,” he says, “and didn’t fancy lecturing Honors freshmen on Homer and Sophocles.” I have read few indictments of the humanities at the turn of the century that are more devastating, and in fewer words. Cunningham would have admired its epigrammatic quality. Harried by student complaints that my grades are too low and the Jewish holidays are “too many,” I am tempted to follow Porter into a less puerile life.

Why I stay, though, can be directly attributed to Cunningham. I have described before on this blog a scene from his course in the history of criticism. (Link provided lest my close readers fear that I have forgotten the earlier account.) One day in class, Cunningham asked the dozen or so graduate students enrolled to fill the blank in an epigram by Sir Henry Wotton:He first deceased; she for a little tried
To live without him, ________, and died.
The other students in the class struggled valiantly to rise to the occasion, devising all manner of poeticisms to satisfy the missing cretic foot. I was dull and embarrassed by my dullness. I wrote in resignation:He first deceased; she for a little tried
To live without him,
went to bed, and died.
Wotton’s original, of course, is far more distinguished:He first deceased; she for a little tried
To live without him,
liked it not, and died.
Cunningham read my pitiful effort aloud to the class and said, “In twenty-five years of teaching, this is the best wrong answer I have ever received.” Porter’s reaction to my anecdote is worth quoting in full:“The best wrong answer I have ever received.” Sounds just like the man. Seems to say very little, but in fact prompts one (well, if one is attentive) to start wondering about lots of things. That’s what I remember about my conversations with him. There was a lot of silence, but when talking was done, he’d let me do more than my share. This of course encouraged me to think what I was saying must be interesting or important. And then he’d drop some little comment that would keep me awake at night for a week. Without a doubt the most efficient teacher I ever knew.Yes, exactly. Every word of Cunningham’s was measured. (The pun is intentional.) His speech was as packed and pointed as his famous epigrams. (See here and here and here for examples.) He never belabored a point, because he expected you to reach the understanding, upon further reflection, that what he said was necessary and true.

Cunningham’s comment in class has kept me awake for three and a half decades. Only after corresponding with William Porter, though, did I realize the meaning of his “prompt” in my life.

In one sentence, Cunningham defined the scholarly life. It is not a matter of formulating correct answers, which is something that undergraduates, with their obsession over grades, cannot seem to grasp. It is a matter of so inhabiting other men’s minds, other men’s time, that your wrong answers are very nearly their own thinking.

I have never become disgusted with “other people’s footnotes,” because I have never wasted much attention upon them. I have been distracted by greater minds. Of course, I’ve never had a very successful academic career, and this in part is why. Despite my professional failure, though, I have remained in the university to pursue a scholarly life. And why? Because the difficulty of entering greater minds, whether they are the founders of creative writing or the Roth to whom I keep returning, is a challenge that has never grown stale for me.

There are only so many footnotes that a person can read. There are, however, an inexhaustible number of lines of verse to get almost right.