Wednesday, April 09, 2014

Dying is a 12-step program

My seven-year-old son Isaac was listening to Gil Roth’s interview with me on the Virtual Memories Show. “He will be dead from prostate cancer within the next two years,” Roth said in introducing me. “You’re dying?” Isaac cried. Isaac is named after Isaac Rosenfeld, of whom the critic Ted Solotaroff said that “his very name itself still seems to possess an incantatory power: some of his friends speak it as though ‘Isaac’ were a magic word for joy and wit. . . .” My son too is a merry prankster, the family’s stand-up comic. He was not prepared to think of his father as dying, and not only because he is just seven years old.

Dying is the problem, not death. As an Orthodox Jew, I believe with perfect faith in the resurrection of the dead, but until that happens, death is the termination of consciousness. No peeking back into life. I won’t get to keep a scorecard of who is crying at my funeral, who is dry-eyed, who never bothered to show up. If I want someone to cry at my funeral, I need to patch things up with him before the last weak images flicker out.

In the past few weeks I have been approaching ex-friends whom I have damaged to ask their forgiveness. I’ve been behaving, in short, as if dying were a twelve-step program. Step 8: “Made a list of all persons we had harmed and became willing to make amends to them all.” Step 9: “Made direct amends to such people wherever possible, except when to do so would injure them or others.” Not that I mind having enemies. One person whom I approached recently accused me of “basking in self-importance,” which is one possible way, I suppose, of describing the tireless knowledge that death is near. But there are other persons, including some with whom I have had very public fallings-out, whom I don’t want as enemies when I pass away. To die without accepting responsibility for the damage I have done to relationships that were once meaningful to me would be shameful and undeniably self-important.

The remaining ten steps can be revised somewhat to suit the dying:

• “We admitted we were powerless over our dying and that prolonging our lives had become unmanageable—by us.”

• “Came to believe that a power greater than ourselves could restore us to acceptance of our death.”

• “Made a decision to turn our last remaining days, the peace and torment, over to the care of God as we understood Him.”

• “Made a searching and fearless moral inventory of ourselves.” (This one needs no revision.)

• “Admitted to God, to ourselves, and to another human being the exact nature of our regrets and reasons for happiness.”

• “Were entirely ready to have God receive us exactly as we have become, without the opportunity for additional effort or success.”

• “Humbly asked Him to make light of our failures.”

• “Continued to take personal inventory and when we indulged in magical thinking about death, promptly stopped it.”

• “Sought through prayer and meditation—and, sometimes, through literary exertions—to improve our conscious contact with God as we understood Him, praying (and, sometimes, writing) for knowledge of life under the shadow of death and the power to endure it.”

• “Having had a spiritual awakening as the result of these steps, we tried to carry this message to the dying and to practice these principles in our daily lives, even if we occasionally suffered dark nights of the soul, which we tried our best not to carry over into the next morning.”

The difference, of course, is that dying is an addiction from which there is no recovery. But the similarity is this. Dying is a mental discipline, even if the goal is not to be clean and sober, but simply to be ready.

Wednesday, February 12, 2014

The greatest debuts

In English-language prose fiction, that is. Here without much further ado or explanation is the list of the twenty-five greatest literary debuts, which I posted to Twitter earlier today. John Wilson of Books and Culture asked me to put the list in one place, and so.

As Darin Strauss recognized, the list is something of a jeu, recklessly tossing together great books that happened to be first books along with books that defined (and, in some cases, foreshortened) a literary career. (A couple of changes have been made to the original list, removing Charles Portis’s True Grit—in actuality, his second novel—and Joyce’s Dubliners and including Invisible Man, which I unaccountably overlooked the first time around.) At all events, the titles on this list are characterized as much by splash as by merit.

  1. Samuel Richardson, Pamela (1740)
  2. Charlotte Brontë, Jane Eyre (1847)
  3. Kingsley Amis, Lucky Jim (1954)
  4. Joseph Heller, Catch-22 (1961)
  5. Ralph Ellison, Invisible Man (1952)
  6. William Golding, Lord of the Flies (1954)
  7. Charles Dickens, The Pickwick Papers (1836–37)
  8. J. D. Salinger, Catcher in the Rye (1951)
  9. Margaret Mitchell, Gone With the Wind (1936)
10. Thomas Wolfe, Look Homeward, Angel (1929)
11. Theodore Dreiser, Sister Carrie (1900)
12. Walker Percy, The Moviegoer (1961)
13. Ken Kesey, One Flew over the Cuckoo’s Nest (1962)
14. Thomas Pynchon, V. (1963)
15. Philip Roth, Goodbye, Columbus (1959)
16. John O’Hara, Appointment in Samarra (1934)
17. Marilynne Robinson, Housekeeping (1980)
18. Carson McCullers, The Heart Is a Lonely Hunter (1940)
19. Harper Lee, To Kill a Mockingbird (1960)
20. Flannery O’Connor, Wise Blood (1952)
21. John Kennedy Toole, A Confederacy of Dunces (1980)
22. Raymond Chandler, The Big Sleep (1939)
23. Henry Roth, Call It Sleep (1934)
24. Michael Chabon, The Mysteries of Pittsburgh (1988)
25. (tie) Erica Jong, Fear of Flying (1973)
       (tie) Donna Tartt, The Secret History (1982)

Update: Honorable mention (that is, suggestions from readers)—Emily Brontë, Wuthering Heights (1847); Anita Loos, Gentlemen Prefer Blondes (1920); James Jones, From Here to Eternity (1951); Richard Yates, Revolutionary Road (1961); George V. Higgins, The Friends of Eddie Coyle (1970); Tom Wolfe, The Bonfire of the Vanities (1987); Zadie Smith, White Teeth (2000); Jhumpa Lahiri, Interpreter of Maladies (2000); ZZ Packer, Drinking Coffee Elsewhere (2003); Chad Harbach, The Art of Fielding (2011).

Update, II: Patrick Kurp’s additions (in his order): Stevie Smith, Novel on Yellow Paper; Laurence Sterne, Tristram Shandy; Philip Larkin, Jill; Herman Melville, Typee; Anthony Powell, Afternoon Men; Evelyn Waugh, Decline and Fall; Tobias Smollet, The Adventures of Roderick Random; Ivy Compton-Burnett, Pastors and Masters; and Henry Green, Blindness.

Tuesday, February 04, 2014

Bibliographing the ’sixties

You might say that my bibliography of ’sixties fiction has been a lifetime in the making. The first hardback book that I ever bought with my own money was Allen Drury’s 1968 novel Preserve and Protect, the fourth and last volume of the tetralogy about American politics that Drury had begun with his Pulitzer Prize-winning Advise and Consent. (Spoiler: Drury never duplicated the mastery of that first volume.)

By the next year I was bolting down Portnoy’s Complaint and writing a celebration of it for Ramona High School’s literary magazine. (The faculty adviser rejected it on the basis of its sensational subject matter and even more sensational language.) I came of age on the fiction of the ’sixties—Roth, Bellow, Malamud, Stanley Elkin, Wright Morris, Walker Percy, Peter De Vries, J. F. Powers, Mark Harris, Evan Connell, Thomas Berger, E. L. Doctorow, Maureen Howard, Wilfrid Sheed, R. V. Cassill, John Barth, Joan Didion, even Madison Jones. These were the writers who lined the bookshelves of my early self-education. I filled my head with useless details about publication order and publishing houses and copyright dates. Sitting down to compile my bibliography four decades later, I found myself doing much of the work from memory.

One reason I wanted to compile it was to leave a record, even a testament, to my useless literary learning. I am struggling against self-pity when I say that learning is no longer considered the sine qua non of the scholar, especially not in English departments. At one time, as J. V. Cunningham wrote in a 1964 Carleton Miscellany symposium on graduate education in English, bibliography was numbered among the specialized disciplines of literary study—that is, every literary scholar was assumed to be a capable hand at it, if not an adept. Now, however, what is prized in English departments is theoretical sophistication, interpretive cunning; being up to the minute, but not necessarily knowing “the impervious facts/ So well you can dispense with them” (to quote again from Cunningham), is what is sought in the bright young hires.

No one will ever again accuse me of being bright or young. As a dinosaur, though, perhaps I am in a good position to watch the meteor of an unsustainable economic model wipe out the last of my species. As Clay Shirky wrote in a brilliant essay last Wednesday, “The [university] faculty has stopped being a guild, divided into junior and senior members, and become a caste system, divided into haves and have-nots.” And nothing distinguishes the haves from the have-nots except for tenure—certainly not learning and not even theoretical sophistication. The idea that American society will go on indefinitely subsidizing an elite caste of low-responsibility intellectuals, who demand the leisure to teach advanced subjects while underpaid assistants perform the hard work of educating most of the undergraduate students in a university, is absurd.

Dedicated only to preserving its leisure and elite status, the university faculty has betrayed the ideal of learning. The “higher education bubble” (as Glenn Harlan Reynolds calls it) will burst. The university caste system will be swept away, along with the last subsidies for the last remaining scholars. At that point, scholarship will operate on the model of the blog—it will be a gift offered to an indifferent world in the hope that someone else might value it as highly as I myself, for example, value the fiction of the ’sixties.

Tuesday, January 28, 2014

The romance of certain old books

Among the distempers of learning in our day is the habit of reading canonical fiction as if it were the only fiction in existence. In the recent n+1 pamphlet No Regrets, for example, the blogger and novelist Emily Gould complains about the “midcentury misogynists”—Bellow, Kerouac, Mailer, Roth, Updike—who populate what Amanda Hess describes in Slate as the “hypermasculine literary canon.”

The unexamined assumption is that misogyny was the stock-in-trade of these “midcentury” writers. No one feels obligated to defend the proposition nor even to examine the misogyny in any detail. What becomes clear, in leafing through women writers’ grievances against the books that “rejected” them, is that male novelists from an earlier generation are being judged by an anachronism, a criterion they could not possibly have known—feminism’s current dictates about respect for women. The moral complacency and self-congratulation implicit in the judgments worry exactly no one.

But “presentism” or “present-mindedness” is merely one fallacy behind such exercises in reading current moral fashion back into literary history. Just as bad is the radical abbreviation of an entire age’s literature by studying only those figures who now appear to be dominant. For an account of “midcentury” literary misogyny to have any validity whatever, more than a handful of writers will have to read. (One is struck by how often the name of Philip Roth comes up, as if the postwar era should be known as the Age of Roth.)

You will familiarize yourself with all manner of generalizations about postwar American fiction in 21st-century literary journalism without ever encountering the names of Paul Horgan, Allan Seager, Willard Motley, Wright Morris, William Bradford Huie, Hortense Calisher, William Eastlake, J. F. Powers, John Leggett, George P. Elliott, Mary Lee Settle, Isaac Rosenfeld, James B. Hall, Thomas Gallagher, R. V. Cassill, Mario Puzo, Oakley Hall, Warren Miller, John Williams, Vance Bourjaily, Mark Harris, Chandler Brossard, Harry Mark Petrakis, Herbert Gold, Evan S. Connell Jr., Thomas Berger, Leo Litwak, Jack Matthews, Alison Lurie, Wallace Markfield, Edward Lewis Wallant, or Richard Yates.

I can’t be alone (can I?) in finding something romantic about the “forgotten” or “neglected” books of the past. While I love Bellow and Roth as much as the next critic—more, probably, since I named a son after Bellow—it is precisely their importance to me, their centrality in my thinking, that makes me want to know (in Triling’s phrase) the “hum and buzz of implication” from which they emerged. I don’t read their books to feel accepted or rejected, to have my lifestyle choices affirmed, but to appreciate their distance from me, their difference. And no method of literary study is more effective at making them strange again—those natives of the foreign country that is the past—than in understanding them as conventional (or not) for their times.

Books could be time machines, but rarely are. They are sadly familiar to us, because they are canonical; that is, because we read them in the present, with the standards and expectations of the present, as towering figures of the present. To be borne into the past, boats beating against the current, the best books are those which are least familiar: the books no one is assigned on any syllabus, the books discussed in no classroom. If nothing else, you have to read these “forgotten” or “neglected” books in editions from the period in which they were originally published, since many of them have never been reprinted. The cover art, the dust-jacket copy, the yellowing pages, the formal typography, the out-of-fashion author photos—even as physical objects, the books are visitors from another time and place.

Besides, there is the intellectual challenge in deciding for yourself whether a book is any good. The celebrated titles of this publishing season are surrounded by publicity; even an independent judgment sounds like an echo of the blurbs. And no one is ever surprised if you like Roth (or don’t). But what about Allan Seager or James B. Hall? Will Amos Berry or Racers to the Sun repay your time, or only waste it? Are you willing to accept the risk of recommending either of them to a friend? If you take seriously the adventure of reading you must involve yourself, sooner or later, in the romance of certain old books.

Monday, January 27, 2014

Five Books of cancer

A season into my sixth year of living with Stage IV metastatic prostate cancer, I am finally writing a book on the experience. Or, rather, what began as a wider-ranging memoir has refocused itself as a cancer book. My working title is Life on Planet Cancer (© the author). A literary critic by profession, I will be including some glances at the very best cancer writing. Where, then, if I were advising readers, would I begin on the literature?

• Mark Harris, Bang the Drum Slowly (1956). Harris tried his best to convince people that his second Henry Wiggen novel after The Southpaw (1953) was not a baseball novel. He was unsuccessful, largely because his descriptions of baseball prefer the plain speech of inside dope to syrupy lyricism (“The damn trouble [with hitting] is that knowing what is coming is only half the trick”). Harris’s story is about a third-string catcher on a major league team who is diagnosed with Hodgkins’s lymphoma when the disease was still incurable (the five-year survival rate is now above 80%). A Southerner who is prone to ignorance and racism, Bruce Pearson is a butt of cruel fun on the team until the news of his cancer slowly spreads through the roster, bringing the New York Mammoths together. Bruce’s attitude toward his own illness, lacking in self-pity, is pitch perfect. And its effect on hardened professional athletes, who do not permit any softness or sentimentality in their lives, is utterly convincing. The result may be the best single account of a death from cancer ever written.

• Peter De Vries, The Blood of the Lamb (1961). If Harris’s is not the best account of a death from cancer ever written then De Vries’s is. Many readers will prefer De Vries’s, because it is the more profound. (I will not shy from the word if you won’t.) Based on the death of De Vries’s own ten-year-old daughter Emily from leukemia, The Blood of the Lamb is the work of a deeply religious man, a Calvinist, who believed that God need not exist to save us. This wintry faith, as Martin Marty calls it in A Cry of Absence (another fine cancer book), a faith intimate with God’s absence, is strange and unfamiliar to most Americans, who are more used to the flush-faced, hallelujah, pass-the-collection-plate religious conviction of evangelicalism. Don Wanderhope, De Vries’s narrator, the father of the dying girl, concludes that “man’s search for meaning” is doomed to disappointment. But if “Human life ‘means’ nothing” it doesn’t follow that it is without truth. “Blessed are they that comfort, for they too have mourned, may be more likely the human truth”—this is Wanderhope’s conclusion in his desperate grief. One of the most eviscerating books you will ever read.

• Aleksandr Solzhenitsyn, Cancer Ward (1968). The first thing everyone says about Solzhenitsyn’s great 500-page novel is that it treats cancer as a metaphor for the totalitarian state. Perhaps it is time to turn the commonplace inside out: totalitarianism is, for Solzhenitsyn, a metaphor for cancer. He himself suffered from an undiagnosed cancer in the early ’fifties while incarcerated in a camp for political prisoners in Kazakhstan. Cancer is, he writes in the novel, a “perpetual exile.” There is no returning from it to a happy life of uncomplicated freedom. A peculiarly Russian vision, reeking of Dostoyevskian tragedy and pessimism? (Also the emotional byproduct of a third-rate medical system, which saved few and palliated the suffering of even fewer?) Yes, and all the more worth being soaked in as a consequence. The popular American attitude toward cancer is a dance to a sentimental tune about hope.

• Siddhartha Mukherjee, The Emperor of Maladies: A Biography of Cancer (2010). An oncologist and cancer researcher at Columbia University Medical Center, Mukherjee (no relation to the novelist Bharati Mukherjee) gave his 470-page book a misleading subtitle. The Emperor of Maladies is less cancer’s life-story than an informal and anecdotal survey of cancer research and treatment since the Second World War. Although it would have been improved by a tighter structure and perhaps a more exhaustive aim, its engaging tone and focus on the personalities involved in the “war on cancer” guaranteed the book a Pulitzer Prize. There is, however, no reason to read it from cover to cover. Like an oral history, it can be read a chapter here and then a chapter fifty pages on without loss or confusion. Mukherjee is good at cramming information into small spaces and clarifying the sometimes daunting language of medicine for general readers. He succeeds in his ambition to make cancer research into a modern adventure, and if this is not the same as writing the biography of cancer, it is as close as we are likely to get for a while; and not without value and pleasure.

• Christopher Hitchens, Mortality (2012). First diagnosed with esophageal cancer in June 2010, Hitchens died a year and a half later. In his last months he wrote a series of seven articles for Vanity Fair on his experience. These were collected and published in a short 93-page book along with some pages of notes toward a last unfinished article, which should probably have been discarded. The essays are characterized by Hitchens’s distinctive brand of honesty (“the novelty of a diagnosis of malignant cancer has the tendency to wear off”) and a unique ability to notice things that other writers on cancer have overlooked (for a cancer sufferer, for example—Hitchens’s preferred term—undergoing blood tests goes from being an easy routine to a painful ordeal). No other cancer book has quite the tone of immediacy that Hitchens’s has.

There are several memoirs that might also be mentioned, especially Lucy Grealy’s Autobiography of a Face (1994), Reynolds Price’s A Whole New Life (1994), Gillian Rose’s Love’s Work (1995), and Wilfred Sheed’s In Love with Daylight (1995), and they are perhaps the next books that should be read. Philip Roth’s Patrimony (1991) is about the suffering from cancer as watched helplessly from outside—Roth’s father Herman died of a brain tumor. The American poets L. E. Sissman and Jane Kenyon, both of whom died from cancer, wrote sharply and movingly of the disease. And I have discussed Anatole Broyard’s Intoxicated by My Illness (1993) at some length elsewhere, because Broyard died of the same cancer I am living with. If I could bring only five books with me to the hospital, though, these are the five I would bring.

Entrepreneurs of the spirit

Will Wilkinson laments the decline of “old school blogging,” the original style of blogging—before the media outlets launched their group blogs and bought up the first-generation “personal bloggers”—in which the blogger composed a self, day by day, “put[ting] things out there, broadcast[ing] bits of [his] mind,” and in return finding a place for himself “as a node in the social world.”

What Wilkinson has to say about the self is provocative and largely true, I think. The self is a convergence of loyalties and enthusiasms and beliefs and habits. That there is a “stable” self, which persists through the flux of illness and health and better and worse, is an “illusion.” Wilkinson’s best line is that the “self is more like a URL,” an “address in a web of obligation and social expectation.”

But I am even more interested in what Wilkinson has to say—or suggest, really—about the economics of blogging. “Old school blogging,” as he calls it, belongs to a “personal gift economy.” The blogger gives away his reflections, and “in return maybe you get some attention, which is nice, and some gratitude, which is even nicer.”

The minute a blogger joins the staff of a magazine, though, everything changes. Everyone likes to get paid for what he does—I am no exception—but blogging for pay changes forever the blogger’s relation to his audience. The “web of obligation and social expectation,” into which the blogger-for-free inserts himself, is narrowed and focused. In reality, his audience shrinks to one (or, at most, a handful): his boss or bosses.

When the blogger becomes a “channel” for a media organization (to use Wilkinson’s term for it), he must adhere to more than the house style. He must also trim his judgment to suit the editorial fashions of his employer. Even where the blogger thinks of himself as a member of the magazine’s family, as I thought of myself at Commentary, conflict is inevitable.

As a literary practice, blogging is fundamentally an exercise of intellectual independence. A blogger no more thinks in a house style than Thoreau did in writing his journal. Writing as a staff member of a magazine, though (even when, as I did at Commentary, you are writing a one-person blog), you must second-guess yourself with regularity, asking whether you are setting yourself, even if accidentally, at odds with editorial policy.

One of the incidents that soured my working relationship with John Podhoretz, Commentary’s editor, was when I reviewed Hillel Halkin’s novel Melisande! What Are Dreams? Halkin’s novel was released in England by Granta, but was not being published in America. It never occurred to me that this would be an issue—Halkin was a longtime contributor to the magazine, the novel was brilliant and memorable—but Podhoretz was justifiably annoyed with me, because the magazine’s policy was not to review books that are not published in this country.

In taking on Halkin’s novel, I acted like a blogger, not a staff writer. I failed to recognize that, when you write for pay, you no longer write for yourself. To the reading public, you do not even write under your own name. When I praised Stone Arabia in a review on the Literary Commentary blog, Dana Spiotta’s publisher whipped my praise into a blurb and attributed it to Commentary. My name went poof!

The other day a trio of journalism students at Ohio State University came by to interview me for a class project. “What would you say to my generation about the future of journalism?” one of them asked to wind up the interview. “I’d say the future is both exciting and frightening,” I replied—“or maybe that’s the same thing.” The internet has made it possible for anyone to set up as a journalist—that is, to write regularly, on any subject that catches a fancy, as if keeping a journal. No one can tell anyone else what to write or not to write, or in what style. The marvel is freedom. The problem, as always, is how to monetize the work.

I had no practical solutions, beyond repeating the naïve ’sixties slogan “If you do the right thing money will come” and telling about the novelist Roland Merullo, who worked as a carpenter while writing his first novels. Complete editorial freedom is available for perhaps the first time in the history of journalism, I told the students—but only if they were willing (God help me) to become “entrepreneurs of the spirit” and not employees.

Whether the “old school” and “personal” bloggers can return to their first spiritual entrepreneurship, after having their literary thinking altered forever by writing for pay, is a question that may concern more than themselves alone. The answer may also suggest something about the future of journalistic freedom.

Wednesday, January 15, 2014

Reply to critics of “Academe Quits Me”

A few days ago, the economist Thomas Sowell found himself obligated to write an op-ed column in which he pointed out that “trickle-down economics”—the economic policy of the political right, according to the political left—is non-existent. It is attacked widely on the left, Sowell observed, but “none of those who denounce a ‘trickle-down’ theory can quote anybody who actually advocated it.”

I thought of Sowell’s column yesterday when I studied the readers’ comments to my essay “Academe Quit Me,” reprinted at Inside Higher Ed. No fewer than eight nine ten commentators were quick to denounce me for “rehash[ing] the canon wars of a previous generation,” in the words of one, or “likening [my] experience of being let go with the Grand Fall of the English Canon,” as another said.

I’ve reread my essay closely several times now and for the life of me I can’t find the word canon anywhere in it. Is it possible I wrote the word in my sleep? One or two commentators acknowledged (if they couldn’t bring themselves to say so outright) that I never actually wrote what I was being denounced for writing. But the denunciations were valid anyhow, because my essay, in the words of one commentator, “sounds like someone who feels that English departments should only teach courses that discuss White men and eurocentric studies,” and “Myers implied that voices and opinions should be excluded,” as another said.

By the magic of sounds like and implies, a text can be made to say anything the critic wants it to say! I can’t think of a stronger case for improving the teaching of English than the example of such wild-eyed readers, who project their bogies and night sweats into texts that spook them.

Even if it is their habit to express themselves in talking points and received ideas, though, it doesn’t follow that everyone else lives by the same habit. I have been writing publicly for more a quarter century now, and nowhere in anything I have written do I call for a return to the canon. I mean nowhere. If there has been one consistency in my writing it has been this. For more than two-and-a-half decades I have dissented from both sides in the canon wars.

One of my first published essays—published in the Sewanee Review in 1989 as I was just beginning my academic career—was called “The Bogey of the Canon.” The title summarizes my argument. To spell it out further:

To the revising of canons there is no end. But the canon, the “old canon,” the “patriarchal canon,” the “restricted, canonical list,” the “fixed repertory”—this is a bogey. It has never existed. It has merely changed, from critic to critic and generation to generation; it bears no marks of persistence as well as change. . . . Those who fear canons have seen a pattern where there is only randomness, and have mistaken a selection for a principle. The name they have given to this is “the canon,” but there is not enough of an identity among canons for there to be any one canon. It cannot be said to be a substantial entity.In light of the comments to my essay yesterday, I’d go one step farther now. The canon is the name by which calls for the restoration of order and coherence to literary study are misunderstood in advance and rejected out of hand without additional examination.

What I actually wrote in “Academe Quits Me” is that academic literary study is no longer a “common pursuit.” It does not represent a “common body of knowledge.” It lacks “common disciplinary conceptions.” Does it say more about me or about my commentators that the only common pursuit they can imagine, the only common disciplinary conception, is a “canon” of “dead white males”?

Most English professors secretly know that I am right, however, even if they would never permit themselves to say so publicly. In my teaching, I have learned that I cannot assume any common background knowledge, not even in English majors.

Last spring I taught one of those boutique courses that could have been offered at the University of Minnesota this semester: an honors seminar on Evil in the Postwar American Novel. Among the books I taught was Cormac McCarthy’s Blood Meridian. I began the discussion by raising the question of Faulkner’s influence upon McCarthy. My students looked at me blankly. “How many of you have read Faulkner?” I asked. No one raised a hand. “How many of you have heard of Faulkner?” Three hands went up. In an upper-division seminar on Philip Roth, pretty much the same thing. Not one student had read Saul Bellow.

In “Academe Quits Me,” I warn that the loss of a common tradition in English study leaves every English professor exposed. No one is indispensable to a university, because no curricular subject, no great author, is indispensable. When he was in college not long ago, a younger friend wrote to me privately yesterday, “you could take Shakespeare’s Treatment of Women, but not Shakespeare.”

I’m not opposed to the inclusion of underrepresented voices—in principle I’m not even opposed to film studies as a part of English—but my critics have failed to grasp my warning. Where nothing is considered essential knowledge, then nothing (not even film or underrepresented voices) is guaranteed a niche in the world of institutionalized scholarship. What my tenured colleagues fail to realize is that their sense of having a secure and permanent place in English is an illusion created by tenure. Nothing else protects them, because nothing else they contribute to scholarship or the academic community is considered necessary, not even by them. They themselves, by acquiescing in the loss of the common pursuit, have made themselves superfluous.

And if they think that a university cannot take away their salaries and their offices while continuing to recognize their tenure—if they think that entire English departments cannot be eliminated—they had better think again. Because such things have already happened at more than one university in this country.

Wednesday, January 08, 2014

Academe quits me

Tomorrow I will step into a classroom to begin the last semester of a 24-year teaching career. Don’t get me wrong. I am not retiring. I am not “burned out.” The truth is rather more banal. Ohio State University will not be renewing my three-year contract when it expires in the spring. The problem is tenure: with another three-year contract, I would become eligible for tenure. In an era of tight budgets, there is neither money nor place for a 61-year-old white male professor who has never really fit in nor tried very hard to. (Leave aside my heterodox politics and hard-to-credit publication record.) My feelings are like glue that will not set. The pieces fall apart in my hands.

This essay is not a contribution to the I-Quit-Academe genre. (A more accurate title in my case would be Academe Quits Me.) Although I have become uncomfortably aware that I am out of step with the purposeful march of the 21st-century university (or maybe I just never adjusted to Ohio State), gladly would I have learned and gladly continued to teach for as long as my students would have had me. The decision, though, was not my students’ to make. And I’m not at all sure that a majority would have voted to keep me around, even if they had been polled. My salary may not be large (a rounding error above the median income for white families in the U.S.), but the university can offer part-time work to three desperate adjuncts for what it pays me. A lifetime of learning has never been cost-effective, and in today’s university—at least on the side of campus where the humanities are badly housed—no other criterion is thinkable.

My experience is a prelude to what will be happening, sooner rather than later, to many of my colleagues. Humanities course enrollments are down to seven percent of full-time student hours, but humanities professors make up forty-five percent of the faculty. The imbalance cannot last. PhD programs go on awarding PhD’s to young men and women who will never find an academic job at a living wage. (A nearby university—a university with a solid ranking from U.S. News and World Report—pays adjuncts $1,500 per course. Just to toe the poverty line a young professor with a husband and a child would have to teach thirteen courses a year.) If only as retribution for the decades-long exploitation of part-time adjuncts and graduate assistants, nine of every ten PhD programs in English should be closed down—immediately. Meanwhile, the senior faculty fiddles away its time teaching precious specialties.

Consider some of the undergraduate courses being offered in English this semester at the University of Minnesota:

• Poems about Cities
• Studies in Narrative: The End of the World in Literature & History
• Studies in Film: Seductions: Film/Gender/Desire
• The Original Walking Dead in Victorian England
• Contemporary Literatures and Cultures: North American Imperialisms and Colonialisms
• Gay, Lesbian, Bisexual, and Transgendered Literature: Family as Origin and Invention
• Women Writing: Nags, Hags, and Vixens
• The Image on the Page
• Bodies, Selves, Texts
• Consumer Culture and Globalization
• The Western: Looking Awry
• Dreams and Middle English Dream Visions

To be fair, there are also four sections of Shakespeare being offered there this semester, although these are outnumbered by five sections of Literature of Public Life (whatever that is). Maybe I’m missing something, but this course list does not make me salivate to enroll at Minnesota the way that Addison Schacht salivates to enroll in classics at the University of Chicago in Sam Munson’s 2010 novel The November Criminals:I could study the major texts of Latin literature, to say nothing of higher-level philological pursuits, all the time. Do you know how much that excites me? Not having to do classes whose subjects are hugely, impossibly vague—like World History, like English [like Literature of Public Life]. You know, to anchor them? So they don’t dissolve because of their meaningless? I’ve looked through the sample [U of C] catalog. Holy fuck! Satire and the Silver Age. The Roman Novel. Love and Death: Eros and Transformation in Ovid. The Founding of Epic Meter. I salivated when I saw these names, because they indicate this whole world of knowledge from which I am excluded, and which I can win my way into, with luck and endurance.That’s it exactly. The Minnesota course list does not indicate a whole world of knowledge. It indicates a miscellany of short-lived faculty enthusiasms.

More than two decades ago Alvin Kernan complained that English study “fail[s] to meet the academic requirement that true knowledge define the object it studies and systematize its analytic method to at least some modest degree,” but by then the failure itself was already two decades old. About the only thing English professors have agreed upon since the early ’seventies is that they agree on nothing, and besides, agreement is beside the question. Teaching the disagreement: that’s about as close as anyone has come to restoring a sense of order to English.

In 1952, at the height of his fame, F. R. Leavis entitled a collection of essays The Common Pursuit. It was his name for the academic study of literature. No one takes the idea seriously any more, but nor does anyone ask the obvious followup. If English literature is not a common pursuit—not a “great tradition,” to use Leavis’s other famous title—then what is it doing in the curriculum? What is the rationale for studying it?

My own career (so called) suggests the answer. Namely: where there is no common body of knowledge, no common disciplinary conceptions, there is nothing that is indispensable. Any claim to expertise is arbitrary and subject to dismissal. After twenty-four years of patiently acquiring literary knowledge—plus the five years spent in graduate school at Northwestern, “exult[ing] over triumphs so minor,” as Larry McMurtry says in Moving On, “they would have been unnoticeable in any other context”—I have been informed that my knowledge is no longer needed. As Cardinal Newman warned, knowledge really is an end in itself. I fill no gap in the department, because there is no shimmering and comprehensive surface of knowledge in which any gaps might appear. Like everyone else in English, I am an extra, and the offloading of an extra is never reported or experienced as a loss.

I feel the loss, keenly, of my self-image. For twenty-four years I have been an English professor. Come the spring, what will I be? My colleagues will barely notice that I am gone, but what they have yet to grasp is that the rest of the university will barely notice when they too are gone, or at least severely reduced in numbers—within the decade, I’d say.

Tuesday, November 12, 2013

Lessons in human dignity

Victor Brombert, Musings on Mortality: From Tolstoy to Primo Levi (Chicago: University of Chicago Press, 2013). 188 pages.

Victor Brombert, who just turned ninety, is one of the last great comparativists in literary scholarship. A younger, more present-minded scholar would decide upon his “approach” before starting a book like this, and whether the “approach” is even relevant to his texts would be of less moment than establishing himself, for a few months at least, ahead of the curve. For Brombert, the first question is what books to read. The Death of Ivan Ilych, Death in Venice, “A Hunger Artist” and “The Metamorphosis,” To the Lighthouse, The Garden of the Finzi-Continis, Waiting for the Barbarians, The Plague, and Survival in Auschwitz—the historical stretch (from 1886 to 1980 and later), the linguistic range (Russian, German, French, and Italian in addition to English), are why the comparativist is worth listening to.

Musings on Mortality, the eleventh book in an academic career that began in 1949, is like sitting in on a late-afternoon graduate seminar in the oak-paneled honors room with the comfortable chairs. The distinguished professor emeritus from Princeton, author of books on Stendhal and Flaubert, has no thesis to grind; he is blessedly “atheoretical,” as the graduate students who are impatient for their guild cards tend to complain. He describes the “foreshadowing” in The Garden of the Finzi-Continis, he speaks of Primo Levi’s “telling what [Auschwitz] was like,” without a trace of self-consciousness. He never quotes a text without giving both the original and the translation (usually his own). Indeed, he will not discuss a book unless he can read it in the original language. This self-limitation is not modesty, although its effect is that, but scholarly integrity. The first commandment of comparative literature is that texts must be studied in the original to be understood properly. He remains faithful to the comparative method from first to last.

There are disadvantages to the method. Brombert’s unfamiliarity with Jewish languages and traditional Jewish sources suspends Primo Levi from a significant portion of his literary heritage, and raises questions about Brombert’s knowledge of Holocaust literature. He explains why Levi “chose to devote an entire chapter [in Survival in Auschwitz] to a canto of Dante’s Divine Comedy”—a literary choice that disturbed his students, Brombert reports—concluding that the “recourse to lines of poetry buried in the memory, but not really forgotten, carried a humanistic message.”

How much the analysis would have benefitted from a comparison to another Holocaust memoir! In The Book and the Sword (1996), the Talmudic scholar David Weiss Halivni tells about the day in Auschwitz when he saw an SS guard eating a sandwich “wrapped in a page of Orach Chaim, a volume of the Shulchan Aruch, Pesil Balaban’s edition.” With tears in his eyes, Halivni begs the guard to give his “this bletl, this page,” as a souvenir:

On the Sundays we had off, we now had not only Oral Torah [to study] but Written Torah as well. The bletl became a visible symbol of a connection between the camp and the activities of Jews throughout history. . . . The bletl became a rallying point. We looked forward to studying it whenever we had free time. . . . It was the bletl, parts of which had to be deciphered because the grease made some letters illegible, that summoned our attention. Most of those who came to listen didn’t understand the subject matter, but that was irrelevant. They all perceived the symbolic significance of the bletl.The comparativist is welcome to prefer the humanistic message, but cut off from “the activities of Jews throughout history,” it begins to feel a little thin and undifferentiated, a synthetic product of the comparativist’s own method. If the reader can accept this limitation—if he can read Brombert’s book in the spirit of its title—Musings on Mortality will succeed on its terms, gently stroking the reader into wonderment.

Thus the confrontation with mortality leads Ivan Ilych “[f]rom self-love to pity and compassion,” a “trajectory” which is “immense.” Thomas Mann warned that the “attraction to the abyss of immensity and darkness, to the unorganized and immeasurable,” conceals a “longing for nothingness.” Kafka toyed with the “idea of liberation through death.” According to Virginia Woolf, art is intimate with death: “It immobilizes the vitally changeable and thereby projects an already posthumous view.” Camus may have been in love with life, but he was forever aware of encroaching death and stressed “the importance of remaining supremely conscious at the point of death.” J. M. Coetzee is “equally elusive and paradoxical” about his own beliefs in the face of death. “I have beliefs,” as one of his characters says, “but I do not believe in them.” Brombert permits his writers to speak for themselves, and if they pull back from the edge of definitiveness, so does he. He excels at summary; he is capable of following the scent of a theme throughout an entire life’s work, flashing the writer’s phrases whenever possible. Each chapter of Musings on Mortality is an education in itself.

Such a book is neither right nor wrong. Although the language breathes heavily sometimes from the academic lifting (“Kafka quickly deconstructs the fabric of his own mythotheological motifs”), this is both unusual for Brombert, who would sooner write in the straightforward tones of paraphrase, and yet also weirdly appropriate. Musings on Mortality is an invitation to learn gladly from a deeply cultured man who would gladly teach. His lesson, to use his own words about Primo Levi, is a “lesson in human dignity.” And among the dignities of man, as Victor Brombert convincingly demonstrates, is the serious discussion of serious literature, which treats it as having something worth saying to those who would only listen.

Tuesday, October 22, 2013

A sapphire anniversary

Sunday was the fifth anniversary of this Commonplace Blog. My very first post, appropriately enough given my sworn allegiance to him, was a review of Philip Roth. Few people read it, although I was happy and relieved to publish it here.

The fall of 2008—I was on sabbatical from Texas A&M University, Hurricane Ike wiped out much of the semester, and all of my interest in my current research came down with the power lines. I had begun a book that I was calling Battle Cry of Theory, a history of French theory’s invasion of English departments from the early ’seventies to the present. But as I felt my time slipping away—I’d been diagnosed with terminal cancer just one year before—suddenly I did not relish the thought of spending my last months over the pages of Paul De Man, Jonathan Culler, J. Hillis Miller, Geoffrey Hartman, and the camp followers of the “Yale critics.”

Or perhaps it was merely that, when my family escaped Houston for a few days at a Jewish youth camp in the Hill Country, it did not occur to me to take any theory along for the ride. Instead I immersed myself in Roth’s new novel Indignation, and having finished it much too quickly, borrowed my wife’s copy of The Brass Verdict by the crime novelist Michael Connelly. Back-to-back reviews to commence my career as a book blogger.

I’d been writing book reviews professionally—that is, for low pay—since 1981, when I reviewed Philip Appleman’s Shame the Devil for New York Newsday. Within two years I had attracted the notice of Mel Watkins, the editor of the New York Times Book Review, who put me to work writing short assessments of the novelists that more prominent critics wanted nothing to do with—Katherine Govier (my first), Sheila Bosworth (my first jacket blurb), Whitney Strieber, Jack Higgins, James Alexander Thom, Ernest K. Gann. When Mr Watkins left the Book Review in 1985 (I could never bring myself to call him “Mel”), the new editor quietly dropped me as a regular contributor.

For the next two decades I reviewed little fiction. My PhD was in the history of criticism, especially the history of American criticism, and The Elephants Teach, my first book, was intended as a contribution to that subject.

My original intent, when I had gone off to Northwestern University, was to write a biographical and critical study of the writers grouped around Yvor Winters—his wife Janet Lewis, his best and most famous student J. V. Cunningham, and writers largely forgotten and not typically associated with him, including John Williams, the author of Stoner. I wanted to bring some attention to obscure poets of moving perfection—Helen Pinkerton, for example—and I planned to call my book Peers of Tradition. The phrase was Cunningham’s. The idea was what set these writers apart.

But though Gerald Graff, my PhD advisor, had himself studied under Winters at Stanford, he vetoed my project. Jerry was working on the book that would become Professing Literature, the first history of English departments in America, and I was enlisted to assist him on the research. He suggested that I write a sort of companion volume. Thus was my story of creative writing workshops, in print for seventeen years now, first conceived.

Until I started A Commonplace Blog five years ago, I didn’t fully realize how gaunt and unhealthy-looking my prose had become under the influence of academic writing. The blog format proved unexpectedly congenial. I had no inkling, when I blindly began, that blogging would be so liberating. Not only was I freed from begging letters to editors (if I wanted to review a book, I could review it without anyone’s permission). But also I no longer had to worry about what the chairman of the English department referred to as “career logic,” wherein every printed word must contribute to the building of a limited but national reputation.

Other than the stray political or scandal-mongering post, which always accumulates more “hits,” my five most popular literary essays of the past five years have been these:

(1.) Review of Tim Winton’s novel Breath, probably because the novel’s subject (surfing) causes my review to pop up in search engines.

(2.) My lament “What Became of Literary History?” which mourns the success of New Criticism in reducing the study of literature to “close reading.”

(3.) “Darlings of Oblivion”—a reflection on cancer and the small struggles of daily living, inspired by a phrase from Nabokov.

(4.) My most popular list—“The 10 Worst Prize-Winning American Novels of All Time.” From Jerzy Kosinski to John Updike.

(5.) A reconsideration of Vladimir Jabotinsky’s Samson, a novel that is hard to find, despite Ruth R. Wisse’s inclusion of it in The Modern Jewish Canon. My essay on it is one of the few in “print.”

That two of the five are reviews or review-essays is oddly cheering. Book pages may be dying (and they never gave their reviewers enough space or pay to begin with), and reader reviews may be squeezing out professional reviewers, but I remain convinced that readers are starved for intelligent and serious book-talk. I am proud to have contributed my share over the past five years.

Friday, October 18, 2013

Remembering JVC

Yesterday the Powerline blog—a politically conservative blog out of the Twin Cities—linked to my essay on Mario Puzo’s novel The Godfather. Over a thousand first-time readers descended upon A Commonplace Blog, although few lingered long enough to poke around in the remains of my literary thought. One who did was the photographer and printmaker William Porter, who had been a classics scholar in another life. From 1979 to 1982, he had held a Mellon postdoctoral fellowship in Renaissance studies at Brandeis University. It was there that he became friends with J. V. Cunningham.

Porter soon discovered Cunningham’s significance to me as well. Four-and-a-half years ago on this blog I published my notes from a course in the history of literary criticism that Cunningham taught at Washington University in St. Louis, where he was the Hurst Visiting Professor in 1976. (I also reproduced a rare early photograph of Cunningham.) And of course I have repeated to anyone who would listen that John Williams’s brilliant minor novel Stoner, a testament to the scholarly life, is based on the life and personality of JVC.

Porter shared his own memories. (He has given me permission to quote them here.) Cunningham, he told me,

ended up writing an important letter for my dossier that helped me a lot when I moved on in 1982. I was also a poet and translator, and in particular fancied myself a writer of epigrams—so I had that to share with Cunningham as well. I got to know him and his lovely wife and visited his house out in Sudbury. I learned more from “hanging out” with Cunningham drinking coffee than I had from nearly any of the teachers with whom I’d studied for semesters or even years.Our experiences are oddly parallel. I too spent several happy afternoons with Cunningham and his wife Jessie MacGregor Campbell, an Austen scholar recently retired from Clark University, at their home in Sudbury. (Mrs Cunningham never failed to serve me carrot cake. Cunningham would not take a piece. He had given up sugar, he explained. Why? “I found that it was easier,” he said.)

For me too he wrote a recommendation, and though I doubt that it helped me very much—by the time I entered the profession of English in the late ’eighties, he was considered a reactionary by those to whom he was not obscure—the letter was precious to me. I have always wished I could use one line of it as a blurb to all my writing: “Mr Myers,” he said, “writes a prose that is always distinctive, and sometimes even distinguished.” Anyone who knows anything at all about Cunningham knows just how high this praise is. After having such a thing said about me (and by him!), there was no possible way for me to stop writing.

Porter himself turned away from the life of scholarship a decade and a half ago. “I wanted to stop reading other people’s footnotes,” he says, “and didn’t fancy lecturing Honors freshmen on Homer and Sophocles.” I have read few indictments of the humanities at the turn of the century that are more devastating, and in fewer words. Cunningham would have admired its epigrammatic quality. Harried by student complaints that my grades are too low and the Jewish holidays are “too many,” I am tempted to follow Porter into a less puerile life.

Why I stay, though, can be directly attributed to Cunningham. I have described before on this blog a scene from his course in the history of criticism. (Link provided lest my close readers fear that I have forgotten the earlier account.) One day in class, Cunningham asked the dozen or so graduate students enrolled to fill the blank in an epigram by Sir Henry Wotton:He first deceased; she for a little tried
To live without him, ________, and died.
The other students in the class struggled valiantly to rise to the occasion, devising all manner of poeticisms to satisfy the missing cretic foot. I was dull and embarrassed by my dullness. I wrote in resignation:He first deceased; she for a little tried
To live without him,
went to bed, and died.
Wotton’s original, of course, is far more distinguished:He first deceased; she for a little tried
To live without him,
liked it not, and died.
Cunningham read my pitiful effort aloud to the class and said, “In twenty-five years of teaching, this is the best wrong answer I have ever received.” Porter’s reaction to my anecdote is worth quoting in full:“The best wrong answer I have ever received.” Sounds just like the man. Seems to say very little, but in fact prompts one (well, if one is attentive) to start wondering about lots of things. That’s what I remember about my conversations with him. There was a lot of silence, but when talking was done, he’d let me do more than my share. This of course encouraged me to think what I was saying must be interesting or important. And then he’d drop some little comment that would keep me awake at night for a week. Without a doubt the most efficient teacher I ever knew.Yes, exactly. Every word of Cunningham’s was measured. (The pun is intentional.) His speech was as packed and pointed as his famous epigrams. (See here and here and here for examples.) He never belabored a point, because he expected you to reach the understanding, upon further reflection, that what he said was necessary and true.

Cunningham’s comment in class has kept me awake for three and a half decades. Only after corresponding with William Porter, though, did I realize the meaning of his “prompt” in my life.

In one sentence, Cunningham defined the scholarly life. It is not a matter of formulating correct answers, which is something that undergraduates, with their obsession over grades, cannot seem to grasp. It is a matter of so inhabiting other men’s minds, other men’s time, that your wrong answers are very nearly their own thinking.

I have never become disgusted with “other people’s footnotes,” because I have never wasted much attention upon them. I have been distracted by greater minds. Of course, I’ve never had a very successful academic career, and this in part is why. Despite my professional failure, though, I have remained in the university to pursue a scholarly life. And why? Because the difficulty of entering greater minds, whether they are the founders of creative writing or the Roth to whom I keep returning, is a challenge that has never grown stale for me.

There are only so many footnotes that a person can read. There are, however, an inexhaustible number of lines of verse to get almost right.

Tuesday, October 15, 2013

Mario Puzo’s Mafia novel

The reshaping of American literary culture from the early 1950’s to the early 1970’s might be captured in one historical image. James Jones’s From Here to Eternity, the massive blockbuster about the Regular Army in the last months before Pearl Harbor, was awarded the 1952 National Book Award in fiction. Not quite two decades later, Mario Puzo’s The Godfather was not even nominated. Joyce Carol Oates was honored for Them, her long aimless narrative of poor whites adrift in riot-torn Detroit.

Puzo had treated the Mafia in his novel in much the same way that Jones had treated the army—as an autonomous social institution with its pressures for conformity, where there is no place for a man with any real integrity.

From this it does not follow, however, that Puzo’s theme is what Gay Talese described in the Washington Post in reviewing the novel:

Whether men’s ambitions are fulfilled in the arena of politics or banking, business or crime, it makes little difference—the rules are often the same; it is a game of power and money; might makes right; and the most brutal acts are easily justified in the name of necessity and honor. Governments fight world wars for honor, drop atomic bombs for peace, stage bloody brawls for Christ; and the Mafia, on a mini-scale, acts out similar aggressions for similar goals—profit, prestige and justice as they see it.The Godfather is not, in short, an anti-Vietnam War novel in disguise. Rather, it is a novel that belongs to the same class as From Here to Eternity. It adopts the techniques of literary naturalism—the detailed social observations, the tone of moral detachment, the long sojourn among an underclass—to tell the story of an institution not immediately associated with the degradation of man.*

And like Jones, Puzo fills his pages with man after man—dozens of them, including the occasional woman—all of whom are distinct individuals, with individual histories and traits. No character is introduced without a backstory and a chapter to himself. This is the method of the blockbuster. In 1952 it was possible to win a major American literary award with a naturalistic blockbuster; by 1970 a novel had to be a “holy vessel of the imagination” to receive official recognition.

If it is read at all any more, Puzo’s The Godfather is probably read as the “novelization” of Francis Ford Coppola’s famous film of the same title, which was rated the third greatest American film of all time. It took Coppola three years to bring the novel to the screen (or about the same length of time that Fred Zinnemann took to film From Here to Eternity). According to literary gossip, Puzo molded and trimmed his work-in-progress to satisfy the demands of Paramount Pictures. If there is any truth to the rumor, however, it is startling that the most important scene in the novel, in which “Don Corleone gave the speech that would be long remembered” and in which “he coined a phrase that was to become as famous in its way as Churchill’s Iron Curtain”—the phrase that inspired the dust jacket illustration by S. Neil Fujita that was reproduced on the movie posters—only makes it into the film version in heavily abbreviated form.

After the Don is shot on the streets outside Genco Olive Oil, after Michael Corleone guns down the police captain Mark McCluskey and the drug smuggler Virgil Sollozzo, after Sonny Corleone has been murdered in retaliation, Vito Corleone calls a meeting of New York’s Five Families with “invitations to Families all over the United States” in order to sue for peace. The meeting is filmed by Coppola, and so too is the Don’s speech. But its central passage is not recorded:Let me say that we [in the Mafia] must always look to our interests. We are all men who have refused to be fools, who have refused to be puppets dancing on a string pulled by the men on high. . . . Who is to say we should obey the laws they make for their own interest and to our hurt? Sonna cosa nostra . . . these are our affairs. We will manage our world for ourselves because it is our world, cosa nostra. And so we have to stick together to guard against outside meddlers. Otherwise they will put the ring in our nose as they have put the ring in the nose of all the millions of Neapolitans and other Italians in this country.Coppola does not include this speech, because it does not express his message. Coppola’s message is delivered by Al Pacino (in “a part too demanding for him,” according to the late Stanley Kauffmann). When Michael Corleone returns from hiding in Sicily after the murders of Sollozzo and Captain McCluskey, he finally goes to see his old flame Kay Adams.

Michael tells her that he is working for his father now. “But I thought you weren’t going to become a man like your father,” Kay says; “you told me.” “My father's no different from any other powerful man,” Michael replies—“any man who’s responsible for other people, like a senator or president.” “Do you know how naïve you sound?” Kay asks with a smile; “senators and presidents don’t have men killed.” “Oh,” Michael says; “who’s being naïve, Kay?” Or, in other words, Gay Talese had it right after all. The Mafia differs from the U.S. government only in the extent and reach of its power. This is a view that can be enjoyed by libertarian and political radical alike, but it is not the view of Puzo’s novel.

In the novel, Michael’s speech to Kay is rather different:You’ve got the wrong idea of my father and the Corleone Family. I’ll make a final explanation and this one will be really final. My father is a businessman trying to provide for his wife and children and those friends he might need someday in a time of trouble. He doesn’t accept the rules of the society we live in because those rules would have condemned him to a life not suitable to a man like himself, a man of extraordinary force and character. What you have to understand is that he considers himself the equal of all those great men like Presidents and Prime Ministers and Supreme Court Justices and Governors of the States. He refuses to accept their will over his own. He refuses to live by rules set up by others, rules which condemn him to a defeated life. But his ultimate aim is to enter that society with a certain power since society doesn’t really protect its members who do not have their own individual power. In the meantime he operates on a code of ethics he considers far superior to the legal structures of society.I’d be tempted to characterize this view as fundamental to Italian fascism if Benito Mussolini had not been an intense and triumphant foe of the Mafia and its “separate authority.” At all events, it is not a view that is affirmed by Mario Puzo. In a small passage tucked away in a seemingly unimportant scene, Puzo makes his own view clear in his own voice. In contrasting Sonny Corleone to his brother-in-law Carlo Rizzi, Puzo writes that Sonnywas a man who could, with the naturalness of an animal, kill another man, while [Carlo] himself would have to call up all his courage, all his will, to commit murder. It never occurred to Carlo that because of this he was a better man than Sonny Corleone, if such terms could be used;a better man, even if he also beats his wife. (Puzo could not get away with such a distinction in 2013.) The mere fact that Carlo Rizzi recognizes a moral authority that is separate from his own, if only in restraining him from murder, means that he is a moral advance over Sonny. The Mafiosi may consider themselves “far superior” to the rest of society, but by Puzo’s lights, they are lesser men.

Puzo’s prose rarely flashes, but it rarely loses it balance either. The Godfather may not have been the best American novel of 1969, or even the third best (although it is easily better than Oates’s Them and two other novels nominated for the National Book Award, including Leonard Michaels’s Going Places and Kurt Vonnegut’s dull and tendentious Slaughterhouse-Five), but it remains a novel worth reading, if only for its ambition of copia or completeness.

The Godfather is a full picture of the Mafia, but it does not glamorize it. Puzo represents the Mafia as the social institutionalization of violence. This is not an accidental feature of “refusing to live by rules set up by others,” but its very essence. Nor does Puzo suggest a superficial and sloganeering moral equivalence between the Mafia and governments or businesses. His Mafia is a unique institution that uniquely degrades men, when it does not murder them.
____________________

* Raffi Magarik, a graduate student in English at Berkeley and a regular reader of A Commonplace Blog, writes to register his unhappiness with this phrase: an institution not immediately associated with the degradation of man. I admit to not being entirely pleased with it myself. What I was thinking is that (a.) prior to Puzo’s novel, the Mafia was not usually thought of as a social institution, and (b.) in Mafia fiction, it is more usually associated with beatings and murder than with human degradation (and certainly not the degradation of the men who become Mafiosi).

Magarik mentions W. R. Burnett’s Little Caesar (1929), perhaps the only earlier American novel about the Mafia. It chronicles the rise of Rico (a character modeled on Al Capone) from mob gunman to mob chieftain. From first to last, though, Rico remains a sociopath. He is vain about his hair, proud of his ability with a gun, and fair in splitting the take from robberies with his subordinates. His rise to power does not degrade him, however; he seizes an opportunity and holds on to power through violence. Puzo was the first American novelist who understood the Mafia as something different from a mere criminal gang—a complex social organism with a “separate authority” and its own code of ethics. His Mafia, in fact, differs only in social detail from James Jones’s army.

Magarik concludes, rather brilliantly, in my opinion, that your politics may determine whether you prefer Puzo’s Mafia or Coppola’s. Coming to the novel and film from my left, he concludes that “the Coppola is better than the Puzo just because the mob seems, on its own terms, too easy a target for naturalistic critique.” As a conservative, I prefer Puzo’s moral vision.

Thursday, October 10, 2013

Alice Munro, the 13th woman

Alice Munro became the first writer whose reputation derives almost exclusively from short stories—and the first Canadian—to win the Nobel Prize in literature.

What is relevant about her for literature, though, escaped the notice of the New York Times, which touted her as “the 13th woman to win the prize.” She is also the nineteenth English-language writer since 1944, the fourth writer in her eighties, the fifteenth avowed leftist (and the fifth in the last decade) to win the literature Prize, but these are not facts worth mentioning in the newspaper of record. That the Times was merely recycling the language on the literature Prize’s homepage (“Alice Munro is the 13th woman awarded the Nobel Prize in Literature so far”) suggests that, for the literary culture, counting by gender is now automatic and unreflective.*

It is true that other Nobel winners have written many distinguished short stories, perhaps most conspicuously Isaac Bashevis Singer. But it is also true that Singer wrote nineteen novels, and in awarding him the Prize in 1978, the Nobel committee cited his “impassioned narrative art which, with roots in a Polish-Jewish cultural tradition, brings universal human conditions to life.” The other great story writers who took home the Prize—Heinrich Böll, Sh. Y. Agnon, Ernest Hemingway—were also prolific novelists who conceived of themselves as novelists.

Alice Munro is the first Nobel winner whose entire career has been devoted to the story. To overlook this fact about her, to consign her to being the thirteenth of her gender rather than the first of her genre, is to overlook her importance in literary history. It is, in fact, to dishonor her.

When she was a young writer, just starting out, Munro hoped (like any fiction writer who hopes for greatness) to write novels. In her modesty, she later claimed that her labors as a wife and mother kept her too busy for anything longer than stories. “A child’s illness, relatives coming to stay, a pileup of unavoidable household jobs, can swallow a work-in-progress as surely as a power failure used to destroy a piece of work in the computer,” she said. “You’re better to stick with something you can keep in mind and hope to do in a few weeks, or a couple of months at most.” As I wrote in my Commentary essay on her last year, though, this modest explanation should also be understood as a sly apologia for the short story, perhaps the ideal form of literature for the busyness of career-driven postmodern lives.

Few have ever been better at the form, and in awarding Alice Munro the literature Prize, the Nobel Committee is also (at last) recognizing the short story’s essential place in modern writing.
____________________

* When I was a newspaper reporter, the recycling of press-release language in a story’s lead would have been cause for a severe dressing down. But perhaps journalistic standards are different now at the Times.

Thursday, October 03, 2013

The unshakable confidence

“He most honors my style,” Whitman wrote in Song of Myself, “who learns under it to destroy the teacher.” The best thing about teaching is not merely the company of young minds, but the opportunity to be instructed by them. In my class at the Ohio State University on The Great Gatsby and the art of criticism, I have repeatedly (and, I’m afraid, rather tiresomely) denounced the the commonplace interpretation, advanced by nearly every English teacher across the dark fields of this republic, that the novel dramatizes the conflict between old money versus new money. The phrases themselves, I like to point out, do not appear—nothing like the phrases themselves appear—anywhere in Fitzgerald’s text. When one of my impertinent students repeated my claim in another class in which Gatsby is assigned, the professor (my colleague) sputtered, “We don’t believe in the fallacy of authorial intent.” (He clearly meant that he does believe in the intentional fallacy, even if the variant in which he believes is a vulgar misrepresentation of Wimsatt and Beardsley’s actual views.)

Yesterday in class another two of my students demonstrated that they have already outgrown my tutelage and are ready to take on my colleagues without any assistance from me. We are now studying Trimalchio together, that “early version” of Gatsby edited by James L. W. West III. We are having great fun comparing Fitzgerald’s changes from draft to published copy. Reading Trimalchio under the privacy of my lamp, I realized that I was making the unexamined assumption that Gatsby is obviously superior—that every change from draft to published copy was for the better. But there is no warrant for such an assumption. The two versions are different, with different aims and effects. Fitzgerald did not revise the draft for the sake of a generalized and unfocused improvement, but for specific results. Our working hypothesis, I told the class after we examined several of his revisions, is that Fitzgerald changed Trimalchio into The Great Gatsby for the sake of a consistent artistic effect. To be explicit: a tragic effect.

(Trimalchio, by contrast, is a good old-fashioned “marriage plot” in which two men are involved in a power struggle over a woman, who—as in Austen, Eliot, and James—consults her own mind in making her choice between men. Daisy announces to Nick that she is leaving Tom, and a few days later she shows up at Gatsby’s mansion with her suitcases packed. For an entire page, Gatsby rehearses the practical reasons why she cannot leave her husband quite yet. “In other words you’ve got her,” Nick comments—“and now you don’t want her.”)

I asked the class for examples of textual changes so that we might test our working hypothesis. A young woman pointed out that, in the roll call of partygoers that Nick writes in the “empty spaces of a time-table” at the beginning of Chapter IV, every time a character in Trimalchio is said to come “from West Egg” he comes instead from East Egg in the published version of Gatsby (and vice versa). The flip-flops are predictable and uniform: every mention of an Egg goes over from one to the other. The student looked at me expectantly, hoping for a learnèd explanation. I looked back at her blankly. Although I had noticed the flip-flops too, I was stumped by them. Not another young woman in class. “They demolish the claim that the novel is about old money versus new money,” she said, “because what they show is that it’s completely arbitrary where the money in the book comes from.” So much for the commonplace interpretation, which ought never again be repeated with a straight face.

Fitzgerald’s revisions also show, as I have written before in this space, that the concept of a fixed and unified text is the unspoken presupposition behind all literary criticism, despite the clear evidence that, for writers themselves, texts are forever in a state of becoming. Most critics in our day are estranged from religion, and yet their mental habits derive from the traditional study of the Bible, and the perfect faith that “nothing can be added to [a literary text], nor anything taken from it” (Eccl 3.14).

Monday, September 23, 2013

Magical thinking about death

Anyone who has ever read The Adventures of Tom Sawyer, which used to be pretty much every boy in America, remembers the scene in which Tom, Huck, and Joe Harper attend their own funeral:

As the service proceeded, the clergyman drew such pictures of the graces, the winning ways, and the rare promise of the lost lads that every soul there, thinking he recognized these pictures, felt a pang in remembering that he had persistently blinded himself to them always before, and had as persistently seen only faults and flaws in the poor boys. The minister related many a touching incident in the lives of the departed, too, which illustrated their sweet, generous natures, and the people could easily see, now, how noble and beautiful those episodes were, and remembered with grief that at the time they occurred they had seemed rank rascalities, well deserving of the cowhide. The congregation became more and more moved, as the pathetic tale went on, till at last the whole company broke down and joined the weeping mourners in a chorus of anguished sobs, the preacher himself giving way to his feelings, and crying in the pulpit.Everyone remembers the scene because it cannily expresses what is among the commonest of human fantasies—the dream of peeking back into life after death to gauge just how much one is mourned and missed.

It’s appropriate the scene should occur in a boys’ book, because the fantasy is destructive of human maturity and the reality principle (which amount to the same thing). Perhaps none of my opinions makes people angrier than my insistence that daydreaming about life after death, whether it takes the form of wish-fulfillment fantasies about one’s own funeral or the delusion that one can ever be released from suffering, is a self-indulgence the dying cannot afford. We don’t encourage our children to believe they can grow up to become superheroes capable of leaping tall buildings in a single bound, and we should not encourage the terminally ill to pin their hopes upon something they will never experience in this lifetime.

To tell them that their suffering will be relieved by death—here, let me help you die—is a lie told for the benefit of the liar, because the dead do not know relief. They don’t know anything. They are dead. The relief is sought by those who must watch the dying suffer, and they will be the only ones to feel the relief. Relief of suffering, like funeral services, belong to living. The dead are excluded from them.

As usual it is Emily Dickinson, the poet laureate of death, who gets it exactly right:That short — potential stir
That each can make but once —
That Bustle so illustrious
’Tis almost Consequence —

Is the éclat of Death —
Oh, thou unknown Renown
That not a Beggar would accept
Had he the power to spurn —
Her first editors, Mabel Loomis Tood and Thomas Wentworth Higginson, assumed that she was describing a funeral, and hung the title like a wreath on the doorway of her poem. But Dickinson is not saying that a funeral is “the éclat of Death,” its moment of brilliant success, but rather that death’s only achievement, its “unknown Renown” (because no one who knows it can return to bask in it), is a “short potential stir.”

We think too much of death and not nearly enough of dying. There is a reason for that. Dying is a mental discipline, which entails many hours of training in (among other things) the renunciation of fantasies that death will be anything other than it is—the cessation of consciousness—and the bitter facing up to the reality of that fact. Those who prefer daydreams of impossible release from what awaits them will leave themselves (and those they love) tragically unprepared for the conclusive Bustle, which is “almost consequence.”

Wednesday, August 28, 2013

Baz Luhrmann’s final paper

Baz Luhrmann substitutes a high-school English paper for F. Scott Fitzgerald’s novel in scripting his film version of The Great Gatsby, released earlier this year and now available on DVD from Warner Home Video. The conflict between “old money” and “new money” and the symbolism of T. J. Eckleburg’s eyes as the “eyes of God,” those English-class favorites, are carefully enunciated and repeated by the actors just in case an unwary moviegoer might be under the illusion that Luhrmann’s purpose in remaking Gatsby is to scrape off the critical clichés and restore a classic to its original condition. The phrases appear nowhere in Fitzgerald’s text. They are, however, fixed as securely to the popular consciousness as Hamlet’s indecisiveness, which belongs not to Shakespeare’s play but to A. C. Bradley’s 1904 lectures on Shakespearean tragedy. No one who knows the commonplaces needs to read the texts with any attentiveness, because they have already been “read” for him—by general agreement.

The mistakes pile up. As the film opens, Nick Carraway is in a sanitarium, years after the events about to be shown, diagnosed as “morbidly alcoholic.” (In the novel, Nick says, “I have been drunk just twice in my life,” and Tobey Maguire, who plays Nick with demonstrative broadness, even repeats the line—as if oblivious to the nonsense the rest of the film makes of it.) Muttering aloud, Luhrmann’s Nick says, “Back then, we all drank too much. The more in tune with the times we were, the more we drank.” Fitzgerald’s Nick drinks too little, and tunes himself to the times in other ways, but Luhrmann’s mind is on finishing his English paper. He requires explanations, not subtleties. The movie’s Nick talks and talks in a voiceover that goes quickly from being intrusive to annoying. Why is he talking so much? He is addressing the psychiatrist who is treating him. “You see, Doctor,” he says at one point, dying to himself and being reborn as Alexander Portnoy.

The geography of the film is intentionally cartoonish. The Valley of Ashes, instead of being a narrow ash dump about the size of Flushing Meadow Park, is a monstrous waste land that has swallowed Queens whole. In an overhead shot reminiscent of Saul Steinberg’s View of the World from 9th Avenue, the lush green of Long Island ends abruptly in black-and-white, stretching from one edge of the frame to the other, with Manhattan glittering beyond it in the distance. Despite being a waste land, though, it is crawling with people. There are so many people milling around Wilson’s garage, in fact, that Gatsby is lucky he doesn’t hit someone long before his car runs over Myrtle Wilson.

Perhaps the worst thing about the film are Gatsby’s parties. Luhrmann has himself confused with Flo Ziegfeld. The parties are theatrical extravaganzas with chorus girls dancing in unison, dueling orchestras, announcers bellowing into microphones, streamers and confetti falling from the ceiling as if at a political convention, explosions of fireworks that must have kept the neighbors awake every night, and hundreds upon hundreds of guests packed so tightly in Gatsby’s rooms that they look like squirming maggots when viewed from above.

In the novel, when Nick attends one of Gatsby’s parties for the first time, he finds himself in conversation with “two girls in twin yellow dresses” and “three Mr. Mumbles,” all of them guessing at the truth about Gatsby. They lean forward confidentially and whisper to one another: “Somebody told me they thought he killed a man once,” “he was a German spy during the war.” Nick reflects knowingly: “It was testimony to the romantic speculation he inspired that there were whispers about him from those who had found little that it was necessary to whisper about in this world.” At Luhrmann’s parties, the guests find it necessary to shout. Anyone who whispered a romantic speculation would not have been heard even by himself.

In researching his final paper, Luhrmann must have learned that Fitzgerald planned originally to call his novel Trimalchio after the character in Petronius’ Satyricon who is famous for his immoderate dinner parties. Not that Luhrmann knows anything about Petronius. As a filmmaker with ambitions to greatness, though, he surely knows Fellini’s Satyricon. The parties in his film owe a deeper and more obvious debt to Fellini than to anything in Fitzgerald. The riots of sight and sound are proof merely that Luhrmann can do Fellini in the twenty-first century. They are the Folies Luhrmann, fantasies of pure excessive spectacle that have nothing whatever to do with the plot of Gatsby. In the novel, Gatsby throws his parties in the hope that Daisy will wander in one night. In the film, Gatsby would not be able to pick Daisy out of the swarm, even if she did happen to wander in.

But, really, I have been saving the worst for last. The worst is Luhrmann’s decision to make Nick Carraway into a writer. The “text” of his film is a large blank book that Nick’s psychiatrist gives him as part of his cure. “Write about it,” his doctor says. “You said yourself that writing brought you solace.” Nick has already admitted that he wanted to be a writer when he was at Yale. He picks up a green hardback copy of Ulysses to drive the point home, even though Ulysses was not published until 1922, the same year as the events in Gatsby. But why let an anachronism stand in the way of reconceiving Nick Carraway as a modernist genius?

I can think of two reasons. First, if Nick is a writer with visions of Joyce dancing in his head then he is that most tedious of creatures—the unreliable narrator. Maybe that’s why he can be “morbidly alcoholic” and also “drunk just twice in [his] life.” Whatever he says about himself is not entirely to be trusted. In the novel, Nick says at one point: “Reading over what I have written so far I see I have given the impression that the events of three nights several weeks apart were all that absorbed me”—that is, he has given a false impression. If he is unreliable, though, is the impression false or is the claim about its falsity false? In Luhrmann’s script, this line becomes: “Looking over my story so far, I’m reminded that, for the second time that summer, I was guarding other people’s secrets.” The shift to the word story is unconscious, I would wager, because Luhrmann and his co-author Craig Pearce never for a moment imagined Gatsby as anything else than make believe. The line about guarding other people’s secrets, which appears nowhere in the book, is also a reminder never to invite comparison with a better writer’s prose.

The second reason is the more important. A few days ago I argued that Nick is not a writer, at least not in the modernist sense, but a kind of confessor who is “privy to the secret griefs of wild, unknown men” like Jay Gatsby. If he is what Elias Canetti called an earwitness to the “intimate revelations” by and about Gatsby—not composing, merely listening—then the splendor of the book’s prose belongs, not to him, but to Gatsby and his dreams. To take the “creative passion” of the style away from Gatsby and bestow it upon Nick, a “normal person” who can only wonder at “what a man will store up in his ghostly heart,” is to inflate Nick into something he is not, rob Gatsby of his greatness, and get The Great Gatsby wrong at the most fundamental level. I grant you this is what high-school English students are routinely assigned to do, but that doesn’t make it right.

Friday, August 23, 2013

Nick Carraway’s fiction

The discussion of The Great Gatsby begins today in my course on it—an entire course devoted to a 180-page book—and in rereading it, I was struck for the first time by the apparent irrelevance of the opening paragraphs. You’ll remember them. Nick Carraway, who has not yet divulged his name, quotes advice from this father, warning him, in much the same way the Los Angeles Review of Books warns reviewers of first-time authors, not to say anything if he can’t say something nice. Nick reflects:

[My father] didn’t say any more, but we’ve always been unusually communicative in a reserved way, and I understood that he meant a great deal more than that. In consequence, I’m inclined to reserve all judgments, a habit that has opened up many curious natures to me and also made me the victim of not a few veteran bores. The abnormal mind is quick to detect and attach itself to this quality when it appears in a normal person, and so it came about that in college I was unjustly accused of being a politician, because I was privy to the secret griefs of wild, unknown men. Most of the confidences were unsought—frequently I have feigned sleep, preoccupation, or a hostile levity when I realized by some unmistakable sign that an intimate revelation was quivering on the horizon; for the intimate revelations of young men, or at least the terms in which they express them, are usually plagiaristic and marred by obvious suppressions. Reserving judgments is a matter of infinite hope. I am still a little afraid of missing something if I forget that, as my father snobbishly suggested, and I snobbishly repeat, a sense of fundamental decencies in parcelled out unequally at birth.What exactly is this passage doing in the novel? On the first page, to boot? What function (if any) does it perform?

Fitzgerald himself, according to Matthew J. Bruccoli’s biography, received eight votes in a poll of Princeton’s class of 1917 for Thinks He Is Biggest Politician (he also received seven votes for Wittiest, and fifteen for Thinks He Is Wittiest).[1] In at least one respect, then, the passage is grounded in autobiography. And perhaps this accounts for the pseudo-intellectual tone of the general observations: “The abnormal mind is quick to detect and attach itself to this quality when it appears in a normal person,” “the intimate revelations of young men . . . are usually plagiaristic and marred by obvious suppressions.” These sound like the grand pronouncements of a recent undergraduate, although Carraway (as we learn later) is just about to turn thirty.

Fitzgerald’s social and academic position at Princeton was anxious and unsettled. Unlike Carraway, his family was not “prominent” nor “well-to-do”: his father was a failure in business. He was one of the few Catholics at Princeton—the only Catholic whom Edmund Wilson knew there—and though he did not flunk out, he never took a degree. Because he was never in good academic standing, he was never able to hold office in the Triangle Club, the undergraduate dramatic guild (he coveted the presidency).[2] Fitzgerald must have been exquisitely sensitive to accusations of “being a politician”—a suck up, as we might now say, a brown noser.

What he has done in this early paragraph is to transfigure the social anxiety into the plausible explanation for his narrative—the fiction of the fiction of Gatsby. Few critics have bothered to answer the question that the Amateur Reader recently asked about the novel: “Why is Carraway writing?” How does he account for his 180-page manuscript?

The answer is that Nick Carraway has the “habit” of listening to the “secret griefs of wild, unknown men”—men like Gatsby, for instance. His “inclination” to reserve judgment has repeatedly cast him in the role of onlooker narrator, although perhaps third-party narrator or confessor narrator is more apt in his case. An “intimate revelation” like Gatsby’s is nothing new to him. Why, even the sensation he reports later upon listening to Gatsby’s war stories (“like skimming hastily through a dozen magazines”) is nothing new. Gatsby too offers up revelations that are “plagiaristic” and “marred by suppressions.”

It has generally been recognized that, at least as far as Nick Carraway understands what he is doing, The Great Gatsby is not a novel. “[O]n Gatsby’s side, and alone,” he is “responsible, because no one else was interested.” As far as he knows, he is telling a true story—as true as he can make it. But if he is not writing a novel, then, what is his writing? He is transcribing an intimate revelation as confessed to a “normal person,” an unremarkable person, who is accustomed to reserving judgment even for the unsought monologues of “curious natures” and “veteran bores.” The fiction behind The Great Gatsby, the device that plausibly explains its place in the world, has rarely been remarked upon, because Fitzgerald has been as masterful in concealing it as in devising it.
____________________

[1] Matthew J. Bruccoli, Some Sort of Epic Grandeur: The Life of F. Scott Fitzgerald, 2nd rev.ed. (Columbia: University of South Carolina Press, 2002), p. 72.

[2] Bruccoli, p. 57.