Tuesday, November 12, 2013

Lessons in human dignity

Victor Brombert, Musings on Mortality: From Tolstoy to Primo Levi (Chicago: University of Chicago Press, 2013). 188 pages.

Victor Brombert, who just turned ninety, is one of the last great comparativists in literary scholarship. A younger, more present-minded scholar would decide upon his “approach” before starting a book like this, and whether the “approach” is even relevant to his texts would be of less moment than establishing himself, for a few months at least, ahead of the curve. For Brombert, the first question is what books to read. The Death of Ivan Ilych, Death in Venice, “A Hunger Artist” and “The Metamorphosis,” To the Lighthouse, The Garden of the Finzi-Continis, Waiting for the Barbarians, The Plague, and Survival in Auschwitz—the historical stretch (from 1886 to 1980 and later), the linguistic range (Russian, German, French, and Italian in addition to English), are why the comparativist is worth listening to.

Musings on Mortality, the eleventh book in an academic career that began in 1949, is like sitting in on a late-afternoon graduate seminar in the oak-paneled honors room with the comfortable chairs. The distinguished professor emeritus from Princeton, author of books on Stendhal and Flaubert, has no thesis to grind; he is blessedly “atheoretical,” as the graduate students who are impatient for their guild cards tend to complain. He describes the “foreshadowing” in The Garden of the Finzi-Continis, he speaks of Primo Levi’s “telling what [Auschwitz] was like,” without a trace of self-consciousness. He never quotes a text without giving both the original and the translation (usually his own). Indeed, he will not discuss a book unless he can read it in the original language. This self-limitation is not modesty, although its effect is that, but scholarly integrity. The first commandment of comparative literature is that texts must be studied in the original to be understood properly. He remains faithful to the comparative method from first to last.

There are disadvantages to the method. Brombert’s unfamiliarity with Jewish languages and traditional Jewish sources suspends Primo Levi from a significant portion of his literary heritage, and raises questions about Brombert’s knowledge of Holocaust literature. He explains why Levi “chose to devote an entire chapter [in Survival in Auschwitz] to a canto of Dante’s Divine Comedy”—a literary choice that disturbed his students, Brombert reports—concluding that the “recourse to lines of poetry buried in the memory, but not really forgotten, carried a humanistic message.”

How much the analysis would have benefitted from a comparison to another Holocaust memoir! In The Book and the Sword (1996), the Talmudic scholar David Weiss Halivni tells about the day in Auschwitz when he saw an SS guard eating a sandwich “wrapped in a page of Orach Chaim, a volume of the Shulchan Aruch, Pesil Balaban’s edition.” With tears in his eyes, Halivni begs the guard to give his “this bletl, this page,” as a souvenir:

On the Sundays we had off, we now had not only Oral Torah [to study] but Written Torah as well. The bletl became a visible symbol of a connection between the camp and the activities of Jews throughout history. . . . The bletl became a rallying point. We looked forward to studying it whenever we had free time. . . . It was the bletl, parts of which had to be deciphered because the grease made some letters illegible, that summoned our attention. Most of those who came to listen didn’t understand the subject matter, but that was irrelevant. They all perceived the symbolic significance of the bletl.The comparativist is welcome to prefer the humanistic message, but cut off from “the activities of Jews throughout history,” it begins to feel a little thin and undifferentiated, a synthetic product of the comparativist’s own method. If the reader can accept this limitation—if he can read Brombert’s book in the spirit of its title—Musings on Mortality will succeed on its terms, gently stroking the reader into wonderment.

Thus the confrontation with mortality leads Ivan Ilych “[f]rom self-love to pity and compassion,” a “trajectory” which is “immense.” Thomas Mann warned that the “attraction to the abyss of immensity and darkness, to the unorganized and immeasurable,” conceals a “longing for nothingness.” Kafka toyed with the “idea of liberation through death.” According to Virginia Woolf, art is intimate with death: “It immobilizes the vitally changeable and thereby projects an already posthumous view.” Camus may have been in love with life, but he was forever aware of encroaching death and stressed “the importance of remaining supremely conscious at the point of death.” J. M. Coetzee is “equally elusive and paradoxical” about his own beliefs in the face of death. “I have beliefs,” as one of his characters says, “but I do not believe in them.” Brombert permits his writers to speak for themselves, and if they pull back from the edge of definitiveness, so does he. He excels at summary; he is capable of following the scent of a theme throughout an entire life’s work, flashing the writer’s phrases whenever possible. Each chapter of Musings on Mortality is an education in itself.

Such a book is neither right nor wrong. Although the language breathes heavily sometimes from the academic lifting (“Kafka quickly deconstructs the fabric of his own mythotheological motifs”), this is both unusual for Brombert, who would sooner write in the straightforward tones of paraphrase, and yet also weirdly appropriate. Musings on Mortality is an invitation to learn gladly from a deeply cultured man who would gladly teach. His lesson, to use his own words about Primo Levi, is a “lesson in human dignity.” And among the dignities of man, as Victor Brombert convincingly demonstrates, is the serious discussion of serious literature, which treats it as having something worth saying to those who would only listen.

Tuesday, October 22, 2013

A sapphire anniversary

Sunday was the fifth anniversary of this Commonplace Blog. My very first post, appropriately enough given my sworn allegiance to him, was a review of Philip Roth. Few people read it, although I was happy and relieved to publish it here.

The fall of 2008—I was on sabbatical from Texas A&M University, Hurricane Ike wiped out much of the semester, and all of my interest in my current research came down with the power lines. I had begun a book that I was calling Battle Cry of Theory, a history of French theory’s invasion of English departments from the early ’seventies to the present. But as I felt my time slipping away—I’d been diagnosed with terminal cancer just one year before—suddenly I did not relish the thought of spending my last months over the pages of Paul De Man, Jonathan Culler, J. Hillis Miller, Geoffrey Hartman, and the camp followers of the “Yale critics.”

Or perhaps it was merely that, when my family escaped Houston for a few days at a Jewish youth camp in the Hill Country, it did not occur to me to take any theory along for the ride. Instead I immersed myself in Roth’s new novel Indignation, and having finished it much too quickly, borrowed my wife’s copy of The Brass Verdict by the crime novelist Michael Connelly. Back-to-back reviews to commence my career as a book blogger.

I’d been writing book reviews professionally—that is, for low pay—since 1981, when I reviewed Philip Appleman’s Shame the Devil for New York Newsday. Within two years I had attracted the notice of Mel Watkins, the editor of the New York Times Book Review, who put me to work writing short assessments of the novelists that more prominent critics wanted nothing to do with—Katherine Govier (my first), Sheila Bosworth (my first jacket blurb), Whitney Strieber, Jack Higgins, James Alexander Thom, Ernest K. Gann. When Mr Watkins left the Book Review in 1985 (I could never bring myself to call him “Mel”), the new editor quietly dropped me as a regular contributor.

For the next two decades I reviewed little fiction. My PhD was in the history of criticism, especially the history of American criticism, and The Elephants Teach, my first book, was intended as a contribution to that subject.

My original intent, when I had gone off to Northwestern University, was to write a biographical and critical study of the writers grouped around Yvor Winters—his wife Janet Lewis, his best and most famous student J. V. Cunningham, and writers largely forgotten and not typically associated with him, including John Williams, the author of Stoner. I wanted to bring some attention to obscure poets of moving perfection—Helen Pinkerton, for example—and I planned to call my book Peers of Tradition. The phrase was Cunningham’s. The idea was what set these writers apart.

But though Gerald Graff, my PhD advisor, had himself studied under Winters at Stanford, he vetoed my project. Jerry was working on the book that would become Professing Literature, the first history of English departments in America, and I was enlisted to assist him on the research. He suggested that I write a sort of companion volume. Thus was my story of creative writing workshops, in print for seventeen years now, first conceived.

Until I started A Commonplace Blog five years ago, I didn’t fully realize how gaunt and unhealthy-looking my prose had become under the influence of academic writing. The blog format proved unexpectedly congenial. I had no inkling, when I blindly began, that blogging would be so liberating. Not only was I freed from begging letters to editors (if I wanted to review a book, I could review it without anyone’s permission). But also I no longer had to worry about what the chairman of the English department referred to as “career logic,” wherein every printed word must contribute to the building of a limited but national reputation.

Other than the stray political or scandal-mongering post, which always accumulates more “hits,” my five most popular literary essays of the past five years have been these:

(1.) Review of Tim Winton’s novel Breath, probably because the novel’s subject (surfing) causes my review to pop up in search engines.

(2.) My lament “What Became of Literary History?” which mourns the success of New Criticism in reducing the study of literature to “close reading.”

(3.) “Darlings of Oblivion”—a reflection on cancer and the small struggles of daily living, inspired by a phrase from Nabokov.

(4.) My most popular list—“The 10 Worst Prize-Winning American Novels of All Time.” From Jerzy Kosinski to John Updike.

(5.) A reconsideration of Vladimir Jabotinsky’s Samson, a novel that is hard to find, despite Ruth R. Wisse’s inclusion of it in The Modern Jewish Canon. My essay on it is one of the few in “print.”

That two of the five are reviews or review-essays is oddly cheering. Book pages may be dying (and they never gave their reviewers enough space or pay to begin with), and reader reviews may be squeezing out professional reviewers, but I remain convinced that readers are starved for intelligent and serious book-talk. I am proud to have contributed my share over the past five years.

Friday, October 18, 2013

Remembering JVC

Yesterday the Powerline blog—a politically conservative blog out of the Twin Cities—linked to my essay on Mario Puzo’s novel The Godfather. Over a thousand first-time readers descended upon A Commonplace Blog, although few lingered long enough to poke around in the remains of my literary thought. One who did was the photographer and printmaker William Porter, who had been a classics scholar in another life. From 1979 to 1982, he had held a Mellon postdoctoral fellowship in Renaissance studies at Brandeis University. It was there that he became friends with J. V. Cunningham.

Porter soon discovered Cunningham’s significance to me as well. Four-and-a-half years ago on this blog I published my notes from a course in the history of literary criticism that Cunningham taught at Washington University in St. Louis, where he was the Hurst Visiting Professor in 1976. (I also reproduced a rare early photograph of Cunningham.) And of course I have repeated to anyone who would listen that John Williams’s brilliant minor novel Stoner, a testament to the scholarly life, is based on the life and personality of JVC.

Porter shared his own memories. (He has given me permission to quote them here.) Cunningham, he told me,

ended up writing an important letter for my dossier that helped me a lot when I moved on in 1982. I was also a poet and translator, and in particular fancied myself a writer of epigrams—so I had that to share with Cunningham as well. I got to know him and his lovely wife and visited his house out in Sudbury. I learned more from “hanging out” with Cunningham drinking coffee than I had from nearly any of the teachers with whom I’d studied for semesters or even years.Our experiences are oddly parallel. I too spent several happy afternoons with Cunningham and his wife Jessie MacGregor Campbell, an Austen scholar recently retired from Clark University, at their home in Sudbury. (Mrs Cunningham never failed to serve me carrot cake. Cunningham would not take a piece. He had given up sugar, he explained. Why? “I found that it was easier,” he said.)

For me too he wrote a recommendation, and though I doubt that it helped me very much—by the time I entered the profession of English in the late ’eighties, he was considered a reactionary by those to whom he was not obscure—the letter was precious to me. I have always wished I could use one line of it as a blurb to all my writing: “Mr Myers,” he said, “writes a prose that is always distinctive, and sometimes even distinguished.” Anyone who knows anything at all about Cunningham knows just how high this praise is. After having such a thing said about me (and by him!), there was no possible way for me to stop writing.

Porter himself turned away from the life of scholarship a decade and a half ago. “I wanted to stop reading other people’s footnotes,” he says, “and didn’t fancy lecturing Honors freshmen on Homer and Sophocles.” I have read few indictments of the humanities at the turn of the century that are more devastating, and in fewer words. Cunningham would have admired its epigrammatic quality. Harried by student complaints that my grades are too low and the Jewish holidays are “too many,” I am tempted to follow Porter into a less puerile life.

Why I stay, though, can be directly attributed to Cunningham. I have described before on this blog a scene from his course in the history of criticism. (Link provided lest my close readers fear that I have forgotten the earlier account.) One day in class, Cunningham asked the dozen or so graduate students enrolled to fill the blank in an epigram by Sir Henry Wotton:He first deceased; she for a little tried
To live without him, ________, and died.
The other students in the class struggled valiantly to rise to the occasion, devising all manner of poeticisms to satisfy the missing cretic foot. I was dull and embarrassed by my dullness. I wrote in resignation:He first deceased; she for a little tried
To live without him,
went to bed, and died.
Wotton’s original, of course, is far more distinguished:He first deceased; she for a little tried
To live without him,
liked it not, and died.
Cunningham read my pitiful effort aloud to the class and said, “In twenty-five years of teaching, this is the best wrong answer I have ever received.” Porter’s reaction to my anecdote is worth quoting in full:“The best wrong answer I have ever received.” Sounds just like the man. Seems to say very little, but in fact prompts one (well, if one is attentive) to start wondering about lots of things. That’s what I remember about my conversations with him. There was a lot of silence, but when talking was done, he’d let me do more than my share. This of course encouraged me to think what I was saying must be interesting or important. And then he’d drop some little comment that would keep me awake at night for a week. Without a doubt the most efficient teacher I ever knew.Yes, exactly. Every word of Cunningham’s was measured. (The pun is intentional.) His speech was as packed and pointed as his famous epigrams. (See here and here and here for examples.) He never belabored a point, because he expected you to reach the understanding, upon further reflection, that what he said was necessary and true.

Cunningham’s comment in class has kept me awake for three and a half decades. Only after corresponding with William Porter, though, did I realize the meaning of his “prompt” in my life.

In one sentence, Cunningham defined the scholarly life. It is not a matter of formulating correct answers, which is something that undergraduates, with their obsession over grades, cannot seem to grasp. It is a matter of so inhabiting other men’s minds, other men’s time, that your wrong answers are very nearly their own thinking.

I have never become disgusted with “other people’s footnotes,” because I have never wasted much attention upon them. I have been distracted by greater minds. Of course, I’ve never had a very successful academic career, and this in part is why. Despite my professional failure, though, I have remained in the university to pursue a scholarly life. And why? Because the difficulty of entering greater minds, whether they are the founders of creative writing or the Roth to whom I keep returning, is a challenge that has never grown stale for me.

There are only so many footnotes that a person can read. There are, however, an inexhaustible number of lines of verse to get almost right.

Tuesday, October 15, 2013

Mario Puzo’s Mafia novel

The reshaping of American literary culture from the early 1950’s to the early 1970’s might be captured in one historical image. James Jones’s From Here to Eternity, the massive blockbuster about the Regular Army in the last months before Pearl Harbor, was awarded the 1952 National Book Award in fiction. Not quite two decades later, Mario Puzo’s The Godfather was not even nominated. Joyce Carol Oates was honored for Them, her long aimless narrative of poor whites adrift in riot-torn Detroit.

Puzo had treated the Mafia in his novel in much the same way that Jones had treated the army—as an autonomous social institution with its pressures for conformity, where there is no place for a man with any real integrity.

From this it does not follow, however, that Puzo’s theme is what Gay Talese described in the Washington Post in reviewing the novel:

Whether men’s ambitions are fulfilled in the arena of politics or banking, business or crime, it makes little difference—the rules are often the same; it is a game of power and money; might makes right; and the most brutal acts are easily justified in the name of necessity and honor. Governments fight world wars for honor, drop atomic bombs for peace, stage bloody brawls for Christ; and the Mafia, on a mini-scale, acts out similar aggressions for similar goals—profit, prestige and justice as they see it.The Godfather is not, in short, an anti-Vietnam War novel in disguise. Rather, it is a novel that belongs to the same class as From Here to Eternity. It adopts the techniques of literary naturalism—the detailed social observations, the tone of moral detachment, the long sojourn among an underclass—to tell the story of an institution not immediately associated with the degradation of man.*

And like Jones, Puzo fills his pages with man after man—dozens of them, including the occasional woman—all of whom are distinct individuals, with individual histories and traits. No character is introduced without a backstory and a chapter to himself. This is the method of the blockbuster. In 1952 it was possible to win a major American literary award with a naturalistic blockbuster; by 1970 a novel had to be a “holy vessel of the imagination” to receive official recognition.

If it is read at all any more, Puzo’s The Godfather is probably read as the “novelization” of Francis Ford Coppola’s famous film of the same title, which was rated the third greatest American film of all time. It took Coppola three years to bring the novel to the screen (or about the same length of time that Fred Zinnemann took to film From Here to Eternity). According to literary gossip, Puzo molded and trimmed his work-in-progress to satisfy the demands of Paramount Pictures. If there is any truth to the rumor, however, it is startling that the most important scene in the novel, in which “Don Corleone gave the speech that would be long remembered” and in which “he coined a phrase that was to become as famous in its way as Churchill’s Iron Curtain”—the phrase that inspired the dust jacket illustration by S. Neil Fujita that was reproduced on the movie posters—only makes it into the film version in heavily abbreviated form.

After the Don is shot on the streets outside Genco Olive Oil, after Michael Corleone guns down the police captain Mark McCluskey and the drug smuggler Virgil Sollozzo, after Sonny Corleone has been murdered in retaliation, Vito Corleone calls a meeting of New York’s Five Families with “invitations to Families all over the United States” in order to sue for peace. The meeting is filmed by Coppola, and so too is the Don’s speech. But its central passage is not recorded:Let me say that we [in the Mafia] must always look to our interests. We are all men who have refused to be fools, who have refused to be puppets dancing on a string pulled by the men on high. . . . Who is to say we should obey the laws they make for their own interest and to our hurt? Sonna cosa nostra . . . these are our affairs. We will manage our world for ourselves because it is our world, cosa nostra. And so we have to stick together to guard against outside meddlers. Otherwise they will put the ring in our nose as they have put the ring in the nose of all the millions of Neapolitans and other Italians in this country.Coppola does not include this speech, because it does not express his message. Coppola’s message is delivered by Al Pacino (in “a part too demanding for him,” according to the late Stanley Kauffmann). When Michael Corleone returns from hiding in Sicily after the murders of Sollozzo and Captain McCluskey, he finally goes to see his old flame Kay Adams.

Michael tells her that he is working for his father now. “But I thought you weren’t going to become a man like your father,” Kay says; “you told me.” “My father's no different from any other powerful man,” Michael replies—“any man who’s responsible for other people, like a senator or president.” “Do you know how naïve you sound?” Kay asks with a smile; “senators and presidents don’t have men killed.” “Oh,” Michael says; “who’s being naïve, Kay?” Or, in other words, Gay Talese had it right after all. The Mafia differs from the U.S. government only in the extent and reach of its power. This is a view that can be enjoyed by libertarian and political radical alike, but it is not the view of Puzo’s novel.

In the novel, Michael’s speech to Kay is rather different:You’ve got the wrong idea of my father and the Corleone Family. I’ll make a final explanation and this one will be really final. My father is a businessman trying to provide for his wife and children and those friends he might need someday in a time of trouble. He doesn’t accept the rules of the society we live in because those rules would have condemned him to a life not suitable to a man like himself, a man of extraordinary force and character. What you have to understand is that he considers himself the equal of all those great men like Presidents and Prime Ministers and Supreme Court Justices and Governors of the States. He refuses to accept their will over his own. He refuses to live by rules set up by others, rules which condemn him to a defeated life. But his ultimate aim is to enter that society with a certain power since society doesn’t really protect its members who do not have their own individual power. In the meantime he operates on a code of ethics he considers far superior to the legal structures of society.I’d be tempted to characterize this view as fundamental to Italian fascism if Benito Mussolini had not been an intense and triumphant foe of the Mafia and its “separate authority.” At all events, it is not a view that is affirmed by Mario Puzo. In a small passage tucked away in a seemingly unimportant scene, Puzo makes his own view clear in his own voice. In contrasting Sonny Corleone to his brother-in-law Carlo Rizzi, Puzo writes that Sonnywas a man who could, with the naturalness of an animal, kill another man, while [Carlo] himself would have to call up all his courage, all his will, to commit murder. It never occurred to Carlo that because of this he was a better man than Sonny Corleone, if such terms could be used;a better man, even if he also beats his wife. (Puzo could not get away with such a distinction in 2013.) The mere fact that Carlo Rizzi recognizes a moral authority that is separate from his own, if only in restraining him from murder, means that he is a moral advance over Sonny. The Mafiosi may consider themselves “far superior” to the rest of society, but by Puzo’s lights, they are lesser men.

Puzo’s prose rarely flashes, but it rarely loses it balance either. The Godfather may not have been the best American novel of 1969, or even the third best (although it is easily better than Oates’s Them and two other novels nominated for the National Book Award, including Leonard Michaels’s Going Places and Kurt Vonnegut’s dull and tendentious Slaughterhouse-Five), but it remains a novel worth reading, if only for its ambition of copia or completeness.

The Godfather is a full picture of the Mafia, but it does not glamorize it. Puzo represents the Mafia as the social institutionalization of violence. This is not an accidental feature of “refusing to live by rules set up by others,” but its very essence. Nor does Puzo suggest a superficial and sloganeering moral equivalence between the Mafia and governments or businesses. His Mafia is a unique institution that uniquely degrades men, when it does not murder them.

* Raffi Magarik, a graduate student in English at Berkeley and a regular reader of A Commonplace Blog, writes to register his unhappiness with this phrase: an institution not immediately associated with the degradation of man. I admit to not being entirely pleased with it myself. What I was thinking is that (a.) prior to Puzo’s novel, the Mafia was not usually thought of as a social institution, and (b.) in Mafia fiction, it is more usually associated with beatings and murder than with human degradation (and certainly not the degradation of the men who become Mafiosi).

Magarik mentions W. R. Burnett’s Little Caesar (1929), perhaps the only earlier American novel about the Mafia. It chronicles the rise of Rico (a character modeled on Al Capone) from mob gunman to mob chieftain. From first to last, though, Rico remains a sociopath. He is vain about his hair, proud of his ability with a gun, and fair in splitting the take from robberies with his subordinates. His rise to power does not degrade him, however; he seizes an opportunity and holds on to power through violence. Puzo was the first American novelist who understood the Mafia as something different from a mere criminal gang—a complex social organism with a “separate authority” and its own code of ethics. His Mafia, in fact, differs only in social detail from James Jones’s army.

Magarik concludes, rather brilliantly, in my opinion, that your politics may determine whether you prefer Puzo’s Mafia or Coppola’s. Coming to the novel and film from my left, he concludes that “the Coppola is better than the Puzo just because the mob seems, on its own terms, too easy a target for naturalistic critique.” As a conservative, I prefer Puzo’s moral vision.

Thursday, October 10, 2013

Alice Munro, the 13th woman

Alice Munro became the first writer whose reputation derives almost exclusively from short stories—and the first Canadian—to win the Nobel Prize in literature.

What is relevant about her for literature, though, escaped the notice of the New York Times, which touted her as “the 13th woman to win the prize.” She is also the nineteenth English-language writer since 1944, the fourth writer in her eighties, the fifteenth avowed leftist (and the fifth in the last decade) to win the literature Prize, but these are not facts worth mentioning in the newspaper of record. That the Times was merely recycling the language on the literature Prize’s homepage (“Alice Munro is the 13th woman awarded the Nobel Prize in Literature so far”) suggests that, for the literary culture, counting by gender is now automatic and unreflective.*

It is true that other Nobel winners have written many distinguished short stories, perhaps most conspicuously Isaac Bashevis Singer. But it is also true that Singer wrote nineteen novels, and in awarding him the Prize in 1978, the Nobel committee cited his “impassioned narrative art which, with roots in a Polish-Jewish cultural tradition, brings universal human conditions to life.” The other great story writers who took home the Prize—Heinrich Böll, Sh. Y. Agnon, Ernest Hemingway—were also prolific novelists who conceived of themselves as novelists.

Alice Munro is the first Nobel winner whose entire career has been devoted to the story. To overlook this fact about her, to consign her to being the thirteenth of her gender rather than the first of her genre, is to overlook her importance in literary history. It is, in fact, to dishonor her.

When she was a young writer, just starting out, Munro hoped (like any fiction writer who hopes for greatness) to write novels. In her modesty, she later claimed that her labors as a wife and mother kept her too busy for anything longer than stories. “A child’s illness, relatives coming to stay, a pileup of unavoidable household jobs, can swallow a work-in-progress as surely as a power failure used to destroy a piece of work in the computer,” she said. “You’re better to stick with something you can keep in mind and hope to do in a few weeks, or a couple of months at most.” As I wrote in my Commentary essay on her last year, though, this modest explanation should also be understood as a sly apologia for the short story, perhaps the ideal form of literature for the busyness of career-driven postmodern lives.

Few have ever been better at the form, and in awarding Alice Munro the literature Prize, the Nobel Committee is also (at last) recognizing the short story’s essential place in modern writing.

* When I was a newspaper reporter, the recycling of press-release language in a story’s lead would have been cause for a severe dressing down. But perhaps journalistic standards are different now at the Times.

Thursday, October 03, 2013

The unshakable confidence

“He most honors my style,” Whitman wrote in Song of Myself, “who learns under it to destroy the teacher.” The best thing about teaching is not merely the company of young minds, but the opportunity to be instructed by them. In my class at the Ohio State University on The Great Gatsby and the art of criticism, I have repeatedly (and, I’m afraid, rather tiresomely) denounced the the commonplace interpretation, advanced by nearly every English teacher across the dark fields of this republic, that the novel dramatizes the conflict between old money versus new money. The phrases themselves, I like to point out, do not appear—nothing like the phrases themselves appear—anywhere in Fitzgerald’s text. When one of my impertinent students repeated my claim in another class in which Gatsby is assigned, the professor (my colleague) sputtered, “We don’t believe in the fallacy of authorial intent.” (He clearly meant that he does believe in the intentional fallacy, even if the variant in which he believes is a vulgar misrepresentation of Wimsatt and Beardsley’s actual views.)

Yesterday in class another two of my students demonstrated that they have already outgrown my tutelage and are ready to take on my colleagues without any assistance from me. We are now studying Trimalchio together, that “early version” of Gatsby edited by James L. W. West III. We are having great fun comparing Fitzgerald’s changes from draft to published copy. Reading Trimalchio under the privacy of my lamp, I realized that I was making the unexamined assumption that Gatsby is obviously superior—that every change from draft to published copy was for the better. But there is no warrant for such an assumption. The two versions are different, with different aims and effects. Fitzgerald did not revise the draft for the sake of a generalized and unfocused improvement, but for specific results. Our working hypothesis, I told the class after we examined several of his revisions, is that Fitzgerald changed Trimalchio into The Great Gatsby for the sake of a consistent artistic effect. To be explicit: a tragic effect.

(Trimalchio, by contrast, is a good old-fashioned “marriage plot” in which two men are involved in a power struggle over a woman, who—as in Austen, Eliot, and James—consults her own mind in making her choice between men. Daisy announces to Nick that she is leaving Tom, and a few days later she shows up at Gatsby’s mansion with her suitcases packed. For an entire page, Gatsby rehearses the practical reasons why she cannot leave her husband quite yet. “In other words you’ve got her,” Nick comments—“and now you don’t want her.”)

I asked the class for examples of textual changes so that we might test our working hypothesis. A young woman pointed out that, in the roll call of partygoers that Nick writes in the “empty spaces of a time-table” at the beginning of Chapter IV, every time a character in Trimalchio is said to come “from West Egg” he comes instead from East Egg in the published version of Gatsby (and vice versa). The flip-flops are predictable and uniform: every mention of an Egg goes over from one to the other. The student looked at me expectantly, hoping for a learnèd explanation. I looked back at her blankly. Although I had noticed the flip-flops too, I was stumped by them. Not another young woman in class. “They demolish the claim that the novel is about old money versus new money,” she said, “because what they show is that it’s completely arbitrary where the money in the book comes from.” So much for the commonplace interpretation, which ought never again be repeated with a straight face.

Fitzgerald’s revisions also show, as I have written before in this space, that the concept of a fixed and unified text is the unspoken presupposition behind all literary criticism, despite the clear evidence that, for writers themselves, texts are forever in a state of becoming. Most critics in our day are estranged from religion, and yet their mental habits derive from the traditional study of the Bible, and the perfect faith that “nothing can be added to [a literary text], nor anything taken from it” (Eccl 3.14).

Monday, September 23, 2013

Magical thinking about death

Anyone who has ever read The Adventures of Tom Sawyer, which used to be pretty much every boy in America, remembers the scene in which Tom, Huck, and Joe Harper attend their own funeral:

As the service proceeded, the clergyman drew such pictures of the graces, the winning ways, and the rare promise of the lost lads that every soul there, thinking he recognized these pictures, felt a pang in remembering that he had persistently blinded himself to them always before, and had as persistently seen only faults and flaws in the poor boys. The minister related many a touching incident in the lives of the departed, too, which illustrated their sweet, generous natures, and the people could easily see, now, how noble and beautiful those episodes were, and remembered with grief that at the time they occurred they had seemed rank rascalities, well deserving of the cowhide. The congregation became more and more moved, as the pathetic tale went on, till at last the whole company broke down and joined the weeping mourners in a chorus of anguished sobs, the preacher himself giving way to his feelings, and crying in the pulpit.Everyone remembers the scene because it cannily expresses what is among the commonest of human fantasies—the dream of peeking back into life after death to gauge just how much one is mourned and missed.

It’s appropriate the scene should occur in a boys’ book, because the fantasy is destructive of human maturity and the reality principle (which amount to the same thing). Perhaps none of my opinions makes people angrier than my insistence that daydreaming about life after death, whether it takes the form of wish-fulfillment fantasies about one’s own funeral or the delusion that one can ever be released from suffering, is a self-indulgence the dying cannot afford. We don’t encourage our children to believe they can grow up to become superheroes capable of leaping tall buildings in a single bound, and we should not encourage the terminally ill to pin their hopes upon something they will never experience in this lifetime.

To tell them that their suffering will be relieved by death—here, let me help you die—is a lie told for the benefit of the liar, because the dead do not know relief. They don’t know anything. They are dead. The relief is sought by those who must watch the dying suffer, and they will be the only ones to feel the relief. Relief of suffering, like funeral services, belong to living. The dead are excluded from them.

As usual it is Emily Dickinson, the poet laureate of death, who gets it exactly right:That short — potential stir
That each can make but once —
That Bustle so illustrious
’Tis almost Consequence —

Is the éclat of Death —
Oh, thou unknown Renown
That not a Beggar would accept
Had he the power to spurn —
Her first editors, Mabel Loomis Tood and Thomas Wentworth Higginson, assumed that she was describing a funeral, and hung the title like a wreath on the doorway of her poem. But Dickinson is not saying that a funeral is “the éclat of Death,” its moment of brilliant success, but rather that death’s only achievement, its “unknown Renown” (because no one who knows it can return to bask in it), is a “short potential stir.”

We think too much of death and not nearly enough of dying. There is a reason for that. Dying is a mental discipline, which entails many hours of training in (among other things) the renunciation of fantasies that death will be anything other than it is—the cessation of consciousness—and the bitter facing up to the reality of that fact. Those who prefer daydreams of impossible release from what awaits them will leave themselves (and those they love) tragically unprepared for the conclusive Bustle, which is “almost consequence.”

Wednesday, August 28, 2013

Baz Luhrmann’s final paper

Baz Luhrmann substitutes a high-school English paper for F. Scott Fitzgerald’s novel in scripting his film version of The Great Gatsby, released earlier this year and now available on DVD from Warner Home Video. The conflict between “old money” and “new money” and the symbolism of T. J. Eckleburg’s eyes as the “eyes of God,” those English-class favorites, are carefully enunciated and repeated by the actors just in case an unwary moviegoer might be under the illusion that Luhrmann’s purpose in remaking Gatsby is to scrape off the critical clichés and restore a classic to its original condition. The phrases appear nowhere in Fitzgerald’s text. They are, however, fixed as securely to the popular consciousness as Hamlet’s indecisiveness, which belongs not to Shakespeare’s play but to A. C. Bradley’s 1904 lectures on Shakespearean tragedy. No one who knows the commonplaces needs to read the texts with any attentiveness, because they have already been “read” for him—by general agreement.

The mistakes pile up. As the film opens, Nick Carraway is in a sanitarium, years after the events about to be shown, diagnosed as “morbidly alcoholic.” (In the novel, Nick says, “I have been drunk just twice in my life,” and Tobey Maguire, who plays Nick with demonstrative broadness, even repeats the line—as if oblivious to the nonsense the rest of the film makes of it.) Muttering aloud, Luhrmann’s Nick says, “Back then, we all drank too much. The more in tune with the times we were, the more we drank.” Fitzgerald’s Nick drinks too little, and tunes himself to the times in other ways, but Luhrmann’s mind is on finishing his English paper. He requires explanations, not subtleties. The movie’s Nick talks and talks in a voiceover that goes quickly from being intrusive to annoying. Why is he talking so much? He is addressing the psychiatrist who is treating him. “You see, Doctor,” he says at one point, dying to himself and being reborn as Alexander Portnoy.

The geography of the film is intentionally cartoonish. The Valley of Ashes, instead of being a narrow ash dump about the size of Flushing Meadow Park, is a monstrous waste land that has swallowed Queens whole. In an overhead shot reminiscent of Saul Steinberg’s View of the World from 9th Avenue, the lush green of Long Island ends abruptly in black-and-white, stretching from one edge of the frame to the other, with Manhattan glittering beyond it in the distance. Despite being a waste land, though, it is crawling with people. There are so many people milling around Wilson’s garage, in fact, that Gatsby is lucky he doesn’t hit someone long before his car runs over Myrtle Wilson.

Perhaps the worst thing about the film are Gatsby’s parties. Luhrmann has himself confused with Flo Ziegfeld. The parties are theatrical extravaganzas with chorus girls dancing in unison, dueling orchestras, announcers bellowing into microphones, streamers and confetti falling from the ceiling as if at a political convention, explosions of fireworks that must have kept the neighbors awake every night, and hundreds upon hundreds of guests packed so tightly in Gatsby’s rooms that they look like squirming maggots when viewed from above.

In the novel, when Nick attends one of Gatsby’s parties for the first time, he finds himself in conversation with “two girls in twin yellow dresses” and “three Mr. Mumbles,” all of them guessing at the truth about Gatsby. They lean forward confidentially and whisper to one another: “Somebody told me they thought he killed a man once,” “he was a German spy during the war.” Nick reflects knowingly: “It was testimony to the romantic speculation he inspired that there were whispers about him from those who had found little that it was necessary to whisper about in this world.” At Luhrmann’s parties, the guests find it necessary to shout. Anyone who whispered a romantic speculation would not have been heard even by himself.

In researching his final paper, Luhrmann must have learned that Fitzgerald planned originally to call his novel Trimalchio after the character in Petronius’ Satyricon who is famous for his immoderate dinner parties. Not that Luhrmann knows anything about Petronius. As a filmmaker with ambitions to greatness, though, he surely knows Fellini’s Satyricon. The parties in his film owe a deeper and more obvious debt to Fellini than to anything in Fitzgerald. The riots of sight and sound are proof merely that Luhrmann can do Fellini in the twenty-first century. They are the Folies Luhrmann, fantasies of pure excessive spectacle that have nothing whatever to do with the plot of Gatsby. In the novel, Gatsby throws his parties in the hope that Daisy will wander in one night. In the film, Gatsby would not be able to pick Daisy out of the swarm, even if she did happen to wander in.

But, really, I have been saving the worst for last. The worst is Luhrmann’s decision to make Nick Carraway into a writer. The “text” of his film is a large blank book that Nick’s psychiatrist gives him as part of his cure. “Write about it,” his doctor says. “You said yourself that writing brought you solace.” Nick has already admitted that he wanted to be a writer when he was at Yale. He picks up a green hardback copy of Ulysses to drive the point home, even though Ulysses was not published until 1922, the same year as the events in Gatsby. But why let an anachronism stand in the way of reconceiving Nick Carraway as a modernist genius?

I can think of two reasons. First, if Nick is a writer with visions of Joyce dancing in his head then he is that most tedious of creatures—the unreliable narrator. Maybe that’s why he can be “morbidly alcoholic” and also “drunk just twice in [his] life.” Whatever he says about himself is not entirely to be trusted. In the novel, Nick says at one point: “Reading over what I have written so far I see I have given the impression that the events of three nights several weeks apart were all that absorbed me”—that is, he has given a false impression. If he is unreliable, though, is the impression false or is the claim about its falsity false? In Luhrmann’s script, this line becomes: “Looking over my story so far, I’m reminded that, for the second time that summer, I was guarding other people’s secrets.” The shift to the word story is unconscious, I would wager, because Luhrmann and his co-author Craig Pearce never for a moment imagined Gatsby as anything else than make believe. The line about guarding other people’s secrets, which appears nowhere in the book, is also a reminder never to invite comparison with a better writer’s prose.

The second reason is the more important. A few days ago I argued that Nick is not a writer, at least not in the modernist sense, but a kind of confessor who is “privy to the secret griefs of wild, unknown men” like Jay Gatsby. If he is what Elias Canetti called an earwitness to the “intimate revelations” by and about Gatsby—not composing, merely listening—then the splendor of the book’s prose belongs, not to him, but to Gatsby and his dreams. To take the “creative passion” of the style away from Gatsby and bestow it upon Nick, a “normal person” who can only wonder at “what a man will store up in his ghostly heart,” is to inflate Nick into something he is not, rob Gatsby of his greatness, and get The Great Gatsby wrong at the most fundamental level. I grant you this is what high-school English students are routinely assigned to do, but that doesn’t make it right.

Friday, August 23, 2013

Nick Carraway’s fiction

The discussion of The Great Gatsby begins today in my course on it—an entire course devoted to a 180-page book—and in rereading it, I was struck for the first time by the apparent irrelevance of the opening paragraphs. You’ll remember them. Nick Carraway, who has not yet divulged his name, quotes advice from this father, warning him, in much the same way the Los Angeles Review of Books warns reviewers of first-time authors, not to say anything if he can’t say something nice. Nick reflects:

[My father] didn’t say any more, but we’ve always been unusually communicative in a reserved way, and I understood that he meant a great deal more than that. In consequence, I’m inclined to reserve all judgments, a habit that has opened up many curious natures to me and also made me the victim of not a few veteran bores. The abnormal mind is quick to detect and attach itself to this quality when it appears in a normal person, and so it came about that in college I was unjustly accused of being a politician, because I was privy to the secret griefs of wild, unknown men. Most of the confidences were unsought—frequently I have feigned sleep, preoccupation, or a hostile levity when I realized by some unmistakable sign that an intimate revelation was quivering on the horizon; for the intimate revelations of young men, or at least the terms in which they express them, are usually plagiaristic and marred by obvious suppressions. Reserving judgments is a matter of infinite hope. I am still a little afraid of missing something if I forget that, as my father snobbishly suggested, and I snobbishly repeat, a sense of fundamental decencies in parcelled out unequally at birth.What exactly is this passage doing in the novel? On the first page, to boot? What function (if any) does it perform?

Fitzgerald himself, according to Matthew J. Bruccoli’s biography, received eight votes in a poll of Princeton’s class of 1917 for Thinks He Is Biggest Politician (he also received seven votes for Wittiest, and fifteen for Thinks He Is Wittiest).[1] In at least one respect, then, the passage is grounded in autobiography. And perhaps this accounts for the pseudo-intellectual tone of the general observations: “The abnormal mind is quick to detect and attach itself to this quality when it appears in a normal person,” “the intimate revelations of young men . . . are usually plagiaristic and marred by obvious suppressions.” These sound like the grand pronouncements of a recent undergraduate, although Carraway (as we learn later) is just about to turn thirty.

Fitzgerald’s social and academic position at Princeton was anxious and unsettled. Unlike Carraway, his family was not “prominent” nor “well-to-do”: his father was a failure in business. He was one of the few Catholics at Princeton—the only Catholic whom Edmund Wilson knew there—and though he did not flunk out, he never took a degree. Because he was never in good academic standing, he was never able to hold office in the Triangle Club, the undergraduate dramatic guild (he coveted the presidency).[2] Fitzgerald must have been exquisitely sensitive to accusations of “being a politician”—a suck up, as we might now say, a brown noser.

What he has done in this early paragraph is to transfigure the social anxiety into the plausible explanation for his narrative—the fiction of the fiction of Gatsby. Few critics have bothered to answer the question that the Amateur Reader recently asked about the novel: “Why is Carraway writing?” How does he account for his 180-page manuscript?

The answer is that Nick Carraway has the “habit” of listening to the “secret griefs of wild, unknown men”—men like Gatsby, for instance. His “inclination” to reserve judgment has repeatedly cast him in the role of onlooker narrator, although perhaps third-party narrator or confessor narrator is more apt in his case. An “intimate revelation” like Gatsby’s is nothing new to him. Why, even the sensation he reports later upon listening to Gatsby’s war stories (“like skimming hastily through a dozen magazines”) is nothing new. Gatsby too offers up revelations that are “plagiaristic” and “marred by suppressions.”

It has generally been recognized that, at least as far as Nick Carraway understands what he is doing, The Great Gatsby is not a novel. “[O]n Gatsby’s side, and alone,” he is “responsible, because no one else was interested.” As far as he knows, he is telling a true story—as true as he can make it. But if he is not writing a novel, then, what is his writing? He is transcribing an intimate revelation as confessed to a “normal person,” an unremarkable person, who is accustomed to reserving judgment even for the unsought monologues of “curious natures” and “veteran bores.” The fiction behind The Great Gatsby, the device that plausibly explains its place in the world, has rarely been remarked upon, because Fitzgerald has been as masterful in concealing it as in devising it.

[1] Matthew J. Bruccoli, Some Sort of Epic Grandeur: The Life of F. Scott Fitzgerald, 2nd rev.ed. (Columbia: University of South Carolina Press, 2002), p. 72.

[2] Bruccoli, p. 57.

Wednesday, August 21, 2013

J. F. Powers and Elmore Leonard

My review of Katherine A. Powers’s edition of her father’s letters, Suitable Accommodations, appeared yesterday—publication day for the book—in the Daily Beast. The book’s publication was overshadowed by the death, earlier in the day, of the crime novelist Elmore Leonard. The Beast itself carried no fewer than three stories relating to Leonard.

Overshadowed by other writers so often during his lifetime, Powers would have had something ironic and painful to say on the subject. Maybe there was a historical lesson in the coincidence, though. Leonard was convinced that at least some of his fiction would last. How long, he didn’t say. But as for Powers: “His style is superb and his characterization sublime,” the novelist Mary Gordon, an admirer, told Portland magazine, “but it’s sad because I don’t think he’ll be remembered.”

Predictions of literary immortality are a fool’s game. Let’s assume, though, that Leonard and Gordon are both right—Leonard will be remembered, while Powers will be forgotten. The former is customarily (and lazily) described as a “genre” writer; the latter is a “literary” writer. In other words, they differed mainly in their subject matter. Leonard wrote about criminals; Powers, about Catholic priests. It’s a safe bet, then, that readers of the future will prefer criminals to priests? Or is Powers’s handicap, as Joseph Bottum said in First Things, that the “catastrophic collapse of religious vocations through the 1970s” robbed Powers’s subject of its immediacy? Priests will no longer interest readers in the future, because readers will no longer be interested in the religious problems of priests—or religious problems of any kind, for that matter. But the criminals you always have with you.

To a critical eye, however, the most obvious difference between them lies elsewhere. Leonard was eight years younger, and Powers got a six-year head start, but Leonard soon outdistanced him, publishing forty-three novels in his lifetime, plus many short stories, while Powers was lucky, as I said in my review of Suitable Accommodations, to average 6,000 words a year. Although Leonard is much admired for the quality of his writing (especially his dialogue), the raw truth is that his prose is much rougher, far less careful, far less polished, than Powers’s. Leonard depends upon narrative effects, which is why so many of his novels and stories have been turned quickly and painlessly into films, while Powers depends upon the barely audible clicking of sentences.

Here, for example, is a passage, conveniently reproduced by NPR, from Leonard’s novel Road Dogs. Notice, first, how three of the first four paragraphs begin with the third-person plural personal pronoun. While the first two refer to a hidden “they,” however (the authorities, presumably), the third refers to the inmates. The shift is handled awkwardly. (The defect mars the fifth paragraph too, which is otherwise a sharp descriptive paragraph.)

The eleventh paragraph, which violates Leonard’s own rule against using patois, is free indirect discourse or even stream of consciousness, attempting to reproduce Foley’s thoughts as he eats. At the same time, Leonard tries to use the paragraph to fill in Foley’s backstory. The result is clumsy and unconvincing. After he finishes eating, Foley is confronted by the Cuban whom he had defended against the white supremicists. Foley takes his measure: “This little bit of a guy acting tough.” There is nothing about the sentence that isn’t the recitation of a formula.

Here, by contrast, selected more or less at random, is a passage from Wheat That Springeth Green, Powers’s last novel:       Joe ran back to the stand, holding the saucers and spoons so they wouldn’t rattle. Father Stock unwrapped them, stuck the napkin in his hip pocket, motioned Sister Agatha and Sister Margaret up to the stand, and said to one of the women behind the counter, “Double dips for the Sisters. No charge.”
       Joe moved away from the stand, away from an old smelly man who looked like a tramp and said to the woman who’d handed him a cone, “No charge.” A joke?
       “Pay, Father,” the woman said.
       The old man licked the cone.
       “Father,” the woman said.
       Father Stock said, “Five cents, mister.”
       The old man licked the cone. “Try and git it.”
       Joe was astonished to see Father Stock lie across the counter and, like a swimmer doing the breast stroke, swat the cone to the ground, the ice cream, only one dip, coming out of the cone and settling in the grass.
What should immediately strike you is the greater specificity, the heightened exactness, of Powers’s prose. While Leonard relies primarily upon nouns to do his stylistic work, Powers is ambidextrous with nouns and verbs. Although Leonard has the reputation for fiction with plenty of action, it is Powers’s scene in which, less menacing as it might be, the action is tightly focused: rattle, unwrap, stick, motion, lick, swat, settle. Both writers characterize an entire history of human relationships in a few paragraphs, but Powers does so with greater economy and precision.

Please don’t misunderstand me. I am not saying that his perfectionism (his own word) makes Powers the better writer. Quite the opposite. I am beginning to wonder if the obsession with specificity and exactness, with perfecting a verbal surface, was not simply a fashion which has passed from the literary scene, and not an article of artistic faith at all. If Powers will not be remembered perhaps the reason is that his principle of style, like a green felt hat trimmed with sequins and gold braid and covered with a black lace veil, belongs to a past that is irrecoverably past. Call it the Age of Finish, a closed chapter of literary history. And Leonard, if he is remembered, will be remembered by an age that is not so fussy with its words.

Update, 8/22/13: After writing the above, I received a warm note from Katherine A. Powers. She includes, as usual for her, some striking literary observations, which she has kindly given me permission to quote. She writes: “JFP had a genius for the mot juste and for causing the words he chose to resonate ineffably with a mood or character or situation that I think goes beyond perfecting a verbal surface, that exactness and specificity, though it most certainly includes them. I read my mother’s attempts to write exactly like that passage, to capture every little move and find a simile that adds something slightly comic—and she does, sort of, but the result lacks a sense of ease (and I feel bad saying it). You could date her work almost precisely as belonging to 1940s and 50s, even though her novel was finally published in 1969—I mean belonging in the sense that that is when her literary sensibility and ambition were formed in all its Pride.”

Betty Wahl’s novel Rafferty & Co. was issued by Farrar, Straus & Giroux. In her Afterword to Suitable Accommodations, Katherine A. Powers describes the novel as being “based in a gentle way, far too gentle, I would say, on life in Ireland with a man something like [J. F. Powers].” Betty Wahl died in 1988, eleven years before her husband.

Tuesday, August 13, 2013

Casual slander and reckless clichés

A “warped ex-faculty member of Texas A&M that enables Johnny Manziel”—according to the Washington Post sportswriter Mike Wise, that’s what I deserve to be called for crying foul when he says in a column that Manziel, the Heisman Trophy-winning quarterback for the Texas Aggies, is “about a trailer park away from Tonya Harding.” (To be fair, I called Wise a moron for making the comparison.)

Those who are new to the Manziel story may wince at the snobbery of Wise’s “trailer park” crack. All it really proves is that Wise, a graduate of Cal State Fresno, is anxious to shed his class origins and join the East Coast élite. Reading the comparison to Tonya Harding, though, uninformed readers are going to assume that, like her, Manziel must have done something criminal. After all, Harding arranged an assault on Nancy Kerrigan, her skating opponent. Harding acted less like an athlete, seeking to defeat her competition, than like a gangster who wanted to maim a gangland rival.

Although Harding avoided prison through a plea bargain, Manziel may not be so lucky. Or so at least you would be right to assume after reading Wise’s column on him. And what exactly did Manziel do wrong, then? Hire thugs to break the legs of A. J. McCarron, the quarterback for Alabama? What else could be comparable to criminal assault? Here is what Manziel stands accused of: apparently he sold his autograph. For filthy lucre. ESPN has the incriminating photo of him signing his name. Signing his name? The monster!

For a college athlete to profit from his possibly short-lived fame is a violation of regulations set down by the National Collegiate Athletic Association (NCAA). If he sold his autograph then Manziel admittedly “broke the rules.” What few in the sportswriting world are prepared to do, however, is to step back and look hard at the NCAA rule that prohibits athletes from trading on their own identity. Yet how is it even legal for an organization to prohibit someone from selling his own autograph? Doesn’t it belong by rights to him?

The truth is that the NCAA rule is an unenforceable contract seeking to restrain trade. It is an illegal and unethical maneuver to prevent competition from athletes, who might cut into the NCAA’s own profits—and the profits of its member schools—if they were allowed to trade freely in memorabilia. In economic terms, the NCAA is a cartel not unlike OPEC, which colludes to set prices and squeeze out competitors. Why else do college football coaches, middle-aged men, overwhelmingly white, earn seven-figure salaries while the players, who actually win and lose the games on the field, risk career-ending injury for a “scholarship” (tuition waiver, room and board, school supplies) that is “worth” less than the median household income in the U.S.?

Wise explains why the system is fair and why Manziel is as rotten as Tonya Harding:

Prominent athletes who stand to reap great rewards from their considerable physical talent and personal appeal have to understand, even at 20, that they are held to a higher standard of decency and behavior than other kids hitting the kegger in the back of the dorm room. Signing a scholarship with a school of Texas A&M’s caliber means you literally sign up for that double standard.You’ve got to love the language. Manziel “stands to reap great rewards” someday. So he should shut up and accept the “double standard” by which he earns 1.3% of what Kevin Sumlin, his coach, is paid for inserting Manziel into the lineup. Because, you know, Johnny Football (as he is called) stands to reap the rewards later. Unless he gets injured, of course. Or unless sportswriting hacks like Mike Wise succeed in ruining his reputation.

You will read Wise’s column and not learn any of this, because despite the fact that Wise is paid to be a journalist, he is less interested in facts than in casual slander and reckless clichés.

Monday, August 12, 2013

No one left to whack

Just recently, in homage to the late James Gandolfini, I watched all six seasons of HBO’s crime drama The Sopranos for the first time. Originally running from 1999 to 2007, The Sopranos was the first television production conceived as a season-long “narrative arc” rather than a bona-fide “series” of self-contained episodes connected to one another only by the recurring characters of a regular cast—the model of the sit-com. (Ed.: Levi Asher insists that David Lynch’s Twin Peaks was first, and warns that any readers who mistake my claim for the truth might get their asses kicked in a bar if they repeat it. Although I can’t imagine Twin Peaks fans kicking the asses of Commonplace Blog fans, I think it’s only fair to record Asher’s dissent here.)

The police procedurals Hill Street Blues (1981–’87), NYPD Blue (1993–2005), and Homicide: Life of the Street (1993–’99) combined both methods, telling complete-in-themselves 40- to 45-minute stories while also incorporating a larger “narrative arc” that clamped some or all of the episodes together. And of course even sit-coms depend upon running gags or character quirks, which the viewers were expected to know ahead of time.

The Sopranos was the first, however, to use the “arc” as its principle of structure. Or, to say it otherwise, if David Chase had never persuaded HBO officials to introduce Tony Soprano to the American television public, there would never have been the later (and, frankly, better) dramas The Wire (2002-’08) and Deadwood (2004–’06).

It has been little remarked upon that the dramatic model behind The Sopranos is the soap opera. For six seasons Chase, Gandolfini, and company struggled against the soap-operish qualities of the Soprano family saga. Will Carmela sleep with Furio? Will Tony and Carmela divorce? Will Christopher make a honest woman of Adriana? Will Meadow and A. J. ever grow up and start acting like adults? It wouldn’t be too wide of the mark, in fact, to describe The Sopranos as a foul-mouthed soap opera with murders—sixty-five of them over the show’s eighty-six episodes.

In the end, though, The Sopranos is about Tony Soprano, the boss of a New Jersey crime organization. As Jeff Halperin says in the best thing I’ve read about it, The Sopranos is “about a single mind,” with ambitions to lay bare “the inner workings of a single person’s mind.” (In Halperin’s opinion, this ambition raises it above The Wire, which aims to “demonstrate the inner workings of society.” He could have added that Deadwood has an even grander ambition—to chronicle the rise of a civilization.)

Halperin’s account also has the advantage of explaining the controversial ending of The Sopranos. A smiling and relaxed Tony is meeting his family at a New Jersey diner when the screen abruptly goes black. What has happened is that Tony has been whacked. He never saw the hit coming, never noticed the hitman enter the restaurant and shoot glances at him, never noticed him go to the bathroom just like Michael Corleone in The Godfather as a prelude to a hit.

Chase was scrupulous (even overly scrupulous) in planting the clues. In the episode called “Soprano Home Movies,” Tony’s brother-in-law Bobby Baccalieri had reflected on getting whacked: “You probably don't even hear it when it happens, right?” (Bobby himself is noisily whacked in the next-to-last episode, flopping around a model-train store before finally expiring in the middle of a store display with trains crashing off bridges and bystanders clutching their ears and screaming.)

To make sure the viewer gets the point, Chase included a flashback of Bobby’s remark (untrue in his case) early in the last episode. The screen goes black because Tony has been shot in the back of the head; he didn’t even hear it when it happened. The Sopranos ends when Tony dies, when there is no one left to whack. (Only Paulie Walnuts, of the original Soprano “crew,” remains alive at the end of the show.) As the anonymous author of the “Definitive Explanation of ‘The End’ ” says, “Once Tony is dead, there is no show. If Tony was to die it had to be the last moment of the series. The show ends where Tony’s consciousness ends.”

In an amusing effort to review all six seasons in twenty-five epigrammatic nuggets, Edwin Turner wrote a couple of years ago that he views The Sopranos as a “study in existential nihilism”:

To put it in the series’ own terms, life is “all a big nothing.” In the series’ final scene in a diner, we’re reminded that the best we can hope for is to enjoy the “good times,” to focus on those moments of peace and happiness with our families. But ultimately, the series suggests nihilism, the “big nothing,” a void signaled in its famous closing shot of extended, abyssal blackness.What Turner overlooks is how neatly Chase, who wrote and directed both the pilot episode and its finale, completes the circle of the show’s six-year run.

The Sopranos starts off by borrowing the storytelling device of Portnoy’s Complaint. Tony begins his own “narrative arc” by addressing Dr. Jennifer Melfi in psychotherapy. He tells her about a pair of ducks, “from Canada or someplace,” which took up residence in his pool. They gave birth to ducklings and taught them to fly. “It was amazing,” he says happily. The scene shifts to the interior of his house where his family is having breakfast on his son Anthony Jr.’s thirteenth birthday. Tony comes in, chuffs his son, pats his wife on the backside, picks up an oversized bird encyclopedia. There is nothing to suggest that Tony is anything other than an ordinary family man until scenes of violence and meetings with “associates,” narrated less than candidly to Dr Melfi, identify him as a Mafioso.

“Do you feel depressed?” she asks without warning. “Since the ducks left,” Tony confesses. “What is it about those ducks that meant so much to you?” she asks in another session late in the episode. He begins crying. “When the ducks gave birth to those babies,” she points out, “they became a family.” Tony has a moment of insight:You’re right. That’s the link. The connection. I’m afraid I’m going to lose my family like I lost the ducks. That’s what I’m full of dread about. It’s always with me.For eight years it is with him. Finally, though, after all the murders and suicides and attempted suicides, the adultery and the drugs and the meaningless violence, after the longstanding feud with the New York-based Lupertazzi crime organization that claimed so many lives, Tony has his family around him again, intact and healthy. He is listening to Journey’s “Don’t Stop Believing” on the jukebox. And then it ends. Just like that.

Tony Soprano is not a tragic figure. His death, presumably, is violent, but the violence is not dramatized. A man’s death, as Roman Tsivkin has quipped, does not happen in his lifetime. For the tragic pleasure, though, it must occur in the drama. In the terms of The Sopranos, Tony may be a great man, but his fate and flaw and fall are not what The Sopranos is ultimately about.

As trivial as it may sound, The Sopranos is about the completion of its own design. Tony’s dread, enunciated in the first episode, is dissipated in the finale. Tony’s other family, the DiMeo crime organization, has largely been destroyed. Dr Melfi has terminated their therapy, saying, “I don’t think I can help you.” There is nothing more to tell. The “narrative arc” has been closed.

And to me, this accounts for both the dramatic success and dramatic failure of The Sopranos. On the one hand, its clear focus on a “single mind” and its strong narrative design are what keep you watching—they make, as the phrase goes, for good television. On the other hand, Tony Soprano is a miserable human being. He is a murderer and an adulterer without a conscience, the textbook definition of a sociopath. (That’s the reason, by the way, Dr Melfi finally concedes she can do nothing for him “therapeutically.”)

He has one moment when he appears to be on the path to something like redemption. Reconciled with Carmela after a nearly year-and-a-half-long separation, he rejects the sexual advances of a drug-addicted commercial realtor who is aroused by the deal they have consummated together. But the moment passes. Tony regrets his self-restraint and returns to his customary ways. The most horrifying moment in the series occurs just one month later, in the fourth-to-last episode, when he murders Christopher Moltisanti, the younger cousin whom he once loved like a son, smothering him after a car accident. Christopher’s death, he tells Dr Melfi, leaves him feeling relieved.

There is nothing, in short, to identify with in Tony Soprano. What keeps you watching is the narrative force, the technical expertise of eye-catching storytelling. After a while, though, The Sopranos begins to feel like a dependency, if not an addiction. Like Dr Melfi, you feel a growing uneasiness and hostility toward Tony Soprano. As good a filmmaker as David Chase is, he is unable to turn your moral sense until you begin to pull for the murderer, as you do, say, in Alfred Hitchcock’s Dial M for Murder.

The result is a dramatic spectacle. That it isn’t ninety minutes of explosions and fit young men defying gravity by leaping impossible lengths is a cinematic advance, I suppose, when compared to what Hollywood has been bringing to theaters in the years since The Sopranos first debuted. But the appeal is, in its genes, much the same. When the screen goes black in the final episode, you feel neither fear nor pity. Nor do you sit back and contemplate the meaning of existential nihilism.

No, you look for something else to watch.

Tuesday, August 06, 2013

The babble of literary gossip

Over at Gawker this morning J. K. Trotter dishes the dirt on Philip Roth’s The Human Stain. Or, rather, its dust jacket. Trotter reports that the sliver of the anonymous letter pictured on the cover, which Delphine Roux sends to Coleman Silk in the novel—

Everyone knows you’re
sexually exploiting an
abused, illiterate
woman half your
—is the reproduction of an actual letter that Philip Roth himself received. From the rival novelist Francine du Plessix Gray (pictured below), a neighbor to Roth’s ex-wife Claire Bloom, if the former FBI agent hired by Roth to track down the anonymous letter’s sender is to be believed.

Gray denies sending any such letter, of course. And it never occurs to Trotter to ask the obvious question. If Roth actually received an anonymous note like Coleman Silk’s then who was his Faunia Farley, the “abused, illiterate woman” he was supposed to be sexually exploiting?

There is an even more obvious question. What difference, for an understanding and appreciation of the novel, could it possibly make? Like so many of those who hang around the fringes of literature, Trotter is more interested in gossip, the easy externals, than in the working machinery of fiction.

This isn’t the first time Roth has been obliged to defend The Human Stain from “the babble of literary gossip.” Last year he wrote an open letter to Wikipedia in which he patiently explained that the late Anatole Broyard, a critic for the New York Times who “passed” as white despite black parentage, was not the original of Coleman Silk. Like Mickey Sabbath and Swede Levov, his main character was “invented from scratch.” Roth explained how fiction works:Novel writing is for the novelist a game of let’s pretend. Like most every other novelist I know, once I had what Henry James called “the germ” [which was indeed an actual event] I proceeded to pretend and to invent Faunia Farley; Les Farley; Coleman Silk; Coleman’s family background; the girlfriends of his youth; his brief professional career as a boxer; the college where he rises to be a dean. . . [etc.].I get cramps when I repeat myself, but for the benefit of those who would like me to blog more frequently I will: the only question in fiction is how consistently and well the writer adheres to the self-determined rules of his own game. Or, as J. V. Cunningham put it with rather more elegance, his “one theme is his allegiance to his scheme.”

Thus it may be the fact that John Williams was “inspired” to write his brilliant novel Stoner by a “real-life feud” in the English department at the University of Missouri, as two journalists claim in a recent article in Vox, or it may be the fact that J. V. Cunningham was the original for William Stoner, as the late Donald E. Stanford told me when I visited him in Baton Rouge (and as I told Bryan Appleyard of the Sunday Times twenty years later). But the actual facts are irrelevant to the fiction, which depends on how Williams transmuted them into a coherent world of new and interdependent facts.

Dust jackets and the originals of fictional characters are what we babble about when we don’t know how to talk about fiction. Sportswriters do much the same, yammering away about Johnny Manziel’s partying or Tim Tebow’s praying to avoid the effort of understanding difficult games from the inside. The gossip is harmless except when it masquerades as knowledge. In literature, it threatens to reduce every novel ever written to a roman à clef. When that happens, the only thing readers will need is a key.

Wednesday, July 31, 2013

Down and out in Newport

Allison Lynn, The Exiles (Boston: New Harvest/Houghton Mifflin Harcourt, 2013). 336 pages.

Perhaps no famous quotation from literature is more contested than F. Scott Fitzgerald’s remark that there are no second acts in American life. You can add Allison Lynn’s name to the list of those who disagree, although only at the very end of her second novel and only after her characters have done everything in their power to prove Fitzgerald right.

The Exiles is the story of two young unmarried parents who own “nothing except an expensive New York lifestyle.” When the opportunity presents itself to exile themselves from the city, they grab it. A Wall Street broker, Nate Bedecker is offered a job at his firm’s satellite office in Newport, Rhode Island. His girlfriend Emily Latham has already walked away from a job in “experiential advertising” when she became appalled at herself for expending her best creative energies on a sales campaign for a potato chip. Having bought “a ’60s-era faux-Victorian” on a “postage-stamp lot,” they load their baby gear and financial papers into a Jeep Cherokee on the Friday of Columbus Day weekend, and drive the two hundred miles to their real-estate lawyer’s office to sign the final papers and pick up the keys. “They might not have a marriage license, but now they had a kid and a house to bind them,” Emily reflects. “It was the real thing.”

Things get more real when they leave the lawyer’s office to find that the Cherokee has been stolen. Stranded with the eighty-five dollars they have in their wallets, they must fend for themselves until the banks open again on Tuesday after the long holiday weekend. Almost immediately they begin making bad decisions. They cancel their credit cards, for example, before thinking to pay for a hotel room. Even if only temporarily, they find they are exiles from the 21st-century economy. Although they display some ingenuity in navigating the expenses of a strange city, their estrangement is keen. In a souvenir shop, “packed to capacity with nearly indistinguishable tchotchkes,” Emily realizes that there were people in Newport with cash to burn:Emily no longer believed that someday she’d be one of those people herself. As a child in Cambridge, Emily had fantasized about growing up and owning a town house in Boston. In her dreams, the urban castle featured a game room, a screening room for movies, and a minifridge full of Grape Crush. She’d since given up that dream (she didn’t want it all anymore, she just wanted enough to remain consistently in the black) along with so many others. What she got, in return, was Nate.Hardly a ringing endorsement of her relationship to the father of her child! The truth is that Emily is a deeply unhappy woman, less in exile than in isolation—and from no one so much as Nate. Her lack of attachment to him is betrayed by her reluctance to marry him. “She didn’t want to wed just because they had a child,” she tells herself. “She wanted to wed because Nate was her soul mate.” Not that he is a loser, exactly. He is a “middle-feeder,” as Lynn calls him, who “pull[s] in a base-level salary and negligible bonus.” Although he is the son of a famous architect, Nate has little ambition and less interior life. Even his baby son Trevor kicks him at night to get away from him.

Nate and Emily would be unpromising subjects for a novel if Lynn had not knotted her plot so deftly. Both of them arrive in Newport with a secret they are keeping from the other. Emily has stolen an expensive painting by a hot young New York artist from a dinner party at the apartment of rich friends. The theft is merely gossip from the New York life they have left behind—breathless email messages from friends speculating on the identity of the culprit, phone calls from the NYPD asking for interviews—until Nate discovers the painting folded up and hidden in an inside pocket of the baby’s diaper bag. Emily can no more explain why she took it than Hurstwood can explain how he stole ten thousand dollars in Sister Carrie. Forces stronger than either of them prompt the thefts. When she finds the painting in a stack of canvases in her friends' study, Emily thinks:This piece might be worth nothing in a hundred years, but today it could fund nearly anything. Paint slopped on a piece of stretched cotton by an imbecile, yet the person who owned it possessed a slice of power. Power: Emily had so little of it herself that she’d been essentially evicted from Manhattan, the epicenter of power. Here, though, was capital on a canvas. As Emily gazed at the [painting] . . . more than anything, she simply wanted a piece of the power. She simply wanted a taste. She simply wanted a whiff of what fell in everyone else’s lap. She simply wanted.Nate’s secret is less revealing of character, but in an age in which health is confused with morality, it is the more devastating. Both his grandfather and his father were afflicted by Huntington’s disease, an inherited disease that causes the slow degeneration of nerve cells in the brain. There is no cure. Nate has never had himself tested, and knowing that he might be carrying the Huntington’s gene, he fathers a child with Emily without telling her of the chances that Trevor will inherit a grisly death sentence.

If her characters are not particularly admirable (or even likable), Lynn gives you something that is far more interesting. Nate and Emily make a series of choices that will have you shouting at them in frustration. Nate takes the baby and hitchhikes across Narragansett Bay to find his grandfather’s locked-up house. Emily lies to the NYPD about the night on which the painting was stolen. Nate puts the painting back in the diaper bag without saying anything to Emily. She refuses to ask old New York friends for any help, although they keep calling and emailing her. You identify, not with them as persons (limited and defective as they are), but with the decisions they make. You second-guess them. You call out what you would do in their circumstances, as if they could hear you. They drive you batty. If their problems were not real problems—stolen car, stolen art, the prospect of terminal disease—you would turn away from them quickly.

“The reader’s identification is rooted in the characters’ decisions,” Umberto Eco has said; “he either supports them or rejects them. The ethical response to a text is rooted in this identification.” Lynn is one of the few young American novelists to grasp this narrative principle instinctively. In the end, though, she blinks. She likes her characters more than they deserve. She wants them to have the second chapter their bad decisions ought to deprive them of. “They would be all right,” she concludes. “Tomorrow they’d make a fresh start, absent the traumatic evidence of their life before.” But the woman is an art thief! you cry. Surely there must be some moral consequence to her brazen covetousness! Lynn and her characters shrug as the novel closes, however. “Anything is possible,” Lynn writes. And though you suspect that she began The Exiles once she had conceived her characters’ dilemma (but not its resolution), you are willing to forgive Allison Lynn almost anything, including her last few pages, because the first three hundred are ethically mesmerizing in a way that few contemporary fiction is any more.