Among the distempers of learning in our day is the habit of reading canonical fiction as if it were the only fiction in existence. In the recent n+1 pamphlet No Regrets, for example, the blogger and novelist Emily Gould complains about the “midcentury misogynists”—Bellow, Kerouac, Mailer, Roth, Updike—who populate what Amanda Hess describes in Slate as the “hypermasculine literary canon.”
The unexamined assumption is that misogyny was the stock-in-trade of these “midcentury” writers. No one feels obligated to defend the proposition nor even to examine the misogyny in any detail. What becomes clear, in leafing through women writers’ grievances against the books that “rejected” them, is that male novelists from an earlier generation are being judged by an anachronism, a criterion they could not possibly have known—feminism’s current dictates about respect for women. The moral complacency and self-congratulation implicit in the judgments worry exactly no one.
But “presentism” or “present-mindedness” is merely one fallacy behind such exercises in reading current moral fashion back into literary history. Just as bad is the radical abbreviation of an entire age’s literature by studying only those figures who now appear to be dominant. For an account of “midcentury” literary misogyny to have any validity whatever, more than a handful of writers will have to read. (One is struck by how often the name of Philip Roth comes up, as if the postwar era should be known as the Age of Roth.)
You will familiarize yourself with all manner of generalizations about postwar American fiction in 21st-century literary journalism without ever encountering the names of Paul Horgan, Allan Seager, Willard Motley, Wright Morris, William Bradford Huie, Hortense Calisher, William Eastlake, J. F. Powers, John Leggett, George P. Elliott, Mary Lee Settle, Isaac Rosenfeld, James B. Hall, Thomas Gallagher, R. V. Cassill, Mario Puzo, Oakley Hall, Warren Miller, John Williams, Vance Bourjaily, Mark Harris, Chandler Brossard, Harry Mark Petrakis, Herbert Gold, Evan S. Connell Jr., Thomas Berger, Leo Litwak, Jack Matthews, Alison Lurie, Wallace Markfield, Edward Lewis Wallant, or Richard Yates.
I can’t be alone (can I?) in finding something romantic about the “forgotten” or “neglected” books of the past. While I love Bellow and Roth as much as the next critic—more, probably, since I named a son after Bellow—it is precisely their importance to me, their centrality in my thinking, that makes me want to know (in Triling’s phrase) the “hum and buzz of implication” from which they emerged. I don’t read their books to feel accepted or rejected, to have my lifestyle choices affirmed, but to appreciate their distance from me, their difference. And no method of literary study is more effective at making them strange again—those natives of the foreign country that is the past—than in understanding them as conventional (or not) for their times.
Books could be time machines, but rarely are. They are sadly familiar to us, because they are canonical; that is, because we read them in the present, with the standards and expectations of the present, as towering figures of the present. To be borne into the past, boats beating against the current, the best books are those which are least familiar: the books no one is assigned on any syllabus, the books discussed in no classroom. If nothing else, you have to read these “forgotten” or “neglected” books in editions from the period in which they were originally published, since many of them have never been reprinted. The cover art, the dust-jacket copy, the yellowing pages, the formal typography, the out-of-fashion author photos—even as physical objects, the books are visitors from another time and place.
Besides, there is the intellectual challenge in deciding for yourself whether a book is any good. The celebrated titles of this publishing season are surrounded by publicity; even an independent judgment sounds like an echo of the blurbs. And no one is ever surprised if you like Roth (or don’t). But what about Allan Seager or James B. Hall? Will Amos Berry or Racers to the Sun repay your time, or only waste it? Are you willing to accept the risk of recommending either of them to a friend? If you take seriously the adventure of reading you must involve yourself, sooner or later, in the romance of certain old books.
Pages
▼
Tuesday, January 28, 2014
Monday, January 27, 2014
Five Books of cancer
A season into my sixth year of living with Stage IV metastatic prostate cancer, I am finally writing a book on the experience. Or, rather, what began as a wider-ranging memoir has refocused itself as a cancer book. My working title is Life on Planet Cancer (© the author). A literary critic by profession, I will be including some glances at the very best cancer writing. Where, then, if I were advising readers, would I begin on the literature?
• Mark Harris, Bang the Drum Slowly (1956). Harris tried his best to convince people that his second Henry Wiggen novel after The Southpaw (1953) was not a baseball novel. He was unsuccessful, largely because his descriptions of baseball prefer the plain speech of inside dope to syrupy lyricism (“The damn trouble [with hitting] is that knowing what is coming is only half the trick”). Harris’s story is about a third-string catcher on a major league team who is diagnosed with Hodgkins’s lymphoma when the disease was still incurable (the five-year survival rate is now above 80%). A Southerner who is prone to ignorance and racism, Bruce Pearson is a butt of cruel fun on the team until the news of his cancer slowly spreads through the roster, bringing the New York Mammoths together. Bruce’s attitude toward his own illness, lacking in self-pity, is pitch perfect. And its effect on hardened professional athletes, who do not permit any softness or sentimentality in their lives, is utterly convincing. The result may be the best single account of a death from cancer ever written.
• Peter De Vries, The Blood of the Lamb (1961). If Harris’s is not the best account of a death from cancer ever written then De Vries’s is. Many readers will prefer De Vries’s, because it is the more profound. (I will not shy from the word if you won’t.) Based on the death of De Vries’s own ten-year-old daughter Emily from leukemia, The Blood of the Lamb is the work of a deeply religious man, a Calvinist, who believed that God need not exist to save us. This wintry faith, as Martin Marty calls it in A Cry of Absence (another fine cancer book), a faith intimate with God’s absence, is strange and unfamiliar to most Americans, who are more used to the flush-faced, hallelujah, pass-the-collection-plate religious conviction of evangelicalism. Don Wanderhope, De Vries’s narrator, the father of the dying girl, concludes that “man’s search for meaning” is doomed to disappointment. But if “Human life ‘means’ nothing” it doesn’t follow that it is without truth. “Blessed are they that comfort, for they too have mourned, may be more likely the human truth”—this is Wanderhope’s conclusion in his desperate grief. One of the most eviscerating books you will ever read.
• Aleksandr Solzhenitsyn, Cancer Ward (1968). The first thing everyone says about Solzhenitsyn’s great 500-page novel is that it treats cancer as a metaphor for the totalitarian state. Perhaps it is time to turn the commonplace inside out: totalitarianism is, for Solzhenitsyn, a metaphor for cancer. He himself suffered from an undiagnosed cancer in the early ’fifties while incarcerated in a camp for political prisoners in Kazakhstan. Cancer is, he writes in the novel, a “perpetual exile.” There is no returning from it to a happy life of uncomplicated freedom. A peculiarly Russian vision, reeking of Dostoyevskian tragedy and pessimism? (Also the emotional byproduct of a third-rate medical system, which saved few and palliated the suffering of even fewer?) Yes, and all the more worth being soaked in as a consequence. The popular American attitude toward cancer is a dance to a sentimental tune about hope.
• Siddhartha Mukherjee, The Emperor of Maladies: A Biography of Cancer (2010). An oncologist and cancer researcher at Columbia University Medical Center, Mukherjee (no relation to the novelist Bharati Mukherjee) gave his 470-page book a misleading subtitle. The Emperor of Maladies is less cancer’s life-story than an informal and anecdotal survey of cancer research and treatment since the Second World War. Although it would have been improved by a tighter structure and perhaps a more exhaustive aim, its engaging tone and focus on the personalities involved in the “war on cancer” guaranteed the book a Pulitzer Prize. There is, however, no reason to read it from cover to cover. Like an oral history, it can be read a chapter here and then a chapter fifty pages on without loss or confusion. Mukherjee is good at cramming information into small spaces and clarifying the sometimes daunting language of medicine for general readers. He succeeds in his ambition to make cancer research into a modern adventure, and if this is not the same as writing the biography of cancer, it is as close as we are likely to get for a while; and not without value and pleasure.
• Christopher Hitchens, Mortality (2012). First diagnosed with esophageal cancer in June 2010, Hitchens died a year and a half later. In his last months he wrote a series of seven articles for Vanity Fair on his experience. These were collected and published in a short 93-page book along with some pages of notes toward a last unfinished article, which should probably have been discarded. The essays are characterized by Hitchens’s distinctive brand of honesty (“the novelty of a diagnosis of malignant cancer has the tendency to wear off”) and a unique ability to notice things that other writers on cancer have overlooked (for a cancer sufferer, for example—Hitchens’s preferred term—undergoing blood tests goes from being an easy routine to a painful ordeal). No other cancer book has quite the tone of immediacy that Hitchens’s has.
There are several memoirs that might also be mentioned, especially Lucy Grealy’s Autobiography of a Face (1994), Reynolds Price’s A Whole New Life (1994), Gillian Rose’s Love’s Work (1995), and Wilfred Sheed’s In Love with Daylight (1995), and they are perhaps the next books that should be read. Philip Roth’s Patrimony (1991) is about the suffering from cancer as watched helplessly from outside—Roth’s father Herman died of a brain tumor. The American poets L. E. Sissman and Jane Kenyon, both of whom died from cancer, wrote sharply and movingly of the disease. And I have discussed Anatole Broyard’s Intoxicated by My Illness (1993) at some length elsewhere, because Broyard died of the same cancer I am living with. If I could bring only five books with me to the hospital, though, these are the five I would bring.
• Mark Harris, Bang the Drum Slowly (1956). Harris tried his best to convince people that his second Henry Wiggen novel after The Southpaw (1953) was not a baseball novel. He was unsuccessful, largely because his descriptions of baseball prefer the plain speech of inside dope to syrupy lyricism (“The damn trouble [with hitting] is that knowing what is coming is only half the trick”). Harris’s story is about a third-string catcher on a major league team who is diagnosed with Hodgkins’s lymphoma when the disease was still incurable (the five-year survival rate is now above 80%). A Southerner who is prone to ignorance and racism, Bruce Pearson is a butt of cruel fun on the team until the news of his cancer slowly spreads through the roster, bringing the New York Mammoths together. Bruce’s attitude toward his own illness, lacking in self-pity, is pitch perfect. And its effect on hardened professional athletes, who do not permit any softness or sentimentality in their lives, is utterly convincing. The result may be the best single account of a death from cancer ever written.
• Peter De Vries, The Blood of the Lamb (1961). If Harris’s is not the best account of a death from cancer ever written then De Vries’s is. Many readers will prefer De Vries’s, because it is the more profound. (I will not shy from the word if you won’t.) Based on the death of De Vries’s own ten-year-old daughter Emily from leukemia, The Blood of the Lamb is the work of a deeply religious man, a Calvinist, who believed that God need not exist to save us. This wintry faith, as Martin Marty calls it in A Cry of Absence (another fine cancer book), a faith intimate with God’s absence, is strange and unfamiliar to most Americans, who are more used to the flush-faced, hallelujah, pass-the-collection-plate religious conviction of evangelicalism. Don Wanderhope, De Vries’s narrator, the father of the dying girl, concludes that “man’s search for meaning” is doomed to disappointment. But if “Human life ‘means’ nothing” it doesn’t follow that it is without truth. “Blessed are they that comfort, for they too have mourned, may be more likely the human truth”—this is Wanderhope’s conclusion in his desperate grief. One of the most eviscerating books you will ever read.
• Aleksandr Solzhenitsyn, Cancer Ward (1968). The first thing everyone says about Solzhenitsyn’s great 500-page novel is that it treats cancer as a metaphor for the totalitarian state. Perhaps it is time to turn the commonplace inside out: totalitarianism is, for Solzhenitsyn, a metaphor for cancer. He himself suffered from an undiagnosed cancer in the early ’fifties while incarcerated in a camp for political prisoners in Kazakhstan. Cancer is, he writes in the novel, a “perpetual exile.” There is no returning from it to a happy life of uncomplicated freedom. A peculiarly Russian vision, reeking of Dostoyevskian tragedy and pessimism? (Also the emotional byproduct of a third-rate medical system, which saved few and palliated the suffering of even fewer?) Yes, and all the more worth being soaked in as a consequence. The popular American attitude toward cancer is a dance to a sentimental tune about hope.
• Siddhartha Mukherjee, The Emperor of Maladies: A Biography of Cancer (2010). An oncologist and cancer researcher at Columbia University Medical Center, Mukherjee (no relation to the novelist Bharati Mukherjee) gave his 470-page book a misleading subtitle. The Emperor of Maladies is less cancer’s life-story than an informal and anecdotal survey of cancer research and treatment since the Second World War. Although it would have been improved by a tighter structure and perhaps a more exhaustive aim, its engaging tone and focus on the personalities involved in the “war on cancer” guaranteed the book a Pulitzer Prize. There is, however, no reason to read it from cover to cover. Like an oral history, it can be read a chapter here and then a chapter fifty pages on without loss or confusion. Mukherjee is good at cramming information into small spaces and clarifying the sometimes daunting language of medicine for general readers. He succeeds in his ambition to make cancer research into a modern adventure, and if this is not the same as writing the biography of cancer, it is as close as we are likely to get for a while; and not without value and pleasure.
• Christopher Hitchens, Mortality (2012). First diagnosed with esophageal cancer in June 2010, Hitchens died a year and a half later. In his last months he wrote a series of seven articles for Vanity Fair on his experience. These were collected and published in a short 93-page book along with some pages of notes toward a last unfinished article, which should probably have been discarded. The essays are characterized by Hitchens’s distinctive brand of honesty (“the novelty of a diagnosis of malignant cancer has the tendency to wear off”) and a unique ability to notice things that other writers on cancer have overlooked (for a cancer sufferer, for example—Hitchens’s preferred term—undergoing blood tests goes from being an easy routine to a painful ordeal). No other cancer book has quite the tone of immediacy that Hitchens’s has.
There are several memoirs that might also be mentioned, especially Lucy Grealy’s Autobiography of a Face (1994), Reynolds Price’s A Whole New Life (1994), Gillian Rose’s Love’s Work (1995), and Wilfred Sheed’s In Love with Daylight (1995), and they are perhaps the next books that should be read. Philip Roth’s Patrimony (1991) is about the suffering from cancer as watched helplessly from outside—Roth’s father Herman died of a brain tumor. The American poets L. E. Sissman and Jane Kenyon, both of whom died from cancer, wrote sharply and movingly of the disease. And I have discussed Anatole Broyard’s Intoxicated by My Illness (1993) at some length elsewhere, because Broyard died of the same cancer I am living with. If I could bring only five books with me to the hospital, though, these are the five I would bring.
Entrepreneurs of the spirit
Will Wilkinson laments the decline of “old school blogging,” the original style of blogging—before the media outlets launched their group blogs and bought up the first-generation “personal bloggers”—in which the blogger composed a self, day by day, “put[ting] things out there, broadcast[ing] bits of [his] mind,” and in return finding a place for himself “as a node in the social world.”
What Wilkinson has to say about the self is provocative and largely true, I think. The self is a convergence of loyalties and enthusiasms and beliefs and habits. That there is a “stable” self, which persists through the flux of illness and health and better and worse, is an “illusion.” Wilkinson’s best line is that the “self is more like a URL,” an “address in a web of obligation and social expectation.”
But I am even more interested in what Wilkinson has to say—or suggest, really—about the economics of blogging. “Old school blogging,” as he calls it, belongs to a “personal gift economy.” The blogger gives away his reflections, and “in return maybe you get some attention, which is nice, and some gratitude, which is even nicer.”
The minute a blogger joins the staff of a magazine, though, everything changes. Everyone likes to get paid for what he does—I am no exception—but blogging for pay changes forever the blogger’s relation to his audience. The “web of obligation and social expectation,” into which the blogger-for-free inserts himself, is narrowed and focused. In reality, his audience shrinks to one (or, at most, a handful): his boss or bosses.
When the blogger becomes a “channel” for a media organization (to use Wilkinson’s term for it), he must adhere to more than the house style. He must also trim his judgment to suit the editorial fashions of his employer. Even where the blogger thinks of himself as a member of the magazine’s family, as I thought of myself at Commentary, conflict is inevitable.
As a literary practice, blogging is fundamentally an exercise of intellectual independence. A blogger no more thinks in a house style than Thoreau did in writing his journal. Writing as a staff member of a magazine, though (even when, as I did at Commentary, you are writing a one-person blog), you must second-guess yourself with regularity, asking whether you are setting yourself, even if accidentally, at odds with editorial policy.
One of the incidents that soured my working relationship with John Podhoretz, Commentary’s editor, was when I reviewed Hillel Halkin’s novel Melisande! What Are Dreams? Halkin’s novel was released in England by Granta, but was not being published in America. It never occurred to me that this would be an issue—Halkin was a longtime contributor to the magazine, the novel was brilliant and memorable—but Podhoretz was justifiably annoyed with me, because the magazine’s policy was not to review books that are not published in this country.
In taking on Halkin’s novel, I acted like a blogger, not a staff writer. I failed to recognize that, when you write for pay, you no longer write for yourself. To the reading public, you do not even write under your own name. When I praised Stone Arabia in a review on the Literary Commentary blog, Dana Spiotta’s publisher whipped my praise into a blurb and attributed it to Commentary. My name went poof!
The other day a trio of journalism students at Ohio State University came by to interview me for a class project. “What would you say to my generation about the future of journalism?” one of them asked to wind up the interview. “I’d say the future is both exciting and frightening,” I replied—“or maybe that’s the same thing.” The internet has made it possible for anyone to set up as a journalist—that is, to write regularly, on any subject that catches a fancy, as if keeping a journal. No one can tell anyone else what to write or not to write, or in what style. The marvel is freedom. The problem, as always, is how to monetize the work.
I had no practical solutions, beyond repeating the naïve ’sixties slogan “If you do the right thing money will come” and telling about the novelist Roland Merullo, who worked as a carpenter while writing his first novels. Complete editorial freedom is available for perhaps the first time in the history of journalism, I told the students—but only if they were willing (God help me) to become “entrepreneurs of the spirit” and not employees.
Whether the “old school” and “personal” bloggers can return to their first spiritual entrepreneurship, after having their literary thinking altered forever by writing for pay, is a question that may concern more than themselves alone. The answer may also suggest something about the future of journalistic freedom.
What Wilkinson has to say about the self is provocative and largely true, I think. The self is a convergence of loyalties and enthusiasms and beliefs and habits. That there is a “stable” self, which persists through the flux of illness and health and better and worse, is an “illusion.” Wilkinson’s best line is that the “self is more like a URL,” an “address in a web of obligation and social expectation.”
But I am even more interested in what Wilkinson has to say—or suggest, really—about the economics of blogging. “Old school blogging,” as he calls it, belongs to a “personal gift economy.” The blogger gives away his reflections, and “in return maybe you get some attention, which is nice, and some gratitude, which is even nicer.”
The minute a blogger joins the staff of a magazine, though, everything changes. Everyone likes to get paid for what he does—I am no exception—but blogging for pay changes forever the blogger’s relation to his audience. The “web of obligation and social expectation,” into which the blogger-for-free inserts himself, is narrowed and focused. In reality, his audience shrinks to one (or, at most, a handful): his boss or bosses.
When the blogger becomes a “channel” for a media organization (to use Wilkinson’s term for it), he must adhere to more than the house style. He must also trim his judgment to suit the editorial fashions of his employer. Even where the blogger thinks of himself as a member of the magazine’s family, as I thought of myself at Commentary, conflict is inevitable.
As a literary practice, blogging is fundamentally an exercise of intellectual independence. A blogger no more thinks in a house style than Thoreau did in writing his journal. Writing as a staff member of a magazine, though (even when, as I did at Commentary, you are writing a one-person blog), you must second-guess yourself with regularity, asking whether you are setting yourself, even if accidentally, at odds with editorial policy.
One of the incidents that soured my working relationship with John Podhoretz, Commentary’s editor, was when I reviewed Hillel Halkin’s novel Melisande! What Are Dreams? Halkin’s novel was released in England by Granta, but was not being published in America. It never occurred to me that this would be an issue—Halkin was a longtime contributor to the magazine, the novel was brilliant and memorable—but Podhoretz was justifiably annoyed with me, because the magazine’s policy was not to review books that are not published in this country.
In taking on Halkin’s novel, I acted like a blogger, not a staff writer. I failed to recognize that, when you write for pay, you no longer write for yourself. To the reading public, you do not even write under your own name. When I praised Stone Arabia in a review on the Literary Commentary blog, Dana Spiotta’s publisher whipped my praise into a blurb and attributed it to Commentary. My name went poof!
The other day a trio of journalism students at Ohio State University came by to interview me for a class project. “What would you say to my generation about the future of journalism?” one of them asked to wind up the interview. “I’d say the future is both exciting and frightening,” I replied—“or maybe that’s the same thing.” The internet has made it possible for anyone to set up as a journalist—that is, to write regularly, on any subject that catches a fancy, as if keeping a journal. No one can tell anyone else what to write or not to write, or in what style. The marvel is freedom. The problem, as always, is how to monetize the work.
I had no practical solutions, beyond repeating the naïve ’sixties slogan “If you do the right thing money will come” and telling about the novelist Roland Merullo, who worked as a carpenter while writing his first novels. Complete editorial freedom is available for perhaps the first time in the history of journalism, I told the students—but only if they were willing (God help me) to become “entrepreneurs of the spirit” and not employees.
Whether the “old school” and “personal” bloggers can return to their first spiritual entrepreneurship, after having their literary thinking altered forever by writing for pay, is a question that may concern more than themselves alone. The answer may also suggest something about the future of journalistic freedom.
Wednesday, January 15, 2014
Reply to critics of “Academe Quits Me”
A few days ago, the economist Thomas Sowell found himself obligated to write an op-ed column in which he pointed out that “trickle-down economics”—the economic policy of the political right, according to the political left—is non-existent. It is attacked widely on the left, Sowell observed, but “none of those who denounce a ‘trickle-down’ theory can quote anybody who actually advocated it.”
I thought of Sowell’s column yesterday when I studied the readers’ comments to my essay “Academe Quit Me,” reprinted at Inside Higher Ed. No fewer thaneight nine ten commentators were quick to denounce me for “rehash[ing] the canon wars of a previous generation,” in the words of one, or “likening [my] experience of being let go with the Grand Fall of the English Canon,” as another said.
I’ve reread my essay closely several times now and for the life of me I can’t find the word canon anywhere in it. Is it possible I wrote the word in my sleep? One or two commentators acknowledged (if they couldn’t bring themselves to say so outright) that I never actually wrote what I was being denounced for writing. But the denunciations were valid anyhow, because my essay, in the words of one commentator, “sounds like someone who feels that English departments should only teach courses that discuss White men and eurocentric studies,” and “Myers implied that voices and opinions should be excluded,” as another said.
By the magic of sounds like and implies, a text can be made to say anything the critic wants it to say! I can’t think of a stronger case for improving the teaching of English than the example of such wild-eyed readers, who project their bogies and night sweats into texts that spook them.
Even if it is their habit to express themselves in talking points and received ideas, though, it doesn’t follow that everyone else lives by the same habit. I have been writing publicly for more a quarter century now, and nowhere in anything I have written do I call for a return to the canon. I mean nowhere. If there has been one consistency in my writing it has been this. For more than two-and-a-half decades I have dissented from both sides in the canon wars.
One of my first published essays—published in the Sewanee Review in 1989 as I was just beginning my academic career—was called “The Bogey of the Canon.” The title summarizes my argument. To spell it out further:To the revising of canons there is no end. But the canon, the “old canon,” the “patriarchal canon,” the “restricted, canonical list,” the “fixed repertory”—this is a bogey. It has never existed. It has merely changed, from critic to critic and generation to generation; it bears no marks of persistence as well as change. . . . Those who fear canons have seen a pattern where there is only randomness, and have mistaken a selection for a principle. The name they have given to this is “the canon,” but there is not enough of an identity among canons for there to be any one canon. It cannot be said to be a substantial entity. In light of the comments to my essay yesterday, I’d go one step farther now. The canon is the name by which calls for the restoration of order and coherence to literary study are misunderstood in advance and rejected out of hand without additional examination.
What I actually wrote in “Academe Quits Me” is that academic literary study is no longer a “common pursuit.” It does not represent a “common body of knowledge.” It lacks “common disciplinary conceptions.” Does it say more about me or about my commentators that the only common pursuit they can imagine, the only common disciplinary conception, is a “canon” of “dead white males”?
Most English professors secretly know that I am right, however, even if they would never permit themselves to say so publicly. In my teaching, I have learned that I cannot assume any common background knowledge, not even in English majors.
Last spring I taught one of those boutique courses that could have been offered at the University of Minnesota this semester: an honors seminar on Evil in the Postwar American Novel. Among the books I taught was Cormac McCarthy’s Blood Meridian. I began the discussion by raising the question of Faulkner’s influence upon McCarthy. My students looked at me blankly. “How many of you have read Faulkner?” I asked. No one raised a hand. “How many of you have heard of Faulkner?” Three hands went up. In an upper-division seminar on Philip Roth, pretty much the same thing. Not one student had read Saul Bellow.
In “Academe Quits Me,” I warn that the loss of a common tradition in English study leaves every English professor exposed. No one is indispensable to a university, because no curricular subject, no great author, is indispensable. When he was in college not long ago, a younger friend wrote to me privately yesterday, “you could take Shakespeare’s Treatment of Women, but not Shakespeare.”
I’m not opposed to the inclusion of underrepresented voices—in principle I’m not even opposed to film studies as a part of English—but my critics have failed to grasp my warning. Where nothing is considered essential knowledge, then nothing (not even film or underrepresented voices) is guaranteed a niche in the world of institutionalized scholarship. What my tenured colleagues fail to realize is that their sense of having a secure and permanent place in English is an illusion created by tenure. Nothing else protects them, because nothing else they contribute to scholarship or the academic community is considered necessary, not even by them. They themselves, by acquiescing in the loss of the common pursuit, have made themselves superfluous.
And if they think that a university cannot take away their salaries and their offices while continuing to recognize their tenure—if they think that entire English departments cannot be eliminated—they had better think again. Because such things have already happened at more than one university in this country.
I thought of Sowell’s column yesterday when I studied the readers’ comments to my essay “Academe Quit Me,” reprinted at Inside Higher Ed. No fewer than
I’ve reread my essay closely several times now and for the life of me I can’t find the word canon anywhere in it. Is it possible I wrote the word in my sleep? One or two commentators acknowledged (if they couldn’t bring themselves to say so outright) that I never actually wrote what I was being denounced for writing. But the denunciations were valid anyhow, because my essay, in the words of one commentator, “sounds like someone who feels that English departments should only teach courses that discuss White men and eurocentric studies,” and “Myers implied that voices and opinions should be excluded,” as another said.
By the magic of sounds like and implies, a text can be made to say anything the critic wants it to say! I can’t think of a stronger case for improving the teaching of English than the example of such wild-eyed readers, who project their bogies and night sweats into texts that spook them.
Even if it is their habit to express themselves in talking points and received ideas, though, it doesn’t follow that everyone else lives by the same habit. I have been writing publicly for more a quarter century now, and nowhere in anything I have written do I call for a return to the canon. I mean nowhere. If there has been one consistency in my writing it has been this. For more than two-and-a-half decades I have dissented from both sides in the canon wars.
One of my first published essays—published in the Sewanee Review in 1989 as I was just beginning my academic career—was called “The Bogey of the Canon.” The title summarizes my argument. To spell it out further:
What I actually wrote in “Academe Quits Me” is that academic literary study is no longer a “common pursuit.” It does not represent a “common body of knowledge.” It lacks “common disciplinary conceptions.” Does it say more about me or about my commentators that the only common pursuit they can imagine, the only common disciplinary conception, is a “canon” of “dead white males”?
Most English professors secretly know that I am right, however, even if they would never permit themselves to say so publicly. In my teaching, I have learned that I cannot assume any common background knowledge, not even in English majors.
Last spring I taught one of those boutique courses that could have been offered at the University of Minnesota this semester: an honors seminar on Evil in the Postwar American Novel. Among the books I taught was Cormac McCarthy’s Blood Meridian. I began the discussion by raising the question of Faulkner’s influence upon McCarthy. My students looked at me blankly. “How many of you have read Faulkner?” I asked. No one raised a hand. “How many of you have heard of Faulkner?” Three hands went up. In an upper-division seminar on Philip Roth, pretty much the same thing. Not one student had read Saul Bellow.
In “Academe Quits Me,” I warn that the loss of a common tradition in English study leaves every English professor exposed. No one is indispensable to a university, because no curricular subject, no great author, is indispensable. When he was in college not long ago, a younger friend wrote to me privately yesterday, “you could take Shakespeare’s Treatment of Women, but not Shakespeare.”
I’m not opposed to the inclusion of underrepresented voices—in principle I’m not even opposed to film studies as a part of English—but my critics have failed to grasp my warning. Where nothing is considered essential knowledge, then nothing (not even film or underrepresented voices) is guaranteed a niche in the world of institutionalized scholarship. What my tenured colleagues fail to realize is that their sense of having a secure and permanent place in English is an illusion created by tenure. Nothing else protects them, because nothing else they contribute to scholarship or the academic community is considered necessary, not even by them. They themselves, by acquiescing in the loss of the common pursuit, have made themselves superfluous.
And if they think that a university cannot take away their salaries and their offices while continuing to recognize their tenure—if they think that entire English departments cannot be eliminated—they had better think again. Because such things have already happened at more than one university in this country.
Wednesday, January 08, 2014
Academe quits me
Tomorrow I will step into a classroom to begin the last semester of a 24-year teaching career. Don’t get me wrong. I am not retiring. I am not “burned out.” The truth is rather more banal. Ohio State University will not be renewing my three-year contract when it expires in the spring. The problem is tenure: with another three-year contract, I would become eligible for tenure. In an era of tight budgets, there is neither money nor place for a 61-year-old white male professor who has never really fit in nor tried very hard to. (Leave aside my heterodox politics and hard-to-credit publication record.) My feelings are like glue that will not set. The pieces fall apart in my hands.
This essay is not a contribution to the I-Quit-Academe genre. (A more accurate title in my case would be Academe Quits Me.) Although I have become uncomfortably aware that I am out of step with the purposeful march of the 21st-century university (or maybe I just never adjusted to Ohio State), gladly would I have learned and gladly continued to teach for as long as my students would have had me. The decision, though, was not my students’ to make. And I’m not at all sure that a majority would have voted to keep me around, even if they had been polled. My salary may not be large (a rounding error above the median income for white families in the U.S.), but the university can offer part-time work to three desperate adjuncts for what it pays me. A lifetime of learning has never been cost-effective, and in today’s university—at least on the side of campus where the humanities are badly housed—no other criterion is thinkable.
My experience is a prelude to what will be happening, sooner rather than later, to many of my colleagues. Humanities course enrollments are down to seven percent of full-time student hours, but humanities professors make up forty-five percent of the faculty. The imbalance cannot last. PhD programs go on awarding PhD’s to young men and women who will never find an academic job at a living wage. (A nearby university—a university with a solid ranking from U.S. News and World Report—pays adjuncts $1,500 per course. Just to toe the poverty line a young professor with a husband and a child would have to teach thirteen courses a year.) If only as retribution for the decades-long exploitation of part-time adjuncts and graduate assistants, nine of every ten PhD programs in English should be closed down—immediately. Meanwhile, the senior faculty fiddles away its time teaching precious specialties.
Consider some of the undergraduate courses being offered in English this semester at the University of Minnesota:
To be fair, there are also four sections of Shakespeare being offered there this semester, although these are outnumbered by five sections of Literature of Public Life (whatever that is). Maybe I’m missing something, but this course list does not make me salivate to enroll at Minnesota the way that Addison Schacht salivates to enroll in classics at the University of Chicago in Sam Munson’s 2010 novel The November Criminals:I could study the major texts of Latin literature, to say nothing of higher-level philological pursuits, all the time. Do you know how much that excites me? Not having to do classes whose subjects are hugely, impossibly vague—like World History, like English [like Literature of Public Life]. You know, to anchor them? So they don’t dissolve because of their meaningless? I’ve looked through the sample [U of C] catalog. Holy fuck! Satire and the Silver Age. The Roman Novel. Love and Death: Eros and Transformation in Ovid. The Founding of Epic Meter. I salivated when I saw these names, because they indicate this whole world of knowledge from which I am excluded, and which I can win my way into, with luck and endurance. That’s it exactly. The Minnesota course list does not indicate a whole world of knowledge. It indicates a miscellany of short-lived faculty enthusiasms.
More than two decades ago Alvin Kernan complained that English study “fail[s] to meet the academic requirement that true knowledge define the object it studies and systematize its analytic method to at least some modest degree,” but by then the failure itself was already two decades old. About the only thing English professors have agreed upon since the early ’seventies is that they agree on nothing, and besides, agreement is beside the question. Teaching the disagreement: that’s about as close as anyone has come to restoring a sense of order to English.
In 1952, at the height of his fame, F. R. Leavis entitled a collection of essays The Common Pursuit. It was his name for the academic study of literature. No one takes the idea seriously any more, but nor does anyone ask the obvious followup. If English literature is not a common pursuit—not a “great tradition,” to use Leavis’s other famous title—then what is it doing in the curriculum? What is the rationale for studying it?
My own career (so called) suggests the answer. Namely: where there is no common body of knowledge, no common disciplinary conceptions, there is nothing that is indispensable. Any claim to expertise is arbitrary and subject to dismissal. After twenty-four years of patiently acquiring literary knowledge—plus the five years spent in graduate school at Northwestern, “exult[ing] over triumphs so minor,” as Larry McMurtry says in Moving On, “they would have been unnoticeable in any other context”—I have been informed that my knowledge is no longer needed. As Cardinal Newman warned, knowledge really is an end in itself. I fill no gap in the department, because there is no shimmering and comprehensive surface of knowledge in which any gaps might appear. Like everyone else in English, I am an extra, and the offloading of an extra is never reported or experienced as a loss.
I feel the loss, keenly, of my self-image. For twenty-four years I have been an English professor. Come the spring, what will I be? My colleagues will barely notice that I am gone, but what they have yet to grasp is that the rest of the university will barely notice when they too are gone, or at least severely reduced in numbers—within the decade, I’d say.
This essay is not a contribution to the I-Quit-Academe genre. (A more accurate title in my case would be Academe Quits Me.) Although I have become uncomfortably aware that I am out of step with the purposeful march of the 21st-century university (or maybe I just never adjusted to Ohio State), gladly would I have learned and gladly continued to teach for as long as my students would have had me. The decision, though, was not my students’ to make. And I’m not at all sure that a majority would have voted to keep me around, even if they had been polled. My salary may not be large (a rounding error above the median income for white families in the U.S.), but the university can offer part-time work to three desperate adjuncts for what it pays me. A lifetime of learning has never been cost-effective, and in today’s university—at least on the side of campus where the humanities are badly housed—no other criterion is thinkable.
My experience is a prelude to what will be happening, sooner rather than later, to many of my colleagues. Humanities course enrollments are down to seven percent of full-time student hours, but humanities professors make up forty-five percent of the faculty. The imbalance cannot last. PhD programs go on awarding PhD’s to young men and women who will never find an academic job at a living wage. (A nearby university—a university with a solid ranking from U.S. News and World Report—pays adjuncts $1,500 per course. Just to toe the poverty line a young professor with a husband and a child would have to teach thirteen courses a year.) If only as retribution for the decades-long exploitation of part-time adjuncts and graduate assistants, nine of every ten PhD programs in English should be closed down—immediately. Meanwhile, the senior faculty fiddles away its time teaching precious specialties.
Consider some of the undergraduate courses being offered in English this semester at the University of Minnesota:
• Poems about Cities
• Studies in Narrative: The End of the World in Literature & History
• Studies in Film: Seductions: Film/Gender/Desire
• The Original Walking Dead in Victorian England
• Contemporary Literatures and Cultures: North American Imperialisms and Colonialisms
• Gay, Lesbian, Bisexual, and Transgendered Literature: Family as Origin and Invention
• Women Writing: Nags, Hags, and Vixens
• The Image on the Page
• Bodies, Selves, Texts
• Consumer Culture and Globalization
• The Western: Looking Awry
• Dreams and Middle English Dream Visions
To be fair, there are also four sections of Shakespeare being offered there this semester, although these are outnumbered by five sections of Literature of Public Life (whatever that is). Maybe I’m missing something, but this course list does not make me salivate to enroll at Minnesota the way that Addison Schacht salivates to enroll in classics at the University of Chicago in Sam Munson’s 2010 novel The November Criminals:
More than two decades ago Alvin Kernan complained that English study “fail[s] to meet the academic requirement that true knowledge define the object it studies and systematize its analytic method to at least some modest degree,” but by then the failure itself was already two decades old. About the only thing English professors have agreed upon since the early ’seventies is that they agree on nothing, and besides, agreement is beside the question. Teaching the disagreement: that’s about as close as anyone has come to restoring a sense of order to English.
In 1952, at the height of his fame, F. R. Leavis entitled a collection of essays The Common Pursuit. It was his name for the academic study of literature. No one takes the idea seriously any more, but nor does anyone ask the obvious followup. If English literature is not a common pursuit—not a “great tradition,” to use Leavis’s other famous title—then what is it doing in the curriculum? What is the rationale for studying it?
My own career (so called) suggests the answer. Namely: where there is no common body of knowledge, no common disciplinary conceptions, there is nothing that is indispensable. Any claim to expertise is arbitrary and subject to dismissal. After twenty-four years of patiently acquiring literary knowledge—plus the five years spent in graduate school at Northwestern, “exult[ing] over triumphs so minor,” as Larry McMurtry says in Moving On, “they would have been unnoticeable in any other context”—I have been informed that my knowledge is no longer needed. As Cardinal Newman warned, knowledge really is an end in itself. I fill no gap in the department, because there is no shimmering and comprehensive surface of knowledge in which any gaps might appear. Like everyone else in English, I am an extra, and the offloading of an extra is never reported or experienced as a loss.
I feel the loss, keenly, of my self-image. For twenty-four years I have been an English professor. Come the spring, what will I be? My colleagues will barely notice that I am gone, but what they have yet to grasp is that the rest of the university will barely notice when they too are gone, or at least severely reduced in numbers—within the decade, I’d say.