I have a dream—a recurring dream in which I am fired from my job of teaching history at a public university. Specifically, my dismissal has to do with Joseph Conrad’s novel The Heart of Darkness. Some nights, I am fired for teaching the novel in class in spite of the discomfort the narrator’s racist language causes some of my students. On other nights, though, I am fired for not teaching the novel despite its place in the canon of great works of Western literature. But the result is the same: I am fired from a job that I thought was protected by tenure. It turns out, however, that tenure is no more timeless than I am, and I find that I am uncertain whether this is cause for regret or rejoicing.
Since the 1970s, the percentage of academics holding tenure—the guarantee of lifetime job security for full-time faculty at public and private universities—has fallen into a precipitous decline. Over the past four decades, warns a recent report from the American Association of University Professors (AAUP):
[T]he proportion of the academic labor force holding full-time tenured positions has declined by 26 percent and the share-holding full-time tenure-track positions has declined by an astonishing 50 percent. Conversely, there has been a 62 percent increase in full-time non-tenure-track faculty appointments and a 70 percent increase in part-time instructional faculty appointments.
As the presence of tenure shrinks, however, its shadow lengthens across academe. But this is a paradox only for those unaware of this country’s long slog across the battlefields of our never-ending culture war. Long before drag queens and kneeling football players arrived on the scene, subversive academics were a favorite piñata of American conservatives. Lately, though, the professoriate has also come under attack from the American Left. In a recent essay for the Chronicle of Higher Education, the progressive literary critic William Egginton argued that tenure “promotes unjust labor relations; discourages risky and innovative thinking during scholars’ most productive years; and intensifies the tendency of faculty to reproduce themselves, not only by area and interest, but also by gender, race, and class.”
The upshot is that the tenured academic now falls somewhere between the blobfish and funnel-eared bat on the list of the planet’s most repellent endangered species. Who could blame those who simply shrug their shoulders at our imminent demise or actively cheer it on? In an age of multiplying existential threats, the future of tenure seems little more than of marginal concern. It appears even more marginal, if not wholly inconsequential, with the growing possibility that artificial intelligence will soon render both tenured and adjunct professors redundant.
But what seems marginal can, at times, be consequential. Consider the origins of tenure. In 1900, a sociologist at Stanford University named Edward Ross was fired at the behest of Jane Stanford, the widow of the university’s founder, after Ross criticized the use of Chinese immigrant workers by her railroad company. Stanford’s president no doubt concluded that by firing Ross, though it led to the resignation of a few colleagues, this problem had been solved. Instead, the controversy sparked a national debate over the freedom of expression in the academy which, in 1915, led to creation of the AAUP.
Led by the philosophers Arthur Lovejoy and Thomas Dewey, the AAUP insisted that tenure was needed to protect the right of professors to speak their minds without fear of retribution. In 1940, the AAUP reiterated but also revised this foundational claim. In its “Statement of Principles on Academic Freedom and Tenure,” the association again called for the guarantee of academic freedom, but reminded its rank and file to avoid “introducing controversial matter which has no relation to their subject.”
One might as well remind a blobfish to avoid sagging jowls. While the AAUP’s caution made sense for professors of mathematics and management, it made less sense for professors of the humanities. This helps to explain the ire of many conservative critics who believe that historians should stick to the great events and people that have made this country great. This is no doubt what is driving the efforts in states like Florida and Texas to limit or even abolish tenure in public universities. In Texas (my home state), Lieutenant Governor Dan Patrick is preparing legislation to eliminate tenure for all new university hires and to make the teaching of, yes, Critical Race Theory “cause for a tenured professor to be dismissed.”
Progressive critics, on the other hand, insist that historians stick to the grim events and gruesome people that made this country unequal and unjust. This camp’s worldview reflects what George Packer, in a brilliant essay for the Atlantic, has called the new historical fatalism; a method that combines inevitability and essentialism: “The present is forever trapped in the past and defined by the worst of it.” Or more bluntly—and with apologies to Faulkner—it is where the past is never dead or even past, but alive and devouring our future.
Yet while these battles rage on our small screens, it is business as usual for most tenured professors. Ironically, this might be our profession’s greatest threat. We continue to teach a couple of courses every semester, but we do so to an ever-shrinking number of students. In 2020, if we subtract students majoring in “communications”—a popular major where students, based on my own experience, communicate mostly in memes and TikTok videos—scarcely four percent of university graduates majored in one of the four remaining disciplines: history, English, philosophy, and languages. According to the staid Hechinger Report, these “trends are all heading downward and showing no signs of bottoming out or stabilizing.”
We spur this flight from our classrooms by continuing to write the occasional peer-reviewed article and even more occasional monograph. Once published or pixelated, the journal article or book will perhaps be read by one or two specialists in the same sub-field. There is an even slighter chance that it will be read by a promotion and tenure committee. A sobering 2011 study by Mark Bauerlein, a professor of English at Emory University, looked at the publication of peer-reviewed articles by the English faculty at four large public universities. These scholars were all impressively fertile, churning out dozens of articles in this or that quarterly every year. But they were also impressively futile: as Bauerlein’s graphs reveal, most of these articles failed to receive a single citation in another (and, no doubt, just as widely ignored) article or monograph.
This is a strange situation. If no one reads a book or article, does it really exist? One does not need to be a Zen Buddhist to figure out that our profession will itself soon cease to exist if we continue to write only for one another. We seem to take perverse pride in not writing for or connecting with non-academic readers whose taxes pay our salaries. Consider the observation made in 2008 by the former president of the American Historical Association, Gabrielle Spiegel:
In defending the practice of history, or the humanities more generally, academics who have dedicated their lives to such study tend to rely on old shibboleths about the importance of understanding history, art, languages, and so on, and understanding what it means to be “human” … But as the term “shibboleth” implies, we are often, I think, simply talking to each other.
We continue to talk to each other about these shibboleths well into our 70s and 80s, and we do so while charging universities for our time. In 1994, the federal law mandating retirement for university professors at the age of 70 was allowed to sunset. Consequently, the sun need never set on professors who cling to their posts—until, that is, we are wheeled off to the nearest sunset facility. In 2013, a New York University study found that, since the removal of the mandatory age ceiling, the average retirement age of professors had risen from 69 to 73. No doubt that trend continues apace, as does the all-too-human refusal to accept that our best days are behind us.
In her AHA speech, Spiegel worried that, because of our refusal to change our ways, “arguments for the importance of history and the humanities are losing their purchase.” Fifteen years later, it is a purchase we have nearly lost, in part because tenure has kept us in a state of professional sedation. Rather than respond reflexively, circling the wagons around tenure when culture warriors like Patrick and DeSantis come gunning for us, we should respond reflectively.
Yes, tenure does allow us, without fear of losing our livelihood, to pursue inconvenient facts unearthed in historical archives or inconvenient claims unveiled by dialectical reasoning. But tenure is also allowing us to write and teach our profession into well-earned irrelevance. Unless we find a way to reform a practice meant to safeguard our discipline, we risk not just the future of tenure, but also the future of the humanities.