I began teaching introductory U.S. history classes at the college level five years ago. These courses are always well-attended, as they fulfill a graduation requirement for other (presumably more worthwhile) majors. But the enrolment numbers are misleading: Across the United States, student interest in the Humanities is approaching all-time lows, with history, it seems, often faring the worst. In my classes, I frequently make the mistake of testing these trends, opening with surveys that ask students about history as a possible major. Excepting the occasional “LOL,” the answer is always no. Administrative fiat, not student choice, explains why our seats are full.
Rock bottom usually carries with it some opportunity, however. As schools begin to take the justifiable and entirely predictable step of officially shuttering humanities classes (and even whole departments) in response to this decline in student interest, these introductory courses—long the bane of professors everywhere (one of the best parts of making tenure is that you no longer have to teach them)—have taken on an increased importance, as they represent our best opportunity to change students’ minds about history; and, if the stars align, successfully recruit a new major to our field every once in a while. In fact, the survival or our departments may end up depending in large part on the success of these classes, the majority of which will be taught, if patterns hold, by younger professors and adjuncts such as myself, who also happen to be the least financially and vocationally secure members of their departments.
For my part, I was happy to be assigned these courses, and anxious to see if I could find a way to teach history better. And in my own way, I think I succeeded—even if, as explained below, I did so at the cost of compromising my value on the academic job market. Bruised by the “LOLs,” and desperate to improve student experience in these classes, I adopted a potentially controversial way to approach teaching at the college level.
The choice of what to teach in an introductory history class is fraught with political implications. Taken broadly, most of the classes in this field exist somewhere between two interpretative extremes. One is the traditional “Glory” approach to U.S. history, a positive, even triumphant, narrative that was more commonly taught in the first half of the 20th century. At the other extreme, you have the “Gory” approach—not positive and certainly not triumphant. That perspective focuses on the contradictions and hypocrisies embedded in U.S. history, with particular emphasis on the groups that were excluded, brutalized or even decimated. With my classes, I would favor Gory—which in retrospect was a thoroughly conventional decision. With this move, my classes would hew more closely to my own interpretation of American history and also, I assumed, meet my students closer to their own systems of belief. This, I thought, would be the best way to address the “crisis” within my field. I would be in for an unpleasant surprise.
In the 1980s and 1990s, Lynne Cheney—chairman of the National Endowment of the Humanities from 1986-1993, a leading conservative voice in the United States, and the wife of then-future vice-president Dick Cheney—came out swinging against the Gory version of U.S history. She championed national high-school teaching standards that emphasized the Glory narrative. Much to her dismay, however, when the actual proposed standards came out, they seemed to bolster the other side. For instance, it was deemed okay, in her opinion, to praise Mansa Musa, a famous African emperor, for being rich, but not John D. Rockefeller. It also was fine to celebrate the Aztecs for their technological innovations without paying too much attention to the practice of human sacrifice. As Cheney saw it, this represented a double standard.
She pointed out that abolitionist Harriet Tubman was mentioned six times in the materials, as compared to two of Tubman’s white male contemporaries, President Ulysses S. Grant and Confederate leader Robert E. Lee, who got one reference between the two of them. Such imbalances, she argued, made U.S. history too “grim and gloomy,” and pushed the “principle of inclusion” to such point that a “new type of exclusion [had] developed.” She called it “the end of history.”
The architects of the standards, mostly college professors, pushed back against Cheney’s criticism. The two sides might as well have been from different planets. Academia, a bastion of liberalism for most of the 20th century, had redefined itself in the 1960s. The left on college campuses had become the “New Left,” shifting its focus from workers and capital to larger issues of equality and justice. In his response to Cheney, Gary Nash, a UCLA history professor and lead author of the standards, called the historians’ craft evidence-based and strenuous, but also “inherently subjective,” the best analogy being “the work of a lawyer who gathers evidence and builds a case to present to the jury.” Harriet Tubman appearing six times in the standards was no accident. College professors controlled the narrative in their classes, and they had a different story to tell.
Cheney succeeded in pushing the Senate to vote 99-1 in favor of recommending that no federal body should certify the standards, crippling their influence to a degree, but this had little impact on how history was taught at the college level. Many applauded this trend, and for good reason: The changes in favor of more diversity were exciting, and Cheney came across as completely out of touch. No sane modern academic could apply for a teaching job at a mainstream college on a Lynne-Cheney platform and expect a friendly response. So when I began my college teaching career five years ago, Cheney’s story of America, not surprisingly, influenced my course content only by omission. It was like a negative-space drawing, by which I would carefully fill in all the space outside of the Glory narrative. This emphasis on the Gory side was done in the name of diversity; the more diverse, the better. If my classes struggled at all, in fact, my working assumption would be that I had not made them Gory enough.
These teaching decisions seemed all the more obvious given that I teach at Tennessee State University, which is one of about 100 historically black colleges and universities (HBCUs) in the United States. My students reliably resist almost all generalizations that come their way, but I’m comfortable asserting that my students aren’t going to lose any sleep over Robert E. Lee getting pushed out of any high school curriculum in favor of Harriet Tubman. Indeed, there is still palpable frustration among many of my students with the traditional curriculum elements that remain. Before one class on the Second World War, for instance, one of my best students told me, “This is great and all—but you need to know that I don’t give a s**t about World War II.” She explained that her high school curriculum had emphasized this topic—and others like it—to the exclusion of, by way of example, the history of slavery. In this setting, the Gory narrative seemed right on point.
And early on, I did have some optimism that my classes were going well, but I wanted to be sure. I already had the baseline (and soul-crushing) survey results from the beginning of class. Now I would add a second survey at the end, with the expectation that more students (I was starting at zero after all) would express interest in a history major. But the enthusiasm rate was still non-existent. All my Goriness hadn’t moved the needle at all. I considered this a failure.
One explanation, of course, was operator error. I may be the problem—not my curriculum choices. Here I will leave it to the reader to ascribe the appropriate amount of importance to the fact that I am a white male teaching predominantly black students.
My focus, though, would be on things I could change, not on the things I couldn’t. So I began to talk to my students ad nauseum about what they wanted a history class to accomplish. It turns out that the Glory-vs.-Gory debate was much more my fight than theirs. The New Left curriculum simply wasn’t doing the same intellectual work for my students that it was doing for me.
This was a startling revelation for me, difficult at first to digest. The New Left was not a suit of clothes I put on just to teach. I believed in it. Because of the New Left, in my opinion, we now have a better and more complete understanding of our world. It also once seemed to work well in the classroom: Interest in the humanities spiked in the 1960s and sustained much of that appeal even into the 1980s and 1990s. What had changed?
To answer this question, it’s important to remember that, notwithstanding the academy’s repudiation of the Lynne-Cheney perspective, the traditional approach she represented had not, at that time, completely disappeared from the public consciousness, even if it was nowhere to be found on a college syllabus. You did not have to teach her outlook. Students already had learned or absorbed it to varying degrees through popular culture, which is why the New Left remained, at its heart, a revisionist project. The New Left in the classroom did not exist in isolation, no matter how liberal the campus became. It was always in conversation with opposing knowledge that emerged in everything from Hollywood movies to the nightly news. Pedagogically, this was enormously helpful—as the humanities always have been animated by an aspiration to foster critical-thinking skills.
But the broader circumstances in which those skills are to be exercised have changed. Whether we think of our current cultural moment as the information age, the post-truth age, or (more accurately, if more awkwardly) the death-of-consensus age, we are all experiencing a reckoning with human subjectivity. Conventional knowledge is not what it used to be. As people increasingly enter feedback loops to get news and develop their ideas, the center, epistemologically, has not held. Knowledge has fragmented, which means the New Left has lost its foil. The term “critical thinking” is meaningful only if one has some important and widespread system of thought to criticize. But there are few cultural inputs in a modern student’s life that we can safely assume will serve as targets for New Leftist critique.
I am hardly the only one concerned that the New Left may have lost its edge in the classroom. Stanford professor Sam Wineburg—a leading innovator in the field of history education—recently critiqued Howard Zinn’s foundational 1980 academic text The People’s History, once regarded as a crowning achievement of the New Left movement in academia. Zinn (1922-2010) was a champion of labor rights who presented America as a military aggressor with a deeply embedded history of racism. Yet The People’s History, Wineburg argues, is no better pedagogically than the textbooks that it was meant to replace. The lack of footnotes is particularly problematic for Wineburg, as he frames Zinn’s masterwork as trading in one ideologically-driven narrative of the past for another. With this text, students still act mostly as observers instead of doing the intellectual work for themselves.
Such critiques would have once been unthinkable. But it is not just the lack of footnotes that is the problem. When first published, Zinn’s book was a disruptive and influential text, which would have made it a wonderful teaching tool then—but circumstances have changed. I know this because my classes were all Zinn, all the time (even though his textbook itself never made an actual appearance). In my head, this made my classes all about counter-narrative. But to my own students, my classes were just plain old…narrative.
The best New Left teaching moments from years past, the ones most likely to shake up student preconceptions, are now more likely to work in the opposite direction, striking students as just more examples of one-sided advocacy in favor of a particular point of view. Indeed, much of what Zinn wrote in The People’s History may strike today’s students as largely inoperable compared to what they’re getting from YouTube or Reddit, because what Zinn’s work served to challenge no longer exists with any sort of coherence.
If the problem was that my curriculum had lost its foil, I needed to find one. In these efforts, I benefitted from the appearance of a slew of studies pointing to the educational value of reading fiction: Novels can present students with characters complex enough to disrupt readers’ assumptions (or, in psychological terms, their “schema”), thereby leading them to revise or refine their thinking. Many novels establish the schema to be disrupted within the text itself through the inclusion of both less complicated (or “flat”) schema-affirming villains or bystanders; and more complex (or “rounded”) schema-challenging protagonists who attract a reader’s fascination and empathy.
So I reorganized my courses. Half of my class material stayed mostly the same. They still reflected a liberal outlook. But I stopped calling what I was talking about the “truth” and no longer pretended that there wasn’t a teacher a few doors down the hallway making completely different (and equally legitimate) curriculum choices.
The other half of the class would offer something entirely new: I would confront my students with something approaching a fully realized human being, whose behavior would not push clearly in one direction or the other. My lecture on European colonization, for instance, relies on a reading of The Orenda, a 2013 novel by Canadian writer Joseph Boyden about a 17th-century Huron warrior and a young Iroquois girl. The book presents both Indigenous and European narrators. My students take an interest in each, but there is always one character, in particular, that gets their attention. That character is a Jesuit priest and willing agent of the French empire, who also seems to mean well and suffers mightily throughout the book. My students invariably want to talk about him—and not because they harbor some pro-colonization agenda. Just the opposite: What triggers their critical-thinking reflexes is the fact that they know I’m pushing them to be anti-colonization in their outlook.
Students today do struggle in some areas in comparison to their predecessors. But on critical thinking, in my experience, they are light years ahead. In the readings that I assign, students always find their way to characters whose experiences and inner monologues produce tension with my in-class narrative. As far as I can tell, this move has nothing to do with whether they are liberal or conservative. If I painted a rosier portrait of colonization, I am convinced that the classroom conversations would gravitate, in turn, toward the Indigenous protagonists. Critical thinking flourishes in my classes in the tension that exists between a viewpoint and a counterexample. So too, potentially, does student interest, which could offer one broader explanation for why some liberal college students at institutions around the country seem to be engaging with more conservative authors, but only on the sly and in their free time.
Every school is different, as is every teacher and every student. But for me, the realization described above has proven transformative. Students talk more in my classes now, they read more, and they disagree with each other (and me) more. They demonstrate a variety of opinion that had always been there, but had rarely surfaced in my earlier classes. They surprise me more now, something that almost never happened before. Besides being more enjoyable, which is no small thing, this teaches me a lot about my students, flooding me with new curriculum ideas and possibilities.
At this point, I cannot even fathom going back to way I used to teach. My old classes made my students flat, but now they get to be round, just like the fictive protagonists that fascinate them. For the first time in my teaching career, my classes are moving the needle, as some of my students are beginning, ever so tentatively, to consider history as a major. It’s an uphill battle to be sure. After handing in her final exam, one student recently told me, with a smile on her face: “You almost did it—you almost made me not hate history.”
The Humanities are down right now, but the game is far from over. In fact, I have become convinced that the Humanities are actually in a wonderful position to matter now more than ever.
The academic job market is extremely competitive in the United States. Look around, and you’re likely to find heavily credentialed PhDs bagging groceries or driving for Uber. While that’s bad for recent graduates, it’s a buyer’s market for schools, which is just one more reason to be concerned about the current state of the Humanities. Schools have more talent to choose from than ever before, and yet the Humanities are still in a free fall. What happens when this pool of highly qualified scholars begins to dry up or, even worse, when school administrators respond to the sustained drop in student interest by scrapping history as a general education requirement?
As a temporary faculty member, I apply for jobs every year, and have learned, in the process, not only about my own shortcomings as a job candidate (which are many), but also about what exactly academia’s priorities are at the moment. In this setting, my teaching approach, backed by nearly unanimous positive reviews at this point, is, tellingly, seen as a mixed bag. I had sensed as much in conversations with colleagues and feedback from reviewers of my pedagogical writings. But a recent meeting with a career counsellor at a prestigious university—who recommended straight out that I drop any detailed description of my new teaching methods from my cover letter—left little doubt. His suggestion was to only talk about my teaching in private, one-on-one conversations, where I could better gauge how these ideas were being received.
Within these vocational realities, I’ve had to reconsider my classes again, but this time from the perspective of my fellow academics, not my students. And in embracing “schema disruption,” it’s clear and undeniable that I lost some ideological control of my classes.
Just as the New Left used to engage students more fully the farther away an example strayed from conventional knowledge, my classes similarly seemed to thrive most when students were exploring ideas that cut hard against the presumptions of my own New Left dogma. And as a result, I now find myself assigning readings that humanize people on the “wrong” side of New Left history. Students frequently take the readings in directions that make me feel uncomfortable. I rarely stop them. Sometimes, they even say and write things that would earn appreciative nods from Lynne Cheney. That has never been the goal, but it has clearly been one result of this teaching style.
On some level, I always knew this was going to be a potential problem. One of the growing trends in academia of late is to draw linkages between a scholar’s personal beliefs and the thrust of their research; and to hold that person accountable for how others may use their ideas, no matter how antithetical to the intentions of the authors those uses may be. For instance, when Jonathan Haidt and Greg Lukianoff published The Coddling of the American Mind in 2018, and traced many of the problems in schools to the coddling that children receive at home as well as calling for more viewpoint diversity in schools, The Guardianpublished an academic counterblast that presented the book as little more than two white men complaining about their lost privilege (which, in turn, elicited a lengthy takedown by Conor Friedersdorf in TheAtlantic). Others argued that the two authors, one a liberal academic and the other a longtime ACLU lawyer, were merely willing tools of the alt-right. Such attacks generally were more about their supposedly political intentions and biological identity than the soundness of their methodology or conclusions.
In anticipation of such critiques, I ensured that the first three books I assigned as I developed my teaching approach all came from diverse authors: Toni Morrison’s A Mercy, Yaa Gyasi’s Homegoing, and James McBride’s Good Lord Bird. I picked these books, first and foremost, because they’re great. But it was also part of a conscious effort to head off any suspicion that my classes were about undermining the goal of diversity or a reflexive attempt simply to protect white male privilege.
In retrospect, this seems incredibly naïve, as many academics now see intellectual diversity itself as inherently problematic, or at least quickly cave to those who do. Consider the example of Roxane Gay, an English professor at Purdue University, a New York Times best-selling writer and a self-proclaimed “bad feminist.” She recently shared a conference stage with Christina Hoff Sommers, a resident scholar at the American Enterprise Institute who sometimes calls herself a “factual feminist” or “equity feminist” and is a frequent critic of what she refers to as “victim feminism.” The list of disagreements between the two is long, and Gay admitted in a recent interview that she knew many people would take issue with her participation in the event: “I think [Sommers] has some incendiary ideas, and engaging those ideas and treating them as legitimate can probably be seen as controversial, sure. [But] I feel fine about it because I don’t think anyone should live in a vacuum or an echo chamber.”
There are two values in conflict here. One is the idea that engaging with an opposing idea makes the other idea stronger. The opposing position is that by engaging with other ideas, you make your own ideas stronger. Gay chose the latter approach, at some risk. Indeed, the prevailing logic among many is that by debating Sommers, Gay became just as “offensive” as Sommers herself.
It is hard to imagine anybody with more progressive bona fides than Gay, yet even she still feels somewhat apprehensive about engaging with ideological adversaries. This trend toward freezing out controversial ideas is a deadly threat to any trend in academia that even comes close to my own teaching approach. That career counselor was absolutely correct to recommend dropping any detailed discussion of my methods in my cover letter.
Gay is an established and influential scholar, who perhaps can take the risk of offending colleagues. Aspiring faculty members on the job market, however, are in a far different situation. Research and teaching obviously matter, but everyone strives to come across as a good potential colleague, somebody whom members of a prospective department would want to spend time with. David French, in the National Review, has discussed how prioritizing social concerns is pushing workplaces towards ideological conformity, with academia being particularly susceptible. If anything you do runs the risk of offending even a single member of a hiring committee (or the entire department for that matter), then good luck to you. A comparably qualified, more palatable candidate will surely be the better hire.
The state of the humanities defies a single, all-encompassing solution. And other approaches may be workable. Professors at Yale, for instance, claim they are getting positive results by moving in precisely the opposite direction from the one I am advocating. Yale’s history department has seen a jump in numbers. And the university’s English department is enthusiastically responding to the demands of its students to further “decolonize” the department’s course offerings and shift the curriculum away from “the writing representations of aristocratic white men.” On that campus, it seems a further leftward turn is helping to save the Humanities, which is great to hear. And if that approach is indeed representative of how the humanities can stay relevant in the modern university, then everybody concerned can take a deep breath. Everything is going to be fine. We just need to continue doing the same thing we’ve been doing for the last 50 years. It would be like asking Lynn Cheney if she wouldn’t mind saving the day by talking just a little bit more about George Washington.
My own view is closer to that of Van Jones, the former advisor to Barack Obama and CNN political contributor, who decried the “ascendant” and “terrible” idea that people in college should aspire to be ideologically safe at all times. As a liberal activist, he argued that ivory tower liberalism dies as “soon as it crosses the street into the real world.” I’d go further than that, in fact: Ivory tower liberalism may be dying inside the ivory tower itself. I tried to make a New Left class work, and I couldn’t do it. If the crisis continues, academia is eventually going to have to decide if the only appropriate response to Howard Zinn fatigue in our classes is to get even Zinnier?
Conservative critics of academia have a longstanding tradition of depicting liberal professors as ideologues first and educators second. In my opinion, they are wrong. For decades, liberal professors simply had no need to choose between these two roles, because their revisionist ideology also was pedagogically effective. But that has now changed. And if we don’t acknowledge that fact, then the fate of not only my own field, but the entire Humanities, may hang in the balance.