Skip to content

Why It’s OK to Speak Your Mind

· 23 min read
Why It’s OK to Speak Your Mind

The following is an excerpt from Why It’s OK to Speak Your Mind, by Hrishikesh Joshi. Routledge, 196 pages. (March 2021)

The division of cognitive labor

Modern society is only possible because of the division of labor. Without division of labor, the most we could achieve is a very meager standard of living. Imagine you had to make everything you use, by yourself, from scratch—without tools created by others, without water and food provided by others, without medicines invented by others. Most of us would not survive for a month, if that. Division of labor makes modern standards of living possible because with individuals specializing in one area, society as a whole is able to be much more productive.

Adam Smith illustrated and developed this idea in his Wealth of Nations by using the example of a pin factory. Imagine 10 people tasked with making pins. If each person had to make a whole pin, perhaps each might make 10 pins a day. Making a whole pin involves several distinct processes. Let’s suppose it involves 10 different tasks. Well, if one person had to do all these tasks we can expect that there would be time lost as that person transitioned from one task to another. Furthermore, it would be hard to become skilled at all these different tasks—that would require lots of training and effort. But what if each person in the factory focused on just one of the 10 tasks instead? Time would be saved in a myriad of ways, and the factory would be able to produce a lot more pins—though, no person by himself would be making a whole pin. As a result, the factory might produce 10,000 pins total per day, whereas it would have produced only 100 without specialization. Modern society is like this pin factory writ large.1

But division of labor in modern life is not limited to the production of physical goods. The other face of specialization is the division of cognitive labor. Our institutions of knowledge production (universities, thinktanks, private research labs) reflect this feature: researchers inevitably specialize in one tiny sub-subfield or two in order to make new discoveries. Yet, the division of cognitive labor has deep implications. What we are able to know is inextricably tied to what I will call the epistemic commons—the stock of facts, ideas, and perspectives that are alive in society’s discourse.

In their book, The Knowledge Illusion, cognitive scientists Steven Sloman and Philip Fernbach write: “Language, memory, attention—indeed, all mental functions—can be thought of as operating in a way that is distributed across a community according to a division of cognitive labor.”2 The authors argue that we know very little, but take ourselves to know a lot because the relevant facts are easily accessible to us. If Sloman and Fernbach are right, then our epistemic health as individuals—i.e., the extent to which our beliefs accurately represent the world—is inextricably tied to the health of the epistemic commons.

Consider the following. Do you understand how a zipper works? How about a flush toilet? These objects seem basic enough. Knowing how they work isn’t exactly rocket science. But people drastically overestimate their understanding of how these simple items function. In one study, Leon Rozenblit and Frank Keil asked people to rate from one to seven how well they understood the workings of such objects. They then asked participants to actually explain in detail how the objects worked. Many were simply unable to do so. And so when asked to revisit the question of how well they understood, subjects drastically lowered their ratings. Psychologist Rebecca Lawson performed a similar experiment where students were asked to explain, by sketching out the mechanism, how a bicycle works. The results were striking—most people were unable to complete the task, even though a bicycle is such a familiar object in our daily lives. This phenomenon, of people thinking they know much more than they actually do, has come to be known as the illusion of explanatory depth.3

Why might we fall prey to this illusion? Well for one, the relevant information is easily accessible. If you want to know how a zipper really works, a simple Internet search will give you all the details you need. Though you may not actually as of this moment know the workings of a zipper, the knowledge is “at your fingertips,” as it were. What this suggests is that our representation of the world is like a low-resolution map such that “zooming in” only gives a clear picture insofar as we are able to rely upon the knowledge others have. With respect to most areas of the map, we are unable to zoom in by ourselves—and if we do, we’ll just see large pixels that don’t look like anything. The division of cognitive labor, then, renders our epistemic lives intricately tied with the efforts and contributions of others.

Furthermore, the very coarse-grained picture we have of the world will itself depend on which perspectives are “alive” in the discourse within our milieu. Consider for example a teenager within a deeply religious sect living in a small village. Suppose that this sect does not believe in Darwinian evolution. The arguments for evolution are not discussed, and when the topic is broached, people quickly dismiss it as an unsubstantiated theory. Some might raise what they take to be decisive counterarguments like: “How come we don’t see monkeys turning into humans now?” or “Where are the missing links?” and so on. Now the teenager might be able, in principle, to discover the powerful arguments in favor of evolution by natural selection. There is a copy of the Origin of Species at the local library, and she could also spend time delving into encyclopedias and biology textbooks. But for all intents and purposes, her map of the world has a large hole in it. What’s more, given that there are ample other constraints on her time, she might simply not find it worthwhile inquiring further.

In this way, there are lots of questions that we might lack the time or imagination to inquire about if the people we’re surrounded by consider the issue settled. Division of cognitive labor means we simply cannot independently verify all the claims we take for granted. But that in turn means that if the view our community settles on is mistaken or impoverished, the distortion easily transfers to us. Our epistemic health thus depends on the epistemic health of our milieu.

The 19th century mathematician and philosopher W.K. Clifford underscored this social, interconnected nature of our ability to understand and describe the world in his landmark essay on the ethics of belief:

Our lives are guided by that general conception of the course of things which has been created by society for social purposes. Our words, our phrases, our forms and processes and modes of thought, are common property, fashioned and perfected from age to age; an heirloom which every succeeding generation inherits as a precious deposit and a sacred trust to be handled on to the next one, not unchanged but enlarged and purified, with some clear marks of its proper handiwork. Into this, for good or ill, is woven every belief of every man who has speech of his fellows. An awful privilege, and an awful responsibility, that we should help to create the world in which posterity will live.4

For Clifford, this meant that each of us has an important ethical responsibility: namely, to believe only on the basis of proper evidence. As I will be arguing in the next chapter, if our epistemic situation is a common resource in this way, then we all have a duty to do what we can to preserve the integrity of this resource. However, believing on the basis of proper evidence, though important in its own right, is not enough—we also have a duty to speak our minds.

Blind spots and social pressure

To set the stage for that argument, it is necessary to examine the way in which the epistemic commons is vulnerable. Tragedies of the commons arise because common resources are often susceptible to damage and degradation.5 For example, industrial pollution can destroy river ecosystems. Analogously, I will argue below, social pressure can degrade the epistemic commons.

Consider again the village described above. Why might reasons to accept evolution be systematically repressed here? Presumably because publicly defending such reasons will come at some cost to one’s social status, the maintenance of which is a strong motivation for most people.6 Somebody discussing evidence in favor of Darwinian evolution might be seen as deviant, and perhaps not a true believer of the religion. Furthermore, accusations of heresy or disbelief can invite severe repercussions in many deeply religious societies—even if such accusations end up being untrue. Thus, even if somebody were to encounter or think of a reason to believe in evolution, they might keep that thought to themselves, especially if they’re unsure of the soundness of the reason. Why risk your reputation and social standing (or worse, in many places and times) just to voice some reason you’re unsure of?

In this way, social pressure can systematically filter out reasons to believe a particular claim. The reasons that don’t get filtered out will make it look like that claim ought to be rejected—even if had there been no such filtering, then people would be justified in believing the claim. In other words, filtering processes created by social pressure allow reasons to pile up on one side of an argument while those on the other side get discarded. Yet, the overall balance of reasons, had open discourse prevailed, might well have supported the other side. Any time we observe social pressure to avoid giving some kinds of reasons, then, we should suspect that a worrisome blind spot exists in some form or another.

Importantly, we can’t dismiss the existence of such distortions simply by surveying the first-order evidence (i.e., evidence directly relevant to the issue at hand) presented to us. The problem is created precisely because evidence is filtered in such a way as to support one conclusion. It’s then no good to simply look at the evidence that is presented and say: “but the conclusion is obviously right!” The conclusion looks obviously right because countervailing evidence is not allowed to surface and accumulate, due to the presence of social pressure. A collective blind spot can exist in this way even if the members of a community respond rationally to the first-order evidence they have.

Lessons from the 20th century

Evidential situations like these can lead to catastrophe. If information is not freely shared within a group due to social pressure, deliberation on very important issues can be distorted. In the above case, the dam will break and ruin many lives.

Moreover, this is not simply an exercise of the imagination. Many avoidable disasters have occurred because there was pressure not to share certain kinds of information. The Chernobyl disaster, in which a nuclear powerplant malfunctioned and exploded in what is modern-day Ukraine, is perhaps a paradigm example. Due to the authoritarian, top-down government in place at the time, individuals had incentives not to raise alarms about radiation levels, the nature of the explosion, substandard materials, etc. The result was devastating for thousands of people, many of whom continue to feel the effects of radiation poisoning to this day. The HBO series Chernobyl offers a detailed look at the deliberations and actions of various individuals as they grappled with the situation in a way that brings out the incentives they had to distort or suppress information.7

Democracies typically do a better job of avoiding unnecessary disasters and missteps like this. The victory of the Allies in World War II can be partly attributed to the nature of information flow within democratic decision-making.8 In the democracies, members of the army were relatively more able and willing to offer information that would lead to course-correction by the upper chain of command. By contrast, within the German army and air force, people were much more hesitant about displeasing their superior officers with news or information or strategic perspectives that might be seen to dampen the war effort.

Democracies are also able to allow the spread of key information through a more open media. Journalists are less prone to intimidation by the government, and thus can quickly disseminate crucial news to civilians and government officials alike. Luther Gulick, who served as a high-level American official during World War II, explained that in contrast, decisions within authoritarian governments are “hatched in secret by a small group of partially informed men and then enforced through dictatorial authority.”9 Democracies are thus able to avoid some of the epistemic pitfalls that beset authoritarian regimes because the channels of information are much freer.

This is no cause for complacency, however. Democracies are not immune to such problems. For example, the infamous Bay of Pigs Fiasco, a failed US-backed landing attempt on Cuba in 1961, resulted in part because those who had doubts about the plan suppressed their reservations.10 Moreover, social pressure need not always come from government authorities. Think of college students who feel pressure to binge drink, the many of us who feel pressure to dress in particular ways, teenagers who (used to) feel pressure to smoke cigarettes—or, what’s more relevant here, people who feel pressure not to publicly express certain social or political opinions. Such forms of social pressure do not come top-down, from some governmental chain of command. Rather, they are much more spontaneous and organic. These pressures emerge from the incentives, interactions, and choices of millions of people who shape a particular culture. Democracy, then, does not solve all the informational problems systemic within authoritarian regimes.

The danger today

In her groundbreaking work on the dynamics of public opinion, political scientist Elisabeth Noelle-Neumann argued that fear of isolation can create a “spiral of silence,” where only one side of an issue is publicly defended. The core mechanism she identifies is this. People don’t want to say things that they believe might risk eliciting the disapproval of others; they don’t want to potentially lose friends and get pushed out of their social groups. There is a fear of isolation. So, instead of saying what they really think about a particular issue, such individuals keep mum. Once the process is set in motion, more and more people become silent about their true opinions.11

Spirals like these typically occur with regards to contentious, emotionally laden moral and political issues. A spiral of silence can drive even the majority opinion underground if the minority is sufficiently vocal, and especially if mass media repeatedly and concordantly come down on one side of the issue. Eventually, the spiral of silence causes the majority opinion to effectively disappear, while the previously minority opinion becomes the dominant societal assumption.12

What does this mean for us, now? Well for one, we shouldn’t assume, for all the reasons explored so far, that such spirals of silence induced by social pressure (real or perceived) are going to line up with the truth all the time (or even most of the time). Spirals of silence are sensitive to social forces, not to the truth. Thus, they can cause society to settle on opinions that are quite misguided.

However, in order to know what policies to support or how to remedy various social problems, we need to have an accurate idea of what the social world is like. The very best of intentions can have terrible consequences if those intentions are not supplemented with an accurate picture of the world. (Indeed, under some description, more or less all of the worst actors and movements in history can be said to have “good intentions.”) But social pressure can warp our collective picture of the world without individuals being in good positions to detect the distortion. So, the more we allow spirals of silence to occur, the more chance there is for the road to hell to be paved with good intentions.

The danger we face today is that many of us have quite confident views about lots of contentious issues, as well as lots of issues that have been “settled,” not via a process of institutionalized disconfirmation, but rather through spirals of silence. But this means that the steps we might take to mitigate economic and social problems could backfire, making things worse. The risk becomes greater the more radical, as opposed to piecemeal, solutions we embrace. We might also be misdiagnosing what the problems are in the first place. And we might be missing various forests for the trees. Our Chernobyl, so to speak, might not involve a nuclear powerplant, but might instead manifest itself in the way we conceive of and try to solve social and economic problems.

One way to respond to this predicament is to encourage epistemic humility.13 Perhaps we should all just check ourselves. This however, is far easier said than done. Knowing our epistemic limitations in abstract terms may not actually induce humility in us (especially the loudest among us) when the rubber meets the road. The only way to properly mitigate our dangerous blind spots is for courageous individuals to speak their minds, and refuse to buckle to social pressure. This is not to say that epistemic humility and other tools for critical thinking are not important or worth cultivating. But if knowledge is a collective enterprise, individual epistemic humility can only go so far. This humility, for instance, cannot prevent a Chernobyl—only people sharing their evidence can.

Institutions of knowledge production

Social pressure creates blind spots by making it costly to provide evidence on one side of an issue, while making it costless or even beneficial to provide evidence on the other side of the issue. Whenever such incentives exist, we should suspect that our resulting view of the world is warped in some way. These incentives are particularly important to address within the institutions responsible for knowledge production and dissemination: research groups of various sorts and fields and academic departments within the university system.

Given modern division of labor, such institutions specialize in knowledge production; the rest of society thus relies upon them for providing an accurate picture of the world. Other individuals in society, however, do not have the time or resources to check all the work produced by such institutions, and so an element of trust is necessary. Analogously, you don’t have the time or wherewithal to check all the work done by your lawyer, doctor, or accountant—when it comes to your interaction with such specialists, then, an element of trust is involved.

However, social pressures within institutions responsible for knowledge production can undermine their mission and distort their product. Science works well only in a context of institutionalized disconfirmation: that is, a situation wherein researchers are free and even incentivized to disconfirm any and all hypotheses that are in contention.

Over time, science has disconfirmed hypotheses that would seem exceedingly natural to humans observing their world. Many things that seem intuitive to us turned out to be false. The Earth, it turned out, is roughly spherical, though it looks flat from our vantage point. And while the sun looks like it goes around the Earth, the reverse is true. In the 17th century, Galileo Galilei suffered persecution at the hands of the Catholic Church for defending this idea. Science naturally works best when such costs are absent—so that it doesn’t take a Galilean personality to seek the truth.14

Modern physics has upended our intuitive picture of the world even further. The things that look “solid” to us—tables, rocks, books, etc.—are actually made mostly of empty space.15 And the fundamental units of physical reality have both particle-like and wavelike properties. Albert Einstein famously showed that time is not absolute. Whether or not two spatially distant events are simultaneous depends on the observer’s frame of reference. He further showed that space and time are intertwined in such a way that it’s best to think of them as spacetime. According to the best models we currently have to explain the behavior of large objects, gravity is the result of spacetime “bending” around massive objects.16 Trippy stuff!

How has science made these remarkable discoveries that are so far from our intuitive sense of the world? Science is a collaborative effort, and no one person can do it all by themselves, even within a sub-sub-field. Science involves enormous division of labor. But for us to be able to trust the products of science, the incentives have to be right. The incentives that individual scientists face must be aligned with finding the truth, wherever it may lie. Generally, this is the case, and that is why science has been on the whole very successful. In physics or chemistry, if you are able to find experimental data that disconfirms an important and commonly accepted hypothesis, you will receive many professional goods—you’ll likely get published in prestigious journals like Nature or Science, you might get big grants in the future, an endowed chair, maybe even the Nobel Prize.

Given these incentives, physics and chemistry are self-correcting. If a hypothesis is easily disconfirmed, it won’t last for long. Researchers, incentivized to disconfirm it, will quickly design experiments to show why the hypothesis doesn’t hold. Sloman and Fernbach write: “Scientific claims can be checked. If scientists are not telling the truth about a result or if they make a mistake, eventually they are likely to be found out because, if the issue is important enough, someone will try and fail to replicate their result.”17 Many scientists have echoed the importance of this feature of science over the years. Any time the accepted wisdom strays from the truth then, a course-correction will quickly follow.

Understanding knowledge production as a collective endeavor, which relies heavily on a well-maintained epistemic commons, helps us appreciate why John Stuart Mill defended his somewhat radical sounding account of justification for our scientific beliefs in On Liberty. He wrote:

If even the Newtonian philosophy were not permitted to be questioned, mankind could not feel as complete assurance of its truth as they now do. The beliefs which we have most warrant for, have no safeguard to rest on, but a standing invitation to the whole world to prove them unfounded.18

Thus, imagine if critics of Newton’s physics found themselves unemployable or prone to receiving censure, threats, etc., as soon as they challenged part of the view. Could a person living in Mill’s time, circa the mid-19th century, be able to trust the science of physics? Could he have reasonably believed in Newton’s laws if people faced a very uphill battle in trying to disconfirm them and he knew about this situation? Plausibly not. For, especially if this person is not a physicist, he lacks the wherewithal to check the researchers’ work. For all he knows there may be good reasons to reject Newtonian physics that are just not allowed to surface.

Indeed, as it turns out, Newtonian physics was accurate only in approximation. For macroscopic objects traveling at relatively low speeds, i.e., well below the speed of light, Newton’s laws allow us to make approximately true predictions. However, as Einstein later showed, some decades after Mill had passed away, Newtonian physics breaks down when it comes to objects moving close to the speed of light. Furthermore, while Newton assumed that space, time, and mass are absolute, Einstein showed that they are relative. Which events are simultaneous, how long an object is, how much mass it has, all depend on the observer’s frame of reference. If you are traveling at, say, half the speed of light relative to where I stand, then the length of a particular table will be quite different for you as opposed to me. Hence, even Newtonian physics, which was by Mill’s time well established and confirmed with countless experiments, turned out not to be sacrosanct.

The scientific process, then, must be structured in a certain way for it to merit our trust and reliance. If there were contrary evidence to be found, would it be discovered, published, and incorporated into the mainstream scientific consensus? The answer to this question must be yes.

In some sense, the scientific enterprise must be objective. What does such objectivity mean? Philosopher Helen Longino argues that it requires an openness to what she calls transformative critique. For Longino, science is fundamentally a social practice, and it is precisely due to this fact that its objectivity can be secured. Individual researchers are bound to have their idiosyncratic perspectives and biases. However, “science” is not simply the aggregation of the findings of individual scientists. Science is fundamentally practiced by social groups, not lone individuals. What gets counted as scientific knowledge results from social processes like peer review, attempts at replication, citation patterns, and clashes between defenders of alternative hypotheses and paradigms. This is a feature, not a bug. “Only if the products of inquiry are understood to be formed by the kind of critical discussion that is possible among a plurality of individuals about a commonly accessible phenomenon,” says Longino, “can we see how they count as knowledge rather than opinion.”19 Consequently, the more diverse points of view there are within a scientific community, the more objective the process is likely to be.

These lessons are not limited to science. Philosophy or literary criticism can be objective in this way too, according to Longino. However, the objectivity essentially depends on whether the social conditions within the field allow for robust critical discussion. A healthy field of inquiry, one whose product we have reason to take seriously, has to be one where people are incentivized to critique and disagree with ideas, such that no idea is sacred or beyond criticism.

To fix ideas, consider the philosophical field of metaethics. This subdiscipline asks foundational questions about the nature and epistemology of moral claims. These questions include, but are not limited to, the following. Are there any moral facts? If there are moral facts, are they subjective or objective? Would such facts be the sort of thing that can be discovered and investigated by the methods of natural science? How might we come to possess moral knowledge? When we say “murder is wrong,” are we expressing something more like a belief or something more like an emotion?20

Now metaethics, given my own impression of it, is a good example of a field that is working reasonably well. People defending a wide range of positions—naturalism, non-naturalism, error theory, expressivism, constructivism, Humeanism—have climbed to the top of the profession, winning prestigious awards and endowed chairs, working at elite universities, and so on. A variety of perspectives and styles of argument can thus exist and flourish within the discipline. There’s no stigma, as far as I can tell, attached to working on either side of the various debates in metaethics. Consequently, younger members of the profession feel free to follow the argument where it leads. And so many different kinds of positions within the logical space have renowned and well-respected defenders.21

When we look at the product of this discipline then, we can be fairly confident that few stones have been left unturned. If there was an easy argument to be made against some position it likely will have been made; the remaining fruits on the tree will probably be pretty high up. We don’t have to worry about reasons piling up on one side of the debate but being filtered out and discarded on the other side. Part of why metaethics works as well as it does might have to do with the fact that its subject matter—though fascinating and stimulating—does not “excite the passions.” People just aren’t going to get mad at you for defending non-naturalism or expressivism.

Due to the absence of such social pressure, we find each position having several defenders. This in turn reinforces the willingness of metaethicists to follow the argument where it leads. There’s a kind of strength in numbers. Contrast this with a hypothetical scenario where there are 100 naturalists (i.e., those who believe that moral properties are natural properties, in principle investigable by natural science) for every non-naturalist (those who deny naturalism). In such a case, it is hard to imagine not feeling isolation or social pressure against defending non-naturalism. Such pressure, whether real or perceived, would especially impact early career researchers, such as graduate students, whose future careers are uncertain. A promising graduate student who is inclined to defend non-naturalism might think twice. The fact that naturalists are in the overwhelming majority may be taken by such a student—whether consciously or subconsciously, rightly or wrongly—to suggest that defending non-naturalism is a bad career move.

Suppose now we add a stigma to this. Imagine that defenders of non-naturalism were publicly censured and ascribed bad character traits. We can see how this would cause reasons to pile up on one side of the debate. It would create perverse incentives that should undermine the trust we ought to have in the product of this community of research. Fortunately, as it stands, such pressures do not exist within metaethics. In fact, it would be considered grossly unprofessional to publicly ascribe bad character traits to one’s intellectual opponents within the field. A person who engaged in ad hominem attacks would quickly lose standing in the profession.

I have been describing modern physics, chemistry, and metaethics as fields that model healthy atmospheres of research (though of course they may not be perfect). But is this true across the board with respect to our institutions of knowledge production? Along with others, economist Glenn Loury suggests there is reason to worry. In a provocative 1994 article called “Self-Censorship in Public Discourse,” he writes:

Some areas of social science inquiry are so closely linked in the public mind to sensitive issues of policy that an objective, scholarly discussion of them is now impossible. Instead of open debate—where participants are prepared to be persuaded by arguments and evidence contrary to their initial presumptions, we have become accustomed to rhetorical contests—where competing camps fire volleys of data and tendentious analyses back and forth at each other.22

In a later passage, Loury claims that perverse incentives within a community of research can reduce the degree to which we should take its output seriously:

The notion of objective research—on the employment effects of the minimum wage, say, or the influence of maternal employment on child development—can have no meaning if, when the results are reported, other “scientists” are mainly concerned to pose the ad hominem query: “Just what kind of economist, sociologist, and so on would say this?” Not only will investigators be induced to censor themselves, the very way in which research is evaluated and in which consensus about “the facts” is formed will be altered. If when a study yields unpopular conclusions it is subjected to greater scrutiny, and more effort is expended toward its refutation, an obvious bias to “find what the community is looking for” will have been introduced. Thus the very way in which knowledge of the world around us is constituted can be influenced by the phenomenon of strategic expression.23

To the extent Loury is right, our epistemic condition with respect to the output of fields that are politicized in the way he describes above is shaky. Given the mountains and mountains of evidence relevant to all these policy-adjacent debates, though, none of us has the time, energy, or expertise required to dig through everything and properly make up our own minds. We inevitably have to rely on the journals, textbooks, and public lectures of the practitioners of these fields. But if the incentives within these fields are skewed in the way Loury describes, then such reliance will expose us to a lopsided selection and analysis of the facts out there. Depending on the case, this may well put us in a worse position epistemically than either ignorance or suspension of judgment with respect to certain topics. It would be like a jury being made to hear hours of arguments from the prosecution, and zero from the defense. Likely, the jury would have been better off before, when they had no opinion on the case!

All this puts us in a serious predicament, especially because, unlike metaethics (sorry metaethicists), the kinds of research Loury alludes to are extremely important to get right from a practical, policy-making perspective. The proper maintenance of the epistemic commons, when it comes to such fields of knowledge, then, is all the more important.

References:

1 Smith, An Inquiry into the Nature and Causes of the Wealth of Nations.
2 Sloman and Fernbach, The Knowledge Illusion, 121.
3 Rozenblit and Keil, “The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth”; Lawson, “The Science of Cycology: Failures to Understand How Everyday Objects Work.”
4 Clifford, “The Ethics of Belief,” 292.
5 This idea was first introduced in Hardin, “The Tragedy of the Commons.”
6 Anderson, Hildreth, and Howland, “Is the Desire for Status a Fundamental Human Motive? A Review of the Empirical Literature.”
7 Parts of the presentation are fictionalized for dramatic purposes, but the core features of the events are preserved.
8 For more on this, see Sunstein, Conformity.
9 Gulick, Administrative Reflections from World War II, 125.
10 For further analysis of the fiasco, see Janis, Groupthink; Sunstein and Hastie, Wiser: Getting Beyond Groupthink to Make Groups Smarter.
11 The worry is not merely theoretical. For instance, a recent poll shows that 62 percent of Americans say they have political opinions they are afraid to share, and 32 percent worry about potential lost job opportunities if their political views become known; see Jenkins, “Poll: 62% of Americans Say They Have Political Views They’re Afraid to Share.” Another study shows that the proportion of Americans who do not feel that they can speak their minds has tripled from the height of McCarthyism and the Red Scare of the 1950s: Gibson and Sutherland, “Keeping Your Mouth Shut: Spiraling Self-Censorship in the United States.” The study further finds that the tendency to self-censor increases with level of education.
12 Noelle‐Neumann, “The Spiral of Silence: A Theory of Public Opinion.”
13 For a thorough recent treatment of how individuals should regulate their epistemic lives in the potential presence of epistemic defeaters, see Ballantyne, Knowing Our Limits.
14 For an exploration of “Galilean personalities” in the context of modern science, see Dreger, Galileo’s Middle Finger: Heretics, Activists, and One Scholar’s Search for Justice.
15 Straightforwardly so at least on one interpretation of quantum mechanics. The other interpretations, however, are even more counterintuitive. For more on the interpretive issues, see: Maudlin, Philosophy of Physics: Quantum Theory.
16 See Maudlin, Philosophy of Physics: Space and Time. One notorious persisting problem here is that quantum mechanics and general relativity do not cohere. So, the search for a unified theory continues.
17 Sloman and Fernbach, The Knowledge Illusion, 224.
18 Mill, On Liberty and Other Essays, 26.
19 Longino, Science as Social Knowledge, 74.
20 For a short but comprehensive overview of the field, see Finlay, “Four Faces of Moral Realism.”
21 For example, the late Derek Parfit, a non-naturalist, was a renowned professor at All Souls College, Oxford. But Peter Railton and Michael Smith, both metaethical naturalists, hold professorships at the University of Michigan and Princeton University respectively, and are similarly well-regarded in the profession. Other prominent metaethicists include Sharon Street at NYU and Christine Korsgaard at Harvard who defend constructivism, Mark Schroeder at USC who defends Humeanism, and Allan Gibbard (emeritus) at Michigan who has defended expressivism. Though, my colleague Kevin Vallier thinks that divine command theory often gets short shrift in the teaching and research practices of the field.
22 Loury, “Self-Censorship in Public Discourse,” 452.
23 Loury, 453.

On Instagram @quillette