Skip to content

Communicating Science in an Era of Post-Truth

The solution seemed to be clear: educate the public and they will accept the science.

· 8 min read
Communicating Science in an Era of Post-Truth

A resurgent populist politics has galvanized public skepticism of the scientific community by portraying academics as a remote elite, distinct from the American populace. This delineation leverages our innate tribal instincts to breed distrust of expertise, providing fertile ground for bad actors to sow seeds of doubt about what we know to be true of the world. Naturally, an army of professional scientists have stood up to this rhetoric, amplifying their voices on social media and assembling political action committees (such as 314 Action) to help fight the onslaught against truth. But too often, scientists and science advocates fail to connect with their audiences. And with social divisions exacerbated since the election of Donald Trump, efforts to reach the lay public have only grown more difficult.

So how can we have meaningful conversations about science during a postmodern age in which both populist Left and Right insist that expertise is suspect and truth is relative?

For decades, it was assumed that a lack of scientific literacy contributed to the public’s adversarial attitudes towards politically polarizing—yet empirically incontrovertible—science. In other words, it was thought that the public did not value science because they did not understand science, and that ignorance was to blame for controversies surrounding science and science-based policy. This is what is known as the deficit model of science communication. The solution seemed to be clear: educate the public and they will accept the science.

Yet the deficit model cuts against a mounting body of evidence that suggests literacy is not the primary contributor to the public’s attitude towards science. For instance, a group of researchers led by Yale Professor Dan Kahan conducted a survey of over 1,500 U.S. adults to assess the relationship between the public’s understanding of climate change and their assessment of the risk it poses to our society.1 They discovered that participants with an extensive understanding of the science were actually less concerned about the potential devastations of climate change, a finding that directly conflicted with the predictions of the deficit model. In its place, Kahan offered what he called the cultural cognition thesis. This holds that an individual’s perception of science—and, in turn, assessment of risk—is primarily influenced by perception of social identity.

Kahan’s theory is supported by a recent study conducted at the Philipp University of Marburg in Germany, where researchers demonstrated that a participant’s interpretation of science—as well as their opinion of scientists—is significantly influenced by their perceived membership of a social community.2 When presented with evidence that conflicts with their predisposed worldview, participants were more likely to doubt the integrity of the science and the credibility of the scientist. The implication here is that when our deepest convictions are challenged by contradictory evidence, we tend to dismiss the science and cling to our beliefs with even greater vehemence—a phenomenon known as the backfire effect.

These studies suggest that the implementation of the deficit model and its associated attempts at ‘educating’ the public only exacerbate the divide between experts and lay citizens. Such attempts antagonize the very people with whom professional scientists need to connect, reinforcing the perception of an ivory tower and further isolating academics from the general populace. The deficit model approach, then, cannot be the solution to the polarization we see today. As Will Storr succinctly put it in his 2014 book The Unpersuadables: “Reason is no magic bullet.”

However, these studies do reveal something rather profound about us. Our opinions on science are not governed by how much we know. They are governed by how we see ourselves.

Our perception of reality is influenced by the values we hold, and our values are shaped—to an appreciable degree—by our identity. One need not go beyond politically contentious issues such as climate change, evolution, or embryonic stem cell research to see this. But the influence our values hold over our interpretation of science goes beyond political affiliation; it extends into the realm of racial, ethnic, and cultural identity as well.3 For example, Kahan’s climate change survey uncovered differences in the perception of science between people who hold hierarchical (traditionally Eastern) and individualistic (traditionally Western) worldviews.1 20 years of U.S. survey data has also shown a burgeoning racial and ethnic gap on opinions regarding climate change—now almost comparable in size to the ideological divide.4

This should not come as a surprise to us. We have evolved to be a tribal species. Our associations with different communities inform our worldviews, and we cement them into our psyche without much critical thought. Our beliefs are cherished because they inform our sense of identity and make us into who we are. Will Storr put it best:

Belief is the heart of who we are and how we live our lives. And yet it is not what we think it is: not a product of intelligence or education or logic. There are invisible forces at play here.

These forces frequently overpower our critical faculties and pull our intuitions from one conviction to the next. On its face, this seems like a hopeless message for scientists eager to galvanize the public on issues that require immediate action such as climate change. But, as strong as our innate biases are, there are bugs in our evolutionary programming that we can exploit to counteract them. It is true that our social identity defines how we interpret scientific information. However, our interpretation of that information can be manipulated by the way in which the information is framed.5 By leveraging cultural values, scientists are able to emphasize some points over others and customize information so that it is more palatable to a particular audience. And the science is clear: we are more receptive to information we can relate to.6

It follows, then, that in order to effectively communicate science in our modern, socially-compartmentalized society, scientists must tailor their messages to meet the concerns, priorities, and values of those they wish to reach. By reframing the science to meet the needs of the general public, communicators are able to transcend our faulty evolutionary design—tribalism, belief, our affinity for emotionally-laden thinking—by leveraging their influence over our information processing, much like a Trojan Horse that allows facts to clear the mental barriers we erect against uncomfortable truths. I call this the adaptive model of science communication.

Science as Art
All of this discussion leaves unanswered the question of how we decide if something represents a breakthrough — after all, there isn’t an international court of arbitration for creativity.

A few recent examples illustrate the method’s utility and success.7 In an effort to connect with evangelicals about the importance of environmental conservation, the entomologist and author E. O. Wilson argued in his book The Creation that, as the species granted dominion over this world by god, we have a moral duty to act as responsible stewards of the environment. Using this moral framework, Dr. Wilson was able to reach a new audience by carefully linking the urgency of ecological preservation to the values enshrined in the Bible.

In a similar attempt to reach the religious community, the National Academies and the Institute of Medicine framed their joint report on the teaching of evolution in science classrooms by moving away from the antagonistic ‘science vs. religion’ narrative, and suggesting that science and religious faith can be reconciled. By highlighting religious scientists (such as NIH Director Dr. Francis Collins) or religious figures who accept evolutionary science (such as the Archbishop of Canterbury Justin Welby) religious communities could be persuaded that they do not have to choose between empirical evidence and their religious identity. This is a controversial idea, but a useful tool nonetheless.

Framing can be utilized to ease the tensions in the debate over genetic engineering as well. Research has shown that emphasizing the market competitiveness and economic benefits of genetically modified food can help dispel doubts on the Right, while underscoring their resilience to climate change can help alleviate concerns on the Left.

Arguably, one of the greater benefits of the adaptive model is the potential for dialogue. Communication becomes possible in its truest sense. The insufficiency of the deficit model lay in its one-way transmission of information, or what American University Professor Matthew Nisbet called a “top-down persuasion campaign.”7 The two-way adaptive model allows the scientist and the layperson to engage in a conversation, and exchange ideas and values in an effort to reach a mutual understanding of and solution to some of society’s most challenging problems. It is not the job of the scientist to ‘sell’ the public on science. Instead, by engaging the public in an open way, scientists can learn about what motivates people to act. In turn, the public can learn how to act from those who understand the problem best.

Although our social and cultural identity shapes the way in which we view the world, it does not shape science. Truth is not relative. There will come a time when some communities will be confronted with empirical evidence that demands a revision of their worldview. But the benefit of the adaptive model is that scientists, by relating to the community the importance of the science to their local issues and concerns, can gently guide them to these provocative truths while still maintaining fidelity to the relevant facts. The ultimate success of this technique does, however, lean on one very important aspect of the relationship between scientist and layperson: trust.

Trust is the lynchpin that facilitates the smooth operation of the adaptive model. The process of relating to a community about their priorities and concerns cannot occur when the public views scientists as strangers to their community. Scientists must make it a priority in their conversations to dissolve the illusion of the ivory tower and remind their audiences that they are members of their community too. We are not a set of intellectual elites with priorities categorically opposed to the rest of populace. We are people. We care for our friends, cherish our families, and love our partners just as everyone else does. We all share a common goal: to solve the challenges of today to ensure the well-being of our species tomorrow. And it is these commonalities that will help to mend the tears in our social fabric.

References:

1 Kahan, Dan M., et al. “The polarizing impact of science literacy and numeracy on perceived climate change risks.” Nature climate change (2012).
2 Nauroth, Peter, et al. “The effects of social identity threat and social identity affirmation on laypersons’ perception of scientists.” Public Understanding of Science (2017).
3 Pearson, Adam R., and Jonathon P. Schuldt. “Bridging climate communication divides: beyond the partisan gap.” Science Communication (2015).
4 Guber, Deborah Lynn. “A cooling climate for change? Party polarization and the politics of global warming.” American Behavioral Scientist (2013).
5 Nisbet, M. C. “Communicating climate change: Why frames matter for public engagement.” Environment: Science and policy for sustainable development (2009).
6 Rattan, A., K. Savani, and R. Romero-Canyas. “Motivating environmental behavior by framing carbon offset requests using culturally-relevant frames.” Paper presented at the Association for Psychological Science, New York, NY (2015).
7 Nisbet, Matthew C., and Dietram A. Scheufele. “What’s next for science communication? Promising directions and lingering distractions.” American journal of botany (2009).

On Instagram @quillette