The Science of Why People Reject Science


Accounting for social and ideological factors in science communication can improve the quality of debates and decision-making when it comes to climate change and other hot button scientific issues.


 

In his 1748 treatise An Enquiry Concerning Human Understanding, David Hume famously wrote, “A wise man…proportions his belief to the evidence.” This sentiment has become a familiar fixture of our modern lexicon, a basic standard for intellectual honesty and a prescription for belief formation in an ideal world.

Of course it was also Hume who in an earlier volume wrote that reason is “the slave of the passions.” The original context of the passage concerns moral action and the precedence of our goals, motives and desires in relation to reason; specifically, Hume considered reason an ex post facto force in thrall to our moral impulses. But this concept can be duly extended to the psychology of belief formation more generally. Indeed, once we factor in culture, ideology, human psychology, and emotionally laden values, the image of the dispassionate creature responsive to the best evidence and argument gets cut down to size.

The ease with which the reasoning self can be subverted should clue us in that there is more going on behind the scenes than cold-blooded evaluation of facts. The prevalence of the triumph of emotion and ideology over reason and evidence is a feature of human psychology we should all be mindful of, but its underlying causes should be of special interest to proponents of science and other veterans of fact-based intervention.

In particular, what we know about cognition has great import for the information deficit model, the quixotic but woefully under-supported idea that the more factual information one is exposed to, the more likely one is to change their mind. As it turns out, our neural wiring tends to lead us in the opposite direction: our emotional attachment to our beliefs and values prods us to double down and organize contradictory information in a way that is consistent with whatever beliefs we already hold — the so-called “backfire effect.” As Chris Mooney put it in 2011: “In other words, when we think we’re reasoning, we may instead be rationalizing.”

I think that on some level anyone who has dealt with science denial for an extended period of time knows this intuitively. Adopting the role of serial debunker, whether online or off, can be a thankless and unrewarding undertaking. We figure that one more “hockey stick” graph, one more image of the vanishing Arctic, one more year of record-topping warmth, one more report on ocean acidification or extreme weather will settle the matter. But it rarely does; discredited claims persist, and frustration sets in.

That these efforts so often end in failure demonstrates the sheer poverty of “Just the facts, ma’am” approaches to persuasion. Summarizing a study on this very topic, Marty Kaplan writes: “It turns out that in the public realm, a lack of information isn’t the real problem. The hurdle is how our minds work, no matter how smart we think we are.” While access to more and higher quality information sounds like a fundamental feature of consensus building, it ultimately matters less than how we situate that information within our preexisting social matrix.
 

 
This is not to say that fighting denial with facts never works, or that people are incapable of changing their minds on anything of consequence — it can and we are. For instance, someone who is already well trained in logic and critical thinking and evidence-based evaluation of claims in other contexts may yield to the facts in a different context even when their beliefs are misaligned. This does, however, appear to be the exception rather than the rule. For the most part, the rationality we so aspire to all too often succumbs to inherent constraints and limitations in our cognitive makeup.

What’s become increasingly clear is that scientific literacy alone can’t solve the problem. Indeed, the reason clashes with creationists and climate deniers have us banging our collective heads against the wall is not because we don’t have the facts on our side. It’s because we’re bumping up against deeply entrenched cultural norms and attitudes. Gaining traction in disputes over settled science, it seems, is less about closing the information gap than about translation of the science such that the recipient can process the information without feeling like their social identity and worldview are at risk. Science denial thus stems from the felt sense that scientific beliefs are incompatible with the received wisdom of one’s social sphere.

The research supporting this view is now extensive. One of the pioneers at the forefront of the connection between beliefs, evidence and culture — what we might call ‘value-based belief formation’ — is a psychology professor at Yale by the name of Dan Kahan. The theory he and his colleagues have proposed as a more robust alternative to the deficit model is identity-protective cognition, otherwise known as cultural cognition.

According to Kahan and others, we perceive and interpret scientific facts largely as symbols for cultural affinities. Especially certain scientific facts — think  climate change, evolution, GMOs, stem cell research, vaccination — by dint of being politicized, carry cultural meaning that has nothing to do with the validity of the underlying science. Depending on the strength of one’s political biases, facts may never enter the analysis at all, except to be argued against and swept aside in order to reinforce a fixed ideological position. The more invested we are in a certain cultural identity, and the more politicized the scientific issue, the higher the frequency at which these impulses operate.

Associating with beliefs that are out of step with one’s social group, moreover, bears societal and interpersonal risks (e.g., loss of trust from peers) to members for whom ideology or ‘party’ has become a deep and meaningful part of their self-concept. This consequently prompts the identity-protective mechanism: we selectively credit and discredit evidence in response to those risks. In short, objective evidence is merely subjective fuel for the ideologically beholden. The psychology of group affiliation and competition for social status frequently override rational assessment of scientific knowledge when it comes to climate change, evolution, vaccine safety, and so forth.

TLDR – It’s about cultural identity and values, not facts. Echoing Hume, we are not ruled by reason.

I see Kahan & Co.’s research as a powerful commentary on the state of hyper-polarization of American politics and culture, and a cogent, if ineluctably depressing, counter-narrative to the information deficit model practiced by so many science communicators today. This leaves us on somewhat insecure footing. We cannot allow misinformation to spread unchallenged, but neither can we continue to labor under the faulty expectation that dousing our debates with more facts and figures will fan the flames of denial. To the extent this body of research provides an accurate picture of the referent under study, we would do well to absorb its insights and apply them in the arena of partisan politics and antiscience contrarianism.

As Chris Mooney wrote in 2012: “A more scientific understanding of persuasion, then, should not be seen as threatening. It’s actually an opportunity to do better — to be more effective and politically successful.” Charging forward with false notions of human psychology dooms our efforts before they get off the ground. That our ideological commitments often prod us to double down and organize dissonant information in ways that cohere with the expectations of our cultural group — and, indeed, that the more informed and literate we are the more susceptible we are to this phenomenon — is invaluable intel in countering the war against facts.

What does this look like in practice? The strategy urged by Kahan and his colleagues is to lead with values, appeal to common concepts and desires, and emphasize shared goals. Scientific minutia is no one’s friend in these conversations. Instead of a rote rehearsal of the facts, explain why those facts should matter to the listener. Especially avoid framing issues in terms of left vs. right or science vs. denialism. In essence, steer the discussion away from ideological pressure points that are likely to trigger ingroup-outgroup dynamics.

By no means is this an easy or surefire path to success. For one, it requires much more preparation in terms of tailoring your arguments to your audience. But the evidence suggests this approach is more productive than throwing fact after fact at the wall and seeing what sticks.

Some have taken cultural cognition theory to imply that facts are culturally determined — to support a kind of postmodernist understanding of truth. This couldn’t be more mistaken. Identity-protective cognition is an explanatory model for how beliefs — particularly those existing at the intersection of science and politics — are formed. It is both descriptive and prescriptive: it contends that our cultural experiences and personal identity influence the way we approach and interpret facts, and points us toward new modes of engagement. It should not be construed as a recommendation for how we are to form reliably accurate views about the world, abandoning fact-based decision making, or embracing a post-truth era. Scientific facts are still culturally independent descriptions of nature, and the physical laws of the universe don’t change depending on who’s measuring them.

Rather, this research, like all good psychology, brings to light imperfect manifestations of our innate cognitive circuitry. The more we learn about cognition, the more fleeting rationality and reasoned thinking appear to be, and the more vigilant we must be to avoid the common pitfalls so ingrained in our neurochemistry. After all, the penchant for tribalism and partisanism are more akin to features than bugs in the human operating system; such shortcomings have been with us from the beginning. Only by recognizing these features and adopting communication strategies that account for them can we hope to effectively engage lasting resistance to established science and help guide society out of the pre-Enlightenment era to which we seem to be regressing.
 

 
Assembled below are a collection of studies and articles on cultural cognition:

The Cultural Cognition Project at Yale Law School

Cultural Cognition of Scientific Consensus (mirrored here; pdf). Kahan et al. 2010:

“Why do members of the public disagree – sharply and persistently—about facts on which expert scientists largely agree? We designed a study to test a distinctive explanation: the cultural cognition of scientific consensus. The “cultural cognition of risk” refers to the tendency of individuals to form risk perceptions that are congenial to their values. The study presents both correlational and experimental evidence confirming that cultural cognition shapes individuals’ beliefs about the existence of scientific consensus, and the process by which they form such beliefs, relating to climate change…”

 

The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change. Kahan et al. 2011:

“The conventional explanation for controversy over climate change emphasizes impediments to public understanding: Limited popular knowledge of science, the inability of ordinary citizens to assess technical information, and the resulting widespread use of unreliable cognitive heuristics to assess risk. A large survey of U.S. adults (N = 1540) found little support for this account. On the whole, the most scientifically literate and numerate subjects were slightly less likely, not more, to see climate change as a serious threat than the least scientifically literate and numerate ones. More importantly, greater scientific literacy and numeracy were associated with greater cultural polarization: Respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence became more concerned, as science literacy and numeracy increased.”

 

The polarizing impact of science literacy and numeracy on perceived climate change risks. Kahan et al. 2012:

“Members of the public with the highest degrees of science literacy and technical reasoning capacity were not the most concerned about climate change. Rather, they were the ones among whom cultural polarization was greatest.”

 

Striving for a Climate Change:

“When it comes to the role of science in political debates, Kahan is no Pollyanna. While the public gladly accepts scientific advice on most topics—there’s no political debate about the public-health merits of pasteurization—a few issues, like climate change, have become polluted with cultural debris. This pollution defies education and intelligence, he’s shown; such smarts make people even more talented at rearranging facts to fit their views.”

 

Fixing the communications failure. Kahan 2010:

“People endorse whichever position reinforces their connection to others with whom they share important commitments…People’s grasp of scientific debates can improve if communicators build on the fact that cultural values influence what and whom we believe…We need to learn more about how to present information in forms that are agreeable to culturally diverse groups, and how to structure debate so that it avoids cultural polarization. If we want democratic policy-making to be backed by the best available science, we need a theory of risk communication that takes full account of the effects of culture on our decision-making.”

 

Most Depressing Brain Finding Ever (study by Kahan et al. 2013 here): 

“Maybe climate change denial isn’t the right term; it implies a psychological disorder. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions. When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win. The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.”

 

Letting Animals Vote:

“We Homo sapiens respond more to stories than to statistics, more to feelings than to facts, more to images than to issues, more to drums than to debates, more to intuition than to information. This is not a failing of our character. It is a characteristic of our species. And in America, we bipeds get to vote.”

 

The Meaning of Scientific “Truth” in the Presidential Election. Kahan 2016:

“Probably the most important insight from the science of science communication is that factual beliefs on contested science issues lead a double life. At least part of the time, for at least some people, [facts] furnish guides for action that depend on the best available evidence. But for many more people, a much greater part of the time, factual beliefs on climate change, evolution, and the like are symbols used to communicate membership in and loyalty to groups embroiled in a competition for social status. The psychological process by which people form and persist in the latter species of belief is known as identity-protective cognition.”

 

The Hyper-Polarization of America:

“My own research repeatedly shows that partisanship for many Americans today takes the form of a visceral, even subconscious, attachment to a party group. Our party becomes a part of our self-concept in deep and meaningful ways. This linkage of party and “self” changes the way we judge the parties and incorporate and receive new information.”

 

The Science of Truthiness: Why Conservatives Deny Global Warming:

“So in sum, we need a nature-nurture, or a combined psychological and environmental account of the conservative denial of global warming. And only then do we see why they are so doggedly espousing a set of beliefs that are so wildly dangerous to the planet.”

 

The ugly delusions of the educated conservative:

“Indeed, if we believe in evidence then we should also welcome the evidence showing its limited power to persuade–especially in politicized areas where deep emotions are involved. Before you start off your next argument with a fact, then, first think about what the facts say about that strategy. If you’re a liberal who is emotionally wedded to the idea that rationality wins the day—well, then, it’s high time to listen to reason.”

 

The Science of Why We Don’t Believe Science:

“Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.”


 

This post was featured on HuffPost’s Contributor platform.

Feature image via The Guardian; photo credit: Josh Edelson/AFP/Getty Images