Tuesday’s “Talk of the Nation” program on PBS discusses the phenonemon of “backfire” in people’s response to facts that contradict their strongly held beliefs.
We’d like to believe that most of what we know is accurate and that if presented with facts to prove we’re wrong, we would sheepishly accept the truth and change our views accordingly.
A new body of research out of the University of Michigan suggests that’s not what happens, that we base our opinions on beliefs and when presented with contradictory facts, we adhere to our original belief even more strongly.
The discussion centered on political beliefs, but I think it holds for other strongly held beliefs, such as religious beliefs. The original paper is here, and the abstract explains:
We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a “backfire effect” in
which corrections actually increase misperceptions among the group in question.
The authors note:
Recent work has shown that most citizens appear to lack factual knowledge about political matters (see, e.g., Delli Carpini and Keeter 1996) and that this deficit affects the issue opinions that they express (Althaus 1998, Kuklinski et al. 2000, Gilens 2001). Some scholars respond that citizens can successfully use heuristics, or information shortcuts, as a substitute for detailed factual information in some circumstances (Popkin 1991; Sniderman, Brody and Tetlock 1991, Lupia 1994; Lupia and McCubbins 1998).
In other words, when we lack factual knowledge, we fill in the blanks with shortcuts that adhere to our political beliefs. I’m reminded of the woman who said that she opposed the healthcare bill because it took away American’s rights. When asked specifically which parts of the bill did that, she couldn’t answer. This of course happens on both sides of the political spectrum, as the authors note.
During the program, I was thinking about how we do this with our religious beliefs, too. Think of average LDS church members who aren’t well versed in church history and apologetics. When confronted with problematic information they are unfamiliar with, many will fill in the blanks according to a “faithful” approach. I think we doubters or exmos are all familiar with the responses we have gotten to the issues that bother us: it’s an “anti-Mormon” lie, we don’t know the real story, there’s nothing to be troubled about, etc.
One problem is that facts are not shared in a vacuum but rather are thrown into a highly contested environment.
People typically receive corrective information within “objective” news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source. In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions – a view that is consistent with a wide array of research (e.g. Lord, Ross, and Lepper 1979; Edwards and Smith 1996; Redlawsk 2002; Taber and Lodge 2006).
In the same way, people get information about problematic church issues that is argued over, defended, and spun by both sides. Forced to choose a side, most will side with their previously held beliefs (in this sense Scott Lloyd is right that it’s a choice to believe). It’s not surprising that very few people exposed to criticism of the LDS church “switch sides.”
They also note something I’ve known for many years: “The least informed people expressed the highest confidence in their answers; … providing the relevant facts to respondents had no effect on their issue opinions.”
In the end, it was the person’s ideology, not the truth of the information presented, that determined the person’s understanding of “facts”:
Political beliefs about controversial factual questions in politics are often closely linked with one’s ideological preferences or partisan beliefs. As such, we expect that the reactions we observe to corrective information will be influenced by those preferences. In particular, we draw on an extensive literature in psychology that shows humans are goaldirected information processors who tend to evaluate information with a directional bias toward reinforcing their pre-existing views (for reviews, see Kunda 1990 and Molden and Higgins 2005).
So far, they’re just talking about confirmation bias, which we see daily on the message boards. But the authors show that, in some cases, facts that contradict one’s beliefs might actually strengthen those beliefs:
Individuals who receive unwelcome information may not simply resist challenges to their views. Instead, they may come to support their original opinion even more strongly – what we call a “backfire effect.”
They suggest that the effort that people expend to argue against those facts reinforces the belief:
We follow Lodge and Taber (2000) and Redlawsk (2002) in interpreting backfire effects as a possible result of the process by which people counterargue preference-incongruent information and bolster their preexisting views. If people counterargue unwelcome information vigorously enough, they may end up with “more attitudinally congruent information in mind than before the debate” (Lodge and Taber 2000: 209), which in turn leads them to report opinions that are more extreme than they otherwise would have had.
I think we’ve all seen this in some of our interaction on the Internet. I sometimes think that I participated on apologetic boards as a believer as a way to bolster my faith, believing that if I could find reasonable responses to criticism, I could still believe. I particularly liked this quote from the NPR program from Nyhan:
And what’s interesting is in some of these cases, it’s the people who are most sophisticated who are best able to defend their beliefs and keep coming up with more elaborate reasons why 9/11 was really a conspiracy or how the weapons of mass destruction were actually smuggled to Syria or whatever the case may be.
So this isn’t a question of education, necessarily, or sophistication. It’s really about, it’s really about preserving that belief that we initially held.
All of this means that it’s extremely difficult to reconsider one’s cherished beliefs, even in the face of contradictory facts. I’m of the opinion that a shift in paradigm–to borrow from Thomas Kuhn–only comes when the facts overwhelm one’s beliefs.
Some apologists sneer at ex-Mormons for believing that the evidence against the truth-claims of Mormonism is overwhelming, but I would imagine that for most of us, the information is overwhelming; otherwise we never would have changed our belief systems.