Facts don’t do what I want them to

Tuesday’s “Talk of the Nation” program on PBS discusses the phenonemon of “backfire” in people’s response to facts that contradict their strongly held beliefs.

We’d like to believe that most of what we know is accurate and that if presented with facts to prove we’re wrong, we would sheepishly accept the truth and change our views accordingly.

A new body of research out of the University of Michigan suggests that’s not what happens, that we base our opinions on beliefs and when presented with contradictory facts, we adhere to our original belief even more strongly.

The discussion centered on political beliefs, but I think it holds for other strongly held beliefs, such as religious beliefs. The original paper is here, and the abstract explains:

We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a “backfire effect” in
which corrections actually increase misperceptions among the group in question.

The authors note:

Recent work has shown that most citizens appear to lack factual knowledge about political matters (see, e.g., Delli Carpini and Keeter 1996) and that this deficit affects the issue opinions that they express (Althaus 1998, Kuklinski et al. 2000, Gilens 2001). Some scholars respond that citizens can successfully use heuristics, or information shortcuts, as a substitute for detailed factual information in some circumstances (Popkin 1991; Sniderman, Brody and Tetlock 1991, Lupia 1994; Lupia and McCubbins 1998).

In other words, when we lack factual knowledge, we fill in the blanks with shortcuts that adhere to our political beliefs. I’m reminded of the woman who said that she opposed the healthcare bill because it took away American’s rights. When asked specifically which parts of the bill did that, she couldn’t answer. This of course happens on both sides of the political spectrum, as the authors note.

During the program, I was thinking about how we do this with our religious beliefs, too. Think of average LDS church members who aren’t well versed in church history and apologetics. When confronted with problematic information they are unfamiliar with, many will fill in the blanks according to a “faithful” approach. I think we doubters or exmos are all familiar with the responses we have gotten to the issues that bother us: it’s an “anti-Mormon” lie, we don’t know the real story, there’s nothing to be troubled about, etc.

One problem is that facts are not shared in a vacuum but rather are thrown into a highly contested environment.

People typically receive corrective information within “objective” news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source. In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions – a view that is consistent with a wide array of research (e.g. Lord, Ross, and Lepper 1979; Edwards and Smith 1996; Redlawsk 2002; Taber and Lodge 2006).

In the same way, people get information about problematic church issues that is argued over, defended, and spun by both sides. Forced to choose a side, most will side with their previously held beliefs (in this sense Scott Lloyd is right that it’s a choice to believe). It’s not surprising that very few people exposed to criticism of the LDS church “switch sides.”

They also note something I’ve known for many years: “The least informed people expressed the highest confidence in their answers; … providing the relevant facts to respondents had no effect on their issue opinions.”

In the end, it was the person’s ideology, not the truth of the information presented, that determined the person’s understanding of “facts”:

Political beliefs about controversial factual questions in politics are often closely linked with one’s ideological preferences or partisan beliefs. As such, we expect that the reactions we observe to corrective information will be influenced by those preferences. In particular, we draw on an extensive literature in psychology that shows humans are goaldirected information processors who tend to evaluate information with a directional bias toward reinforcing their pre-existing views (for reviews, see Kunda 1990 and Molden and Higgins 2005).

So far, they’re just talking about confirmation bias, which we see daily on the message boards. But the authors show that, in some cases, facts that contradict one’s beliefs might actually strengthen those beliefs:

Individuals who receive unwelcome information may not simply resist challenges to their views. Instead, they may come to support their original opinion even more strongly – what we call a “backfire effect.”

They suggest that the effort that people expend to argue against those facts reinforces the belief:

We follow Lodge and Taber (2000) and Redlawsk (2002) in interpreting backfire effects as a possible result of the process by which people counterargue preference-incongruent information and bolster their preexisting views. If people counterargue unwelcome information vigorously enough, they may end up with “more attitudinally congruent information in mind than before the debate” (Lodge and Taber 2000: 209), which in turn leads them to report opinions that are more extreme than they otherwise would have had.

I think we’ve all seen this in some of our interaction on the Internet. I sometimes think that I participated on apologetic boards as a believer as a way to bolster my faith, believing that if I could find reasonable responses to criticism, I could still believe. I particularly liked this quote from the NPR program from Nyhan:

And what’s interesting is in some of these cases, it’s the people who are most sophisticated who are best able to defend their beliefs and keep coming up with more elaborate reasons why 9/11 was really a conspiracy or how the weapons of mass destruction were actually smuggled to Syria or whatever the case may be.

So this isn’t a question of education, necessarily, or sophistication. It’s really about, it’s really about preserving that belief that we initially held.

All of this means that it’s extremely difficult to reconsider one’s cherished beliefs, even in the face of contradictory facts. I’m of the opinion that a shift in paradigm–to borrow from Thomas Kuhn–only comes when the facts overwhelm one’s beliefs.

Some apologists sneer at ex-Mormons for believing that the evidence against the truth-claims of Mormonism is overwhelming, but I would imagine that for most of us, the information is overwhelming; otherwise we never would have changed our belief systems.


7 Responses to Facts don’t do what I want them to

  1. Tim says:

    Doesn’t there have to be some other kind of dissatisfaction to cause the intellectual reasons to be persuasive?

    For instance a person who is dissatisfied with the culture or religious experience is the kind of person who is going to be more likely to take the facts and leave. A person happy with where they are is going to become a NOM or find some way to reconfigure the facts.

  2. aerin says:

    I would be interested to see if they looked at people of different ages. I think people who are younger are more willing to look at new ideas/different ways of looking at things. The older people get, the less willing they are to look at new ideas/ways of being. I think it takes a certain type of personality (at any age) to rock the boat/examine different information.

    And your point about this happening on both sides of the political spectrum (and everywhere in between) is not lost on me either. It is most certainly true. The voices that I have the most respect for are those who will at times admit that the other side may have a point. Those who will never admit the other side has a valid point – well, usually those conversations end badly.

  3. […] everybody is talking about this new study that suggests that misinformed people rarely change their minds when presented with the […]

  4. Odell says:

    I wonder what I would have thought about previously known facts regarding the LDS church and faith had I not been burned out by years of church service?

    Years of useless and time consuming service may have allowed me to break through the bias.

  5. Reed Manson says:

    We believe what we want to believe. We believe whatever promotes our own well-being. The rationalizations only serve the needs of the creature.

    Apostacy is what happens when we realize the church is taking more out of us than it is putting in. Not when we discover it is based on logical fallacy or fraudulence.

    If someone really wants to believe they will usually find a way.

  6. MC says:

    There seems to be a contradiction within the analysis of this phenomenon. On the one hand, the study says that “most citizens appear to lack factual knowledge about political matters and that this deficit affects the issue opinions that they express.” Their opinions, then, are the product of ignorance and intellectual laziness (i.e., heuristic devices). Their reticence to revise their opinions can be explained by such intellectual laziness, since it requires effort to genuinely re-evaluate a position.

    But then the researchers say that “in some of these cases, it’s the people who are most sophisticated who are best able to defend their beliefs.” So it can’t be intellectual laziness, or ignorance of the issues involved.

    It seems to me that intelligence or level of information about a topic are largely irrelevant. I think the common trait among those who stubbornly adhere to a position, regardless of new or conflicting information, is a reliance upon the emotional component of a topic. The tea party activists, for example, are riled up because they have an emotional response to current events. They experience fear and anger and look for some external cause for those emotions. When exposed to tea-party rhetoric, they latch onto it because they can draw a connection between their internal emotional state and an external, understandable cause.

    When confronted with facts that contradict their new worldview (they’re actually paying LESS in taxes; tax rates are at a historically low point, etc) it doesn’t matter to them because that emotional response still exists; you can’t argue against it.

    That’s what’s happening with religious believers. They have an emotional response with respect to their religion, and as long as it “feels” true – as long as their emotions tell them it’s real – external evidence can do nothing to change their mind.

    • Reed Manson says:

      I think you’ve summed things up pretty well.
      We decide how we want to feel than pursue beliefs that allow us to feel that way. We don’t slavishly follow the truth where ever it may lead us.

      We are emotion driven creatures. The analytic brain is far less influential in how we direct our lives than we acknowlege.

      It is surprising how few people can intelligently defend their beliefs with any persuasiveness

      The typical Mormon testimony is nothing more than the bare assertion that the church makes me feel good and so must be true.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: