People say they’d change their minds, but they mostly don’t
No matter where you stand on America’s political spectrum, you probably think of yourself as a rational human being. You base your opinions on facts, not unverified anecdotes, or the dictates of wannabe leaders. When credible evidence proves your “facts” to be wrong, of course you adjust your beliefs accordingly.
Only there’s a very good chance you don’t, according to a working paper by Duke’s Matthew Lilley and UCLA Anderson’s Brian Wheaton. On both sides of the proverbial political fence, their work suggests, people are quick to reject information that contradicts their beliefs.
“We find that individuals claim to be quite open-minded; they claim their beliefs would change in response to the information,” the researchers write. “In actuality, no such thing happens.”
Do People Really Disregard Facts?
The study addresses what’s known among academics as “motivated reasoning” — perhaps thickheadedness in other circles — a tendency to uncritically accept new information that matches one’s worldview, while critically analyzing and discarding information that does not. Lilley and Wheaton raised the possibility that motivated reasoning around politics — a well-documented topic — is not as pervasive as it appears in the research. Sometimes people may ignore new information because they distrust the source, Lilley and Wheaton note. It’s not bias when one holds on to core beliefs in the face of questionable data. Much of existing research on the topic, the authors noted, did not fully separate the notion of rational distrust of new information from actual motivated reasoning.
But Lilley and Wheaton’s work suggests that, even when faced with facts from sources they trust, many people remain stubbornly fixed to their original, contrary belief.
The paper analyzes answers from 2,000 volunteers recruited to complete surveys about several loaded political issues, such as racial bias in policing, gun control and climate change. Initially, each participant was asked questions aimed at assessing their beliefs around these topics generally. The researchers measured change (or lack of change) in these beliefs after presenting them with new, relevant facts about the subjects’ assigned issue.
Murder Rates and Political Parties
For example, some subjects were asked to guess the 2020 murder rate in the two largest cities with Republican mayors (Jacksonville, Florida, and Fort Worth, Texas). For context, they were told, the FBI reports the rate in the largest cities with Democrat mayors (New York City and Los Angeles) at 6.7 murders per 100,000 population.
Participants sorted to the “information” group were reminded of their guess and told the correct answer: 13.5 murders per 100,000 for the Republican cities. They were then asked their beliefs about whether policies by Democrat mayors lead to more crime. And whether they supported criminal justice policies to reduce incarceration rates by focusing on rehabilitation over punishment.
The information group responded to the aforementioned belief questions in an identical way to the control group, which received no new information. In other words, respondents do not actually change their minds in response to the information.
This might just be rational distrust of information, though. As such, another group was not immediately told the right answer. Instead, they were asked: If told, hypothetically, that FBI data pegs the Republican city murder rate at 13.5 murders per 100,000, would you believe Democratic policies lead to more crime?
Most participants responded to this hypothetical with beliefs much different from those in the information or control groups — beliefs more sympathetic to Democratic mayors. In other words, participants claimed they would be open-minded and change their mind in response to the information, suggesting that they find the source of the information sound. This indicates that the lack of any shift in beliefs in the information group is actual motivated reasoning, not merely information distrust.
“Essentially, they say they’re going to change their views. A lot,” Wheaton says in a phone interview. “But then they actually don’t do that at all.”
For this issue, where the fact about murder rates is ideologically unfavorable to Republicans, it was mostly Republican respondents who engaged in this motivated reasoning. For other issues where a fact ideologically unfavorable to Democrats was presented, it was mostly Democrat respondents who engaged in motivated reasoning.
Subjects asked their hypothetical beliefs before being told the actual information were most likely to change their views, provided they were asked the same exact beliefs question after receiving the information. Wheaton suggests those limited circumstances “ties their hands,” making it difficult for them to justify their original belief after suggesting they were open to new facts about a single, specific question. Lilley and Wheaton write that this reveals one possible, albeit limited, way of de-biasing individuals from engaging motivated reasoning.
Featured Faculty
-
Brian Wheaton
Assistant Professor of Global Economics and Management
About the Research
Lilley, M. and Wheaton, B. (2024). Are Preconceptions Postconceptions? Evidence on Motivated Political Reasoning.