Why False Beliefs Are Hard to Shake

Flat-Earthers believe that the Earth is a flat disc ringed by an ice wall.
Flat-Earthers believe that the Earth is a flat disc ringed by an ice wall. (Image credit: Elena Schweitzer/Shutterstock)

Once a belief takes hold, it can be hard to make it budge, even with reams of data and evidence. Now, a new study hints at one reason why: When a person gets just a few jolts of positive feedback for their belief, they feel very certain they're right.

This certainty persists even if the overall body of evidence suggests the person is wrong, researchers reported Aug. 16 in the open-access journal Open Mind. This certainty can be a curiosity killer, said study co-author Louis Marti, a doctoral student at the University of California, Berkeley.  

"If the answers you have happen to be wrong, but you have a very high certainty that you're correct, you're probably not going to go out and seek out other information," Marti told Live Science.

False beliefs

Marti and his colleagues were interested in how misinformation takes hold, a hot topic in an era when false information spreads rapidly online. In the face of firm evidence, people hang onto false beliefs like that the Earth is flat or that climate change is a hoax, with obvious implications for politics and policy. [7 Ways to Prove the Earth Is Round (Without Launching a Satellite)]

The researchers knew from previous studies that curiosity drives the search for new information. The question, then, was this: What keeps people from becoming curious? How do they become so certain that they already know it all?

To find out, the team ran three experiments using online participants recruited from Amazon's pay-by-the-gig website, Mechanical Turk. In three separate experiments with more than 500 different participants in each, the researchers presented a variety of colorful shapes on a computer screen and asked whether each was a "daxxy." A "daxxy" was defined as a structure with a particular color, shape and size, but the participants had no idea which color, shape and size were right. They had to guess and then use the feedback on whether they were right to reason their way to the correct definition of "daxxy."

The advantage of this method, said study co-author Celeste Kidd, a professor of psychology at UC Berkeley, is that the researchers could statistically determine how certain any given participant should be about the definition of "daxxy" at any given point, based on how much information had been presented. They could then ask the participants how certain they felt and compare the two answers.

Certainty uncertainty

People are pretty good at using feedback to figure out what "daxxy" means, Marti said. But it turns out, they're not so good at knowing when they've got the answer right. [Top 10 Conspiracy Theories]

The main factor determining how certain someone was in their definition, Marti said, was how well they'd done in their most recent answers — no matter how abysmally they'd performed otherwise.

"You might get the first 19 trials wrong but get the last five trials right," Marti said, "And if that happens to you, you're probably going to say you're certain, even though you got 19 wrong."

Something like climate change denial or belief in a flat Earth is likely more complicated than a simple learning task like the daxxy experiment, Marti said. But this confusion over certainty might still matter in those cases, because it could keep people from seeking out new information that might upset their preconceived notions.

Take a flat-Earth believer, Kidd said. Their belief can explain why the horizon looks flat from most vantage points and why it doesn't feel like you're spinning through space. That positive feedback might be enough to keep someone from searching for the real explanations (the immensity of the Earth's curvature and the constant nature of its rotation, respectively).

The findings pertain to more than just fringe conspiracy theorists, though. Everyone holds false beliefs of one kind or another, Marti said. He added that he now hopes to study whether there is any way to "snap people out of" their misconceptions about certainty.

"If we can get people to realize there is a gap in their information, our theory would predict that would then raise their curiosity, which would then make them more likely to research things themselves," he said.

Original article on Live Science.

Stephanie Pappas
Live Science Contributor

Stephanie Pappas is a contributing writer for Live Science, covering topics ranging from geoscience to archaeology to the human brain and behavior. She was previously a senior writer for Live Science but is now a freelancer based in Denver, Colorado, and regularly contributes to Scientific American and The Monitor, the monthly magazine of the American Psychological Association. Stephanie received a bachelor's degree in psychology from the University of South Carolina and a graduate certificate in science communication from the University of California, Santa Cruz.