Most methods for squashing conspiracy theories don't work, study finds. Here's what does.
A systematic review of conspiracy theory interventions shows that most traditional approaches have little impact, but certain alternatives show promise.
Debunking conspiracy theories with counterarguments is often a fruitless effort — but according to a new scientific review, there may be alternative strategies that can successfully fend off conspiratorial beliefs.
Having already grown over the past 10 years, interest in conspiracy theories skyrocketed during the pandemic, when failure to comply with public health recommendations was sometimes associated with conspiracy beliefs. For example, proponents of the anti-vax movement may avoid vaccinations for themselves or their children on the basis that some hazardous outcome of vaccination is being covered up. Although increasingly prominent in public discourse, conspiracy theories have proved a difficult mindset to shift.
"I wouldn't have a Ph.D. in this project if conspiracy theories were easy to counteract," said Cian O'Mahony, a doctoral candidate in psychology at University College Cork in Ireland who led the systematic review reported in the journal PLOS One. The review doesn't reveal "a silver bullet" for countering conspiracy theories, he said, but "we have found some interesting avenues for future research that we should follow up."
The review is the first of its kind, as previous studies have been more concerned with understanding the psychological underpinnings of conspiracy beliefs, O'Mahony told Live Science. Research into designing interventions to combat conspiracy is still relatively new. "When we did the review, we found that there's only a handful of papers that are actually published on this topic," he said.
Related: Belief that COVID-19 was a hoax is a gateway drug to other conspiracy theories
O'Mahony described a conspiracy theory as "a belief that explains events by invoking malicious groups working in secret." The role of some underground organization distinguishes conspiracy theories from general misinformation and "fake news." For instance, the statement "Bigfoot exists" would not be a conspiracy theory unless qualified by adding "and a particular organization is trying to keep it a secret."
The new review suggested that many methods for changing conspiracy beliefs are ineffective — particularly those that involve straightforwardly arguing against a person's beliefs after they're already entrenched. However, the review also highlighted some emerging practices that might be successfully wielded against conspiracy theories.
The most promising was training to teach people how to critically analyze information to distinguish pseudoscience from the real thing. However, even generic "analytically priming" a study participant's mental state to be more alert — by presenting them with text in a hard-to-read font, for example — was found to reduce the likelihood of falling for a conspiracy theory they saw shortly afterward.
Finally, "information inoculation" can also be effective. In this strategy, conspiracy theory counterarguments are presented alongside a warning that exposure to misinformation is to follow, before the subject is exposed to the theory. It is likened to the way a vaccine exposes someone to a fragment or weakened form of a virus so that they are resistant to the disease when they encounter it.
(Unfortunately, this same approach can also be used to spread conspiracy, if someone "inoculates" with a conspiratorial explanation first, O'Mahony noted.)
"While it is not overly optimistic, this review points out several potentially promising" lines of research, Iris Žeželj, a professor of social psychology at the University of Belgrade who was not involved in the new review, said in an email.
However, she highlighted the need to replicate the studies demonstrating successful intervention, as well as the challenge of scaling them up into policies. O'Mahony noted these same caveats and also pointed out the current lack of evidence that any of these interventions have a lasting impact.
Valerie van Mulukom, a researcher at the Centre for Trust, Peace and Social Relations at Coventry University in the U.K. who was not involved with the review, described it as a "timely endeavour" but emphasized that it is important to consider the spread of conspiracy beliefs as a social process.
"Interventions may decrease belief in certain conspiracy theories by pointing out issues in the information presented, but they do not take away the social causes underlying belief," she said in an email. Factors like people's personalities, paranoias, need for closure, financial insecurities and feelings of marginalization may all influence what conspiracies they ascribe to and what interventions work on them.
"It is not the case that everyone with lower levels of analytical or scientific reasoning believes in conspiracy theories," van Mulukom noted.
As a follow-up to their review, O'Mahony and his colleagues are developing a video game aimed at honing players' critical thinking skills. Such games have already been shown to be effective in combating fake news.
"This might sound a little avant-garde, but we're finding that this is a potentially promising avenue for teaching people to apply critical thinking skills to conspiracy theories," he said.
Live Science newsletter
Stay up to date on the latest science news by signing up for our Essentials newsletter.
Anna Demming is a freelance science journalist and editor. She has a PhD from King’s College London in physics, specifically nanophotonics and how light interacts with the very small. She began her editorial career working for Nature Publishing Group in Tokyo in 2006. She has since worked as an editor for Physics World and New Scientist. Publications she has contributed to on a freelance basis include The Guardian, New Scientist, Chemistry World, and Physics World, among others. She loves all science generally, but particularly materials science and physics, such as quantum physics and condensed matter.
By Harry Baker
By Ben Turner
By Harry Baker
Do they admit that they were wrong? Do they double down on what they believed to be true or do they just slink away in shame and hope that no one points it out?
my money is on the fourth alternative, though - They act like it does not matter.
No shame about having misinformed the public at all.
They got their money and is happy, the rest can burn for all they care.
Real scientists always seek facts and evidence and alter their conclusions from that.
conspiracy theorists move the goalposts
(examples CENSORED by auto-mod)
Anyone who doesn't believe in conspiracy theories is a fool. We have hard proof that conspiracies exist.
The FBI tried to get MLK to off himself. They ran a massive domestic spying operation that included intentionally provoking violent riots to discredit the civil rights movement and opposition to the Vietnam War. Why do people believe that the same people who did all that... are suddenly being completely honest now?
Conspiracy thinking is caused by feelings of powerlessness. It is amplified by mockery and censorship. It is mitigated by having open, welcoming two-way dialogues which emphasize free choice and empowerment.
In other words, YOU people are responsible for conspiracy thinking. The "experts" in one field who totally failed to learn anything about interacting with actual humans before they opened their smug, ignorant mouths. Dehumanizing the most disenfranchised people in society and silencing their voices actively makes the problem worse.
In 2003, people who believed that Saddam Hussein did NOT have a secret stash of nuclear weapons were "conspiracy theorists".
In 1986, people who believed that the U.S. was funding Nicaraguan Contras with arms sales to Iran were "conspiracy theorists".
And until the Church Committee in 1975, only "conspiracy theorists" believed that the U.S. government was running mind control experiments using psychedelic drugs that produced several serial killers, in addition to a host of other atrocities.