The Conspiracy Paradox
New research from Cian O’Mahony and colleagues at University College Cork dives deep into this problem.
Published in Advances in Psychology (2024), their study tested four popular approaches for reducing belief in conspiracy theories. Some were clever. Some were common-sense.
Only one truly worked.
And surprisingly, the solution wasn’t shouting louder or fact-checking harder. It was teaching people how to think, not just what to reject.
This article breaks down the study’s findings in plain language—with optional deeper dives along the way.
Whether you're a media professional, educator, or just tired of misinformation in your group chat, you'll come away with a clearer understanding of why conspiracies stick, and what might actually unstick them.
Why Conspiracy Beliefs Stick
Let’s be honest—conspiracy theories can feel comforting.
They offer neat explanations in a messy world. When things go wrong, it’s easier to believe that someone, somewhere is pulling the strings than to face the randomness of life or the failures of complex systems.
Psychologists call this a response to epistemic uncertainty—our basic human discomfort with not knowing who to trust or what to believe.
But it’s not just about facts. It’s about feelings.
Research has shown that people who gravitate toward conspiracy theories often feel:
- Powerless or anxious
- Distrustful of institutions
- Alienated from mainstream sources of information
These beliefs aren’t just intellectual—they’re emotional. And that makes them sticky.
Deeper dive:
Conspiracy beliefs often satisfy epistemic (need for understanding), existential (need for safety), and social (need for identity) motives.
O'Mahony et al. note, conspiracy theories “are a means for people to make sense of the world in times and situations of uncertainty” (2024, p. 2).
That emotional appeal makes them resistant to correction—especially when corrections come from perceived outsiders or authorities.
So if yelling "That’s fake!" doesn’t work... what does? That’s exactly what the researchers set out to test.
The Study – Four Approaches Put to the Test
In a pair of large experiments involving over 1,700 participants, researchers from University College Cork tried to answer one key question:
What’s the most effective way to reduce belief in conspiracy theories—without teaching people to reject everything that sounds suspicious?
To find out, they tested four popular intervention strategies—each with a different approach to tackling misinformation.
Some were fast and simple. Others were more interactive. Only one proved truly effective across the board.
Here’s a breakdown of the four methods they used, explained simply:
Priming (a simple warning)
A short message warned participants that conspiracy theories can be misleading, emotionally appealing, and factually wrong. Think of it as a road sign that says, “Caution: Fake News Ahead.” Did it work? Not really. This “light touch” approach didn’t help participants think more critically or change their beliefs.
Inoculation (teaching the tricks)
Inspired by the logic of vaccines, this method taught people to recognize common rhetorical fallacies in conspiracy theories—like fake experts, logical leaps, or the “who benefits?” trap. Did it work? Somewhat. It reduced belief in implausible conspiracy theories (like “the moon landing was faked”), but had no effect on more believable ones.
Active Inoculation (learning by doing)
Same fallacy training, but with an added quiz to help people practice applying what they learned. Imagine a mini bootcamp in spotting BS. Did it work? Yes—but modestly. It slightly improved critical thinking, mostly for easy-to-spot conspiracy theories.
Discernment Training (how to think, not what to think)
This method taught participants how to evaluate claims based on logic, evidence, and plausibility—without automatically rejecting all conspiracies. They practiced distinguishing between reasonable concerns and far-fetched paranoia. Did it work? Yes—and it was the most effective overall. This was the only intervention that significantly improved participants’ ability to judge both plausible and implausible conspiracy theories.
Deep detail:
The researchers used a tool called CTAC (Critical Thinking About Conspiracies), a test that presents realistic conspiracy scenarios and asks participants to choose the most logical interpretation. Discernment-trained participants consistently scored higher.
Bottom line?
The most powerful intervention wasn’t about telling people what’s false. It was about showing them how to think clearly—even when a theory sounds convincing.
The Big Insight: Thinking vs. Dismissing
Here’s what the researchers realized: It’s not enough to teach people to reject conspiracy theories. We have to teach them how to reason through them.
Think about it. Some conspiracy theories are totally baseless—like claims that lizard people secretly run the government.
Others, though, are rooted in real events and historical wrongdoing—like COINTELPRO, MKUltra, or the NSA’s surveillance programs.
So the real question becomes:
Can you tell the difference between wild speculation and a legitimate concern?
That’s where most anti-misinformation strategies fall short. They train people to develop a reflexive no—to treat anything “conspiratorial” as inherently false.
But that kind of blanket scepticism can be just as dangerous as blind belief.
Deeper dive:
O’Mahony et al. warn that interventions aimed only at “reducing belief” may “negatively impact participants’ ability to critically reason about plausible conspiracy theories” (2024, p. 1).
In other words: we might be teaching people to reject everything, including the truth. This is where discernment training shines.
It avoids the trap of dogmatic dismissal and instead equips people with tools to:
- Spot logical fallacies
- Ask the right questions
- Weigh evidence
- Withhold judgment when appropriate
In short, it builds cognitive flexibility, not just cognitive defense.
The Danger of Blind Scepticism
In trying to protect people from conspiracy theories, are we accidentally teaching them to ignore the truth?
That’s the warning behind one of the study’s most powerful insights.
While many interventions succeed in reducing belief in clearly false claims, they can also push people toward an unhealthy mindset where everything is suspicious, nothing is credible, and no claim—no matter how well-supported—is taken seriously.
“Just because it’s a conspiracy theory,” the authors note, “doesn’t mean they’re not out to get you.” —O’Mahony et al., 2024
Why this matters:
Some conspiracy theories—especially the plausible ones—are about real abuses of power, uncovered through journalism, whistleblowing, or activism. Dismissing these outright can:
- Undermine democratic accountability
- Distract from genuine corruption
- Erode trust in legitimate dissent
Historical context:
- MKUltra – The CIA’s illegal mind control program (1950s–70s)
- COINTELPRO – FBI surveillance and disruption of civil rights leaders
- NSA Surveillance – Mass data collection revealed by Edward Snowden
None of these started with government press releases. They were exposed—slowly, painfully—through evidence and persistence. And yes, they all sounded like conspiracy theories at first.
The lesson:
If we train people to instinctively reject anything “that smells like a conspiracy,” we risk creating a public that’s not just misinformed—but disarmed. Discernment, not dismissal, is the way forward.
So What Actually Works?
The big takeaway from the study is simple—but powerful:
The best way to protect people from conspiracy thinking isn’t by telling them what’s false. It’s by helping them figure it out for themselves.
Among the four methods tested, discernment training stood out. It didn’t just lower belief in wild theories—it also helped participants critically evaluate more plausible, real-world scenarios.
And that matters. Because in real life, conspiracy narratives don’t always come with tinfoil hats.
They might show up as:
- Viral claims about vaccine side effects
- Misleading videos during political protests
- Accusations of media manipulation during elections
In these situations, the most dangerous response isn't belief—it's automatic belief or automatic rejection.
People need the mental tools to pause, assess, and ask:
- “What’s the evidence?”
- “Is this logically sound?”
- “What would I need to know to make a judgment here?”
What discernment training looks like in practice:
- Teaching how to spot rhetorical tricks like the “who benefits?” fallacy
- Understanding what counts as strong vs. weak evidence
- Using structured reasoning tools (like the CTAC test used in the study)
Who could benefit from this?
- Schools: Media literacy curriculums can go beyond “check your sources” to teach real reasoning.
- Newsrooms: Journalists can help audiences understand why a claim is problematic—not just that it is.
- Social platforms: Instead of just slapping a “misinformation” label, platforms could offer interactive logic tools or mini lessons.
Smart people still fall for bad ideas when they don’t have good thinking habits. That’s what this research is really about.
Takeaway – Building Cognitive Immunity, Not Just Fact Armor
Think of your mind like your body. You don’t stay healthy just by avoiding germs—you stay healthy by building a strong immune system.
The same goes for navigating conspiracy theories and misinformation. You can’t just rely on fact-checks, warning labels, or quick debunks.
Those are like masks and hand sanitizer: helpful, but surface-level.
What really works—according to this new research—is building cognitive immunity: the ability to spot sketchy logic, weigh evidence, and resist emotional manipulation.
“We argue that a successful intervention should improve participants’ ability to selectively reject unreasonable conspiracy theories—not simply encourage them to disavow anything that sounds conspiratorial.” —O’Mahony et al., 2024
In other words:
Don’t just slap armor on people. Teach them how to fight. That means moving beyond slogans like “Trust the science” or “Don’t believe everything you hear.”
Instead, we need to build:
- Mental flexibility, not just skepticism
- Curiosity, not cynicism
- Critical thinking, not automatic doubt
Because in an information war, your sharpest weapon isn’t a louder voice—it’s a sharper mind.
Closing – The Future of Truth
We live in a world where everyone has a megaphone—and not everyone is telling the truth.
Conspiracy theories can spread faster than facts, powered by fear, uncertainty, and emotion.
But if this research teaches us anything, it’s that shouting “You’re wrong!” isn’t a strategy. It’s a reaction. The real solution is slower, smarter, and more empowering:
Teach people to think better, not just believe better.
That’s how we build a society that can handle complex truths without falling into simplistic lies. Not by telling people what to believe—but by helping them understand why some beliefs hold up and others fall apart.
The fight against disinformation isn’t just about defending facts. It’s about cultivating minds that can question with clarity, doubt with discipline, and discern with depth.
That’s the future of truth—if we choose to build it.
Sources & Further Reading
- O’Mahony, C., Murphy, G., & Linehan, C. (2024). True discernment or blind scepticism? Comparing the effectiveness of four conspiracy belief interventions. Advances in Psychology. DOI: 10.56296/aip00030
- Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26(6), 538–542.
- Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
- McGuire, W. J. (1964). Inducing resistance to persuasion. In Advances in Experimental Social Psychology (Vol. 1, pp. 191–229). Academic Press.