Misinformation—defined here as any information that turns out to be false—isn’t new.

Romans engraved it on coins. Nazis weaponized it via cinema. But today’s version comes turbocharged with WiFi and algorithms.

We’re talking about lies that travel faster than the truth and can be tailored, eerily, to your exact psychological profile. Yes, your brain is being sold back to you, one misleading meme at a time.

But here’s the twist: knowing something’s false doesn’t mean your brain will act like it knows. Ever heard that myth about vaccines causing autism? Or weapons of mass destruction being found in Iraq?

Even if we’ve heard the corrections, that misinformation has a nasty habit of sticking around.

Psychologists call this the “continued influence effect”—but let’s call it what it really is: the ghost of a bad idea that won’t leave your house.

So why do we believe wrong things in the first place?

First up, meet your brain’s favorite shortcut: intuition. Instead of slow, careful thinking, we often go with gut vibes. If something feels familiar, or flows smoothly when we read it, we’re more likely to believe it—even if it's a flaming pile of hooey.

Repetition is key. Say a lie often enough, and it starts to feel like truth. This is called the illusory truth effect, and it doesn’t care whether you’re Mensa-level smart or not.

Add in your worldview, and now you’ve got trouble. If a headline matches your beliefs—say, your political leanings—it’s not just accepted; it’s welcomed like a long-lost cousin.

People even form false memories about stuff that fits their identity. And if someone tries to correct it?

That can feel like an attack, not a fact-check.

And don’t even get started on emotions. If you’re angry, happy, or feeling left out, you’re more likely to believe and share false stuff.

Emotion fuels virality, and social media loves a good drama. Cue moral outrage, and suddenly that sketchy tweet is getting retweeted into oblivion.

Now, let’s talk corrections. The old-school approach—just giving people the facts—is like giving someone a bucket to put out a grease fire. You need to do more.

Researchers say the best debunking happens when you:

  1. Provide a clear, plausible alternative (“No, the wildfire wasn’t arson—it was lightning, and here’s the meteorology data to prove it.”).
  2. Use simple, relatable language (not “short-term trend is statistically insignificant,” but “wobbles happen, the Earth is still warming”).
  3. Respect the person’s identity while correcting them—don't steamroll their values.
  4. Deliver corrections early, often, and from credible sources.

Better yet? Catch the lie before it spreads. That’s prebunking—basically psychological vaccination.

Warn people that lies are coming, show them the tricks (like cherry-picking or fake experts), and boom: they build resistance.

Want to level up? Use games. Seriously—playing a game where you pretend to spread disinformation can help you spot it in real life.

Still, none of this will solve the whole problem. Social media platforms are engineered to spread misinformation like glitter in a kindergarten class—fast, sticky, and everywhere. So while individuals can get smarter, policymakers have a role too.

Think regulations, algorithm accountability, and media diversity.

At the end of the day, the psychology of misinformation tells us this: lies don’t just spread because people are dumb.

They spread because people are human. We crave meaning, belong to groups, feel emotions, take shortcuts. Understanding that—really understanding that—is the first step toward doing better.

So next time you’re doomscrolling and come across something that feels a little too perfect, pause. Think. Ask questions. And remember: your brain is a wonderfully messy place.

Keep the doors open to new info, but maybe install a better bouncer at the entrance.


Source: The psychological drivers of misinformation belief and its resistance to correction | Nature Reviews Psychology