Disinformation is a deliberate assault on our shared values, targeting the progress humanity has made since World War II in justice, equality, and civil rights.
But how do we fight back?
The Carnegie Endowment for International Peace recently released an in-depth report titled Countering Disinformation Effectively: An Evidence-Based Policy Guide.
This guide lays out the tools, strategies, and hard realities of combating disinformation—and why this fight is essential for safeguarding democracy and social justice.
Here, we break it down, showing not only what’s at stake but also what we can do to stop this insidious war on truth.
The Bigger Picture: Why Disinformation Matters
Disinformation is not just a collection of fake news stories or wild conspiracy theories.
It is a strategic tool wielded by actors like Russia, autocratic regimes, extremist groups, and even domestic elites with vested interests.
Its goals are clear:
- Erode Trust: By spreading lies and half-truths, disinformation undermines trust in institutions like the media, courts, and democratic elections.
- Fracture Society: It exploits divisions along political, racial, and socioeconomic lines, creating polarization that weakens collective action.
- Suppress Civil Rights: Confusion and cynicism make it harder to mobilize for justice, fairness, and equality. For example, false narratives about elections disproportionately target marginalized communities to suppress voter turnout.
Russia’s role is particularly egregious.
Through state-sponsored campaigns, the Kremlin has interfered in elections across the globe, from the United States to Europe, using disinformation to sow discord and destabilize democratic governments.
But Russia is not alone.
Powerful elites and private actors have adopted these same tactics to entrench their power and protect their wealth, rolling back decades of progress in social justice and civil rights.
How Disinformation Spreads: The Tools of the Trade
Disinformation thrives on a potent combination of supply, demand, and technology:
- The Supply Side: State actors, extremist groups, and profit-driven organizations deliberately create and spread falsehoods. Their tactics include deepfakes, troll farms, and coordinated networks of fake accounts.
- The Demand Side: People are naturally drawn to disinformation because it often appeals to their biases, fears, or frustrations. Repetition, emotional triggers, and group identity make false narratives particularly persuasive.
- Social Media Platforms: Algorithms designed to maximize engagement amplify divisive and sensational content. These platforms reward outrage, making disinformation cheap to produce and easy to spread.
The consequences are profound.
Movements like "Stop the Steal" in the U.S. and anti-vaccine conspiracies have destabilized societies and undermined trust in critical institutions.
And the rise of generative AI threatens to make things even worse, enabling the mass production of hyper-realistic fake content.
Fighting Back: Ten Interventions to Counter Disinformation
The Carnegie report identifies ten key interventions—some tactical, others structural—to fight disinformation.
Here’s what you need to know:
1. Supporting Local Journalism
Local news is a cornerstone of democracy, keeping citizens informed and holding power to account.
Yet, since 2004, a quarter of U.S. newspapers have closed, creating "news deserts" that leave communities vulnerable to disinformation.
- The Solution: Public funding, philanthropy, and policies like Australia’s bargaining laws (which force tech companies to pay for news) can revitalize local journalism.
- The Challenge: Scaling this solution is costly. Experts estimate it would take billions annually to restore the local news ecosystem in the U.S.
2. Media Literacy Education
Teaching people to critically evaluate information is one of the most promising long-term strategies.
- Success Stories: Programs like IREX’s "Learn to Discern" in Ukraine have empowered individuals to spot disinformation and develop healthier media consumption habits.
- The Challenge: Reaching large, vulnerable populations—including those most susceptible to disinformation—requires significant investment in schools, libraries, and community programs.
- Lateral Reading as a Tool: Training individuals to cross-check sources instead of relying on single sites improves their ability to discern accurate information.
3. Fact-Checking
Fact-checking can debunk false claims and provide accurate information. Research shows it is particularly effective at reducing belief in specific falsehoods.
- Strengths: Platforms like Facebook have integrated fact-checking, labeling disputed posts to curb their spread.
- Limitations: Fact-checking doesn’t always change behaviors or deeply held beliefs, particularly on polarized topics.
4. Labeling Social Media Content
Adding context or warnings to misleading posts reduces their credibility and discourages sharing.
- Best Practices: Bold, clear labels work best. Reminders to consider accuracy before sharing are also effective.
- Risks: Over-reliance on labels can lead to user skepticism or overconfidence in unlabeled content.
5. Counter-Messaging Strategies
Counter-messaging uses emotional appeals and storytelling to challenge disinformation. Successful campaigns resonate with people’s values and aspirations.
- Examples: Empathetic narratives that promote unity and civic engagement.
- Challenges: These campaigns are resource-intensive and must be carefully tailored to specific audiences.
6–10 (In Brief):
- Cybersecurity for Elections: Protect systems from hacks and leaks that fuel disinformation.
- Statecraft: Sanctions and public education to deter foreign disinformation campaigns.
- Removing Inauthentic Accounts: Deplatforming fake networks reduces their influence.
- Reducing Data Collection: Limiting microtargeting weakens the precision of disinformation campaigns.
- Algorithm Changes: Tweaking recommendation systems to prioritize credible content over sensationalism.
The Role of Generative AI: A Double-Edged Sword
Generative AI, like ChatGPT and deepfake technology, poses both risks and opportunities.
On the one hand, it can make disinformation more realistic and harder to detect.
On the other hand, AI-powered tools can assist fact-checkers, monitor disinformation in real-time, and create accurate counter-narratives.
- AI’s Role in Solutions: Platforms can use AI to identify coordinated disinformation campaigns and surface credible content.
- AI as a Threat: Hyper-realistic fake content could overwhelm current counter-disinformation measures if left unchecked.
Challenges Ahead: Why Progress is Hard but Necessary
Fighting disinformation is a generational challenge.
Success is hampered by:
- Limited data on disinformation’s true impact.
- Political resistance from those who benefit from chaos and polarization.
- The difficulty of tailoring solutions across different cultures and regions.
But progress is possible. Just as previous generations fought for civil rights and democracy, we can rise to this challenge—provided we act with urgency and determination.
What You Can Do
- Support Local Journalism: Subscribe, donate, or advocate for funding.
- Think Critically: Fact-check claims before sharing and avoid knee-jerk reactions online.
- Advocate for Policy Change: Push for stronger platform regulations and media literacy programs.
- Stay Engaged: Recognize disinformation as a threat to your rights and democracy.
Conclusion: The Fight for Truth is Everyone’s Fight
Disinformation is not just a media problem—it’s a societal problem.
It affects our rights, our communities, and the future of democracy itself.
But we are not powerless.
By supporting evidence-based solutions and working together, we can turn the tide.
The truth is worth fighting for. And it’s a fight we can win.
Sources and Further Reading
- Jon Bateman and Dean Jackson, Countering Disinformation Effectively: An Evidence-Based Policy Guide, Carnegie Endowment for International Peace, 2024. Read the full report here.
- International Research & Exchanges Board (IREX), "Learn to Discern" Media Literacy Program. Explore the program here.
- Australian Bargaining Code and Journalism Support: ABC News.
- Studies on Media Literacy: Stanford History Education Group’s Lateral Reading research. More info here.
- OpenAI and Generative AI Tools: OpenAI website.
- Psychological Defense Agency, Sweden: Government resource.