Why Disinformation Matters

The 2020 Global Inventory of Organized Social Media Manipulation, published by the Oxford Internet Institute, offers an alarming look at the scale of disinformation worldwide.

During the study researchers found that 81 countries have used social media to manipulate public opinion, up from 70 countries in 2019.

These countries include authoritarian regimes like China and Russia.

These states use disinformation to undermine foreign democracies through strategies like spreading anti-democratic narratives and amplifying distrust in institutions.

In democracies like the United States, disinformation is often deployed during election cycles to polarize voters and sow confusion, with campaigns targeting specific groups to suppress voter turnout or mislead public opinion.

For example, Russian-backed campaigns during the 2020 U.S. election amplified narratives aimed at dividing the electorate.


Meanwhile, in Myanmar, coordinated social media manipulation was used to justify state-led violence against the Rohingya population, highlighting how disinformation can have devastating real-world consequences.


From spreading false information during elections to undermining trust in health officials during the COVID-19 pandemic, the tactics of computational propaganda are growing more sophisticated and pervasive.


This article will unpack these findings and explain why this issue threatens not only democracy but also the world we leave behind for future generations.


What Is Disinformation and Why Should You Care?

Disinformation is the intentional spread of false or misleading information to manipulate public opinion.

Unlike misinformation, which is shared unintentionally, disinformation is a deliberate tactic used by governments, political parties, and private actors to achieve specific goals.


How does it work?


Disinformation campaigns often rely on:

  • Fake accounts: Bots and trolls flood social media with posts that amplify false narratives. For example, in 2020, automated accounts in 57 countries were used to drown out dissenting voices.
  • Manipulated media: Doctored photos, videos, and memes are crafted to deceive. In Argentina, a video was edited to make a government official appear intoxicated, spreading distrust during a critical election.
  • Micro-targeted ads: Political advertisements, tailored to exploit personal data, spread falsehoods to specific audiences. During the 2019 UK General Election, 90% of the Conservative Party’s Facebook ads were flagged as misleading.

For the average social media user, this means that every scroll comes with the risk of encountering manipulated content. Whether it's false health advice during a pandemic or fake news about a political candidate, disinformation aims to sow confusion and mistrust.


The Global Scale of Disinformation

The Oxford report reveals the staggering scale of disinformation:

  • 81 countries engaged in cyber troop activities to manipulate public opinion.
  • Social media platforms removed over 317,000 accounts between 2019 and 2020 linked to these activities.
  • Cyber troops spent $10 million on political ads globally in just two years. These ads appeared on platforms like Facebook, which revealed that during this period, significant sums were spent by actors tied to disinformation campaigns.
  • For instance, in the Philippines, cyber troops promoted government propaganda via targeted political ads, while in Brazil, similar ads amplified divisive narratives during the elections. This underscores how platforms are both battlegrounds and amplifiers for disinformation.

Even more alarming is the rise of private firms specializing in disinformation.

In 48 countries, private companies were contracted to run manipulation campaigns.


Notable examples include the Israeli-based Archimedes Group and the Spanish firm Eliminalia. Both firms specialized in creating fake accounts, spreading polarizing messages, and deploying bots to manipulate public narratives.


Archimedes Group orchestrated campaigns across Africa, Latin America, and Southeast Asia, while Eliminalia focused on local elections in countries like Colombia and Ecuador.

Consolidating their activities under the larger umbrella of private firms highlights the growing commercialization of disinformation.

These firms used techniques like creating fake social media accounts, amplifying polarizing messages, and deploying bots to sway public opinion, often earning lucrative contracts for their efforts.

These firms deploy armies of sock puppets (fake accounts operated by humans), bots, and data-driven strategies to amplify their clients’ messages.

Since 2009, state actors have spent at least $60 million hiring these firms, though the actual figure is likely much higher.


Who Is Behind It?

The actors behind disinformation campaigns are diverse:

  1. Governments:
    • Countries like Russia and China lead global disinformation efforts, often targeting foreign audiences. For example, during the COVID-19 pandemic, Russia pushed disinformation narratives aimed at amplifying vaccine skepticism in Western countries, while China promoted propaganda to downplay its role in the virus’s origins and highlight its pandemic response as superior. These campaigns undermined trust in public health measures and sowed confusion globally.
  2. Political Parties:
    • In 61 democracies, political parties used disinformation to attack opponents and sway elections. In Tunisia, fake Facebook pages amplified polarizing content ahead of the 2019 elections.
  3. Private Firms:
    • Companies like the Israeli-based Archimedes Group ran campaigns across Africa, Latin America, and Southeast Asia. Another firm, Eliminalia, supported local elections in Colombia and Ecuador.
  4. Citizen Influencers:
    • Some campaigns rely on ideologically aligned volunteers or influencers. In Indonesia, "buzzer groups" worked with political campaigns during the 2019 elections, amplifying pro-government narratives.

How Disinformation Harms Everyone

Disinformation affects all aspects of society:

  • Polarization: Troll farms in Nigeria spread conspiracies designed to divide citizens along social, ethnic, or political lines. Similar tactics are used globally to deepen divides.
  • Harassment: Activists, journalists, and politicians are frequent targets of online trolling and doxing. In Tajikistan, university professors were recruited to run coordinated smear campaigns against government critics, using social media platforms to disseminate false accusations and discredit their targets. This tactic undermined not only the reputations of targeted individuals but also public trust in academic institutions. By involving educators in state-sanctioned disinformation efforts, the campaign blurred the lines between truth and propaganda, damaging the independence of academia and fostering an environment of fear and self-censorship among intellectuals. This tactic, sanctioned by the state, aimed to suppress dissent and intimidate activists. By leveraging trusted community figures, these campaigns not only spread disinformation but also eroded public confidence in educational institutions, showing the far-reaching consequences of weaponized propaganda.
  • Public Health Risks: COVID-19 disinformation led to vaccine hesitancy and confusion about public health measures, costing lives.

These harms extend beyond individual victims. They erode trust in institutions, weaken democratic norms, and make it harder for societies to address collective challenges.


Why It’s Hard to Fight Back

Despite efforts by social media platforms, combating disinformation remains an uphill battle:

  • Platforms like Facebook and Twitter have taken down thousands of fake accounts, but their enforcement is inconsistent and often reactive. For example, in 2020 alone, Facebook removed over 150 coordinated networks linked to disinformation campaigns, targeting countries such as the United States, Iran, and Russia. Twitter disclosed that more than 32,000 accounts associated with state-backed operations were eliminated during the same period. However, these actions only scratched the surface, as many fake accounts evade detection or re-emerge under different aliases.
  • Advanced techniques, such as deepfakes, make it harder to distinguish real content from fake.
  • Many campaigns operate covertly, using encrypted apps or the dark web to evade detection.

These challenges underscore the need for systemic solutions, including better technology for detecting fake content, stricter regulations on political ads, and improved digital literacy among users.


What Can Be Done?

As individuals, we can take steps to protect ourselves and our communities from disinformation.

For example, Finland’s national media literacy program serves as a powerful model.

By integrating critical thinking and digital literacy into school curricula, the country has successfully reduced the spread of disinformation among its citizens.

Such initiatives demonstrate that proactive education can empower communities to recognize and reject false narratives.


  1. Be skeptical: Verify the source of any controversial or shocking post before sharing.
  2. Use fact-checking tools: Websites like Snopes or FactCheck.org can help debunk false claims.
  3. Support trusted organizations: Groups like the Oxford Internet Institute and Global Disinformation Index work to expose and counteract disinformation.
  4. Educate others: Encourage friends and family to think critically about what they see online.

Governments and platforms also have a role to play by enforcing stricter regulations, investing in detection technology, and increasing transparency around content moderation.


Disinformation is not just a digital nuisance; it is a direct threat to democracy, public health, and the social fabric of our communities.

The findings of the Oxford report show that this problem is global, growing, and deeply entrenched in the way we consume information online.

But there is hope. By raising awareness, supporting counter-disinformation efforts, and promoting digital literacy, we can push back against the tide of manipulation.

The question we must all ask ourselves is: what kind of world do we want to create—one based on truth or one shaped by lies?

The choice is ours to make, but the time to act is now.


Sources:

  • Bradshaw, S., Bailey, H., & Howard, P. N. (2021). Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation. Oxford Internet Institute. Read the full report here.
  • Additional insights from the Computational Propaganda Project at the University of Oxford.