A recent study by Tom Buchanan, published in The Social Science Journal (Trust, personality, and belief as determinants of the organic reach of political disinformation on social media, 2021), sheds light on this very issue.


It reveals how personality, trust, and belief drive the spread of disinformation online, often in surprising ways.

The findings show that disinformation isn’t just spread by bots or bad actors; ordinary people, maybe even you, play a significant role.

Let’s explore the key insights—and how they might apply to your own online behavior.


The Problem: Why Disinformation Spreads So Easily

Disinformation—false information deliberately created to mislead—has become a critical issue.

Fake stories can influence elections, harm public health, and fuel social division. But here’s the catch: most disinformation doesn’t spread because of algorithms alone.

Human behavior plays a key role. When you share, like, or even comment on a post, you amplify its reach.

This phenomenon is called "organic reach," and it’s how disinformation thrives.

Research has shown that even a small fraction of people engaging with false material can push it to massive audiences.


But why do some of us share these posts while others scroll past?

Buchanan’s study identifies three critical factors:

  1. Trust in the source
  2. Personality traits
  3. Belief in the content’s truthfulness

Key Finding 1: Trust Doesn’t Matter as Much as You Think

Most of us assume that if a story comes from a reputable source like the BBC or a verified account, we’re more likely to share it.

Surprisingly, Buchanan’s study found that the perceived trustworthiness of the original source of a post had little impact on whether people shared it.

What mattered more was who shared it with you.

If a trusted friend posted the story, you might be more likely to engage with it—regardless of whether it originated from a credible outlet or a sketchy meme account.

This aligns with prior research showing that people tend to trust the person who brings a story to their attention, not necessarily the source behind the story itself.


Takeaway: The next time you see a story shared by a friend, pause and ask: "Do I trust the source as much as I trust this person?"


Key Finding 2: Your Personality Shapes What You Share

Ever wondered why some people seem to share anything and everything online, while others are more cautious?

The study highlighted one personality trait that stood out: conscientiousness.

People lower in conscientiousness (those who are less detail-oriented or careful) were significantly more likely to share disinformation.

Why? They might not take the time to fact-check or think critically before hitting "share."

Interestingly, agreeableness—a trait associated with being kind and cooperative—didn’t predict sharing behavior in this study, despite past research suggesting it might.

Instead, attentiveness and caution (or the lack thereof) were the key factors.


Takeaway: If you’re someone who tends to act impulsively online, try slowing down. Double-check before you share, especially if a story seems too good (or bad) to be true.


Key Finding 3: Belief Drives Sharing

Here’s the most straightforward insight: people are more likely to share content if they believe it’s true.

This may sound obvious, but it’s crucial for understanding why disinformation spreads.

If a story aligns with your existing beliefs or "feels" right, you’re more likely to amplify it—even if it’s false.

For example, imagine seeing a headline that confirms something you’ve always suspected about a political figure or hot-button issue.

The emotional pull of "this must be true!" can override your usual skepticism.


Takeaway: Before sharing, ask yourself: Do I believe this because it’s true, or because it aligns with what I already think?


So, What Can You Do?

If you’re worried you might have shared disinformation in the past, don’t panic. You’re not alone.

But there are simple steps you can take to avoid doing so in the future:

  1. Pause and think: Before you share, ask, "Is this true?" Take a moment to verify the story with a quick search.
  2. Check the source: Look beyond who shared it with you and consider whether the original source is credible.
  3. Be mindful of your habits: If you know you’re impulsive or tend to act without thinking online, make an effort to slow down.
  4. Fact-check emotional stories: If a headline makes you feel outraged, elated, or vindicated, that’s a red flag to double-check its accuracy.

Why It Matters

What you share online shapes the world we all live in.

Disinformation isn’t just an abstract problem; it influences the decisions we make as individuals and communities.

By being thoughtful about what you amplify, you can help curb the spread of harmful falsehoods and make social media a smarter, more trustworthy space.

As Buchanan’s study reminds us, sharing isn’t just about the content—it’s about us.


By understanding how our personalities, beliefs, and trust affect what we share, we can all play a part in building a better online world.


Sources and Further Reading

  • Buchanan, T. (2021). Trust, personality, and belief as determinants of the organic reach of political disinformation on social media. The Social Science Journal. DOI:10.1080/03623319.2021.1975085
  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science. DOI:10.1126/science.aap9559
  • Newman, N., et al. (2020). Reuters Institute Digital News Report 2020. Link

 Study overview


Title and Focus

The paper, "Trust, Personality, and Belief as Determinants of the Organic Reach of Political Disinformation on Social Media" by Tom Buchanan, explores why some people are more likely to engage with and spread disinformation online.

Specifically, it examines the roles of:

  1. Trust in the source of information
  2. Personality traits
  3. Belief in the truthfulness of disinformation

Key Findings

  1. Trustworthiness of the source:
    • Surprisingly, the trust level of the original source of disinformation (e.g., BBC vs. random fake account) did not significantly influence people's likelihood of sharing. This contrasts with earlier findings that trusted sources encourage sharing.
    • Instead, the relationship between the person who shares it directly with you (e.g., a friend) and you might play a stronger role.
  2. Personality traits:
    • Conscientiousness: Lower conscientiousness (less careful, less diligent individuals) was significantly linked to a higher likelihood of sharing disinformation. This supports the idea that inattentiveness may drive false information sharing.
    • Agreeableness: Unlike prior research, agreeableness did not influence whether people spread disinformation.
  3. Belief in the content's truthfulness:
    • People who believed the disinformation stories were true were significantly more likely to engage with or share them. Belief in the truth of false material is thus a key driver of its spread.
  4. Age:
    • Older participants were less likely to believe the disinformation stories but did not differ in their likelihood of spreading them compared to younger participants. This contradicts earlier studies linking older age to higher rates of disinformation sharing.

Methodology

  • Participants: 172 UK-based individuals recruited online via the Prolific platform, with a mix of ages and educational levels.
  • Experiment Setup:
    • Participants were shown three examples of disinformation on social media (e.g., a fabricated UN migration story) with sources categorized as high, medium, or low trustworthiness.
    • Participants rated how likely they would be to:
      • Share, like, or comment on the posts.
      • Believe the content was truthful.
    • Personality traits were measured using a validated scale.

Limitations

  1. The study relied on self-reported likelihood to share/engage with disinformation rather than actual observed behavior.
  2. The framing of content as shared by “a friend” might have influenced results, as this relationship was not varied experimentally.
  3. Sampling bias: Most participants had post-secondary education, which might reduce generalizability.

Practical Implications

  1. Fact-checking interventions: Teaching people to recognize trustworthy sources may have limited impact since source trustworthiness wasn't a key factor in this study.
  2. Focus on individual differences: Targeting individuals low in conscientiousness or those prone to believing disinformation may be more effective.
  3. Belief vs. sharing motivations: Strategies to counter disinformation need to address both belief in falsehoods and motivations to share them, as these might not always align.