In an age where more young people use TikTok as a search engine than Google, this issue is critical.
If TikTok’s algorithms can manipulate the information people encounter, they can influence elections, political opinions, and public discourse without users even realizing it.
This article breaks down the study’s findings, explains why it matters to everyone—including those who don’t use TikTok—and explores what we can do to demand transparency and accountability.
TikTok: A Search Engine for the Young Generation
Social media is no longer just for entertainment.
TikTok, with its over 1 billion active users, has become a primary news source for many young people.
In Germany, 67% of TikTok users aged 18-25 use the search feature frequently to find political information.
This shift is happening globally.Young people increasingly turn to social media for news instead of traditional outlets.
The study found that TikTok’s "Others Searched For" feature plays a key role in shaping what they see, often directing users toward politically charged or misleading content.
But what happens when the search results aren’t neutral? Instead of leading users to fact-based information, they funnel them into specific narratives, often in a way that benefits certain political groups over others.
To understand how this works, researchers from the TikTok Audit Team and AI Forensics conducted a systematic study to examine search behavior on the platform.
Their research focused on how the "Others Searched For" feature influenced search outcomes when users looked up political parties or candidates in Germany ahead of the 2024 European Parliament elections.
The results were concerning.
What Researchers Found: How TikTok Steers Users Toward the Far Right
The research team analyzed 89,780 search suggestions collected over a month before the 2024 European Parliament elections.
Using a framework called risk-scenario-based auditing (RSBA), they categorized these suggestions into eight types, including Clickbait, Diversion, Dog Whistles, and Suspicious Content.
Here’s what they discovered:
Far-Right Overrepresentation in Search Suggestions
One of the study's most striking findings was the disproportionate visibility of the far-right Alternative für Deutschland (AfD) party in search suggestions.
For example, when users searched for mainstream parties like SPD or Die Grünen, they were often led to far-right content instead.
A simple query for "SPD" suggested "AfD aktuell" (AfD news), while searches for "Die PARTEI" led to results for "bsw partei", another right-wing faction.
The study notes that this phenomenon is not a coincidence.
AfD has been one of the most active political parties on TikTok, taking advantage of the platform’s engagement-driven algorithm.
This means that users looking for neutral political information are more likely to be directed towards far-right narratives instead.
Clickbait & Conspiracy-Laden Suggestions
Beyond political redirection, TikTok's "Others Searched For" feature frequently suggested sensationalist, misleading, or conspiratorial content.
For example:
- Searching "CDU" returned "cdu politiker jung" (CDU politician young)—implying a scandal or gossip.
- "Die Grünen" brought up "habeck frau weg" (Habeck's wife left)—a false rumor about a Green politician.
- "Alternative für Deutschland" led to "ganz deutschland weint" (All of Germany cries), a phrase often used in nationalist sentiment campaigns.
These types of suggestions fuel misinformation and guide users toward emotional, knee-jerk reactions rather than informed political engagement.
Dog Whistles & Nationalist Messaging
One of the most concerning aspects of the study was the presence of dog whistles—subtle phrases used to promote nationalist or extremist views.
For example:
- Searching "Sozialdemokratische Partei Deutschland" (Social Democratic Party of Germany) suggested "deutschland muss deutsch bleiben" (Germany must stay German).
- "Alternative für Deutschland" returned "warnung deutschland 2024" (Warning Germany 2024), echoing nationalist alarmism.
The study found that these coded messages were particularly prevalent in searches for mainstream left-wing and centrist parties, reinforcing the idea that these parties are "threats" to national identity.
Selective Moderation: Who Gets Blocked?
Another striking finding was the inconsistency in TikTok’s content moderation.
Certain political figures had no search suggestions at all, while others had misleading or damaging search results associated with their names.
For instance:
- Searches for AfD politicians Maximilian Krah and Petr Bystron returned no search suggestions.
- Meanwhile, mainstream parties and left-leaning politicians were not protected from misleading suggestions.
This raises major concerns about TikTok’s algorithmic bias and content moderation policies. Why are some figures shielded while others are targeted?
Why This Matters—Even If You Don’t Use TikTok
Even if you personally don’t use TikTok, millions of young people do.
That includes your siblings, children, and friends—many of whom rely on it for political information.
When algorithms distort reality, they influence voting behavior, public debates, and democracy itself.
The study shows that TikTok's search suggestions are not just reflecting user interests, but actively shaping them.
This means that political narratives can be manipulated at scale simply by tweaking which search suggestions appear.
What Needs to Change
To address these issues, experts recommend:
- Greater transparency in how TikTok generates search suggestions.
- Better content moderation to ensure that misleading or extremist content is not amplified.
- Public accountability so that social media companies are held responsible for their impact on political discourse.
The Fight for a Fair Digital Landscape
TikTok’s search algorithms are not neutral.
They actively shape political discourse and influence public opinion.
Without intervention, platforms like TikTok will continue to guide users toward extremist narratives, distort elections, and undermine democracy.
We must demand transparency, stronger regulations, and media literacy to ensure that digital platforms serve truth, not manipulation.
Sources & Further Reading
- TikTok Audit Team Study: https://tiktok-audit.com/blog/2024/Search-Suggestions/
- How Algorithms Shape Public Opinion – Wired: https://www.wired.com/story/creating-transparent-ai-algorithms-machine-learning/
- Disinformation & Elections – European Digital Rights (EDRi): https://edri.org/topics/disinformation-and-electoral-interference/
- The Influence of Social Media on Young Voters – The Guardian: https://www.theguardian.com/news/datablog/2015/mar/10/a-third-of-young-people-think-social-media-will-influence-their-vote
- TikTok and Political Manipulation – The New York Times: https://www.nytimes.com/2020/10/09/technology/tiktok-election-misinformation.html
Did you learned something new today? Do you enjoy my work?
Keep it going for just $2! 🎉
Grab a membership, buy me a coffee, or support via PayPal or GoFundMe. Every bit helps! 🙌🔥
BMAC:https://buymeacoffee.com/nafoforum/membership
PP: https://www.paypal.com/donate/?hosted_button_id=STDDZAF88ZRNL
GoFundMe: https://www.gofundme.com/f/support-disinformation-education-public-education-forum
Study overview
Summary of the TikTok Audit Team & AI Forensics Study on Search Suggestions
Objective
The study conducted by the TikTok Audit Team in collaboration with AI Forensics aimed to analyze the impact of TikTok’s "Others Searched For" feature on political discourse, with a focus on the 2024 European Parliament elections in Germany. The objective was to determine whether TikTok’s search suggestions contributed to misinformation, polarization, or the disproportionate visibility of certain political parties.
Methodology
- Data Collection: Researchers collected 89,780 search suggestions over a period of one month before the elections.
- Sample Queries: The study focused on search queries related to nine major German political parties and 51 politicians.
- Search Suggestion Classification: Each search suggestion was categorized into eight types:
- No Suggestion
- Unrelated
- Diversion
- Insights
- Clickbait
- Suspicious
- Dog Whistle
- Other
- User Behavior Analysis: A survey of 1,643 TikTok users in Germany (ages 18-25) examined how frequently young users engage with search suggestions and their influence on political perceptions.
- Technical Evaluation: Researchers assessed whether TikTok’s algorithmic personalization influenced the consistency of search suggestions across different user sessions.
Key Findings
- Overrepresentation of Far-Right Content
- Searches for mainstream parties like SPD and Die Grünen frequently suggested content related to the far-right Alternative für Deutschland (AfD).
- The AfD’s high visibility in search suggestions was inconsistent with the search queries, indicating a potential bias in content amplification.
- Misleading & Clickbait Suggestions
- A significant portion of search suggestions contained clickbait headlines, unverified claims, or outright misinformation.
- Examples:
- Searching for "Die Grünen" suggested "Habeck Frau weg" (Habeck's wife left), a false rumor.
- Searching for "CDU" led to "CDU Politiker jung" (CDU politician young), implying scandal or controversy.
- Dog Whistles & Nationalist Narratives
- Researchers identified coded language that appeared to subtly reinforce nationalist or far-right ideologies.
- Example: Searching for "Sozialdemokratische Partei Deutschland" (SPD) suggested "Deutschland muss deutsch bleiben" (Germany must stay German).
- These phrases were disproportionately linked to left-wing or centrist parties, framing them in a negative light.
- Algorithmic Influence & Personalization
- The study found that TikTok’s algorithm prioritizes engagement over factual accuracy, amplifying high-engagement content regardless of its validity.
- Repeated exposure to misleading suggestions can create an altered perception of political discourse among users.
- Selective Moderation & Content Suppression
- Certain political figures, particularly from the AfD, had no search suggestions available, while others faced disproportionately negative or misleading suggestions.
- There was no clear pattern in TikTok’s moderation decisions, suggesting an inconsistent or non-transparent content governance model.
Implications
- The study concludes that TikTok’s search algorithm is not neutral and actively shapes user perceptions by prioritizing high-engagement but misleading content.
- The lack of transparency in search suggestion moderation raises concerns about potential election interference and manipulation of public opinion.
- There is an urgent need for regulatory oversight and algorithmic transparency to ensure that search suggestions do not unduly influence democratic processes.
Recommendations
- Increased Transparency: TikTok should disclose how search suggestions are generated and allow independent audits.
- Improved Content Moderation: More consistent application of moderation policies across all political parties.
- User Education: Raising awareness of how algorithmic bias influences search results and political perceptions.
- Regulatory Oversight: Policymakers should implement guidelines for social media search algorithms to prevent misinformation and undue political influence.
Conclusion
The TikTok Audit Team and AI Forensics study highlights systemic risks posed by TikTok’s search suggestion algorithm in shaping political discourse. The findings demonstrate that misleading content, disproportionate visibility of far-right narratives, and selective moderation contribute to potential election interference. Addressing these issues requires greater transparency, improved moderation practices, and increased public awareness of how algorithms influence political information consumption.