TikTok Algorithm: Pro-Republican Bias in 2024?

A study finds TikTok’s algorithm favored Republican content in the 2024 election. Learn how political bias shaped recommendations.
TikTok algorithm bias during the 2024 U.S. election, showing different content recommendations for Republican and Democratic-leaning users.

⬇️ Prefer to listen instead? ⬇️


  • 📊 A recent study found TikTok’s algorithm favors Republican-leaning content, offering it 11.8% more often than Democratic-leaning content.
  • 🔄 Democratic-seeded accounts were exposed to 7.5% more opposite-party content than Republican-seeded accounts.
  • 🎭 Negative partisanship—content criticizing the opposing party—was a key driver of ideological recommendations, with anti-Democratic content being pushed more aggressively.
  • 📢 Donald Trump’s TikTok videos were recommended to Democratic-leaning users 27% of the time, while Kamala Harris’s videos reached Republican-leaning users only 15.3% of the time.
  • ⚖️ The algorithm’s bias may impact political polarization, voter perception, and ultimately influence election outcomes in 2024.

Smartphone displaying TikTok app with political content

TikTok’s Growing Political Influence

With over a billion active users worldwide, TikTok has become a powerful driver of political conversations, especially among younger audiences. As the platform serves as a primary news source for millions, concerns have emerged regarding potential biases in its recommendation algorithm. A recent study found that TikTok’s algorithm systematically promoted Republican-leaning content more strongly than Democratic-leaning content, raising questions about the platform’s role in shaping perceptions ahead of the 2024 U.S. presidential election.

Person scrolling social media feed on smartphone

How Social Media Algorithms Can Shape Political Discourse

Understanding Algorithmic Personalization

All major social media platforms rely on content recommendation algorithms that learn from users’ behavior to personalize their experiences. These algorithms analyze engagement patterns—such as watch time, likes, shares, and comments—to predict and serve content that will likely maintain user attention.

While this personalization improves user experience, it also creates filter bubbles and echo chambers, reinforcing users’ preexisting opinions and limiting exposure to diverse perspectives. TikTok, with its highly sophisticated For You Page (FYP) algorithm, arguably delivers one of the most aggressive forms of content personalization, making it a crucial space for political messaging.

Algorithmic Bias in Other Platforms

TikTok is not the only platform under scrutiny for political bias in recommendation algorithms:

  • YouTube’s AI-driven recommendations have been found to lean left in the U.S., pushing more liberal-leaning content (Zaki, 2023).
  • Facebook has been criticized for amplifying misinformation and polarization through its engagement-driven feed.
  • Twitter (now X) has faced investigations into algorithmic favoritism of right-leaning viewpoints, particularly after Elon Musk’s acquisition.

With TikTok dominating young voters’ media diets, its potential to affect political opinions has become a key concern, especially heading into a critical election year.

Researcher analyzing data on computer screen

Methodology: How the Study Tested TikTok’s Bias

A research team from New York University Abu Dhabi designed an audit study using 323 simulated TikTok accounts (“sock puppets”) to systematically measure the political bias of TikTok’s recommendations.

1. Simulated Political Demographics

The study created bot accounts of three types:

  • Republican-seeded accounts – Initially exposed to right-wing content
  • Democratic-seeded accounts – Initially exposed to left-wing content
  • Neutral accounts – Served as a control group

These accounts were deployed in politically diverse states (Texas, New York, Georgia) to control for geographic influence.

2. Two-Phase Experiment

The experiment was divided into:

  1. Conditioning Phase – Each account initially engaged with ideologically aligned content (up to 400 videos) to “train” TikTok’s algorithm.
  2. Recommendation Phase – Accounts then accessed their “For You” feed to passively observe over 394,000 recommended videos, analyzing their ideological slant.

This approach allowed researchers to evaluate whether TikTok favored a political ideology disproportionately in its recommendations.

Close-up of TikTok app recommendations

Key Findings: Evidence of Pro-Republican Bias in TikTok’s Algorithm

Data showed that:

  • Republican-seeded accounts were fed 11.8% more aligned content than Democratic-seeded accounts (Ibrahim et al., 2024).
  • Democratic-seeded accounts encountered 7.5% more opposing content on average.

🔍 Implication: The algorithm’s heavier alignment with Republican content suggests a bias that provides conservative users with a more politically insulated experience, while liberal users see more opposing viewpoints.

2. Negative Partisanship Drives Recommendations

TikTok’s algorithm was not merely favoring one party—it was aggressively pushing negative partisan content (content that criticizes the opposing party).

  • Both left- and right-leaning accounts were served more “attack-style” content than positive content.
  • Republican-seeded accounts were specifically targeted with anti-Democratic videos.

🔍 Implication: This supports the theory that controversial, polarizing content drives higher engagement, which the algorithm prioritizes.

3. Unequal Exposure to Key Political Figures

Notably, the study found that content from Donald Trump and Kamala Harris was recommended at vastly different rates:

  • Trump-related videos were shown to Democratic-seeded accounts 27% of the time.
  • Kamala Harris videos appeared on Republican-seeded accounts only 15.3% of the time.

🔍 Implication: The algorithm appears to expose Democratic-leaning users to opposition content more frequently, potentially influencing how they perceive opposing candidates.

What Topics Are Being Pushed? Analyzing the Content Landscape

The study outlined which political topics were most frequently recommended to users based on their ideological affiliations:

Political AffiliationTop Topics Pushed by TikTok
Republican-seeded accountsImmigration, Foreign Policy, Ukraine War, Crime
Democratic-seeded accountsAbortion, Climate Change

Interestingly, certain topics—immigration, crime, the Gaza conflict, and foreign policy—were disproportionately recommended as ideological mismatches to Democratic-leaning users.

🔍 Implication: This suggests a strategic emphasis on Republican-favored narratives, potentially influencing undecided or left-leaning viewers by exposing them to content that highlights conservative concerns.

Person looking concerned while reading news on phone

The Psychological Effects of Algorithm-Driven Political Bias

1. Reinforcement of Confirmation Bias

By consistently recommending politically aligned content, algorithms strengthen confirmation bias, where people prefer and trust information that aligns with their preexisting beliefs. This leads to:

  • More extreme political attitudes as beliefs are reaffirmed.
  • Less exposure to opposing viewpoints, reducing cross-party dialogue.

2. The Reactance Theory & Exposure to Opposing Content

Interestingly, Democratic users were exposed to more opposing viewpoints, but research in political psychology suggests this can backfire, causing “reactance”—a defensive resistance to contradictory information. Instead of broadening perspectives, this might actually deepen polarization.

Voting booth with people casting ballots

What Could This Mean for the 2024 Election?

With social media playing an ever-greater role in shaping voter opinion, TikTok’s pro-Republican skew raises critical concerns:

  • It could subtly influence undecided voters by exposing them to more Republican-leaning narratives.
  • It may contribute to deepening U.S. political polarization, reinforcing one-sided perspectives.
  • It opens the door for misinformation and agenda-driven curation, possibly affecting electoral outcomes.

Team discussing data on a conference table

What Can Be Done? Addressing Algorithmic Bias

1. Increase Transparency in Recommendation Systems

Researchers advocate for more transparency in how TikTok and other platforms determine what content to push. Companies should:

  • Publicly disclose how political content is weighted.
  • Allow third-party audits to assess biases in content recommendations.

2. Implement More Balanced Content Curation

To prevent ideological silos, platforms should balance political content exposure by:

  • Encouraging more cross-ideological interaction without excessive reactance.
  • Reducing the emphasis on polarizing, negative partisan content.

3. Strengthen Media Literacy Efforts

Users need to engage critically with content. Digital literacy initiatives can:

  • Teach users to identify biased recommendations.
  • Encourage cross-platform news consumption to counteract algorithmic filtering.

Why Algorithmic Scrutiny Matters

TikTok’s subtle yet measurable favoring of Republican content reveals a broader issue: social media algorithms shape democratic discourse—often in ways users don’t realize. As the 2024 election nears, understanding and challenging these biases is essential to maintaining an informed electorate and balanced political conversation.

💡 The key takeaway? Don’t let algorithms determine your political worldview—seek diverse sources and question your feed.


FAQs

What were the key findings of the recent study on TikTok’s algorithm and the 2024 U.S. election?

The study found that TikTok’s algorithm recommended more Republican-aligned content than Democratic-aligned content, with Republican-leaning accounts receiving an 11.8% higher proportion of ideologically matched videos.

How did the study measure political bias in content recommendations?

Researchers created sock puppet accounts with differing political leanings and analyzed their video recommendations using large language models like GPT-4 and Gemini-Pro.

Why did TikTok’s algorithm promote more Republican-leaning content over Democratic-leaning content?

The bias appeared to stem from the algorithm’s preference for negative partisanship, favoring anti-Democratic content more frequently than anti-Republican content.

What are the implications of biased algorithmic recommendations on political discourse?

Biased recommendations can enhance political polarization, reinforce echo chambers, and shape election outcomes by influencing public perceptions of candidates and issues.

How does algorithm-driven exposure to partisan content affect cognitive biases and voter perceptions?

Repeated exposure to aligned content strengthens confirmation bias, while exposure to opposing viewpoints can cause reactance, reinforcing preexisting beliefs rather than challenging them.

What does this study reveal about the broader role of algorithmic curation in influencing political attitudes?

It highlights how automated recommendation systems can subtly shape political views, demonstrating the need for increased transparency and regulation in social media algorithms.


Citations

  • Ibrahim, H., Jang, H. D., Aldahoul, N., Kaufman, A. R., Rahwan, T., & Zaki, Y. (2024). TikTok’s recommendations skewed towards Republican content during the 2024 U.S. presidential race. Retrieved from https://arxiv.org/abs/2501.17831.
  • Rahwan, T. (2024). Commentary on algorithmic bias and political content recommendations.
  • Zaki, Y. (2023). YouTube’s recommendation system exhibits left-leaning bias in the U.S. Retrieved from https://academic.oup.com/pnasnexus/article/2/8/pgad264/7242446.
Previous Article

Does Wildfire Smoke Increase Dementia Risk?

Next Article

Neural-Network Brain Scans: Can They Boost Creativity?

Write a Comment

Leave a Comment

Your email address will not be published. Required fields are marked *



⬇️ Want to listen to some of our other episodes? ⬇️

Subscribe to our Newsletter

Subscribe to our email newsletter to get the latest posts delivered right to your email.
Pure inspiration, zero spam ✨