YouTube's algorithm more likely to recommend users right-wing and religious content, research finds

YouTube has a pattern of recommending right-leaning and Christian videos, even to users who haven’t previously interacted with that kind of content, according to a recent study of the platform’s suggestions to users.

The four-part research project, conducted by a London-based nonprofit organization that researches extremism called the Institute for Strategic Dialogue, explored video recommendations served to accounts designed to mimic users interested in four topic areas: gaming, male lifestyle gurus, mommy vloggers and Spanish-language news.

“We wanted to, for the most part, look at topics that don’t generally direct people into extremist worlds or anything along those lines,” said Aoife Gallagher, the project’s lead analyst.

Researchers created accounts and built mock user personas by searching for content, subscribing to channels and watching videos using those accounts. After having built personas for five days, researchers recorded the video recommendations displayed on each account’s homepage for a month.

The study noted that YouTube’s recommendation algorithm “drives 70% of all video views.”

In one investigation, the most frequently recommended news channel for both child and adult accounts interested in “male lifestyle guru” content was Fox News, even though neither account had watched Fox News during the persona-building stage. Instead, the accounts watched Joe Rogan and Jordan Peterson and searched for the term “alpha male.”

“This suggests that YouTube associated male lifestyle videos and creators with conservative topics,” the study said.

In another experiment, researchers created two accounts interested in mommy vloggers — mothers who make video diaries about parenting — that they trained to have different political biases. One of the accounts watched Fox News, and the other watched MSNBC. Despite having watched their respective channels for equal amounts of time, the right-leaning account was later more frequently recommended Fox News than the left-leaning account was recommended MSNBC.

A mommy vlogger account that the left-leaning user had already subscribed to was the most recommended channel.

“These results suggest that right-leaning news content is more frequently recommended than left-leaning,” the study said. Both accounts were also recommended videos by an anti-vaccine influencer.

Jessie Daniels, a professor of sociology at Hunter College, part of the City University of New York, and the author of a 2018 article titled “The Algorithmic Rise of the ‘Alt-Right,’” said the project’s main findings were in line with her previous research. She has examined the rise of the internet in the 1990s and how the far right saw an opening to share its beliefs with larger audiences by bypassing the traditional media gatekeepers.

Daniels said she believes the findings suggest that YouTube has made continued engagement and profits its top priorities rather than concerns around reinforcing existing political biases or echo chambers.

Videos with religious themes — primarily Christianity — were also recommended to all the accounts, even though none of them had watched religious content during the persona-building stage. The accounts interested in mommy vloggers, for example, were shown videos with Bible verses.

The researchers also found that YouTube recommended videos including sexually explicit content to the child account and videos featuring influencer Andrew Tate, who has been charged with human trafficking and rape (allegations that he has denied) in Romania, even though he is banned from the platform.

Heading into this year’s presidential race, concerns about the spread of election misinformation on social media are only growing. In 2022, a study by researchers at New York University found that after the last presidential election, YouTube recommended videos that pushed voter fraud claims to Donald Trump supporters.

“One of the main issues that we’re seeing is polarization across society, and I think that social media is contributing an awful lot to that kind of polarization,” Gallagher said.

This isn’t the first time YouTube has faced scrutiny for its algorithm. Researchers have repeatedly found that YouTube has recommended extremist and conspiracy theory videos to users.

“We welcome research on our recommendation system, but it’s difficult to draw conclusions based on the test accounts created by the researchers, which may not be consistent with the behavior of real people,” YouTube spokesperson Elena Hernandez said in a statement to NBC News. “YouTube’s recommendation system is trained to raise high-quality content on the home page, in search results, and the Watch Next panel for viewers of all ages across the platform. We continue to invest significantly in the policies, products, and practices to protect people from harmful content, especially younger viewers.”

For years, there have also been concerns that social media platforms may create echo chambers where users engage only in content that reinforces their beliefs. However, other recent research has also suggested that users’ own preferences, not the YouTube recommendation system, play the primary role in what they decide to watch and that YouTube may even have a moderating influence.

“This goes back to a lack of transparency and a lack of access that we have to data on YouTube,” Gallagher said. “YouTube is one of the most cloaked of the platforms. It’s very, very difficult to analyze YouTube at scale.”

This article was originally published on NBCNews.com