YouTube comments show how platform helps radicalize viewers, study says

Video publishing platform says it has already made changes

YouTube helps radicalize people by leading them toward increasingly extreme content, a study presented at a tech research conference this week shows.

A group of researchers from Swiss and Brazilian universities analyzed more than 330,000 videos posted on 349 channels and more than 72 million comments to track how YouTube’s algorithmic recommendations suggested increasingly radical videos, moving viewers from milder contrarian content to more extreme videos embracing racist, anti-Semitic and white supremacist ideologies.

The paper, “Auditing Radicalization Pathways on YouTube,” was presented at the 2020 Conference on Fairness, Accountability and Transparency in Barcelona, Spain. One of the paper’s authors, Manoel Horta Ribeiro, said YouTube isn’t solely to blame for radicalizing people, but that it is responsible for hosting the extreme communities on its platform, TechCrunch reported.

“We do find evident traces of user radicalization, and I guess the question asks, ‘Why is YouTube responsible for this?’ " Ribeiro said, according to TechCrunch. "And I guess the answer would be because many of these communities, they live on YouTube and they have a lot of their content on YouTube and that’s why YouTube is so deeply associated with it.”

YOUTUBE ROLLS OUT CHANGES TO CHILDREN’S CONTENT

The authors of the study collected data between May and July 2019. During that period, The New York Times published a story detailing one man’s experience being radicalized on YouTube and the company said it was taking steps to address the issues. The platform discontinued its “related channels” feature in May and in June it announced it was removing more hateful and supremacist content and making changes to reduce the spread of “borderline” content like phony miracle cures or flat Earth conspiracy videos.

“Over the past few years, we’ve invested heavily in the policies, resources and products needed to protect the YouTube community,” a YouTube spokesperson said in a written statement. “We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways.”

GET FOX BUSINESS ON THE GO BY CLICKING HERE

A woman interact with the YouTube app. on an iPhone . (iStock)

YOUTUBE ANNOUNCES CRACKDOWN ON HARASSMENT, RACIST, SEXIST COMMENTS

The company said the research didn’t reflect the changes it had made. And another study, published in December, claimed the opposite. Its authors concluded that YouTube’s algorithm “actively discourages viewers from visiting radicalizing or extremist content” and instead favors “mainstream media and cable news content over independent YouTube channels.”

But that study, which looked at even more videos, was criticized by some in the tech community because its authors reportedly incorrectly assumed that YouTube’s algorithm would not perform differently for logged-in users than anonymous users, so they apparently didn't log in.

Princeton computer science professor Arvind Narayanan wrote on Twitter in a response to that study that's not how the algorithm works. But it’s difficult for anyone to properly study YouTube radicalization because content creators on YouTube are constantly adapting to the algorithm and trends, affecting the results of any attempt to study the topic.

YOUTUBE’S HIGHEST EARNERS LIST OF 2019 INCLUDES 8-YEAR-OLD AT TOP

“After tussling with these complexities, my students and I ended up with nothing publishable because we realized that there’s no good way for external researchers to quantitatively study radicalization,” he wrote. “I think YouTube can study it internally, but only in a very limited way.”

In his comments at the conference, Ribeiro also acknowledged that there are numerous factors, according to TechCrunch.

“In a sense, I do agree that it’s very hard to make the claim that the radicalization is due to YouTube or due to some recommender system or that the platform is responsible for that,” he said, according to the report. “It could be that something else is leading to this radicalization, and in that sense, I think that the analysis that we make — it shows there is this process of users going from milder channels to more extreme ones. And this solid evidence towards radicalization because people that were not exposed to this radical content become exposed. But it’s hard to make strong causal claims, like YouTube is responsible for that.”

READ MORE ON FOX BUSINESS BY CLICKING HERE