YouTube’s algorithm recommends videos that violate its own policies

The algorithm YouTube uses recommends videos that don’t follow the company’s guidelines

Marvin Tolentino/Alamy

YouTube’s algorithm recommends videos that violate the company’s own policies on inappropriate content, according to a crowdsourced study.

Not-for-profit company Mozilla asked users of its Firefox web browser to install a browser extension called RegretsReporter, which tracked the YouTube videos they watched, and asked them whether they regretted watching each video.

Between July 2020 and May 2021, 37,380 users flagged 3362 videos they viewed as regrettable – a fraction of 1 per cent of all those they watched. Reports of these videos were highest in Brazil, with about 22 videos out of every 10,000 viewed being logged as regrettable.

Researchers then watched the reported videos and checked them against YouTube’s content guidelines; they found that 12.2 per cent of the reported videos either shouldn’t be on YouTube, or shouldn’t be recommended through its algorithm, say the Mozilla researchers.

About a fifth of the reported videos would fall under what YouTube’s rules classify as misinformation, and a further 12 per cent spread covid-19 misinformation, say the researchers. Other issues flagged in the survey included violent or graphic content and hate speech.

“Some of our findings, if scaled up to the size of YouTube’s user base, would raise significant questions and be really concerning,” says Brandi Geurkink at Mozilla in Germany. “What we’ve found is the tip of the iceberg.”

Most of the contentious videos were delivered to users through YouTube’s algorithm, which recommends videos from channels that a user may not necessarily follow or hasn’t searched for. Seven in 10 of the regret reports were tied to recommended videos, and those recommended by YouTube were 40 per cent more likely to be regretted than videos users actively searched for, say the Mozilla researchers.

Non-English language videos were reportedly 60 per cent more likely to be regretted, which the researchers believe may be because YouTube’s algorithms are trained on primarily English-language videos.

“This highlights the need to tailor moderation decisions on a per-country level, and make sure YouTube has expert moderators that know what is happening in each country,” says Savvas Zannettou at the Max Planck Institute for Informatics in Germany.

Geurkink said YouTube’s lack of transparency over its algorithm is “unacceptable”, especially after years of research has raised concerns about its impact on society.

A YouTube spokesperson said: “The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone.”

The company added it had made changes to its recommendation system in the last year that reduced consumption of “borderline content” to less than 1 per cent of all videos.

More on these topics:

Source link

Leave a Comment