YouTube has been claiming for several years its desire to fight against “inappropriate content”.

However, a recent study by the Mozilla Foundation reveals that the platform's algorithm itself offers such content to its users.

The study is the result of a survey conducted between July 2020 and May 2021 by the founding foundation of Mozilla Firefox, competitor of Google (which owns YouTube), reports 

New Scientist

.

Mozilla offered a new extension on its search engine, Regrets Reporter, which allowed users to report if they had encountered "regrettable" content.

"This is only the tip of the iceberg"

As a result, the 37,380 users who took part in the study reported 3,362 videos, or 1% of all content viewed, reports

Slate

.

After analysis, it turned out that 12% of them should in fact not have been hosted by YouTube because they directly violate the rules of the platform.

“What we've discovered is just the tip of the iceberg,”

a Mozilla executive

told

New Scientist

.

Indeed, not only 20% of the videos reported fall under disinformation, but according to the conclusions of the study, 70% of these reports concerned videos recommended by YouTube to Internet users.

Faced with these results, the video platform kicked in touch.

“The purpose of our recommendation system is to connect users with content they like,” a spokesperson said simply.

“Every day, over 200 million videos are recommended just on the homepage.

"

High-Tech

TikTok: Creators will soon be able to sell personalized videos to their followers

Did you see ?

Belgium: An AI spots MEPs glued to their smartphones in Parliament and reframes them

  • Video

  • High-Tech

  • Fake news

  • Youtube

  • Mozilla

  • Firefox