The YouTube logo (illustration).
Danny Moloshok / AP / SIPA
YouTube wants to end hate comments.
Taking inspiration from Twitter, Google's video platform has just set up a system encouraging Internet users to moderate their comments online.
Now, when a user is about to post a comment perceived as hateful or violent by the site, an alert window will appear on the screen, encouraging the user to review their message so that it is more correct and less aggressive. , reports
"Keep the comments respectful," invites the social network in these cases.
A new filtering system
Faced with this message, the Internet user can edit his message in a more appropriate language.
But the filtering system does not work perfectly, Google also leaves the possibility of forcing the passage and posting the comment.
Users are thus not “censored” in the event of an algorithm error.
It is possible to indicate such an error when the window appears.
On its blog page, Google also states that new technology has been developed to detect hate speech.
The algorithm takes into account the theme of the video as well as the context of the commentary.
In the last quarter, YouTube shut down 1.8 million channels that violated its policy.
Of those, 54,000 terminations concerned hate speech.
Coronavirus: Facebook groups are growing in popularity, study finds
YouTube: Traumatized by the moderation of videos, a former employee files a complaint