San Francisco (AFP)

YouTube announced on Wednesday that insults, personal threats and harassment in general would no longer be tolerated, a new step in the process of remediation of the video platform undertaken for several years.

"We will no longer allow malicious content that insults someone based on their personal characteristics, such as race, gender, or sexual orientation," said Matt Halprin, vice president of Google's subsidiary, in a statement. online.

"This applies to everyone, to individuals and content creators and policy makers," he adds.

YouTube, like other major social platforms, has been much criticized for its laxity in terms of dissemination of deleterious content by its users.

Google has invested heavily since 2017 to strengthen its processes of detection and removal of these videos or problematic words, with a mixture of automated systems (artificial intelligence) and human resources.

The focus so far has been on hate speech, terrorist images, pedophile comments, and so on.

Many insults and threats were also removed, but YouTube has decided to toughen its rules.

"Now, implied or implied threats will be banned as well as explicit threats," says Halprin.

"This includes content that simulates violence against an individual or terms that imply that physical violence may occur," he explains.

Handling a gun while talking about someone or even putting their face on a violent video game video will no longer be acceptable, for example.

YouTube is planning a series of retaliatory attacks against channels that repeatedly violate the anti-harassment code of conduct: inability to generate revenue, deletion of content ...

The enhanced regulation also applies to comments. The platform claims to have withdrawn more than 16 million in the third quarter of this year alone, specifically for harassment reasons.

As a result, "we estimate that this figure will increase in the coming quarters," says YouTube.

© 2019 AFP