San Francisco (AFP)

YouTube announced on Wednesday that insults, personal threats and harassment in general would no longer be tolerated, a new step in the process of remediation of the video platform undertaken for several years.

"We will no longer allow malicious content that insults someone based on their personal characteristics, such as race, gender, or sexual orientation," said Matt Halprin, vice president of Google's subsidiary, in a statement. online.

"This applies to everyone, to individuals and content creators and policy makers," he adds.

YouTube, like other major social platforms (Facebook, Twitter, Instagram ...) has been criticized for its laxity in terms of dissemination of deleterious content by its users.

Last June the service was criticized for failing to remove videos from extreme right-wing commentator Steven Crowder, who had repeatedly mocked the origins and sexual orientation of journalist Carlos Maza. gay and Hispanic origin of the Vox information site.

In addition to insults, YouTube expands its definition of harassment.

"Now, implied or implied threats will be banned as well as explicit threats," says Halprin.

"This includes content that simulates violence against an individual or terms that imply that physical violence may occur," he explains.

Handling a gun while talking about someone or even putting their face on a violent video game video will no longer be acceptable, for example.

YouTube is planning a series of retaliatory attacks against channels that repeatedly violate the anti-harassment code of conduct: inability to generate revenue, deletion of content ...

The network can go so far as to remove recidivist strings.

- To divert attention -

But this hardening of the rules did not convince Carlos Maza. "The malicious insults were already prohibited (...) YouTube does not put in place such measures to distract the reporters of the real subject: the non-application of its own rules" he reacted on Twitter.

The journalist believes that hate content, in general, is more of a problem than personal insults, and quotes political commentators who "call hatred with a smile" against groups of people, such as Muslims or migrants. .

In June, faced with criticism, Youtube deprived Steven Crowder's videos of access to advertising revenues.

But "the demonetization does not work on YouTube," insists Carlos Maza. "People like Crowder make money from sales of derivatives and donations. (...) As long as YouTube offers them a free platform to find new customers, they will continue to break the rules."

Since 2017, and at the mercy of different polemics and scandals, Google, parent company of Youtube, has invested heavily to strengthen its processes of detection and removal of videos or problematic words, with a mix of automated systems (artificial intelligence) and resources human.

The focus was first on hate speech, terrorist images, pedophile comments, and so on.

The rule reinforcement presented on Wednesday also applies to comments. The platform claims to have withdrawn more than 16 million in the third quarter of this year, specifically for harassment reasons, and expects this figure to increase in the coming quarters.

"We also give creators the power to direct the conversation on their channels," says Matt Halprin. "When we're not sure if a comment violates our rules, but seems potentially inappropriate, creators have the option of approving it before it's posted under their video."

This tool, tested for a few months, will be added by default on most channels by the end of the year, but YouTubers will always have the opportunity not to use it.

© 2019 AFP