Several internal Facebook documents were published by the 

Wall Street Journal 

this week

It was first a question of a less strict moderation for certain “VIP” accounts on Instagram and Facebook, then of the harmfulness of Instagram for young girls.

Wednesday, it is the recommendation algorithm of the social network that is targeted by these documents, reports BFMTV.

In 2018, Facebook changed its recommendation algorithm that feeds users' news feed.

Mark Zuckerberg's teams wanted to accentuate “meaningful social interactions”, that is, between friends, families and loved ones.

The aim was to reduce the exposure of a user to “professional” content, which can alter their mental health.

"Our approach had unhealthy side effects"

But in an internal report that the

Wall Street Journal

obtained, a group of researchers within the American company demonstrated that the algorithm actually behaves quite differently.

Rather than preserving the user, he highlights content deemed to be violent, toxic or even fakenews.

"Disinformation, toxicity and violent content are abnormally prevalent in re-shared content," he wrote in one of the memos published by the American daily. “Our approach has had unhealthy side effects on important parts of the content, especially in politics and news. Our responsibility is growing, ”add the researchers.

According to the conclusions of this survey, certain political parties or media have thus oriented their editorial line towards sensationalism and outrage.

The objective: to provoke "engagement", a term evoking the shares, "likes" or comments linked to a publication.

"Many interlocutors told us that they feared, in the long term, the negative effects that this algorithm can have on democracy," said a researcher in an internal report.

High-Tech

Artificial intelligence: The UN warns of the dangers of development without safeguards

Health

Instagram harms girls' mental health, Facebook study finds

  • Social networks

  • Fake news

  • High-Tech

  • Investigation

  • Facebook