"The council is concerned about the way Meta has put its economic interests before content moderation," said the entity described as independent, but financed by the company.

In her report, she calls for a "significant overhaul" of the double verification program dubbed "cross-check", to make it more transparent, more responsive and fairer.

Currently, when posts or images potentially violating Facebook or Instagram policies are flagged, they are promptly removed if they are deemed high risk and come from unknown users.

But if their author is on the "whitelist", this content remains online while it is examined more closely - a process that usually takes several days, and sometimes several months.

This double-speed, "inequitable" system has therefore "offered additional protections to the expression of certain users, selected in part on the basis of Meta's economic interests", details the report.

Because of "cross-check", "content identified as contrary to Meta rules remains visible on Facebook and Instagram, while they are spreading virally and could cause damage", admonishes the supervisory board .

It recommends accelerating secondary reviews of content from personalities who may post important human rights messages, and removing high-risk ones pending the internal verdict.

It also asks the company to publish the eligibility criteria to benefit from the program, and to publicly identify, on the platforms, the accounts of the users concerned.

"We created the double-check system to avoid potentially overly harsh decisions (when we take action on content or accounts that don't actually violate our policies) and to double-check cases where there might be a higher risk. error rate or when the potential impact of an error is particularly severe," said Nick Clegg, Meta's head of international affairs, in a statement.

The group has also undertaken to provide a more comprehensive response to the recommendations of the supervisory board within 90 days.

This entity is made up of 20 international members, journalists, lawyers, human rights defenders and former political leaders.

It was created in 2020 on the proposal of boss Mark Zuckerberg and is responsible for evaluating the Californian group's content moderation policy.

Members launched a 'cross-check' review in October 2021, following revelations from whistleblower Frances Haugen, who leaked internal documents to the press and accused the company of putting 'profits before security" of its users.

Meta's responses to the survey were "insufficient at times" and "his own understanding of the practical implications of the program leaves something to be desired," the council said.

© 2022 AFP