Washington (AFP)

Facebook's supervisory board considers the platform's content moderation policy to be "vague" and has annulled four decisions by the social network to remove publications out of five cases examined, considering these choices problematic.

One of the cancellations relates to a publication in France that the company believed to present "an imminent risk ... of physical danger", according to a statement dated Thursday.

This October 2020 post was a video and accompanying text in French in a Facebook group dedicated to Covid-19.

The publication reported an alleged scandal at the National Medicines Agency, the regulator of the health sector, which had refused to authorize the use of hydroxychloroquine against Covid-19.

The user criticized the lack of a health strategy in France and considered that hydroxychloroquine could save lives

The council, whose mission is to assess the platform's decisions on certain controversial content, asked Facebook to restore this message, judging that the site's rules on disinformation and imminent physical danger were "unduly vague".

The cases examined are posts removed in October and November from four continents - Europe, North America, Asia and South America - by Facebook.

The council, which began its work in December, has taken on six initial cases in total.

They relate to content that has violated the network's rules on hate speech, dangerous organizations or nudity.

Besides France, these examples concern political tensions in Burma, the conflict between Azerbaijan and Armenia or a quote from a Nazi leader by a Facebook user in the United States.

In one of the cases, the supervisory board said to have "wondered about the following question: did the social network have the right to withdraw a publication containing a hateful injury in the context of an" armed conflict "?

"In several cases, members wondered if Facebook's rules were clear enough," adds the "council of wise men", which has issued several recommendations, including asking Mark Zuckerberg's group to be more transparent on how it moderates disinformation.

Facebook must comply with the decisions of this body.

The project of a sort of "Supreme Court", having the last word on controversial content on Facebook and Instagram, was announced at the end of 2019 by Mr. Zuckerberg.

The boss of the Californian group has for years been subjected to criticism and investigation of his moderation policy, which is often considered too lax.

The platform is criticized for its lack of reactivity against certain calls for violence, linked for example to the massacres of Rohingyas in Burma in 2017 or the recent murder of Professor Samuel Paty in France.

© 2021 AFP