San Francisco (AFP)

Facebook is adding a tool to its anti-disinformation kit: the network will now primarily display substantiated articles, based on first-hand information and written by journalists.

When different articles are published on the same news, the algorithm will identify the one who "is most often cited as being the source of the information" and will make it appear in the head, announced Tuesday the giant of the social networks, which tries purge your platform of various problematic content.

Facebook is attacking in this case the spread of articles and videos not intended to inform but to deceive or trap users, for political or financial purposes.

Often presented in a sensationalist manner, to generate "views", "clicks" and shares, they may have been produced by content farms and may be based on reports by press organizations that have invested resources to find information.

Facebook says it wants to give priority to "original news covers", which "play an important role in informing people around the world, from the disclosure of news, to in-depth investigations, including the discovery of new ones facts and data, the communication of the most recent information in times of crisis and the dissemination of testimonies. "

- Please sign -

There will also be a bonus for publications that identify their authors. "We will downgrade informative content that is not transparent" to journalists.

This factor will be determined based on apparent signatures and lists of reporters on media sites.

"We have established that publications that do not provide this kind of information often lack credibility and produce content just to get clicks," said the platform to 1.73 billion daily users.

The announcement elicited positive reactions from observers, albeit often ironic ("Funny that all this was not a priority before", comments on Twitter Gavin Sheridan, a former Storyful).

Others point to the gray areas of the process, such as the difficulty for artificial intelligence to separate the wheat from the chaff.

A reputable magazine like The Economist, for example, has chosen not to sign articles while other publications use pseudonyms which are not a guarantee of reliability.

However, this is not an in-depth overhaul of the news feed, which combines family photos, videos, Facebook suggestions, articles, content sharing, advertisements, birthdays, comments, etc.

The modification of the algorithm concerns only the news, and the social network clarified that the personal choices of the users would continue to prevail.

- Friends first -

"Most of the information people see on their + news feed comes from sources they follow or from sources that their friends follow, and that will not change."

Even for newspapers, the network does not expect a strong impact from this measure.

"First-hand information and well-sourced articles may see their distribution increase (...) but it is important to remember that + news feed + uses a wide variety of signals to prioritize content".

In 2018, Facebook had embarked on a major reorganization of its news feed, which has since focused on publications shared by family and friends, to the detriment of sources of information.

But for a substantial part of the users, the platform has supplanted television and other media as a filter for accessing information.

According to a study by the Pew research Center carried out in 2019, 55% of American adults consult the news "often" or "sometimes" via social networks.

The awareness of this issue, and of the responsibility of a juggernaut like Facebook, arrived in 2018, when the scandals of the 2016 election broke out, marked by large-scale disinformation campaigns, piloted by foreign.

With the approach of the American presidential election, the Californian giant has deployed an arsenal of measures, from cybersecurity to the strengthening of moderation rules, to avoid a new catastrophic scenario.

© 2020 AFP