New York (AFP)

As many observers had anticipated, spurious messages and calls for violence have proliferated on major digital platforms since the US presidential election.

To deal with it, the methods of Facebook, Twitter and YouTube differ.

Facebook and Twitter have been quick to put up warnings or remove false posts, including unproven claims by President Donald Trump and his supporters that large-scale fraud is taking place in key states.

The news feeds of the Republican billionaire on the two social networks are now strewn with vignettes warning about the veracity of the content.

According to Daniel Kreiss, a researcher at the Center for Information, Technology and Public Life at the University of North Carolina, "Facebook and Twitter had a clear strategy to monitor certain accounts" and had prepared to crack down at the slightest. offense.

This was not the case with YouTube, notes the academic.

Media Matters, a media watchdog group, has compiled a list of deceptive videos circulating on Google's video platform.

They have been viewed over a million times in total this week and some of them are still online.

"YouTube videos promoting disinformation about the results of the 2020 presidential election have garnered a high number of views despite the platform's policy of banning content that aims to mislead people about the electoral process," Alex Kaplan wrote Thursday on the Media Matters blog.

The analyst notably cites false allegations by conservative host Steven Crowder or the far-right television One America News Network (OANN).

YouTube has not been completely idle, however, removing some problematic content, including an episode of Steve Bannon's channel, in which this former Donald Trump adviser called for the heads of Dr. Anthony Fauci, specialist in infectious diseases to be spiked. , and FBI boss Christopher Wray.

"We have removed this video which went against our rules against incitement to violence. We will continue to be vigilant in applying our policies during the post-election period," assured Alex Joseph, a spokesperson. from YouTube.

Twitter took it a step further, however, by permanently deleting Mr. Bannon's account from the show.

- "Top of the basket" -

Overall, YouTube doesn't seem to be as tough or responsive as major social media outlets in its moderation efforts.

This difference may be due to the nature of the content published there.

“The problem with videos, especially live video, is that it's difficult for artificial intelligence (AI) to spot a problem,” notes Adam Chiara, professor of communications at the University of Hartford.

"For text or photos, the AI ​​can search for keywords or duplicate images. It is not so easy with video," adds the specialist.

This could explain why other video-sharing platforms have been inundated with attempts at disinformation.

In particular, Media Matters identified several videos posted on the popular TikTok youth app by supporters of the QAnon conspiracy movement, which developed a dark conspiracy theory around alleged bogus ballots cast by Democrats.

After being viewed over 237,000 times, these videos have been deleted.

But for Mr. Kreiss, YouTube could still have acted differently.

"Any implementation of serious rules should start with the institutional political accounts", judges the specialist.

"You have to start at the top of the basket. I think the rules should apply for the president and for other representatives of the elites who seek to undermine the credibility of the elections."

© 2020 AFP