After the killing of mosques in Christchurch, it was hoped that this will not happen again but the online broadcast of the video of the attack on a synagogue Wednesday in Halle, Germany shows the difficulty of internet platforms to prevent killers to show off.
The shooter, whose Yom Kippur attack left two dead and two seriously wounded in East Germany, managed to post footage of his attack on Twitch's live streaming platform where fans of video games and esport normally meet.
How long did the images stay online? How many people did they see the pictures? Twitch says he does not know the answer yet and will continue his investigation.
"We have done as soon as possible to remove this content, and we will suspend any accounts that post or repost images of this abominable act," said a spokesman Twitch interviewed by AFP.
If the balance sheet is less heavy, the modus operandi in Halle immediately made one think of that of the author of the attack against two mosques of Christchurch in New Zealand in March.
A far-right Australian had killed 51 people before being arrested but had managed to broadcast his attack live for about 17 minutes on Facebook, before the retransmission was stopped.
This very long delay had earned Facebook very virulent criticism and raised voices from all sides, demanding immediate action.
- Artificial intelligence -
Facebook has recruited police forces on both sides of the Atlantic to educate its artificial intelligence tools to try to prevent the recurrence of such a disaster.
The difficulty is that the "artificial brain" must be able to tell the difference between a real attack and a movie or video game scene.
If the filtering done by artificial intelligence is not precise enough, users who retransmit virtual images may be ejected from the network.
Facebook has joined the London police on his initiative. The images filmed by the cameras carried by the units of the "Met" during their training with shooting feed the bank of images already constituted of the network.
Artificial intelligence tools need huge amounts of data - here shots - to learn how to correctly identify, sort and ultimately delete them.
This initiative, announced in mid-September, is part of a wider framework of measures taken to clean up "hate and extremist" content and white supremacist movement or individuals.
The company has also increased the restrictions for access to Facebook Live.
- Alliance against extremism -
Facebook and its partners have also announced the creation of a new organization on the sidelines of the UN General Assembly at the end of September in New York.
It must include Facebook, Microsoft, Twitter and Google (via YouTube) as well as Amazon or the LinkedIn platforms (owned by Microsoft) and WhatsApp (Facebook).
The new structure will "thwart the increasingly sophisticated attempts of terrorists and violent extremists to use digital platforms".
"We are trying to create a civil defense mechanism and in the same way that we respond to natural emergencies such as fires and floods, we have to be ready to react to a crisis like the one we have experienced," Ms Ardern.
After the Christchurch bombing, Facebook removed 1.5 million potential views from the video, including 1.2 million before they were viewed by anyone.
"This gap between 1.2 million and 1.5 million is where we recognize that we need to do better," said network number two, Sheryl Sandberg.
The structure will have an independent staff.
Non-governmental actors will lead an advisory committee and the governments of the United States, France, the United Kingdom, Canada, New Zealand and Japan will also have a consultative role, as will experts from the UN and the European Union.
© 2019 AFP