Questions and answers: Christchurch video shows limits of online controls

TIME ONLINE | News, backgrounds and debates

Berlin / Christchurch (AP) - The extremely brutal video of the attack on two mosques in New Zealand is difficult to get out of the net. The case shows how difficult it is to control the distribution of such violent video on the Internet.

How do online platforms generally oppose videos with prohibited content?

Originally, businesses were dependent on someone pointing them out. Meanwhile, they also use software to automatically detect child pornography or violence, as well as to create databases for photos and videos that have already been discovered. It stores a kind of digital fingerprint of the files. The goal is to immediately recognize and remove them when re-uploading.

How well did the system work in the Christchurch case?

The attacker had transmitted his bloodstreak via Facebook Live, the platform of the online network, where every video can be brought online in real time. Facebook deleted the 17-minute video after a police citation. But before that, users had already made recordings of them uploading them to Youtube, for example. The services had problems removing the ever-recurring copies.

To what extent is this about?

Only Facebook removed according to own data from Sunday 1.5 million videos with representations of the attack - and only in the first 24 hours. In 1.2 million cases, the upload was stopped. Since then everything is being watched closely. "We work around the clock," said a spokeswoman. Facebook boss Sheryl Sandberg was in contact with New Zealand's Prime Minister Jacinda Ardern. There one now wants to examine how this can be prevented in the future. Ardern said on Sunday, however: "The problem goes far beyond New Zealand."

Is it possible to banish the video from the net?

No. Because even if the Internet giants Facebook, Youtube and Twitter are successful - there are other platforms that do little about it. From the perspective of critics, the US companies make in principle, but far too little. In fact, control is often outsourced to subcontractors who employ people at low wages in poorer countries. One of them is Manila, the capital of the Philippines. There are many thousands of "cleaners" busy from morning to night to remove dirt from the Internet.

What does the case say about the services' abilities to keep forbidden content off their platforms?

It's almost impossible to let people control the mass of newly shared content. But Facebook is not ready to let this software completely. Also on Youtube revealed deficits of the software, the already known videos platform should keep away.

Why is that?

A problem may be that algorithms can still be tricked - for example, if someone makes changes to the images. Another challenge is that fragments from such videos may also be included in media reports that should not be deleted. Today, almost always a person has to evaluate the context.

Is this all surprising?

No, Facebook Live has been repeatedly in negative headlines because violence has been transferred. So the public shocked a case in the spring of 2017. At that time, a man in the US, how he murdered a retiree. The livestream was initially available for retrieval. Facebook came into the criticism, because the video was deleted 23 minutes after the first user hint - but remained online for over two hours. A first video, in which the act had been announced, went unnoticed.

What could be the consequences?

For years now, the online platforms have been under massive pressure from politicians to remove posts with violence, hate speech or terrorist propaganda faster. They always point to how much better this works now, and that many content is deleted even before it has seen even one user. Christchurch shows the current deficits - and could at the same time reduce the willingness of politicians to accept the weaknesses. As influential US Senator Mark Warner told The Hill website, the rapid spread of the video shows how easily the largest platforms could still be misused.

Are there any nuances in dealing with violence in videos?

In 2016, the livestream of a police check that killed American Philando Castile revealed a case of unjustified violence that might otherwise never have been proven. That was in line with public interest - but it also meant seeing a person die.

Tweet from Facebook

Warner at "The Hill"

Twitter channel of the police

New Zealand Police on Facebook

Facebook reaction to the video

Message from Jacinda Ardern

Message on Instagram

Communication from the police