Barack Obama was also on display at the European Police Congress in Berlin - but a fake: at a presentation, investigators and security officials were shown how easily fraudsters can create so-called deep fakes - deceptively real, computer-generated videos of people who say or do something which never happened. For investigators this can become a challenge in the future.

At the same time, similar technology also facilitates investigations for the authorities and plays an increasing role in German policing. These include facial recognition software, artificial intelligence (AI) based video analytics, and programs that look for patterns in data or predict the likelihood of crime.

For example, police units are using image recognition software to look for suspects in video and video recordings - either for crimes or live match with a wanted list, as in the meantime completed pilot at the Berlin Südkreuz.

Thomas Striethörster, President of the Federal Police Berlin, spoke at the police congress for "biometric facial recognition at selected train stations and airports". It is up to the policy to decide whether face recognition should be allowed in the future in everyday life. "We have enough surveillance cameras, but we had situations where we did not recognize the culprit," says Striethörster. Assassin Anis Amri also ran past a camera, but was not recognized.

More than two false alarms per hour

Striethörster took up the criticism of the error rate of the video surveillance at the Berlin pilot test: "An error value of 0.25 sounds good at first, but the escalator drive every hour 1000 people down and two and a half times per hour, a person is falsely detected," said the head of the Berlin Federal Police.

One consideration is to use two facial recognition systems simultaneously and to network them - only if both strike an alarm should be triggered. In a second test phase, Südkreuz will be examining how well software can detect potentially dangerous situations in the coming months - such as abandoned objects, unusual movements in crowds or a person lying on the ground.

The Hamburg police are now looking for automatic in image and video footage of suspects who were involved in riots during the Hamburg G20 summit in July 2017. In a reference database, biometric facial impressions are stored by persons who are seen in the images from surveillance cameras, but also in mobile phone videos of witnesses. However, this also includes many people who drove during the G20 summit in Hamburg about just about work. Hamburg's data protection officer Johannes Caspar has therefore ordered to delete the database.

Masked offenders make it difficult for the software

"In the context of the G20 summit, a total of 100 TB of video and video footage was incurred." Forcing a clerk to look through everything would be 60 years, "explains Thomas Radszuweit of the investigation unit" Schwarzer Block "at the LKA Hamburg Use.

It took seven weeks to import 17 terabytes into the system alone. With masked perpetrators, however, the software reaches its limit - then an investigator would have to compare indicia such as clothes in different shots.

Even in Bavaria facial recognition facilitates the identification of photos of suspects, but the digital assessment is always checked by an expert: "The face is measured, but the light-imaging expert can tell with 99.9 percent, if it is actually the person ", says Bernhard Egger from the Bavarian LKA. 148 preliminary investigations were thus clarified in the past year.

Germany is far behind other countries such as the USA when it comes to the use of automation software. The security authorities are "in the initial phase", says the administrative digitization expert at Capgemini, Tobias Knobloch. "First steps, such as predictive policing, are taking place, of course more is possible in security and more is already being done, though we do not know much about it because of secrecy."

Critics demand guard rails for the use of technology

Forecasting software calculates the likelihood of burglaries and reveals patterns in the perpetrators' actions. The effect of the software is difficult to prove, but Knobloch sees a positive side effect: "In order to be able to automate or digitize processes, I have to make them explicit, disassemble, reinterpret, question, simplify, optimize - all this would not be possible without the pressure of the digital happen, "he told the MIRROR.

The police congress also presented AI-based software such as the "Terror Cell Identification" (TCI), which was designed to calculate radicalization patterns in terrorist suspects. The Hesse LKA is already using the analysis platform "Hessendata", which is based on the Gotham system of the controversial analysis company Palantir. Data from various sources such as police databases or social media are evaluated, networks visualized. In the future, the software will also be applied in areas such as organized crime and serious crimes such as murder.

The use of such technologies meets with criticism - even at the police congress: Andreas Kleinknecht, a member of the management of Microsoft Germany, warned that face recognition and services based on it were "at risk of abuse". The policy should not "become a repair shop for misguided AI developments," said Kleinknecht, saying that it would be necessary to set up roadblocks for the deployment of the technology as soon as possible - including in police work.