The "systems have proven themselves": This conclusion draws the Ministry of the Interior to one of the most controversial German monitoring projects. It is about the pilot project for automated facial recognition, carried out at Berlin Südkreuz station.

The project started on August 1, 2017 and ran for a total of one year. Now those responsible published their final report. He rates the tests at the Südkreuz as successful - despite massive concerns.

Test results for #face recognition published at the station # Südkreuz. #Seehofer: Systems have proven themselves in an impressive way, so that wide introduction is possible. Can thus improve the security for citizens.
To the report: https://t.co/OckZqvXZnu

- Federal Ministry of the Interior, for Construction and Homeland (@BMI_Bund) October 11, 2018

The test showed that "facial recognition systems can provide significant added value for police work, especially for the federal police," said the Ministry of Interior's press release. Interior Minister Horst Seehofer (CSU) is quoted in the words that now "a broad introduction is possible."

DPA

Test area at the Südkreuz

That's how the test went

At the test in Berlin three cameras filmed certain areas of the interchange station Südkreuz. Three programs from different manufacturers were then released on the live video footage. The aim of the software: It looked for certain faces, for target persons defined in the system. As soon as the software believed it had discovered someone familiar in the crowd, they reported it.

About 300 test persons had volunteered to participate in the project, providing themselves as target individuals to be discovered in the crowd. Originally, the test should last only half a year, then it was extended to the summer of 2018. All the other passers-by who crossed the designated areas were by-catch of the system.

The Ministry of the Interior had prepared the project together with the Federal Police and the Federal Criminal Police Office (BKA), it was carried out by the Federal Police. The participants hope for more security in times of terrorist attacks. The hope: This could make offenders faster.

How many false hits land the programs?

The crux of the test, however, was the question of how reliably the software detects faces - and above all, how many false hits it outputs. The background: If the software reports a match that is not, then not only an uninvolved passerby is suspected of being a wanted target. With too many such false positives, the system would also be unusable for the police.

The best program in the test delivered, according to the final report now a hit rate of a good 80 percent. That is, in more than 80 percent of cases, subjects were recognized by the system as they passed through the station. Whether the person wore glasses or a scarf, did not matter. However, the value also means that even the most successful system did not always recognize target persons, with an average of 20 percent of station visits they remained unrecognized, investigators speak in this context of a "false negative".

A person in a thousand gets caught under false suspicion

There were also "false positives", ie false matches. This fake hit rate is, according to the report, under 0.1 percent. At 1000 matches that the program performs, so there is a message about a match that is none at all. When thousands of commuters pass through the station every day, it means that several random people are reported as suspicious per day simply because the system confuses them with a target person.

From the point of view of the ministry, the value is acceptable. It also means that he can be reduced by a combination of two different programs still "to a negligible degree".

Now you have to decide "under what conditions and to what extent the technology should be used in the future," it continues from the Interior Ministry on. The "ob" is apparently clear from the perspective of the Ministry of Interior.

In the final report of the Federal Police Headquarters, an extended, permanent use of the facial recognition software is clearly mentioned as a recommendation for action. The video technique should be used "at selected passenger stations" as a "support instrument of the police search". Already installed cameras should be checked for the possibility to upgrade them.

Massive interference in fundamental rights

The pilot had been accompanied by protests and warnings from privacy activists. The German Bar Association had complained about the lack of a legal basis for this type of video surveillance. "When masses of faces are scanned by respectable citizens at train stations, then the state grabs serious in basic rights," said the association.

DPA

Camera at the Südkreuz

Germany's top privacy advocate Andrea Voßhoff was critical of the start of the test run. "If such systems were to go into live operation, this would be a significant violation of fundamental rights."

The then Interior Minister Thomas de Maizière had always brushed aside concerns and talked about a possible widespread use of technology from the beginning of the test. Looking for terrorists and felons, he could not imagine constitutional concerns, said de Maizière. Proportionality must always be checked in the monitoring measure.