SVT has previously stated that the Data Inspectorate is starting an audit to find out if Swedish authorities use the face recognition technology provided by the company Clearview AI.

The background is information from a customer list that is reported to have leaked from Clearview AI, and from which the online magazine Buzzfeed News published information from. There, Sweden is included in a long list of countries where authorities are reported to have used the service.

Buzzfeed has also published a map that is said to come from a presentation made by Clearview AI, and where Sweden is marked as one of the countries to which the company has expanded or plans to expand.

After Sweden's Radio Ekot asked questions about Clearview AI to the police's press service, the authority's data protection officer made a quick investigation of the legal situation regarding the face recognition service.

"Many and difficult" questions

SVT has taken note of the investigation, which states that Clearview AI "raises many and difficult legal issues" and that "it should not be considered to use current services". However, if they do so "the Police Authority will violate a number of provisions in data protection legislation, which may, among other things, lead to penalty fees".

The Police Department's NFC, National Forensic Center, has explained to the Data Protection Ombudsman that the service is not used there. The police department has also not been asked to make any judicial judgments.

The investigation nevertheless lands in a recommendation: "To ensure that the Clearview AI software and the corresponding external services are not used within the Police Authority so far, NFC and NOA should clarify and disseminate that this is not allowed."

And, following the Data Protection Ombudsman's investigation, internal information is now being provided to police employees, which means that it is not allowed to use Clearview AI.

Tests can explain

The Data Protection Representative notes that you can register for a 30-day trial with Clearview AI, and that this may be the reason why organizations do not know that employees are using the software.

Both Interpol and the Danish police have stated that e-mail addresses linked to the respective authorities may have ended up in Clearview's customer list in just this way - when individual employees tested the tool.

A lawyer representing Clearview AI has, for the Danish daily Politiken, stated that the leaked list contains several errors, but declined further comments with reference to an ongoing investigation.