Apple wants (for now only) in the United States to scan photos on iPhones in the fight against child pornography.

Despite noble intentions, the initiative also received criticism in the past week.

In the Netherlands, networks and servers of websites can be scanned for the presence of child pornography using a similar method.

But that also brings with it dilemmas.

Apple's plan has been met with opposition from experts.

They fear a sliding scale: now it is child porn, soon a regime will demand to check whether the iPhones of citizens contain images of, for example, homosexuality or lese majesty.

Apple promises to use the software only to hand over "valuable and useful information" about child pornography to the police.

The company says it will not give in to pressure to expand the system.

Experts are not convinced.

Moreover, a mandatory reporting obligation when child pornography is found, as it applies in the US, is counterproductive, says Arda Gerkens, director of the Online Child Abuse Expertise Agency (EOKM) in conversation with NU.nl.

She points out that Apple will scan devices for material already known to authorities.

"Imagine what tens of thousands of reports per year mean for the workload of the police. They have to do an investigation, a search and the confiscation of data carriers. And it is not the case that one detective is working on that."

“You are not going to save victims with it and you are not going to find perpetrators with it.”

Arda Gerkens, director EOKM about a reporting obligation

"While it may be one person in a hut on the heath who was sent an image and has a setting to download it automatically," Gerkens outlines.

"It concerns known material. So you are not going to save victims with it and you are not going to find perpetrators with it. Meanwhile, thousands of people come together in other online places to exchange child pornography. The police can use that time much better to set up those networks. traces. That's where the real perpetrators and victims are."

Scan for cleaning, not for discovery

The EOKM offers a similar tool to detect child pornography.

Companies and organizations can voluntarily use the scanner to check their online platforms, networks and servers for the presence of known child pornography material.

The software essentially works the same as the method that Apple wants to apply.

A code is assigned to known images via a mathematical formula.

This also happens with all photos in the place where child pornography is searched.

If a code there matches a code from the database with images of child abuse or porn, the alarm bells go off.

The service of the EOKM is called the HashCheckService.

It refers to the method of converting the image material into a unique code: a hash.

“We want to be able to guarantee that it is not going in the wrong direction.”

Marco Edelman, spokesperson for TransIP

Gerkens emphasizes that the service is not intended for tracing perpetrators.

"There is no obligation to report it to the police. In fact, it is not necessary, because the material is already known. The most important thing is that the internet becomes clean."

Handful of incidents with Dutch scanning software

The risks and dilemmas associated with using Apple's proposal are illustrated by a recent incident in which the HashCheckService removed legitimate images.

The service consists of several databases.

Not only child pornography images from the Dutch police, but also from the American NCMEC, the international police organization Interpol and the Canadian police are connected to it.

"The Canadian database consists of three categories: illegal, questionable and other, so not illegal," explains Gerkens.

"The latter set contains, for example, residual material that is found during an investigation. That should not have been included."

Because of the incident, TransIP, a company that provides server space for websites and makes the HashCheckService available, informed its customers in June that it would temporarily disable the service.

"We are struggling with it," says TransIP spokesperson Marco Edelman.

"We have joined the service with a view to improving the world, but we also want to be able to guarantee that the lists are correct and that it is not working in the wrong direction."

In addition to this incident, it has happened "four to six times" that the HashCheckService incorrectly removed images, says Gerkens, who says that the service is active at about thirty companies.

"If human work is involved, you can never be 100 percent guaranteed that everything will always go well," says Edelman of TransIP.

"We're still having a bit of trouble with that right now."

The company is now waiting for the EOKM to make improvements before meeting the organization again.

Keywords: