Apple is taking a radical step in the fight against child pornography.

From autumn on, the group will initially have photos on devices of US users compared with a list of known child pornographic material when using the in-house online storage service iCloud.

Apple presented a complex procedure for this on Thursday that is supposed to ensure data protection.

For the comparison, a file with so-called "hashes" of already known child pornographic content should be loaded onto the devices - a kind of digital fingerprint of the image.

A copy of the photo can be identified when comparing with special processes, but the original cannot be restored from the hash.

If there is a match, suspicious images are provided with a certificate, thanks to which Apple can exceptionally open them after uploading them to iCloud and subject them to an examination.

The system only sounds the alarm when there is a certain number of hits.

How many there must be for this is not made public.

If it is found, the alarm will sound

If child pornographic material is actually discovered during the check, Apple reports this to the American non-governmental organization NCMEC (National Center for Missing & Exploited Children), which in turn can involve the authorities.

While the function is only activated for Apple customers with US accounts, the file with the hashes is an integral part of the operating system.

It should be loaded onto all iPhones on which this system version is installed.

The list is to be updated on the devices when new versions of the operating systems for iPhones and iPad tablets are released.

Before the function can be introduced internationally, the legal requirements must first be clarified.

Users who find known child pornographic material as a result of the comparison will not be informed of this.

However, your account will be blocked.

The comparison via hashes is also used, for example, by online platforms to discover such content while it is being uploaded and to prevent it from being published.

According to the industry, the process works practically flawlessly for photos - but does not yet apply to videos.

Back doors required for authorities

Critics of the encryption of private communication in chat services and on smartphones, which is common today, often cite the fight against child sexual abuse as an argument to demand back doors for authorities. Apple's announced system is an attempt to solve the problem in a different way. The company repeatedly fought against demands by US security authorities to crack the encryption of its devices during investigations. The focus on hashes of already known photos also means that new content created on the devices is not discovered.

Apple published analyzes by several experts who welcomed data protection in the process.

At the same time, a cryptography expert at the Johns Hopkins University in the United States, Matthew Green, criticized Twitter for creating the possibility of synchronizing files on the devices at all.

Specifically, he sees the danger that someone could smuggle hashes for other content onto devices - and that authoritarian governments could enact regulations to search for other content in this way.

Another function will in future enable parents to receive a warning message if their child receives or sends nude photos in Apple's iMessage chat service.

The nudity in the pictures is detected by software on the device.

The group does not hear about it.