The computer giant Apple unveiled, Thursday, August 5, several tools intended to better identify on its products and its servers the images of a sexual nature involving children.

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography," the group said on its site.

To do this, Apple plans to use cryptographic tools to compare photos uploaded to its iCloud server with those stored in a file maintained by the National Center for Missing and Exploited Children (NCMEC).

The group ensures that it does not have direct access to the image.

When a photo looks similar to the one in the file, Apple will manually go check it, disable the user's account if necessary and send a report to the Center.

Control of images received by message, parental control ...

The group also plans to scan images sent or received via the iMessage messaging service on children's accounts linked to a family subscription.

When explicit photos are identified, they will then be blurred and the child will be referred to prevention messages before they can eventually be opened or sent.

Parents can, if they choose, choose to receive a message when their child receives or sends such photos.

The voice assistant Siri will also be trained to "intervene" when users search for child pornography images by warning them that these contents are problematic.

These tools will be available gradually with the next updates of the operating systems on the iPhone, iPad, iWatch and iMac in the United States.

A control system that "opens the door to other abuses"

These changes "mark a significant departure from long-established privacy and security protocols," said the Center for Democracy and Technology (CDT).

"Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and abuse not only in the United States, but around the world," said Greg Nojeim of CDT in a message sent to AFP.

The group should, according to him, "abandon these changes and restore the confidence of its users in the security and integrity of their data stored on Apple devices and services."

The computer giant has a reputation for defending the privacy of its customers in the face of pressure from certain authorities seeking to access user data in the name of the fight against crime or terrorism.

"The exploitation of children is a serious problem, and Apple is not the first technology company to change its position on the protection of privacy in an attempt to combat it," said India McKinney and Erica Portnoy of the NGO for the protection of freedoms on the Internet Electronic Frontier Foundation (EEF).

But even developed with the best of intentions, a system designed to detect child pornography "opens the door to other abuses", they fear in a blog post.

Apple only needs to change the settings a little to find other types of content or to scan accounts not only of children but of everyone, they explain.

With AFP

The summary of the week

France 24 invites you to come back to the news that marked the week

I subscribe

Take international news everywhere with you!

Download the France 24 application

google-play-badge_FR