New York (AFP)

Apple on Thursday unveiled new tools to better identify images of a sexual nature involving children on its iPhone, iPad and its iCloud server in the United States, raising the concerns of privacy advocates on the internet.

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography," the group said on its site.

To do this, Apple plans to use cryptographic tools to compare photos uploaded to its iCloud server with those stored in a file maintained by the National Center for Missing and Exploited Children (NCMEC).

The group ensures that it does not have direct access to the image.

When a photo looks similar to the one in the file, Apple will manually go check it, disable the user's account if necessary and send a report to the Center.

The group also plans to scan images sent or received via the iMessage messaging service on children's accounts linked to a family subscription.

When explicit photos are found, they will then be blurred and the child will be referred to prevention messages before they can possibly be opened or sent.

Parents can, if they choose, choose to receive a message when their child receives or sends such photos.

The voice assistant Siri will also be trained to "intervene" when users search for child pornography images by warning them that these contents are problematic.

These tools will be available gradually with the next updates of the operating systems on the iPhone, iPad, iWatch and iMac in the United States.

- Confidentiality -

These changes "mark a significant departure from long-established privacy and security protocols," said the Center for Democracy and Technology (CDT).

"Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and abuse not only in the United States, but around the world," said Greg Nojeim of CDT in a message sent to AFP.

According to him, the group should "abandon these changes and restore the confidence of its users in the security and integrity of their data stored on Apple devices and services".

The computer giant has a reputation for defending the privacy of its customers in the face of pressure from certain authorities seeking to access user data in the name of the fight against crime or terrorism.

"The exploitation of children is a serious problem, and Apple is not the first technology company to change its position on the protection of privacy in an attempt to combat it," said India McKinney and Erica Portnoy of the NGO for the protection of freedoms on the Internet Electronic Frontier Foundation (EEF).

But even developed with the best of intentions, a system designed to detect child pornography "opens the door to other abuses", they fear in a blog post.

Apple only needs to change the settings a little to find other types of content or to scan accounts not only of children but of everyone, they explain.

© 2021 AFP