Apple on Thursday unveiled new tools to better identify sexual images involving children on its iPhones, iPads and iCloud server in the United States, raising concerns from internet privacy advocates.

"We want to help protect children against predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography," said the group on its site.

To do this, Apple plans to use cryptographic tools to compare photos uploaded to its iCloud server with those stored in a file maintained by the National Center for Missing and Exploited Children (NCMEC).

The group ensures that it does not have direct access to the image.

When a photo looks similar to the one in the file, Apple will manually go check it, disable the user's account if necessary and send a report to the Center.

Apple: Think Different.


Users: Great!


Apple: About encryption and privacy.


Users: Yeah !!


Apple: We're scanning your private messages, and we're gonna tell your parents.


Users: https://t.co/WDKtt8r6I7

- EFF (@EFF) August 5, 2021

"An infrastructure of surveillance and censorship"

The group also plans to scan images sent or received via the iMessage messaging service on children's accounts linked to a family subscription.

When explicit photos are spotted, they will then be blurred and the child will be referred to prevention messages before they can possibly be opened or sent.

Parents can, if they choose, choose to receive a message when their child receives or sends such photos.

The voice assistant Siri will also be trained to “intervene” when users search for child pornography images by warning them that these contents are problematic.

These tools will be available gradually with the next updates to the operating systems for the iPhone, iPad, iWatch and iMac in the United States.

These changes "mark a significant departure from long-established privacy and security protocols," said the Center for Democracy and Technology (CDT).

"Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and abuse not only in the United States, but around the world," said Greg Nojeim of CDT.

"The exploitation of children is a serious problem"

The group should according to him "abandon these changes and restore the confidence of its users in the security and integrity of their data stored on Apple devices and services."

The computer giant has a reputation for defending the privacy of its customers in the face of pressure from certain authorities seeking to access user data in the name of the fight against crime or terrorism.

"The exploitation of children is a serious problem, and Apple is not the first technology company to change its position on the protection of privacy in an attempt to combat it," said India McKinney and Erica Portnoy of the NGO for the protection of freedoms on the Internet Electronic Frontier Foundation (EEF).

But even developed with the best intentions, a system intended to detect child pornography images "opens the door to other abuses", they fear in a blog post.

Apple only needs to change the settings a bit to find other types of content or to scan accounts not only of children but of everyone, they explain.

High-Tech

Xiaomi dethrones Samsung and now dominates the smartphone market in Europe

High-Tech

Watches, headphones, connected glasses… Huawei's strategy to ensure its survival after a catastrophic year

  • High-Tech

  • Society

  • Struggle

  • Apple

  • Child pornography