• Technology Apple will analyze the photos of its users' devices to fight against child pornography

  • Background Apple will automatically search for child pornography photos on iPhones

Apple

has delayed

the implementation of its technology

for detecting and combating the distribution of images of sexual exploitation of minors through the company's services for a

few months

and will introduce changes before its launch.

The American technology giant planned to launch its new technology on family iCloud accounts with iOS 15, iPadOS 15 and macOS Monterrey, which will finally

be delayed for a few months

due to "

comments

from clients, advocacy groups, researchers and others," as it has recognized. Apple through a statement sent to 9to5Mac.

"We have decided to take more time over the next few months to gather information and make improvements before launching these child safety features," said a company spokesperson.

This technology, which Apple had announced in August but had not yet implemented, was designed to

protect children against

sexual

harassers

who use the company's communication tools to contact and exploit minors, as well as to prevent the dissemination of these contents.

These "new cryptographic applications" make it possible to detect images of this type that are stored in iCloud.

This method does not scan the images in the cloud, but is based on an on-device comparison of known images provided by child safety organizations before they are uploaded to iCloud.

What is being compared is not the image itself, but

the 'hashes' of the images

, a kind of fingerprint.

A cryptographic technology called 'private set intersection' is what determines if there is a match without revealing the result, and is attached to the image once uploaded to iCloud.

The 'secret exchange threshold' technology ensures a high level of match, and that is when Apple receives an alert for human teams to review.

If confirmed, the user's account is deactivated and a report is sent to the relevant associations and the Police.

Likewise, through a second technology implanted in its Messages messaging app,

parents or guardians will receive a notification

(optional) each time the child sends or receives an image with explicit sexual content, but this occurs after the child receives a warning informing you that if you proceed to view said image, your parents will be notified.

At the moment, Apple has not specified when this new technology will be launched after its delay or in what aspects it intends to change it.

This decision comes after criticism from experts and organizations who denounced that it violated the privacy of users and that it could be used as a back door to spy on people.

According to the criteria of The Trust Project

Know more

  • Apple Inc.

FutureAlarm before the return of the Tamagotchi: "Obsessive children can be harmed"

TechnologyWhatsApp launches its long-awaited tool to pass history of chats between mobiles

GadgetsRazer Announces Keyboard Customization Kit

See links of interest

  • Last News

  • Holidays 2021

  • Home THE WORLD TODAY

  • Sweden - Spain, live

  • Stage 19, live: Tapia - Monforte de Lemos