Apple will launch a system to check photos for

images of child pornography on a country-by-country basis

, based on local laws, the company said Friday.

A day earlier, Apple said it would implement a system that filters photos for such images

before they are uploaded from iPhones in the United States

to its iCloud storage.

Child safety groups praised Apple when it joined with Facebook, Microsoft and Google in taking such steps.

But the verification of Apple's photographs on the iPhone itself has raised concerns among those who believe that this system of investigation of its users' devices could be exploited by governments.

Photos are checked

by many other tech companies

after uploading them to servers.

At a press conference on Friday, Apple said it would make plans to expand the service based on the laws of each country where it operates. The company said nuances in its system, such as "security checks" that are passed from iPhone to iPhone. Apple servers that do not contain useful data will

protect Apple from government pressure

to identify material other than child abuse images.

Apple has a humane review process that acts as a backup against government abuse, he added.

The company

will not pass reports from its photo verification system to law enforcement

if the review does not find images of child abuse.

Regulators are increasingly demanding that tech companies

do more to remove illegal content.

For the past several years, law enforcement agencies and politicians have used the scourge of child abuse material to discredit strong encryption, in the way they had previously cited the need to curb terrorism.

Some resulting laws, even in Britain, could be used to force tech companies to

act secretly against their users.


While Apple's strategy can deflect government meddling by showing its initiative or complying with advance directives in Europe, many security experts said the privacy champion was making a big mistake by showing

its willingness to access the phones of the clients.


"It may have diverted the attention of US regulators to this issue, but it will attract regulators internationally to

do the same with terrorist and extremist content,"

said Riana Pfefferkorn, a researcher at the Stanford Internet Observatory. "Politically influential copyright holders in Hollywood and elsewhere might even argue that their digital rights should be enforced that way."


WhatsApp, the world's largest fully encrypted messaging service, is also under pressure from governments who want to see what people are saying, and fear that it will now increase.

WhatsApp boss

Will Cathcart

tweeted feeling "concerned."

"It's the way to step back in people's privacy."


"We have had personal computers for decades, and

there has never been a mandate to scan the private content

of all desktops, laptops or phones around the world for illegal content," he wrote.

"This is not how technology built in free countries works."


Apple experts argued that they weren't actually registering people's phones because data sent on their devices must overcome

multiple hurdles.

For example, watchdog groups flag prohibited material and identifiers are included in Apple operating systems around the world, making them more difficult to manipulate.


According to the criteria of The Trust Project

Know more

  • iPhone

  • United States

  • Facebook

  • Microsoft

  • Google

  • Europe

  • WhatsApp

  • Twitter

Technology The Police will save the data and measurements of your face with the new DNI model

Textile industry Ecoalf, Lolo Carolo, Ray Mosgo ... Spain assaults the European heart of sustainable fashion

Catalan Socialism The PSC gains economic influence but loses weight in the negotiation with separatism

See links of interest

  • Last News

  • Holidays 2021

  • Home THE WORLD TODAY