More than a dozen leading cybersecurity experts on Thursday criticized Apple and the European Union's plans to monitor people's phones for illegal materials, describing such efforts as dangerous and ineffective strategies that would encourage government censorship.

In a 46-page study, the researchers wrote that Apple's proposal to capture images of child sexual abuse on iPhones, in addition to the idea that it would support members of the European Union to detect similar abuse and terrorist images on devices, Encrypted in Europe, uses "dangerous technology".

According to the researchers, "resisting attempts to spy on and influence law-abiding citizens should be a national security priority."

This technology, known as client-side scanning, will allow Apple, or perhaps also European law enforcement officials, to detect child sexual abuse images in someone's phone by scanning images uploaded to a storage service. iCloud from Apple.

When Apple announced its planned tool in August, it said it would compare its so-called photo fingerprint against a known database of child sexual abuse material to look for potential matches.

In a 46-page study, researchers wrote that Apple's proposal to capture child sexual abuse images on iPhones uses "dangerous technology."

But the plan has caused an uproar among privacy advocates, and raised fears that the technology could undermine digital privacy and be used by authoritarian governments to track down political opponents and ultimately other enemies.

Apple said it would reject any such requests by foreign governments, but a protest against it led to it temporarily suspending the release of the scanning tool last September.

The company declined to comment on the report, which was released on Thursday.

Cybersecurity researchers said they began their study before Apple's announcement.

Documents released by the European Union and a meeting with EU officials over the past year have led them to believe that the EU leadership wants a similar program, not only to expose images of child sexual abuse, but also to look for signs of organized crime and indications of terrorist links.

Apple said it would reject any requests by foreign governments to track down political opponents.

The researchers believe a proposal to allow image screening in the European Union could come soon, possibly this year.

The researchers said they decided to publish their findings now to inform the European Union of the risks of this plan, and because "the breadth of state oversight powers really cross the red line," said Ross Anderson, professor of security engineering at Cambridge University and a member of the group.

Aside from monitoring concerns, the researchers said their findings suggested the technology was not effective in identifying images of child sexual abuse.

Within days of Apple's announcement, they said, people had figured out ways to avoid detection by making minor edits to the photos.

"It allows scanning a private personal device without any potential justification for something illegal to happen," added Susan Landau, a professor of cybersecurity and policy at Tufts University, another member of the group.

"It is very dangerous. It is dangerous for business, national security, public safety and privacy," she said.

© New York Times Foundation 2021

Moved to Arabic by "Al Jazeera's Leadership" page