This is reported by Apple Insider.

Initially, the company planned to launch a system that can recognize photographs with child pornography and notify Apple if the corresponding images are uploaded to iCloud.

It is noted that the postponement of the launch of tools to identify prohibited content is associated with concerns that the new software may create certain risks for users associated with the protection of personal data. 

“Based on feedback from customers, advocacy organizations, researchers and other stakeholders, we have decided to take additional time over the coming months to gather information and make improvements before releasing these critical tools for child safety,” Apple said in a statement. ...

The company did not report on the timing of the launch of the updates.

Former employee of the US National Security Agency Edward Snowden previously said that violation of user rights by IT giants, which can directly verify the personal information of device owners, creates risks that this function will be used by states.