Voice Assistant: Apple stops evaluating Siri recordings

The company responds to criticism from privacy advocates. Worldwide, no voice assistant recordings are evaluated, unless users explicitly allow it.


Apple has stopped evaluating Siri images worldwide. For the time being, no recordings of the language assistant to be evaluated more, said a spokeswoman for the company. Users should be explicitly asked for permission after a software update.

The group responded to criticism from privacy advocates who denounce the common practice on Amazon and Google for a long time. The Google mother alphabet has announced similar steps, but so far only for Europe.

Couples having sex

The Siri assistance software allows smartphone users to send text messages, play music or call someone by voice input. To improve the software, parts of the recordings of people are typed and evaluated. According to the company, this should avoid mistakes that are caused, for example, by dialects or slip of the tongue, or by sounds that Siri mistakenly interpret as language.

A report by the British newspaper Guardian had sparked outrage a few days ago. According to him, sensitive conversations recorded by Siri were intercepted at Apple, for example about the health of people or business details. Even shots of couples during sex should have been evaluated, the newspaper said, citing insiders.

A big problem is that sometimes Siri starts unintentionally because the software accidentally understands "Hey Siri". As the Guardian reported, the word "Syria" is enough to awaken the language assistant. In some cases, Siri himself interpreted the sound of a zipper as a command to record.

ref: zeit

Similar news: