Hey, Siri reviewer, no more listening to sex talk

Apple and Amazon to curtail human review of voice recordings

WASHINGTON • Apple and Amazon have announced they would curtail the use of humans to review conversations on their digital voice assistants, a move that gives users more privacy controls over their communications.

Apple said it would stop using contractors to listen in on users through Siri to grade the voice assistant's accuracy after an Apple whistle-blower told The Guardian that the contractors responsible for "grading" the accuracy of the digital assistant regularly overheard conversations about doctors' appointments, drug deals and even couples having sex.

Their job was to determine what triggered Siri into action - whether the user had actually said, "Hey, Siri" or if it was something else, such as the sound of a zipper.

Apple said it would suspend the global analysis of those voice recordings while it reviewed the grading system. Users will be able to opt out of reviews during a future software update.

"We are committed to delivering a great Siri experience while protecting user privacy," said Apple spokesman Cat Franklin, in an e-mail to The Washington Post.

Amazon updated its privacy policy regarding voice recording made by its Alexa service. Amazon will now let users opt out of having humans review those recordings, selecting a new option in the settings of the Alexa smartphone app. Amazon employees listen to those recordings to help improve its speech-recognition technology.

The company tweaked Alexa privacy features in May, giving users the ability to delete recordings of their voices. And users could already opt out of letting Amazon develop new features with their voice recordings. Many smart-speaker owners do not realise that Siri, Alexa and, until recently, Google's Assistant, keep recordings of everything they hear after their so-called "wake word" to help train their artificial intelligences.

Google quietly changed its defaults last year, and Assistant no longer automatically records what it hears after the prompt "Hey, Google". Apple said it uses the data "to help Siri and dictation... understand you better and recognise what you say". But this was not made clear to users in Apple's terms and conditions.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," the Apple whistle-blower told The Guardian.

"These recordings are accompanied by user data showing location, contact details and app data."

In response, Apple said that the recordings accounted for only 1 per cent of Siri activations and lasted just a few seconds. They also were not linked to users' Apple IDs.

The Apple whistle-blower said the Apple Watch and the HomePod, a smart speaker, were especially prone to accidental activation.

Apple contractors in Ireland told The Guardian that they had been sent home for the weekend and were told it was because the global grading system "was not working".

WASHINGTON POST

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Sunday Times on August 04, 2019, with the headline Hey, Siri reviewer, no more listening to sex talk. Subscribe