Apple is changing Siri in aftermath of audio recordings controversy

0
37
Apple 'sorry' that workers listened to Siri voice recordings

Apple has issued a formal apology for employing human contractors to listen to audio recordings of its users talking to Siri.

The company said these contractors were meant to improve users’ experiences with its digital assistant. However, now that the media has revealed this practice, sparking privacy concerns among users across the world, Apple is admitting it didn’t live up to its “high ideals”. Actions speak louder than words, though, right? Apple said it will require users to opt in to having their recordings listened to by human reviewers, rather than having this be the default. And only Apple employees will be allowed to listen to audio samples of the Siri interactions, rather than contract workers.

Apple 'sorry' that workers listened to Siri voice recordings

The company also said it will no longer keep audio recordings of users’ interactions with Siri.
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process,” Apple said in the post. “As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize.”

Why is Apple sorry?

Apple was recently caught using human contractors to review recordings from Siri – something it never made explicitly clear to customers. The Guardian said those contractors had access to voice clips that were often recorded due to accidental Siri triggers. Workers reportedly listened to up to 1,000 recordings a day, and many of the clips were long enough to hear private information.

How is Siri’s privacy policy changing?

Apple said it now plans to change Siri’s privacy policy. Here’s how:

Apple 'sorry' that workers listened to Siri voice recordings

  1. By default, Apple will no longer retain audio recordings of Siri interactions. It will continue to use computer-generated transcripts, however.
  2. Users will be able to opt-in to a feature that will allow Siri to “learn from” the audio samples of user requests. Apple hopes people will opt-in, knowing that it “respects their data and has strong privacy controls in place”. Users can also opt-out at any time.
  3. Only Apple employees will be allowed to listen to audio samples of Siri interactions. Apple is therefore no longer using human contractors, and said its team will “work to delete any recording which is determined to be an inadvertent trigger of Siri”.

Apple has halted Siri grading. It plans to resume this autumn – after its new privacy policy and a software update are rolled out to users.

LEAVE A REPLY