Apple tests opt-in feature for Siri recording review program

Apple tests opt-in feature for Siri recording review program

Apple is rolling out a new opt-in notice for Siri audio sample review with the beta of iOS 13.2. This new opt-in feature was promised back in August after reports that audio from Siri requests were being reviewed by contractors and that the audio could contain sensitive or personal information.

Apple had previously halted the grading process entirely while it updated the process by which it used the audio clips to “improve Siri.”

The new process will include an explicit opt-in for those users who want to have clips of commands transmitted to Apple to help improve how well Siri understands commands.

The update is out in beta for iPadOS 13.2, iOS 13.2, Apple tvOS 13.2, WatchOS 6.1 and MacOS 10.15.1.

Some particulars of the new policy include:

  • An explicit opt-in.
  • Only Apple employees will be reviewing audio clips, not contractors.
  • Computer-generated transcripts are continuing to be used for all Siri users. These are in text form with no audio. They have been disassociated from identifying information by use of a random identifier.
  • These text transcripts, which Apple says include a small subset of requests, may be reviewed by employees or contractors.
  • Any user can opt-out at any time at Settings > Privacy > Analytics and Improvements, turn off “Improve Siri and Dictation.”

Apple also is launching a new Delete Siri and Dictation History feature. Users can go to Settings>Siri and Search>Siri History to delete all data Apple has on their Siri requests. If Siri data is deleted within 24 hours of making a request, the audio and transcripts will not be made available to grading.

ALSO READ: NASA can share SpaceX’s IP with rivals for free, says Elon Musk

The new policies can be found at Settings>Privacy>Analytics and Improvements>About Siri in the iOS 13.2 beta. A key section details how these segments are used:

If one of your Siri or Dictation interactions is selected for review, the request, as well as the response Siri provided, will be analyzed to determine accuracy and to generally improve Siri, Dictation, and natural language processing functionality in Apple products and services. Depending on the context of your request, Apple employees may review Siri Data directly relevant to the request, in order to grade the effectiveness of Siri’s response. Only Apple employees, subject to strict confidentiality obligations, are able to access audio interactions with Siri and Dictation.

There seems to be a solid set of updates here for Siri protections and user concerns. The continued use of text transcripts that may be reviewed by contractors is one sticky point — but the fact that they are text, anonymized and separated from any background audio may appease some critics.

These were logical and necessary steps to make this process more clear to users — and to get an explicit opt-in for people who are fine with it happening.

The next logical update, in my opinion, would be a way for users to be able to see and hear the text and audio that Apple captures from their Siri requests. If you could see, say, your last 100 requests in text or by clip — the same information that may be reviewed by Apple employees or contractors, I think it would go a long way to dispelling the concerns that people have about this process.

This would fit with Apple’s stated policy of transparency when it comes to user privacy on their platforms. Being able to see the same things other people are seeing about your personal data — even if they are anonymized — just seems fair.

LEAVE A REPLY