First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
According to a report by The Guardian, Apple contractors were listening to up to 1,000 recordings per day - many of which were triggered accidentally.
Following the report, Apple told The Verge that they would suspend the 'grading program' which governs the manual reviews, vs. the company's policy of retaining random audio clips from Siri for six months, after which they would be stripped of identifying information and kept for another two years.
Wednesday's announcement, however, is the suspension of both non-optional recordings as well as the entire grading program. The company will no longer keep Siri recordings unless a user opts in. When customers don't give Apple contractors permission to access their data, only Apple employees will have the ability to do so.
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.