Apple has said it will no longer have human contractors listen to audio recordings of customers using its digital assistant Siri.
The tech giant’s apology comes about a month after a whistleblower told The Guardian about the practice, which Apple said in its statement on Wednesday, was part of the company’s “Siri quality evaluation process,” called grading.
The report raised privacy concerns among customers and Apple said Wednesday it halted the grading program immediately.
“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said in the statement.
The company said it will resume the Siri grading program this fall, but with changes, including not keeping any audio recordings from Siri.
However, customers will be allowed to opt in to let Apple listen to audio samples, but only Apple employees will be able to listen, and “inadvertent triggers” of Siri will be deleted, the company said.
“Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy,” the company said. “We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.”
In July, The Guardian reported that Siri could be activated by something as mundane as the sound of a zipper -- leaving any conversation open to surveillance and accompanied by user data like location and contact details.
Siri is not the only assistant listening in on the recordings of users. In April, it was revealed that Amazon’s voice assistant Alexa was at times recording private conversations. In July, it also emerged that Google’s Assistant was doing the same.
FOX Business’ Tuttie Dedvukaj contributed to this report.