A whistleblower claimed Apple contractors overheard private conversations through accidental Siri recordings.
Last week (26 July), a whistleblower anonymously aired privacy concerns about how Apple contractors grade recordings from voice assistant Siri.
The individual, who works for Apple, told The Guardian that contracted staff overheard private conversations through accidental Siri recordings. Some of these recordings reportedly featured confidential medical information, drug deals and recordings of couples having sex, according to the source.
While the recordings were not associated with the user’s Apple ID, the individual who spoke to The Guardian said that it was not unusual for recordings to feature enough information to make the speaker identifiable. For instance, some clips featured names and addresses.
The Apple Watch and Apple’s HomePod were responsible for most of these accidental recordings, according to the whistleblower.
Seven days after publishing these claims, The Guardian shared an update on the situation.
It said that contractors working for Apple in Ireland showed up to work this morning (2 August), but “were sent home for the weekend after being told the system they used for the grading [Siri recordings] was ‘not working’ globally”. The Guardian reported that managers were allowed to remain on site, but contractors are unsure what this means for the current state of their work and the future of their employment.
In a statement to The Verge, Apple said: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
The Verge also asked Apple if it would stop saving Siri recordings on its servers, but the company declined to comment. Apple says that it keeps the recordings for six months before removing identifying information from a copy that it could keep for two years or more, according to The Verge.
The Guardian also spoke to UK director of Big Brother Watch, Silkie Carlo, who said: “Too often we see that so-called ‘smart assistants’ are in fact eavesdropping. We also see that they often collect and use people’s personal information in ways that people do not know about and cannot control.
“Apple’s record on privacy is really slipping. The current iOS does not allow users to opt out of face recognition on photos, and this revelation about Siri means our iPhones were listening to us without our knowledge.”