A whistleblower said that contractors working for Apple have overheard private conversations through accidental Siri recordings.
The employee told the publication that contractors who work at the company are given the responsibility of listening to a small proportion of Siri recordings to grade the virtual assistant’s responses to queries.
According to the worker, who remained anonymous, these recordings have often been triggered by accident. This means that contractors working for Apple have overheard “confidential medical information, drug deals and recordings of couples having sex”.
These recordings are not associated with the user’s Apple ID, but the whistleblower said that there can often be enough information divulged in the short recordings to identify the speaker. This information sometimes includes addresses and names.
“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings],” the whistleblower added.
When The Guardian reached out to Apple for comment, the company said: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under strict obligation to adhere to Apple’s strict confidentiality requirements.”
Apple said this small portion amounts to less than 1pc of all Siri requests. As there are hundreds of millions of devices using Siri regularly, this figure could be in the hundreds of thousands, TechCrunch estimated.
The anonymous individual who spoke to The Guardian said that the Apple Watch and the Apple HomePod were responsible for most of these accidental recordings.
Both Amazon and Google were criticised earlier this year for employing staff to listen to Alexa and Google Assistant recordings. Amazon, Google and Apple have all failed to disclose this information to the public until forced to do so.
In Apple’s customer-facing privacy documentation, it does state that some data is shared with third parties “using encrypted protocols”, but it does not inform users that they may be recorded and listened to by a stranger.
The Verge pointed out that it is necessary for humans to listen in to determine if Siri is being triggered by actual requests or false positives, as smart assistants can’t tell the difference, and “if they could, it wouldn’t be a false positive”.
The Guardian also noted that while Amazon and Google enable users to opt out of some uses of their recordings, Apple does not. The only way to be entirely sure that a third party won’t hear your Siri recordings is to completely disable the feature.
Apple publicly tackled a separate privacy concern last week, when it re-enabled the Apple Watch app Walkie Talkie after resolving a security issue within the app that allowed eavesdropping.