Apple postpones new features to detect child sexual abuse material

3 Sep 2021

Image: © CDPiC/Stock.adobe.com

One month after announcing a new suite of CSAM features, Apple has put the brakes on rolling out the new technology.

Apple has made the decision to delay the roll-out of new features for the detection of child sexual abuse material (CSAM) following backlash from privacy advocates.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement shared with a number of news publications.

Apple announced the new features early last month with the intention of supporting child protection and limiting the spread of CSAM on its services, such as iCloud and Messages.

This included the introduction of barriers around searching for CSAM, parental alerts for explicit images on a child’s phone, and alerts to law enforcement if CSAM was detected in a user’s iCloud photos.

The search barriers would be introduced through new resources within Siri and Apple’s search functions.

The parental controls feature would detect and censor explicit images sent to users under the age of 13 on a family account. Apple said the images would be analysed via on-device machine learning and that the company would not get access to the messages.

The feature for CSAM detection, however, was the most technical of those proposed. This would use Apple’s new NeuralHash technology to flag material uploaded to iCloud that matched with existing CSAM content in the database of the National Centre for Missing and Exploited Children (NCMEC) in the US.

Again, this process would be carried out on the user’s device. Images would be translated into an unreadable string of letters and numbers, known as a hash, and seek a match with a stored database of hashes from NCMEC and other child safety organisations.

While Apple attempted to assure its detractors that these measures would be privacy-preserving, concerns were raised as to how this could open a backdoor into widespread surveillance and monitoring of content.

An open letter signed by industry figures such as Edward Snowden urged Apple not to go ahead with the new features.

As a tech service provider, Apple has come to pride itself on privacy and has used this position to differentiate itself from its competitors. In defending the CSAM detection tools, the company claimed its proposed features were more privacy-preserving than other tools in use by companies such as Google and Facebook.

Yet the company has spent the past month pushing back with assurances that its new tech would not be used for surveillance.

Owing to the backlash, Apple SVP Craig Federighi admitted that the company had caused confusion with its announcement of the CSAM detection features and that the messaging – and the company’s best intentions – got “jumbled”.

Elaine Burke is the host of For Tech’s Sake, a co-production from Silicon Republic and The HeadStuff Podcast Network. She was previously the editor of Silicon Republic.

editorial@siliconrepublic.com