Apple to launch child safety feature in the UK that scans images for nudity

21 Apr 2022

Image: © guteksk7/Stock.adobe.com

Apple has made a number of changes to its initial proposal for CSAM detection tools, which faced backlash due to privacy concerns.

Apple has announced that a new safety feature that scans images sent to and from children will soon be available on devices in the UK, The Guardian reported.

The feature – called ‘communication safety in Messages’ – is designed to warn children using the Apple app when they receive or send photos that contain nudity. This feature was launched in the US last December.

If the AI technology detects nudity, the image is blurred and the child is warned about the potential content, while being presented with options to message someone they trust for help.

iPhones with a blurred image and warnings about sensitive content.

An example of the feature when it detects an image with nudity. Image: Apple

Apple initially proposed a set of tools for the detection of child sexual abuse material (CSAM) last August, but the roll-out was postponed following a backlash from critics.

As well as alerts for explicit images on a child’s phone, Apple was also looking to introduce barriers around searching for CSAM and alerts to law enforcement if CSAM is detected in a user’s iCloud photos.

While Apple said the measures would be privacy-preserving, concerns were raised as to how they could open a backdoor into widespread surveillance and monitoring of content.

Apple has made a number of changes to its image scanning feature since then. The initial announcement said parents of users under the age of 13 would automatically be notified if explicit images were detected, but that is no longer mentioned in the update.

The communication safety feature is also switched off by default and has to be turned on by parents.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple said about the feature. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

CSAM detection features are also being added to Apple apps such as Siri, Spotlight and Safari Search. These apps will intervene if the user searches for queries related to child exploitation.

“These interventions explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple said.

Siri will also help users who ask how to report CSAM content, directing them to resources on how to file a report.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com