Meta will blur nude images in Instagram DMs for under 18s

11 Apr 2024

Image: Meta

A host of new tools and features will encourage people ‘to think twice’ before sending nude images or engaging with people in a sexually explicit manner on the app.

Meta has announced a new nudity protection feature on Instagram that will blur any images containing nude content sent to users under the age of 18.

In an announcement today (11 April), Meta said that it is introducing a spate of new measures aimed at making it more difficult for potential scammers and criminals to interact with teens, especially with the intention of sextortion – a form of extortion that involves blackmail after acquiring sexually explicit content from the victim.

“While people overwhelmingly use DMs to share what they love with their friends, family or favourite creators, sextortion scammers may also use private messages to share or ask for intimate images,” the Instagram parent company wrote.

To help address this, Meta said it will soon start testing the new nudity protection feature in Instagram DMs that will blur any image detected to contain nudity. The objective is to encourage people “to think twice” before sending nude images.

“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Meta added.

The feature will be turned on by default for all Instagram users under the age of 18 and a notification will also encourage adults to turn it on. When activated, nudity protection will ensure people sending nude images will see a message reminding them to be “cautious” when sending sensitive photos.

“Anyone who tries to forward a nude image they’ve received will see a message encouraging them to reconsider,” Meta said.

“When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn’t confronted with a nude image and they can choose whether or not to view it. We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat.”

Users will also be redirected to safety tips developed by experts about the potential risks involved in engaging with or sharing nude content. These include reminders that the photo may be screenshotted and forwarded without consent – or that a person’s relationship with the other user may change in the future.

“Companies have a responsibility to ensure the protection of minors who use their platforms,” said John Shehan, senior vice-president at the US-based National Centre for Missing & Exploited Children.

“Meta’s proposed device-side safety measures within its encrypted environment is encouraging. We are hopeful these new measures will increase reporting by minors and curb the circulation of online child exploitation.”

Meta has long faced criticism from various groups for not doing enough to keep children safe across its family of apps, including Instagram and Facebook.

In 2021, whistleblower Frances Haugen, a former product manager at Facebook, revealed herself as the source of thousands of leaked company documents that alleged a lax approach at the company towards teen safety and claimed that profit outweighs public good at Meta.

Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.

Vish Gain is a journalist with Silicon Republic

editorial@siliconrepublic.com