Instagram announces ban on graphic self-harm images

11 Feb 2019

Instagram app on mobile. Image: AllaSerebrina/Depositphotos 

Instagram says it will ban graphic depictions of self-harm from its platform, following pressure from UK legislators.

Photo-sharing platform Instagram has announced it will ban all graphic self-harm images as part of a number of measures taken following criticism in the aftermath of UK teenager Molly Russell’s death.

Last week, the company said it would remove graphic images or videos of self-harm such as cutting following extensive discussions with mental health experts.

Bereaved father criticised social media platforms

Ian Russell, father of Molly, had campaigned for the platform to improve monitoring for such content since his daughter took her own life in 2017.

In Russell’s view, some social media firms were partially to blame for his daughter’s death. Family members found material and content relating to self-harm and suicide when they looked at her Instagram account after she died.

A few weeks ago, Russell told the BBC: “Some of that content seemed to be quite positive, perhaps groups of people who were trying to help each other out … but some of that content is shocking in that it encourages self-harm [and] it links self-harm to suicide.”

Instagram head announces new measures

After growing pressure from UK legislators, including UK health secretary Matt Hancock, Instagram head Adam Mosseri admitted the firm had not done enough in this area.

Following a meeting with Hancock and other social media firms, Mosseri announced some changes in an op-ed for The Telegraph, published on 4 February. “We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable.”

As well as this, Instagram said it would be removing non-graphic images of self-harm from the most visible parts of its app, such as the Explore feed, searches and hashtags. The platform is also adding sensitivity screens to content that involves cutting, which will require users to tap through to see.

He said: “We are not removing this type of content from Instagram entirely, as we don’t want to stigmatise or isolate people who may be in distress and posting self-harm related content as a cry for help.”

Mosseri noted that the platform mostly relies on community reporting of offending posts, but added that Instagram will be spending on technology to improve the flagging of these images before they reach users.

“Nothing is more important to me than the safety of the people who use Instagram.” He added: “I have a responsibility to get this right. We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”

Speaking on BBC Radio 4, UK digital minister Margot James said the government would “have to keep the situation very closely under review to make sure that these commitments are made real – and as swiftly as possible”.

Instagram app on mobile. Image: AllaSerebrina/Depositphotos 

Ellen Tannam was a journalist with Silicon Republic, covering all manner of business and tech subjects

editorial@siliconrepublic.com