Instagram Removes Fictional Depictions Of Self-Harm And Suicide

Instagram Removes Fictional Depictions Of Self-Harm And Suicide

Instagram announced on Sunday that it’s banning drawings, memes, video, and comics that portray self-harm and suicide, as well as “other imagery that may not show self-harm or suicide, but does include associated materials or methods.” This adds to its preexisting policy of prohibiting content which promotes or encourages self-harm and suicide.

Advocates have implored the platform to take stronger measures, especially after the well-publicised death of 14-year-old Molly Russell. In January of this year, her father publicly spoke about finding images of depression, self-harm, and suicide on her social media feeds. Still, she’d displayed “no obvious signs” of preexisting mental health issues. Moved by Russell’s story, UK health secretary Matt Hancock addressed a letter to Instagram, Facebook, Twitter, Google, Snapchat, Pinterest, and Apple urging them to remove graphic and triggering content. “It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people,” he wrote. “It is time for internet and social media providers to step up and purge this content once and for all.”

In February, Instagram head Adam Mosseri responded to the news by announcing in the Telegraph that the platform would launch “sensitivity screens” blurring images of self-harm, which the user could click through to view. They updated their policy to remove all graphic images of self-harm, shadowban images of non-graphic content including healed scars, and direct users who search for self-harm and suicide-related content to support networks.

Subsequently, some users found that their images of healed scars accompanying stories of recovery had been blurred or removed, which generated the hashtag #youcantcensormyskin–raising the question of how to promote positive stories while removing triggering images.

Data on such imagery is “limited and complex,” Samaritans, a UK-based charity and crisis hotline, noted in a statement to Gizmodo. They continue:

While we know that some content can be helpful and supportive, there is also evidence that content can be harmful, and that some imagery can glorify, sensationalise and normalise self-harm. This is very concerning, as we know that self-harm is a strong risk factor for future suicide. We also know that in recent years both the self-harm and suicide rates in young people have been rising, which is incredibly worrying.

Samaritans go on to say that they will be collaborating with social media platforms, including Instagram, and hearing from creators in order to “maximise supportive content and minimise harmful content.”

Mosseri touched on that fine line in the policy announcement:

We understand that content which could be helpful to some may be harmful to others. In my conversations with young people who have struggled with these issues, I’ve heard that the same image might be helpful to someone one day, but triggering the next.

Some good examples of helpful content can be found on the Trevor Project’s Instagram, with messages of hope and support from regular people to Jonathan Van Ness. They welcome people who’ve received support to share their stories without graphic imagery.

If you or someone you know is having a crisis, please call Lifeline on 13 11 14.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.