It’s Instagram’s response to the public outcry over the death of British teenager Molly Russell who killed herself in 2017 after viewing graphic content on the photo-sharing platform.
“We have expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery.
“We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods,” Adam Mosseri, Head of Instagram, wrote in a blog post on Sunday.
According to Instagram, nothing is more important to it than the safety of people who use the platform, particularly the most vulnerable.
“Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like ”Explore”. And we’ll send more people more resources with localized helplines, like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the US,” Mosseri said.
After Russell’s death, her family discovered she had been “suggested” disturbing posts on Instagram and Pinterest about anxiety, depression, self-harm, and suicide, according to reports.