The move came after British Health Secretary Matt Hancock met with social media companies about doing more to safeguard the mental health of teenagers using their platforms.
British teenager Molly Russell was found dead in her bedroom in 2017. The 14-year-old had apparently taken her own life, and her Instagram account reportedly revealed she followed accounts related to depression and suicide.
“It is encouraging to see that decisive steps are now being taken to try to protect children from disturbing content on Instagram,” said the girl’s father, Ian Russell.
It is now time for other social media platforms to take action to recognize the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people.
Insta has never allowed posts that promote or encourage suicide or self-harm. We will not allow any graphic images of self-harm, such as cutting on Instagram even if it would previously have been allowed as admission.
“We are not removing this type of content from Instagram entirely, as we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help,” Mosseri said.
Instagram’s planned to ramp up efforts get counseling or other resources to people who post or search for self-harm related content.
Mosseri joined representatives from Facebook, Google, Snapchat, Twitter and other companies who met with Hancock to discuss handling of content related to self-injury or suicide.