Mosseri announced that the Facebook-owned photo-sharing platform would no longer "allow any graphic images of self-harm, such as cutting on Instagram - even if it would previously have been allowed as admission". But it refused to remove them altogether - claiming they actually help some people.
Instagram finally changed its view and said it has chose to ban the sickening images following a "comprehensive review with global experts".
"We are not removing this type of content from Instagram entirely, as we don't want to stigmatise or isolate people who may be in distress and posting self-harm related content as a cry for help", he explained.
"We need to do more to consider the effect of these images on other people who might see them". We have never allowed posts that promote or encourage suicide or self-harm, and will continue to remove it when reported.
Instagram's Boss, Adam Mosseri, has announced that all graphic images of self-harm will be removed from the social media platform following the death of 14-year-old Molly Russell, who took her own life in 2017.
However, not everyone is keen on Instagram's proposed changes.
Britain on Friday urged all social media platforms to join Instagram and curb self-harm posts after a United Kingdom teen who went online to read about suicide took her own life.
Jo Robinson: It's not simple no, and as a parent myself, my heart goes out to any parent who's lost a child to suicide - it really does, but suicide and self-harm and are terribly complex behaviours.
Kane, Alli back in Tottenham training
Over 2 ½ in the Over/Under 2 ½ Goals market is odds-on but if forced to make a selection it would be Under at odds-against. It did to an extent, as Spurs crashed out of both the Carabao Cup and FA Cup in the same week.
'It's the kind of thing that hits you in the chest and sticks with you, ' he said.
So they've had to be very careful about just simply shutting down conversations and removing content and making people feel worse than they did in the first place. Instagram is also planning to blur self-harm content and put it behind a privacy screen so users can not access it accidentally. I think it's been a real challenge for social media companies actually to know how to respond properly and responsibly when young people share images and content about suicide and self-harm.
Instagram's aim is to eliminate graphic self-injury or suicide-related imagery and significantly downplay related content in features at the service while remaining a supportive community, according to Mosseri. Nonetheless, he added that "We need to be led by what the clinicians and experts say need to be taken down" and that he is prepared to legislate if necessary.
Mr Russell was buoyed by Instagram's commitment, and in an interview with the BBC called on other social media to follow suit.
"It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people", he told the BBC.
"What really matters is when children are on these sites they are safe".
Or, if you are struggling and need to talk, you may benefit from seeking professional support.
- U.S. Congresswoman Gabbard to officially declare 2020 candidacy
- Greece’s Parliament backs North Atlantic Treaty Organisation admission for Macedonia
- CNQ partners in 'significant' South Africa offshore oil and gas find
- Phil Jones signs new contract with Manchester United through 2023
- Rocket the Raccoon's real-life inspiration Oreo dies
- USA prison inmate falls through ceiling in failed escape attempt
- Apple puts modem engineering unit into chip design group
- Sprint files lawsuit against AT&T for phony 5G network | Digital
- BAFTA suspends 'Bohemian Rhapsody' director Bryan Singer nomination
- Future Mars Rover Named for DNA Pioneer Rosalind Franklin