14 year old Molly Russell viewed images of self harm and suicide on social media in the months before she died. A coroner has now concluded that the content was “unsafe” and it contributed to her death.
“I’ve said it before and I’ll say it again and continue to say it, that I have no doubt that Instagram helped kill my daughter.”
Ian Russell speaking to Tortoise’s Imy Harper
This is Ian Russell.
In 2017, his daughter Molly killed herself. She was 14 years old.
Her parents describe her as a forward-looking and positive child.
In 2021, Ian Russell told Tortoise that, at first, they didn’t get what could have happened to her.
“We didn’t understand how she could have found herself in such a place that she would contemplate ending her life. And we didn’t understand how she’d managed to keep her anguish hidden from her friends and her family so well. And that’s what led us to look into her social media accounts, on which we found horrifying content. It was obvious to us as a family, obvious to me as a parent that had profoundly affected how she was feeling.”
Ian Russell speaking to Tortoise’s Imy Harper
Molly’s parents found that she had been spending time on Instagram and Pinterest, and engaging with posts relating to depression, self-harm and suicide.
They think that once Molly started clicking on that sort of content, she fell down a rabbit hole. According to them, the social media algorithms started to recommend more and more harmful posts.
Since her death, Ian Russell has been campaigning for better regulation of social media. He says tech companies have a responsibility to monitor their content and their algorithms.
“From what I’ve seen, there’s a definite link between depressive self-harm, anxiety, eating disorder, suicidal ideation and what young people find online. And I think we need to do things – as much as we possibly can – to make the digital world a safer space.”
Ian Russell speaking at a Tortoise ThinkIn
An inquest into Molly’s death – which happened almost five years ago – started earlier this month.
And while it wasn’t a criminal or civil lawsuit, it was still a landmark inquiry.
“This is the first time ever that social media giants have been forced to speak under oath in court about whether or not their products contributed to the death of a child. It’s also the very first time that we as journalists have been able to see the content that Molly was engaging with before her death. And we heard today that out of the more than 16,000 Instagram posts that Molly saved, shared, or liked on Instagram in the six month period before her death, more than 2,000 were depression, self-harm, or suicide related.”
Channel 4
The question at the centre of this inquest was whether the content Molly saw was “unsafe”, and at the end of last week the coroner delivered his conclusion.
***
The evidence in court was heated.
Dr Navin Venugopal, a child psychiatrist, told the court that he had struggled to sleep for weeks… after reviewing some of the “very disturbing and distressing” content that Molly had seen.
Her family’s lawyers also questioned representatives from social media companies.
“Today, an executive from the company defended its policy, but in a heated exchange at the inquest into Molly’s death, her family’s lawyer asked, ‘Why on earth do you do it?’ He shouted, ‘You are not a parent. You are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this.’”
Sky News
Elizabeth Lagone, who is Head of Wellbeing at Meta, Instagram’s parent company, apologised, saying that Molly had seen some content that violated their 2017 policies.
But she also said that monitoring social media content is an evolving field and she defended people’s rights to post expressions of suicidal and depressive thoughts, because they might find solace in an online community.
Crucially, she also said the content that Molly had seen before she died was “safe” for children.
“In court, Meta described much of that content as nuanced and complex to deal with, and said it had expanded its policies since Molly’s death to include graphic content and fictional depictions of suicide and self-harm.”
Sky News
After coverage of Molly’s death, Meta changed its community guidelines. Elizabeth Lagone told the inquest that she’s proud of where Instagram’s policies are today.
But for Ian Russell, that isn’t enough.
“Change is all too slow. And while this is happening, in the UK alone official figures show that there are something like four school-aged children who end their own lives every week. There are four families like ours, four groups of friends like Molly’s friends, that have to put up with this and go through this, and whose lives will be forever different.”
Ian Russell speaking at a Tortoise ThinkIn
After two weeks of evidence the coroner concluded that it was likely that social media contributed to Molly’s death. Andrew Walker said the content she viewed “was not safe” and “shouldn’t have been available for a child to see.”
He’ll now write a prevention of future death report outlining his concerns and write to Meta and Pinterest, as well as the government and Ofcom.
His conclusion will only strengthen Ian Russell’s view that social media companies need to change.
“I would say that if we move fast and mend things, to paraphrase the famous Facebook quote, and by moving fast, I mean doing it sensibly in a pace that will allow it to happen in a controlled way, then there is hope for the future.”
Ian Russell speaking at a Tortoise ThinkIn
This episode was written and mixed by Patricia Clarke.
Further listening

Secrets and lies
Two young children make horrific allegations in a London police station, lighting the fuse on one of the most serious British conspiracy theories in decades