Content on social media sites, including Instagram and Pinterest, is “likely” to have contributed to the death of British teenager Molly Russell, who took her own life after viewing thousands of posts about suicide, depression and self-harm, a coroner has ruled.
The result marks a reckoning for social media platforms, as legislators around the world grapple with how to make the internet safe for children, and will put renewed pressure on apps used by young people.
Delivering his conclusions almost five years after Russell’s death in November 2017 when aged 14, senior coroner Andrew Walker said she died from “an act of self-harm whilst suffering from depression and the negative effects of online content”.
Russell had engaged with 2,100 depression, suicide or self-harm posts on Meta-owned Instagram in the six months before she took her life and interacted with such content on all but 12 days over that period, unbeknown to her family.
Social media sites had not been “safe” at the time of Russell’s death, said Walker, adding that it was “likely that the materials used by Molly, already suffering from a depressive illness and vulnerable due to her age, affected her mental health in a negative way, and contributed to her death in a more than minimal way”.
Walker chose not to record a conclusion of suicide because of the severe nature of Russell’s mental health and the fact that online content had “normalised her condition”. Posts Russell saw portrayed suicide as “an inevitable consequence of a condition that could not be recovered from”, said Walker.
The design of the platforms’ algorithms meant Russell was exposed to certain content without seeking it out, said Walker. Russell had been able to “binge” harmful videos, images and clips “some of which were selected and provided to without Molly requesting them”, he said. Those “binge periods” were “likely to have had a negative impact on Molly”, he added.
“At the time these sites were viewed by Molly some of [them] were not safe as they allowed access to adult content that should not have been available for a 14-year-old to see,” said Walker.
The coroner said on Thursday: “It used to be the case that when a child came through the front door of their home they did so to a place of safety . . . where dangers were kept to a minimum, if at all.
“What we did with the internet was [bring] into our homes a source of risk and we did so without appreciating the extent of that risk.”
The coroner will prepare a report in the coming weeks aimed at preventing future deaths, which will be sent to Pinterest and Instagram. He indicated on Thursday that he believed children and adults needed separate social media sites.
The inquest heard that Russell was emailed by Pinterest “10 depression pins you might like” in the weeks after her death and that Instagram had suggested accounts to her linked to depression and self-harm.
The inquest heard clashes between the Russell family’s barrister Oliver Sanders KC and Meta’s head of wellbeing Elizabeth Lagone, who defended the company’s view that certain self-harm related content was “safe”.
Meta banned all graphic self-harm and suicide content in 2019 and always removed posts that encouraged or promoted it.
Pinterest’s head of community Jud Hoffman acknowledged during the inquest that even with upgrades to technology, the site remained “imperfect” and admitted it had not been safe at the time Russell used it.
Sanders told the coroner on Thursday that 14-year-old teenagers like Russell were in the “eye of the storm” of the potential damage caused by the rapid growth of social media sites.
Both Instagram and Pinterest have strengthened their policies since Russell’s death in 2017.
Meta said it was “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers” and “will carefully consider the coroner’s full report”.
Pinterest said: “Combating self-harm content is a priority for us as we work to ensure that Pinterest plays a positive role in people’s lives . . . Molly’s story has reinforced our commitment to create a safe and positive space for our [users].”
Anyone in the UK affected by the issues raised in this article can contact the Samaritans for free on 116 123.