Business

Instagram allowed self-harm images so people could ‘cry for help’, inquest hears

A senior Meta executive told a British inquest the company had allowed “graphic” images of self-harm on its Instagram site at the time a teenager died by suicide because it wanted to enable users to “cry for help”.

Molly Russell from Harrow, London, died in November 2017 after viewing a large volume of posts on sites such as Meta-owned Instagram, and Pinterest, related to anxiety, depression, suicide and self-harm.

Meta’s head of health and wellbeing policy, Elizabeth Lagone, told the North London coroner’s court on Friday that graphic images Instagram allowed users to share at the time of Russell’s death, could have been “cries for help” and the platform wanted people to “seek community”.

“Graphic promotion or encouragement [of suicide or self-harm] was never allowed,” she said, but added that “silencing [a poster’s] struggles” could cause “unbelievable harm”.

She said the issues were “complicated” and that expert understanding had evolved in recent years.

The court was shown a series of video clips that Russell had liked or saved on Instagram before she died, including close-ups of individuals cutting their wrists with razor blades, that senior coroner Andrew Walker said were “almost impossible to watch”. 

The clips included close-up shots of people self-harming, falling from buildings and swallowing handfuls of pills, often spliced with loud music and negative messages. It was unclear whether they displayed real events or were taken from film and TV.

Walker said the content “appears to glamorise harm to young people” and was “of the most distressing nature”.

Lagone said Instagram had changed its policy in 2019 after experts advised it that graphic self-harm imagery could encourage users to hurt themselves. The company previously removed posts that glorified, encouraged or promoted self-harm but not posts that could have enabled users to admit their struggles and support each other.

After Russell’s death, experts advised the company that “some graphic images . . . could have the potential to promote self-injury,” according to part of Lagone’s witness statement read out in court.

When asked if Meta had undertaken research into the impact of self-harm content on users, Lagone said she was not aware of any and that it would have been difficult to conduct. “The impact of certain material can affect people in different ways at different times . . . It’s really complicated,” she said.

Molly Russell’s father, Ian Russell, told the inquest this week that social media algorithms had pushed his daughter towards disturbing posts and contributed to her death. He told the court that “social media helped kill my daughter”.

Instagram had recommended accounts to Molly Russell that included some related to depression and suicidal feelings.

Molly Russell had also been recommended content about depression by Pinterest, the inquest heard this week, including “ten depression pins you might like”. She continued to receive emails from Pinterest after her death, including one entitled “new ideas for you in depression”.

On Thursday a senior Pinterest executive admitted to the inquest that the site had not been safe at the time of Molly Russell’s death and was still “imperfect”, in spite of updates to its rules.

When asked by the Russell family’s barrister, Oliver Sanders KC, if Instagram’s policies had been “inadequate” when Molly Russell died, Lagone said: “We concluded that we needed to expand the policies and we did so.”

The hearing comes as the passage through parliament of the online safety bill, which aims to compel internet companies to keep their platforms safe, has been paused. Liz Truss, the new prime minister, is said to be considering relaxing a clause that is controversial among tech lobbyists which would make platforms responsible for removing content that was “legal but harmful”, such as bullying.


Source link

Related Articles

Back to top button