UK

Police probe online ‘Suicide Squad’ as three girls are rescued

Police probe online ‘Suicide Squad’ as three girls are rescued… but Instagram won’t shut down teens’ ‘death chat’ because it doesn’t break their rules

  • Girls aged between 12 and 16 years old created a ‘pact to die by suicide’ 
  • They were part of a chat group that described its participants as a Suicide Squad
  • The girls were found by officers in the street after being reported missing 

Three missing girls were found ‘seriously unwell’ in the street after they created a ‘pact to die by suicide’, it has emerged.

The shocking details have been revealed by police investigating an online chat group that led to ‘suicidal crises’ among a young peer group.

Twelve girls, aged 12 to 16 from across the South of England, were part of a disturbing Instagram chat group that described its participants as a ‘Suicide Squad’.

Its existence was discovered when three of them travelled by train to Chingford, East London, and were found by officers in the street after being reported missing.

The shocking details have been revealed by police investigating an online chat group that led to ‘suicidal crises’ among a young peer group

They were taken by ambulance to hospital for emergency treatment and one of the girls revealed they had first met online and created the pact. The girls’ phones were seized following the incident on March 1 and officers were able to trace another nine youngsters who were part of the same chat group.

Seven of the 12 girls had already self-harmed before they were identified by British Transport Police (BTP) officers.

Police officers from five forces have been part of an investigation into the incident, which was revealed after a briefing note was published online by a local authority.

The note, adapted from a letter sent by BTP’s Assistant Chief Constable Charlie Doyle and first reported by the BBC, said that ‘peer-to-peer influence increased suicidal ideation [formulation of ideas] among the children involved to the extent that several escalated to suicidal crises and serious self-harm’.

The memo also warned that ‘regrettably, it is likely that a similar scenario may come to notice in the future’. Shockingly Facebook, which owns Instagram, said it has reviewed the reports but found no content that broke their rules, meaning the chat group has not been removed from online.

Last night, Dr Bernadka Dubicka, the Royal College of Psychiatrists’ chair of young people and children’s faculty, hit back at the social media firm and said their ‘rules are not good enough’.

‘They can’t be setting their own rules, it needs to be regulated,’ she said. ‘They need to be doing a lot more, acting a lot quicker and thinking of harmful effects on children and young people instead of their shareholders first.

‘If they feel they’ve not been breaking their own rules then their rules are not good enough.

‘It’s good that the police are investigating this but it’s the tip of the iceberg. It’s so easy to get access to suicide references on the internet.’

It is understood a second social media page, which reportedly encouraged harmful behaviour, is subject to a police investigation and has been removed. In November 2020, Instagram introduced technology to recognise self-harm and suicide content on its app.

Fears about the impact of this content on young and vulnerable people have been raised since 2017 when 14-year-old schoolgirl Molly Russell killed herself after seeing graphic images on the platform.

A Facebook spokeswoman said: ‘Mental health, suicide and self-harm are serious issues with devastating consequences, and our deepest sympathies are with anyone affected by them. We are co-operating with the police on this important investigation and reviewed reports but found no content that broke our rules, nor in fact any suicide or self-harm related content.’

A spokesperson for BTP said they were called in the early hours of March 1 ‘following concern for the welfare of three teenage girls’ who were taken to hospital before being discharged. They added: ‘An investigation is currently ongoing into a report of a social media page encouraging harmful behaviour, which is also believed to be linked to a second concern for welfare incident in London later that day. The page has now been deleted, and these are believed to be localised incidents.’

Anyone can contact Samaritans FREE any time from any phone on 116 123, even a mobile without credit. This number won’t show up on your phone bill. Or you can email [email protected] or visit www.samaritans.org.

Advertisement


Source link

Related Articles

Back to top button