Police discovers secret Instagram group where teenage girls as young as 12 prepare and decide to kill

Police have uncovered a secret Instagram group named ‘suicide’ involving twelve girls, aged between 12 and 16 years old across southern England, where the young girls decide to have “suicidal crises” and “serious self-harm”.

According to the BBC, the police work discovered the web group when three of the women went missing and were found seriously unwell in London.

The group of women travelled by train to satisfy in London.

They were found seriously unwell during a street and brought by ambulance to hospital for emergency treatment.

One of the women mentioned that they had first met one another online and discussed suicide, consistent with the police briefing.

Police officers then examined digital devices to spot the name of the web group and its other members.

Seven of the 12 girls had self-harmed before being traced by the police. Children’s social care services from seven different local authorities are involved in safeguarding children identified as members of the group.

Police said during a statement to BBC that “peer-to-peer influence increased suicidal ideation amongst the youngsters involved to the extent that several escalated to suicidal crises and high self-harm.”

Instagram says it found no evidence of its rules being broken because it uses AI (AI) to hunt and block self-harm posts and groups.

Some of the youngsters had met on other social media platforms but were a part of a closed Instagram group – an immediate message thread – whose title used the words “suicide” and “missing”.

Facebook, which owns Instagram, doesn’t deny that the name of the closed group referenced “suicide” but says it’s not been faraway from the platform because the content of the messages doesn’t break its rules.

In a statement, a corporation spokesperson said it had been co-operating with the police.

“We reviewed reports and located no evidence of the people involved breaking our rules around suicide and self harm.

“We don’t allow graphic content, or content that promotes or encourages suicide or self-harm, and can remove it once we find it.

“We’ll still support the police and can answer any valid legal request for information.”

Leave a Reply

Your email address will not be published. Required fields are marked *