Meta has announced today they will notify parents if their teens are searching for content relating to self harm or suicide.

Parents who have set up a child’s account through the app’s Teen Accounts feature will receive alerts following any search that would suggest a teen wants to self harm.

Notifications will be sent via text, email or WhatsApp, depending on available contact information. Parents will also be able to view expert resources, which aim to help them carefully approach sensitive conversations.

Mobile phone app logos for, from left, Facebook, Instagram and WhatsApp

In a blog post, Meta states: ‘These alerts are designed to give parents the information they need to support their teen.’

‘We have strict policies against content that promote or glorify suicide or self-harm.’ They are also working on similar notifications relating to conversations with AI.

‘We know teens are increasingly turning to AI for support. While our AI is already trained to respond safely to teens and provide appropriate resources, we are now building similar parental alerts for AI, which will be shared in the coming months.’

The Instagram logo on a phone

This comes in the midst of government discussion around a social media ban for under-16s, which Australia has already implemented.

The NSPCC have proposed guidelines on how to help a child who may be struggling with anxiety or depression and suicidal thoughts. Suggestions include; talking to them over text or the phone if they’re unable to speak openly in person, creating healthy coping mechanisms such as yoga or mindfulness and validating their feelings.

If you or someone you know is affected by the themes in this story reach out to Samaritans on 116 123.

Mentions of self harm and suicide