Meta has announced today they will notify parents if their teens are searching for content relating to self harm or suicide.
Parents who have set up a child’s account through the app’s Teen Accounts feature will receive alerts following any search that would suggest a teen wants to self harm.
Notifications will be sent via text, email or WhatsApp, depending on available contact information. Parents will also be able to view expert resources, which aim to help them carefully approach sensitive conversations.
Mobile phone app logos for, from left, Facebook, Instagram and WhatsApp
In a blog post, Meta states: ‘These alerts are designed to give parents the information they need to support their teen.’
‘We have strict policies against content that promote or glorify suicide or self-harm.’ They are also working on similar notifications relating to conversations with AI.
‘We know teens are increasingly turning to AI for support. While our AI is already trained to respond safely to teens and provide appropriate resources, we are now building similar parental alerts for AI, which will be shared in the coming months.’
The Instagram logo on a phone
This comes in the midst of government discussion around a social media ban for under-16s, which Australia has already implemented.
The NSPCC have proposed guidelines on how to help a child who may be struggling with anxiety or depression and suicidal thoughts. Suggestions include; talking to them over text or the phone if they’re unable to speak openly in person, creating healthy coping mechanisms such as yoga or mindfulness and validating their feelings.
If you or someone you know is affected by the themes in this story reach out to Samaritans on 116 123.
Mentions of self harm and suicide
Submitted Article
Headline
Short Headline
Standfirst
Published Article
HeadlineInstagram expands parental supervision to self-harm searches
Short HeadlineParents to be notified by Instagram over searches
StandfirstInstagram will alert parents if teens search for self harm content, and sets its sights on AI.
Meta has announced today they will notify parents if their teens are searching for content relating to self harm or suicide.
Parents who have set up a child’s account through the app’s Teen Accounts feature will receive alerts following any search that would suggest a teen wants to self harm.
Notifications will be sent via text, email or WhatsApp, depending on available contact information. Parents will also be able to view expert resources, which aim to help them carefully approach sensitive conversations.
Mobile phone app logos for, from left, Facebook, Instagram and WhatsApp
In a blog post, Meta states: ‘These alerts are designed to give parents the information they need to support their teen.’
‘We have strict policies against content that promote or glorify suicide or self-harm.’ They are also working on similar notifications relating to conversations with AI.
‘We know teens are increasingly turning to AI for support. While our AI is already trained to respond safely to teens and provide appropriate resources, we are now building similar parental alerts for AI, which will be shared in the coming months.’
The Instagram logo on a phone
This comes in the midst of government discussion around a social media ban for under-16s, which Australia has already implemented.
The NSPCC have proposed guidelines on how to help a child who may be struggling with anxiety or depression and suicidal thoughts. Suggestions include; talking to them over text or the phone if they’re unable to speak openly in person, creating healthy coping mechanisms such as yoga or mindfulness and validating their feelings.
If you or someone you know is affected by the themes in this story reach out to Samaritans on 116 123.
Sanremo is back: Italy’s five-night song contest where new and famous artists debut original tracks, mixing Eurovision-style drama with red-carpet glam and serious musical prestige
A University of Nottingham trial suggests a new wrist-worn device using electrical stimulation could help suppress Tourette’s tics, after a high-profile BAFTA controversy put the condition back in the spotlight