Meta Restricts Certain Content for Teens Amid Mental Health Concerns
Meta Will Restrict Content For Teens Around Self-Harm, Eating Disorders Amid Child Safety Lawsuits
Meta said Tuesday it's adding protections to teen users’ accounts to hide age-inappropriate content from search results and explore pages on Instagram and Facebook, and notifying teens to update their privacy settings, amid lawsuits from states over child safety and an upcoming hearing before the Senate.
Meta will hide suicide and eating disorder content from teens as government pressure mounts
Meta is restricting teens from viewing content that deals with topics like suicide, self-harm, and eating disorders, the company announced today. The content, which Meta says may not be “age appropriate” for young people, will not be visible even if it’s shared by someone a teen follows.
Instagram tightens its teen policy: Meta-owned app will now automatically hide content related to suicide
More than six years after the tragic death of Molly Russell, Instagram is finally hiding all posts that can pose serious harm to children.
The Meta-owned app is blocking posts related to suicide, self-harm, eating disorders and other 'types of age-inappropriate content' for users under 18.
Anyone aged between 13 to 17 will automatically get the block on Instagram – as well as Facebook – and won't be able to turn it off, although it will lift once they turn 18.