Meta announced plans to implement new privacy safeguards specifically aimed at better shielding teens and minors from online content related to graphic violence, eating disorders, and self-harm. The new policy update for both Instagram and Facebook “in line with expert guidance” begins rolling out today and will be “fully in place… in the coming months,” according to the tech company.
[Related: Social media drama can hit teens hard at different ages.]
All teen users’ account settings—categorized as “Sensitive Content Control” on Instagram and “Reduce” on Facebook—will automatically enroll in the new protections, while the same settings will be applied going forward on any newly created accounts of underage users. All accounts of users 18 and under will be unable to opt out of the content restrictions. Teens will soon also begin receiving semiregular notification prompts recommending additional privacy settings. Enabling these recommendations using a single opt-in toggle will automatically curtail who can repost the minor’s content, as well as restrict who is able to tag or mention them in their own posts.
“While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” Meta explained in Tuesday’s announcement. Now, search results related to eating disorders, self-harm, and suicide will be hidden for teens, with “expert resources” offered in their place. A screenshot provided by Meta in its newsroom post, for example, shows links offering a contact helpline, messaging a friend, as well as “see suggestions from professionals outside of Meta.”
[Related: Default end-to-end encryption is finally coming to Messenger and Facebook.]
Users currently must be a minimum of 13-years-old to sign up for Facebook and Instagram. In a 2021 explainer, the…
Read the full article here