Connect with us

411

Meta Expands Teen Safety Features to Facebook and Messenger in Major Update

Published

on

Meta is taking a bigger step toward safeguarding young users online. The company is expanding its Teen Accounts program—initially introduced on Instagram—to now include Facebook and Messenger, bringing a more unified and controlled experience for teens and their guardians across its platforms.

This update is rolling out first in the US, UK, Australia, and Canada, with plans to expand globally “in the next couple of months.” While no exact timeline is given, Meta says the goal is to offer a safer and more positive digital experience for young users by default.

What’s Changing for Teens?

Under the new system, teens will see restrictions on who they can interact with, how long they can stay on the app, and what kind of content they can view or send. Here are some of the most notable features:

  • Instagram Live Restrictions: Teens under 16 will need parental permission to go live.

  • Night-Time Notifications Turned Off: Teens will no longer receive notifications at night, helping promote healthier app use and sleep habits.

  • 60-Minute Activity Reminders: The app will nudge teens to take a break after an hour of screen time.

  • Blurred Images in DMs: AI will automatically blur photos with suspected nudity, and teens will need parental approval to view them.

  • DM Limits: Teens can only text people they already follow—cutting down on unsolicited messages from strangers.

The AI Behind the Scenes

While Meta hasn’t provided a technical breakdown, the content moderation features—like the image blurring—appear to use AI-based detection built directly into the device. This includes monitoring for explicit images and other flagged content. If detected, the image is blurred, and the teen must request access through a parental control gate.

Success on Instagram Sets the Stage

Meta reports that the Teen Accounts program has already onboarded over 54 million active users globally, with the majority sticking to the built-in restrictions. According to a company survey:

  • 97% of teens stayed within Teen Account limits.

  • 94% of parents supported the program.

  • 85% believed it contributed to a better (or at least safer) social media experience.

This feedback has encouraged Meta to scale the approach across all its core platforms.

Why This Matters

Social media remains a double-edged sword for young people—connecting them to friends, ideas, and opportunities, while also exposing them to unwanted content and harmful behavior. Meta’s latest updates aim to give parents more oversight and teens more protection, without entirely blocking them from digital spaces.

Whether these changes will meaningfully shift the online experience for teens—or just serve as a PR shield for Meta—remains to be seen. But the move signals that Big Tech may finally be listening to long-standing concerns from parents, educators, and lawmakers.

{Source Stuff}

Follow Joburg ETC on Facebook, Twitter , TikTok and Instagram

For more News in Johannesburg, visit joburgetc.com

Continue Reading