TEEN ACCOUNTS FOR FACEBOOK AND MESSENGER

source - https://about.fb.com/ (teen accounts)

Meta made a significant move to safeguard younger users by bringing Instagram’s teen safety features to Facebook and Messenger. The change, going live worldwide in April 2025, brings more stringent privacy defaults, AI-driven content monitoring, and improved parental controls.

This change arrives as regulatory pressure mounts in the U.S. and EU, where politicians are urging tech giants to make the internet a safer place for children. But will it be enough? Here’s a rundown of what’s new—and what parents and teens need to know.

MAJOR CHANGES TO TEEN ACCOUNTS

1. Stricter Privacy Defaults

New teen accounts (under 18) will automatically have:

  • Posts set to “Friends only” (previously public by default).
  • DMs restricted to “Close Friends” (strangers can’t message teens).
  • Existing teen accounts will receive prompts to enable these settings.

Why it matters: Reduces exposure to strangers and unwanted contact.

2. AI-Backed Safety Measures

Meta is using machine learning to:

  • Hide suspicious adult accounts from teen searches/recommendations.

Flag high-risk behavior, such as:

  • Adults sending bulk friend requests to minors.
  • Teens posting self-harm or age-inappropriate content.
  • Encrypted message scanning (even in Messenger’s secret chats) for harmful content.

Controversy: Privacy advocates worry about overreach in monitoring private chats.

3. Enhanced Parental Controls

Parents can now use Meta’s Family Center to:

  • Set daily screen time limits (with school-night/weekend modes).
  • Receive alerts if their teen blocks or reports someone.
  • Monitor app usage across Instagram, Facebook, and Messenger.

Catch: Tools are only available in 30+ countries at launch.

WHY IS META MAKING THESE CHANGES?

Due to legal pressure and the raising concern for the minors online, Meta was left with no other option but to make this extension.

  • U.S. Laws: California’s Age-Appropriate Design Code Act (effective July 2025) requires stricter child protections.
  • EU’s Digital Services Act (DSA): Fines companies that fail to safeguard minors.
  • Competitive Landscape: Rivals like TikTok and Snapchat have also rolled out teen safety features.
  • Public Backlash: Past lawsuits accused Meta of harming teen mental health.

LIMITATIONS & CONCERNS

  • Teens Can Override Settings: Protections are default-only—minors can manually disable them.
  • No Age Verification: Meta relies on self-reported ages, making it easy for kids to bypass restrictions.
  • Privacy Trade-Offs: AI scanning encrypted messages raises surveillance concerns.

PREVENTIVE MEASURES THAT PARENTS CAN TAKE

  • Enable Family Center to monitor activity.
  • Discuss online safety with teens—defaults aren’t foolproof.
  • Report suspicious accounts via Meta’s tools.

FINAL VERDICT

Meta’s expansion of teen safety features is a welcome change, but its effectiveness depends on:

  • Teens not disabling protections.
  • Better age verification.
  • Transparency around AI monitoring.

For now, parental vigilance remains crucial.