Parental controls fall short: Meta study shows teens still glued to social apps

Show summary Hide summary

Internal research at Meta suggests routine parental controls may not prevent teens from compulsively using social platforms — a finding that prosecutors in a high-profile Los Angeles trial are using to argue social networks contributed to youth harm. The study’s conclusions, disclosed in court testimony, raise fresh questions about whether product design — not parental supervision — is the decisive factor in teen overuse.

How this surfaced in court

The study, known internally as Project MYST and conducted with the University of Chicago, was cited during testimony in a lawsuit filed by a teenage plaintiff identified as “KGM” or Kaley. She and family members allege that several major platforms built features that encourage excessive use and have left young users with anxiety, depression and other serious harms.

Kaley’s case names Meta, YouTube, ByteDance (TikTok) and Snap; the latter two reached settlements before the trial began. In Los Angeles County Superior Court this week, Kaley’s attorney relied on the internal Meta research to argue the company knew its products worsened risks for vulnerable teens yet did not issue warnings or change course.

What Project MYST found

The study surveyed roughly 1,000 teenagers and their parents to assess social media habits and perceived self-control. Key takeaways presented in court included:

  • Limited link between parental oversight and teen self-regulation: researchers reported minimal correlation between parental rules or monitoring and teens’ own reports of attentiveness to their social media use.
  • Agreement across respondents: both parents and teens in the survey tended to report the same lack of association between supervision and teens’ perceived ability to moderate use.
  • Higher vulnerability after adverse experiences: teens who had encountered more stressful or traumatic events—such as bullying or family instability—were likelier to report weaker attentiveness to their social media behavior.

If those conclusions hold, they suggest common tools like screen-time limits or in-app parental controls may not be sufficient to prevent compulsive engagement for some young people.

Conflicting interpretations on the stand

Instagram head Adam Mosseri testified he did not recall specifics about Project MYST beyond its name, though court documents indicate he had approved moving forward with the research. Mosseri also acknowledged one possible reason teens overuse platforms: many use social media to escape difficult real-world situations.

Meta’s lawyers emphasized that Project MYST measured teens’ feelings about their usage rather than proving clinical “addiction.” They argued the study was narrowly focused and pushed responsibility back toward family and environmental factors, noting elements of the plaintiff’s life such as parental divorce, bullying, and an abusive household.

By contrast, Kaley’s counsel framed the study as evidence that the platforms’ algorithmic feeds, notification systems and reward-based design mechanics actively exploit teen vulnerabilities — particularly for those already coping with trauma.

Notably, Project MYST’s findings have not been published publicly, and the company did not alert users or caregivers based on the work, testimony showed.

Why this matters now

The trial’s outcome could shape more than one family’s future: judges and juries will interpret how internal research influences legal responsibility and whether product design must change to better protect minors. A verdict against the companies could accelerate regulatory scrutiny and pressure platforms to alter recommendation engines, notification strategies and parental-control tools.

For parents and caregivers, the case signals a gap between available monitoring features and their effectiveness for some adolescents — particularly those facing stressful home or school environments. That suggests a need for broader approaches combining digital tools with mental health support and school- or community-based interventions.

Meta responded to requests for comment, saying the analysis did not demonstrate an effect of parental oversight on teens’ behavior and stressing that caregivers repeatedly ask for monitoring tools, which the company builds. The company also notes it avoids labeling heavy use as “addiction,” preferring terms like problematic use when people spend more time than they feel comfortable with.

How jurors weigh Project MYST and other internal studies alongside witness testimony will help determine whether platforms are held accountable for design choices that affect young users. The case continues in Los Angeles as both sides present expert evidence and argue over the balance between individual, parental and corporate responsibility.

Give your feedback

Be the first to rate this post
or leave a detailed review



ECIKS.org is an independent media. Support us by adding us to your Google News favorites:

Post a comment

Publish a comment