Meta CEO Mark Zuckerberg denied in a Los Angeles courtroom that Instagram targets children under 13, as he testified in a landmark trial examining social media’s impact on youth mental health.
The case centers on a California woman who alleges that Instagram and YouTube knowingly designed addictive features to hook young users, contributing to her depression and suicidal thoughts during childhood. She is seeking damages from both Meta and Google.
During cross-examination, plaintiff’s lawyer Mark Lanier confronted Zuckerberg with internal Meta documents referencing a “tween strategy,” including a 2018 presentation stating: “If we want to win big with teens, we must bring them in as tweens.”
Zuckerberg rejected the interpretation, reiterating that Meta does not allow users under 13 on its platforms. He acknowledged internal discussions about creating a child-safe version of Instagram but said such a product was never launched.
Age Verification and Revenue Impact
Jurors were shown internal communications, including an email from former Meta executive Nick Clegg stating that age limits were “unenforced (unenforceable?).”
Zuckerberg responded that age verification is technically difficult for app developers and argued that responsibility should also fall on device manufacturers.
He testified that teens account for less than 1% of Instagram’s revenue.
Allegations of Maximizing Screen Time
The CEO was also questioned about past internal emails outlining goals to increase user time spent on Instagram by double-digit percentages.
While acknowledging that earlier growth strategies included time-spent metrics, Zuckerberg stated that Meta’s current focus is on user experience rather than engagement maximization.
Jurors reviewed internal documents projecting an increase in daily average usage from 40 minutes in 2023 to 46 minutes in 2026. Zuckerberg described these figures as performance benchmarks rather than formal goals.
A Broader Regulatory Reckoning
The trial is part of a wider wave of litigation in the United States targeting major social media companies over alleged harm to minors.
Snap and TikTok settled with the plaintiff prior to trial.
A verdict against Meta or Google could significantly reshape the legal shield long provided to technology companies under U.S. liability protections, particularly as lawsuits shift focus from user-generated content to platform design decisions.
Globally, governments are tightening restrictions on youth access to social media. Australia has banned users under 16, while Florida has prohibited platforms from allowing users under 14 — legislation currently being challenged in court.
Why It Matters for CXOs
This case goes beyond content moderation. It targets platform architecture, engagement incentives, and product design decisions.
For digital leaders, the implications are clear:
-
Platform design can become a liability risk
-
Engagement optimization strategies may face regulatory scrutiny
-
Age verification standards are likely to tighten globally
The outcome could redefine accountability standards for digital product leadership.







