Lead: On Wednesday, March 25, 2026, a Los Angeles jury found Meta’s Instagram and Google-owned YouTube negligent in a first-of-its-kind lawsuit brought by a 20-year-old plaintiff identified as KGM (Kaley). Jurors concluded the platforms were designed in ways that helped create addictive usage patterns for minors and awarded $3 million in compensatory damages and a recommended $3 million in punitive damages. The jury apportioned responsibility 70% to Meta and 30% to YouTube; a judge will set the final monetary award and rules on any adjustments. The verdict follows a separate New Mexico jury decision earlier that week that also found Meta liable for harms to children’s mental health.
Key Takeaways
- A Los Angeles jury on March 25, 2026 found Meta and YouTube negligent and a substantial factor in harm to plaintiff KGM, who is 20 years old.
- The jury awarded $3 million in compensatory damages and recommended an additional $3 million in punitive damages, totaling $6 million before judicial review.
- Liability was split: Meta 70% (reflected as $2.1 million of punitive share) and YouTube 30% ($900,000 punitive share).
- Jurors deliberated more than 40 hours; nine of 12 jurors were required to agree on each claim and a majority agreed on seven claims against each company.
- Key testimony included the plaintiff’s account of heavy childhood use (YouTube from age 6; Instagram from age 9) and testimony from Meta executives Mark Zuckerberg and Adam Mosseri; YouTube CEO Neal Mohan did not testify.
- Claims targeted platform design elements—endless feeds, autoplay and notifications—while defendants pointed to user safety tools and alternative explanations for the plaintiff’s struggles.
- TikTok and Snap settled before trial; the Los Angeles ruling is viewed as a potential bellwether for thousands of pending suits nationwide.
Background
The lawsuit, filed in California, charged major tech platforms with designing features that encourage prolonged use by minors and failing to warn about associated risks. Plaintiffs’ lawyers focused on product mechanics—such as infinite scroll and autoplay—arguing these features were intentionally tuned to maximize engagement among young users. Defendants invoked platform protections, parental responsibility, and legal limits on liability for third-party content under Section 230 of the Communications Decency Act (1996).
Social and political pressure on platforms has increased as clinicians, lawmakers and families cite rising adolescent mental health concerns since the 2010s. Courts and regulators have begun to treat product design as central to harm questions, drawing analogies in public debate to past liability fights in industries such as tobacco. Prior to this trial, TikTok and Snap settled; two separate juries this week (including the New Mexico verdict) added momentum to plaintiffs’ legal strategy.
Main Event
The jury trial in Los Angeles ran several weeks with testimony from experts, former employees, engineers and the plaintiff, KGM (who testified she used YouTube from age 6 and Instagram from age 9 and was on social platforms “all day long” as a child). Lawyers for the plaintiff demonstrated product features they said were designed to keep young users continuously engaged. Jurors were instructed not to base their findings on specific post content, because Section 230 limits platform liability for third-party posts.
Meta and YouTube defended themselves by citing safety features, parental controls and alternative contributors to the plaintiff’s mental-health history. Meta emphasized the complexity of teen mental health and noted therapists did not expressly attribute her condition solely to social media. YouTube argued it functions more like a video-streaming service and pointed to internal data showing Kaley’s average use of Shorts was about one minute per day.
After more than 40 hours of deliberation, the jury concluded that each company’s negligence was a substantial factor in Kaley’s harm. Two jurors persistently dissented from the majority on liability questions, but nine jurors agreed on the necessary claims, producing the compensatory award and subsequent punitive recommendation. Jurors explained to reporters that testimony from Mark Zuckerberg influenced their view of the companies’ intentions and practices.
Analysis & Implications
Legally, the verdict signals courts are increasingly willing to evaluate product design choices as potential bases for tort liability. Plaintiffs in similar cases only need to show social platforms were a substantial factor in harm, not the sole cause—a lower evidentiary bar that may benefit future suits. The division of responsibility (70/30) provides a concrete apportionment model plaintiffs could cite in subsequent cases.
Practically, the ruling may accelerate settlements by defendants seeking to limit precedent-setting jury decisions, or it could prompt more vigorous appeals and motions seeking summary reversal. Tech companies have signaled they will explore legal options; appeals could raise questions about the applicability of negligence law to complex digital design and the continued reach of Section 230 protections.
Regulators and legislators watching the case may interpret the verdict as support for stronger oversight or mandated product safety features for minors. Platforms could respond by modifying default settings, restricting features for underage accounts, or enhancing parental tools—changes that would shift business design and potential revenue models, and that could ripple across international markets with different legal regimes.
Comparison & Data
| Company | Liability Share | Punitive Damages (juror recommendation) | Notes |
|---|---|---|---|
| Meta (Instagram/Facebook) | 70% | $2,100,000 | Found more responsible; Zuckerberg testified |
| YouTube (Google) | 30% | $900,000 | Argued platform is a video service; CEO not called to testify |
| Total | — | $3,000,000 (punitive) + $3,000,000 (compensatory) | Judge will determine final award |
The table summarizes the jury’s monetary allocation and responsibility split. The recommended $3 million in punitive damages mirrors the $3 million compensatory award, producing a pre-judicial total of $6 million. The jury’s use of company-specific shares clarifies how jurors translated fault into dollar figures.
Reactions & Quotes
Company spokespeople issued immediate statements rejecting the verdict and announcing plans to explore legal options, including appeals. Both companies emphasized existing safety features and disputed that a single app can be blamed for complex mental-health outcomes.
“The verdict misrepresents YouTube, which is a responsibly built streaming platform, not a social media site.”
Google spokesperson Jose Castañeda
This statement frames YouTube as categorically different from social platforms and signals a legal strategy to distinguish streamed video from interactive social networks. The company also noted usage trends showing limited Shorts viewing by the plaintiff.
“Teen mental health is profoundly complex and cannot be linked to a single app.”
Meta spokesperson
Meta’s brief response underscores its defense: multiple factors contribute to mental-health outcomes and platform design is not the sole determinant. Meta and other defendants are likely to lean on this complexity in both trial and appellate briefs.
“We wanted them to feel it.”
Juror (partial name withheld)
A juror told reporters the award reflected a desire to send a message to platforms about practices the jury found unacceptable. Jurors also said internal testimony—particularly from top Meta executives—affected their assessment of intent and corporate awareness.
Unconfirmed
- Whether the presiding judge will fully adopt the jury’s punitive award, reduce it, or increase it remains pending until a formal damages hearing; the final figure is unconfirmed.
- How appellate courts will treat the jury’s interpretation of product design liability and the interplay with Section 230 is unresolved and subject to appeal.
- Whether and how Meta or YouTube will change product defaults or design features in response to the verdict has not been confirmed by the companies beyond general statements of disagreement.
Bottom Line
The Los Angeles verdict marks a consequential legal development: juries can, under current tort law, attribute substantial responsibility to major platforms for product-design choices that courts may find harmful to minors. While the monetary award must be finalized by a judge and will almost certainly be contested on appeal, the decision strengthens plaintiffs’ leverage in the many similar suits pending nationwide.
Beyond this single case, the ruling raises practical questions about how platforms balance engagement-driven product design against safety concerns for young users. Policymakers, companies and families will watch the post-trial process—judicial rulings, appeals, potential settlements, and any product or policy changes—for signals about the future responsibilities of digital platforms toward minors.
Sources
- AP News — news/press report with trial coverage and quotations.