A Los Angeles jury has found Meta and Google liable for the mental health damage caused by Instagram and YouTube to a young woman who became addicted to the platforms as a child, awarding $6 million in total damages in the first social media addiction trial to reach a verdict in the United States.

The March 25 ruling marks a turning point for more than 2,400 pending lawsuits against social media companies from families, schools, and state governments — and it arrived just one day after a separate New Mexico jury ordered Meta to pay $375 million for endangering children on its platforms.

The California Verdict

After nine days of deliberation, the jury found both Meta (owner of Instagram) and Google (owner of YouTube) negligent in how they designed their platforms. The plaintiff, identified as K.G.M. and now 20 years old, testified that she became addicted to YouTube at age six and Instagram at age nine.

Her legal team argued the companies deliberately built features — autoplay, infinite scroll, beauty filters, algorithmic recommendations — to maximize engagement without regard for user wellbeing, despite internal research showing the harm to young users.

Key Facts
  • Compensatory damages: $3 million
  • Punitive damages: $3 million (jury found "malice, oppression, or fraud")
  • Meta's share: 70% ($4.2 million total)
  • Google's share: 30% ($1.8 million total)
  • Settled before trial: TikTok and Snap (undisclosed amounts)

The New Mexico Verdict: $375 Million

One day before the California verdict, a New Mexico jury delivered an even larger financial blow to Meta. The state's attorney general, Raúl Torrez, had filed suit in 2023 after an undercover investigation revealed that Meta's platforms were "prime locations" for predators to target minors.

$375M
Total penalty against Meta in New Mexico
75,000
Individual violations found by the jury
$5,000
Penalty per violation (state maximum)

What Makes These Cases Different

Social media companies have faced criticism for years, but legal liability has been rare. Section 230 of the Communications Decency Act has historically shielded platforms from responsibility for user-generated content. These trials succeeded by targeting something different: product design.

Traditional Approach | New Legal Strategy Sue over content posted by users | Sue over platform design choices Blocked by Section 230 immunity | Bypasses Section 230 entirely Focuses on individual posts | Focuses on features like autoplay, infinite scroll, algorithms Hard to prove causation | Expert testimony on addictive design patterns ::/versus

Plaintiffs argued that features such as autoplay videos, infinite scrolling feeds, push notifications, and engagement-maximizing algorithms constitute a defective product — not protected speech. Courts have increasingly accepted this framing, allowing cases to proceed past motions to dismiss.

The Scale of Litigation

These two verdicts are the tip of an enormous legal wave. As of March 2026, the federal multidistrict litigation (MDL) includes at least 2,407 claims, with hundreds more in state courts.

2023
New Mexico AG files suit against Meta after undercover investigation
2023-2024
Thousands of families and school districts file addiction lawsuits
Early 2026
TikTok and Snap settle with plaintiff K.G.M. before trial
March 24, 2026
New Mexico jury orders Meta to pay $375 million
March 25, 2026
California jury finds Meta and Google liable, awards $6 million
2026 (upcoming)
Federal MDL bellwether trials begin, school district claims prioritized

How Meta and Google Are Responding

Both companies have stated they disagree with the verdicts and plan to appeal.

Meta has pointed to safety features it has introduced, including teen account restrictions, time limit reminders, and parental supervision tools. The company has also argued that parents bear responsibility for monitoring children's screen time.

Google has emphasized its YouTube Kids app and restrictions on autoplay for minors. The company maintains that its platform provides educational value and that the verdict mischaracterizes how the product works.

⚠️
Both verdicts will likely be appealed and could take years to resolve. However, the "malice, oppression, or fraud" finding in California makes reversal on appeal more difficult — juries rarely reach that standard without strong evidence. ::/alert

What This Means for Parents and Users

The immediate practical impact is limited — Instagram and YouTube are not shutting down, and neither company is being forced to redesign its products through these rulings. But the legal momentum is building pressure:

  • More settlements are likely. TikTok and Snap already settled before trial. With two guilty verdicts in two days, remaining defendants have stronger incentives to negotiate.
  • Platform changes may accelerate. Both Meta and Google have been gradually adding teen safety features, but the threat of billions in potential damages across thousands of cases creates urgency.
  • Legislative action may follow. Several U.S. states have passed or proposed laws restricting minors' access to social media. These verdicts strengthen the argument for federal regulation.

The Big Picture

Advocates are calling this social media's "Big Tobacco moment" — a reference to the 1990s lawsuits that ultimately forced the tobacco industry to pay hundreds of billions in settlements and fundamentally changed how cigarettes were marketed and sold.

The parallel is imperfect. Tobacco companies lied about health risks for decades. Social media companies have been more transparent about risks through internal research (some of which was leaked by whistleblower Frances Haugen in 2021), even if they have been slow to act on their own findings.

But the legal framework is now established: social media platforms can be held liable not for what users post, but for how the product itself is designed. With 2,400 cases waiting in the pipeline and the first bellwether trials approaching, the financial exposure for Meta, Google, TikTok, and Snap could eventually reach tens of billions of dollars.

Both companies have the resources and legal teams to fight these cases for years. But the question has shifted from whether social media companies can be held accountable to how much accountability will cost them.