Jury finds Meta and YouTube liable in landmark social media addiction case

A California jury found Meta and YouTube liable for negligent design of addictive platforms, awarding $6 million in damages to a 20-year-old plaintiff.

Objective Facts

A California jury found Meta and YouTube liable on all counts in a landmark case that accused the tech giants of intentionally addicting a young woman and injuring her mental health. The jury ordered the companies to pay a total of $3 million in compensatory damages and recommended YouTube pay an additional $900,000 in punitive damages and $2.1 million in punitive damages from Meta. Meta bears 70% of the responsibility for the plaintiff's harms and YouTube 30%, jurors found. The jury concluded that Meta's apps, including Instagram, and Google's YouTube were deliberately built to be addictive and the companies' executives knew this and failed to protect their youngest users. The landmark verdict may influence the outcome of 2,000 other pending lawsuits.

Left-Leaning Perspective

Left-leaning outlets and advocates framed the verdict as a watershed moment holding Big Tech accountable for deliberately addictive design. Parents, child safety experts and some lawmakers said the finding of liability was a long-overdue moment of accountability, with the verdict representing social media's Big Tobacco moment. Numerous parents who attributed their children's deaths to social media attended the trial in Los Angeles, and Parents for Safe Online Spaces said the jury's decision was a rare and momentous win in a years-long fight. The left emphasizes corporate knowledge and intent to harm. Attorneys stated that this is the first time in history a jury has heard testimony by executives and seen internal documents that prove these companies chose profits over children. Digital rights advocates stated that new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years, noting that these products were purposefully designed to harm and addict millions of young people. Democratic Senator Ed Markey pushed for congressional action, stating that Congress must impose real guardrails on these platforms. Left-leaning voices emphasize this verdict breaks new legal ground and represents systemic corporate wrongdoing. Legal experts note this is new legal territory that could reshape an industry long shielded by Section 230, and that platforms will have to rethink their focus on engagement at any cost. They downplay the modest damages awarded and focus on the precedent-setting liability finding. The narrative avoids acknowledging strong evidence from the trial showing the plaintiff had serious pre-existing mental health challenges from her home environment.

Right-Leaning Perspective

Right-leaning outlets and tech company spokespersons emphasize the modest financial impact and legal uncertainty moving forward. The financial repercussions of the Los Angeles case are a small price to pay for companies as large as Meta and Google, and the companies plan to appeal the decision with no guarantee that subsequent cases will go the same way. Meta stated teen mental health is profoundly complex and cannot be linked to a single app. The tech industry defense emphasizes complexity and alternative causal factors. During the trial, Meta and YouTube denied that the plaintiff's social media use led to her mental health issues, arguing that her family history, difficulties at home and school and learning disabilities played a more significant role in her psychological and emotional struggles. While tech giants argue they've already invested heavily in youth safety features, some experts compare the legal pressure to Big Tech's Big Tobacco moment. Google argued the case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site. Right-oriented analysis stresses appeals prospects and design of future litigation. In the short term, Meta and Google are expected to aggressively appeal the verdict, likely taking the case to higher courts, and long-term these companies will likely undergo a strategic pivot away from addictive engagement toward meaningful interaction models. The $6 million damages are a small price for these companies, and they plan to appeal with no guarantee subsequent cases will go the same way.

Deep Dive

This verdict represents a significant but narrow legal breakthrough. The case succeeded by shifting legal focus from user-generated content (protected by Section 230) to platform design architecture—infinite scroll, autoplay, notifications, and algorithmic recommendations. The verdict validated the plaintiff's lawyers' approach of shifting the legal target; instead of focusing on the content people see on social media, the case put the spotlight on how social media services were designed, comparable to the legal crusade in the 1990s against Big Tobacco. This strategic reframing may prove durable across pending litigation. However, the trial record reveals genuine factual complexity. Internal Meta documents showed that 11-year-olds were four times as likely to keep coming back to Instagram despite the platform requiring users to be at least 13 years old, supporting intentional minor-targeting claims. Yet simultaneously, Meta had claimed that Kaley's difficult childhood, not social media, caused her mental health challenges, but her lawyer argued those challenges simply raised the stakes for the companies to protect children. The jury verdict required finding only that platform negligence was a "substantial factor"—not the sole cause—in harm. Both sides' evidence had merit, which explains why jurors deliberated for nine days, at one point telling the judge they were struggling to reach consensus, though a majority ultimately voted to hold both companies liable. What happens next will determine real-world significance. The companies plan to appeal the decision, and Meta plans to appeal the New Mexico case as well. While tech giants argue they've already invested heavily in youth safety features, some experts compare the wave of legal pressure to Big Tech's Big Tobacco moment. The precedent matters most if appeals fail or if judges order "injunctive relief"—mandatory design changes—rather than just damages. In the federal Track Two process, a judge could issue injunctive relief, meaning a court order forcing Meta and Google to fundamentally change their app designs, potentially banning infinite scroll for minors or mandating chronological feeds. Until that happens, this remains a symbolic win with moderate financial sting. Both narratives have defensive merit: the left correctly identifies internal awareness of harm risks; the right correctly notes that teen mental health involves multiple causes and that $6 million is marginal to billion-dollar companies.

OBJ SPEAKING

← Daily BriefAbout

Jury finds Meta and YouTube liable in landmark social media addiction case

A California jury found Meta and YouTube liable for negligent design of addictive platforms, awarding $6 million in damages to a 20-year-old plaintiff.

Mar 25, 2026· Updated Mar 26, 2026
What's Going On

A California jury found Meta and YouTube liable on all counts in a landmark case that accused the tech giants of intentionally addicting a young woman and injuring her mental health. The jury ordered the companies to pay a total of $3 million in compensatory damages and recommended YouTube pay an additional $900,000 in punitive damages and $2.1 million in punitive damages from Meta. Meta bears 70% of the responsibility for the plaintiff's harms and YouTube 30%, jurors found. The jury concluded that Meta's apps, including Instagram, and Google's YouTube were deliberately built to be addictive and the companies' executives knew this and failed to protect their youngest users. The landmark verdict may influence the outcome of 2,000 other pending lawsuits.

Left says: The verdict represents social media's Big Tobacco moment—the harm these companies intentionally cause children has been proven in a court of law. A jury of regular people has managed to do what Congress and even state legislatures have not: Hold Meta and Google accountable for addicting young people to their products.
Right says: Meta said teen mental health is profoundly complex and cannot be linked to a single app, saying the company remains confident in its record of protecting teens online. The companies plan to appeal the decision, and there's no guarantee that subsequent cases will go the same way.
✓ Common Ground
Multiple sources across the spectrum acknowledge that for years, parents, pediatricians, educators and whistleblowers have pushed claims that social media harms young people's mental health, and for the first time, juries in two states took their side—finding both Meta and YouTube liable in Los Angeles and Meta liable in New Mexico.
Both sides acknowledge that when major companies lose significant court cases, those cases often bring product and operational changes, even if companies survive financially.
Several voices across ideological lines recognize the precedent value of focusing litigation on platform design rather than user-generated content, as a way to navigate Section 230 protections.
Both Democratic and Republican lawmakers, including Senator Marsha Blackburn, agree the verdict should propel legislative action on online safety, with calls to pass the Kids Online Safety Act.
Objective Deep Dive

This verdict represents a significant but narrow legal breakthrough. The case succeeded by shifting legal focus from user-generated content (protected by Section 230) to platform design architecture—infinite scroll, autoplay, notifications, and algorithmic recommendations. The verdict validated the plaintiff's lawyers' approach of shifting the legal target; instead of focusing on the content people see on social media, the case put the spotlight on how social media services were designed, comparable to the legal crusade in the 1990s against Big Tobacco. This strategic reframing may prove durable across pending litigation.

However, the trial record reveals genuine factual complexity. Internal Meta documents showed that 11-year-olds were four times as likely to keep coming back to Instagram despite the platform requiring users to be at least 13 years old, supporting intentional minor-targeting claims. Yet simultaneously, Meta had claimed that Kaley's difficult childhood, not social media, caused her mental health challenges, but her lawyer argued those challenges simply raised the stakes for the companies to protect children. The jury verdict required finding only that platform negligence was a "substantial factor"—not the sole cause—in harm. Both sides' evidence had merit, which explains why jurors deliberated for nine days, at one point telling the judge they were struggling to reach consensus, though a majority ultimately voted to hold both companies liable.

What happens next will determine real-world significance. The companies plan to appeal the decision, and Meta plans to appeal the New Mexico case as well. While tech giants argue they've already invested heavily in youth safety features, some experts compare the wave of legal pressure to Big Tech's Big Tobacco moment. The precedent matters most if appeals fail or if judges order "injunctive relief"—mandatory design changes—rather than just damages. In the federal Track Two process, a judge could issue injunctive relief, meaning a court order forcing Meta and Google to fundamentally change their app designs, potentially banning infinite scroll for minors or mandating chronological feeds. Until that happens, this remains a symbolic win with moderate financial sting. Both narratives have defensive merit: the left correctly identifies internal awareness of harm risks; the right correctly notes that teen mental health involves multiple causes and that $6 million is marginal to billion-dollar companies.

◈ Tone Comparison

Left-leaning outlets use triumphant, accountability-focused language emphasizing "finally" and "breakthrough," with moral weight attached to jury validation. Right-leaning/tech defense commentary adopts a more legalistic, skeptical tone emphasizing financial insignificance, appeal prospects, and the complexity of causality—language designed to diminish the verdict's immediate impact while preparing audiences for appeals.

✕ Key Disagreements
Whether the platforms intentionally designed addictive features or merely created engaging products
Left: Plaintiffs showed internal Meta documents in which CEO Mark Zuckerberg and executives described efforts to attract and keep kids and teens on platforms, with one document stating 'If we wanna win big with teens, we must bring them in as tweens.' Left argues this proves intentional design for addiction targeting minors.
Right: Meta and YouTube argued that Kaley's family history, difficulties at home and school, and learning disabilities played a more significant role in her struggles, with Meta noting that not one of her therapists identified social media as the cause of her mental health issues.
The causality and scope of social media's role in mental health harm
Left: Advocates contend these products were purposefully designed to harm and addict millions of young people, and lead to lifelong mental health consequences.
Right: The companies insisted throughout the case that there is no scientific proof that social media causes mental health issues, suggesting they are being used as a scapegoat for multi-faceted emotional issues children face.
Whether this verdict represents transformative change or a limited financial setback
Left: The Big Tobacco moment analogy highlights a pivotal shift, with advocates asserting this legal outcome proves the harm these companies intentionally cause children has been proven in a court of law.
Right: The $6 million in damages is a small price for companies as large as Meta and Google, and the companies plan to appeal with no guarantee that subsequent cases will go the same way.