Jury finds Meta liable in child sexual exploitation case in New Mexico

A New Mexico jury found Meta violated state law and ordered it to pay $375 million for failing to protect children from sexual predators on its platforms.

Objective Facts

On Tuesday, March 24, 2026, a jury found Meta violated New Mexico law and ordered the company to pay $375 million in damages for willfully engaging in unfair, deceptive, and unconscionable trade practices. New Mexico Attorney General Raúl Torrez sued Meta in 2023 for allegedly creating a "breeding ground" for child predators on Facebook and Instagram. This is the first time the company has been held accountable in a jury trial for these issues. A later portion of the case to be presented directly to the judge in May could also force Meta to make changes to its platforms and pay additional penalties. Meta said it "respectfully" disagrees and plans to appeal the decision.

Left-Leaning Perspective

Left-leaning and watchdog outlets framed the verdict as a watershed moment for child safety accountability. ParentsSOS, a coalition of families who have lost children to social media harm, called the verdict a "watershed moment" and applauded "this rare and momentous milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our kids." New Mexico Attorney General Raúl Torrez declared the verdict "a historic victory for every child and family," stating "Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough." These outlets emphasized internal evidence of Meta's knowledge of harms. New Mexico prosecutors revealed legal filings detailing internal messages from Meta employees discussing how CEO Mark Zuckerberg's 2019 announcement to make Facebook Messenger end-to-end encrypted by default would impact the ability to disclose to law enforcement some 7.5 million child sexual abuse material reports. Meta's internal research showed "one-in-three teens experienced problematic use," according to prosecution arguments. Left outlets noted the verdict's broader implications. More than 40 state attorneys general have filed lawsuits against Meta, claiming it's contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive. Torrez stated he was "really focused on how we can change the design features of these products" and noted his team was "able to overcome section 230 motions" in both the Meta and Snap cases.

Right-Leaning Perspective

Right-leaning voices and Meta's own statements focused on the investigation's ethical problems and Meta's transparency efforts. Meta spokesperson Andy Stone characterized the investigation as "ethically compromised," claiming Torrez "led an ethically compromised investigation into Meta that knowingly put real children at risk." Stone pointed to the investigation's "use of child photos on proxy accounts, delays by prosecutors in reporting child sexual abuse material and the disposal of data from devices used in the investigation." Meta and its legal team emphasized robust safety investments and transparent limitations. Meta argued the lawsuit "makes sensationalist, irrelevant and distracting arguments by cherry picking select documents" and stressed that 40,000 people at Meta are responsible for safety and "the company invests heavily in measures to protect young users." Meta executives emphasized at trial that "the company continuously improves safety and addresses compulsive social media use without infringing on free speech or censoring users." Right-leaning commentary noted legal and structural concerns about the verdict. Meta had argued it was shielded from liability by free-speech protections and Section 230, claiming "the state's allegations of harm cannot be separated from the content on the platforms, because its algorithms and design features serve to publish content," though the New Mexico judge rejected this argument. Meta stock was up 5% in after-hours trading following the verdict, suggesting shareholders viewed the penalty as manageable for a $1.5 trillion company.

Deep Dive

This verdict represents a significant legal inflection point in the ongoing tension between tech company autonomy and child safety regulation. For years, Meta and other platforms relied heavily on Section 230 immunity and First Amendment protections to shield themselves from liability for platform harms. The judge in New Mexico rejected Meta's Section 230 arguments, allowing the case to go to trial, bypassing what tech companies considered their primary legal firewall. The jury's decision to hold Meta liable suggests that consumer protection law—focused on misleading statements and unfair practices rather than content liability—may be a viable legal path for states to regulate platform design without directly violating Section 230. What both sides largely acknowledge but dispute is the underlying reality: Meta's platforms do expose children to sexual exploitation, and Meta's internal documents show company awareness of these risks. The genuine disagreement centers on whether this awareness constitutes intentional deception versus acknowledged unavoidable risks in large platforms, and whether Meta's solutions were inadequate or reasonable. The trial highlighted tension between end-to-end encryption as a privacy measure and its use by predators, leading Meta to announce it would stop supporting such encryption on Instagram. This concession—made during trial—undercuts Meta's narrative of having done all it reasonably could. The May bench trial phase will likely prove even more consequential than this jury verdict. A second phase scheduled for May 4 could result in court-mandated changes to Meta's platforms, including age verification requirements and new protections for minors. Such structural changes would exceed monetary penalties and set a template for other states. The verdict's long-term impact depends less on the $375 million fine—manageable for a $1.5 trillion company—than on whether courts begin mandating specific platform design modifications nationwide. With more than 40 state attorneys general pursuing similar lawsuits, this New Mexico success establishes a legal roadmap. Meta's appeal will likely center on Section 230, First Amendment rights, and whether consumer protection laws apply to platform design—questions that may ultimately reach the U.S. Supreme Court.

OBJ SPEAKING

← Daily BriefAbout

Jury finds Meta liable in child sexual exploitation case in New Mexico

A New Mexico jury found Meta violated state law and ordered it to pay $375 million for failing to protect children from sexual predators on its platforms.

Mar 24, 2026· Updated Mar 25, 2026
What's Going On

On Tuesday, March 24, 2026, a jury found Meta violated New Mexico law and ordered the company to pay $375 million in damages for willfully engaging in unfair, deceptive, and unconscionable trade practices. New Mexico Attorney General Raúl Torrez sued Meta in 2023 for allegedly creating a "breeding ground" for child predators on Facebook and Instagram. This is the first time the company has been held accountable in a jury trial for these issues. A later portion of the case to be presented directly to the judge in May could also force Meta to make changes to its platforms and pay additional penalties. Meta said it "respectfully" disagrees and plans to appeal the decision.

Left says: Left-leaning sources called the verdict "a historic victory for every child and family" and emphasized that "Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew."
Right says: Meta spokesperson Andy Stone criticized the investigation as "ethically compromised," while Meta's legal team argued the lawsuit "makes sensationalist, irrelevant and distracting arguments by cherry picking select documents."
✓ Common Ground
Meta faces longstanding concerns about risks to kids and teens on its platforms from parents, whistleblowers, advocates and lawmakers. Both sides acknowledge this context, though they dispute Meta's response.
Attorneys for Meta acknowledge that "some bad material gets through its safety net" and both sides agree that perfect content moderation on such large platforms is technically difficult, though they disagree whether Meta did enough.
Both sides recognize that Social media giants are facing hundreds of other cases from individuals, school districts and state attorneys general—some of which are set to go to trial later this year. This verdict will likely influence those proceedings.
Several commentators across perspectives note that Meta announced midway through trial it would stop supporting end-to-end-encrypted messaging on Instagram later this year, indicating the company made concessions during litigation.
Objective Deep Dive

This verdict represents a significant legal inflection point in the ongoing tension between tech company autonomy and child safety regulation. For years, Meta and other platforms relied heavily on Section 230 immunity and First Amendment protections to shield themselves from liability for platform harms. The judge in New Mexico rejected Meta's Section 230 arguments, allowing the case to go to trial, bypassing what tech companies considered their primary legal firewall. The jury's decision to hold Meta liable suggests that consumer protection law—focused on misleading statements and unfair practices rather than content liability—may be a viable legal path for states to regulate platform design without directly violating Section 230.

What both sides largely acknowledge but dispute is the underlying reality: Meta's platforms do expose children to sexual exploitation, and Meta's internal documents show company awareness of these risks. The genuine disagreement centers on whether this awareness constitutes intentional deception versus acknowledged unavoidable risks in large platforms, and whether Meta's solutions were inadequate or reasonable. The trial highlighted tension between end-to-end encryption as a privacy measure and its use by predators, leading Meta to announce it would stop supporting such encryption on Instagram. This concession—made during trial—undercuts Meta's narrative of having done all it reasonably could.

The May bench trial phase will likely prove even more consequential than this jury verdict. A second phase scheduled for May 4 could result in court-mandated changes to Meta's platforms, including age verification requirements and new protections for minors. Such structural changes would exceed monetary penalties and set a template for other states. The verdict's long-term impact depends less on the $375 million fine—manageable for a $1.5 trillion company—than on whether courts begin mandating specific platform design modifications nationwide. With more than 40 state attorneys general pursuing similar lawsuits, this New Mexico success establishes a legal roadmap. Meta's appeal will likely center on Section 230, First Amendment rights, and whether consumer protection laws apply to platform design—questions that may ultimately reach the U.S. Supreme Court.

◈ Tone Comparison

Left-leaning outlets use language of moral victory and corporate accountability, emphasizing phrases like "historic victory," "profits over safety," and "enough is enough." Right-leaning sources and Meta focus on procedural integrity and business context, deploying terms like "cherry picking," "ethically compromised," and "robust disclosures." The tone difference reflects fundamentally different narratives: one of accountability for intentional harm versus one of reasonable efforts constrained by technical realities and improper prosecution tactics.

✕ Key Disagreements
Whether Meta knowingly concealed harmful effects from the public
Left: The prosecution argued "Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew." They presented internal research as evidence of Meta's knowledge.
Right: Meta's attorney argued "What the evidence shows is Meta's robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public."
Whether the New Mexico investigation was legitimate and ethical
Left: Details from the attorney general's undercover investigation led to three arrests and were discussed in court.
Right: Meta argued the state's investigation was "ethically compromised" by its "use of child photos on proxy accounts, delays by prosecutors in reporting child sexual abuse material and the disposal of data from devices used in the investigation."
Whether platform design features were intentionally addictive or reflect user preference
Left: New Mexico prosecutors urged Meta to implement changes including "protecting minors from encrypted communications that shield bad actors." Left sources argue design features were deliberately engineered to maximize engagement at the expense of safety.
Right: Meta emphasized it "continuously improves safety and addresses compulsive social media use without infringing on free speech or censoring users," framing design choices as balancing competing values rather than intentional harm.
The significance and precedential value of the $375 million penalty
Left: Prosecutors celebrated the verdict as sending "a clear message to big tech executives that no company is beyond the reach of the law" and a "turning point in the fight for children's safety."
Right: Critics noted that at $5,000 per violation, the penalty "may seem paltry for a company valued at $1.5 trillion," though acknowledging "this is the first jury verdict of its kind against Meta over harm to young people."