UPDATE March 26, 2026 10:00 a.m. ET: In addition to the decision by a New Mexico jury explained in our original post described below, a Los Angeles jury reached a $6 million verdict against Meta Platforms and Google in a lawsuit over social media addiction.
ORIGINAL POST, March 26, 2026 6:00 a.m. ET: A New Mexico jury has delivered a devastating verdict against Meta. The ruling is one of the clearest signs yet that what the company has long framed as a moderation challenge is increasingly being treated, in court, as a business-model and governance failure. According to Fox Business:
A New Mexico jury on Tuesday ordered Meta to pay $375 million after finding the company violated state law by misleading users about the safety of its platforms and allegedly enabling child sexual exploitation.
Jurors found the Facebook and Instagram parent company violated New Mexico’s consumer protection law following a lawsuit brought by Attorney General Raul Torrez, who accused Meta of failing to protect children from predators.
“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” said New Mexico Attorney General Raúl Torrez. “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough”…
The lawsuit, filed in 2023 by the state, alleged Meta created a “breeding ground” for child predators and misled users about safety protections on Facebook, Instagram and WhatsApp.
This ruling lands only weeks after NLPC highlighted the parallel California litigation over whether Meta’s platforms were deliberately engineered to be addictive for minors in our recent post on the lawsuits challenging Meta’s platform design.
It also reinforces the case NLPC has already made that Meta’s soft reforms are not enough, because the company’s much-publicized protections for young users still leave the underlying access and incentive problem intact.
In 2024, we presented a shareholder proposal urging Meta to examine raising the minimum age for its platforms and submit that analysis to shareholders—because the evidence of harm is longstanding, and “engagement-first” incentives don’t self-correct.
For years, Meta has tried to present child-safety failures as isolated enforcement misses rather than the predictable result of a platform built to maximize engagement and growth. But when a jury concludes that the company misled users and violated state law, that becomes much harder to dismiss as activist rhetoric. Shareholders should see this verdict for what it is: a warning that weak oversight of youth safety is no longer just a reputational liability, but a material legal and governance risk — exactly as NLPC warned.
