A landmark California trial is putting Meta’s business model under oath—specifically, whether its Facebook and Instagram platforms are engineered to keep users compulsively engaged, even when harm is foreseeable, especially for kids. According to Fox Business:
An expert witness in a case brought by a California woman against Meta, the parent company of Facebook and Instagram, testified that the design features of its social media apps are addictive, likening them to a “drug,” especially when affecting youth.
The landmark case continued in a California courtroom on Tuesday with witness testimony.
Dr. Anna Lembke, psychiatrist and Stanford University professor, told the court after reviewing thousands of pages of internal documents and reviewing social media companies’ own research, she determined the design features of social media are addictive.
The mother of four, who is the highest ranking person overseeing addiction initiatives at the university, defined addiction as “the continued, compulsive use of a substance or a behavior despite harm to self or others.”
Lembke argued that Meta deploys “potent” features, such as Instagram’s “infinite scroll” and tailored-for-you algorithms, to stimulate dopamine release that “drugifies human connection.”
Meta’s executives are simultaneously trying to sand down the word “addictive” into something more palatable—and the courtroom exchange makes that tension explicit. According to the New York Post:
“Protecting minors over the long run is even good for the business and for profit,” [Adam Mosseri, head of Instagram], said.
[Plaintiff’s attorney Mark] Lanier then got personal, asking Mosseri about his three sons before flashing an internal Meta report that warned kids who’ve faced trauma are especially vulnerable to harm from social media, and asked whether platforms like his should be doing more to shield children with rocky upbringings from potential damage online.
This is exactly why NLPC has pressed Meta on governance-level accountability for child safety rather than cosmetic “controls.” In 2024, we presented a shareholder proposal urging Meta to examine raising the minimum age for its platforms and submit that analysis to shareholders—because the evidence of harm is longstanding, and “engagement-first” incentives don’t self-correct.
Since then, the pattern hasn’t changed: Meta announces new “protections,” while the underlying attention-extraction architecture remains intact. NLPC has documented that gap repeatedly. Meta is currently in the midst of another lawsuit brought by the state of New Mexico, “accusing Meta of creating a marketplace and ‘breeding ground for predators who target children for sexual exploitation and failing to disclose what it knew about those harmful effects.”
If these lawsuits prove Meta understood that kids are most vulnerable—and designed their platforms for maximum compulsion anyway—more shareholders will realize the company has more than a PR problem. It’s a board oversight failure with real legal, reputational, and long-term enterprise risk.
(Image above created via Gemini AI)
