Within 48 hours, the legal landscape governing social media and children shifted in ways that will take years to fully understand and verify.
On March 24, 2026, a Santa Fe jury ordered Meta to pay US$375 million for violating New Mexico’s consumer protection laws. The next day, a Los Angeles jury found Meta and Google’s YouTube negligent in the design of their platforms, awarding almost $6 million in damages to a single plaintiff.
The dollar figures are drawing headlines, but a $375 million penalty against a company worth $1.5 trillion is a rounding error. The award is less than 2% of Meta’s $22.8 billion net income in 2025. Meta’s stock rose 5% on the day of the New Mexico verdict, indicating how the market assessed the effect of the penalty on the company.
Fines without structural change are more akin to licensing fees than accountability. As a technology policy and law scholar, I believe the question of whether these verdicts will produce real changes to the products that millions of children use every day is more consequential than the jury awards.
The answer is not yet, and not automatically. A financial penalty does not rewrite a single line of code, remove an algorithm or place a safety engineer in a role that was eliminated to protect a quarterly earnings report. Meta and Google have signaled they will appeal, with First Amendment challenges to the product-design theory the likely central battleground.
The companies’ lawyers are likely to argue, with some justification, that the science linking the design of platforms to mental health harm remains contested, and that the companies have already implemented safety measures. In the meantime, Instagram, Facebook anf YouTube will continue to operate exactly as they did before the verdicts.
The verdicts against Meta pave the way for hundreds or even thousands of similar cases.
Consumer protection
Most coverage framing the New Mexico verdict casts it as a child safety case. It is that, but it also presents a more technically significant dimension: a consumer protection claim grounded in allegations of corporate deception. New Mexico Attorney General Raúl Torrez did not sue Meta for what users posted, but instead sued Meta for its false statements about its own platform safety, employing a novel legal approach.
For three decades, Section 230 of the Communications Decency Act has shielded internet platforms from liability for content generated by their users. Courts have interpreted Section 230 immunity broadly, and many earlier attempts to hold platforms accountable for child harm have foundered on it.
The New Mexico complaint, filed in December 2023, was drafted with explicit awareness of this obstacle. It asked a single question: Did Meta knowingly lie to New Mexico consumers about the safety of its products?
The jury’s answer was yes, on all counts, and its verdict rested on three distinct…



