Meta Hit With $375m Fine For Child Safety Failures

A jury in New Mexico has ordered Meta Platforms Inc. to pay $375 million for violating state consumer protection laws by misrepresenting the safety of its social media platforms and failing to shield children from harmful content.

The verdict was delivered Tuesday, March 24, 2026, ending a nearly seven‑week trial that began in early February.

The ruling marks a significant legal setback for the company behind Facebook, Instagram, and WhatsApp, and represents the first jury decision in the United States holding Meta responsible for exposing minors to risk while promoting its platforms as safe.

The lawsuit, initiated in December 2023 by New Mexico Attorney General Raúl Torrez, alleged Meta engaged in deceptive and unfair practices, claiming the company knowingly allowed harmful content and adult interactions to reach children while advertising safety features that were ineffective or misleading.

Jurors concluded that Meta made false statements about platform safety and allowed children to be exposed to predators and explicit content.

READ ALSO:Meta Taps Ex-Trump Official Dina Powell McCormick as President

The $375 million award reflects the maximum fine permitted under state law, calculated at $5,000 per violation across thousands of instances.

During the trial, evidence included undercover tests where investigators created a teen profile that was quickly targeted with inappropriate content, testimony from former Meta employees, and internal warnings ignored by company management.

Meta responded by stating it disagrees with the verdict and intends to appeal, highlighting its investments in parental controls and safety tools aimed at protecting young users.

Attorney General Torrez indicated that further legal actions are expected.

A second phase, set for May 2026, will be heard by a judge and could impose additional penalties or require structural changes to improve child safety across Meta platforms.

The decision adds to a growing wave of lawsuits nationwide against social media companies over youth safety and algorithm design, prompting broader questions about corporate responsibility in protecting minors online.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.