A jury in New Mexico on Tuesday found tech company Meta liable for harming children and misleading consumers about the safety of its platforms, ordering the company to pay $375 million in civil penalties for violating consumer protection laws.
New Mexico Attorney General Raúl Torrez praised the result, saying :
New Mexico is proud to be the first state to hold Meta accountable in court for misleading parents, enabling child exploitation, and harming kids… Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew.
The jury found Meta liable for claims brought by the state of New Mexico under the Unfair Practices Act, which include enabling child sexual exploitation as well as making false statements about children’s online safety. The act prohibits deceptive or unfair business practices, such as false advertising or misleading statements about goods and services. Violators may be fined up to $5,000 per violation.
The verdict follows an investigation into Meta’s platforms launched in 2023 by the New Mexico Department of Justice (NMDOJ) for dishonesty and harmful design choices that allowed the sexual abuse of children, online solicitation, and other harms. Based on NMDOJ evidence, Meta’s design features enabled child predators to sexually exploit children on its social media platforms.
Furthermore, testimonies from witnesses and experts demonstrated that the tech company intentionally designed its platforms to increase youth engagement and expose them to dangerous content related to self-harm, contrary to the company’s public commitments. Additional evidence also revealed that Meta employees and external child safety experts repeatedly warned the company about these dangers, but the company prioritized financial profit over child safety.
Torrez called the decision a “historic victory” for children and their families, adding that the NMDOJ will seek additional financial penalties in the next phase of proceedings and will require Meta to change its platforms to enhance child protection. Measures include effective age verification, removal of predators from the platform, and protection of minors from encrypted communications that shield malicious actors. A bench trial is scheduled on May 4 to hear NMDOJ’s final claim against Meta.
Meta has faced similar lawsuits related to consumer protection and child safety amid concerns about physical and mental harms from social media use. In 2024, the Australian legislature banned social media use for children under 16 years old. Additionally, four school boards in Canada sued social media companies for disrupting students’ education and creating addictive products marketed to children.
New York state lawmakers have passed legislation prohibiting social media companies from using addictive recommendation algorithms for users under-18. In 2026, the first case concerning tech companies’ liability for children becoming addicted to social media began on February 10 in California.