New Mexico Jury Rules Against Meta, Finding Company Endangered Children's Mental Health and Safety
Health

New Mexico Jury Rules Against Meta, Finding Company Endangered Children's Mental Health and Safety

A New Mexico jury has found Meta guilty of harmful and deceptive trade practices targeting children, recommending penalties totaling $375 million.

By Sophia Bennett6 min read

New Mexico Jury Delivers Landmark Verdict Against Meta

A New Mexico jury has ruled that Meta knowingly damaged children's mental health and deliberately concealed what it knew about child sexual exploitation occurring across its social media platforms. The verdict, reached after nearly seven weeks of trial proceedings, marks a significant turning point in how courts and governments are beginning to hold major tech companies accountable for the safety of young users.

Jury Finds Thousands of Violations

Jurors determined that Meta — the parent company of Instagram, Facebook, and WhatsApp — engaged in deceptive and "unconscionable" trade practices that exploited the inexperience and vulnerabilities of children, in direct violation of New Mexico's Unfair Practices Act. The jury identified thousands of individual violations, with each one carrying a separate penalty, ultimately arriving at a total recommended fine of $375 million.

While that figure is substantial, it represents less than one-fifth of what state prosecutors had originally sought. Juror Linda Payton, 38, explained that the jury reached a compromise on the estimated number of affected teenagers, but chose to apply the maximum $5,000 penalty per violation — a decision she said reflected her belief that every child deserved the highest possible consideration.

Profits Over Safety, Prosecutors Argued

New Mexico state prosecutors, led by Attorney General Raúl Torrez who filed the original lawsuit in 2023, argued that Meta consistently placed profit generation above the safety and well-being of its youngest users. Evidence presented during the trial included a wide range of internal company communications, whistleblower testimony, reports from psychiatric experts, and input from tech safety consultants.

Local public school educators also took the stand, sharing firsthand accounts of classroom disruptions linked to social media activity, including disturbing cases of sextortion schemes targeting minors. The trial also scrutinized specific public statements made by Meta CEO Mark Zuckerberg, Instagram chief Adam Mosseri, and Meta's global head of safety, Antigone Davis, with the jury evaluating whether those statements were misleading to users.

Additionally, jurors examined Meta's consistent failure to enforce its own minimum age requirement of 13, the role of its content-recommendation algorithms in amplifying harmful or sensational material, and the platform's widespread circulation of content related to teen suicide.

Undercover Investigation Played Key Role

A significant portion of the state's case rested on an undercover investigation in which law enforcement agents created social media profiles posing as minors to document the volume and nature of sexual solicitations they received — and to observe how Meta responded. Prosecutors also argued that Meta failed to fully disclose or meaningfully address the risks associated with social media addiction, though the company itself has never acknowledged that such addiction exists. Meta executives did, however, concede at trial that "problematic use" of their platforms is a real concern.

Meta Disputes Verdict, Plans to Appeal

Meta's legal team pushed back throughout the trial, maintaining that the company actively works to detect and remove harmful content and bad actors, and that it discloses known risks to users. Attorney Kevin Huff argued in closing statements that Meta's applications are designed to help people build connections — not to serve as tools for predators.

Following the verdict, a Meta spokesperson confirmed the company intends to appeal and expressed disagreement with the jury's findings. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content," the spokesperson said. "We remain confident in our record of protecting teens online."

Notably, Meta's stock rose approximately 5% in after-hours trading following the announcement — a sign that Wall Street investors were largely unmoved by the legal outcome. Meta's current market valuation stands at approximately $1.5 trillion.

What Comes Next

Despite the jury's decision, Meta will not be required to immediately alter its platform practices. A second phase of the trial, scheduled for May, will place the question of public nuisance before a judge rather than a jury. That phase will determine whether Meta's platforms have created a public nuisance and whether the company should be required to fund public programs designed to address the harms caused.

A Broader Wave of Legal Action

New Mexico's case was among the earliest to reach trial in what has become a sweeping wave of litigation targeting social media platforms over their impact on young people. More than 40 state attorneys general have filed separate lawsuits against Meta, claiming the company deliberately engineered addictive features in Instagram and Facebook that have contributed to a growing mental health crisis among American youth.

Meanwhile, jurors in a separate federal case in California have been deliberating for over a week on whether Meta and YouTube bear liability in a similar matter.

Sacha Haworth, executive director of The Tech Oversight Project, called the verdict a sign of shifting accountability. "Meta's house of cards is beginning to fall," she said, pointing to whistleblowers and unsealed documents as evidence of the company's long awareness of the dangers its platforms posed to children.

ParentsSOS, a coalition of families who have lost children to harms linked to social media, described the ruling as a "watershed moment" in the broader fight to hold technology companies responsible. "We applaud this rare and momentous milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our kids," the group said in a statement.