Meta would prefer to exit New Mexico rather than redesign its applications for children.
A bench trial in Santa Fe may compel changes to algorithms, implementation of age verification, and establish a $3.7 billion mental health fund. Meta has responded by threatening to withdraw Facebook and Instagram from the state if the requirements are deemed unacceptable.
In March, a jury in New Mexico reached a groundbreaking verdict, ruling that Meta, formerly known as Facebook, had breached the state’s consumer protection law by misrepresenting the safety of its platforms for young users. The penalty imposed was $375 million, marking the first victory for a state in a trial against a significant US tech company over child endangerment.
This trial phase, which begins on Monday before Judge Bryan Biedscheid in Santa Fe, will not involve a jury. Over the course of approximately three weeks, the judge will hear what New Mexico Attorney General Raúl Torrez seeks from Meta regarding the harm already established by the jury, and he will make a decision. Reuters succinctly stated that this trial could demand changes to Facebook, Instagram, and other Meta platforms that the company has resisted for nearly ten years.
Meta's threat indicates a serious consideration of the outcome. The company has signaled that it may withdraw Facebook and Instagram from New Mexico if the court's directives become untenable.
What New Mexico is requesting is not merely symbolic. Court documents reviewed by Reuters and the Boston Globe reveal that the state is asking the court to mandate Meta to verify the ages of its users, alter its recommendation algorithm to avoid optimizing for engagement with minors, cease autoplay and infinite scrolling for users under 18, suspend push notifications during school hours and nighttime, and limit children's monthly usage to 90 hours.
Additionally, the state is requesting $3.7 billion to support teen mental health services throughout New Mexico, on top of the initial $375 million awarded. Each of these proposed measures has been scrutinized, lobbied for, and partially implemented by Meta in various forms, often in response to concerns in other markets that the company perceives as more threatening than New Mexico. To date, however, none of these measures have been enforced by a court in the United States.
If Judge Biedscheid were to grant even a portion of these requests, it would be unprecedented for a state court to actively alter the product specifications of a global social media platform.
The origins of the case date back to before the verdict; Torrez filed the lawsuit in late 2023 after his office conducted an undercover operation that involved setting up a fake Instagram profile for a 13-year-old girl. He later stated that the account was immediately bombarded with images and targeted solicitations from users attempting to exploit children. The state contends that this was not merely a consequence of the platform's size but a fundamental feature of its recommendation system.
During the first phase of the trial, prosecutors presented evidence of internal communications from Meta discussing the implications of Mark Zuckerberg’s 2019 decision to default to end-to-end encryption for Facebook Messenger. Documents indicated that this change would hinder Meta’s ability to inform law enforcement about an estimated 7.5 million reports of child sexual abuse materials annually. According to NBC News, the jury deemed these communications pivotal in concluding that Meta knowingly endangered children, with the encryption decision being a particularly damaging piece of evidence during the trial.
Subsequently, the European Commission formally accused Meta of failing to prevent underage people from accessing its platforms under the Digital Services Act, marking the first such accusation against a mainstream social platform.
Meta's response, outlined in pre-trial documents and a public letter reported by The Washington Post and Source New Mexico, has been significant. The company claims that some of New Mexico's proposed remedies are technically unfeasible, would disrupt its consistent operation across markets, and could ultimately force it to withdraw Facebook and Instagram from the state. Torrez described this threat as “showing the world how little it cares about child safety,” a comment that gained widespread media attention on April 30.
Assessing whether Meta would actually follow through on its threat is complex. With a population of roughly 2.1 million, New Mexico represents only a small fraction of Meta's global user base. This proclamation may serve as a negotiation tactic, aimed at persuading the judge to consider the broader implications of any stringent orders. Additionally, it underscores the concern that platform-level adjustments in one jurisdiction could set a precedent for others.
Over 40 state attorneys general have initiated similar lawsuits against Meta, with significant trials scheduled through 2026. In this context, New Mexico is being treated as a test case.
Meta is approaching the second phase of the trial not through a position of neglect. In recent years, it has implemented numerous teen-safety features, including AI systems that monitor adult interactions with minors, notifications for excessive use, default privacy settings for users under 16, parental control tools, and advertising restrictions for teens. Many of these measures were enacted in response to regulatory pressure from the EU, which has now implemented an age verification
Other articles
Meta would prefer to exit New Mexico rather than redesign its applications for children.
New Mexico is requesting a Santa Fe judge to mandate changes to algorithms, implement age verification, and create a $3.7 billion fund for teen mental health, targeting Meta. The company has warned of the possibility of exiting the market.
