Meta prefers to exit New Mexico rather than redesign its applications for children.
A bench trial in Santa Fe may lead to changes in algorithms, age verification, and the establishment of a $3.7 billion mental health fund. In response, Meta has threatened to withdraw Facebook and Instagram from the state.
In March, a jury in New Mexico made a unique ruling, concluding that Meta, previously known as Facebook, had breached the state's consumer protection law by misrepresenting the safety of Facebook and Instagram for young users. The penalty imposed was $375 million, marking the first instance a state won a trial against a significant U.S. tech company over endangering children.
That was the straightforward part.
On Monday, the second phase of the case will commence before Judge Bryan Biedscheid in Santa Fe, without a jury this time. Over approximately three weeks, the judge will hear what New Mexico Attorney General Raúl Torrez seeks for addressing the harm the jury has already confirmed Meta caused, after which he will make a decision.
Reuters, in their report on May 2, clearly stated that this trial could mandate changes to Facebook, Instagram, and other Meta platforms that the company has resisted for nearly a decade.
In reaction, Meta has issued a warning that indicates it is taking this possibility seriously. If the orders are deemed unacceptable, Meta has signaled it may completely withdraw Facebook and Instagram from New Mexico.
What the state requests
The remedies proposed by Torrez are substantive. Court documents reviewed by Reuters and the Boston Globe reveal that New Mexico is seeking a court order for Meta to verify users’ ages, modify its recommendation algorithm to avoid optimizing for engagement among minors, disable autoplay and infinite scrolling for users under 18, halt push notifications during school hours and overnight, and limit children’s monthly usage of its platforms to 90 hours.
Additionally, the state is requesting $3.7 billion to support mental health services for teens across New Mexico, on top of the previously awarded $375 million.
Each of these measures has been explored, advocated for, and partially implemented by Meta in various forms, often preemptively or in regions where the company feels greater risk than in New Mexico. None, however, have been mandated by court order in the U.S.
If Judge Biedscheid were to approve even a significant portion of Torrez's requests, it would be unprecedented for a state court to keenly alter the product specifications of a global social media platform.
The origins of the case
The lawsuit predates the verdict, as Torrez initiated it in late 2023 following an undercover operation by his office that involved creating a fictitious Instagram account for a 13-year-old girl. The account was flooded with images and targeted solicitations from users attempting to exploit children, which Torrez later described to CNBC. Essentially, the state's argument was that this was not merely a consequence of scale but an inherent issue with the platform’s recommendation system.
During the first phase of the trial, prosecutors submitted internal Meta communications that discussed the repercussions of Mark Zuckerberg’s 2019 decision to shift Facebook Messenger to default end-to-end encryption. These documents indicated that employees believed this change would hinder Meta’s ability to report approximately 7.5 million instances of child sexual abuse material annually to law enforcement.
The jury, as reported by NBC News, considered these communications critical in determining that Meta knowingly endangered children. The encryption development, which was presented as a privacy enhancement, emerged as a particularly damaging piece of evidence at the trial.
Meta has since faced formal charges from the European Commission for not adequately keeping underage users off its platforms under the Digital Services Act, marking the first such accusation against a mainstream social media platform.
Meta’s response, detailed in pre-trial submissions and a public letter referenced by The Washington Post and Source New Mexico, has been remarkable. The company asserts that some remedies sought by New Mexico are technically unfeasible, could disrupt its operational consistency across markets, and might ultimately compel it to disengage Facebook and Instagram from the state.
Torrez labeled this threat as “demonstrating how little it cares about child safety,” a comment widely reported on April 30.
Evaluating whether Meta would actually follow through on this threat is complex. New Mexico has around 2.1 million residents, a small fraction of Meta's vast global user base. The threat appears partly as a negotiation strategy, aiming to prompt the judge to consider the broader implications of any aggressive orders. However, it also underscores the concern that remedies at the platform level in one jurisdiction could influence future actions in others.
More than 40 state attorneys general have initiated similar lawsuits against Meta, with leading trials anticipated through 2026. In this context, New Mexico is being treated as a test case.
Meta is not approaching the second phase having ignored the issue. In recent years, it has introduced numerous teen safety features, including AI systems that identify adults messaging minors who don't follow them, “take a break” notifications for excessive use, private default accounts for users under 16, parental control
Other articles
Meta prefers to exit New Mexico rather than redesign its applications for children.
New Mexico is requesting that a judge in Santa Fe compel Meta to implement changes to its algorithms, establish age verification, and contribute to a $3.7 billion fund for teen mental health. The company has responded with a threat to exit the market.
