SANTA FE, N.M. (KRQE) — In a closely watched trial in Santa Fe, New Mexico, the state argues that big social platforms should be treated as a public nuisance, while the company behind Instagram and Facebook pushes back hard. The state rested its case on Wednesday, and attorneys for “Meta” mounted a defense that aims to keep the company from being labeled responsible for the harms the state says flow from its products. This dispute could reshape how online speech and corporate responsibility are treated in state courts across the country.
The lawsuit accuses “Meta” of designing its apps in ways that nudge users toward harmful behavior and of profiting from engagement even when it meant exposing youth to risk. Prosecutors walked jurors through internal research and alleged patterns, trying to show corporate choices made harm more likely. For the state, the narrative is about accountability: platforms built features that increased time online, and that intensification, they say, created real-world harms.
On the other side, “Meta” rejects the public nuisance label and argues the law was never meant for this kind of case. The company paints a picture of platforms as tools used by billions for lots of different reasons, not as single-purpose machines that inevitably cause harm. Defense attorneys told the court that holding a tech company liable here would upend settled legal principles and chill online innovation.
Legal experts watching the trial point out that public nuisance law traditionally targets activities that interfere with public rights like health or safety in a direct, localized way. Applying that doctrine to a global social network is a leap, they say, because content decisions and user interactions cross lines that nuisance law never contemplated. That gap — between old legal categories and new tech realities — is what both sides are fighting over in court.
Evidence the state presented included internal reports and studies about teen mental health and engagement with social apps, framed to suggest corporate awareness. Lawyers for “Meta” countered with context, arguing that many factors drive behavior and that correlation is not causation. The company emphasized investments in safety features and content moderation as part of a broader approach to address harms.
Courtroom strategy matters here. The state is aiming to persuade a jury that the company’s product design decisions created a foreseeable risk to public health and safety. “Meta” is focused on legal limits, arguing that courtroom remedies should not rewrite product design choices or substitute for legislative policymaking. Both sides are aware that a verdict could reverberate far beyond New Mexico.
If a court were to accept the public nuisance claim against “Meta,” the remedies could range from monetary damages to injunctive relief that changes how features work. That outcome raises practical questions: who decides what product changes are required, how to measure compliance, and whether local courts are equipped to oversee a multinational platform. Skeptics worry about judicial micromanagement of complex technical systems.
The case also stokes broader policy debates about the balance between platform responsibility and free expression. Critics of big tech often want stronger consequences for design choices tied to harm, while defenders caution against overbroad legal theories that might restrict speech or innovation. Both sides frame the stakes as not just about one company, but about how society will regulate technology going forward.
Industry groups are watching closely because a loss for “Meta” could invite similar lawsuits nationwide, using public nuisance as a template. Lawmakers and regulators in other states have floated parallel approaches, and courts in other jurisdictions could look to this trial for precedent. That prospect has spurred heavy attention from legal scholars and civil society organizations on how to craft future rules without unintended fallout.
For New Mexico, the trial is also a test of the state’s willingness to push novel legal theories in pursuit of public safety aims. Prosecutors argue they are filling a regulatory gap, seeking accountability where they believe legislative action has fallen short. Opponents say such big-picture policy choices belong to legislatures, not to judges or juries applying decades-old tort law in new ways.
No matter the verdict, the litigation is likely to accelerate conversations about what a modern, functional framework for platform accountability should look like. Lawmakers, judges, technology companies, parents, and educators will all need to grapple with the same reality: digital platforms shape behavior at scale, and figuring out who bears responsibility is complicated and urgent. The courtroom phase in Santa Fe is just one chapter in a national debate that will continue long after the jury leaves the room.