Meta Heads To Trial In New Mexico Over Claims It Enabled Child Exploitation On Social Platforms

Meta Platforms Inc. is set to appear before a New Mexico court as it faces sweeping allegations that its social media services exposed minors to sexual exploitation and allowed harmful activity to flourish for profit.

The lawsuit, filed by New Mexico Attorney General Raúl Torrez, accuses the parent company of Facebook, Instagram, and WhatsApp of failing to adequately protect children and teenagers from online predators. According to the complaint, Meta’s platforms allegedly made it easier for adults to locate, communicate with, and exploit underage users, in some cases resulting in real-world abuse and human trafficking.

Proceedings are scheduled to begin Monday at the Santa Fe District Court, with the trial expected to span up to eight weeks. State prosecutors argue that Meta knowingly ignored systemic risks affecting young users while continuing to benefit financially from engagement-driven platform design.

Meta has strongly rejected the accusations, insisting that it has implemented extensive safety measures to protect minors. The company says it has deliberately slowed youth engagement growth and invested heavily in child safety tools, moderation systems, and reporting mechanisms aimed at preventing abuse.

At the heart of the case are claims that Meta’s product architecture—particularly features such as infinite scrolling feeds and auto-playing video content—encourages compulsive use. The lawsuit alleges that these design choices increase the likelihood of addiction among children and teens, potentially contributing to mental health challenges including anxiety, depression, and self-harm.

Prosecutors further claim that internal Meta documents show company executives were aware of widespread sexual exploitation risks and the negative psychological impact on younger users. Despite this knowledge, the lawsuit argues, Meta failed to roll out fundamental safeguards such as robust age-verification systems.

The legal action stems in part from an undercover investigation dubbed “Operation MetaPhile.” During the operation, law enforcement officers created Facebook and Instagram accounts posing as children under the age of 14. According to court filings, these accounts were quickly targeted with sexually explicit content and contacted by adults soliciting similar material. The investigation ultimately led to criminal charges against three individuals.

In response, Meta has argued that it is protected from liability under the First Amendment and Section 230 of the Communications Decency Act, which shields online platforms from responsibility for user-generated content. The company has characterized the state’s claims as exaggerated and misleading, saying they rely on selectively chosen internal documents taken out of context.

As the trial unfolds, the case is expected to draw national attention, raising broader questions about the responsibility of major technology companies to protect children in an era of algorithm-driven social media.