Meta Faces Scrutiny in Landmark Trial Over Child Safety
Santa Fe, N.M. — A high-stakes trial in Fresh Mexico is examining the extent to which Meta, the parent company of Facebook, Instagram, and WhatsApp, understood and addressed the risks its platforms pose to children. State prosecutors allege Meta failed to adequately disclose these risks, including potential mental health problems and exposure to sexual exploitation. The trial, which began February 9th, is among the first in a growing wave of lawsuits against the social media giant .
Allegations of Prioritizing Profit Over Safety
New Mexico’s Attorney General, Raúl Torrez, argues that Meta knowingly prioritized user engagement and profit over the safety of young people. Evidence presented includes internal company documents, such as emails between Meta executives, flagging urgent concerns about exploitation on Facebook and Instagram . One email, read in court, stated that Instagram had turn into a “leading two-sided marketplace for human trafficking” .
Meta’s Defense
Meta’s legal team contends that the company has implemented safety measures, including protections for teenagers and content filtering. They acknowledge, however, that some harmful content still slips through these safeguards. Meta CEO Mark Zuckerberg, in a deposition played for the jury, stated that “safety is extremely important” and that the company stopped directly linking business performance goals to time spent on its platforms in 2017 .
Potential Outcomes and Legal Arguments
If the jury finds Meta violated New Mexico’s consumer protection laws, the company could face penalties amounting to billions of dollars. Meta disputes this calculation and seeks a different approach to assessing sanctions .
The case centers on two counts of violating the New Mexico Unfair Trade Practices Act, which protects consumers from deceptive practices. A third count was dropped by the judge . Prosecutors are attempting to sidestep legal protections, such as Section 230 of the Communications Decency Act, which often shields tech companies from liability for user-generated content.
A second phase of the trial, scheduled for May before a judge without a jury, will determine whether Meta created a public nuisance with its platforms and if the company should fund programs to address the resulting harm. Prosecutors allege Meta’s platforms serve as a “breeding ground” for predators and negatively impact the mental health of teenagers .
Broader Implications
This trial is one of several legal battles Meta is facing regarding the safety of its platforms. A California jury is currently deliberating in a similar case, potentially setting a precedent for thousands of other lawsuits . The outcomes of these cases could lead to increased restrictions on smartphone use in schools and a broader reckoning for social media companies.