Home / Tech / New Mexico jury finds Meta violated consumer protection law over child exploitation claims.

New Mexico jury finds Meta violated consumer protection law over child exploitation claims.

New Mexico jury finds Meta violated consumer protection law over child exploitation claims.

In a landmark decision that could reverberate throughout the technology industry, a New Mexico jury on Tuesday found social media behemoth Meta Platforms Inc. liable for harming children’s mental health and violating the state’s consumer protection laws. After a rigorous nearly seven-week trial, the jurors sided emphatically with state prosecutors, determining that Meta — the parent company of widely used platforms Instagram, Facebook, and WhatsApp — deliberately prioritized corporate profits over the safety and well-being of its youngest users. The verdict levies a substantial penalty of $375 million against the company, reflecting thousands of individual violations of the state’s Unfair Practices Act.

The crux of the prosecution’s argument centered on accusations that Meta knowingly concealed critical information regarding the pervasive dangers of child sexual exploitation on its platforms and the detrimental effects on child mental health. The jury concurred with these allegations, concluding that Meta engaged in false or misleading statements regarding the safety of its services. Furthermore, the jurors found Meta guilty of "unconscionable" trade practices, leveraging the vulnerabilities and inexperience inherent in a child user base for its own commercial gain. This finding is particularly significant, as it sidesteps traditional legal protections for tech companies by focusing on Meta’s own conduct and misrepresentations rather than solely on third-party content.

Responding to the verdict, Meta spokesperson Andy Stone issued a statement to CBS News, saying, "We respectfully disagree with the verdict and will appeal." Stone reiterated Meta’s stance, asserting, "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online." The company’s immediate intent to appeal signals a prolonged legal battle, but the initial jury decision marks a significant setback for the tech giant.

This New Mexico case represents one of the pioneering trials to reach a verdict in a burgeoning wave of litigation targeting social media platforms over their impact on children and adolescents. The timing of the verdict coincides with increasing societal scrutiny on Big Tech, legislative efforts to impose more restrictions on smartphone use among minors, and a growing public health discourse around digital well-being. Lawsuits against Meta and other social media companies are escalating, with numerous school districts and attorneys general across the nation seeking accountability for what they describe as a youth mental health crisis exacerbated by addictive platform designs.

The trial, which commenced on February 9, provided an unprecedented look into Meta’s internal workings. State prosecutors, led by New Mexico Attorney General Raúl Torrez, presented compelling evidence, including findings from a state undercover investigation. Agents created social media accounts posing as children, meticulously documenting instances of sexual solicitations and Meta’s often inadequate or delayed responses. This direct evidence of harmful content and the company’s handling of it proved pivotal in demonstrating Meta’s alleged failures. The lawsuit, initially filed by Torrez in 2023, also asserted that Meta had not fully disclosed or adequately addressed the dangers associated with social media addiction. While Meta executives have stopped short of acknowledging "addiction," they have conceded the existence of "problematic use" and maintained that their aim is for users to have positive experiences on their platforms.

During closing arguments, Meta attorney Kevin Huff argued, "Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business. Meta designs its apps to help people connect with friends and family, not to try to connect predators." This defense attempted to frame the company’s actions as responsible and aligned with user well-being, despite the profit-driven model. Historically, tech companies have largely been shielded from liability for user-generated content under Section 230 of the U.S. Communications Decency Act, a 30-year-old provision, alongside First Amendment protections. However, New Mexico prosecutors deftly navigated these protections by focusing on Meta’s own active role in designing algorithms that allegedly proliferate harmful content and its own misleading statements about platform safety, rather than solely on the content itself.

Prosecution attorney Linda Singer underscored this point, stating, "We know the output is meant to be engagement and time spent for kids. That choice that Meta made has profound negative impacts on kids." This argument suggested that Meta’s core business model, driven by maximizing user engagement through complex algorithms, inherently creates an environment where harmful content can thrive and reach vulnerable young users.

The New Mexico verdict is just one piece of a larger legal puzzle for Meta. In a federal court in Southern California, another jury has been deliberating for over a week in a separate bellwether trial, weighing whether Meta and YouTube should be held liable for harms caused to children on their platforms. Bellwether cases are crucial as their outcomes can significantly influence the trajectory and settlement of thousands of similar lawsuits. Adding to the pressure, Meta CEO Mark Zuckerberg himself testified in the Los Angeles trial last month, acknowledging the difficulty of enforcing the under-13 age ban on Instagram, noting that "a meaningful number of people…lie about their age to use our services." This admission highlights a systemic challenge for platforms designed for broad public use.

Moreover, more than 40 state attorneys general have filed coordinated lawsuits against Meta, alleging that the company is a direct contributor to a burgeoning mental health crisis among young people. These lawsuits specifically target Meta’s deliberate design of Instagram and Facebook features, which are argued to be inherently addictive and harmful to developing minds.

The Santa Fe County jury, drawn from a politically progressive state capital city, carefully considered a raft of Meta’s internal correspondence, reports related to child safety, and extensive testimony. Jurors heard from Meta executives, platform engineers, former employees turned whistleblowers, psychiatric experts, and tech-safety consultants. Local public school educators also provided poignant testimony, detailing the daily struggles with disruptions linked to social media, including harrowing sextortion schemes targeting children within their communities.

Chief Deputy Attorney General James Grayson emphasized the core issue in his closing arguments: "What this case is about is one of the biggest tech companies in the world taking advantage of New Mexico teens." The jury meticulously reviewed a checklist of allegations from prosecutors. These included Meta’s alleged failure to disclose what it knew about problems with enforcing its age ban for users under 13, the prevalence of social media content related to teen suicide, and the pivotal role of Meta’s algorithms in prioritizing sensational or harmful content to maximize engagement. They also considered whether specific statements about platform safety made by high-ranking Meta executives—including Zuckerberg, Instagram head Adam Mosseri, and Meta global head of safety Antigone Davis—were false or misleading.

Looking ahead, a slated second phase of the New Mexico trial is anticipated, possibly in May. This phase, which will be heard by a judge without a jury, will determine whether Meta created a public nuisance and, if so, what remedies might be ordered. Such remedies could range from mandating significant changes in platform design and safety protocols to requiring Meta to fund programs for mental health support and digital literacy for children. This verdict not only imposes a substantial financial penalty but also sets a precedent, signaling a growing legal appetite to hold powerful tech companies accountable for the real-world consequences of their digital empires, especially concerning the most vulnerable segments of their user base.

Leave a Reply

Your email address will not be published. Required fields are marked *