The landmark verdict comes after a nearly seven-week trial, and after jurors in a California federal court have been deliberating for more than a week on whether Meta and YouTube should be liable in a similar case.
New Mexico jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over security, and violated parts of the state’s Unfair Practices Act.
The jury agreed with the allegations that Meta made false or misleading statements and also agreed that Meta engaged in “unconscionable” business practices that took advantage of children’s vulnerabilities and inexperience.
Jurors found there were thousands of violations, imposing a separate fine of $375 million on each. That’s less than a fifth of what prosecutors were seeking.
Meta is worth about $1.5 trillion and the company’s stock rose 5% in early after-hours trading after the decision, a sign that shareholders were taking the news for granted.
Juror Linda Payton, 38, said the jury settled on the estimated number of teens affected by Meta’s platforms, while opting for the maximum penalty per violation. With a maximum fine of $5,000 for each violation, he said he feels every child deserves the maximum amount.
The social media group will not be forced to immediately change its practices. It will be up to a judge — not a jury — to determine whether Meta’s social media platforms created a public nuisance and whether the company should pay for public events to address the damages. That second phase of the trial will take place in May.
A spokesperson for Meta said the company disagrees with the decision and will appeal.
“We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the spokesperson said. “We will continue to defend ourselves vigorously, and we are confident in our record of protecting teens online.”
Meta’s lawyers said the company discloses risks and strives to eliminate harmful content and experiences, while acknowledging that some bad content does fall through its safety net.
The New Mexico case was one of the first to reach trial in a wave of litigation related to social media platforms and their impact on children.
More than 40 state attorneys general have filed a lawsuit against Meta, claiming it is knowingly contributing to a mental health crisis among youth by making Instagram and Facebook features addictive.
“The meta house of cards is beginning to collapse,” said Sacha Howarth, executive director of the watchdog group The Tech Oversight Project. “For years, it has been abundantly clear that META has failed to prevent sexual predators from turning online interactions into real-world harm.”
Howarth pointed to whistleblowers like Arturo Bezar, as well as unsealed documents and other evidence, saying it paints a dire picture.
The New Mexico case was based on an undercover investigation where agents created social media accounts impersonating children to document sexual solicitations and Meta’s response.
The lawsuit, filed in 2023 by New Mexico Atty. General Raul Torrez also said that META did not fully disclose or address the dangers of social media addiction. Meta does not agree that social media addiction exists, but trial officials acknowledged “problematic use” and said they wanted people to feel good about the time they spent on Meta’s platforms.
“The evidence shows not only that Meta invests in security because it is the right thing to do but also because it is good for business,” Meta attorney Kevin Huff told jurors in closing arguments. “Meta designs its apps to help people connect with friends and family, not to try to connect predators.”
Tech companies are protected from liability for content posted on their social media platforms under Section 230 of the US Communications Decency Act, a 30-year-old provision, as well as the First Amendment shield.
New Mexico prosecutors say Meta must still account for its role in pushing content through complex algorithms that disseminate material harmful to children.
“We know the output means engagement and time spent with the children,” said prosecuting attorney Linda Singer. “The decision META took had a deeply negative impact on children.”
What the New Mexico jury reviewed
The New Mexico trial examined Meta’s internal correspondence and reports related to child safety. Jurors also heard testimony from Meta executives, platform engineers, whistleblowers who left the company, psychiatrists and technical security consultants.
The jury also heard testimony from local public school teachers who grappled with disruptions related to social media, including sextortion schemes targeting children.
In reaching the verdict, the jury considered whether social media users were misled by specific statements about platform security by Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri and Meta global security head Antigone Davis.
Jurors also considered Meta’s failure to enforce restrictions on users under the age of 13, the role of its algorithms in prioritizing sensationalist or harmful content, and the prevalence of social media content about teen suicide.
ParentsSOS, a coalition of families who have lost children to social media-related harm, called the decision “a watershed moment”.
“We parents who have experienced the unimaginable – the death of a child due to the harms of social media – applaud this rare and important milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our children,” the group said in a statement.
Lee writes for the Associated Press.
<a href