Two juries just put social media companies on notice
When I was a kid, in 1989, 11-year-old Jacob Wetterling went missing, kidnapped near his home in rural Minnesota. His young face was on milk cartons everywhere.
An abduction like this was so unthinkable at the time, the tragedy is imprinted on my brain − along with millions of others in Minnesota and nationwide. About 27 years later, his killer pleaded guilty to federal child pornography charges and confessed that he abducted, sexually assaulted and murdered Wetterling.
Predators haven't gone away. They still lurk and target children. But with the rise of social media, the risk has expanded into a new space that can also help create online addictions and feed feelings of depression.
Two landmark cases confirm this.
On March 24, a jury in New Mexico ordered Meta, the parent company of Facebook and Instagram, to pay $375 million after determining that Meta violated state law because it had misled users about platform safety and allegedly enabled child sexual exploitation.
On March 25, a jury in Los Angeles found Meta and Google's YouTube negligent in creating features that were addictive to a young woman and caused mental distress. Meta must pay $4.2 million in combined compensatory and punitive damages, and YouTube must pay $1.8 million.
These damning verdicts for Meta and other platforms serve as a warning for lawsuits that could lie ahead and for parents worried about the dangers that lurk in the digital frontier.
Meta is in hot water for good reason
In the first case about child exploitation, jurors found that Meta violated New Mexico’s consumer protection law. Attorney General Raúl Torrez sued and accused Meta of failing to protect children from predators.
"The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety," Torrez said.
California jurors deliberated 40 hours to decide the verdict in the first-of-its-kind case brought by a now 20-year-old woman, KGM, or "Kaley" as her attorneys referred to her. She testified at trial that she became addicted to social media as a child, and she accused social media companies of creating a product as addictive as cigarettes because of features like infinite scroll, which she said led to mental health struggles.
In that trial, Zuckerberg often said his words were being twisted, but also defended the purpose of Meta: "I think the way we should build things is to build useful services for people to connect with their family and friends and learn about the world."
But in a 2024 Senate committee hearing about the harms of social media, Zuckerberg apologized to families.
In a company statement, Google spokesperson Jose Castañeda said that the case "misunderstands YouTube, which is a responsibly built streaming platform, not a social media site."
A platform designed to keep young users hooked − while lacking strong safeguards to stop predators from targeting them − is a dangerous combination indeed. That reality isn’t new, but what's new is seeing accusations and lawsuits finally lead to landmark verdicts. For parents and teens already struggling with social media, this simply confirms what they’ve long suspected: In many ways, this moment has been a long time coming for platforms like Facebook, Instagram and YouTube.
Meta's lawsuit is a warning for other platforms, parents and kids
As a parent of four, I'm interested in these cases and the short- and long-term effects of social media on kids (and adults), so I've written about this a lot. The fact that there is now legal and financial accountability for these companies is probably a relief ‒ and a warning to a lot of parents who struggle to implement guardrails and boundaries with their kids.
Teens are especially susceptible to being on social media too much. Data already showed us what these verdicts confirmed. Older teens are online "almost constantly" − an average of nearly five hours a day, practically a part-time job. A lot of social media use also contributes to anxiety and depression, especially for girls, according to research.
I wouldn't be surprised if, as a result of these verdicts, tech companies redesign features that enable anonymous contact or facilitate grooming, or if there are stricter safety defaults, especially for minors. In December, Australia banned social media for kids under 16. I doubt that would happen here in the United States, but platforms will make changes – and policies could also force them.
On one hand, I'm glad that Meta and other platforms will be held accountable for their design and algorithms, which contribute to harming minors, either by making them even more susceptible to addiction or online predators. However, I do think there is enough accountability and responsibility to go around.
I am not disputing the jury's findings, and I am not suggesting that addiction or sexual exploitation of a child online by a predator is entirely a parent's or a child's fault, either. Platforms have made the harms of social media strangely difficult to police, but monitoring what our kids do and how they fill their time is also part of a parent's job.
The verdicts in these cases should not diminish a parent's responsibilities toward their child. Parents must also continue to be vigilant and know what their child is doing online.
Big Tech companies and parents (and even teens themselves) must all be proactive so we can prevent our kids from suffering from dangers in this new, digital world, just as we do to protect them from harm in the real world.
Nicole Russell is an opinion columnist with USA TODAY. She lives in Texas with her four kids. Sign up for her newsletter, The Right Track, and get it delivered to your inbox.
