Seattle attorney Matt Bergman (Image: Courtesy Social Media Victims Law Center Blog)

In a landmark decision on March 25, a California jury, in a first-of-its-kind bellwether verdict, held Meta Platforms and Google legally responsible for harm caused to a young plaintiff from use of their platforms, Instagram and YouTube, in childhood. 

The jurors found that specific design features—like endless scrolling and algorithm-driven content—were found to be a substantial factor in a young woman’s depression and anxiety after years of use that began in childhood. The verdict includes $6 million in damages, with Meta responsible for about 70%, and marks a significant shift in how courts may evaluate the impact of social media on minors.

Learn more about social media impacts on kids: Check out our story ‘Can’t Look Away’ film puts spotlight on kids, mental health & big tech‘ Watch the film for $7 at online at Jolt.Film.

The case (K.G.M. v. Meta Platforms Inc. & YouTube LLC) was tried in Los Angeles Superior Court as part of California’s coordinated social media litigation. Meta and YouTube are headquartered in California. The 20-year-old plaintiff, however, was represented by Seattle’s Social Media Victims Law Center (SMVLC). 

SMVLC founder Matt Bergman praised the decision: “It is a watershed moment in the culmination of a four-year legal battle to hold these companies accountable for the carnage that social media is inflicting on young people. 

“It is the first time ever that a jury has even considered whether these companies can be held liable, and the verdict is a milestone in the quest for accountability,” Bergman said.

In bringing the case, K.G.M. ‘s attorneys argued that the platforms were engineered to maximize engagement through features such as autoplay, endless scrolling, and constant notifications. Among other impacts of her use of the platforms, Bergman said K.G.M. suffered mental distress and body dysmorphia. Meta and Google disputed that claim, arguing that adolescent mental health is shaped by many factors and that their platforms include safety tools and parental controls. 

What set this case apart was its focus on product design, not user-generated content, helping it sidestep the broad immunity companies often invoke under Section 230 of the 1996 Communications Decency Act, which provides immunity to websites, platforms, and users from liability for third-party content. 

The win came from zeroing in on product designs rather than focusing on user-generated content. Jurors were asked not whether social media caused harm outright, but whether the way these platforms are built contributed meaningfully to it. They said yes. 

Bergman said the verdict opens the door to hitting companies where they care most: money.

“We have learned that appealing to these companies’ moral sensibilities doesn’t work. They continue to profit too much to be swayed by moral sensibilities,” said Bergman. “Bad press doesn’t seem to bother them. They just make so much money by exploiting young people. So we think that when they have to start paying jury verdicts and have to start talking and explaining to juries why their profits are more important than our kids’ safety, their behavior will start to change.’

The implications reach far beyond one plaintiff. The Los Angeles case is a bellwether in a broader wave of social-media harm litigation, while a separate New Mexico case ended a day earlier with a jury ordering Meta to pay $375 million in civil penalties under that state’s consumer-protection law. Together, the rulings suggest that tech companies’ long-standing insulation from liability is facing new pressure in court. 

Appeals are expected. The implications stretch far beyond K.G.M’s win. As a “bellwether” case, the verdict is tied to thousands of similar lawsuits filed by parents and school districts nationwide, all asking the same question: Are social media companies products with known risks to children?

Comments are closed.