A woman in California is suing Meta and Google’s YouTube, alleging Instagram and other platforms contributed to her mental health struggles and social media addiction.
Snapchat and TikTok have already reached pretrial settlements with the woman, who was 19 years old when the lawsuit was written and is identified only as K.G.M.
This case marks the first time that CEOs of major social media companies will testify in front of a jury. Experts who track social media say the outcome could represent a pivotal moment for social media platforms.
“Cars have to have seat belts, they have to have airbags and they have to have bumpers that protect you,” said Kyle Morse, deputy executive director of The Tech Oversight Project, a nonprofit that researches and exposes tactics of Big Tech giants.“We believe that social media platforms and the features that are designed into them should equally be as safe.”
In the past, social media corporations have claimed the liability of their content falls on a third- party, leaning on the First Amendment, which protects freedom of speech, and Section 230 of the federal Communications Act. This protects online platforms from liability for the third-party content on their services.
However, in the K.G.M. trial, many of the arguments are based on the intentional design choices made by social platforms as opposed to the third-party content.
Eric Goldman, a professor at Santa Clara University School of Law, said this strategy makes the First Amendment and Section 230 argument irrelevant.
“The judge has already said that many of the arguments that the plaintiffs are making are not based on third-party content,” Goldman said. “They’re based on this other thing called design choices.”
The K.G.M. case also marks the first time social media companies will be tried on product liability law, Morse said.
“This trial is a real moment of reckoning because it’s clawing back the Section 230 protections that every major tech company hides behind,” Morse said.
Morse said a key takeaway of the lawsuit is the plaintiff’s claim that social media platforms are intentionally designed to harm people.
Meta did not respond to The DePaulia’s request for comment. Tech experts have noted, however that, in 2024, Meta launched Instagram Teen Accounts as an initiative to provide protections for teens using the app.
“These types of controls can be helpful, but I know some very real examples where they did not protect kids from predators and … cyberbullying,” said Lindsay Doyle, a mental health therapist and mother of four from East Lansing, Michigan.
She said, for instance, that her own son has been in a situation where teen account controls were ineffective.
Doyle said that while she thinks social media companies do care about the safety of kids, their primary goal is likely to keep children using these apps because it makes them money.
Marc Berkman, CEO of the nonprofit Organization for Social Media Safety, agreed that features designed to keep kids swiping for hours on these social media platforms show an intent to maximize engagement and revenue.
Berkman, an advocate for accountability, also argued that they have ignored clear risks when it came to “pushing design … for the objective of maximizing revenue and profit at the expense of our children’s safety.”
Jury selection began Jan. 27, and opening arguments are expected to follow soon.
Goldman said the K.G.M. trial will amount to a “battle of experts” with each side trying to persuade the jury. He added that a strong verdict in favor of K.G.M. would signal the public’s support for regulating social media platforms.
“That will only add fuel to the fire of the efforts to regulate social media,” Goldman said.
Related Stories:
Support Student Journalism!
The DePaulia is DePaul University’s award-winning, editorially independent student newspaper. Since 1923, student journalists have produced high-quality, on-the-ground reporting that informs our campus and city.
We rely on reader support to keep doing what we do. Donations are tax deductible through DePaul’s giving page.