Documents filed by several U.S. school boards in a class-action lawsuit against Meta and other social media platforms show the company halted internal research after finding evidence that Facebook and Instagram can harm users’ mental health, particularly among teens.
According to the documents, a 2020 internal study code-named ‘Project Mercury’ had Meta researchers working with Nielsen to measure the effects of taking a one-week break from Facebook and Instagram. The results defied the company’s expectations: users who stepped away for a week reported lower levels of depression, anxiety, loneliness, and social comparison.
Meta Concealed Study Showing Facebook and Instagram Harm Mental Health
Despite the significance of the findings, Meta did not publish the study or pursue additional research. According to court allegations, the company internally dismissed the results, citing what it called a “negative media narrative” about Meta. However, the documents show that internal researchers told Nick Clegg, then Meta’s global head of public policy, that the results were accurate. One researcher even shared an internal message with a frowning emoji, writing, “The Nielsen study does indeed show a causal effect on self-comparison.” Another employee likened the company’s concealment of the findings to tobacco companies hiding the risks of smoking for decades.
Even with this evidence, plaintiffs said Meta told Congress it could not measure the harm its products cause teenage girls.
In response to the allegations, Meta disputed the internal study’s findings. Spokesman Andy Stone told Reuters on Saturday that the decision to halt the study was not due to its results but to what the company described as “methodological flaws” in its design. Stone said Meta has spent years improving safety tools and protecting teens through continuous updates. “For many years, we have listened to parents, relied on research, and implemented real changes aimed at enhancing youth safety,” he said.
Major Lawsuit Claims Tech Firms Concealed Risks to Children
A lawsuit filed by the law firm Motley Rice broadens the scope of scrutiny beyond Meta, implicating Google, TikTok, and Snapchat. The companies are accused of concealing known risks associated with their products from users, educators, and parents.
The complaint alleges that major platforms allowed children under 13 to access their services despite legal restrictions, failed to address content exploiting children sexually, and, according to the plaintiffs, deliberately sought to boost teen engagement—even within schools. The plaintiffs say these actions represent a systematic effort to increase dependence on the platforms.
The documents also point to alleged attempts to influence child-focused organizations. The lawsuit claims TikTok sponsored the National Parent-Teacher Association (PTA) and that company officials boasted internally that the group would support the company’s positions in public statements, with its CEO ready to issue favorable media commentary. Plaintiffs say this raises questions about the independence of such organizations and their ability to advocate effectively for families.
Meta Faces Allegations Over Security Vulnerabilities and Harmful Content
The lawsuit also portrays the allegations against Meta as the most detailed compared with other tech companies. Internal documents show that the company’s youth safety features were limited in effectiveness and rarely used, and that testing of new features was blocked over concerns they might affect platform growth. One document notes that a user could attempt human trafficking on the platform up to 17 times before their account was suspended—a threshold employees described as “extremely high and unacceptable.” The documents also reveal that Meta knowingly used algorithms that exposed teens to harmful content to boost engagement.
Leaked messages from 2021 further show CEO Mark Zuckerberg reportedly said child safety was not a “top priority” compared with major projects such as building the metaverse. According to the suit, this prompted Nick Clegg and other executives to push for more funding for child protection efforts, but they received no clear response.
Meta spokesperson Andy Stone countered that the lawsuit relies on “cherry-picked quotes and misleading statements.” He said the company’s current policy removes accounts immediately when human trafficking is reported and clarified that Meta’s objection to the release of documents was about how they were presented, which the company considers “misleading and taken out of context.”
Read More
Meta Replaces Fact-Checkers with Community Notes Amid Backlash from Experts and Politicians
European Commission: Meta Misled Users With Its Pay-To-Avoid-Ads Model