We live in the age of social media, and there is now an entire generation that has grown up “with a smartphone in hand.” Yet more and more information indicates that social platforms may pose a greater threat to children’s development than previously assumed. The latest development is a class-action lawsuit filed against Meta, the owner of Facebook and other platforms.
In the lawsuit involving more than 1,800 plaintiffs — including children and parents, school districts, and state attorneys general — documents were disclosed describing an internal research program called “Project Mercury.” As part of this initiative, Meta scientists and the research firm Nielsen collaborated on a study in which users turned off Facebook for a week. The collected data showed reductions in symptoms such as depression, anxiety, loneliness, and social comparison.
Despite these findings, the company — according to the lawsuit — abandoned further research, claiming the results were skewed by an “existing media narrative against the company.” Meta employees reportedly compared the company’s policy to the tobacco industry: “we are doing the research and we know our products are bad, and yet we’re not talking about it.” These are very strong words, but if the lawsuit’s conclusions are accurate, they may not be far from the truth.
Court documents further suggest that the company designed youth-safety features in ways that made them ineffective and unlikely to be used, and that tests of more restrictive safeguards were blocked out of concern that they would negatively impact user-growth metrics. One document also describes a moderation system in which a user had to be “caught” 17 times attempting human-trafficking activity before their account would be removed — something employees labelled a “very, very, very high strike threshold.”
The documents also include allegations that the company delayed efforts to limit interactions between pedophiles and minors, and that employees working on safety were pressured to “find arguments” to justify a lack of intervention. The most shocking revelation came from 2021 messages in which Mark Zuckerberg wrote that he could not say children’s safety was his priority, “when I have a number of other areas I’m more focused on like building the metaverse.” According to the lawsuit, Zuckerberg also ignored or rejected requests from Nick Clegg, then Meta’s head of public policy, for increased funding of child-protection efforts.
The filing additionally alleges that Meta delayed the rollout of dedicated privacy settings for teenage users, even though internal data suggested that features such as default-private accounts could significantly reduce the risk of inappropriate interactions with adults. The company allegedly abandoned these protections because they could negatively affect engagement metrics, despite warnings from internal teams that young users were highly exposed to harm.
Meta — through spokesman Andy Stone — firmly denies the allegations, claiming that Project Mercury was discontinued due to methodological flaws and that the company has introduced numerous safety features for teenage users over the years. The case is scheduled for trial on January 26, 2026, in the Northern District of California.
We will be watching closely for further developments — this is a highly important case concerning the safety and mental health of underage social media users.

