# Meta Fights to Curb Proof in Youth Safety Lawsuit
Meta faces mounting legal pressure as newly unsealed court filings reveal damning internal evidence about the company's knowledge of child safety risks on Instagram and Facebook. The tech giant is now fighting to limit the scope of evidence presented in unprecedented multidistrict litigation involving over 1,800 plaintiffs, including children, parents, school districts, and state attorneys general. These court battles will determine whether Meta can be held accountable for allegedly prioritizing engagement and profit over the protection of young users.
Unsealed Evidence Reveals Meta's Internal Knowledge of Harms
The recently unsealed court filings paint a stark picture of Meta's internal awareness regarding serious risks to minors on its platforms[1]. According to the brief filed in the Northern District of California, Meta was aware that millions of adult strangers were contacting minors on its sites, that its products exacerbated mental health issues in teens, and that content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet rarely removed[1].
The evidence suggests Meta had specific knowledge of predatory behavior on Instagram. In 2019, 3.5 million profiles engaged in "inappropriate interactions with children" via Instagram DMs, and by 2022, Instagram recommended 1.4 million teens to potential predators in a single day[2]. Internal surveys found that 13 percent of 13–15‑year‑olds received unwanted sexual advances on a weekly basis[2]. Despite these alarming statistics, Meta allegedly failed to disclose these harms to the public or to Congress[1].
Meta employees proposed multiple ways to mitigate these harms, but were repeatedly blocked by executives who feared that new safety features would hamper teen engagement or user growth[1]. One key recommendation involved defaulting teen accounts to private, which Meta's own research indicated would have prevented 5.4 million unwanted direct messages daily[2].
The Sextortion Crisis and Tragic Consequences
One of the most troubling aspects of the litigation involves Instagram's role in sextortion schemes targeting minors. Court filings reveal that Meta understood Instagram's "Accounts You May Follow" feature was actively connecting adult strangers to children, potentially exposing millions to adult groomers worldwide[2].
These cases have resulted in tragic outcomes. In one documented case, a teenager was targeted by a predator posing as a young girl on Instagram, manipulated into sharing compromising photos, and then subjected to extortion demands. Overwhelmed by the situation, the teen died by suicide in August 2024[2]. The Social Media Victims Law Center argues that had Meta implemented its own internal safety recommendations, countless lives could have been saved[2].
Meta's Defense Strategy and Legal Arguments
Meta has vigorously contested the allegations, arguing that plaintiffs' lawyers have selectively cited internal documents to construct a misleading narrative[3]. The company maintains that it has consistently put teen safety ahead of growth over more than a decade[3]. Meta points to specific safety measures it has implemented, including making all teen accounts private by default and allowing parents to place time restrictions on Instagram usage[3].
The company also highlights its cooperation with law enforcement. In 2024, Meta received over 9,000 emergency requests from US authorities and resolved them within an average of 67 minutes—responding even more quickly for cases involving child safety and suicide[3]. Additionally, Meta works with the Tech Coalition through the Lantern program, which allows participating companies to share signals about predatory accounts[3].
However, federal judges have ruled against Meta's efforts to dismiss key claims. In October 2024, U.S. District Judge Yvonne Gonzalez Rogers ruled that Meta must face lawsuits filed by state attorneys general, rejecting Meta's arguments that the harms against children aren't concrete and substantial[4]. The judge did dismiss some claims under Section 230 of the Communications Decency Act, which protects certain platform features, but allowed other claims to proceed[4].
The Broader Implications for Social Media Accountability
The litigation represents an unprecedented challenge to social media companies' business models. More than 1,800 plaintiffs allege that Meta, along with TikTok, Snapchat, and YouTube, "relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children's mental and physical health"[1].
The core allegations extend beyond child safety to include claims that Meta deliberately designed addictive features targeting minors[4]. Plaintiffs argue that Meta's own internal research revealed troubling connections between platform use and rising rates of depression, anxiety, eating disorders, and self-harm among teens and young adults, yet meaningful safeguards were not implemented[5].
State attorneys general have taken the matter seriously, with dozens filing lawsuits in the federal multidistrict litigation alleging that Meta engaged in efforts "to misrepresent, conceal, and downplay the impact of [its] features on young users' mental and physical health"[4]. Recent public challenges to Meta's safety claims have emerged, including criticism from the New Mexico Attorney General over Instagram's teen "PG-13" rating system, which critics call misleading and insufficient[5].
Frequently Asked Questions
What specific harms did Meta allegedly know about but fail to address?
According to unsealed court filings, Meta was aware that millions of adult strangers were contacting minors on its platforms, that its products exacerbated mental health issues in teens, and that content related to eating disorders, suicide, and child sexual abuse was frequently detected yet rarely removed[1]. The company also knew that 3.5 million profiles were engaged in inappropriate interactions with children via Instagram direct messages as early as 2019[2].
How many plaintiffs are involved in the Meta lawsuits?
More than 1,800 plaintiffs have joined together in the multidistrict litigation, including children and parents, school districts, and state attorneys general[1]. These plaintiffs are suing Meta and other social media companies including TikTok, Snapchat, and YouTube.
What is the "Accounts You May Follow" feature and why is it problematic?
According to court filings, Instagram's "Accounts You May Follow" feature actively connected adult strangers to children, potentially exposing millions of children to adult groomers worldwide[2]. Internal documents show that by 2022, Instagram recommended 1.4 million teens to potential predators in a single day[2].
Has Meta won any legal victories in these cases?
Meta has had limited success. While Judge Yvonne Gonzalez Rogers dismissed some claims under Section 230 of the Communications Decency Act—which protects certain platform features like infinite scroll and displaying likes—she allowed the majority of claims to proceed[4]. The judge ruled that Meta must face lawsuits filed by state attorneys general and school districts.
What safety measures has Meta implemented in response to these lawsuits?
Meta has made teen accounts private by default and allows parents to place time restrictions on Instagram usage, limiting use to 15 minutes a day or blocking access during school or nighttime hours[1][3]. The company also cooperates with law enforcement and participates in the Lantern program, which shares information about predatory accounts across tech companies[3].
What role did sextortion play in the Meta litigation?
Sextortion cases have become a focal point of the litigation, with documented instances where predators posed as young girls on Instagram to manipulate teens into sharing compromising photos, then demanded money under threat of exposure[2]. At least one documented case resulted in a teen's suicide, prompting wrongful death lawsuits and highlighting the platform's vulnerability to predatory schemes[2][5].
🔄 Updated: 1/22/2026, 6:50:51 PM
**LIVE NEWS UPDATE: Meta's Technical Defenses in Youth Safety Lawsuit**
Meta is aggressively challenging evidence in the ongoing youth safety lawsuits by arguing that plaintiffs' lawyers have "selectively cited internal documents" with "cherry-picked quotes and snippets of conversations taken out of context," aiming to dismiss claims on addictive features like infinite scroll, visual filters promoting body dysmorphia, and lack of age verification or parental controls[3][1][2]. Technically, courts have rejected Meta's Section 230 immunity for product defects such as absent usage limits, ineffective parental notifications, and manipulative content recommendations that exploit youth brain development, allowing these to proceed while dismissing some user-generated content claims[4]. Implications include potential mandates for stricter ag
🔄 Updated: 1/22/2026, 7:00:52 PM
**LIVE NEWS UPDATE: Meta Battles to Exclude Key Evidence in Youth Safety Lawsuits**
Meta is aggressively challenging the admissibility of internal documents in ongoing federal youth safety lawsuits, arguing plaintiffs' lawyers have "cherry-picked quotes and snippets of conversations taken out of context" to falsely claim platforms prioritize growth over teen well-being, as stated in Meta's January 2026 defense[3]. Technically, surviving claims spotlight **design defects** like absent age verification, inadequate parental controls, infinite scroll, and manipulative filters promoting body dysmorphia—features U.S. District Judge Yvonne Gonzalez Rogers ruled non-protected by Section 230, potentially forcing Meta to deploy costly safeguards such as usage limits and predator-reporting algorithms[
🔄 Updated: 1/22/2026, 7:11:06 PM
**NEWS UPDATE: Public outrage intensifies as Meta seeks to suppress evidence in youth safety lawsuit.** Parents and advocates slammed Meta's bid to block teen suicide stories, mental health research, and its own surveys showing 13% of 13-15-year-olds facing weekly sexual advances, with Social Media Victims Law Center attorney Matthew P. Bergman declaring, “Meta knew Instagram was a hunting ground for predators, yet chose to protect engagement metrics over children’s lives.”[1] New Mexico AG Raúl Torrez and bipartisan lawmakers decried the move amid lawsuits citing 3.5 million inappropriate child interactions in 2019 and 1.4 million daily teen recommendations to predators in 2022, demanding accountability over profits.[2]
🔄 Updated: 1/22/2026, 7:21:05 PM
Meta is attempting to block extensive evidence in its New Mexico child safety trial, requesting the court exclude research on social media's mental health impact, teen suicide cases, the company's finances, past privacy violations, and CEO Mark Zuckerberg's personal history[1]. Legal experts told Wired that Meta's effort to suppress information—including data about its AI chatbots and surveys documenting inappropriate content on its platforms—is unusually broad, though the company argues the material is irrelevant or could unfairly prejudice the jury[1]. The trial, set to begin February 2, represents the first state-level case of its kind, with New Mexico Attorney General Raúl Torrez accusing Meta
🔄 Updated: 1/22/2026, 7:31:05 PM
Meta is attempting to block extensive evidence in an upcoming New Mexico child safety trial scheduled to begin February 2, seeking to exclude research on social media's mental health effects, references to teen suicides, the surgeon general's public health warning, and even details about CEO Mark Zuckerberg's college years.[3] Legal experts told Wired that Meta's efforts to suppress information are unusually broad, though the company argues the excluded evidence is irrelevant or could unfairly prejudice the jury.[3] This case marks the first state-level trial of its kind, with New Mexico Attorney General Raúl Torrez accusing Meta of failing to protect minors from online predators, trafficking, and sexual abuse
🔄 Updated: 1/22/2026, 7:41:08 PM
**NEWS UPDATE: Public Outrage Mounts as Meta Seeks to Suppress Youth Safety Evidence**
Consumer advocates and families devastated by Instagram-related sextortion suicides are fiercely criticizing Meta's bid to exclude damning evidence from its upcoming New Mexico child safety trial, including teen suicide stories and its own surveys showing 13% of 13-15-year-olds facing weekly unwanted sexual advances[3][4]. Social Media Victims Law Center founding attorney Matthew P. Bergman declared, “Meta knew Instagram was a hunting ground for predators, yet chose to protect engagement metrics over children’s lives,” spotlighting stats like 3.5 million profiles engaging in inappropriate child interactions in 2019 and 1.4 million teens recommended to predators in a single 2022 day
🔄 Updated: 1/22/2026, 7:51:11 PM
Meta is attempting to exclude substantial evidence from an upcoming New Mexico child safety trial scheduled to begin February 2, seeking to block research on social media's mental health impact, teen suicide stories, the company's financial records, past privacy violations, and mentions of CEO Mark Zuckerberg's college years.[1] Legal experts told Wired that Meta's effort to suppress information is unusually broad, including requests to prevent discussion of former US Surgeon General Vivek Murthy's public health warning on youth mental health and surveys about inappropriate content on its platforms, arguing such evidence is irrelevant or could unfairly prejudice the jury.[1] The New Mexico Attorney General filed the lawsuit in late 2023, acc
🔄 Updated: 1/22/2026, 8:01:16 PM
**LIVE NEWS UPDATE: Meta Platforms Stock Slides Amid Youth Safety Lawsuit Battle**
Meta Platforms (NASDAQ: META) shares traded at $615.52 as of January 22, 2026, reflecting a sharp 5.3% drop from the January 21 close of $612.96 after a 1.46% gain that day, with early 2026 performance down significantly from a January 6 high of $660.62[3][5][6]. Investors appear wary of Meta's aggressive courtroom push to suppress evidence in the ongoing youth safety lawsuit, exacerbating pressure from prior Q3 concerns over $100 billion+ AI capex plans in 2026, as options data signals a ±$14.5
🔄 Updated: 1/22/2026, 8:11:18 PM
**LIVE UPDATE: Meta's Global Legal Battle Over Youth Safety Evidence Intensifies**
Meta's aggressive push to exclude youth mental health research, teen suicide stories, and CEO Mark Zuckerberg's Harvard history from its February 2 New Mexico trial—accused of enabling predator access to minors—threatens to set precedents rippling to dozens of pending U.S. cases and beyond, including a 2023 multi-state lawsuit by over three dozen states and a D.C. Attorney General suit alleging "long-lasting psychological damage" from addictive features.[1][2][4][7] Internationally, while no direct foreign responses are reported, the case echoes global scrutiny as Meta defends its teen protections—like private accounts by default and 9,00
🔄 Updated: 1/22/2026, 8:21:20 PM
**NEWS UPDATE: Meta Fights to Curb Proof in Youth Safety Lawsuit**
Meta is aggressively challenging evidence in ongoing youth safety lawsuits by arguing that internal documents are cherry-picked and misleading, emphasizing instead its technical safeguards like private-by-default teen accounts, AI protections against self-harm prompts, and rapid response to 9,000+ US emergency requests in 2024 (resolved in 67 minutes on average)[3]. Courts, including U.S. District Judge Yvonne Gonzalez Rogers' October 2024 rulings, have rejected Meta's motions to dismiss key claims on **defective designs** such as infinite scroll, manipulative filters promoting body dysmorphia, absent age verification, and weak parental controls—allowing these to proceed despit
🔄 Updated: 1/22/2026, 8:31:23 PM
**NEWS UPDATE: Meta's Aggressive Bid to Curb Evidence in Youth Safety Trial Draws Expert Scrutiny**
Legal experts speaking to Wired describe Meta's pretrial motions to exclude youth mental health research, teen suicide stories, its own surveys on inappropriate content, and former Surgeon General Vivek Murthy's public health warning as "unusually broad," far beyond standard efforts to narrow cases.[1][3][6] Industry observers note this reflects Meta's strategy to limit jury exposure in the February 2 New Mexico trial—the first state-level test of platform accountability for child exploitation—while Meta counters that plaintiffs rely on "cherry-picked quotes and snippets" misrepresenting its decade-long safety investments, like default private teen accounts.[5]
🔄 Updated: 1/22/2026, 8:41:19 PM
**NEWS UPDATE: Meta Fights to Curb Evidence in Youth Safety Lawsuit**
Legal experts describe Meta's aggressive motions to exclude youth mental health research, teen suicide stories, its own surveys on inappropriate content, former Surgeon General Vivek Murthy's public health warning, and even CEO Mark Zuckerberg's Harvard history as "unusually broad," far beyond standard efforts to narrow cases, according to Wired-reviewed court documents ahead of the February 2 New Mexico trial[1][3]. Meta counters that plaintiffs' lawyers are "selectively citing internal documents to construct a misleading narrative," insisting its full record shows prioritizing teen safety—like making accounts private by default and resolving 9,000+ US emergency child safety requests in 2024 within
🔄 Updated: 1/22/2026, 8:51:19 PM
**NEWS UPDATE: Regulators Reject Meta's Bid to Dismiss Youth Safety Claims**
U.S. District Judge Yvonne Gonzalez Rogers ruled on October 24, 2024, that Meta must face lawsuits from 33 state attorneys general alleging its platforms harm children's mental health through addictive features, dismissing only parts protected under Section 230 of the Communications Decency Act while allowing core claims to proceed[1][2]. Meanwhile, the House Energy and Commerce Committee on December 2, 2025, advanced a 19-bill package including a revised Kids Online Safety Act (KOSA) without its original "duty of care" provision to sidestep First Amendment challenges, signaling ongoing federal pushes for child protections[4][
🔄 Updated: 1/22/2026, 9:01:22 PM
**BREAKING: Meta's Bid to Curb Evidence in Youth Safety Lawsuit Sparks Global Scrutiny.** As Meta seeks to exclude youth mental health research, teen suicide stories, and even CEO Mark Zuckerberg’s college history from its upcoming New Mexico trial—filed by Attorney General Raúl Torrez in December 2023 alleging failures against child exploitation—a parallel federal lawsuit by a bipartisan coalition of **33 U.S. state attorneys general**, led by California’s Rob Bonta, accuses Meta of addictive features harming children worldwide under laws like COPPA.[1][4][2] This escalating legal front, mirroring international calls for accountability amid platforms' global reach affecting **billions of young users**, has prompted Meta to defend its "consisten
🔄 Updated: 1/22/2026, 9:11:15 PM
U.S. District Judge Yvonne Gonzalez Rogers ruled in October 2024 that Meta must face lawsuits filed by 42 state attorneys general, rejecting the company's attempt to have the cases dismissed on grounds that alleged harms weren't concrete or substantial.[1][6] While Judge Gonzalez Rogers dismissed some claims under Section 230 of the Communications Decency Act—potentially protecting features like infinite scroll and displayed likes—she allowed other state claims to proceed, including those alleging Meta violated unfair business practices laws.[4] The states collectively argue that Meta knowingly prioritized corporate profits over youth well-being by deliberately designing addictive features and failing to implement available safety