Four whistleblowers accuse Meta of hiding research on child safety risks in VR and online platforms

📅 Published: 9/8/2025
🔄 Updated: 9/8/2025, 8:01:17 PM
📊 15 updates
⏱️ 10 min read
📱 This article updates automatically every 10 minutes with breaking developments

Four whistleblowers have accused Meta, the parent company of Facebook and Instagram, of deliberately hiding research that revealed significant child safety risks within its virtual reality (VR) and online platforms. The allegations include claims that Meta suppressed internal findings about the dangers to children using its Horizon Worlds VR social platform and other services, potentially putting young users at risk[3][4].

According to whistleblowers who have shared internal documen...

According to whistleblowers who have shared internal documents with U.S. lawmakers, Meta was aware that children under 13 were accessing Horizon Worlds by misrepresenting their ages, violating federal laws designed to protect children’s online privacy. Rather than taking action to remove or protect these underage users, employees were allegedly instructed not to document the presence of children on the platform. This conduct reportedly violated the Children’s Online Privacy Protection Act (COPPA), which requires parental consent before collecting data from children under 13[1].

Fairplay for Kids, a child advocacy organization, conducted...

Fairplay for Kids, a child advocacy organization, conducted an investigation from July 2024 to April 2025 and found that children aged 10 to 12 were able to access Horizon Worlds through child accounts that required parental consent but still exposed them to risks such as voice chatting with older users. Despite Meta’s policy initially restricting Horizon Worlds to users aged 13 and older, the company began allowing child accounts in late 2024 without sufficient safeguards[1].

The whistleblower revelations come amid growing scrutiny of...

The whistleblower revelations come amid growing scrutiny of Meta’s handling of child safety across its platforms. Previous whistleblowers, including Frances Haugen and others, disclosed that Meta ignored internal research showing harm to teens from Instagram, such as worsening body image issues. Meta has been accused of prioritizing user engagement and advertising revenue over protecting young users from harmful content and interactions[2].

In response to the latest allegations, the U.S. Senate Judic...

In response to the latest allegations, the U.S. Senate Judiciary Subcommittee on Privacy, Technology, and the Law has scheduled hearings to examine claims that Meta buried research exposing the risks its platforms pose to children. The hearing, titled "Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research," is set for September 9, 2025, and aims to assess the extent of Meta’s knowledge and its responsibility in safeguarding children online[4][5].

Meta has not provided a detailed public response to these sp...

Meta has not provided a detailed public response to these specific whistleblower claims but has previously emphasized its work with child safety organizations and initiatives to improve online protections for young users. Critics argue, however, that more transparency and regulatory oversight are urgently needed to prevent harm to children who are increasingly engaging with immersive virtual environments and social media platforms[2][3].

This unfolding controversy highlights the ongoing challenges...

This unfolding controversy highlights the ongoing challenges tech giants face in balancing innovation in virtual and social technologies with the ethical imperative to protect vulnerable users, especially children, from exposure to risks and exploitation.

🔄 Updated: 9/8/2025, 5:40:21 PM
Consumer and public reaction to the whistleblowers’ accusations against Meta has been highly critical, with significant concern over child safety in VR. Fairplay’s investigation found that in 24 of 26 visits to Horizon Worlds, minors used adult accounts lacking parental consent, with children comprising up to 52% of users in some spaces[3]. Users have described the platform as a "nursery," highlighting widespread unease about the presence of children in unsafe environments, while whistleblower Kelly Stonelake condemned Meta for knowingly allowing underage access and violating federal privacy laws[2][3].
🔄 Updated: 9/8/2025, 5:50:27 PM
Four whistleblowers with a combined 40 years of experience at Meta have accused the company of deliberately hiding research on child safety risks in its VR and online platforms, including Horizon Worlds[2][5]. One whistleblower, Kelly Stonelake, former Horizon Worlds Director of Product Marketing, revealed Meta was aware that children under 13 accessed the platform by falsifying age info, violating federal privacy laws, yet employees were instructed not to document this[1]. A Senate Judiciary subcommittee is set to hold a hearing on September 9, 2025, to examine these allegations in detail[3][4].
🔄 Updated: 9/8/2025, 6:00:45 PM
Regulators have intensified scrutiny of Meta following whistleblower claims that the company suppressed research on child safety risks in its VR platform Horizon Worlds. The U.S. Federal Trade Commission (FTC) is investigating Meta amid allegations that children under 13 accessed the platform using adult accounts lacking parental consent and COPPA protections, with reports showing minors comprised up to 52% of users in some VR spaces[3]. Congress has also received disclosures from whistleblowers accusing Meta’s legal team of editing or blocking studies highlighting harassment and virtual grooming risks to avoid regulatory action[1][5].
🔄 Updated: 9/8/2025, 6:10:32 PM
Four whistleblowers have accused Meta of suppressing internal research that exposed child safety risks in VR, prompting disclosures to U.S. Congress amid growing regulatory scrutiny[1]. The whistleblowers allege Meta’s legal team edited or blocked studies to avoid regulatory action and legal fallout, while the FTC has been targeted with complaints over Meta allowing children under 13 to use adult accounts on Horizon Worlds without proper protections[3]. These revelations have intensified governmental pressure on Meta regarding enforcement of child safety standards in virtual environments[1][3].
🔄 Updated: 9/8/2025, 6:20:33 PM
Four whistleblowers, including current and former Meta researchers, have accused Meta of suppressing internal research on child safety risks in its VR platform and other online services, revealing cases where children were sexually targeted through Meta’s virtual reality environments[1][2]. One researcher was instructed to delete recordings of a teen reporting sexual propositions involving a 10-year-old on Meta’s Horizon Worlds, while documented incidents include a 48-year-old sex offender soliciting a 9-year-old and a 25-year-old man abducting a 13-year-old after meeting them via Meta headsets[1][2]. These allegations come ahead of a Senate subcommittee hearing scheduled for September 9, 2025, to examine Meta’s handling of this research and its
🔄 Updated: 9/8/2025, 6:30:38 PM
Four whistleblowers with decades of experience at Meta have accused the company of deliberately suppressing internal research exposing serious child safety risks in its virtual reality platform Horizon Worlds, including reports of virtual grooming and harassment of children as young as 10[1][4]. These revelations have sparked international concern, prompting U.S. Congress hearings and a complaint to the FTC, as investigations revealed that up to 52% of users in some VR spaces were minors using adult accounts without parental consent, circumventing safety regulations such as COPPA[3]. Global child protection advocates and regulators are responding with increased pressure on Meta and other tech firms to ensure transparency and enforce stronger safeguards for children in digital environments.
🔄 Updated: 9/8/2025, 6:40:37 PM
Four whistleblowers revealed to U.S. Congress that Meta suppressed internal research exposing child safety risks in its VR platform Horizon Worlds, alleging Meta's legal team interfered to avoid regulatory scrutiny and lawsuits[1]. In response, the Federal Trade Commission (FTC) is investigating Meta following a formal complaint accusing the company of knowingly allowing underage users without proper protections, with studies showing 42-52% of users in certain VR spaces were minors bypassing safety measures[3]. Despite Meta’s introduction of child accounts in late 2024, concerns remain as children continue accessing adult accounts, prompting calls for stronger government oversight on VR child safety[3][1].
🔄 Updated: 9/8/2025, 6:50:41 PM
Four whistleblowers with a combined 40 years of Meta experience accused the company of deliberately suppressing VR research revealing child safety risks, alleging legal teams altered findings to minimize exposure of harassment and grooming dangers in Horizon Worlds[1][4]. Industry experts highlight that minors under 13 were found using adult accounts in nearly all examined VR experiences, with children comprising up to 52% of users in some spaces, raising serious concerns about Meta’s ineffective safety measures despite introducing child-specific accounts[3]. This suppression and continued exposure of children to risk have prompted sharp critiques that Meta prioritized profits and legal protection over child welfare[1][3].
🔄 Updated: 9/8/2025, 7:00:57 PM
Four whistleblowers accusing Meta of concealing research on child safety risks in VR and online platforms have prompted a U.S. Senate subcommittee to hold a hearing titled "Hidden Harms" on September 9, 2025, to examine these allegations in detail[3]. This government response signals increased regulatory scrutiny of Meta’s practices regarding youth protection in digital environments. The whistleblowers, with decades of combined Meta experience, have also submitted documents to Congress, spotlighting urgent concerns over Meta’s prioritization of profit over child safety[2].
🔄 Updated: 9/8/2025, 7:11:06 PM
Consumer and public reaction to the whistleblower revelations accusing Meta of hiding child safety research in VR has been one of significant concern and outrage. Parents express feeling "completely wrong" in their understanding of the risks, as current parental controls on VR are far less comprehensive than those on smartphones, leaving many "flying blind" about their children’s virtual experiences[1]. Advocacy groups and lawmakers are pressing for urgent Senate hearings, emphasizing the need for "all hands on deck" to regulate and ensure safer metaverse spaces amid growing fears that Meta prioritized profit over children’s safety[3].
🔄 Updated: 9/8/2025, 7:21:03 PM
Following the whistleblower allegations that Meta hid research on child safety risks in VR and online platforms, Meta’s stock experienced immediate market pressure, dropping about 4.3% in after-hours trading on September 8, 2025. Investors reacted sharply amid growing fears of regulatory backlash ahead of the Senate hearing scheduled for September 9, where lawmakers will scrutinize Meta’s handling of child safety data[3][4]. Market analysts cited increased legal risks and potential fines as key factors driving the sell-off.
🔄 Updated: 9/8/2025, 7:31:11 PM
Four whistleblowers with a combined 40 years of Meta research experience allege the company concealed critical data showing significant child safety risks in VR and online platforms, including the immersive metaverse environment[3]. Internal findings reveal that despite safety tools like 'Space Bubble' and 'Personal Boundary,' youth usage of these features is low, and parental controls lag behind smartphone standards, leaving children vulnerable and parents uninformed[1]. In one incident, Meta executives testing Horizon Worlds in October 2022 encountered loud child users disrupting sessions, leading them to isolate meetings in "closed" VR spaces to avoid the distraction rather than address underlying safety concerns[2].
🔄 Updated: 9/8/2025, 7:41:02 PM
Consumer and public reaction to the four whistleblowers accusing Meta of hiding child safety research in VR has been sharply critical, with widespread concern about the company prioritizing profit over children’s well-being. Parents express frustration over the lack of effective parental controls and transparency, describing themselves as "flying blind" in monitoring their children's virtual interactions[1]. Advocacy groups and lawmakers are intensifying calls for stricter regulation, with the Senate Judiciary subcommittee scheduled to investigate these claims, highlighting the urgency of addressing safety gaps in the metaverse[1][3].
🔄 Updated: 9/8/2025, 7:51:00 PM
Four whistleblowers have accused Meta of hiding internal research revealing significant child safety risks on its VR and online platforms, including Horizon Worlds, where children as young as 10 accessed the system despite age restrictions[1][2][5]. Meta reportedly deleted or altered data showing these dangers and instructed staff not to document the presence of underage users, violating federal laws like COPPA, as testified by former Horizon Worlds Director Kelly Stonelake[2][5]. A Senate subcommittee hearing to examine these allegations is scheduled for September 9, 2025, aiming to hold Meta accountable for burying research that highlights real harms, including FBI-documented cases of adults sexually exploiting children via Meta VR devices[3][4][1].
🔄 Updated: 9/8/2025, 8:01:17 PM
Four whistleblowers have accused Meta of repeatedly deleting or altering internal research revealing serious child safety risks in its VR and online platforms, including exposing users as young as 10 years old to harm[4]. This has sparked international concern, with regulatory bodies such as the U.S. Senate Judiciary subcommittee preparing to scrutinize Meta’s practices amid calls for stronger global VR safety regulations[1][4]. Investigations reveal that despite introducing child accounts, many minors bypass safety features by using adult accounts, highlighting Meta’s ongoing failure to protect underage users across multiple countries[3].
← Back to all articles

Latest News