Lead: Two former Meta safety researchers told a US Senate committee on Tuesday that the company concealed evidence of risks to children tied to its virtual reality products, alleging internal pressure to delete or avoid research that would show sexual-abuse and other harms. The testimony, given by Jason Sattizahn and Cayce Savage, followed reporting by the Washington Post and prompted denials from Meta, which called the claims false. Senators questioned whether the company’s policies and parental controls are adequate, and whether internal legal review shaped or suppressed findings. The exchange highlighted renewed scrutiny of Meta’s Reality Labs and its responsibility for youth safety in immersive platforms.
Key Takeaways
- Two former Meta researchers, Jason Sattizahn and Cayce Savage, testified to a US Senate committee on Tuesday that Meta discouraged or altered internal research into harms to minors on its VR platforms.
- Savage alleged she found coordinated child sexual exploitation on Roblox within Meta’s VR app store and said she warned Meta to remove the app; Roblox disputes the characterization.
- Meta denied the allegations, calling the reporting selective, and said it has approved nearly 180 Reality Labs studies on youth safety and well-being in recent years.
- Sattizahn, who worked at Meta from 2018 to 2024, told senators the company pruned and manipulated research that could show harm, calling Meta’s public rebuttal a “lie by avoidance.”
- Senators from both parties, including John Hawley (R-MO) and Ashley Moody (R-FL), pressed researchers on the accessibility and effectiveness of parental controls for Quest headsets and Horizon Worlds.
- This testimony joins earlier disclosures, most notably Frances Haugen’s 2021 leaks about Instagram, reinforcing ongoing oversight of tech platforms and child safety.
Background
Meta, the owner of Facebook, Instagram and WhatsApp, has invested heavily in immersive technology through Reality Labs, marketing headsets and social VR environments such as Horizon Worlds. Concerns about children’s experiences online have been a policy focus for regulators, child-safety advocates and lawmakers for several years, and scrutiny intensified after high-profile inside disclosures about platform harms. In 2021 Frances Haugen released internal documents alleging Instagram’s impact on teenage mental health, prompting hearings and public debate about whether corporate incentives override safety.
The current allegations center on internal research practices: Sattizahn and Savage say researchers were asked to remove or avoid evidence that could demonstrate sexual-exploitation risks or other harms to underage users. Meta counters that the claims come from selectively leaked documents and that it has not prohibited safety work; the company highlighted that it has cleared nearly 180 Reality Labs-related studies concerning youth. The dispute surfaces amid a broader policy conversation about how platforms identify, document and mitigate emergent risks in fast-evolving virtual spaces.
Main Event
On Tuesday before a Senate committee, Sattizahn and Savage detailed circumstances they say show managerial and legal interference with safety research. Savage testified that during her youth-focused VR work she documented instances in which Roblox appeared to be used by predatory actors who set up pay-to-perform scenarios using Robux, the platform’s virtual currency, and that she recommended the app be removed from Meta’s VR store. Roblox issued a statement rejecting Savage’s portrayal as based on outdated or ill-informed information.
Sattizahn, who worked at Meta from 2018 until 2024, said the company pressured researchers to remove or not pursue data that could demonstrate user harm. He described Meta’s public rebuttal to the Washington Post’s reporting as misleading and told senators the company was “pruning and manipulating” research outputs. Meta replied that the reporting was curated to create a false narrative and reiterated its record of supporting internal studies on youth safety.
Senators pressed specifics. Republican Senator John Hawley asked about evidence and controls; Republican Senator Ashley Moody said she personally struggled to find and use parental controls on Meta’s devices despite being an early attorney general litigant against the company. The former researchers said the controls exist but can be difficult to navigate and that internal barriers limited the production of robust evidence about harms.
Analysis & Implications
The testimonies sharpen questions about research governance inside major tech firms: who decides what investigations proceed, how legal and product teams interact with safety teams, and whether commercial imperatives can bias internal science. If true, the allegations suggest structural incentives that may deprioritize or obscure risks, particularly when those risks could constrain product deployment or monetization of youth-facing experiences. Regulators will likely probe whether internal policies comply with consumer-protection statutes and whether nondisclosure or document-handling practices impeded outside scrutiny.
For parents and guardians, the episode underscores a practical gap: even where parental tools exist, their usability and transparency matter. Senators’ testimony highlighted reports that Quest parental controls and Horizon Worlds settings may be hard to discover or confusing to operate, which could blunt their protective value. Policy responses may include mandates for clearer labeling, default safety settings for underage accounts, and independent audits of safety research and product impact.
Commercially, persistent allegations of suppressed safety findings could affect Meta’s brand and invite tighter regulatory oversight, potential fines, and restrictions on youth-targeted features. For the VR ecosystem broadly, the case may push app stores and platform operators to adopt stricter vetting, continuous monitoring, and stronger collaboration with law enforcement for exploitation reports. Internationally, other jurisdictions watching US oversight may adopt parallel measures, increasing compliance costs for immersive platforms operating across borders.
Comparison & Data
| Year | Topic | Allegation |
|---|---|---|
| 2021 | Internal research showed harms to teens; Frances Haugen leaked documents | |
| 2024–2025 | Meta VR / Reality Labs | Former researchers allege suppression of findings on child safety and sexual-exploitation risks |
The table above places the current testimony alongside the better-known 2021 disclosures. Both episodes center on internal research that employees say identified risks to young users; both prompted public hearings. Meta has cited nearly 180 Reality Labs-related studies it approved in recent years as evidence of ongoing safety work, but the whistleblowers argue approval does not equal freedom to publish or pursue all lines of inquiry. That distinction—between permission to study and permission to surface damaging results—will be central to any oversight response.
Reactions & Quotes
“Meta has chosen to ignore the problems they created and bury evidence of users’ negative experiences,”
Jason Sattizahn, former Meta researcher
Context: Sattizahn used this phrase in testimony to emphasize his view that internal processes limited truthful reporting about harms. Meta disputes the characterization, saying the company supports and publishes safety research.
“We strongly disagree with the allegations; safety is a top priority,”
Roblox spokesperson (company statement)
Context: Roblox responded to Savage’s testimony by rejecting the claim that the platform is a hub for coordinated paedophile rings and described ongoing moderation and enforcement efforts, including 24/7 moderation and law-enforcement referrals.
“The claims are nonsense,”
Meta spokesperson (company statement)
Context: Meta called the whistleblowers’ narrative selective and reiterated its record of approving youth-safety studies, arguing that internal documents were taken out of context.
Unconfirmed
- The specific claim that Meta instructed researchers to erase all evidence of sexual abuse risk remains an allegation; public documentation of a directive to delete such evidence has not been independently released.
- The assertion that Roblox was being used systemically by coordinated paedophile rings within Meta’s VR store is contested; Roblox says the description relies on outdated or incorrect information and points to its moderation systems.
- Reports that Meta lawyers directly reshaped internal research outcomes are based on selectively leaked documents reported by the Washington Post; the full set of internal communications has not been made public for independent review.
Bottom Line
The Senate testimony by two former Meta researchers intensifies scrutiny over how large tech firms monitor and report risks to children in emerging platforms like VR. Allegations that research was suppressed, if substantiated, would indicate governance failures with consequences for public safety and regulatory compliance. Meta’s categorical denials and citation of numerous approved studies point to a contested factual record that will hinge on access to internal documents and potentially independent audits.
For parents, lawmakers and platform operators, the episode highlights two imperatives: make safety tools discoverable and effective, and ensure independent oversight of safety research and product decisions. Expect policymakers to press for clearer transparency requirements, stronger parental defaults, and mechanisms that protect researchers who surface harms. The coming weeks may bring further document releases, additional testimony or formal inquiries that clarify which claims are verified and which remain contested.