Lead
Two former Meta researchers testified to a Senate subcommittee on Tuesday that Meta’s virtual reality (VR) environments exposed children to adult content and that the company impeded or erased internal research documenting those harms. Jason Sattizahn and Cayce Savage said they found minors encountering nudity, sexual solicitations and live sexual acts inside Meta VR, and that some of their work was blocked or deleted. Meta disputed parts of the testimony, saying investigators have conducted youth-related studies and that privacy rules limit some research. Senators from both parties signaled urgency for legislative remedies and possible legal changes.
Key Takeaways
- Two former Meta employees, Jason Sattizahn and Cayce Savage, testified to the Senate Judiciary Subcommittee on Privacy, Technology and the Law on Tuesday about alleged harms to minors in Meta VR.
- Savage, who left Meta in 2023, told senators she observed children being solicited for nude photographs, bullied, and exposed to mature content including gambling and pornography.
- Sattizahn said some adults used Meta VR in ways that transmitted sexual audio, and said he was fired last year after raising concerns about limits on research.
- The witnesses alleged Meta blocked further research into the prevalence of harms and in one instance ordered deletion of evidence tied to underage sexual harassment.
- Meta spokesperson Andy Stone called the central claims “nonsense,” said there was never a blanket ban on youth research, and noted about three dozen youth-related social studies since 2022.
- Longstanding regulatory constraints such as the 1998 COPPA law complicate how companies collect and retain data on children under 13, a factor Meta cites when deleting some data.
- Lawmakers from both parties expressed frustration and renewed calls for new legislation this year to hold platforms accountable for child safety online.
Background
Meta has invested billions of dollars into virtual reality hardware and software since acquiring Oculus in 2014 for $2 billion and rebranding the parent company to Meta in 2021 to emphasize the metaverse strategy. The company sells headsets and funds virtual spaces where users interact in real time using voice, spatial audio and avatars. Those interactions present new moderation and safety challenges different from text- or image-based social platforms.
Researchers, regulators and journalists have repeatedly raised concerns about how minors fare on Meta platforms, dating back to whistleblower disclosures about Instagram and teens in 2021. Federal privacy law, notably the Children’s Online Privacy Protection Act (COPPA) of 1998, restricts how companies collect data on children under 13 and forces deletion of some information lacking verifiable parental consent, creating operational and compliance trade-offs for safety teams.
Main Event
On Tuesday, Sattizahn and Savage spoke to the Senate subcommittee and described casework and field observations from their time at Meta in which underage users were exposed to sexual content and predatory behavior inside VR. Savage testified that encounters with strip-club simulations, live pornography and solicitations for nude images were not rare in the spaces she studied.
Savage said she sought to quantify how frequently children experienced harm but that Meta blocked her attempts to run the necessary research. She told senators she could not report a percentage of affected children because she was prevented from completing the work. Savage left Meta in 2023 after raising these concerns internally, she said.
Sattizahn described audio-based harassment he observed, including instances in which adults appeared to be masturbating while minors were present and could hear the audio. He said he raised the issue internally, which he argues contributed to his termination last year. He also said teams were instructed to redact or remove certain evidence, including material tied to youth harassment in Germany, according to his testimony.
Meta pushed back in public statements and in testimony recorded by the company’s spokesperson, Andy Stone, who said the claims were selectively framed and that the company has not imposed a blanket ban on youth research. Stone noted Meta had conducted roughly three dozen youth-related social studies since 2022 and stressed compliance with privacy rules governing children’s data.
Analysis & Implications
If the witnesses’ accounts are accurate, they point to a gap between product investment and safety resourcing: Meta has committed billions to building VR infrastructure but, according to the testimony, has allocated insufficient resources to systematically study and mitigate risks to minors. That gap has commercial incentives: researchers at the hearing argued that fewer young users would reduce engagement and advertising revenue.
Policy implications are immediate. Senators from both parties signaled support for legislative action that could impose duties on platforms to protect children or make it easier for victims to sue. Last year the Senate advanced bills including a potential “duty of care” for platforms, though comparable measures stalled in the House. Any new legislation would need to reconcile child protection with free-speech and privacy concerns.
Operationally, COPPA and other privacy regimes complicate empirical study of harms because collecting verifiable data on under-13 users requires parental consent; companies often respond by avoiding research paths that could collect identifiable data. That tension—between the desire to study harms and legal constraints on data collection—creates the risk of undercounting or underreporting the scale of problems.
Comparison & Data
| Year | Key Event | Relevant Detail |
|---|---|---|
| 2014 | Oculus acquisition | Meta (then Facebook) bought Oculus for $2 billion |
| 2021 | Corporate rebrand | Company became Meta, emphasizing VR/metaverse |
| 2022–present | Youth research | Meta says ~3 dozen youth social studies since 2022 |
This simple timeline highlights the mismatch witnesses described: large strategic investment in VR beginning in 2014 and accelerating after 2021 versus the comparatively limited, contested or legally constrained research into youth harms described in testimony. Quantifying prevalence of harms remains an open empirical problem because witnesses say access to the necessary datasets was often restricted.
Reactions & Quotes
Senators reacted sharply after the testimony and called for technical and legal remedies. Republican and Democratic lawmakers said they had run out of patience with Meta’s management of child safety.
“We were hired to make the platform safer for children, but what we found was a company that knew its products were unsafe and did not act,”
Sen. Marsha Blackburn, R–Tenn., Subcommittee Chair (paraphrased)
Meta’s public response emphasized regulatory constraints and disputed the framing of selective internal documents.
“The claims at the heart of this hearing are nonsense; they’re based on selectively leaked internal documents,”
Andy Stone, Meta spokesperson (paraphrased)
Witnesses also described operational practices they said blocked research and limited cross-team sharing.
“We were directed how to write reports to limit risk to Meta, and internal groups were locked down,”
Jason Sattizahn (paraphrased)
Unconfirmed
- Exact prevalence: The percentage of children in Meta VR who experienced each specific harm (e.g., solicitation, exposure to pornography) has not been publicly quantified and witnesses said they were blocked from completing such prevalence research.
- Deletion specifics: While witnesses allege deletion of evidence tied to underage harassment in at least one case, independent verification of which files were deleted and under what legal basis remains pending.
- Scope across regions: Testimony cited incidents in Germany and elsewhere, but a comprehensive cross-national dataset showing incidence rates by country has not been made public.
Bottom Line
The Senate testimony by two former Meta researchers adds a detailed, internal perspective to longstanding public concerns about how children fare in immersive online spaces. Their allegations—if corroborated—underscore tensions between rapid product investment and the organizational safeguards needed to protect minors in real-time, multi-sensory environments.
Policy responses are likely: lawmakers signaled support for new statutes or legal tools to hold platforms accountable, and advocates will press for better transparency and independent research access. For families and regulators, the key questions remain how to measure harms reliably within privacy rules and how to align corporate incentives with child safety in the metaverse era.
Sources
- NBC News (news report; primary account of Tuesday’s Senate hearing)
- The Washington Post (news report; prior reporting on internal staff briefing to Congress)
- FTC — COPPA (official; U.S. federal regulation on children’s online data)