Meta suppressed more research on child safety, employees tell Washington Post

Meta continued to suppress studies that showed its products harmed kids, according to a Washington Post report, even after promising Congress it would change its ways four years ago.

Sept. 9, 2025 — A new report in the Washington Post adds to the growing body of evidence showing that Meta, the parent company of Facebook and Instagram, has suppressed or ignored evidence of the harm its products cause.

The Post reported yesterday that “a trove of documents from inside Meta…was recently disclosed to Congress by two current and two former employees who allege that Meta suppressed research that might have illuminated potential safety risks to children and teens on the company’s virtual reality and apps—an allegation the company has vehemently denied.”

The former Meta employees testified on Tuesday before the Senate Judiciary Committee.

Meta’s ongoing pattern of reckless harm

The newly disclosed actions follow a pattern of suppressed research and evidence of harm that Meta established four years ago.

In 2021, a leak of internal Facebook studies led to Congressional hearings. (Meta is the parent company of Facebook and Instagram.) In those Oct. 2021 hearings, members of the Senate’s consumer protection subcommittee scolded Facebook’s global head of safety for withholding information about how its products hurt young people, and for not changing those products to reduce the harm.

“Senators accused the company of knowing for years that Instagram, its photo-sharing app, has caused mental and emotional harm, the New York Times reported at the time.

New allegations on child safety

According to a joint statement the current and former Meta employees submitted to Congress in May 2025, those 2021 hearings led Meta’s legal team to “establish plausible deniability” about the negative effects of the company’s products.

The internal documents obtained by the Post include guidance from Meta’s legal team showing researchers how to avoid sensitive topics that could lead to negative publicity. Post reporter tk tk wrote:

“In one 2023 message exchange, a Meta lawyer advised a user-experience researcher that ‘due to regulatory concerns,’ he should avoid collecting data that showed children were using its VR devices.”

In the documents, according to the Post, employees warned the company that children younger than 13 were bypassing age restrictions to use the company’s virtual reality services.

further information

This week’s Post report and Senate hearing testimony continue Meta’s fact-pattern around kids safety and digital products. For more information see:

Meta suppressed research on child safety, employees say, Washington Post, Sept. 8, 2025.

Meta whistleblowers testify on child safety research before Senate Judiciary Committee, on next week’s scheduled hearing, PBS News, Sept. 9, 2025.

Facebook disputes its own research showing harmful effects of Instagram on teens’ mental health, The Guardian, Sept. 30, 2021.

Whistleblower tells Congress that Facebook products harm kids and democracy, NPR News, Oct. 5, 2021.

How Facebook still targets surveillance ads to kids, Fairplay report, Nov. 2021.

Facebook grilled by Senators over its effect on children, New York Times, Oct. 4, 2021.

Next
Next

44 state attorneys general warn AI CEOs: ‘If you harm kids, you will answer for it’