Safeguarding American Children Online

December 1, 2025 by No Comments

Interaction with new smartphone

Consider a prominent U.S. corporation, herein referred to as “Big Corp.” Despite being among the world’s most valuable enterprises, its business model fundamentally relies on fostering addiction to its products among children, leading to devastating consequences.

Internal research conducted by Big Corp reveals that this dependency contributes to increased anxiety, depression, eating disorders, and suicidal thoughts in young people. However, when these discoveries emerge, the company reportedly suppresses the studies and misleads Congress regarding the findings.

Concurrently, Big Corp allegedly subjects children to severe dangers, consistently failing, and frequently declining, to prevent such harm.

Such egregious conduct by an American corporation would typically necessitate swift consequences. Nevertheless, a recent legal brief involving over 1,800 plaintiffs, comprising parents and minors, asserts that Meta has purportedly engaged in these exact transgressions for an extended period. The key distinction lies in the virtual realm: instead of physically endangering children, this tech behemoth has allegedly compromised their safety for financial gain online.

This brief, highlighted by TIME, presents recent evidence detailing how technology firms like Meta, the parent company of Facebook and Instagram, have deliberately designed their platforms with addictive features aimed at children, disregarding potential harms.

The lawsuit, corroborated by Vaishnavi Jayakumar, Instagram’s former head of safety and well-being, states that Meta’s policy allows individuals involved in sex trafficking to remain on its platforms until an offender is reported by users at least 17 times. 

Last September, the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, under my chairmanship, received testimony from brave former Meta employees, including Jayakumar, who conducted youth safety research. During a revealing hearing, they disclosed how Meta allegedly concealed internal studies indicating that children utilizing its virtual reality headsets were sexually solicited by adults within the company’s Metaverse.

The psychological and physiological impact of such abuse on a child is comparable to in-person incidents. However, whistleblowers and extensive internal documents suggest Meta pursued “plausible deniability” by eradicating evidence that might compel the company to intervene. Reportedly, executives even cautioned researchers against using the term “kids” when referring to users on their VR platforms, preferring the euphemism: “Alleged minors with young sounding voices who may be underage.”

Despite Meta’s claims of upholding its record and denying culpability, such harmful practices have allegedly been commonplace within the company. Earlier this year, the Federal Trade Commission noted that in 2019, Instagram’s algorithmic recommendations reportedly facilitated connections between known “groomers” and minors. Even with knowledge of these perilous interactions, CEO Mark Zuckerberg reportedly opted against bolstering the platform’s safety teams, prioritizing cost savings.

The pattern with Meta and other major tech platforms is recurring: algorithms reportedly connect users with drug dealers and promote pro-suicide content; AI chatbots allegedly engage in role-playing fantasies with young users; and platform design features enable children to share their location on a map with anyone, including predators actively seeking to locate them.

The stark reality is that major tech corporations cannot be relied upon to design inherently safe platforms, as robust safety protocols would invariably diminish their profits. It is imperative that Congress intervenes to ensure these companies are held responsible for the widespread damage they have caused to a generation of young people.

Earlier this year, I reintroduced the bipartisan Kids Online Safety Act (KOSA), a measure designed to provide children with comparable protections from harm in the digital sphere as they receive in the physical world. The Senate’s iteration of KOSA would impose a clear duty of care on online platforms to counteract specific threats to minors, such as sexual exploitation, illicit drug access, and the promotion of suicide and eating disorders. Mandating that major tech firms assume responsibility for enhancing their products’ safety is crucial for safeguarding children and offering parents reassurance.

This legislation enjoys extensive bipartisan backing, having passed the Senate last year with an impressive 91-3 vote. This year, the bill has already secured a veto-proof majority, boasting 67 Senate co-sponsors. 

During a Senate Judiciary Committee hearing last year, Meta CEO Mark Zuckerberg confronted numerous parents whose children suffered harms, even loss, due to social media. He stated, “I’m sorry for everything you’ve all gone through,” adding, “No one should have to go through the things that your families have suffered.”

However, this is not the first instance of him offering such an apology. For parents grappling with profound loss, an apology devoid of concrete action remains hollow. 

These families merit accountability. The Kids Online Safety Act has the potential to finally provide it.