Democracy Dies in Darkness

TikTok knew depth of app’s risks to children, court document alleges

Internal communications of platform employees were revealed in a copy of a state lawsuit where redacted portions were visible.

7 min
(Dado Ruvic/Reuters)

TikTok’s political woes deepened Friday after an inadvertent disclosure in a government lawsuit against the platform detailed internal research and communications suggesting that the app’s managers were aware of its alleged risks to children, airing explosive revelations as the Chinese-owned company tries to fend off a federal ban.

According to a lawsuit filed by the state of Kentucky this week, redacted portions of which were unintentionally left readable, the popular video-sharing platform’s internal research allegedly showed that children were particularly susceptible to its powerful algorithmic feeds and that excessive use of the site could lead to a series of mental health issues.

“As expected, across most engagement metrics, the younger the user, the better the performance,” stated one 2019 internal company report, according to the lawsuit.

The allegations are likely to serve as fodder for critics across government who argue that TikTok and other social media companies have prioritized their profits over the well-being of their most vulnerable users, particularly children and teens. And it adds to the company’s legal headwinds, which include a law that aims to cut it off from its vast user base of some 170 million Americans.

On Tuesday, more than a dozen state attorneys general sued TikTok, accusing the company of harming children by using addictive product features such as autoplay and push notifications to keep kids hooked on the app, despite potential harms from excessive use.

The complaints, filed by 13 states, including Kentucky, and the District of Columbia, marked the most significant legal challenge against the company to date over allegations that it is contributing to a youth mental health crisis in the United States.

NPR first reported Friday that faulty redactions exposed dozens of pages from Kentucky’s complaint, featuring internal TikTok documents and communications. The Washington Post obtained and reviewed a copy of the document in which the redacted portions are readable. A state judge has since sealed the complaint.

According to the lawsuit, TikTok’s own research found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”

In redacted comments, the complaint cites internal company surveys showing that TikTok users believed they spent too much time in the app. “The reason kids watch TikTok is because the algo[rithm] is really good,” one unnamed executive purportedly said. “But I think we need to be cognizant of what it might mean for other opportunities. And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at somebody in the eyes.”

They also show that some of the company’s attempts at resolving users’ concerns about overuse offered only marginal impact. During one internal experiment that was cited (and redacted) in the complaint, the company found that its default screen-time-use reminders reduced teens’ average TikTok time per day only from 108.5 minutes to 107 minutes. The company promoted the feature anyway, Kentucky’s lawyers noted.

The complaint refers to an internal TikTok group known as “TikTank” that studied the app’s effect on users and wrote a report arguing that its recommendation algorithm and advertising-based business model encouraged “optimization for time spent in the app.”

Such optimization is common in social media, where apps compete for attention and cultural cachet. But the TikTank report also noted that “TikTok is particularly popular with younger users, who are particularly sensitive to reinforcement in the form of social reward and have minimal ability to self-regulate effectively,” according to the complaint.

In a statement, TikTok spokesman Alex Haurek called it “highly irresponsible” for news outlets to publish information that is under seal, and said the complaint “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”

“We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16,” he added.

Responding to a request for comment, Kevin Grout, a spokesman for Kentucky Attorney General Russell Coleman, asked that The Post “refrain from publishing any information redacted in the complaint, which is subject to certain confidentiality agreements for the time being.”

Other states’ complaints were also heavily redacted when filed, but public officials have signaled they plan to push state courts to make more of them public.

“These unredacted documents prove that TikTok knows exactly what it’s doing to our kids–and the rot goes all the way to the top,” the Tech Oversight Project, a social media watchdog group that receives funding from philanthropic groups including the Omidyar Network, said in an X post Friday.

Earlier this week, TikTok pushed back on the state lawsuits, which Haurek said were based on “inaccurate and misleading claims.”

A coalition of states has been investigating the impact of social media companies on children’s mental health for over two years. Those efforts culminated in the flurry of lawsuits this week against TikTok and a prior volley last year against Meta, which over 40 states similarly accused of harming kids through addictive product features across its Facebook and Instagram platforms. Those cases are ongoing. Both sets of lawsuits are poised to test novel arguments that states are advancing that link how businesses design their platforms to rising mental harms among children.

The issue gained renewed attention in 2021 after Facebook whistleblower Frances Haugen disclosed internal research showing the company knew its products at times exacerbated mental health issues among some teens. The revelations sprung legislators and regulators into action, with federal and state lawmakers pushing for new guardrails to protect children online and enforcers launching new probes into tech companies’ practices.

The legislative efforts face major hurdles. Several state laws restricting children’s access to social media or requiring that tech companies take steps to mitigate harms to youth have been halted in court after facing challenges from industry groups arguing the rules violate users’ constitutional rights. At the federal level, legislation to expand privacy and safety protections for children online has cleared the Senate but faces political obstacles in the House.

In lieu of new regulations, state enforcers have gone after Meta and TikTok for allegedly violating existing consumer protection laws, including by misleading the public about how safe its platforms are for children.

TikTok is currently fighting a federal law to force a sale or ban of the app, whose China-based parent company ByteDance has stoked national security concerns in the United States. TikTok argues the measure would infringe on the free speech rights of the app’s millions of U.S. users.

Under the law, TikTok can avoid a ban if ByteDance sells the app to non-Chinese owners before Jan. 19, a deadline the president can extent by 90 days if TikTok is making progress on a sale. Such a sale appears unlikely, given TikTok’s huge potential price tag of more than $100 billion, and the short window for completing such a geopolitically sensitive deal. China has also said it would ban the sale and export of one of TikTok’s most critical components, its recommendation algorithm.

A panel of three judges on the federal D.C. Court of Appeals is expected to rule on TikTok’s challenge of the law. The company and the Justice Department have requested an expedited judgment by December, allowing time for a potential appeal to be filed with the Supreme Court before Jan. 19.

TikTok is separately facing a lawsuit by the Justice Department alleging the company broke federal children’s privacy laws by collecting data on millions of users under 13.

Eva Dou contributed to this report.