Internal documents from TikTok, accidentally made public this week, reveal that the company was aware its algorithm could be detrimental to children’s wellbeing. The revelation comes as attorneys general from 14 states have filed lawsuits against the popular video-sharing app, alleging harm to minors’ mental health.
The confidential information came to light when journalists at Kentucky Public Radio discovered that redacted portions of court documents became visible when copied into a new text file. These documents were part of Kentucky’s contribution to the multi-state legal action initiated on Tuesday against TikTok.
According to the exposed documents, TikTok’s parent company ByteDance conducted internal studies showing potential negative impacts on young users. The research suggested that users could develop an addiction to the app after viewing just 260 videos, which translates to
approximately 35 minutes of use given the average video length of 8 seconds.
More alarmingly, the documents indicated that extensive use of TikTok correlated with various adverse mental health effects. These included diminished analytical skills, impaired memory formation, reduced contextual thinking and conversational depth, decreased empathy, and heightened anxiety.
The unintended disclosure also revealed that TikTok was aware of the limited effectiveness of its tools designed to restrict screen time for teenage users. Despite this knowledge, the company continued to promote these features publicly.
Furthermore, the documents exposed TikTok’s approach to underage users. While the platform’s policies prohibit accounts for children under 13, internal guidelines instruct moderators to exercise caution when removing accounts and to only take action on reported underage users if the account explicitly identifies the user as under 13.
Legal experts and attorneys representing plaintiffs in related cases against social media companies view these revelations as consistent with their allegations. They argue that tech companies deliberately design their products to maximize user engagement and profits at the expense of young people’s mental health.
Jayne Conroy, an attorney representing plaintiffs in a class-action lawsuit against social media platforms, stated that the internal documents demonstrate how these companies intentionally create products that “relentlessly engage and exploit the adolescent brain.”
Matthew Bergman, founder of the Social Media Victims Law Center, echoed this sentiment, noting that social media platforms are designed to be addictive, often showing content that users “can’t look away from” rather than what they actually want to see.
In response to the leaked information, a TikTok spokesperson criticized the publication of sealed court documents as “highly irresponsible.” The company maintains that the complaint misrepresents their commitment to community safety by taking outdated documents out of context and cherry-picking misleading quotes.
TikTok asserts that it has implemented robust safeguards, including proactive removal of suspected underage users and voluntary safety features such as default screen time limits and enhanced privacy settings for minors under 16.
The accidental revelation of these internal documents has intensified the ongoing debate about the responsibility of social media platforms in protecting young users. As the legal battles unfold, the tech industry faces increasing scrutiny over its practices and the potential long-term impacts of its products on children’s mental health and well-being.