Recently unsealed internal documents reveal that TikTok prioritized its public image over effective strategies to combat teen mental health issues, struggling to balance safety with user engagement.
These documents are part of an ongoing lawsuit against TikTok, linked to a filing by the Kentucky Attorney General. Although portions were redacted, they were initially available for public review before being resealed.
Currently, TikTok faces lawsuits from 14 state attorneys general, who have accused the platform of misleading advertising regarding its addictive algorithm, thereby jeopardizing the well-being of children.
The lawsuits target several harmful features of the platform, including beauty filters, the For You Page (FYP), and TikTok Live. Alarmingly, TikTok’s internal research indicates that users can become addicted after viewing just 260 videos, with compulsive usage associated with various negative mental health outcomes, such as diminished analytical skills, memory issues, and heightened anxiety. The research also noted that the algorithm tends to engage younger users more effectively.
Internal measures like screen time alerts and limits were introduced despite research suggesting minimal effectiveness. According to the documents, these limits resulted in only a slight 1.5-minute decrease in app usage.
Additionally, the platform’s adverse effects on body image are highlighted, as TikTok reportedly favored conventionally attractive users in the FYP algorithm. Suggestions to implement informative banners and awareness campaigns regarding content related to beauty filters were ignored. Executives were aware that young users often encountered videos depicting suicidal ideation and eating disorders, which evaded moderation filters.
In response to these allegations, a TikTok spokesperson stated that the complaint selectively quoted outdated documents, misrepresenting the platform’s commitment to community safety. The spokesperson emphasized the company’s robust safeguards and voluntary safety features, including default screen time limits and privacy settings for users under 16.
TikTok’s internal communications reflect patterns observed in similar discussions among leaders at Meta, where suggestions to address youth bullying and mental health concerns were reportedly dismissed. These findings emerge amid a broader landscape of legal challenges faced by TikTok, Meta, and other social media platforms regarding their impact on young users’ mental health.