TikTok Employees Flag Teen Safety Risks in Court Video

Key takeaways
  • Newly unsealed employee videos reveal TikTok staff raised internal concerns about the app’s addictive design and its effects on teen wellbeing.
  • The evidence strengthens North Carolina’s lawsuit alleging TikTok misled families by concealing known risks tied to prolonged screen time and mental health.
  • TikTok disputes the claims, citing its safety features, but the revelations intensify scrutiny as the platform faces both legal and regulatory challenges.

Newly unsealed clips reveal internal concerns that contradict TikTok’s public safety claims.

TikTok’s algorithm has long been a source of fascination for its ability to keep users endlessly engaged. But a newly unsealed video in a North Carolina lawsuit has thrown the platform’s practices into sharp relief, raising serious questions about the company’s awareness of potential harm to teenagers.

The clips, featuring current and former employees, contradict TikTok’s repeated public assurances that its platform is safe for young people. With the lawsuit pressing forward, the revelations put TikTok at the center of a broader debate about whether its pursuit of engagement has come at the expense of adolescent mental health.

Employee Admissions: Engagement Above Wellbeing

The unsealed footage shows TikTok employees acknowledging the tension between business goals and user health. One former risk detection staffer noted that the platform “encourages content” that may not always be healthy, a reflection of how the recommendation system is designed to amplify what users find most engaging.

Another executive admitted the company’s “lofty goal” was to keep people on the app for as long as possible — even if those goals were “not necessarily congruent with good mental health.

These statements suggest that TikTok’s leadership understood the addictive nature of its design while publicly insisting that safety and well-being were priorities. The gap between internal recognition of risks and external messaging now forms a core tension in the North Carolina case.

The Algorithm’s Role in Compulsive Use

Central to the lawsuit is the claim that TikTok’s algorithm is designed to be “highly addictive to minors.” Employees in the unsealed video highlighted how content pathways can escalate from seemingly benign topics into potentially harmful ones. Concerns included prolonged screen time, sleep disruption, and compulsive scrolling behaviors — all framed as side effects of a recommendation engine optimized for retention.

More troubling are employee accounts that the algorithm can feed vulnerable users streams of content related to disordered eating or negative mental health themes, with little opportunity to reset or escape.

@alliancefored

As #NationalRecoveryMonth comes to a close, we hope you can remember that recovery isn’t linear. Be gentle with yourself and give yourself the grace you need and deserve.⁠ ⁠ The National Alliance for Eating Disorders provides a free therapist-staffed helpline and interactive online referral database, as well as many free, weekly, therapist-led (virtual and in-person) support groups.⁠ ⁠ -Call: 866.662.1235⁠ -Text: ALLIANCE to 741-741⁠ -Email: [email protected]⁠ -Visit: allianceforeatingdisorders.com⁠ ⁠ #NOTONEMORE #ItsTimeForChange #EatingDisordersAwareness

♬ original sound - National Alliance for ED

 

This feeds into the state’s argument that TikTok failed to disclose the full extent of these risks to parents and young users.

TikTok’s Public Defense and Safety Measures

TikTok has dismissed the unsealed video as “misleading” and “taken out of context,” insisting that the conversations date back years and do not reflect its current operations. The company highlights more than 70 safety features — including default privacy settings for teens, restricted late-night notifications, and parental control options — as evidence of its commitment to protecting younger audiences.

However, critics argue these safeguards do not address the underlying algorithmic design that incentivizes compulsive use. Measures such as guided meditation prompts and screen-time nudges are seen as incremental, while the lawsuit points to structural decisions that embed addictive patterns at the core of the app’s business model.

The Stakes of the North Carolina Lawsuit

North Carolina’s attorney general has accused TikTok of unfair and deceptive trade practices, alleging that the company concealed knowledge of the risks tied to its platform. The unsealed video bolsters the claim that TikTok knew about potential harm but continued to engineer features that maximized time spent in-app.

The lawsuit seeks financial penalties and a court order to restrict TikTok’s ability to market itself as safe for minors. Its outcome could set a precedent not only for TikTok but for how regulators and courts address addictive design features across the broader social media landscape.

A Platform Under Dual Pressure

The revelations come at a moment of heightened scrutiny for TikTok. In the United States, the app faces the looming possibility of a ban or forced sale under federal law, with a September deadline on the horizon. At the same time, growing pressure from state-level lawsuits signals that regulators are widening their focus from national security concerns to public health risks.

For TikTok, the unsealed video is more than a courtroom setback — it underscores the vulnerability of its reputation. As employee voices contradict corporate messaging, the platform’s ability to convince lawmakers, regulators, and parents that it can be trusted with young audiences grows more uncertain.

Trust Erodes as Contradictions Surface

The North Carolina lawsuit cuts to the heart of a longstanding question: can a platform built on maximizing engagement also safeguard the well-being of its youngest users? Employee testimony now in the public domain suggests TikTok knew more about these risks than it admitted.

Whether the courts hold the company accountable remains to be seen, but the revelations have already deepened skepticism about TikTok’s priorities. At stake is not only its U.S. future, but also the broader reckoning over how much responsibility tech companies bear for the psychological toll of their algorithms.

About the Author
Dan Atkins is a renowned SEO specialist and digital marketing consultant, recognized for boosting small business visibility online. With expertise in AdWords, ecommerce, and social media optimization, he has collaborated with numerous agencies, enhancing B2B lead generation strategies. His hands-on consulting experience empowers him to impart advanced insights and innovative tactics to his readers.