For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns.
The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks.
In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted — blacked-out from public view — since authorities entered into confidentiality agreements with TikTok.
But in one of the lawsuits, filed by the Kentucky Attorney General’s Office, the redactions were faulty. This was revealed when Kentucky Public Radio copied-and-pasted excerpts of the redacted material, bringing to light some 30 pages of documents that had been kept secret.
After Kentucky Public Radio published excerpts of the redacted material, a state judge sealed the entire complaint following a request from the attorney general’s office “to ensure that any settlement documents and related information, confidential commercial and trade secret information, and other protected information was not improperly disseminated,” according to an emergency motion to seal the complaint filed on Wednesday by Kentucky officials.
NPR reviewed all the portions of the suit that were redacted, which highlight TikTok executives speaking candidly about a host of dangers for children on the wildly popular video app. The material, mostly summaries of internal studies and communications, show some remedial measures — like time-management tools — would have a negligible reduction in screen time. The company went ahead and decided to release and tout the features.
Separately, under a new law, TikTok has until January to divest from its Chinese parent company, ByteDance, or face a nationwide ban. TikTok is fighting the looming crackdown. Meanwhile, the new lawsuits from state authorities have cast scrutiny on the app and its ability to counter content that harms minors.
In a statement, TikTok spokesman Alex Haurek defended the company’s child safety record and condemned the disclosure of once-public material that has now been sealed.
"It is highly irresponsible of NPR to publish information that is under a court seal,” Haurek said. “Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
He continued: “We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”
Kentucky AG: TikTok users can become ‘addicted’ in 35 minutes
As TikTok’s 170 million U.S. users can attest, the platform’s hyper-personalized algorithm can be so engaging it becomes difficult to close the app. TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user “is likely to become addicted to the platform.”
In the previously redacted portion of the suit, Kentucky authorities say: “While this may seem substantial, TikTok videos can be as short as 8 seconds and are played for viewers in rapid-fire succession, automatically,” the investigators wrote. “Thus, in under 35 minutes, an average user is likely to become addicted to the platform.”
Another internal document found that the company was aware its many features designed to keep young people on the app led to a constant and irresistible urge to keep opening the app.
TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.
In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”
TikTok: Time-limit tool aimed at ‘improving public trust,’ not limiting app use
The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day.
The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.
Internal documents show that TikTok measured the success of this tool by how it was “improving public trust in the TikTok platform via media coverage,” rather than how it reduced the time teens spent on the app.
After tests, TikTok found the tool had little impact – accounting for about a 1.5-minute drop in usage, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool. According to the attorney general’s complaint, TikTok did not revisit this issue.
One document shows one TikTok project manager saying, “Our goal is not to reduce the time spent.” In a chat message echoing that sentiment, another employee said the goal is to “contribute to DAU [daily active users] and retention” of users.
TikTok has publicized its “break” videos, which are prompts to get users to stop endlessly scrolling and take a break. Internally, however, it appears the company didn’t think the videos amounted to much. One executive said that they are “useful in a good talking point” with policymakers, but “they’re not altogether effective.”
Document: TikTok demoted people it deemed unattractive on its feed
The multi-state litigation against TikTok highlighted the company’s beauty filters, which users can overlay on videos to make themselves look thinner and younger or to have fuller lips and bigger eyes.
One popular feature, known as the Bold Glamour filter, uses artificial intelligence to rework people’s faces to resemble models with high cheekbones and strong jawlines.
TikTok is aware of the harm these beauty filters can cause young users, the documents show.
Employees suggested internally the company “provide users with educational resources about image disorders” and create a campaign “to raise awareness on issues with low self esteem (caused by the excessive filter use and other issues).”
They also suggested adding a banner or video to the filters that included “an awareness statement about filters and the importance of positive body image/mental health.”
Technology
Does the 'Bold Glamour' filter push unrealistic beauty standards? TikTokkers think so
This comes as the documents showcase another hidden facet of TikTok’s algorithm: the app prioritizes beautiful people.
One internal report that analyzed TikTok’s main video feed saw “a high volume of … not attractive subjects” were filling everyone’s app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful.
“By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote.
This article was originally published in NPR