New Report Exposes How Many Minutes It Takes to Get Addicted to TikTok
TikTok is well aware of just how harmful it is to young users, according to internal documents included in a lawsuit filed Tuesday.
Thirteen states are separately suing TikTok for misleading the public about the app’s potential harmful effects. One of the lawsuits, filed by the Kentucky attorney general’s office, contained faulty redactions, revealing the confidential internal documents uncovered in the two-year investigation into TikTok, according to NPR. The information contained in the redactions was first reported by Louisville Public Media, before a judge resealed the suit.
State investigators found that it was possible to form a habit around using the app after watching 260 videos, which on a fast-paced app such as TikTok can take fewer than 35 minutes.
Internal research at TikTok found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.
This was not only the case with how teens were using the app but also with what they were being shown on it. TikTok actively demoted videos featuring people deemed unattractive, and boosted videos of those using beauty filters. It’s not difficult to imagine how imposing and rewarding unattainable beauty standards could be harmful to young users.
Internal documents in the suit also showed that TikTok would sort users into “filter bubbles” of content, where a user “encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience.”
An internal document showed that users were “placed into ‘filter bubbles’ after 30 minutes of use in one sitting.”
This can be particularly harmful should a user end up in a bubble that is pushing negative content, such as pro-anorexia content disguised as “thinspiration,” which has recently had a major resurgence on the app. Videos featuring self-harm also made it past TikTok moderators.
Internal documents also showed just how easy it is for young users to be led down a depressing rabbit hole, after engaging with content in the filter bubbles “painhub” or “sadnotes.”
“After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble,” one employee wrote. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.”
When TikTok did tout new time-management tools to reduce kids’ usage, internal documents revealed that TikTok cared more about how the tools were perceived than how well they actually worked. The documents showed that executives rated the success of these tools by how they were “improving public trust in the TikTok platform via media coverage,” rather than how they were actually reducing usage. The tools themselves were found to have a negligible impact on usage.
One executive said that the app’s “break” videos, which encourage users to consider leaving the app after long periods of activity, were “useful in a good talking point” but “not altogether effective.” TikTok still decided to launch the features.
One executive gave a chilling description of what TikTok’s addicting algorithm could do to young users. “We need to be cognizant of what it might mean for other opportunities,” the unnamed executive said, according to court documents. “And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at someone in the eyes.”
TikTok spokesperson Alex Haurek criticized NPR for publishing the now-redacted information.
“It is highly irresponsible of NPR to publish information that is under a court seal,” said Haurek. “Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.
“We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16,” Haurek said.