The contents of the complaint that Kentucky filed against the app were released by NPR. The documents in question were not censored.
According to National Public Radio (NPR), executives and staff of TikTok were well aware of the fact that the app’s features encourage compulsive usage of the app, as well as the severe mental health repercussions that are associated with this behavior. This broadcasting group went over the records from the case that were filed by the Kentucky Attorney General’s Office and were published by the Kentucky Public Radio. The documents were not censored in any way. A few days ago, lawyers from more than a dozen states filed a lawsuit against TikTok, accusing the company of “falsely claiming [that]it is safe for young people.” Russell Coleman, the Attorney General of Kentucky, stated that the application was “specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control.”
While the majority of the documents that were filed for the litigation contained censored material, the redactions that were made in Kentucky were incorrect. It appears that TikTok’s own investigation discovered that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” TikTok’s management were also aware of the fact that excessive use can not only disrupt sleep, but also interfere with responsibilities at work and school, and even “connecting with loved ones.”
According to reports, they were also aware of the fact that the time-management mechanism within the app is not very effective in preventing young users from using the app. Despite the fact that the tool limits the amount of time that can be spent on the app to a maximum of sixty minutes per day, adolescents were still spending 107 minutes on the app even when it was operational. While the average amount of time spent using the tool before it was released was 108.5 minutes per day, this is only 1.5 minutes less than that. According to the records that were kept within the company, TikTok determined that the effectiveness of the tool was dependent on how it “improved public trust in the TikTok platform via media coverage.” As stated in one of the business’s documents, “[m]inors do not have executive function to control their screen time, while young adults do.” The corporation was aware that the product would not be successful. According to a different paper, it was said that “across most engagement metrics, the younger the user, the better the performance.”
Furthermore, it has been alleged that TikTok is aware of the existence of “filter bubbles” and is aware of the possible risks that they may pose. According to the documents, employees conducted internal tests in which they discovered that they were drawn into negative filter bubbles quickly after following particular accounts, such as those that focused on content that was both painful (“painhub”) and sad (“sadnotes”). In addition, they are aware of information and accounts that promote “thinspiration,” which is a term that is common among people who have eating disorders. The researchers who worked on TikTok discovered that users are placed inside filter bubbles after using the app for thirty minutes in a single sitting. This is because of the way the algorithm works.
Based on the documents, it appears that TikTok is also having difficulties with moderation. During the course of an internal inquiry, it was discovered that underage girls using the app were receiving “gifts” and “coins” in exchange for live strip dancing. Furthermore, it has been alleged that higher-ups in the corporation have told their moderators not to remove users who are suspected to be under the age of 13, unless the youngsters’ accounts explicitly say that they are, in fact, under the age of 13. According to NPR, TikTok has also admitted that a significant amount of content that violates its guidelines is able to pass past its moderation processes. This includes videos that glorify minor sexual assault and physical abuse, as well as videos that normalize pedophilia.
Alex Haurek, a spokesman for TikTok, defended the company and told the organization that the complaint filed by the Kentucky Attorney General “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.” He additionally stated that TikTok has implemented “robust safeguards, which include proactively removing suspected underage users” and that the company has additionally “voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”