At least 83 clips featured children in sexualized content, according to a disturbing investigation.
An investigative investigation published by Bloomberg presents a grim picture of the problems that Twitch faces in controlling the livestreaming platform, particularly with regard to its Clips function, which enables users to save short films. After conducting an analysis of over 1,100 recordings, the media outlet discovered that at least 83 of them had sexually explicit content that involved children. Twitch removed the videos once it was made aware of the situation, and a spokeswoman for the firm stated in an email to Newtechmania that the company has since “invested heavily in enforcement tooling and preventative measures, and will continue to do so.”
The concern with Clips’ permanent character on the usually ephemeral platform was brought to attention by Bloomberg, which highlighted one occurrence that represented the same problem. This article tells the distressing story of a young child who, at the age of twelve, uploaded a video to Twitch in the spring of last year “to eat a sandwich and play his French horn.” Almost immediately, he started accepting requests from viewers, which (in a terrible mirror of behaviour that occurs online) led to the child pulling his pants down in some peculiar way.
A statement made by the outlet indicates that the problem was resolved “in an instant.” Nevertheless, the recording ability of Clips made it possible for one viewer to save it, and this viewer is described as having followed more than one hundred accounts belonging to minors. As a result, the twenty-second clip reportedly received more than one hundred and thirty views before Twitch was contacted and removed it.
As a means of preserving moments on the platform that would otherwise be fleeting, clips were introduced in the year 2016. This function will record 25 seconds prior to tapping the record button, as well as five seconds after it has been pressed. One of the unintended consequences of this is that it gives predators the opportunity to seize a potentially dangerous moment and use it in another context.
In an effort to generate more material on the platform that is similar to that of TikTok, Twitch has planned to expand Clips this year as part of a strategy. The company intends to establish a discovery feed, which is also comparable to TikTok, in which users will be able to publish their own short movies.
The Canadian Centre for Child Protection performed an analysis of the 83 exploitative films and came to the conclusion that 34 of them showed young people exhibiting their genitals on camera. This information is cited in the study that Forbes published. There were reportedly a large number of young boys between the ages of five and twelve. There were an additional 49 clips that contained sexually explicit content that featured youngsters “exposing body parts or being subjected to grooming efforts.”
According to the organisation, the 34 videos that were considered to be the “most egregious” were viewed a total of 2,700 times. The remainder had a total of 7,300 views.
The answer from Twitch
In an email to Newtechmania, a spokeswoman for Twitch stated, “We take this issue extremely seriously and we do not tolerate any form of harm to young people that occurs anywhere online.” The company has stated that it has developed new models to detect potential grooming behaviour and is updating its existing tools to more effectively identify and remove banned users who are attempting to create new accounts (including for youth safety-related issues). This is in response to the fact that the company was made aware of the child sexual abuse material (CSAM).
In addition, Twitch emphasises that it has increased the enforcement of livestreams by its safety personnel, which is the origin of Clips. “This means that when we disable a livestream that contains harmful content and suspend the channel, we are preventing the creation and spread of harmful clips at the source,” the business noted. “Due to the fact that clips are created from livestreams, we are prevented from creating and spreading harmful clips.” “It is also important to note that we have made efforts to ensure that when we delete and disable clips that violate our community guidelines, those clips are not accessible through public domains or any other direct links,”
“We also acknowledge that, unfortunately, the dangers that can be found online continue to develop,” the representative stated. The rules that our internal safety teams employ to identify some of the developing online hazards, such as generative AI-enabled Child Sexual Abuse Material (CSAM), have been updated by us. In addition, Twitch has announced that it has increased the number of external organisations with which it collaborates in order to (hopefully) eliminate any content that is comparable in the future.
Problems with Twitch’s moderation system
According to Bloomberg, Clips has been one of the sections on Twitch that has received the least amount of moderation. In April 2023, the firm cut off fifteen percent of its internal trust and safety team, which was part of a terrifying year for layoffs in the technology industry. Additionally, the company has become increasingly dependent on external partners in order to suppress content that involves CSAM.
When compared to more conventional video platforms such as YouTube or Instagram, Twitch presents a more difficult issue for moderation due to its emphasis on livestreaming on its platform. Hashes, which are digital fingerprints that can identify previously known problematic files that have been uploaded to the internet, can be compared with videos that have been submitted by these platforms. According to Lauren Coffren, who works for the United States National Centre for Missing and Exploited Children, “Hash technology looks for something that’s a match to something seen previously,” she told Bloomberg of the technology. The term “livestreaming” indicates that it is novel.