It has been reported by The Wall Street Journal that users who are teenagers do not even need to seek for explicit Reels before using the platform.
According to separate experiments carried out by The Wall Street Journal and a professor at Northeastern University named Laura Edelson, Instagram is recommending Reels that include sexual content to adolescents as young as 13 years old, even if these adolescents are not expressly looking for racy movies. In preparation for the examinations, which were conducted the majority of the time between January and April of this year, both of them made new accounts and set their ages to thirteen years old. From the very beginning, it appears that Instagram has distributed films that are relatively sexual in nature. These movies include footage of women dancing sensually or videos that concentrate on bodies. It was then that accounts that viewed such videos but skipped other Reels began to receive recommendations for videos that were more explicit.
There were several Reels that were recommended that had women acting out sexual behaviors, and there were others that offered to deliver nudes to users who commented on their various accounts. Additionally, it was reported that the test users were given films of individuals displaying their genitalia. In one instance, the alleged underage user was shown “video after video about anal sex.” Following the creation of the accounts, it took as little as three minutes for sexual Reels to begin appearing in the accounts. After only twenty minutes of watching them, the majority of the video that was recommended in their Reels section was produced by creators engaged in sexual activity.
It is important to note that The Journal and Edelson ran the same test for TikTok and Snapchat, and they discovered that neither platform pushed sexual videos to the accounts that were made by teenagers. After making a concerted effort to locate videos that were improper for their age group and following the artists who produce them, the accounts did not even receive recommendations for such videos.
The Journal reports that workers of Meta have discovered similar issues in the past, based on documents that have not been published to the public that detail internal study on the potentially negative experiences that young kids have when using Instagram. According to the magazine, the workers in charge of safety at Meta had previously carried out the same test and had results that were comparable. On the other hand, Andy Stone, a representative for the company, dismissed the allegation by saying to The Journal, “This was an artificial experiment that does not match the reality of how teenagers use Instagram.” He further stated that the business had “established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.”
In January, Meta made important privacy upgrades pertaining to the safety of teen users. These updates included the automatic placement of teen users under its most restrictive control settings, which they are unable to opt out of. The tests that were carried out by The Journals were carried out after those upgrades were released, and it was even able to duplicate the results as recently as June. The changes were disclosed by Meta not long after The Journal published the findings of an earlier experiment. In that investigation, it was discovered that Instagram’s Reels would offer “risqué footage of children as well as overtly sexual adult videos” to test accounts that exclusively followed teen and preteen influencers.