The action was taken in response to the singer’s non-consensual pornographic deepfakes becoming viral last week.
This week, pornographic deepfakes of the artist Taylor Swift began to circulate on the site, and X has verified that it is prohibiting users from searching for the artist’s name. On Saturday, site visitors began to notice that certain searches involving Swift’s name would only yield an error message. This anomaly initially occurred on Saturday. On Saturday evening, Joe Benarroch, who is the head of business operations at X, sent a statement to the Wall Street Journal in which he stated, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.” This step is taken some days after the problem was initially brought to light.
Criticism has been leveled against X for its handling of the issue from the beginning, specifically that it has been slow to stop the spread of sexually explicit photographs that were not obtained with consent. NBC News reported earlier this week that when the photographs became viral on Wednesday, Swift’s admirers chose to take matters into their own hands in order to limit their visibility and get them removed. They did this by mass-reporting the accounts that published the images and flooding the hashtags related to the singer with content that was positive. A significant number of the accounts that had committed the offense were eventually suspended, but not before they had been viewed millions of times in certain instances. According to a story that was published by The Verge on Thursday, a single post was viewed more than 45 million times.
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X, and we have a zero-tolerance policy towards such content,” X said in a statement that was written and published on its platform later that day. In addition to taking necessary steps against the accounts that are responsible for posting the photographs, our teams are actively removing all of the images that have been detected. We are keeping a close eye on the situation to make certain that any additional infractions are dealt with swiftly and that the content is removed. An atmosphere that is both safe and respectful for all users is something that we are dedicated to preserving.
However, it was still able to locate the photographs even after a few days had passed. 404Media was able to determine that the photographs most likely originated from a Telegram group that is notorious for using free software, such as Microsoft Designer, to make artificial intelligence-generated images of women without their consent. In an interview that took place on Friday with Lester Holt of NBC News, the CEO of Microsoft, Satya Nadella, stated that the issue brings to light what the firm is responsible for and “all of the guardrails that we need to place around the technology so that there is more safe content that is being produced.” He went on to explain that “there is a lot to be done there, and a lot is being done there,” but he also mentioned that the organization needs to “move fast.”