If it were to become law, it would provide victims with a ten-year window in which to file a lawsuit for damages.
Those who create or distribute deepfake pornography will be held accountable, according to a measure that was unanimously approved by the United States Senate on Tuesday. The DEFIANCE Act, which stands for the Disrupt Explicit Forged Images and Non-Consensual Edits Act, would make it possible for victims to file lawsuits against individuals who create, share, or possess sexual images or films generated by artificial intelligence that use their likeness. Following the widespread dissemination of the infamous Taylor Swift deepfake among internet thugs at the beginning of this year, the issue became deeply ingrained in the mindset of the general public.
If passed, the bill would allow victims to file lawsuits for damages of up to $150,000. In cases when the incident is related to attempted sexual assault, stalking, or harassment, the amount increases to a total of $250,000.
A companion bill is currently waiting for it in the House of Representatives. Alexandria Ocasio-Cortez, a Democrat from New York, is the sponsor of the sister bill. In the event that it is approved there, which is quite probable given the fact that the vote in the Senate was unanimous, it will then be sent to the desk of President Biden for final enactment.
“There’s a shock to seeing images of yourself that someone could think are real,” Ocasio-Cortez said in an interview with Rolling Stone earlier this year. The moment you take a look at it, you will never forget it. It is similar to the same exact purpose that is behind sexual assault and physical rape, which is to humiliate, dominate, and gain authority over the victim. Deepfakes are, without a doubt, a method of digitally committing acts of violent humiliation against other individuals.
Deepfakes are a type of digital forgery that allows victims to sue for damages. This bill, which is supported by Senator Dick Durbin (D-IL), allows victims to sue for damages. The statute of limitations would be extended to ten years for victims, beginning either from the time they discovered the content or, in the case of children, when they turn 18 years old. This would be an even more distressing situation.
Last night, Senate Majority Leader Chuck Schumer (Democrat of New York) made the following statement on the Senate floor: “As we all know, artificial intelligence plays a more significant role in our lives than it ever has before. While it does have many advantages, it is also easier than ever before to create sexually explicit deep fakes without the consent of a person.” The fact that these fabricated pictures of people are being circulated on the internet without any legal recourse is a heinous violation of their right to privacy and dignity.
During his floor speech, Schumer referenced Taylor Swift and Megan Thee Stallion as two instances of celebrities who have been affected by the types of content that the measure seeks to regulate. However, The Verge observes that online sexual deepfakes have harmed individuals who have far less influence (and money for lawyers) than A-list pop stars. For example, high school girls have been affected by these deepfakes. Some of these girls have discovered that their peers have been passing around sexual photographs that were fabricated specifically for them.
Thankfully, the measure includes provisions that ensure victims will be afforded privacy safeguards over the course of the court proceedings and that they will be able to recoup their legal expenses. In his statement, Schumer stated that “it is a grotesque practice,” and that “victims of these deep fakes deserve justice.”