A robocall pretending to be the president was received by New Hampshire voters.
According to Bloomberg, ElevenLabs, an artificial intelligence business that provides voice cloning services with its tools, has banned the user who constructed an audio deepfake of Joe Biden that was utilized in an attempt to disrupt the elections. In a robocall that was sent out to some voters in New Hampshire last week, the voice mimicking the president was used to persuade them not to vote in the primary election that was being held in their state. At first, it was unclear what technology was used to mimic Biden’s voice; however, a comprehensive investigation conducted by Pindrop, a security company, revealed that the individuals responsible for the crime utilized tools developed by ElevanLabs.
Before comparing the audio of the robocall to samples from more than 120 voice synthesis algorithms that are used to make deepfakes, the security company eliminated the background noise and cleaned up the audio of the robocall. Vijay Balasubramaniyan, the CEO of Pindrop, stated in an interview with Wired that the company “came back well north of 99 percent that it was ElevenLabs.” The company has already identified and suspended the account that was responsible for creating the bogus audio, according to Bloomberg, which reports that the company was informed of Pindrop’s findings and is continuing its investigation. ElevenLabs has informed the news organization that it is unable to comment on the matter at hand; nonetheless, the company has stated that it is “dedicated to preventing the misuse of audio AI tools and [that it takes]any incidents of misuse extremely seriously.”
Within the context of the impending presidential election in the United States, the deepfaked Biden robocall demonstrates how technologies that are capable of imitating the likeness and voice of another individual could be utilized to manipulate votes. “This is kind of just the tip of the iceberg in what could be done with respect to voter suppression or attacks on election workers,” Kathleen Carley, a professor at Carnegie Mellon University, said in an interview with The Hill. “It was almost a harbinger of what all kinds of things we should be expecting over the next few months.”
A few days after ElevenLabs released the beta version of its platform, the internet began utilizing it to make audio snippets that sound like celebrities reading or saying anything inappropriate. This was done in a very short amount of time. Customers of the startup company are given the opportunity to use the company’s technology to clone voices for the purpose of “artistic and political speech contributing to public debates.” On its safety page, users are cautioned that they “cannot clone a voice for abusive purposes such as fraud, discrimination, hate speech, or for any form of online abuse without infringing the law.” This warning is included in the content of the page. However, it is abundantly evident that it is necessary to implement further protections in order to prevent malicious actors from utilizing its resources in order to manipulate elections all over the world and influence voters.