How AI Is Being Used to Influence and Disrupt the Election

admin
4 Min Read

With the rapid advancement of generative AI technology over the past few years, it’s no longer a question of whether artificial intelligence will have an impact on this fall’s rematch of Joe Biden and Donald Trump and other races — but how much. There’s now an ever-growing number of AI tools that political campaigns, operatives, pranksters, and bad actors can use to influence voters and possibly disrupt the election. And as many experts are warning, in the absence of stronger regulation, things could get messy real fast. Below, we’re keeping track of how this first U.S. election of the AI era is playing out, including the deep fakes and other ways AI has already been used for political gain, and what legislators and tech firms are doing about it (or at least say they are).

Two days before the New Hampshire primary in January, a robocall featuring an AI-generated imitation of President Biden’s voice was sent out to thousands of people in the state urging them not to vote. The call was also spoofed to appear as if it had come from the telephone of a former state Democratic Party official. Independent analysis later confirmed that the fake Biden voice had been created with ElevenLabs’ AI text-to-speech voice generator.

The New Hampshire attorney general’s office launched an investigation into the robocall and subsequently determined it had been sent to as many as 25,000 phone numbers by a Texas-based company called Life Corporation, which sells robocalling and other services to political organizations.

On February 23, NBC News reported that a New Orleans magician named Paul Carpenter had admitted using ElevenLabs to create the fake Biden audio. Carpenter said he did it after being paid by Steve Kramer, a longtime political operative then working for Democratic presidential candidate (and AI proponent) Dean Phillips. The campaign has denied having any knowledge of the effort.

“I was in a situation where someone offered me some money to do something, and I did it,” Carpenter said. “There was no malicious intent. I didn’t know how it was going to be distributed.” He told NBC he was admitting his role in part to call attention to how easy it was to create the audio:

Carpenter — who holds world records in fork-bending and straitjacket escapes, but has no fixed address — showed NBC News how he created the fake Biden audio and said he came forward because he regrets his involvement in the ordeal and wants to warn people about how easy it is to use AI to mislead. Creating the fake audio took less than 20 minutes and cost only $1, he said, for which he was paid $150, according to Venmo payments from Kramer and his father, Bruce Kramer, that he shared.

“It’s so scary that it’s this easy to do,” Carpenter said. “People aren’t ready for it.”

Kramer, who also previously worked on the failed 2020 presidential campaign of Kanye West, was paid nearly $260,000 by the Phillips campaign across December and January for ballot-access work in Pennsylvania and New York. A Phillips campaign spokesperson told NBC News that it played no part in the AI robocall:

Share This Article
By admin
test bio
Please login to use this feature.