The voice in a robocall sounded like Biden but instead came from a New Orleans magician, said the political consultant behind the scheme who wanted to send a 'wake-up call' about the dangers of AI
The political consultant behind a robocall that mimicked President Joe Biden’s voice said Monday he was trying to send a wake-up call about the potential malign uses of artificial intelligence, not influence the outcome of last month’s New Hampshire primary.
Steve Kramer, in an interview days after he was publicly identified as the source of the calls, confirmed paying a New Orleans street magician $150 to create a recorded message that was sent to thousands of voters two days before the first-in-the-nation primary on Jan. 23. The messages played a voice similar to Biden’s that used his phrase “What a bunch of malarkey,” and falsely suggested that voting in the primary would preclude voters from casting a ballot in November.
“Maybe I’m a villain today, but I think in the end we get a better country and better democracy because of what I’ve done, deliberately,” Kramer said.
New Hampshire authorities have been investigating the calls as a potential violation of the state’s voter suppression law.
Kramer says he disagrees that his robocall suppressed turnout, noting that Biden won the Democratic primary by a wide margin as a write-in candidate. And though he did some ballot access work for another Democratic presidential hopeful, Rep. Dean Phillips of Minnesota, Kramer said he acted alone to publicize the dangers of artificial intelligence.
While New Hampshire and federal authorities have issued cease and desist orders to two Texas companies involved in transmitting the calls, Kramer said neither of them knew what he was up to.
“Their entities had no idea what I was doing, and I don’t ask permission,” he said. “I let the chips fall where they may.”
Kramer, who owns a firm that specializes in get-out-the-vote projects, has decades of experience working on federal, state and local campaigns, many of them in New York. He said he had grown increasingly concerned since the 2022 midterm elections that campaigns, super PACs and others were poised to use artificial intelligence in harmful ways. Frustrated with the slow pace of regulation at the state and federal level, he said, he made a New Year’s resolution to tackle the issue himself.
“One of the things I said is, I want to make a difference this year,” he said. “By deliberately doing it on Sunday night before the Tuesday primary when even people who aren’t involved in politics are at least casually watching what’s going on … gave me a way to wake up the whole country.”
Kramer said he planned to keep quiet until after last weekend’s South Carolina primary, but the magician he paid, Paul Carpenter, went to NBC News with his story. Carpenter, who specializes in card tricks and illusions, told The Associated Press on Friday that he thought Kramer worked for Biden and was surprised to learn about the criminal investigation.
“I created the gun. I didn’t shoot it,” Carpenter said.
The New Hampshire attorney general’s office declined to comment Monday. Kramer declined to say whether he has been contacted by state investigators, but said he has been subpoenaed by the Federal Communications Commission and will cooperate.
The FCC declined to comment Monday about whether it has subpoenaed Kramer and said it is working diligently to combat the harmful misuse of AI.
“I wrestled in college, I’m ready for the fight,” Kramer said. “If they want to throw me in jail, good luck. Good luck, and I meant that.
“If they want to fine me for doing the right thing when they didn’t do the right thing, even though it’s been their job and they went to a fancy law school? Well, you’ve proven a point.”
Sophisticated generative AI tools, such as voice-cloning software and image generators, already are in use in elections in the U.S. and around the world, leading to concerns about the rapid spread of misinformation. Last year, as the U.S. presidential race got underway, several campaign advertisements used AI-generated audio or imagery, and some candidates experimented with using AI chatbots to communicate with voters.
Kramer estimates he spent about $500 to generate $5 million worth of media coverage.
Bipartisan efforts in Congress have sought to regulate AI in political campaigns, but no federal legislation has passed. Since the New Hampshire robocalls, however, the FCC has outlawed robocalls that contain voices generated by artificial intelligence, and major tech companies have signed a pact to adopt precautions voluntarily to prevent AI tools from being used to disrupt elections.
Kramer says he wants to see immediate action across all regulatory bodies and social platforms.
“I could care less if I pick up business or a business leaves me because of this,” he said. “I did the right thing.”