Deepfake robocall in New Hampshire brings AI-fueled disinformation to Election 2024
On January 21, a number of New Hampshire voters received a robocall using deepfake technology to imitate the voice of President Biden.
“It’s important that you save your vote for the November election,” said the apparently AI-generated voice. “Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again. Your vote makes a difference in November, not this Tuesday.”
The message ended with the phone number of Kathy Sullivan, the former chair of the New Hampshire Democratic Party who now runs a super PAC urging voters to write Biden’s name in for the primary.
Robocalls and dirty tricks go back well before artificial intelligence. But the ability to seemingly place words in the mouth of a prominent figure makes the task of disentangling true and false messages all the more difficult.
Although disinformation that leads voters to believe false claims is concerning, Center for Election Innovation & Research executive director David Becker sees a larger problem—a general erosion of trust.
“Our foreign adversaries in Russia and China and Iran, as well as domestic actors who’ve been lying about the election, don’t need to convince us that the lies are true,” says Becker. “They just need to convince us that there is no truth, that you can’t believe anything. Whether they use AI or not, they want us to generally distrust our overall election system and all information about it.”
In an environment of sophisticated AI deepfakes, the task of separating true and false can overwhelm our personal tools. “If Joe Biden says something in a speech, or Joe Biden says something to you in a robocall, you’re told that you should dismiss them both,” says Becker.
Far from a one-off, Becker sees the New Hampshire robocall as “a canary in the coal mine for what we’re going to see.”