by Alan Wooten
More than 3 in 4 Americans fear that the overuse of artificial intelligence will impact the 2024 presidential election, and many are unsure whether they will be able to detect imitation photos, videos and audio files.
AI & Politics ’24, led by Lee Raini and Jason Husser of Elon University, found that 78 percent believe artificial intelligence will likely be used to influence the outcome of the dispute between President Joe Biden and former President Donald Trump. 39 percent believes that artificial intelligence will harm the electoral process, and only 5 percent thinks it will facilitate.
“Voters believe this election will be held in an extremely difficult information environment,” said Rainie, director of Elon’s Imagining the Digital Future Center. “They predict that artificial intelligence will enable new types of disinformation, falsified materials and voter manipulation tactics. Worse yet, many people are unsure if they can clean up the garbage they know is contaminating campaign content.”
The poll of 1,020 adults across the country aged 18 or older was conducted by Imagining the Digital Future Center and Elon University Poll. Sampling took place April 19-21, released Wednesday morning, has a margin of error of +/-3.2% and a confidence level of 95%.
Other significant findings include that 46 percent of applicants are candidates people who maliciously alter or falsify photos, video or audio should be disqualified from holding office, and 69 percent are unsure whether they can detect imitation photos.
One in five, or 23 percent of respondents, say they have used enormous language models or chatbots such as ChatGPT, Gemini or Claude. When asked if they were partisan, most Democrats, Republicans and independents said they were not sure.
When asked about their confidence in the voting process in this presidential election, 60 percent are “very” or “somewhat” confident that citizens’ votes will be accurately cast and counted. According to the parties, 83 percent of Democrats are confident and 60 percent of Republicans are unconfident.
When it comes to questions about candidates’ misuse of AI, the survey proposes four options: “If a political candidate is proven to have maliciously and intentionally digitally altered or falsified photos, videos or audio files, should one of the penalties read:”
Ninety-three percent wanted punishment. The choices were no punishment (4%), a stern fine (12%), criminal proceedings (36%), and being prevented from holding office or being removed from office if the candidate wins the election (46%). Patterns in this data included women advocating preventing people from holding office; and prosecutions favored by households earning more than $100,000 and those with college degrees. Republicans (17%) were more likely than Democrats (8%) to support stern punishment. The decision not to impose a penalty did not eclipse single-digit support in either group.
The poll found that 61 percent of respondents are very or somewhat confident they can get correct and reliable news and information during the election. However, only 28 percent expressed this opinion about the majority of voters. In another question, 53 percent said it was very or somewhat straightforward “to get the political news and information you need” these days.
“Disinformation in elections has existed since before the invention of computers, but many people worry that in 2024, the advancement of artificial intelligence technologies will give bad actors an accessible tool to spread disinformation on an unprecedented scale,” said Husser, a professor of political science and director of polling at the University Elona. “We know that most voters are aware of the risks of artificial intelligence for the 2024 elections. However, the behavioral consequences of this awareness will remain unclear until we see the repercussions of AI-generated disinformation.
“The optimistic hope is that risk-conscious voters will be able to approach information with greater caution in the 2024 cycle, leading them to become more sophisticated consumers of political information. The pessimistic outlook is that concerns about AI-related disinformation may translate into reduced self-efficacy, institutional trust and civic engagement.”
– – –
Alan Wooten is the editor-in-chief of w Central Square.
“Election Results” photo by Banks of clay.

