The First AI Election: When Convincing Deepfakes Influence the Electorate

by | Oct 30, 2024 | Commentary, Featured

“We are so screwed, it’s beyond what most of us can imagine,” exclaimed Aviv Ovadya. “We were utterly screwed a year and a half ago, and we’re even more screwed now. And the further you look into the future, the worse it gets.” Ovadya, CEO of the AI & Democracy Foundation, was referring to advancements in artificial intelligence (AI), which he warned would cause an “Infocalypse.” Ovadya issued this warning in 2016, and nearly a decade later, the U.S. finds itself in a presidential election where this is a real factor. Much has been said about how the 2024 election is historically significant, with a former president and convicted felon running for office and a mixed-race woman within striking distance of the presidency. However, a less-discussed but equally important factor is that this is the first artificial intelligence-driven election.

Much of Ovadya’s concern was rooted in the development of deepfakes, which use a synthetic media technology that can create hyper-realistic videos, audio, and images by using AI.

Deepfakes existed in previous presidential cycles, as seen in 2020 when videos circulated showing then Speaker of the House, Nancy Pelosi, appearing drunk. These videos were clearly fabricated or altered to the naked eye. In the years since, however, AI has advanced significantly. Companies like OpenAI, Alphabet’s Google, and Microsoft have developed user-friendly AI tools that allow individuals with little to no technical skill to create false or misleading images, videos, and written content that appear legitimate. For example, shortly after President Joe Biden announced his decision to run for re-election in 2023 (a decision he has since reversed), the Republican Party created and distributed an AI-generated video response, stoking fear about his continued presidency.

It is no surprise that in an election year, these new tools are being used to influence voters. For example, Trump supporters have circulated convincing AI-generated images showing Trump in a variety of heroic or divine roles, from praying under a heavenly light to rescuing people with a gun during the 2024 hurricanes. Additionally, they have distributed images of Vice President Kamala Harris speaking in what appears to be a Soviet Union-era venue, positioning her as Joseph Stalin incarnate. Similarly themed deepfakes purport to show Harris’ communist credentials. Relatedly, Elon Musk, a Trump supporter, shared a fake audio file of Harris saying things that she did not actually say on his X (formerly Twitter) platform.

AI-generated images not only reflect partisan tactics but also expose the continued legacy of racism in the U.S. While AI-generated content featuring figures like Trump or Biden appear remarkably accurate, the same cannot be said for Harris. The images of her bear little resemblance to the real person. Scholars have noted that this discrepancy arises because AI is largely programmed using data from a predominantly white majority, reflecting Silicon Valley’s demographic makeup, which is disproportionately white and male. While these errors may be corrected in the future, they highlight the racial inequities embedded in AI technology and its development.

Fake news, such as AI-generated content, misleads the electorate. However, the problem with AI content goes beyond its falsehood. Once the public becomes aware of the existence of deepfakes, they can no longer trust images, video, or audio at face value. This skepticism, though sometimes healthy, has been exploited by political campaigns. For example, Trump falsely claimed that a large crowd at a Harris rally was AI-generated. While journalists debunked the claim, the fragmented media landscape and declining trust in journalism make it unlikely that the entire electorate will be reached or persuaded by this factual conclusion.

The blending of truth, lies, and deepfakes can be used to gaslight the electorate, causing people to question reality when they can no longer distinguish fact from fiction. A notable example from the 2024 election was the release of deepfake images by the Trump campaign showing Taylor Swift fans — known as “Swifties” — wearing “Swifties for Trump” shirts. Social media posts claimed Swift had endorsed Trump. A few weeks later, the real Swift endorsed Harris, but this did not stop Trump supporters from continuing to spread AI-generated images suggesting otherwise. For voters who do not follow or trust the news, such mixed messages can be highly confusing.

We are just beginning to understand the role and influence of AI in electoral politics. Voters can take steps to mitigate the harms of AI, such as slowing down and investigating the source of content; not believing everything they see, hear, or read, corroborating media messages; and avoiding sharing content on social media until they are certain it is true. However, broader discussions at the governmental level are needed to address the challenges AI poses to democracy. These discussions should include revisiting copyright laws regarding people and brands, as well as exploring defamation, libel, and responsibility for content creation and dissemination. These challenges will likely grow more pronounced in the years ahead. The 2024 election is already giving us a glimpse of what’s to come, and it does not look good.

 

 

Nolan Higdon is an author, lecturer at Merrill College and the Education Department at University of California, Santa Cruz, Project Censored National Judge, and founding member of the Critical Media Literacy Conference of the Americas. His work is available at Substack. Higdon is the author of The Anatomy of Fake News: A Critical News Literacy Education (2020); Let’s Agree to Disagree: A Critical Thinking Guide to Communication, Conflict Management, and Critical Media Literacy (2022); and The Media And Me: A Guide To Critical Media Literacy For Young People (2022). He is a regular source of expertise on media issues and current political events for CBS, NBC, The New York Times, and The San Francisco Chronicle.

 

More from The Edge

The Revolution Starts at the Table

Not everything that is faced can be changed, but nothing can be changed until it is faced. –James Baldwin   In November 2024, a New York Times headline declared, “Turkey, Stuffing and a Side of Politics.” Soon after, the Associated Press and CNN followed suit...

Finding Ourselves in the Days After Trump Wins

I thought Kamala Harris was going to win. I said, repeatedly, that I thought she would win the 2024 presidential election in a landslide. I found all the talk of a “close” election wrong-headed in overemphasizing polling and being anti-feminist. After all, her...

Manufacturing Outrage, One Pet at a Time

Disinformation, Hyper-Partisan Media, and the Perils of Big Tech   “If I have to create stories so that the American media actually pays attention to the suffering of the American people, then that’s what I’m going to do, Dana [Bash],” Republican...

The Rise of Misogynoir Fascism

Pascism or Fatriarchy? We are fully in the throes of the 2024 Presidential Election. Kamala Harris says the present right-wing assault on our democracy must be stemmed. And Trump spits out authoritarian barbs and lies. A rising fascism is often named as our largest...

Rocking The Forest: Rockstadt Extreme Fest Pummels Romania

Nestled beneath lush pine forests of the Carpathian Mountains in the quaint Transylvanian town of Rasnov, Rockstadt Extreme Fest unleashed its tenth annual independent music festival, their website boasting, “The largest and most monumental heavy metal festival in...

Radicalize the Promise of Kamala

My purpose: to claim all possibilities for radicalizing Kamala Harris’s campaign to demand a ceasefire and not one more bomb, while making sure that Trump does not win. And to recognize how unique it is to see this band of warriors organizing around Harris as her...

No[n]Sense: Administrative Responses to Campus Protests

What happened on campus at University of Texas at Dallas on May 1, 2024 — and in its aftermath — makes no sense to me whatsoever. On that day, colleagues, students, community members and others were shackled and jailed. And while they were released within 24 hours,...

Op-Ed: Educators Are Not the Enemy

by Monica J. Casper, Meghan Eagen, Amy E. Farrell, Katherine Felter, Grace E. Howard, E. Goldblatt Hyatt, Erika Robb Larkins, Monica R. McLemore, Kyle J. Morgan, Kayti Protos, Stephanie Troutman Robbins, William Paul Simmons, Joseph Stramondo, Megan Thiele Strong,...