Back to Articles
Byte-Sized Diplomacy: AI-Enabled Elections or Deepfake Democracy?

Lowy Institute

SKIPPED

Details

Date Published
3 July 2024
Priority Score
4
Australian
Yes
Created
8 Mar 2025, 01:04 pm

Authors (1)

Summary

The article examines the growing influence of artificial intelligence on democratic processes, with a focus on elections around the world. It highlights the increasing concern over AI-generated misinformation and deepfakes, which pose significant challenges to election integrity. Within the Australian context, it references warnings from Home Affairs Minister Clare O’Neil about the weakening of democratic fundamentals and the limited protections currently offered by electoral laws against AI threats. The article emphasizes the need for stronger governance frameworks and policies to address these challenges globally, while also noting the role of human agency and media in mitigating AI's impact on misinformation.

Body

Got a big question on technology and security for “Byte-sized Diplomacy”.Send it through here.We are hearing a lot about AI and elections, should we be concerned?Elections are a fundamental part of democracy. However, we can’t afford to miss the impacts of AI across the practices of democracy, especially in what’s been described as a “bumper election year” around the globe. Alongside polls, parties and politics, we need to understand neurotechnology, knowledge access, the data economy and corporate tech governance.The latest call tocurb harm from AI mis-and disinformationcame from UN Secretary-General António Guterres. Trackers are identifyingnoteworthy instances of AIgenerally andgenerative AIspecifically as it goes head to head withpolitical campaignsand elections. In the United States, an AI voice clone of President Joe Biden targeted New Hampshire constituentstellingthem not to vote. Countless examples of AI-generated images and text of political candidates have appeared including deepfakes ofDonald TrumpandNarendra Modi, as well as “softfakes” generated toimprove candidate appealofPrabowo SubiantoandImran Khan.Australia’s Home Affairs Minister Clare O’Neilwarned last monththat “technology is weakening democratic fundamentals such as free and fair elections and open political debate”. Populations worldwide areworried, Australians among the most.The latestDigital News Reportshows that concern in Australia about misinformation increased to 75% this year, up from 64% in 2022. TheIpsos AI monitorshows52% of Australiansthink AI will make disinformation on the internet worse. A 2024 Adobe study, called Future of Trust, found that 78% of Australian respondents think misinformation and deepfakes will impact elections.It is important not to unnecessarily hype this issue or suggest foreign interference when there isn’t evidence.Australian electoral law presently offersonly limited protectionsto combat deepfakes, disinformation and AI threats. This has led to increasingcalls to ban AI generated election materials. Electoral commissioner Tim Rogersnotedthat the Australian Electoral Commission (AEC) “does not possess the legislative tools, or internal technical capability, to deter, detect or adequately deal with false AI generated content concerning the election process”. The AEC was not setup to establish truth in politics – who could? – but to maintain an impartial and independent electoral system.Rogers also noted adecline in AEC's previously very good relationshipwith social media companies across the past 18 months. This aligns with an overall deterioration of resourcing and responsiveness in dealing with serious challenges. Electionintegrity and enforcement priorities are in fluxat some of the biggest social media platforms. Some platform policy changes have systematicallyamplified authoritarian state propagandaandfake news.Companies no longer appear to beacting together to disruptforeign influence operations on their platforms. There seems to be an approachequating content moderation and censorship. Recent assessments of influence operations on Meta and OpenAI area good start but require transparency.The silencing of voices from public space is a primary concern.Women,LGBTQI and Indigenouspeople experience online hate speech at more than double the national average in Australia. Such experience,especially for women, is global.AI threats to democratic processes and institutions areemerging across elections in campaigns, information and infrastructure,although the evidence ofAI impacts on specific election results is limited, which is unsurprising given the recency of its widespread adoption. It is important not to unnecessarily hype this issue orsuggest foreign interference when there isn’t evidence. But ensuring election integrity is not just about making sure voters have access to verified, truthful information, even though this is critical.Australian democratic process and bureaucratic institutions are based onfour key ideas; active and engaged citizens, an inclusive and equitable society, free and franchised elections, and the rule of law for all. Australia not only has compulsory voting, but we also enforce it – exceptional in the current context.Technological change cannot be ignored (Tiffany Tertipes/Unsplash)However rather than focusing only on elections, it is vital to improve the resilience and integrity of the information environment across democratic processes. The world is on the cusp of further technological and social changes and ensuring the settings in support reflect of our values and interests is critical.Two emerging threats should be top of mind: changes in the way we access and assess knowledge, and the expansion of consumer neurotechnology applications.Many AI tools – and especially voice assistants – deliver outputs that remove the context that helps us to assess information accuracy, source and integrity. This is changing the way people interact with information and create knowledge. Understanding the inputs and processes of AI and its algorithmic influence and interference potential, as well as who shapes knowledge production is vital to mitigating itsdisastrous potential.Tech companies developing foundational technologies such as AI arepushing the boundaries on acceptable and appropriate corporate governance. The choices of these companies and their processes, procedures and practices shape the information environment. But these companies are unelected and largely opaque. Systemic changes to big tech business models will take time, but all signs suggest they are impending. This will help get the digital backbone of our society right.The concern around AI also misses the role of human agency. Tighter restrictions on political content would help – but critical to its impact isdistributionof the message and who it is targeting. Watermarking genuine or AI-generated content is one great provenance suggestion, butthe idea is only one of many needed.Having an election doesn’t guarantee a democracy, butyou cannot have a democracy without an election. AI and emerging technologies are causing friction to democratic institutions and processes. In the face of this friction, we need to steady our institutions, forums and processes. Hopefully the focus and interest in AI in elections will translate into legislative, policy and industry changes to strengthen the integrity of our institutions and information environment.