On July 21, 2024, Joe Biden announced that he would be dropping out of the race, only a week after the assassination attempt on now official Republican candidate, Donald Trump. The electoral campaign which has so far been characterized by immense uncertainty, provides a perfect breeding ground for the spread of disinformation about the candidates and election process itself. As the stakes of the upcoming US elections remain incredibly high for both domestic and international politics, disinformation is becoming one of the most pressing challenges of our time.
Times of Uncertainty
On July 13, 2024, during the rally in Butler, Pennsylvania, there was an assassination attempt on Donald Trump. As the motives behind the assassination attempt remain unclear, disinformation and conspiracy theories surrounding the case flooded the internet and social media platforms. Stories claiming that the assassination attempt was an inside job by the US Security Service spread parallel to those stating it was a performance orchestrated by Trump himself, further polarizing an already divided society.
Developments like this should not come as a surprise. Policymakers and scholars have already been pointing towards the potentially harmful effects of disinformation – especially during election periods. At the beginning of the year, the World Economic Forum’s Global Risks Report highlighted manipulated and falsified information as “the most severe short-term risk the world faces”. This threat to the election process, and democratic processes more generally, will be especially challenging in the year of super elections as more people than ever in history take to the polls in 2024. The US is a particularly interesting case as it already has a proven track record of being a target of electoral foreign interference campaigns. It is therefore important for German and EU policymakers to look closely at how disinformation impacts the US elections to better understand and mitigate similar risks, as they are far from immune to such malicious activities.
First Efforts to Institutionalize the Fight Against Disinformation
By now, there is strong evidence that foreign countries, and especially Russia, interfered in the 2016 US elections. Research by Allcott and Gentzkow has shown that, although it is highly unlikely that disinformation influenced the actual voting in 2016, as the average American voter did not use social media as their primary source for political news, it still reached a significant portion of the electorate in the months leading up to the polls.
The 2016 election marked a crucial turning point, sparking increased interest in the study of disinformation and its effects. Additionally, and from a policy perspective, this election was also a pivotal moment as the US Congress introduced a bipartisan bill with the overarching goal of protecting the interests of the US and its allies from foreign information operations, such as propaganda and disinformation. The bill passed into law in December 2016 and acknowledged the necessity for a “whole-of-government” approach in countering disinformation and the establishment of a center, which should lead and coordinate the analysis of foreign information warfare. Although a separate agency that would be tasked solely with countering disinformation was not created, the mission of the Global Engagement Center (GEC), which was originally established to fight extremism, was expanded to “recognizing, understanding, exposing, and countering foreign state and non-state propaganda and disinformation efforts”. Policy reports suggest that the GEC has used its $120 million budget to counter state-sponsored disinformation by funding civil society groups and private sector partners. This includes funds for the research into disinformation and counter-disinformation tactics, as well as for journalists and fact checkers, and support for local counter-disinformation efforts.
Science Behind Disinformation
In parallel with government efforts to institutionalize the fight against disinformation, science has made progress in determining who is most likely to fall for disinformation, and why. Research suggests confirmation bias plays an important, if not crucial role when it comes to disinformation: people tend to believe things that reinforce their prior political beliefs. In addition, other factors such as simple repetition or group identification, i.e. party affiliation during the election period, may increase susceptibility to disinformation. Disinformation may not entirely change a pre-existing set of beliefs. However, simply being exposed to disinformation that favors a candidate one already supports may make objective judgments increasingly difficult. In a society as polarized as the US, this is particularly worrisome, because disinformation can spread very easily, which, in turn, exacerbates polarization.
Therefore, addressing this issue is of the utmost importance for maintaining electoral integrity. Singer and Brooking point out that the first step is taking the issue seriously. They further state that investing in media literacy should be seen as a “national security imperative”. There is strong evidence that media literacy trainings are effective. Similarly, investing in (local) journalism should be an important goal, especially as trust in the media continues to decline. Both measures will not provide a quick solution as they are costly and take time to reach a significant number of people. Short- to medium-term solutions, such as fact-checking, remain very important. However, fact-checking as a measure by itself is not the most effective in combating disinformation as it does not address the root of the problem. As Carnegie researchers point out in a report, there is no “silver bullet” solution to combating disinformation – governments should implement a variety of policy measures for managing the threat posed by disinformation campaigns in real time, but also provide more sustainable solutions which take time to be implemented but will make society more resilient in the long run.
Long Road Ahead
Since the passing of the bill meant to address propaganda and disinformation in 2016, not much else has been done in the US to address the issue of disinformation and actively implement above-mentioned strategies. The GEC has started funding fact-checking institutions and research on foreign propaganda and disinformation tactics. However, the center’s work has been set back by funding delays. In addition, although the GEC has made some progress in its mission and has established cooperations with foreign countries, its existence is now under threat. The current funding is set to expire at the end of this year and needs to be reauthorized by Congress. Members of the Republican Party accused the GEC of operating within the US. The agency makes clear however that its mission is to tackle “foreign state and non-state propaganda and disinformation efforts”.
One of the latest additional efforts was the introduction of the Disinformation Governance Board (DGB), which briefly served as an advisory board to the US Department for Homeland Security under the Biden administration but was quickly dissolved. Only a few days after the board was officially announced, members of the Republican party as well as some civil rights advocates criticized the DGB for lacking transparency in how it was operating and concerns were raised that the institution was partisan and policing content on the internet. Despite party politics that may have played a role in shutting down the DGB before it could even get to work, reports studying effective ways of combating disinformation already pointed out the dangers of government-mandated content control, arguing that it might be counterproductive due to First Amendment issues that might arise.
AI and Deep Fakes
As traditional approaches face challenges, new threats are emerging. Amidst the rising tide of disinformation, the usage of AI and deep fake technology appears to be gaining further traction and is also becoming a prominent characteristic in the upcoming US elections. One story that made many headlines is the AI call, mimicking the voice of Joe Biden, meant to discourage voters ahead of the New Hampshire primary elections. However, deep fakes, and disinformation more generally, are a bipartisan issue. As some research on deep fakes during the current US elections shows, AI-generated content seems to target both Biden and Trump. For instance, at the beginning of the year, a photo circulated thousands of times on the social media platform X showing Trump and convicted sex offender Jeffrey Epstein with a young girl. It was later proven that the photo was AI-generated. Such content is particularly harmful, even further blurring the line between real and fake, as AI is getting better and better at simulating reality. The deep fakes have already started targeting Kamala Harris as well, who emerged as the Democratic candidate when Biden dropped out of the race. AI-generated photos of her next to Epstein are also all over the internet, in a kind of irony, as it is the same kind of image-altering that already targeted Trump at the beginning of the year. This only shows that such content negatively impacts both Democrats and Republicans and benefits from the already existing deep divide between the two parties and their voters.
Outlook
Democracies, especially in times of elections, rely on well-informed citizens. By now it is evident that disinformation poses a significant threat to the electoral process by fundamentally undermining trust in democratic processes and institutions. The recent assassination attempt on Donald Trump and Joe Biden’s decision to drop out of the race has only heightened the atmosphere of uncertainty and mistrust. In addition, the proliferation of AI-generated deep fakes is further polluting the information environment.
The 2016 elections served as a wake-up call, leading to increased scrutiny of disinformation and legislative efforts to protect against foreign interference. However, much remains to be done at the policy level to effectively combat this issue. The US government must fully embrace the “whole-of-government” approach outlined in the 2016 legislation. Cutting funding for the Global Engagement Center (GEC) – the only interagency body dedicated to countering disinformation targeting the US and its allies – would be a significant setback. Instead, the GEC should be empowered to implement long-term, sustainable strategies that address disinformation beyond election cycles. Strengthening collaborations with international partners, such as the EU, is also crucial for tackling this global challenge effectively. As we navigate the complexities of the 2024 elections, policymakers, researchers, and the general public must remain vigilant and proactive in addressing the ever-evolving landscape of disinformation – for the sake of saving our democracies.