Fighting Disinformation in a Dangerous Year

In the spotlight: a recent viral deepfake of French politician Marine Le Pen illustrated the dangers around disinformation. Image: Associated Press / Alamy

With two billion people around the world expected to vote in elections this year, the battle to counter disinformation has never been more pressing.

As 2023 ended, a video of Marine Le Pen addressing her New Year’s wishes to French voters in Russian was viewed half a million times.

The AI-altered video, with the hashtag #MarinePoutine, played on the long-standing Kremlin links of Le Pen’s far-right National Rally. But the deepfake also caused embarrassment for President Emmanuel Macron’s centrist Renaissance party, whose spokesperson posted it ostensibly to show the impact of fake content – even though the government has been pushing for strict regulations on AI.

The video illustrates the dilemmas faced by democracies in an age of disinformation. But it also shows the need for new ways to inoculate citizens against this virus.

It followed a Washington Post investigation on Russia’s efforts to undermine support for Ukraine and weaken NATO resolve through far-right parties such as National Rally. Notably, the article was based not just on interviews, but on intelligence.

Kremlin documents obtained by an unnamed European security service and seen by the Washington Post revealed that top aides to Russian President Vladimir Putin directed messaging to the French public aiming to boost ‘the fear of World War III’ and to increase the number of people in France who are reluctant ‘to pay for another country’s war’ and who want ‘dialogue with Russia on the construction of a common European security architecture’. The documents also showed that in June 2023, a Kremlin strategist directed a Russian troll farm employee to create a ‘200-character comment by a middle-aged French person’ who considers Europe’s support for Ukraine to be ‘a stupid adventure’ promoted by the US, which leads to inflation and ‘falling living standards’.

Fertile Ground

National Rally called the Washington Post report ‘a cabal against [it]’, and the party continues to lead in opinion polls ahead of European Parliament elections in June.

More widely, concerns are growing that disinformation – turbocharged by Russia’s war in Ukraine, the erosion of trust in institutions, and the rapid rise of generative AI – will find fertile ground in this historic election year. As an unprecedented two billion people are expected to go to the polls, including in the EU, the UK, the US and India, the World Economic Forum’s annual risk report ranks misinformation and disinformation as the biggest short-term global risk.

In Europe, deepfakes have already been deployed in last year’s election campaigns in Slovakia and Poland. A study of Germany, Italy and Bulgaria tracked false claims aimed at increasing hostility towards Ukrainian refugees and blaming the war on NATO and Ukraine itself. While it is hard to establish a causal relation between disinformation narratives and popular support, citizens of the three countries had more doubts about military support to Ukraine than the EU average. The study welcomed national counter-disinformation efforts, as well as the creation of the EDMO taskforce on European Parliament elections, but called for more steps to increase resilience.

Concerns are growing that disinformation – turbocharged by Russia’s war in Ukraine, the erosion of trust in institutions, and the rapid rise of generative AI – will find fertile ground in this historic election year

In the US, a recent poll showed that nearly six in 10 adults believe that AI tools will increase the spread of false and misleading information during the presidential election in November. According to the poll, US citizens generally see preventing AI-generated disinformation as a shared responsibility and want more regulation. President Joe Biden has set in motion federal guidance for AI. Meanwhile, the EU is in the final stages of agreeing an AI Act.

From Debunking to Prebunking

Just as the disinformation landscape evolves, strategies to fight it are also evolving. NATO provides a case in point.

After Russia’s illegal annexation of Crimea in 2014, the alliance created Setting the Record Straight, an online portal dedicated to debunking Russian myths. The portal exposes Russia’s litany of false claims, countering them with facts based on verified sources. In 2021, as Russia started to prepare its full-fledged invasion of Ukraine, NATO moved up a gear from debunking to prebunking.

Simply put, prebunking aims to teach individuals how to spot false claims before encountering them, rather than trying to counter them afterwards. It was pioneered by William J McGuire in the 1960s as he sought a ‘vaccine for brainwash’ for US troops during the Korean War. The idea was that just as the body gains immunity to a virus after a vaccine, exposure to a small dose of propaganda could immunise the mind.

Building on McGuire’s work, Sander van der Linden, a professor of social psychology at the University of Cambridge, and his colleagues have developed ways to pre-empt misinformation and disinformation related to elections, Covid-19 and climate change. They include games such as Bad News, Go Viral! (developed with the UK government) and Harmony Square (developed with the US Department of State's Global Engagement Center and the Department of Homeland Security). The games help build psychological resistance to online misinformation by getting players to use the techniques and types of content prevalent in the production of viral fake news, following the principle that prevention is better than cure.

Bold Moves

In the months preceding Russia’s 2022 invasion of Ukraine, the US, the UK and NATO took bold moves to counter the Kremlin’s claims that it had no intention to attack and that, on the contrary, it was the West and Ukraine that were aggressive. They did so through proactive communications, largely based on the declassification of an unprecedented amount of intelligence. For instance, the US outlined a Russian plan to create a phoney video that could be used as a pretext for invasion, while the UK revealed that Moscow would try to stage a coup in Ukraine and install a pro-Kremlin puppet.

At the same time, US officials disclosed, step by step, the concrete indicators of Russia’s invasion plans, such as moving blood supplies near Ukraine, while the UK Ministry of Defence started posting daily intelligence updates on military developments on the ground, which it continues to this day.

Democracies must defend themselves by demonstrating strength and unity in what they say and do. They need to get savvier about technology, and use it based on democratic values

NATO Secretary General Jens Stoltenberg’s statements to the media were also a major driver of NATO communications, reaching billions of people around the world. For example, in November 2021, as Russia claimed it was conducting mere exercises, he warned of the unexplained ‘large and unusual’ concentration of Russian forces on Ukraine’s borders, with ‘heavy weapons, artillery, armoured units, drones, electronic warfare systems and tens of thousands of combat ready troops’, and called on Russia to show transparency and de-escalate.

Ten days before the invasion, following Putin’s claim that ‘it is not our plan to occupy Ukrainian territory’, the NATO Secretary General made clear that ‘everything is now in place’ for Russia to attack. He called again on Putin to step back from the brink, establishing firmly who the aggressor was. Putin’s decision to invade Ukraine had arguably been taken long before, so the main aim of Stoltenberg’s messaging was to maintain NATO unity and support for Ukraine.

While it is hard to correlate public opinion with specific communications, a survey commissioned by NATO, published in June 2022, showed support for alliance membership at an unprecedented 72%, with Russia viewed unfavourably by 68% of respondents, an increase of 27% since the previous year. Significantly, 67% perceived Russia’s invasion as affecting the security and safety of their own countries.

The June 2023 survey shows similar results, with 65% of respondents in favour of continued support to Ukraine. Recent signs of ‘Ukraine fatigue’ will put these figures to the test, unless the US and other allied countries continue to make the case to their publics why Russia’s war in Ukraine fundamentally affects their own current and future security.

A New Template

NATO communications in the run-up to Russia’s full-fledged invasion are part of a new template for fighting disinformation. But this is not a job for one government, one organisation, or just communicators alone.

To be effective, fighting disinformation must be a collective, layered and networked effort, including proactive communications of democratic values and goals; stronger regulation of AI; more robust content moderation by big digital platforms; and increased media literacy and societal resilience.

The year 2024 will be a defining one. Democracies must defend themselves by demonstrating strength and unity in what they say and do. They need to get savvier about technology, and use it based on democratic values. Otherwise, they risk further eroding public trust and enabling the rampant disinformation driven by divisive populists and authoritarian adversaries.

The views expressed in this Commentary are the author’s, and do not represent those of RUSI or any other institution.

Have an idea for a Commentary you’d like to write for us? Send a short pitch to and we’ll get back to you if it fits into our research interests. Full guidelines for contributors can be found here.


Oana Lungescu

Distinguished Fellow

View profile


Explore our related content