Opinion: Reporting On Elections And How Reporters Can Dodge Disinformation

As violent conflict continues in Israel and Gaza, 2023 may go down as a watershed moment for those tracking and countering fake news and videos with a 'never before' seen volume of misinformation.

It may give way to an equally challenging 2024 as large and critical parts of the world head to elections. Incredible both in terms of geographical size - it is an election year in 40 countries including Taiwan, US, India, Russia, South Africa - and impact; elections will take place in countries that make up over 60% of the global GDP.

The concerns are not unfounded. The US presidential elections in 2016 saw a surge in misinformation, and research from the last round of the 2020 elections points to a series of instances where there were unsourced rumours from polling locations, fake and out-of-context screen shots, videos, and images from private groups and a number of "repeat offender" X accounts with large followings.

Fact checking organisations point to a plethora of misinformation pools that have begun to grow across countries since then, something they fear may spike in an election year. While fake news has been a major problem for Indonesia, which has struggled with hoax news for many years now, fact checkers now report seeing a surge in video-related disinformation, such as inaccurate or misleading text on an authentic video or an altered video. (Fact checkers based in Indonesia speaking to author on condition of anonymity).

India, on the other hand, is seeing a different set of problems. Jency Jacob, fact checker and Managing Editor of boomlive, an India-based fact checking organisation, says the country has seen a jump in voice cloning tools and a discernible jump in audio-related misinformation. India has also, more recently, seen a spurt in hyper local 'deep fakes', especially in the last few months of 2023, compared to before.

Types of Misinformation during elections

The first question to ask in the misinformation haze we are living in is whether there is actually a demonstrable change or spike in misinformation around elections. Tom Drew OBE, Head of Counter Terrorism at Faculty, explains that it's very difficult to assess the scale of misinformation or disinformation around hyper-partisan discussion topics such as elections - so assessing a change is also difficult. He does add that as media consumption and creation happens via increasingly diverse platforms and often without the controls of Big Tech or traditional media, we are likely to consume more misinformation around elections. Drew believes the big change we could have predicted and are seeing is greater sophistication in the tactics used to spread deliberate and coordinated disinformation around elections. Manipulated or synthetic imagery (like deepfakes) are easier to make and are far more convincing than they were at the time of the last important round of general elections and he predicts a significant uptick in their use ahead of elections this year.

The second question - what are we looking for? While reporting on elections and safeguarding against misinformation, it is important to identify the types of disinformation often surrounding elections. Broadly, the categories can include disinformation either for or against a candidate and political parties; disinformation about voters; misleading information about the electoral system; a false narrative around issues of migration; fake news or false news about election-related violence; and false accusations of corruption or fraud during elections, such as booth capturing allegations or fraudulent activity by election officials. There are many more, as the ICFJ encapsulated before the 2022 US mid-term elections, including allegations that votes were cast in the name of dead persons and the creation of 'information chaos' on election day through falsehoods on the time or day of voting.

Tom Drew OBE, Head of Counter Terrorism at Faculty, points to another growing trend - the impact of Large Language Models (LLM) on electoral disinformation. With its qualities of being a natural technology that can improve the targeting, tailoring and production of disinformation, LLMs have now been democratised through intuitive user interfaces like ChatGPT. Drew says one should expect that most bad actors seeking to promote electoral instability through disinformation will use LLMs in some stage of their information operations.

What works?

The first and most obvious solution is engagement from both news organisations and platforms. Our own research in 2023 at the Reuters Institute for the Study of Journalism examined the question - does the news media exacerbate or reduce misinformation? The study found: "Our findings challenge the notion that news media in general, by drawing people's attention to false and misleading content (Tsfati et al. 2020), leave the public misinformed (Haber et al. 2021), and support the idea that news helps people become more informed about politics (Aalberg and Curran 2012), and in some cases, more resilient to misinformation (Humprecht et al. 2020). With some variation across countries and across categories of media outlets, news use increases political knowledge gain, and while it often broadens people's awareness of false and misleading claims, it does not increase the likelihood - and in several cases decreases the likelihood - of believing misinformation. In line with previous research, both comparative studies and work focused on individual outlets such as Fox News (Jamieson and Albarracin 2020; Simonov et al. 2020), we find that not all kinds of news media deliver these effects, and a few may sometimes have detrimental effects - such as online news in India."

Essentially, news can be a powerful ally in this misinformation attack. When reported responsibly and accurately, relying on news media helps people become more knowledgeable about politics, and it generally does not leave people more misinformed.

On AI and misinformation specifically, The RAND Corporation, a non-profit research organization that develops solutions to public policy, has collated a list of tools that fight disinformation online, ranging from verifying information to region specific offerings, like factcheck.org that monitors the factual accuracy" that is present in American politics.

Fact checkers, another important community in the fight against misinformation, point to a 'back to the basics' approach - double check what sources are telling you, more boots on the ground i.e. reporters covering, documenting and confirming events on location, and not expecting that your audience will be either canny enough or alert to the nuances that differentiate a fake video, image or piece of audio from a real one.

Even as much of this has played out in previous elections, there remain two critical points to consider as we begin 'election year 2024'. One, the biggest challenge for journalists covering and reporting on elections, while also trying to scale the walls of misinformation, is that tools that help are still not accessible to those who need it the most. It is now more than ever that news organisations, journalist communities and larger platforms must engage more deeply with local journalists and newsrooms that will report on elections from quite literally, every corner of the world. And two, it is equally important to guard against retrograde policy action by governments that ostensibly aim to check election time disinformation but actually hamper the ability of journalists to report accurately without running the risk of legal or criminal action. In our zeal to ensure we clamp down on election misinformation, there is a greater risk of clamping down on vital information for voters and citizens.

(Mitali Mukherjee is Director, journalist programmes at the Reuters Institute for the Study of Journalism, University of Oxford. She is a political economy journalist with more than two decades of experience in TV, print and digital journalism).

Disclaimer: These are the personal opinions of the author.

.