Skip to content

Are Chatbots Misinforming Us About the European Elections? Yes.

2024 has been called a super-election year, with more than 60 national elections taking place around the world. At the same time, this year has also seen important advancements in the sophistication and application of AI technologies. The potential impact of AI technologies on the electoral process, specifically in voters’ access to accurate information, has been a concern of many.

Since the launch of OpenAI’s ChatGPT in late 2022, the power of AI has become tangible to the wider public, with major companies competing intensively to bring new AI products to the mass customer market. Most prominent among these are AI-driven chatbots, powered by Large Language Models (LLMs) to “understand” and generate human-like text. As these chatbots grow in popularity and power, with the ability to access real-time information and provide source links, they increasingly take over the function of search engines. Indeed, some of these chatbots, such as Microsoft’s Copilot, have already been integrated into internet search.

With chatbots emerging as a popular source of primary information, the impact they have on elections is no longer theoretical. Can these programs consistently provide accurate information about complicated, important topics like the electoral process? If not, do they at least refer users to authoritative sources?

This report investigates the accuracy of the four most popular chatbots’ responses to questions relating to the upcoming European Parliament elections. While the bots appear to have been relatively well-tuned to provide non-partisan responses to political topics, none of them provided reliably trustworthy answers to questions voters may pose about the electoral process.

This is problematic: when voters are wrongly informed on electoral requirements, they may be deterred from voting (for example, thinking it is more complicated than it is), miss deadlines, or make other mistakes. In short, this unintentional misinformation can impact the right to vote and electoral outcomes.

Our findings also suggest that legal obligations under the EU’s Digital Services Act (DSA) are not being fulfilled, such as proper risk assessment, testing and training to mitigate risks to electoral processes. These findings also run against commitments made by some companies under the EU’s Code of Practice against Disinformation to identify and mitigate risks of dis- and misinformation and to adopt safe design principles.