top of page

The Threat of Disinformation

How should the EU deal with it? Towards the 2019 European Parliament elections and the way forward: policy options and recommendations

Newspapers © Pexels / Public Domain / Pixabay

According to the European Commission (2018), disinformation concerns “all the forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit”(p.3). Disinformation as a tool against foes is far from new. Down memory, it has been recurrently used to destabilize and manipulate enemies’ perspectives in warfare time. However, over recent years, the complexity and scale of information pollution in the digitally-connected and cross-border world has become an unprecedented challenge (Wardle, 2017, p.10). The mainstream political instability and the decline of leaders’ authority bolstered by the massive increase of technological means are fomenting the spread of disinformation as never before. It is rather becoming cheaper, faster, more effective in its wide-reach results and peculiarly spread in peaceful times.

Disinformation as a powerful means usually dates back to the aftermath of the 2016 US Presidential election and the UK European Union membership referendum, when official reports confirmed the involvement of external actors, most notably Russia, throughout both electoral campaigns (Lilkov, 2019, p.1). Among important national elections across Europe in 2017, including those in France and Germany, several countries faced the same challenge (Fried & Polyakova, 2018, p.1). This was corroborated by a report on internet freedom by Freedom House (2017) which stated that “online manipulation and disinformation tactics played an important role in elections in at least 18 countries over the past year(...)”. This trend was confirmed as well by a Euro-barometer survey (2018) which found that 83% of European citizens believe that online disinformation represents “a danger to democracy” (p.23). Disinformation tactics and electoral interference can take many forms and can be motivated by different reasons. Some of them are the desire to shape voter choices, to suppress turnout, to undermine public confidence in democracy but even to undermine social and political cohesion and to obtain financial gain.

The EU strategy to counter disinformation

From an EU perspective, the tool can be used by external actors to undermine the credibility of its institutions (European Commission, 2020a). The EU has actively faced the exposure of citizens to large scale disinformation since 2016. The Commission worked hard to implement actions aimed to face the challenge and mostly targeted to ensure transparent European Parliament Elections in 2019 (European Commission, 2020b). One of the several vulnerabilities for European Parliament elections is the potential suppression of voters through disincentives addressed to specific groups. This strategy can imply the spread of incorrect online information about voting procedures and registration deadlines or even more advanced efforts. These involve targeting undecided voters with designed posts which reinforce the idea that abstention can be an effective protest against the political establishment (Lilkov, 2019, p.3). This is particularly significant for what concerns the European Parliament elections where the voter turnout is historically low. The 2014 elections confirmed this negative trend with a voters’ turnout of 42.6%. As a result, political groups with anti-European sentiment or radical political views can be overrepresented thanks to the substantially lower number of votes required for a European Parliament mandate.

The EU strategy to face disinformation towards the 2019 elections included several steps. Some of the major developments are related to the Code of Practice against Disinformation and the Joint Action Plan of 2018 (Viola, 2020). The former was a worldwide self-regulatory set of standards signed voluntarily by leading platforms, social networks and advertising industries. Before May 2019, the European Commission carried out a targeted and regular monitoring of the implementation of the commitments set out in the Code (European Commission, 2019). Rather, the Action Plan’s main purpose was to ensure the proper implementation of the Code of Practice towards the European Parliament elections, complementing this with further actions on a governmental level and supporting media literacy efforts, fact-checkers and researchers.

The strategy adopted, which included the passages provided by the figure above, proved to be necessary. Indeed, the 2019 report of June on the implementation of the Action Plan confirmed the role of Russia as an influential source of disinformation, as documented by the East StratCom Task Force, as well as other actors that increasingly deployed the same tactics. Several attempts were identified, some of which aimed to suppress voter turnout through attacks on government websites and to spread manipulative content, and this was absolutely predictable, as the European Parliamentary elections of 2019 were the most digital ever.

However, according to the report of the Commission that followed the elections of May, the 2019 elections showed the record-highest turnout in 25 years with 50.6% of EU citizens eligible taking part. Moreover, the report explicitly stated that the Commission’s initiatives on disinformation contributed to securing the integrity of the electoral process and voters’ confidence in it (European Commission, 2020a). While these were an important step towards more transparency in the election context, they are not entirely satisfactory. So far, the measures implemented by the EU, as well as by national governments and civil society organizations to contrast the impact of disinformation operations, have been only partially adequate and are still limited, especially given the caliber and the continuously changing nature of the threat (Pelegatta, 2019).

Prospects, actors involved and policy options

It is undeniable that digital and cyber technologies are rapidly evolving; likewise the speed and efficiency of influence operations, ever less expensive (Fried & Polyakova, 2018, p.11). Nevertheless, it could be misleading to blame digital technologies as the primary actor responsible for the expansion of disinformation campaigns. Indeed, these operations are mostly specifically designed to sow mistrust and to sharpen existing socio-cultural divisions. In this regard, the deep crisis of trust in institutions is probably largely underestimated (Wardle, 2017, p.65), while it should be treated as a major explanation of this rapid development. To this end, the only truly effective way to fight the roots of disinformation would be to address the crisis of confidence in mainstream politics that is creating demand for alternative narratives. This should be the priority of any actor, EU included. An excessive concern over Russia and other external actors’ influence on the European democracies would risk neglecting the home-grown threat represented by populists and far-right parties (Butcher, 2019, p.17). In effect, disinformation is already dramatically damaging public trust in democratic institutions and values. The long-term leading solution unquestionably passes through the creation of credible and persuasive narratives, able to replace the ones that are now losing legitimacy.

Simultaneously, however, tactics of mitigation are necessary. Efforts should be employed to efficiently cure the symptoms and the whole society should be considered to have an essential role in promoting the fight against disinformation. In this regard, the EU has to deal with considerable limits of action. For instance, legislation is a crucial dimension to take into consideration. Indeed, while the phenomenon of disinformation is global in nature and practice, legislation has normally a territorial reach. In this respect, there are limits to what governments and international organizations could do, and this is why major efforts should be probably invested elsewhere. The second weakness of Western societies is associated with the concept of censorship and the different perceptions in these countries of what is legitimate to say or print. What speech should be allowed? And who should be in charge of deciding? We are experiencing a “constitutional moment” for how to regulate the private infrastructure that supports free expression around the world (The Economist, 2020a). The very fine line that liberal democracies do not want to cross is something that must be considered, being both a strength and a vulnerability of the West.

The EU. These limits are evident for what concerns EU strategy. In its activities, the EU tries to avoid interference with member states. The actions taken against disinformation by its institutions run the risk of further alienating Euro-sceptics inclined to perceive this as censorship. Nevertheless, the EU, with cross-border competences, can adopt a broader approach with more potential for long-term success (Butcher, 2019, p.13).

To this end, the 2018 Code of Practice is a significant starting point. It recognises these explicit limits, opting to place the burden of responsibility on the private sector and civil society, rather than considering the EU an actor per se. The Code explicitly rules out policies encouraging the deletion of lawful content exclusively when they are considered ‘false’, according to Article 10 of the European Convention on Human Rights, which guarantees the freedom of expression and information (p.13). The Code, however, has a major weakness: it remains entirely non-binding. Although all of the relevant internet media companies have signed up to it, signatories may withdraw from it at any time. While compulsion could be counter-productive, the European Commission could keep the signatories bounded to their commitments keeping the option of regulatory measures on the table, foreseeing consequences when signatories choose to withdraw from the Code or specific commitments. Furthermore, the reports of the signatories could be made public, and the Commission could provide a regular public evaluation of the progress, including recommendations of feasible improvements (p.17).

Private sector and social media. There are several things that social media companies can do to effectively collaborate with the EU and other stakeholders. In the light of the strategy adopted by the EU and the private sector against disinformation during the Pandemic, some initiatives to develop could be to:

A) Promote authoritative content at the EU and Member State level. To this end, Google Search gives prominence to articles published by EU fact-checking organisations, which generated more than 155 million impressions over the first half of 2020.

B) Improve users' awareness. In this regard, Facebook and Instagram directed more than 2 billion people to resources from health authorities during the Pandemic.

C) Detect manipulative behaviour. For instance, Twitter challenged more than 3.4 million suspicious accounts targeting the Covid-19 discussion (European Commission, 2020c).

D) Introduce more competition within the sector. To sell more ads, the tech companies’ algorithms send users news that can grab their attention. A feasible remedy should be to change the tech firms’ business model introducing more competition (The Economist, 2020b). Moreover, technology companies as well as advertising networks should develop ways to prevent purveyors of disinformation from gaining financially.

E) Grant the access of the research community to digital platforms’ data. The impossibility for journalists and researchers to understand and report on the threats to the democratic debate compromise the efforts to raise citizens’ awareness and build social resilience. In this sense, a way to face this problem could be to convince platforms to allow increased access to data by researchers and specific structures established for that purpose in full respect of data protection requirements (European Commission, 2020a).

Civil society. The EU, along with national governments, should implement more civic-education and media-literacy courses. The sharing of standards can help to create the necessary common ground for supranational responses to disinformation, making it harder for foreign actors to divide and polarize communities within nations (Fried & Polyakova, 2018, p.13). However, responsibility is also related to media consumers themselves and the way they read the news: “by raising awareness of the importance of checking and comparing sources, applying skepticism to outrageous claims, and exercising informed judgement at all times, disinformation can be reduced from a dangerous political problem to just part of the background noise of our online experiences” (Butcher, 2019, p.19).

Likewise, the European Democracy Action Plan of 2020 recognises the importance of supporting and engaging the civil society in concrete actions such as media literacy and fact-checking initiatives, to “strengthen cooperation amongst diverse actors of the civil society at the European level” (European Commission, 2020d). Civil society can be faster and more effective than most governments in identifying and countering disinformation campaigns. Several civil-society groups have shown an ability to identify, for instance, prominent Russian trolls and to expose significant campaigns run by them. In this regard, the Lithuanian Elves and the Ukrainian StopFake should be absolutely taken as examples to further develop this kind of initiative. At the same time, a decisive development should be to work on the identification of the sources of disinformation in real-time, integrating fact-checking with source-checking. Indeed, “when bot accounts who originated a rumour appear to be based in a country other than the one connected with said rumour, it could prove to be a faster way of encouraging skepticism in the audience than debunking the fact itself” (Wardle, 2017, p.67).

The way forward

The challenge of disinformation is complex and alarming, no magic recipe can decisively defeat it. Over the last years, it has become ever more efficient in undermining the credibility of democratic processes as well as social and political cohesion at large. Responsibility, resistance and resilience are the three main components to build on response strategies based on the active involvement of all stakeholders: from the EU up to fact-source checkers and citizens. Democratic societies may be vulnerable in the short-term run, but history demonstrates that they have longer-term advantages, especially when supported by tools of transparency, limited regulation, an active civil society and the possibility to cooperate.

Alessandro Castellani is a 24-years-old Italian student. He graduated in Political Science and International Studies at the University of Florence (Italy). Currently, he is a master student of International Security Studies, a course offered jointly by the School of Advanced Studies Sant’Anna (Pisa, Italy) and the University of International Studies of Trento. He is very interested in the EU and its politics and policies. You can find him on LinkedIn and Facebook.

The opinions expressed here are those of the writers and do not represent the views of European Guanxi.

Do you have an article you would like to share? Write for us.


Butcher, P. (2019). Disinformation and democracy: The home front in the information war. European politics and institutions programme. European Policy Centre.

Eurobarometer. (2018). Flash Eurobarometer 464: Fake News and disinformation online. Retrieved from: Http://

European Commission. (2018). A multi-dimensional approach to disinformation. Report of the High Level Expert Group on Fake News and Online Disinformation. Retrieved from: Http://

European Commission-Fact Sheet. (2019). Questions and Answers – Code of Practice against disinformation: Commission calls on signatories to intensify their efforts. Retrieved from:

European Commission. (2020a). Communication from the Commission to the European Parliament, the Council and the European Economic and Social Committee: report on the 2019 elections to the European Parliament. Retrieved from:

European Commission. (2020b). Tackling online disinformation. Retrieved from:

European Commission. (2020c). Tackling coronavirus disinformation. Retrieved from:

European Commission. (2020d). Roadmap European Democracy Action Plan. European Commission. Ref.Ares(2020)3624828

Freedom House. (2017, November 13). New Report – Freedom on the Net 2017: Manipulating Social Media to Undermine Democracy. Retrieved from:

Fried, D., & Polyakova, A. (2018). Democratic defense against disinformation. Atlantic Council Eurasia Center.

Haigh, M., & Haigh, T., & Kozak, N. (2017). Stopping Fake News: The work practices of peer-to-peer counter propaganda. Journalism Studies, pp.1-26.

How to deal with free speech on social media. (2020a). Retrieved October 22, from

Lilkov, D. (2019). European Parliament elections: The Disinformation Challenge. Wilfried Martens Centre for European Studies.

Peel, M. (2019). Fake news: How Lithuania’s “elves” take on Russian trolls. Financial Times. Retrieved from:

Pelegatta, A. (2019). Divide et Impera: The Art of Disinformation. ISPI.

Social media’s struggle with self-censorship. (2020b). Retrieved October 22, from

Viola, R. (2020, March 8). Online disinformation: a major challenge for Europe. Shaping Europe’s digital future. Retrieved from:

Wardle, C. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe report DGI(2017)09.

Additional readings

Dutch Government, Ministry of Justice and Security. (2017). Χίμαιρα An analysis of the ‘hybrid threat’ phenomenon.

European Commission. (2018). Action Plan against Disinformation.. European Commission contribution to the European Council.

Guess, A., & Lyons, B. (2020). Social Media and Democracy. Cambridge University Press, pp. 10-33.

Jack, C. (2017). Lexicon of Lies. Terms for Problematic Information. Data & Society. Retrieved from:

Missiroli, A. (2019). From hybrid warfare to “cybrid” campaigns: the new normal? NDC Policy Brief, No.19.

Nagasako, T. (2020). Global disinformation campaigns and legal challenges. Int. Cybersecur. Law Rev, pp.125–136.

Missiroli, A., Andersson, J.J., Gaub, F., Popescu, N., Wilkins, J.J., et al. (2016) Strategic Communications: East and South.European Union Institute for Security Studies.

119 views0 comments

Recent Posts

See All