The actions of Musk on Twitter and Zuckerberg on Meta contributed to the effectiveness of the Kremlin’s propaganda.
This conclusion was made by the experts of the European Commission. According to their study, the influence of Russian accounts increased significantly in 2022 across Europe, and especially in the first half of 2023.
During the first year of Russia’s illegal war in Ukraine, social media companies enabled the
Kremlin to run a large-scale disinformation campaign targeting the European Union and its
allies, reaching an aggregate audience of at least 165 million and generating at least 16 billion
views. Preliminary analysis suggests that the reach and influence of Kremlin-backed accounts has grown further in the first half of 2023, driven in particular by the dismantling of Twitter’s safety standards.
The largest social media platforms made commitments to mitigate the reach and influence of
Kremlin-sponsored disinformation. Overall, these efforts were unsuccessful. Over the course of 2022, the audience and reach of Kremlin-aligned social media accounts increased
substantially all over Europe. These circumstances raise questions not only about European
Union defences against Russia’s information warfare but also about the integrity of the
European election in June of 2024.
In the meantime, Europe has established new policy and law to address these vulnerabilities.
In response to European Commission guidance, most major platforms signed a new Code of
Practice on Disinformation in June 2022.1 Shortly thereafter, the EU passed the Digital Services Act (DSA) – a new landmark regulation of online platforms that enters into force in 2023.
This study evaluates how the DSA’s rules can be used to guard against the Kremlin’s disinformation campaigns and protect the dignity, safety and free expression of EU citizens.
We evaluated Kremlin disinformation campaigns across all major platforms in more than 10 European languages over a period of almost a year. These data sets were then analysed using
the compliance framework contained in Articles 34 and 35 of the DSA that require risk
assessment and mitigation.
The conclusions are clear. We find that the Kremlin’s ongoing disinformation campaign not
only forms an integral part of Russia’s military agenda, but also causes risks to public security,
fundamental rights and electoral processes inside the European Union. Moreover, we observe
that disinformation is only one weapon in the Kremlin’s information warfare arsenal.
The Kremlin’s operations on online platforms often build on other inflammatory or deceptive
content, and a range of malign behaviours designed to silence opponents and suppress the
truth about the war in Ukraine.
These risks were mitigated intermittently by the platforms in particular aspects of Russian
disinformation about the war. But their efforts did not effectively impede the growth and
influence of Kremlin information warfare generally. Effective mitigation was not yet required by
law under the DSA during the period of study in 2022. However, most of the platforms were
signatories to the Code of Practice as of June 2022.
Under the Code, online platforms committed to a broad set of measures that could have
mitigated some of the Kremlin’s malign activities. However, the evidence suggests that online platforms failed to implement these measures at a systemic level. Moreover, the Code is not
designed to mitigate a full-scale, state-sponsored information war propagated by thousands of
accounts engaged in coordinated tactics. Consequently, in many cases, the mitigation measures introduced by online platforms failed to account for the Kremlin’s malign intent and full scope of information warfare tactics employed on online platforms. For instance, no platform introduced policies addressing all or even most Kremlin-operated accounts.
In addition, platforms fundamentally ignored cross-platform coordinated campaigns.
As a result, the Russian Federation continues to operate vast networks of social media accounts propagating deceptive, dehumanising, and violent content and engaging in coordinated inauthentic behaviour. Indeed, we find that the reach of Kremlin-sponsored
disinformation inside the EU has grown since February 2022. In absolute numbers, proKremlin accounts continue to reach the largest audiences on Meta’s platforms.
Meanwhile, the audience size for Kremlin-backed accounts more than tripled on Telegram. In addition, we found that no platform consistently applied its terms of service in repeated tests of user notification systems in several Central and Eastern European languages.
The rules provided by the DSA hold great potential to reign in Kremlin disinformation
campaigns and other state-sponsored attacks on the democratic integrity and fundamental rights.
But, they must be applied quickly and effectively in order to help mitigate these coordinated attacks on European democracy.
