Skip to content
The photo shows an arrow sign on a brick wall, symbolising the DSA in terms of navigating platform power.
18 July 2024| doi: 10.5281/zenodo.12792898

Navigating platform power: From European elections to the regulatory future

The month of June 2024 was the occasion for the first EU-wide election since the new EU rules on digital services and markets came into force. Six weeks after the ballots were cast, Ann-Kathrin Watolla and Matthias C. Kettemann took stock. How are the new EU rules against platform-based challenges for democracy working in real life conditions? Has the vaunted Digital Services Act (DSA) worked in terms of navigating platform power? This article shows that the challenges of implementing the DSA have only just begun to emerge.

The anticipated wave of digital challenges

In the weeks leading up to the European elections, news outlets, researchers, authorities and civil society actors alike warned about an anticipated wave of disinformation. Held most recently in early June 2024, EU elections are “a flagship of European democracy”, as the European Parliament puts it. However, experts across the globe feared the very real possibility that actors both inside and outside the European Union would try to undermine the democratic processes of European elections. Their main concern was about the dissemination of false information about voting procedures and the sowing of division and polarisation within the EU. Apart from disinformation, the NGO Democracy Reporting International (DRI) identified hate speech, foreign interference and paid political ads (PPAs) as the most dominant digital threats to the EU elections. However, if there is one thing we can learn from DRI’s stakeholder meeting, it is that these threats do not come in one massive wave but rather as separate, smaller surges that slowly try to erode the foundations of democratic discourse. In an effort to halt these highly dispersed attacks, projects like Elections24Check, with its fact-checking database ahead of the 2024 EU elections, are important to counterbalance online disinformation online.

No (major) news is good news?

Now that the elections are behind us, we can safely say that the anticipated massive wave did not wash over the EU. According to the European Digital Media Observatory’s (EDMO) Task Force on the EU Parliament elections, there were no major disinformation incidents.  Is this good news? Not quite. Just because there were no radical accounts of disinformation does not mean that the digital threats identified by DRI have not become a reality. Let’s take disinformation as an example. Correctiv and DRI have identified how chatbots provided misinformation about the EU elections. Irrespective of the chatbot used – be it Google Gemini, Microsoft Copilot or ChatGPT – accurate information about the voting process was not always provided. This is all the more alarming as people also use chatbots like a search engine. As DRI reminds us, “when voters are wrongly informed on electoral requirements, they may be deterred from voting (for example, thinking it is more complicated than it is), miss deadlines, or make other mistakes. In short, this unintentional misinformation can impact the right to vote and electoral outcomes.” (DRI, p. 2) This is why navigating platform power is crucial to protect democratic principles.

But it’s not just that chatbots provided disinformation around the EU elections; studies also show a variety of specific disinformation narratives. These often include targeting politicians, anti-EU sentiments or fake election results

Problems – meet laws

Since February 2024, a new set of EU rules has been in place to counteract these digital challenges and to create a safer and more equitable digital space where users’ fundamental rights are protected. In particular, the Digital Services Act (DSA) requires platforms to “put in place reasonable, proportionate and effective mitigation measures” (Art. 35 (1)) targeting, for example, “any actual or foreseeable negative effects on civic discourse and electoral processes, and public security” (Art. 34 (1) (c)). 

As the DSA has been in force for five months now, this would mean that platforms are now obliged to develop measures to counter the digital threats mentioned above. However, as a recent study suggests, we may not be quite there yet. Looking at the five very large online platforms (VLOPs) – Facebook, Instagram, TikTok, X and YouTube – the study, conducted by the Spanish fact-checking portal Maldita, found that in 45% of instances of disinformation content, the platforms took no visible action. With digital platforms as the “primary public opinion battleground“, this is not something to be taken lightly. And while the European Commission continues to open formal procedures against service providers to ensure these obligations are met, we still have a long way to go in navigating platform power to create a safer digital environment. 

Where do we go from here? 

While it may seem somewhat disappointing that the commencement of the DSA in February 2024 did not immediately solve all the challenges of the digital space, this does not mean that things are not changing. To begin with, recent analyses of how disinformation agendas were deployed in the context of the EU elections are already providing us with valuable insights about platform-based challenges to democracy. We anticipate that we will gain even more insights from the individual services’ systemic risk assessments (Art. 34) to be published in autumn 2024. Moving forward, further research can build on this to better understand “what the most pressing sources of systemic risk are, where common vulnerabilities arise, and what mitigations can effectively reduce negative effects”, as the Centre for Regulation in Europe (CERRE) points out. Since navigating platform power is a highly complex endeavour, we have built our DSA Research Network on the principles of communication and collaboration: bringing together stakeholders from NGOs, academia and regulatory bodies, we set out to provide early recommendations on how to implement the DSA properly and identify possible areas for a needed reform early on. With our Circle of Friends, we provide a unique space for merging diverse perspectives on the challenges and opportunities of the DSA in need of further research, putting further emphasis on the multi-stakeholder engagement.

As a collaborative initiative of the Alexander von Humboldt Institute for Internet and Society (HIIG), the Leibniz Institute for Media Research | Hans Bredow Institute and the DSA Observatory, funded by the Mercator Foundation, the project was built on an interdisciplinary approach from the design stage onwards. Bringing together legal and social-science perspectives, we aim to provide a comprehensive view on the implementation of the DSA. Therefore, we are currently working on three focus areas: 

  • Operationalisation of risk-based governance approaches
  • Impact of hybrid and risk-based governance on collective rights and values
  • Assessment of due diligence and hybrid governance from a fundamental rights perspective 

When our Circle of Friends came together for the first time in 2024, these focus areas served as the basis to discuss the DSA-areas in need of further academic research. With the individual services’ risk assessments made available around that time as well, this serves as a solid basis for better understanding and navigating platform power in the context of the DSA.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Ann-Kathrin Watolla, Dr.

Senior Researcher & Project Lead

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.