Skip to content
Digital Democracy needs deliberation
31 October 2022| doi: 10.5281/zenodo.7273446

Designing Digital Democracy

Germany’s most important philosopher of democracy and democratic discourses is unhappy. In his latest book, Jürgen Habermas argues that “half-publics” are taking the place of public spaces and that democratic discourse is challenged by heated online debates. But the character of the debates itself is not the central issue. The bigger challenge is making sure that the private rules and practices of platforms that shape online debates are aligned with public values. A number of platforms and NGOs have started to develop deliberative approaches to platform rules. But designing digital democracy is challenging.

Jürgen Habermas is unhappy. When his Structural Change of the Public Sphere appeared 60 years ago, he saw individual communication and participatory culture in danger from mass media, film, radio and television. Passive listeners and viewers would no longer engage democratically but only consume. Fast forward to the year 2020: He is unhappy again, as he eloquently explains in his New Structural Change of the Public Sphere and Deliberative Politics. Now, the culprits are not passive listeners and overbearing mass media. Instead, it’s the too many (and too active) speakers online and the platforms, that allow them to post cat memes, hate speech and Corona disinformation. 

Much has changed in 60 years. The platforms themselves have become rule-makers, rule-enforcers, and judges of their own decisions. They have created communication spaces where discourse, which necessarily impacts democratic values, is subjected to the demands of the attention economy. Is it time for a reset? Should we include more societal groups in developing rules on what can be said online? The German Academies of Sciences and Humanities certainly think so. They recently called for the participation of “representatives of governmental and civil society bodies as well as (…) users (…) in decisions about principles and procedures of content curation”.

Democratic reset

It is therefore not very surprising when the current German government commits to “advancing the establishment of platform councils” (i.e., institutions that oversee the rules and practices of platforms) in their Coalition Agreement. But how should those councils be constructed? As mini-parliaments, supreme courts, councils of wise persons?  Half a year later, not much had moved on the political side. In response to a formal query (Kleine Anfrage) from the CDU/CSU parliamentary group in June, the federal government replied that it was “actively involved in the development of concepts for setting up platform councils” and that platform councils could “represent a sensible addition to the legal framework”. 

The signs for including new bodies to improve the legitimacy of platform rules, practices, and decisions are there: A major social network, Meta, has created an Oversight Board to help with content decisions and algorithmic recommendations. The same social network is experimenting with deliberative processes at scale. A gaming label is experimenting with player councils to help programmers make exciting choices. German public television’s advisory council wants to create a people’s panel to ensure more input into programming decisions. The world’s largest online knowledge platform has, since its inception, let users (and user-editors) decide upon content-related conflicts. All of these examples share one fundamental goal: ensuring that decisions on communication rules, for people and/or mediated through algorithms, are better, more nuanced, and considered more legitimate through broader involvement. 

Solving free speech? 

Scholars and NGOs have become increasingly involved in the debate as well. In 2021 I co-authored an introductory study on Social Media Councils, exploring the concept and their origins in media councils. Tech journalist Casey Newton suggests that to build trust, platforms should try a little democracy. David Kaye together with ARTICLE 19 and Stanford’s GDPI published a detailed study on the potential of Social Media Councils, which ARTICLE 19 then followed up with a report on their Social Media Councils experiment in Ireland. At Harvard, Aviv Ovadya suggests that Citizens Assemblies can help policymaking beyond corporate CEOs and partisan pressure

What can democratic approaches to platform governance achieve? Will they “solve” the challenge of ensuring democratic discourse spaces while at the same time leaving platforms enough space to innovate and set internal rules? First of all, securing free speech is a regulatory challenge that cannot be solved; it is a so-called wicked problem. Nor can public health or climate change be solved. In order to ensure freedom of expression and a lively political discourse (because the institutional dimension of free speech is often forgotten), it is precisely not less regulation and just more freedom that is needed. If Elon Musk allows Donald Trump and Kanye West (whose content on Instagram was reduced or removed because of anti-Semitic statements) back on Twitter (where his content was removed because of anti-Semitic statements within one day of his return), this is only formally a gain in freedom of expression. 

Difficulties of implementation

There are basically two choices and neither is easy to implement. SciencesPo’s Rachel Griffin recently reminded us that to alleviate legitimacy deficits of platform decision-making, platforms can choose a “multistakeholderist response to increase civil society’s influence in platform governance through transparency, consultation and participation” or a “rule of law response” extending “the platform/state analogy to argue that platform governance should follow the same rule of law principles as public institutions”. Or, of course, a mixture of the two, like, arguably, the Meta Oversight Board. Though it has been cited approvingly by the Special Rapporteur on Freedom of Expression in her most recent report (“Many other companies provide little or no information on their operations, much less a public channel of appeal and review”), scholars like Riku Neuvonen (Helsinki and Tampere Univiersities) and Esa Sirkkunen (Tampere University) show why the Board does not (yet) meet the democratic promise of social media councils (or can we counted as true “Supreme Court”).

Delivering deliberative democracy

Let’s return to the unhappy philosopher. Jürgen Habermas is concerned to see a society that is shattered into “semi-publics” and losing its common points of reference. The spaces in which communication takes place seem to gain a peculiar “anonymous intimacy: By previous standards, they can be understood neither as public nor as private, but most likely as a sphere inflated to the public sphere of a communication hitherto reserved for private correspondence.” We call these “hybrid spaces” because private rules, private algorithmic recommendation regimes shape and influence communications that are relevant for public values and interests. And it is precisely in these spaces that the future of digitally mediated democracy is being negotiated – and designs for digital democracy are being piloted. 

Yet who is supposed to help implement new models of democratic decisions-making in the digital age? For Jürgen Habermas the answer is clear: the state. In an essay also printed in Neuer Strukturwandel, he concludes with a reminder of the responsibility of constitutional law for the stabilisation of a society’s order of truth: “It is not just a political decision, but a constitutional imperative to maintain a media order that ensures the inclusive character of the public sphere and a deliberative character of the formation of public opinion and will”.

States matter

States matter. In today’s complex society, at least democratic states are not primarily seen as a threat to freedom, but are also its guarantor. Indeed, states have human rights-based obligations to respect, protect and fulfil/enable/ensure human rights. It is not enough not to censor opinions for a state to fulfil its obligations. States have to actively design media orders to enable democratic discourses. Democracies are based on the communicative interaction of their citizens. This requires – constitutionally – a communication order that is institutionally protected. Freedom of communication and media freedoms are thus to be located within a system of various institutional guarantees. As the media law experts Keno Potthast and Wolfgang Schulz write in an expert opinion for the Berlin-Brandenburg Academy of Sciences and Humanities, democracy in the light of the Basic Law needs the state to ensure the functioning of a free and open, individual and public formation of opinion.

Activating all stakeholders

As the Academies of Arts and Sciences note, designing digital democracy is a project for all stakeholders: “academia and the providers of digital infrastructures and services (… platforms and public service media), but also NGOs and start-ups. Science can develop and provide innovative concepts (…) for a democracy-friendly design [of online communication spaces].” As far as science is concerned, this is what HIIG’s Platform://Democracy project funded by Stiftung Mercator and implemented at the HIIG, the Leibniz Institute for Media Research | Hans-Bredow-Institut and at the University of Innsbruck has set out to do.  

A Platform for Digital Democracy

This blog post, which draws from a longer one published on te.ma, a platform for open science and civil discourse, is part of the Platform://Democracy project. It is funded by Stiftung Mercator and led by Matthias C. Kettemann with Josefa Francke and Christina Dinar. The project explores whether and how platform councils have the potential to align public values and private orders. In four regional research clinics, the project sheds light on how to provide the normative infrastructure for better rule-making, rule-enforcing, and rule-adjudication structures in (primarily online) hybrid communication spaces. In these clinics in Europe, Africa, Asia/Pacific/Australia and the Americas, participants will exchange experiences on models to increase the quality of deliberative democracy in online settings. 

Further reading

Interested in digital democracy? Do you understand the added societal value, and potential drawbacks, of digitalization? Rules, and the normative order of the Internet, are changing rapidly. But rules continue to matter in designing social innovation: Do you know who rules the internet? And how you can contribute? In order to understand how democracy and the public sphere can be governed in the 21st Century, a solid understanding of the role of social media in harnessing and projecting opinion power while fighting disinformation helps.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?