Skip to content
a curved white line on green football grass, coming from the bottom right corner and ending in the top right corner, symbolising how platform councils could help regulating online communication
24 January 2024| doi: 10.5281/zenodo.13221894

More Power to the People: How Platform Councils Can Make Online Communication More Democratic

Social media platforms have become an integral part of public and private opinion-forming. The decisions made by platforms through terms of use and algorithmic moderation practices shape the protection of human rights. This has triggered an important discussion: How can social interest be integrated into these digital spaces? One potential solution to this complex challenge is the concept of platform councils or social media councils (SMCs). But how can they ensure that public interests and democratic values are taken into account in the regulatory processes of platforms? In our research, we asked ourselves precisely this question with 30 researchers from all continents of the world. This blog post provides a brief overview of the topic.

Public discourse in digital spaces

Private companies essentially determine the framework conditions for the public exchange of opinions on online platforms. This is done through content moderation. The term describes the process by which, for example, users’ posts, images and videos are reviewed and evaluated to ensure that they comply with the platform’s community guidelines. This moderation therefore decides what content is deemed inappropriate and what is deleted, edited or authorised on the platforms. Inappropriate or harmful content includes, for example, insults, hate speech, glorification of violence, pornographic material, misinformation or spam.

Through these terms of use and algorithmic moderation of content, social media platforms also influence public discourse. They ultimately decide which user contributions remain (and are disseminated) and which are removed (or hidden). As a result, they significantly regulate how public opinion is formed in digital spaces. This, in turn, is crucial for the democratic communication rights of citizens. Companies act as rule-makers, enforcers and ultimately judges of their own decisions.

Expansion of the entrepreneurial area of responsibility

Companies are not democracies. They are not run by democratically elected representatives. They essentially follow their profit interests. However, as the European Court of Human Rights emphasized in its 2015 Cengiz judgment, “the internet has become one of the most important means of exercising the right to freedom of information and expression by providing (…) essential tools for participation in activities and discussions on political issues and topics of general interest.” In the course of this social change, responsibility for inclusivity and the protection of human rights can therefore not be located exclusively at state level. Companies also have a decisive role to play. The interplay of state laws and private community standards, among other things, has created a hybrid regulatory framework for digital platforms.

Experts agree that companies in such important positions should also be held socially responsible for the appropriate and sustainable exercise of their primacy. Do they face a conflict here because they have to make decisions against the profit interests of the company? One possible solution to this conflict of interest is to give the decision-making process more legitimacy through platform councils. 

Platform councils as a component that improves legitimacy

The idea behind platform councils is to increase inclusivity in decision-making and the design of the communication space. By involving people who are not acting in the interests of the company, fundamental rights, and other important values are to be strengthened on the platforms. Meta’s Oversight Board is seen as the first significant step towards external control of a commercial platform’s decision-making processes. However, many other platforms remain reluctant to introduce similar governance structures.

However, the exact, effective design of these councils has not yet been uniformly clarified. Platform councils are conceivable at different levels of regulation (national, regional, and global) and can be set up in different constellations. There are also different approaches as to what the platform councils should decide on. Should they only set the broad lines of moderation practice through precedents, or should they act as a kind of court and review every decision that users question?

Inclusiveness as an important factor

The composition of the councils is a key question. Experts with technical expertise and elected representatives of users and minorities could be part of the composition. The inclusion of marginalized groups is of great importance, especially to include the interests of otherwise marginalized groups. However, the inclusivity of the platform council can be at odds with its effectiveness: Larger councils that are as heterogeneous as possible could strengthen legitimacy but at the same time struggle with the challenge of inefficient decision-making. The more interests that have to be taken into account, the more time-consuming the decision-making process becomes. 

Metas Oversight Board was also set up with this in mind. In designing it, great importance was attached to inclusivity. However, our complex modern society creates representation problems that are almost impossible to solve. As a result, Meta’s Oversight Board continues to be criticized for not taking cultural or social perspectives sufficiently into account.

Other potential drawbacks of the councils could be the weakening of state regulatory authorities, a lack of clarity about responsibilities, a dilution of ethical standards, a normative cover-up effect, and an overly global approach to language rules that should be made regionally, which disregards local practices.

Learning from others

A possible model for the complex establishment of the councils could be the European Commission for Democracy through Law (the so-called “Venice Commission”). This independent advisory body within the Council of Europe provides expertise on issues of constitutional law and democratic institutions, with a focus on best practices and minimum standards.

References

Kettemann, Matthias C and Schulz, Wolfgang – Ground Rules for Platform Councils (https://graphite.page/platform-democracy-report/#read-full-article)

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?