Unsere vernetzte Welt verstehen
Wichtiger denn je! Die Governance sozialer Plattformen während und nach der 2020 US- Präsidentschaftskampagne
Die 2020 US-Präsidentschaftskampagne hat uns gezeigt, dass die Governance sozialer Plattformen wie zum Beispiel die Praktiken algorithmischer und manueller Moderation von Content wichtiger denn je sind. Doch die eingesetzten Systeme sind hochgradig intransparent, obwohl sie einen entscheidenden Einfluss auf den Ausgang von Wahlen haben können.
Social Platform Governance and the 2020 US Presidential Election
Four years ago, many people around the world woke up feeling surprised or shocked. While on election eve it seemed as though Hilary Clinton would win the 58th US presidential election, an early morning look at the cellphone confronted sleepy eyes with a new reality. Donald J. Trump had won the electoral college by seizing several crucial swing states. Only months later, amid the revelations and allegations related to the Cambridge Analytica scandal, the role of targeted political advertisement and other forms of social media campaigning in persuading undecided voters were publicly much more debated. These voters eventually tilted the polls in favour of Donald Trump.
Much has happened over the past four years. To stay with the US campaign a moment longer, Brad Parscale, a former freelance online marketing specialist at Trump’s companies, who led the 2016 Trump campaign’s digital strategy, was promoted to chief of staff in the 2020 campaign. He was, however, later demoted after a failed rally in July, 2020. While Brad Parscale’s rise underlines the importance of digital political campaigning, his fall reflects the difficulties it can cause to transform online attention into large-scale offline mobilisation. Mr. Parscale effectively lost his job to this conundrum: only a fraction of registered guests for the campaign event in Tulsa showed up, not least as TikTok users launched a successful campaign to sabotage the event by registering multiple times.
To organise such events, the 2020 Trump campaign relied on an improved campaign app that itself acts as a platform linking supporters directly to the President, while generating politically and financially profitable data for Trump’s campaign and company, which owns the application. Nevertheless, Americans spend a lot of time on Facebook and 48% use it to consume political news according to the Reuters News Report 2020. This constitutes Facebook as the central online hub for public opinion formation and the most important platform for political advertising and digital campaigning.
A shift in discourse and increased awareness
The past years have increased public awareness of the challenges that digitalisation, datafication and algorithmic decision-making bring for society and democracy, but considering trends of algorithmic governance in the public and private sector further work is necessary and needed. The Cambridge Analytica scandal and the grown awareness of social platforms’ roles in society represent a public relations problem to platform companies that signaled understanding and willingness to comply with regulation. In fact, they often expressed the need to be more closely regulated and if not, to take more responsibility themselves. Facebook, for example, recently announced its decision to delete messages referring to QAnon as well as antisemitic content relating to Holocaust denial. The latter would not be unlawful in the United States, yet is illegal in many European countries.
While these decisions seem understandable for most US citizens, especially when considering the issue from a European perspective, they conflict with the US constitutional understanding of freedom of speech. In addition to these specific political decisions such as the moderation of the New York Post’s Biden story that are taken by high-level management in social platform companies, millions of messages and accounts are algorithmically deleted on a daily basis, in order to protect users from online harms. However, these safeguards simultaneously can be considered as potentially conflicting with democratic rights and skewing the formation of public opinion by over-blocking legitimate content.
What is algorithmic moderation and why is it important for elections?
In a recent research sprint on AI and content moderation organised by the Humboldt Institute for Internet and Society and the Network of Centers, we focused on current developments in platform governance and observed a general increase in the use of algorithmic content moderation among many social media platforms throughout the coronavirus pandemic.
Building on ground setting work by Grimmelmann (2015) who defined moderation as “the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse” (p.47), I would underline that the distinction between algorithmic content moderation systems for recommendation and detection of illegal or harmful content necessary to develop a broader public perception of the problem. While recommendation systems that decide which content social platform users get to see also form online communities by recommending accounts to follow or groups to join, they have been criticised for leading users down rabbit holes or into groups that act as echo chambers and may intensify political polarisation or even radicalisation and extremism.
In the research sprint we focused the deletion of illegal and potentially harmful content by algorithmic content moderation systems that “often remain opaque, unaccountable and often poorly understood” (Gorwa, Binns and Katzenbach, 2020:2). This is problematic per se since the decisions of why content was removed are not transparent and, moreover, makes an empirical investigation of how and to what extent the use of algorithmic content moderation affects public opinion formation in political campaigns extremely difficult. Thus, the opaqueness of algorithmic content moderation systems also hampers scientific policy advice and decision making.
Take-away
While platforms have started reporting on their ‘community guideline enforcement’, a.k.a. human and algorithmic detection and deletion of content, in so-called transparency reports, the data included in these reports is fragmented and not available in machine readable formats. If the aim is to genuinely improve the quality and inclusivity of online public discourses, civil society actors such as NGOs and research institutions must be granted increased access to platform data. This would allow an independent and effective assessment of the impact that algorithmic moderation decisions have on opinion formation during democratic election campaigns such as the 2020 US presidential election. Within the next weeks, the fellows of the HIIG research sprint will present three policy reports outlining recommendations on how to enhance transparency in algorithmic content moderation and better inform policy making on the governance of social media platforms.
Philipp Darius is a PhD candidate at the Hertie School’s Centre for Digital Governance and a political consultant. In his dissertation he applies methods from computational social science and political data science to investigate the intersection of politics, technology and democratic governance.
Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de
Jetzt anmelden und die neuesten Blogartikel einmal im Monat per Newsletter erhalten.
Plattform Governance
Plattformdaten und Forschung: Zugangsrechte als Gefahr für die Wissenschaftsfreiheit?
Neue Digitalgesetze gewähren Forschenden Zugangsrechte zu Plattformdaten, doch strikte Vorgaben werfen Fragen zur Wissenschaftsfreiheit auf.
Beschäftigte durch Daten stärken
Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?
Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk
Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.