Zum Inhalt springen
Kid expressing itself
17 Dezember 2019| doi: 10.5281/zenodo.3753022

TikTok: Ein Kaleidoskop an Bildern, Daten und rechtlichen Fragen

Ermöglichen Social Media Plattformen Menschen, sich über Grenzen hinweg zu verbinden, oder sind sie die Wächter digitaler Kommunikationsräume, in denen ihre Nutzer strenge, inhaltsbezogene Regeln einhalten müssen? Das jüngste Beispiel TikTok zeigt, dass Plattformen immer noch Schwierigkeiten damit haben, benutzerfreundliche Dienste mit Grundsätzen der Meinungsfreiheit zu vereinbaren. 


TikTok is the new social media platform “en vogue”. It hosts chatoyant user-generated pictures, gifs, music, and videos underlined by filters and effects that make the user experience different from usual services. TikTok’s image is young, fun, easy, and the pace is commonly described as particularly fast, leaving users feeling kind of dizzy. Currently, the TikTok application is leading in terms of downloads amongst teenagers and young adults. This confirms a trend of increasingly audio-visual content dissemination and, subsequently, of visual communication. As one user describes it: “The service is supposed to help you create and to use your cinematographic imagination to make others smile”. But this enthusiasm among users is not fully shared with experts and freedom of expression advocates, and here’s why.

A platform like an amusement park

One of TikTok’s most important feature is that it provides not only the tools to create exciting imagery and the platform infrastructure to publish it, it also develops a recommender system based on human moderation and a machine learning system which constantly supplies new content. If other large social media platforms are sometimes compared to a newsstand or a marketplace, TikTok is more like an amusement park: users have constant access to young, fun, and entertaining content. On its trending page, they will find suggestions on what they should watch and what they should post: the system will nudge them in both ways. The app and its recommender system have been under scrutiny in questions mainly related to data protection, also due to the fact that the service is owned by Chinese tech company ByteDance. Experts fear that user data from the U.S. or the E.U. might be collected and later analysed by Chinese authorities. This speculation could turn out even more unpleasing if the data is somehow used in combination with the Chinese social scoring system. There is a general concern about data protection on TikTok, especially because the app collects data of everyone not only of registered users. German newspaper Süddeutsche Zeitung reported that TikTok uses the ‘fingerprinting method’ to track users. U.S. authorities have now started investigating the company with regard to the data protection aspect and experts confirm that the Chinese authorities might indeed be looking into the user data from abroad as they expand their media control globally.

How user-generated content is perceived in China could be the reason for the company’s criticised content moderation policy. The community guidelines sound similar to those of other large social media platforms but their wording is much vaguer and generic. In addition, their enforcement on the base of guidelines is opaque to users. All in all, TikToks content moderation policy replicates mistakes from previous platforms on the level of what is removed and how. First, TikToks community guidelines forbid the usual categories of unwanted content without further describing what is meant. There are a few inconsistencies like forbidding copyright infringements without providing any guidelines on how to legally include third-party content although the app is based on the dissemination of visuals and music snippets (and users rarely compose the music themselves). It bans any type of content showing nudity without mentioning any exception like breastfeeding or art. For the sake of space, I will not go into every possible point of critique but so far TikTok’s community guidelines are simplistic and lead to a shallow treatment of the issue. As to the German peculiarity, that is, the Network Enforcement Act (NetzDG), TikTok provides an explanation which unfortunately contains misleading information: according to TikTok, content that was subject of a complaint but does not fulfill the criteria for an immediate removal within 24 hours, shall be removed within seven days. In reality, under the NetzDG, the content may as well not be removed at all if it is not punishable under German criminal law. In addition to providing a “NetzDG complaint” option next to each post (similar to YouTube), users can also access the complaint form via TikTok’s privacy and safety information page. 

A secret change in moderation

A recent report by Netzpolitik.org gives unique insights into how content is moderated on TikTok. In this online spreadsheet, a secret source describes how the content moderation guidelines have been changed after a critical piece by the Guardian (which, on a side note, demonstrates the importance of reporting bad practices) according to which the company instructs moderators to ‘censor’ content related to ‘Tiananmen Square, Tibetan independence, or Falun Gong’. Indeed, several users were locked out of their accounts for talking about detention camps in China. Another important take-away of Netzpolitik’s report is how reviewers ‘censor’ content, i.e., not only removing but mainly down-grading it. According to the community guidelines, content that violates the rules shall be removed or the account closed. Decreasing the visibility of user-generated content is not mentioned as a form of sanction but is apparently used as an easier way of hiding unwanted content without being exposed to the allegation of censorship. Investigations have shown that TikTok systematically downgraded images of people with a handicap or who are overweight. The company tried to justify this behaviour as a way of protecting users from cyber-bullying. But as a platform that describes itself as ‘inclusive community’ this argument seems – again – contradictory. It also conflicts with its (presumed) legal status as a neutral platform: curating and ranking content to this extent increasingly resembles publishers’ activities. 
To conclude, it seems impossible to keep politics out of the TikTok equation. Be it the alarming situation in Hong Kong, the creeping social scoring system or the ongoing human rights violations for minorities in mainland China – one cannot help but wonder if Tiktok will be a space for more freedom of expression or a censorship agent, possibly controlled by a government (or its laws as a proxy). Its vague and yet restrictive content moderation policy is a source of concern for users, not only because of the alleged link to the Chinese State. However, the company is noticeably pressured by public opinion as well as by a rapidly changing market: if TikTok is unable to convince their users that the app users worldwide respectfully without removing and hiding unwanted content on the base opaque community standards, it might get outpaced by other services. For instance, Instagram (not to say that its policies are exemplary) has already launched an in-app very similar to TikTok called Reels, respectively Cenas in Brazil, and history might repeat when looking back at the takeover performed Instagram Stories from Snapchats format.

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Amélie Heldt

Ehem. Assoziierte Forscherin: Plattform Governance

Auf dem Laufenden bleiben

HIIG-Newsletter-Header

Jetzt anmelden und  die neuesten Blogartikel einmal im Monat per Newsletter erhalten.

Forschungsthema im Fokus Entdecken

Plattform Governance

In unserer Forschung zur Plattform Governance untersuchen wir, wie unternehmerische Ziele und gesellschaftliche Werte auf Online-Plattformen miteinander in Einklang gebracht werden können.

Weitere Artikel

Drei Gruppen von Menschen haben Formen über sich, die zwischen ihnen und in Richtung eines Papiers hin und her reisen. Die Seite ist ein einfaches Rechteck mit geraden Linien, die Daten darstellen. Die Formen, die auf die Seite zusteuern, sind unregelmäßig und verlaufen in gewundenen Bändern.

Beschäftigte durch Daten stärken

Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?

Eine stilisierte Illustration mit einem großen „X“ in einer minimalistischen Schriftart, mit einem trockenen Zweig und verblichenen Blättern auf der einen Seite und einem leuchtend blauen Vogel im Flug auf der anderen Seite. Das Bild symbolisiert einen Übergangsprozess, wobei der Vogel das frühere Twitter-Logo darstellt und das „X“ das Rebranding der Plattform und Änderungen im Regelwerk von X symbolisiert.

Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk

Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.

Das Bild zeigt einen Traktor von oben, der ein Feld bestellt. Eine Seite des Feldes ist grün bewachsen, die andere trocken und erdig. Das soll zeigen, dass nachhaltige KI zwar im Kampf gegen den Klimawandel nützlich sein, selbst aber auch hohe Kosten für die Umwelt verursacht.

Zwischen Vision und Realität: Diskurse über nachhaltige KI in Deutschland

Der Artikel untersucht die Rolle von KI im Klimawandel. In Deutschland wächst die Besorgnis über ihre ökologischen Auswirkungen. Kann KI wirklich helfen?