Skip to content
Kid expressing itself
17 December 2019| doi: 10.5281/zenodo.3753022

TikTok, a kaleidoscope of visuals, data and legal questions

Are social media platforms empowering people to connect beyond borders or are they the custodians of digital communication spaces where users have to comply with strict content moderation rules? The recent example of TikTok shows that platforms somehow still struggle with combining user-friendly services with freedom of expression principles. 


TikTok is the new social media platform “en vogue”. It hosts chatoyant user-generated pictures, gifs, music, and videos underlined by filters and effects that make the user experience different from usual services. TikTok’s image is young, fun, easy, and the pace is commonly described as particularly fast, leaving users feeling kind of dizzy. Currently, the TikTok application is leading in terms of downloads amongst teenagers and young adults. This confirms a trend of increasingly audio-visual content dissemination and, subsequently, of visual communication. As one user describes it: “The service is supposed to help you create and to use your cinematographic imagination to make others smile”. But this enthusiasm among users is not fully shared with experts and freedom of expression advocates, and here’s why.

A platform like an amusement park

One of TikTok’s most important feature is that it provides not only the tools to create exciting imagery and the platform infrastructure to publish it, it also develops a recommender system based on human moderation and a machine learning system which constantly supplies new content. If other large social media platforms are sometimes compared to a newsstand or a marketplace, TikTok is more like an amusement park: users have constant access to young, fun, and entertaining content. On its trending page, they will find suggestions on what they should watch and what they should post: the system will nudge them in both ways. The app and its recommender system have been under scrutiny in questions mainly related to data protection, also due to the fact that the service is owned by Chinese tech company ByteDance. Experts fear that user data from the U.S. or the E.U. might be collected and later analysed by Chinese authorities. This speculation could turn out even more unpleasing if the data is somehow used in combination with the Chinese social scoring system. There is a general concern about data protection on TikTok, especially because the app collects data of everyone not only of registered users. German newspaper Süddeutsche Zeitung reported that TikTok uses the ‘fingerprinting method’ to track users. U.S. authorities have now started investigating the company with regard to the data protection aspect and experts confirm that the Chinese authorities might indeed be looking into the user data from abroad as they expand their media control globally.

How user-generated content is perceived in China could be the reason for the company’s criticised content moderation policy. The community guidelines sound similar to those of other large social media platforms but their wording is much vaguer and generic. In addition, their enforcement on the base of guidelines is opaque to users. All in all, TikToks content moderation policy replicates mistakes from previous platforms on the level of what is removed and how. First, TikToks community guidelines forbid the usual categories of unwanted content without further describing what is meant. There are a few inconsistencies like forbidding copyright infringements without providing any guidelines on how to legally include third-party content although the app is based on the dissemination of visuals and music snippets (and users rarely compose the music themselves). It bans any type of content showing nudity without mentioning any exception like breastfeeding or art. For the sake of space, I will not go into every possible point of critique but so far TikTok’s community guidelines are simplistic and lead to a shallow treatment of the issue. As to the German peculiarity, that is, the Network Enforcement Act (NetzDG), TikTok provides an explanation which unfortunately contains misleading information: according to TikTok, content that was subject of a complaint but does not fulfill the criteria for an immediate removal within 24 hours, shall be removed within seven days. In reality, under the NetzDG, the content may as well not be removed at all if it is not punishable under German criminal law. In addition to providing a “NetzDG complaint” option next to each post (similar to YouTube), users can also access the complaint form via TikTok’s privacy and safety information page. 

A secret change in moderation

A recent report by Netzpolitik.org gives unique insights into how content is moderated on TikTok. In this online spreadsheet, a secret source describes how the content moderation guidelines have been changed after a critical piece by the Guardian (which, on a side note, demonstrates the importance of reporting bad practices) according to which the company instructs moderators to ‘censor’ content related to ‘Tiananmen Square, Tibetan independence, or Falun Gong’. Indeed, several users were locked out of their accounts for talking about detention camps in China. Another important take-away of Netzpolitik’s report is how reviewers ‘censor’ content, i.e., not only removing but mainly down-grading it. According to the community guidelines, content that violates the rules shall be removed or the account closed. Decreasing the visibility of user-generated content is not mentioned as a form of sanction but is apparently used as an easier way of hiding unwanted content without being exposed to the allegation of censorship. Investigations have shown that TikTok systematically downgraded images of people with a handicap or who are overweight. The company tried to justify this behaviour as a way of protecting users from cyber-bullying. But as a platform that describes itself as ‘inclusive community’ this argument seems – again – contradictory. It also conflicts with its (presumed) legal status as a neutral platform: curating and ranking content to this extent increasingly resembles publishers’ activities. 
To conclude, it seems impossible to keep politics out of the TikTok equation. Be it the alarming situation in Hong Kong, the creeping social scoring system or the ongoing human rights violations for minorities in mainland China – one cannot help but wonder if Tiktok will be a space for more freedom of expression or a censorship agent, possibly controlled by a government (or its laws as a proxy). Its vague and yet restrictive content moderation policy is a source of concern for users, not only because of the alleged link to the Chinese State. However, the company is noticeably pressured by public opinion as well as by a rapidly changing market: if TikTok is unable to convince their users that the app users worldwide respectfully without removing and hiding unwanted content on the base opaque community standards, it might get outpaced by other services. For instance, Instagram (not to say that its policies are exemplary) has already launched an in-app very similar to TikTok called Reels, respectively Cenas in Brazil, and history might repeat when looking back at the takeover performed Instagram Stories from Snapchats format.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Amélie Heldt

Ehem. Assoziierte Forscherin: Plattform Governance

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.