Skip to content
romi-yusardi-643823-unsplash
01 January 2019

#NSFW? Be yourself but don’t undress

The world’s biggest social network plans to launch a new dating application but in the same time it bans all types of potentially sexualised behaviour in a new version of its community standards, including communication via private groups or messages. This threatens not only the users’ freedom of expression but also their freedom of personal development.

No more “sexual behaviour”

“Bringing the world closer together” is Facebook’s mission statement, which could soon include a new dating application. Making use of all the data collected over the past years, the world’s biggest social media platform is working on its own dating service and testing it internally as well as in Columbia. Apparently it could be more of an additional feature inside the already existing structure rather than a new, separate application. Almost in the same time, Facebook updated its community standards, banning all types of “sexual activity” and taking effect from 15 October 2018. In doing so, the social network expands its strict ban on nudity in pictures to all types of social interaction, including even private messages. Until now, the focus was mostly on pictures showing nudes or sexual interaction. According to the new community standards, any form of sexual speech that “goes beyond simply mentioning” a state of “sexual arousal” or a “sexual act” respectively any sexual action is forbidden. In practice a post or message containing any type of speech that could express the wish for sexual interaction or simply arranging a date with explicit verbalisation could be subject to deletion.

Facebook has a reputation of being conservative, expressing this vision in its community standards and enforcing them globally. But they are not the only ones: also Tumblr that was until now quite liberal about content showing nudity (so-called “adult” content would be flagged as “not safe for work”, short #NSFW) recently announced they would take down adult content, that is any media that depicts “real-life human genitals or female-presenting nipples”, starting 17 December. This ban does – in contrast to Facebook – not include text: “Written content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr.” Tumblr’s reaction to allegations regarding child pornography is perceived as a change of policy due to Verizon’s take-over in 2017.

Please share your life

This development of content moderation policies with regards to speech possibly containing sexual content is a change for the worse. Social media platforms invite users to share every single detail of their life, even the most intimate. They rely on user-generated content to generate interaction and are constantly fighting for their users’ attention in order to keep them on board as long as possible. While the business model is to collect data produced by users on the basis of which micro-targeting becomes more and more precise, users are constantly losing ground when it comes to the freedom of choosing to be more expressive. If the pictures, videos or texts they wish to share with their respective communities are not consistent with the platforms standards, they will be deleted. Under the pretext of creating a “safe space” for communication, social networks become more restrictive, regardless of the age of the users affected or the actual content.

Facebook’s strict policy on nudity and sexual speech has been subject to criticism in the past, especially because it is more severe than national laws and contradicts the company’s mission of making the world more open and connected. Several pictures of high historical, artistic and journalistic value have been deleted due to “nudity”, as well as pictures showing women breastfeeding. The controversies regarding these cases aren’t new and Facebook has been under attack for forcing austere morals on its users. This phenomenon could be amplified by the use of artificial intelligence in proactive content moderation, if there is no longer a “human in the loop”. Indeed, experts confirm that the technology used to retrieve unwanted content isn’t fulfilling the expectations for now. Reports show that algorithms and filters are still struggling with the recognition of visual content (e.g. the differentiation of naked skin and deserts), although deep learning powered image recognition algorithms are performing well at recognising single items and activities. The main issue is that the context of visuals isn’t incorporated in the filtering process, making no difference between pornographic pictures and photography showing nudes. Furthermore, the context needs to be assessed according to the respective cultural codes in different parts of the world, making a “one size fits all” solution impossible.

Private ordering and human rights

The example of Facebook’s new community guidelines on sexual behaviour shows – once more – that the social expectations are high when it comes to the respect and protection of human rights. Although social media platforms just as other private companies aren’t bound to fundamental rights the same way public authorities are, users perceive take-down decisions as a violation of their rights. Social media platforms legally have the right to govern their contractual relationship with users, including setting up a list of unwanted content even if the speech would be considered legal according to the laws of the user’s country of residence. One should therefore refrain from calling community guidelines “censorship”, unless it relies on some kind of state driven action. In German constitutional law, prior restraint is absolutely forbidden by art. 5 Basic Law but only when a public institution is involved and it requires to control the content before its publication. The European Convention on Human Rights has a less strict definition of censorship in art. 10, i.e. prior restraint would need to be proportionate but isn’t forbidden per se.

Nonetheless, social media platforms fulfil a special function in the digital sphere and especially the big players such as Facebook, YouTube and Twitter are expected to comply with human rights standards, even if not obliged to by national laws. Applying a strict non-sexualised-content policy is a major restriction in the way users communicate via Facebook. Not only does it limit their freedom of expression when it comes to posting visuals which are likely to be filtered, but it is also patronising them in their behaviour and their right to free personality development. Even though an increasing number of courts tend to rule in favour of users when it comes to the deletion of legal content, it remains unclear how far they will interfere in the platforms’ freedom of contract.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Amélie Heldt

Former Associated Researcher: Platform Governance

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.