Zum Inhalt springen
romi-yusardi-643823-unsplash
01 Januar 2019

#NSFW? Sei du selbst – aber bloß nicht nackt

Das größte soziale Netzwerk der Welt plant, eine neue Dating-Anwendung zu starten, verbietet aber gleichzeitig alle Arten von potenziell sexualisiertem Verhalten in der neuen Version seiner Community-Standards, einschließlich der Kommunikation über private Gruppen oder Nachrichten. Dies gefährdet nicht nur die Meinungsfreiheit der NutzerInnen, sondern auch ihre persönliche Entwicklung.

No more “sexual behaviour”

“Bringing the world closer together” is Facebook’s mission statement, which could soon include a new dating application. Making use of all the data collected over the past years, the world’s biggest social media platform is working on its own dating service and testing it internally as well as in Columbia. Apparently it could be more of an additional feature inside the already existing structure rather than a new, separate application. Almost in the same time, Facebook updated its community standards, banning all types of “sexual activity” and taking effect from 15 October 2018. In doing so, the social network expands its strict ban on nudity in pictures to all types of social interaction, including even private messages. Until now, the focus was mostly on pictures showing nudes or sexual interaction. According to the new community standards, any form of sexual speech that “goes beyond simply mentioning” a state of “sexual arousal” or a “sexual act” respectively any sexual action is forbidden. In practice a post or message containing any type of speech that could express the wish for sexual interaction or simply arranging a date with explicit verbalisation could be subject to deletion.

Facebook has a reputation of being conservative, expressing this vision in its community standards and enforcing them globally. But they are not the only ones: also Tumblr that was until now quite liberal about content showing nudity (so-called “adult” content would be flagged as “not safe for work”, short #NSFW) recently announced they would take down adult content, that is any media that depicts “real-life human genitals or female-presenting nipples”, starting 17 December. This ban does – in contrast to Facebook – not include text: “Written content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr.” Tumblr’s reaction to allegations regarding child pornography is perceived as a change of policy due to Verizon’s take-over in 2017.

Please share your life

This development of content moderation policies with regards to speech possibly containing sexual content is a change for the worse. Social media platforms invite users to share every single detail of their life, even the most intimate. They rely on user-generated content to generate interaction and are constantly fighting for their users’ attention in order to keep them on board as long as possible. While the business model is to collect data produced by users on the basis of which micro-targeting becomes more and more precise, users are constantly losing ground when it comes to the freedom of choosing to be more expressive. If the pictures, videos or texts they wish to share with their respective communities are not consistent with the platforms standards, they will be deleted. Under the pretext of creating a “safe space” for communication, social networks become more restrictive, regardless of the age of the users affected or the actual content.

Facebook’s strict policy on nudity and sexual speech has been subject to criticism in the past, especially because it is more severe than national laws and contradicts the company’s mission of making the world more open and connected. Several pictures of high historical, artistic and journalistic value have been deleted due to “nudity”, as well as pictures showing women breastfeeding. The controversies regarding these cases aren’t new and Facebook has been under attack for forcing austere morals on its users. This phenomenon could be amplified by the use of artificial intelligence in proactive content moderation, if there is no longer a “human in the loop”. Indeed, experts confirm that the technology used to retrieve unwanted content isn’t fulfilling the expectations for now. Reports show that algorithms and filters are still struggling with the recognition of visual content (e.g. the differentiation of naked skin and deserts), although deep learning powered image recognition algorithms are performing well at recognising single items and activities. The main issue is that the context of visuals isn’t incorporated in the filtering process, making no difference between pornographic pictures and photography showing nudes. Furthermore, the context needs to be assessed according to the respective cultural codes in different parts of the world, making a “one size fits all” solution impossible.

Private ordering and human rights

The example of Facebook’s new community guidelines on sexual behaviour shows – once more – that the social expectations are high when it comes to the respect and protection of human rights. Although social media platforms just as other private companies aren’t bound to fundamental rights the same way public authorities are, users perceive take-down decisions as a violation of their rights. Social media platforms legally have the right to govern their contractual relationship with users, including setting up a list of unwanted content even if the speech would be considered legal according to the laws of the user’s country of residence. One should therefore refrain from calling community guidelines “censorship”, unless it relies on some kind of state driven action. In German constitutional law, prior restraint is absolutely forbidden by art. 5 Basic Law but only when a public institution is involved and it requires to control the content before its publication. The European Convention on Human Rights has a less strict definition of censorship in art. 10, i.e. prior restraint would need to be proportionate but isn’t forbidden per se.

Nonetheless, social media platforms fulfil a special function in the digital sphere and especially the big players such as Facebook, YouTube and Twitter are expected to comply with human rights standards, even if not obliged to by national laws. Applying a strict non-sexualised-content policy is a major restriction in the way users communicate via Facebook. Not only does it limit their freedom of expression when it comes to posting visuals which are likely to be filtered, but it is also patronising them in their behaviour and their right to free personality development. Even though an increasing number of courts tend to rule in favour of users when it comes to the deletion of legal content, it remains unclear how far they will interfere in the platforms’ freedom of contract.

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Amélie Heldt

Ehem. Assoziierte Forscherin: Plattform Governance

Auf dem Laufenden bleiben

HIIG-Newsletter-Header

Jetzt anmelden und  die neuesten Blogartikel einmal im Monat per Newsletter erhalten.

Aktuelle HIIG-Aktivitäten entdecken

Forschungsthemen im Fokus

Das HIIG beschäftigt sich mit spannenden Themen. Erfahren Sie mehr über unsere interdisziplinäre Pionierarbeit im öffentlichen Diskurs.

Weitere Artikel

Drei Gruppen von Menschen haben Formen über sich, die zwischen ihnen und in Richtung eines Papiers hin und her reisen. Die Seite ist ein einfaches Rechteck mit geraden Linien, die Daten darstellen. Die Formen, die auf die Seite zusteuern, sind unregelmäßig und verlaufen in gewundenen Bändern.

Beschäftigte durch Daten stärken

Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?

Eine stilisierte Illustration mit einem großen „X“ in einer minimalistischen Schriftart, mit einem trockenen Zweig und verblichenen Blättern auf der einen Seite und einem leuchtend blauen Vogel im Flug auf der anderen Seite. Das Bild symbolisiert einen Übergangsprozess, wobei der Vogel das frühere Twitter-Logo darstellt und das „X“ das Rebranding der Plattform und Änderungen im Regelwerk von X symbolisiert.

Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk

Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.

Das Bild zeigt einen Traktor von oben, der ein Feld bestellt. Eine Seite des Feldes ist grün bewachsen, die andere trocken und erdig. Das soll zeigen, dass nachhaltige KI zwar im Kampf gegen den Klimawandel nützlich sein, selbst aber auch hohe Kosten für die Umwelt verursacht.

Zwischen Vision und Realität: Diskurse über nachhaltige KI in Deutschland

Der Artikel untersucht die Rolle von KI im Klimawandel. In Deutschland wächst die Besorgnis über ihre ökologischen Auswirkungen. Kann KI wirklich helfen?