Skip to content
Busted! User generated content | HIIG
13 November 2019

Busted: Internet platforms are not liable for user-generated content

Online, you will hear this myth a lot: Whatever people post online does not fall under the responsibility of platform operators. Amélie Heldt takes a close look at this myth to determine whether there is truth to it.

In time for this year’s Internet Governance Forum (IGF), Matthias C. Kettemann (HIIG) and Stephan Dreyer (Leibniz Institut für Medienforschung | Hans-Bredow-Institut (HBI)) will be publishing a volume called “Busted! The Truth About the 50 Most Common Internet Myths“. As an exclusive sneak peek, we are publishing an assortment of these myths here on our blog – some of those have been busted by HIIGs own researchers and associates. The entire volume will be accessible soon at internetmyths.eu.


Myth

Internet platforms are merely a conduit for user-generated content. They function like a pipe and do not look at the content itself, which is why they are neither liable nor responsible for unlawful content uploaded by their users.

Busted

Originally, Internet platforms were thought of as distributors, not publishers: content-neutral platforms that enable their users to share content without verifying or curating it. This principle was translated into a landmark U.S. Internet legislation, section 230 of the 1996 Communications Decency Act (hereinafter Sec. 230 CDA), according to which (simply put) no “interactive computer service” should be treated as a publisher or a speaker, hence should not be liable for what had been expressed by means of user-generated content (UGC). Internet platforms would only be liable for UGC in cases of federal criminal liability and intellectual property claims, or when performing an editorial role. The origin of this law can be found in a decision by the Supreme Court (Smith v. California) concerning the liability of a book store owner compared to an author’s or publisher’s liability: the Court held that one could not be held criminally liable without knowledge of content, that is, for just possessing a book containing obscene images. Such regulation of liability was deemed unconstitutional under the First Amendment although obscene speech itself is not protected. 

Other countries have adopted similar legislation, e.g. section 79 of the Indian Information Technology Act, which provides a qualified immunity for intermediaries, or article 14 of the EU’s E-Commerce Directive 2000/31/EC. 

However, there has been a noticeable change over the past five years: the EU judiciary and the European legislator have moved towards a tighter liability for platforms. In the course of introducing more balanced responsibility-sharing, the intermediary immunity has been increasingly restricted when it comes to hate speech, terror propaganda, or copyright infringements, e.g. by art. 17 (3) of the 2019 EU Copyright Directive in the Digital Single Market, under which online content-sharing service providers are now liable for copyright infringements. Under the German Network Enforcement Act (2017) platforms have to ensure an efficient complaint procedure for “manifestly unlawful content”. Under certain EU proposals it could also become mandatory for platforms to de facto proactively filter UGC in order to prevent illegal content from being uploaded, but this proposal is highly controversial. All in all, the question of platforms’ liability highlights the differences in speech regulation between the US and the EU.

Apart from the limits imposed by the lawmakers it is the principle itself that has been under attack. The basis of this relative intermediary immunity, that is, their neutrality vis-à-vis the content itself, is in most cases a chimera. Platforms classify, prioritize and moderate UGC. Technology enables them to increasingly identify and remove content before it is flagged by a user, making the analogy of an uninformed book store owner obsolete.

Truth

Internet platforms are not merely neutral distributors of content they neither know nor care about. While the liability of platforms in the US is closely curtailed, European law recognizes a more nuanced responsibility regime, especially with regard to the protection of intellectual property, clearly illegal content and serious offences, such as the promotion of terrorism.


Sources

Daphne Keller, Toward a Clearer Conversation About Platform Liability, Knight First Amendment Institute’s “Emerging Threats” Essay Series (2018), https://knightcolumbia.org/content/toward-clearer-conversation-about-platform-liability.

Aleksandra Kuczerawy, Intermediary Liability & Freedom of Expression: Recent Developments in the EU Notice & Action Initiative (2014), Computer Law and Security Review 31 (2015) 1, 46-56; CiTiP Working Paper 21/2015,https://ssrn.com/abstract=2560257.

Amélie Heldt

Former Associated Researcher: Platform Governance

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.