Skip to content
fellows-short-news
14 July 2014

Information bubbles, are they a myth or a part of our daily digital lives?

The internet and social media in particular are often seen as important elements of modern democracy, not only because they make one-to-one and one-to-many communication easier, but because they can expose us to discover viewpoints different than ours. Being exposed to challenging viewpoints is important, because informed voters make better decisions. According to this belief, thanks to social media, anyone can make his/her voice heard, which was not possible in traditional media due to space limitation and editorial filters. However, skeptics claim that social media also come with filters. To prevent information overload, either users only follow like-minded users, or algorithms personalize incoming information and show us what we already like and agree with. Due to this “filter bubble” effect, for instance a liberal user reads more liberal news, deliberates mostly with liberal users and his/her world-view is never challenged. Viewpoint diversity online would thus diminish due to these self-created or imposed bubbles.

While the claim of the skeptics received attention in the media, there are very few scientific studies that studied viewpoint diversity. Facebook [1] was one of the first to study information diversity and they defined diversity as “novelty”. In a study of 250 million users, Facebook scientists modified[1] news feeds of some users by removing certain incoming information. Facebook concluded that thanks to weak ties (users who we have in our networks but not very deeply connected), bubbles are burst.  We will thus receive novel information on Facebook that we would not get elsewhere, thanks to our weak ties. However, this study is rather thin on theory, as “novelty” is not the only metric that can be used to measure diversity. Thanks to Facebook we can get “novel” websites, but they might not necessarily contain challenging viewpoints.

Another study [2] focused on Twitter and source diversity, checking whether the incoming tweets of users contained items from all end of the political spectrum. Authors concluded that, thanks to the “retweet” feature of Twitter, bubbles did not occur. However, according to studies in media and communication studies where the concept information diversity is studied extensively for many years, diversity should not only be measured by the number of available of sources. This is because (1) a high number of sources does not indicate diversity in information or viewpoints; (2) while there are many sources, minorities and marginalized groups can find a hard time to reach a larger audience due to power imbalances; (3) while a user’s incoming feed is diverse, that does not mean he/she will actually consume this information.

In a comparative study we performed for Dutch and Turkish Twitter users, we used metrics from existing social media analytics studies, but also added new ones using the theory from media and communication studies. We first crawled tweets of popular (seed) Twitters users who mainly tweeted political matters. This list included politicians, political parties, newspapers, bloggers, journalists, etc. Later, we crawled regular users who retweeted from these seed users and the retweets they made. Finally we labelled  the “seed” users who belong to a small political party as a “minority”. That made for instance the Kurdish Party in Turkey and the Greens in the Netherlands a “minority” user.  Minorities formed 15% of all tweets created by the popular users for both countries. According to our research, on a scale of 0 to 1,  source diversity is around 0.6 for both countries. We therefore conclude that we cannot observe bubbles using this metric. However, when we look at “minority access”, whether minorities could reach a larger audience, we do observe bubbles for Turkish users. The minorities cannot reach more than half of the studied Turkish users. Another interesting finding is that, while the incoming feed of users for both countries is not too bad, their outgoing feed (items they choose to retweet or reply) is rather low. So, while users receive diverse information, they are still biased in what they share.

The results of our study introduce various questions. How should minorities be able to reach a larger public? Can this be done by design? While identifying minorities and their valuable tweets is no easy task, showing these items to ‘‘challenge averse’’ users is a real challenge. So far, “pushing” challenging information to users or giving the user feedback about his biased news consumption do not seem to have a significant effect [3]. More research is needed to understand how users’ reading behavior change and to determine the conditions that would allow such a change. Further, normative questions arise while making design decisions. When designing diverse recommendation systems, it is definitely a challenge to determine which view is “valid”. For instance, should a recommendation system show all viewpoints in the climate change debate, if some viewpoints are not empirically validated or simply seen as false by the majority of the experts? Should a viewpoint get equal attention even if it provides no information or only contains arguments with fallacies? Should information intermediaries be required to introduce diversity into their design? These questions would need to be addressed by a good ethical analysis before design decisions of such systems are made.

The paper can be downloaded at http://www.sciencedirect.com/science/article/pii/S0747563214003069


1. Facebook is now in spotlight for a more recent experiment they conducted without notifying users. This experiment has major ethical concerns as well, but this is out of scope for this post.

  1. Bakshy, E., Rosenn, I., Marlow, C., & Adamic, L. (2012). The role of social networks in information diffusion. Proceedings of the 21st international conference on World Wide Web, (pp. 519–528). URL: http://dl.acm.org/citation.cfm?id=2187907.
  2. An, J., Cha, M., Gummadi, K., & Crowcroft, J. (2011). Media Landscape in Twitter: A World of New Conventions and Political Diversity. In Proceedings of the Fifth International AAAI Conference on Weblogs and Social Media (pp. 18–25).
  3. Munson, S. A., Lee, S. Y., & Resnick, P. (2013). Encouraging reading of diverse political viewpoints with a browser widget. In International conference on weblogs and social media (ICWSM). Boston.

The author of this post is Engin Bozdag, research fellow of the Alexander von Humboldt Institute for Internet and Society. The post does not necessarily represent the view of the Institute itself. For more information about the topics of these articles and associated research projects, please contact presse@hiig.de.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Engin Bozdag

Ehem. Fellow: Internet- und Medienregulierung

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?