Skip to content
artur-luczka-626939-unsplash
26 June 2018| doi: 10.5281/zenodo.1303313

Social order in the digital society

In today’s digital economy, data is so important that many observers speak of it as “the oil of the 21st century.” The corporations that collect the most of it have a competitive advantage. The business model of internet giants such as Google, Amazon, and Facebook is based on facilitating user interaction and thereby generating huge amounts of data. Fourcade describes this basic structure of the data economy as a Faustian bargain—in exchange for free services, we have to give away our soul, in the form of our privacy.

 

On May 7, 2018, Marion Fourcade continued the lecture series Making Sense of the Digital Society. Fourcade is professor of sociology at UC Berkeley and an associate fellow of the Max Planck – Sciences Po Center on coping with instability in market societies. Fourcade’s upcoming book The Ordinal Society investigates new forms of social stratification and morality in the digital economy.

Only two days after the 200th birthday of Karl Marx, Fourcade fittingly focused on questions of social inequality and exclusion. In her talk about social order in the digital society she dealt with the social consequences of today’s data collection practices.

 

 

Fourcade notes that economic interest in personal data goes back to pre-digitization times. The credit industry in the United States already started collecting information about merchants in the 1840s in order to evaluate credit-worthiness. In the 1870s, consumer credit reporting began. In the 1970s and 80s the credit industry concentrated rapidly. Together with computerization, this led to more and more precise financial profiles. The classification of individuals became increasingly differentiated and was used to decided on credit conditions. The credit score was born. The logic of quantification and increasing efficiency is  now disseminating into further sectors, such as insurance, health or the jobs market, and is thereby affecting our life chances in many different ways.

 

 

One societal implication of this development that Fourcade diagnoses is a regime of visibility. In the digital age, our ability to escape the quantification and evaluation of our actions has become smaller. A society of transparent citizens is emerging. Transparency is a very one-sided matter in the digital society. While we are becoming increasingly transparent, the handling of data by corporations is becoming increasingly opaque. Trying to escape does not seem to be a viable option. On the contrary, according to Fourcade, “invisibility” can have negative consequences. To be invisible is an evaluative category, and for many classification systems it is a nontrustworthy category.

 

 

For example, a US scientist who tried to hide her pregnancy as a self-experiment by paying exclusively in cash became a target of the authorities after trying to buy an Amazon voucher worth 500 dollars. In the US, paying in cash is associated with low-income earners and because of that, it is associated with crime, for instance through money laundering. Hence, people that fall into this category suffer structural discrimination. Drawing on Marx, Fourcade calls this group the “lumpenscoretariat”.

 

 

Much of what stayed in the private realm in the past is visible nowadays. Referring to Pierre Bourdieu, Fourcade recognizes a new form of capital in the digital society, “übercapital”. First of all, it is an ironic allusion to Uber, but at the same time it hints at the German word “über”, which can be translated as  “above”. It is also synonymous with meta and points at the lack of self-determination. As an aggregated evaluation of our digital traces, “übercapital” determines our position in the social space, but is not subject to our access and control. Thereby it decides about our access to goods, services and ultimately our life chances.

 

 

Efficiency and profits are generated in the digital society by classifying people based on algorithmic analyses. As a consequence, people are increasingly under pressure to adapt and optimize their behavior. Thus, the digital order of classification and ranking is a moral order as well.

At the same time, there are hidden structural forms of discrimination in the design and conception of algorithms. Therefore, “übercapital” has the tendency to enhance already existing inequalities. The Orwellian potential of the new digital possibilities of social control can currently be observed in China. The Chinese authorities are experimenting with a “social score” that integrates various databases in order to evaluate the behavior of companies, persons, and organizations. The score ultimately decides upon access to goods and services.

 

 

It is harder to politicize the inequalities and forms of exclusion that result from the rating economy than in the past. Persons are not aggregated anymore for  tangible reasons such as status characteristics. They are sorted and ranked individually through a seemingly invisible and opaque classification system. The formation of bonds of solidarity between individuals tied through the same social status and shared experiences of exclusion is complicated. In contrast, the moralizing logic of “übercapital” conveys the impression that individuals are personally responsible for their position and potential disadvantages.

Historically, people always had to create the basis for collective action through a shared understanding about their mutual position. Based on this, narratives could be formed that enable political mobilization. That is why the current debate about digitization and its societal implications is so relevant. It is a first step in order to ameliorate new inequalities and forms of exclusion in the digital society.

 

The lecture series Making Sense of the Digital Society will continue on September 24, 2018 with Stephen Graham. If you want to stay up to date, you can subscribe here to our event newsletter.

 

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Marc Pirogan

Student Assistant: Internet Policy and Governance

Felix Beer

Former Student Assistant: The evolving digital society

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?