Skip to content
consumer-trust
15 December 2016

Datafication and Consumer Trust

We’re already living in a highly data driven society. Algorithms control the content flow displayed to us through search engines, for the news we read on Facebook and Twitter, the purchases we make via loyalty programs, the movies we watch on Netflix and the music we listen to on Spotify. Our behaviours are logged, collected and analysed by a growing number of operators for a growing number of reasons. At the same time, the data driven and automated society is also growing.

Seen from a consumer perspective, this naturally provides a number of services and innovations that we appreciate and often are prepared to pay for. But this data driven development also poses challenges to both consumer protection as well as how to develop the authorities’ supervision methods and role. Digital consumer profiling is in many senses the foundation of the digital economy, an economy that includes mega operators such as Google and Facebook, but also media houses that own a number of newspapers and media websites, relative newcomers in the sharing economy, the marketing industry, both e-commerce and traditional “brick-and-mortar” stores through, for example, loyalty cards and programs. In a new overview on digitalisation and consumer interest that I’ve done for the Swedish Consumer Agency, I highlight some developing trends that need to be looked into more closely. In short, how we manage consumer data and the operators and tools used for moderating, analysing and trading in the data will be of crucial importance for consumer status in the digital economy.

The need for critical studies

First of all, we can conclude that in committing to data driven innovation we also need to encourage a critical perspective and knowledge about how to manage data driven processes, algorithm controlled processes and data-analyses. We need to be able to recognise when the consumer needs protection and empowerment, evaluate winners and losers and strive for transparency both with regards to how the machines work as well as the regulations we would like to see guiding their work. This places demands on both legislators, politicians, supervisory authorities as well as on industry organizations and the academy. In brief:

  • Seen from a consumer protection policy perspective, datafication entails a growing information asymmetry between the consumers and the market operators. We need to develop consumer protection, but the supervisory authorities also need to develop their supervisory and collaborative roles, both within and between authorities. This is as much an issue of power as of integrity and privacy.
  • We, as consumers, commit to hundreds of agreements in the course of our daily digitalised lives. What are the implications of agreements that can only be understood and influenced by one of the parties? Seen from a consumer perspective, user agreements should certainly be shorter, clearer and easier to influence individually. But, are there any other ways of managing informed consent in a time of information overload? We need to figure this one out.

Consumer power also requires insight. For example, in 2015, the Norwegian Data Inspectorate (Datatilsynet) conducted a study of all the parties present when a user visited the index home page of six Norwegian newspapers. They concluded that between 100 and 200 cookies were downloaded to the user’s computer, that the user’s IP address was forwarded to 356 servers, and that on average 46 third parties were present at each of the automated ad trades taking place on the newspapers’ homepages. None of the six newspapers provided public information about the presence of such a large number of third-party companies. How can consumers be expected to choose safe services if they are unaware of the parties present, the information collected or what it’s used for?

Knowledge, transparency and balance

Addressing the challenge of consumers in a digital economy requires knowledge, insight and balance. We need to improve our knowledge about these relatively new developments and their implications – this means research, preferably within several disciplines that specialise in data, society, law, culture and economy. And these fields need to communicate much better than they generally do. The academic ways to organise, fund and publish tend not to help with regards to this. HIIG is a good example of such an interdisciplinary venture, but we need more. Much more. It should be the model, not the exception. At the same time, this poses a challenge to supervisory and organisational methods with regards to recognising downsides to automated processes. And finally, there is a constant need to maintain an articulated balance between the market and consumer protection – which is both a political and a legal matter. Amongst others, there is a risk that consumer trust in digital services will be weakened if the use of personal information is perceived as illegitimate. And as a result of weaker trust, the potential benefits of the digitally generated economy will also very likely be weakened.

Further Links

Stefan Larsson is an Associate Professor in Technology and Social Change at Lund University Internet Institute (LUii) in Sweden, and a member of the Swedish Consumer Agency’s scientific council. From August to October 2016 he was a visiting researcher at the Alexander von Humboldt Institute for Internet and Society.

Photo: flickr.com  CC BY-NC 2.0

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Stefan Larsson

Ehem. Gastforscher: Internet Policy and Governance

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?