Skip to content
Accepting cookies
15 September 2022

Caring about privacy but accepting cookies? Questioning the privacy paradox.

Have you ever asked yourself why you agree to privacy agreements like cookies on a website way faster and consider them less online than offline? This phenomenon is called the privacy paradox, but is it really so paradoxical or are there eventually reasonable explanations for our online behaviour?

What is the privacy paradox?

The privacy paradox is understood as a concept attempting to reconcile consumer behaviour with privacy concerns (Martin, 2019, p. 67). In short, it can be defined as follows: “People’s concerns toward privacy are unrelated to their privacy behaviours. Even though users have substantial concerns with regard to their online privacy , they engage in self-disclosing behaviours that do not adequately reflect their concerns” (Dienlin, Trepte 2014).

Living in an information society

Information and Communication Technologies (ICTs) “are radically transforming devices because they engineer environments that the user is enabled to enter through gateways, experiencing a form of initiation” (Floridi, 2010, p. 5, Ch. 1). ICTs are changing our world as much as they are creating new realities. The distinction between the analogue offline and the digital online is quickly becoming blurred. “This recent phenomenon is variously known as ‘Ubiquitous Computing’ , ‘Ambient Intelligence’, ‘The Internet of Things’, or ‘Web augmented things’ (Floridi, 2010, p. 8, Ch. 1). Floridi (2010) calls this the information society, in which we are depending on ICTs because they have become a part of our reality.

How cognitive barriers influence our privacy behaviour

In the information society, we have to make privacy decisions in a limited amount of time, which companies use to their advantage. Cranor (2012) estimates for example “that it would take a user an average of 244 hours per year to read the privacy policies of every website she visits, or 54 billion hours per year for every United States consumer to read every privacy policy she encountered (McDonald & Cranor, 2008)” (Waldman, 2020).

Research has identified plenty of cognitive and behavioural barriers to rational privacy and disclosure decision making (Acquisti, Brandimarte, & Loewenstein, 2015; Camerer, 1998). One of the most pervasive cognitive biases is for example hyperbolic discounting (Waldman, 2020). The tendency to overweight immediate consequences of a decision and to underweight the ones that will occur in the future makes rational disclosure decisions difficult for the consumer. On top of the disclosure often carry certain immediate benefits like access. “But the risks of disclosure are usually felt much later. As such, our tendency to overvalue current rewards while inadequately discounting the cost of future risks makes us more willing to share now” (Waldman, 2020). The decision making process in general, as well as privacy decisions, is thus affected by incomplete information and bounded rationality (Acquisti & Grossklags, 2005). Additionally, we do not have access to all necessary information in order to make a fully informed judgement about the trade-offs involved (Kokolakis, 2017, p. 130).

An alternative approach

Kristen Martin found in her study that “consumers may be consistent in stating privacy concerns and expectations in surveys, while also retaining those concerns and expectations after engaging with a website” (2019, p. 72). She and other scholars push to overthink the privacy paradox assumption, that all privacy claims disappear after disclosure (Dienlin & Trepte, 2014; Martin, 2019; Solove, 2021). This assumption is problematic for clear responsibilities over the consumer’s data.

Privacy as a Core Value

When we say that privacy is a core value, it means that privacy needs to be protected all the time, and that firms should be held accountable for what happens with the consumers’ data after the disclosure. Martin says that “scholars who make normative claims about privacy have been arguing for privacy as a core value, which is necessary for individual autonomy and development, to foster intimacy and relationships, and for societies to flourish” (2019). Core values are not negotiable, they are positive goals we seek to attain and require in our communities (Donaldson & Walsh, 2015; Martin, 2019). Martin points out, a positive obligation to identify and respect privacy expectations from consumers is essential to business ethics (Martin, 2018; Shue, 2020).  

Where to go from here?

Evidence suggests that individuals care about their privacy even after information disclosure. The human cognitive bias in the online environment, together with the missing information for a fully informed decision, makes the decision-making processes difficult and explains the gap between the users’ privacy preferences and the disclosure behaviour. It raises the question whether the privacy paradox is really one. The privacy paradox has furthermore problematic implications for responsibilities over the consumers’ data. It places the primary responsibility for personal data on consumers, which means that companies bear little to no responsibility. An alternative approach could be seeing privacy as a core value, which would give firms a positive obligation to identify and respect privacy expectations.

Johanna Klix

Former student assistant: Knowledge & Society

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data governance

We develop robust data governance frameworks and models to provide practical solutions for good data governance policies.

Further articles

The photo shows an arrow sign on a brick wall, symbolising the DSA in terms of navigating platform power.

Navigating platform power: from European elections to the regulatory future

Looking back at the European elections in June 2024, this blog post takes stock of the Digital Services Act’s effect in terms of navigating platform power.

The image shows a football field from above. The players are only visible because of their shadows, symbolizing Humans in the Loop.

AI Under Supervision: Do We Need ‘Humans in the Loop’ in Automation Processes?

Automated decisions have advantages but are not always flawless. Some suggest a Human in the Loop as a solution. But does it guarantee better outcomes?

The image shows blue dices that are connected to eachother, symbolising B2B platforms.

The plurality of digital B2B platforms

This blog post dives into the diversity of digital business-to-business platforms, categorising them by governance styles and strategic aims.