Skip to content
nick-hillier-339049-unsplash
26 April 2018

Data protection and designing technology

In his dissertation Jörg Pohle uncovers the history of ideas and the historical construction of the data protection problem and data protection as an abstract solution – including the architecture of its legal implementation. The aim of his work is to critically evaluate this construction and to draw conclusions for the design of ICT systems. For our dossier on GDPR, we asked him a few questions:

What inspired you to write your dissertation?

At the beginning of the work on my dissertation I planned to investigate how legal requirements – for example from the Federal Data Protection Act – could be translated into technical requirements. A one-to-one implementation is of course not possible. During my research, however, I found that For data protection law – but much more generally for all privacy and surveillance theories – it has not been clarified what is actually the (legal) good to be protected. There is no consensus on what assumptions are made about information processing and use, and what exactly should lead to the problem to be solved. Subsequently, I changed my research question and examined how the problem was historically constructed in discourse; which legal, organisational and technical means or measures were proposed for its solution; and whether and to what extent this is still sustainable from an IT and information science point of view.

…and what answer did you find when you wrote to it?

The result of my work is that data protection – as a solution to the problem of data power created by the industrialization of social information processing – must be re-derived. The aim was to explain why and how information processing organisations threaten fundamental rights and freedoms, but also social values such as the rule of law and democracy. To this end, I have presented a data protection attacker model corresponding to the state of the sociological, legal and computer science debate and an analytical grid for a threat analysis based on it. Finally, I have shown how information technology systems can be developed on this basis to avert these threats and secure individual and social freedom.

For which target group are the outcomes particularly interesting?

Firstly, for all those interested in the history of ideas in the fields of privacy, surveillance and data protection; secondly, for lawyers, who thus receive a scientifically sound justification for data protection that is not based on traditional 19th century ideas of privacy; thirdly, for computer scientists who want to examine socio-technical systems for their individual and social effects or design information technology systems that protect existing spaces of freedom and at the same time create new freedoms.

What was your best and worst and worst finding during the research?

The most exciting realization was that almost all discussions today about the computerization of society – both in science and in society – were held between the late 1960s and the early 1980s. The “worst” is that the work was then by orders of magnitude better than what is now so pinned – from “digitization” to “algorithms” to “AI”…

 

 

How is your publication related to GDPR?

The work shows at which points the GDPR simply passes the problem. For example because it makes assumptions that are not correct: such as the fixation on “personal data”, which assumes that fundamental rights and civil liberties cannot also be violated with anonymous data. Or because she paints unrealistic or simply incidental dangers on the wall, but at the same time does not even look at essential problems. On the other hand, my work can be used to fill the analysis and implementation processes laid down in the GDPR – from data protection impact assessment to data protection by design to data protection by default – so that in the end individuals, groups and society as a whole can actually be protected from the information power of states and large private organisations.

What new perspectives does this open up for you on the subject of data protection?

On the one hand, this can be a continuation of earlier discussions: for example on the objectives of data protection, as formulated by Adalbert Podlech, according to which data protection is the solution to the “technology-imparted social problem” of “determining and enforcing the conditions under which a society’s information conduct can be acceptable to the members of society”. On the other hand, I have found various, especially technical approaches in the literature that are worth following up and translating into IT systems – and I am currently working on them.

In your opinion, is GDPR an effective tool?

The GDPR is not an effective instrument, because there are only two things that the basic regulation has effectively expanded: the documentation requirements and the rights of processors…

What can I do to protect my data effectively?

Encrypt and don’t give it to anyone! The question is wrong: Data protection serves just as little to protect data as sun protection serves to protect the sun or disaster control serves to protect disasters…

 

 
Jörg Pohle’s thesis uncovers the history of ideas and the historical construction of the data protection problem and of data protection as its (abstract) solution, including the architecture of its legal implementation, in order to critically assess this construction and to draw conclusions for the design of ICT systems. The thesis reveals the manifold aspects which underlie the analysis of the data protection problem – from concepts of humankind and society, organisations, information technology, and information processing, to concepts, schools of thought, and theories within informatics, information science, sociology, and law, to scientific and pre-scientific assumptions and premises and how they have influenced the specific solution to this problem.

Data protection and consequences for designing ICT systems: read dissertation (PDF)

History and theory of data protection from a computer science perspective and consequences for designing ICT systems

Based on a critical assessment of this historical construction, the thesis concludes that data protection must be rederived as a solution to the information power problem, which has arisen through the industrialisation of social information processing. To this end, the thesis presents an abstract, state-of-the- art data protection attacker model, an analytical framework for a data protection impact assessment, and a procedural operationalisation approach illustrating both the sequence and the substantive issues to be examined and addressed in this process.

The thesis then draws conclusions for the design of data-protection-friendly – and not just legally compliant – ICT systems. Further, the thesis clarifies the ways in which many concepts referred to in the privacy, surveillance, and data protection debate are invalid, outdated, or oversimplified. This includes the fixation on personally identifiable information, both in terms of the limitation of the scope of application and as a reference point for lawmaking and ICT design, the patently false but widespread assertion that sensitivity is a property of information, the naive public–private dichotomy, and the so-called privacy paradox.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Jörg Pohle, Dr.

Head of Research Program: Actors, Data and Infrastructures

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.