Making sense of our connected world
The principle of purpose limitation in data protection laws
In his dissertation, Maximilian von Grafenstein examines the principle of purpose limitation in data protection law, which not only protects the autonomy of those affected, but also provides data processing companies sufficient scope to optimally implement data protection as part of their innovation processes. For our dossier on the GDPR, we asked him a few questions:
What inspired you to write your dissertation?
As a lawyer and later in the Startup Clinic at HIIG, I advised and supported startups in implementing legal requirements. It struck me that the legal requirements often stood in contrast to innovation processes in startups. This became particularly clear with the so-called principle of purpose limitation in data protection. This requires the processors of personal data to indicate the purposes of their later data processing before collecting any data. The processors are bound to these purposes. In the beginning, it is often not even 100% clear in what direction a startup is heading, thus, the purposes of future data processing cannot be specified at all. My question was therefore how the principle of purpose limitation should actually be interpreted and whether it can be brought into line with innovation processes.
…and what answer did you find?
The surprising result of my work is that the principle of purpose limitation is not an obstacle to innovation processes. On the contrary, as a legal principle it is a very suitable regulatory instrument for innovation processes. The reason for this is that it gives innovative data processors sufficient scope for implementation. This enables them to adapt the legal requirements specifically to their innovation processes and, above all, to their risks.
The principle of purpose limitation is in itself a regulative instrument open to innovation.
Combined with instruments for co-regulation, the principle of purpose limitation is not only in line with innovation, but can also even promote innovation under certain circumstances. This is because processors of personal data can reduce the high level of legal uncertainty through co-regulatory instruments. Scientific studies clearly point out that increasing legal certainty has a positive effect on innovation processes. In data protection law, this legal certainty is provided by so-called codes of conduct and certificates.
For who are the outcomes particularly interesting?
The results are particularly interesting for companies that process personal data. My thesis shows ways in which these companies can actually use their compliance with data protection regulations as a competitive advantage. It also helps regulators — especially data protection authorities — to interpret the principle of purpose limitation in such a way that it not only effectively protects against the risks of data processing, but also brings about the competitive advantages of the GDPR, which are often invoked by politicians.
What were your best and worst experiences during the research?
In my opinion, the debate on data protection law suffers from a great polarisation, which makes it difficult to find concrete solutions to problems. The best experiences were definitely the moments when I told actors from science, business or politics about my approach and was able to convey the great potential of this approach. Accordingly, the most difficult moments were when I realized that others who are concerned with data protection are not taking the time to at least understand this approach.
What’s the relevance of your publication for the GDPR?
The dissertation focuses on the principle of purpose limitation as implemented or should be interpreted in the GDPR. The principle of purpose limitation also exists in other legal systems or constitutions, for example as a core component of Germany’s fundamental right to informational self-determination and as (more implicit) component of the right to privacy of the European Convention on Human Rights. In my dissertation I work out the similarities and differences. Thereby the focus is always on the European Charter for Fundamental Rights as well as the GDPR. Thus, I was all the more surprised when an expert of the German Research Foundation came to the conclusion that my work does not deal with the principle of purpose limitation of the GDPR – in my opinion, a classic example of the case mentioned above, when people do not really engage with the approach.
What new perspectives does this illustrate to you on the subject of data protection?
Before my research, I saw data protection as a pure obstacle to innovation processes. In the course of my work, however, I have grown to understand that the GDPR in particular can be viewed as a relatively modern regulatory approach, which is particularly suitable for regulating the often unforeseeable risks of data-driven innovation processes.
In your opinion, is GDPR an effective tool?
The GDPR can be a very effective instrument. However, it must be implemented accordingly. If it is interpreted according to the classical approach, one runs the risk not only of unnecessarily hindering innovation processes, but also of not providing effective protection. The reason for this is that the GDPR mainly focuses on the time of data collection. In principle, it is designed in such a way that most of the obligations of the processors or rights of the data subjects take effect at the time of data collection. In order to provide effective protection and not unnecessarily hinder innovation processes, however, rights and obligations must at least apply equally during the subsequent use of the data.
What can I do to protect my data effectively?
We as consumers need to be aware of the risks we take when we disclose information about ourselves. For this purpose, however, the processors of personal data must provide information about these risks in the information or declarations of consent they provide. So far, in most cases, personal data processors have not sufficiently pointed out risks. For example, it is not sufficient for a data processor to write that it passes the data on to third parties. Such a reference does not say anything about whether the data will be used, for example, to offer the persons concerned personalised advertising, to evaluate their creditworthiness, or to attempt to influence them when they cast a vote. These are very different risks to the autonomous exercise of fundamental rights of those affected. Data may only be used for such purposes if these risks have been clearly indicated, the data subject has specifically consented to them, or legal regulations permit processing for these purposes. As soon as the person concerned is specifically informed about such risks, it is also reasonable to expect them to make a decision as to whether they agree with this risk. Of course, they can only make such a decision for themself, not for others.
Abstract: The principle of purpose limitation in data protection laws
In his dissertation, Maximilian von Grafenstein examines the principle of purpose limitation provided for by data protection law from the perspective of a regulation of innovation. This approach examines both the risks caused by innovation and whether risk protection instruments are appropriate with respect to their effects on innovation processes.
In light of this approach, this thesis examines, first, the function of the principle of purpose limitation in light of Article 8 of the ECFR, and second, which regulation instruments serve best, when implementing the principle of purpose limitation in the private sector, in order to balance the colliding fundamental rights of the data controller and the individual concerned. Pursuant to the previous analysis, the principle of purpose limitation is a regulation instrument that seeks to protect the individual’s autonomy against the risks caused by the processing of data related to him or her. In this regard, the first component of the principle of purpose limitation requires the controller to specify the purpose of the data processing. This requirement is a precautionary protection instrument obliging the data controller to discover specific risks caused by data processing against the individual’s fundamental rights to privacy, freedom, and non-discrimination.
My book is finally available! 670 pages about the concept of data protection law and its principle of purpose limitation – and the result is: this principle is not a barrier to innovation but can even enhance (under certain conditions) innovation processes! pic.twitter.com/ZYFJA3YZmi
— Max von Grafenstein (@MaxGrafenstein) 12. April 2018
In contrast, the second component, i. e., the requirement to limit the data processing to the preceding purpose, aims to limit the risk caused by the later data processing to the risks previously discovered. Whether a risk caused by the later processing of personal data is compatible or incompatible with the risk previously discovered depends, again, on the individual’s fundamental rights to privacy, freedom, and non-discrimination. However, as a legal principle, the principle of purpose limitation not only protects an individual’s autonomy but simultaneously leaves sufficient room for data controllers to find the best solution for protection with respect to the particularities of the specific case. This scope of action enables, combined with co-regulation instruments, data controllers to turn the principle of purpose limitation into an innovation-enhancing mechanism.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.
You will receive our latest blog articles once a month in a newsletter.
Platform governance
Why access rights to platform data for researchers restrict, not promote, academic freedom
New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.
Empowering workers with data
As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.
Two years after the takeover: Four key policy changes of X under Musk
This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.