Skip to content
achievement-agreement-arms-1496186
01 April 2019| doi: 10.5281/zenodo.3086157

Translating law into code – why computer scientists and lawyers must join forces

GDPR has shown that law and computer science have more in common than one might initially assume. It is therefore even more important to find a common language for the two disciplines. Former associate researcher Sibylle Schupp’s comment stresses the importance of the collaboration between law and computer science. The article was first published in HIIG’s encore 2018.

As academic disciplines, computer science and law are really far apart: one is a mathematical or perhaps engineering discipline, while the other is a social science. From this perspective, there is not much common ground. But in practical terms legal provisions exist that tie the two disciplines, simply by including in the legal text references to IT technologies. A good example is the GDPR, which refers to “the state of the (technological) art” and even directly calls for “appropriate technical” matters. In fields like privacy protection, thus, lawmakers seem to see computer science as an aid. But at the same time it’s obvious that both disciplines need to find a common language – laws have to be translated into formulas, so that technology can deploy them.

For privacy regulations, cryptography and the whole toolbox of privacy-enhancing technologies provide software solutions for individual subtasks – but what about an entire app or a legacy system? On this note, I propose a different kind of support: code checking. By code we mean software code and by checking we mean automated compliance checks. In computer science, automated code checks are very common and have a wide range of applications. Algorithms are used to check whether software can really be trusted to do what it sets out to do, but they can also provide proof that a particular piece of software is sticking to its energy or time budget. Algorithms can also check how well software has been tested or how much it has changed.

So, surely it should be possible to apply these compliance checks to privacy properties and to check (software) code for properties required by (legal) code? Well, it depends. Legal provisions obviously come in English, German or other natural languages and that wording needs to be made more precise before it can be further processed.

Taking again the example of the GDPR, which talks about risks of varying degrees, time of processing and minimal data. All of this leaves open what is measured by risk, how time is defined and on what scale data sets ought to be minimal. From a computer-science point of view, all these terms need to be formally specified – and that requires legal expertise. But this is not the only thing that is needed.

A lawyer certainly agrees that legal interpretations are needed in all these cases, but the crux here is that the algorithmic checker needs to have them in a certain format – as a formula of some kind. For example, they are needed in the form of temporal, deontic or probabilistic logics, which look at truth over time or at truth with respect to ethical or statistical laws. Providing such formulas requires knowledge from computer science. So here we go: legal terms must be cast in a language an automated checker can understand, but the computer scientist lacks the legal knowledge of what to cast, and the legal expert lacks the knowledge what to cast in. What if the two joined forces? What if each key concept of a privacy regulation – time, risk, data minimisation, but also purpose, consent, impact– were scrutinised from both angles, the legal angle as well as the technological angle from computer science? Then, the lawyer would help the computer scientist to build a better compliance checker, and the computer scientist would help to disambiguate a privacy regulation. It is hard to see any disadvantage here

Sibylle Schupp, professor of computer science at Hamburg University of Technology, was a visiting researcher at Alexander von Humboldt Institute for Internet and Society in 2018. In line with her focus on methods for software quality assurance, she deals with legal issues and the missing interdisciplinary exchange.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Sibylle Schupp, Prof. Dr.

Former Associate researcher: Data, actors, infrastructures

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.

The picture shows a tractor cultivating a field from above. One side of the field is covered in green, the other is dry and earthy. This is intended to show that although sustainable AI can be useful in the fight against climate change, it also comes at a high ecological cost.

Between vision and reality: Discourses about Sustainable AI in Germany

This article explores Sustainable AI and Germany's shift from optimism to concern about its environmental impact. Can AI really combat climate change?