Skip to content
Webseite_Sami
03 May 2021| doi: 10.5281/zenodo.4733685

Siri’s evil sister. When the Dutch public service steals your data

“System Risk Indication” (SyRI) deployed by the dutch government for automatically detecting social benefit fraud. The program was shut down due to a severe lack in transparency and unproportional collection of data. This demonstrates how automating public services fails, when not properly implemented.


In 2014, the Dutch Ministry of Social Affairs and Employment initiated a project for detecting social benefit fraud. The program SyRI (“System Risk Indication”) was supposed to serve this goal by means of automation and data collection. Early in 2020 however, the district court of The Hague ordered an immediate stop of the program and the Ministry followed the judgement a couple of months later. While nobody questioned  SyRI’s purpose, its actual implementation and practice was highly criticized. First by NGO’s, Civil Rights Organizations, and representatives of the UN and later by the court. The case of SyRI is representative of how automated decision making in the public sector and therefore presumably in the public interest can fail, if it lacks the appropriate transparency and attention to privacy.

All they need is all your data

In 2014 SyRI’s legal basis was passed in form of an amendment to an act from 2001. This amendment regulated which data could be used and how a SyRI-project proceeded. The amendment mentioned 17 different categories of data which were allowed to be used for proactive risk evaluation. SyRI was allowed to cross-reference data about work, fines, penalties, taxes, properties, housing, education, retirement, debts, benefits, allowances, subsidies, permits and exemptions, and more. This data could be taken from a wide range of public authorities, including the Dutch Tax and Customs Administration, the Labour Inspectorate, the Public Employment Service and others. Even though the Dutch Data Protection Authority (DPA) raised concerns in 2012 and again in 2014, the amendment was passed by the government. One key concern of the DPA was the possible conflict with Article 8 of the European Convention on Human Rights (ECHR). According to Article 8 everyone has the right to respect for his or her private and family life, home and correspondence. The concern already foreshadowed the court’s decision some years later.

SyRI in action

It is difficult to say how often SyRI was actually applied. One difficulty is that there have been SyRI-like projects before 2014, such as project Waterproof. While there is information on some of the projects, the government withholds many details. According to the court’s investigation, between 2008 and 2014 there have been 22 projects in which SyRI or its precursors have been used and from 2015 on there have been 5 more SyRI-projects.

A typical SyRI-project started with a request by two or more administrative bodies towards the Ministry. The request had to specify purpose, required data, and relevant risk indications. Once the request had been accepted, the data was collected and personal information got replaced by pseudonyms. After checking the data against the risk profile, the decrypted information about the flagged persons was sent to the Ministry for a second analysis.  However, the SyRI legislation did not include an obligation to notify the data subjects individually that a risk report had been submitted. There only was an obligation to announce the start of a SyRI project beforehand by way of publication in the Government Gazette. Only if the data subject explicitly asked for access, the Ministry would grant it.

Better watch your water use

On what basis were people flagged? What indicated a risk for SyRI? Again, this question is difficult to answer due to a lack of transparency. Even the court noted that the government provides almost no information. Allegedly, knowing about the indicators could be used to game the system. Nevertheless, some indicators could be identified. One SyRI-application was detecting cohabitation fraud. This means that people received benefits for singles but were in fact living together which, if registered, would result in less money. There were several types of information that SyRI took as an indicator for this kind of fraud. One was the registration of multiple cars under one name within a short period of time. Another was when the paid taxes on waste disposal of a single individual seemed rather typical for multiple persons. Furthermore, in previous projects to SyRI, like project Waterproof, for example, we know that low use of water was taken to indicate that the respective person lived with his or her partner in another flat. It is possible to likely that such indicators were in place again for SyRI. 

It is worth mentioning that SyRI might not only have checked for discrepancies in the data, as the government claims. According to the plaintiffs, there are hints that SyRI might have deployed Artificial Intelligence, more precisely Machine Learning, to automate analysing data points to detect “suspicious” behavior. This, however, could not be proven since the government did not allow for a sufficiently thorough investigation of the software itself.

Automate Public Service – but not like this!

The lack of transparency and data protection led the court to its judgment on SyRI. The judgment was mostly based on Article 8 of the ECHR and the GDPR. Furthermore, the court shared a concern that was already made public by the UN Special Rapporteur. SyRI was predominantly used in the poorer neighborhoods of, for instance, Capelle aan den ijssel, Eindhoven, Haarlem, and Rotterdam. 

However, the judgment was not negative in all its aspects:  

[The] court shares the position of the State that those new technological possibilities to prevent and combat fraud should be used. The court is of the opinion that the SyRI legislation is in the interest of economic wellbeing and thereby serves a legitimate purpose as adequate verification as regards the accuracy and completeness of data based on which citizens are awarded entitlements is vitally important.

The judgment casts light on a topic that is more general than the case of SyRI. There is an emergence of digital welfare states around the world, i.e. states are increasingly using new technologies to perform public services. Correctly done, this can help prevent fraud, or render public services more accessible and effective. However, as the case of SyRI shows, when the actual implementation has severe flaws – be it on the legal or the technical level – public services can quickly shift to surveillance and biased targeting.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Sami Nenno

Associated Researcher: AI & Society Lab

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Du siehst Eisenbahnschienen. Die vielen verschiedenen Abzweigungen symbolisieren die Entscheidungsmöglichkeiten von Künstlicher Intelligenz in der Gesellschaft. Manche gehen nach oben, unten, rechts. Manche enden auch in Sackgassen. Englisch: You see railway tracks. The many different branches symbolise the decision-making possibilities of artificial intelligence and society. Some go up, down, to the right. Some also end in dead ends.

Artificial intelligence and society

The future of artificial Intelligence and society operates in diverse societal contexts. What can we learn from its political, social and cultural facets?

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.