Zum Inhalt springen
GameJam_16-9 Verkehr
13 Oktober 2017

Serious Games als Privacy-by-Design Instrument

In unserer digitalisierten Gesellschaft, in der zwischenmenschliche Kommunikation immer umfassender auf der Verarbeitung personenbezogener Daten beruht, sind wir zunehmend mit der Komplexität datenschutzrechtlicher Konzepte konfrontiert. Wie lassen sich solche Konzepte vermitteln? Diese Frage stellt sich nicht nur in Bezug auf Bürger, die von der Datenverarbeitung betroffen sind, sondern auch auf Unternehmensmitarbeiter, die solche Daten verarbeiten. Games können grundsätzlich höchst komplexe Sachverhalte auf intuitive Weise vermitteln! Am Alexander von Humboldt Institut für Internet und Gesellschaft (HIIG) haben wir deshalb den “Game Jam: Unveil the Privacy Threat” organisiert, auf dem wir Konzepte für Serious Games mit genau diesem Ziel entwickelt haben: Welche Spielideen für welche Lernziele im Bereich Datenschutz konkret entwickelt wurden, berichtet Maximilian von Grafenstein, Leiter des Forschungsprogramms “Governance of Data-Driven Innovation”.

Privacy in a digitised society

How can we explain the complexity of privacy to citizens? How can data-driven companies sensitise their employees to the privacy threats that may be caused by their behaviour? Addressing these questions is a highly difficult task because privacy does not mean not disclosing any personal information at all. A strict non-disclosure of personal information could be an effective solution. However, in a digitised society, where individual communication is increasingly based on the processing of personal data, it is hardly feasible. Privacy thus means controlling the potential risks for individuals caused by a disclosure of personal information. But these risks depend on the context in which the information will be used, and on the implementation of legal, technical and organisational measures that control such use (i.e. privacy and security by design). This is the inherent reason why attempts to explain the meaning of privacy often fail: the task seems to be too complex (and also counter-intuitive in light of our everyday behavior in a digitized world).

The Game Jam Idea

That’s why we at the Alexander von Humboldt Institute for Internet and Society decided to develop games! Games can make very complex matters understandable in a playful and highly intuitive way. However, many games developed for educational purposes in privacy are rather boring. The game mechanics are often so simplistic that most people would only play them if  they are forced to do so: Play the game or you’re fired! Games like this don’t achieve their actual learning goals (explaining privacy in a really playful and intuitive manner).

In order to create games that intuitively explain the complexity of privacy to its players, we organized, as part of the research project Privacy by Design in Smart Cities, the Game Jam: Unveil the Privacy Threat, which took place last weekend in Berlin. 26 game designers, developers, privacy experts, and artists, who came from all over Europe, the Middle East and even South America, worked on six different game concepts during a two-day development marathon. All game concepts were based on three use cases, each of which were presented by a keynote speaker who is active in this field. The first speaker, Jillian York, Director of International Freedom of Expression at the Electronic Frontier Foundation gave five reasons for why this statement is wrong in her keynote “I’ve nothing to hide!”. In her keynote “Oops… wrong recipient ?”, the second speaker, Michelle Dennedy, Vice President and Chief Privacy Officer at Cisco, illustrated how challenging it is, in globally operating data-driven companies, to sensitize employees to privacy when setting up a properly working privacy-by-design methodology. In his keynote “Unravel the anonymization paradox!”, the third speaker, Jonathan Fox, Director Strategy and Planning, Chief Privacy Office, Security & Trust Organization at Cisco ( “the Fox” for short), demonstrated the challenges of data anonymisation. All three use cases were intended to give the game jam participants an initial understanding of the many facets of privacy before they started to develop game concepts that addressed one or even all of them. Some of the game jam participants even worked through the night, supported and guided by several privacy mentors (such as by Michelle Dennedy herself)!

Who’s Sherlock

The showdown took place on Sunday evening, at 6pm: The participants had to pitch their game concepts to a 7-person interdisciplinary expert jury: Lars Vormann from Gamescom, Nico Nowarra from Experimental Game, Meike Kamp from the Berlin Data Protection Authority, Lies van Roessel from the Hans Bredow Institute, Thomas Schildhauer from our institute, Michelle Dennedy, Jillian York, and Eva Schulz-Kamm from Siemens, who sponsored  the prize. The winner got a weekend trip to London for the whole winning team including a visit to The Crystal, the world’s largest exhibition on the future of (smart) cities. So, which game concepts did participants develop to explain the complexity of privacy? And who was the winner?

The first presented game concept “Who’s Sherlock” was a questions-and-answers game that lets its players experience how easy it is to find information about oneself and others in the internet, and how sensitive this kind of information can be in different contexts. The second game concept, “Privacy Rush”, focused on the moment when an individual wants to connect with a public wifi system. Before an individual connects with the system, this browser game is loaded and appears on the screen, aiming to inform the player (i.e., the wifi user) about the privacy risks caused by an insecure connection (as well as how to better protect him or herself). The third game concept “Data Trade” addressed the complexity of today’s data economy. Taking the perspective of a data trader, this card game demonstrates how different data sets can be combined, leading to a higher price on data markets because they generate more  information about individuals. However, this additionally generated information also leads to higher privacy risks and, therefore, the players have to constantly implement and adapt its protection in order to mitigate these risks (because a risky data is set is worth less than a safer one). The fourth game concept “Nothing to hide” lets the game players experience that the “nothing to hide” argument is wrong. The fifth game concept “Smart City” was a simulation game targeting political decision makers who are responsible for urban development decisions. Taking the perspective of a city mayor, the game players can make their cities smarter and smarter by digitizing different public areas (the health sector becomes an ehealth sector, policing becomes predictive etc.). At the end of his or her legislative term, the player’s political decisions will be measured pursuant to the future scenarios simulated in the game. The sixth game concept “Pieces of Data” seeks to teach its players how to manage third-party access rights, i.e. by app providers, to their smartphones. Taking the perspective of an activist in a state that is becoming more and more repressive, the players learn how data collected by their smartphones can harm them, and even other parties – and how to protect themselves against such threats. Because this game was so advanced in its development, it won!

However, all game concepts impressively demonstrated that games can serve as a highly effective tool. In a privacy-by-design framework: They’re not just useful for explaining the complexity of privacy to citizens. In fact, games can even help companies to meet data protection requirements. This might be the case in the context of educating employees in data-driven companies. But it is even possible that games will ultimately support, or even substitute for, the lengthy legal texts that individuals purport to agree with by clicking on a box as part of the informed consent. Based on these experiences, we are eager to get working on making these scenarios a reality! Because this truly is effective privacy by design.

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Maximilian von Grafenstein, Prof. Dr.

Assoziierter Forscher, Co-Forschungsprogrammleiter

Auf dem Laufenden bleiben

HIIG-Newsletter-Header

Jetzt anmelden und  die neuesten Blogartikel einmal im Monat per Newsletter erhalten.

Forschungsthema im Fokus Entdecken

Man sieht einen leeren Büroraum ohne Möbel und braunen Teppichboden. Das Bild steht sinnbildlich für die Frage, wie die Arbeit der Zukunft und digitales Organisieren und Zukunft unseren Arbeitsplatz beeinflusst. You see an empty office room without furniture and brown carpeting. The image is emblematic of the question of how the work of the future and digital organising and the future will influence our workplace.

Digitale Zukunft der Arbeitswelt

Wie werden KI und Digitalisierung die Zukunft der Arbeit verändern? Wir erforschen ihre Auswirkungen sowie die Chancen und Risiken.

Weitere Artikel

Drei Gruppen von Menschen haben Formen über sich, die zwischen ihnen und in Richtung eines Papiers hin und her reisen. Die Seite ist ein einfaches Rechteck mit geraden Linien, die Daten darstellen. Die Formen, die auf die Seite zusteuern, sind unregelmäßig und verlaufen in gewundenen Bändern.

Beschäftigte durch Daten stärken

Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?

Eine stilisierte Illustration mit einem großen „X“ in einer minimalistischen Schriftart, mit einem trockenen Zweig und verblichenen Blättern auf der einen Seite und einem leuchtend blauen Vogel im Flug auf der anderen Seite. Das Bild symbolisiert einen Übergangsprozess, wobei der Vogel das frühere Twitter-Logo darstellt und das „X“ das Rebranding der Plattform und Änderungen im Regelwerk von X symbolisiert.

Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk

Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.

Das Bild zeigt einen Traktor von oben, der ein Feld bestellt. Eine Seite des Feldes ist grün bewachsen, die andere trocken und erdig. Das soll zeigen, dass nachhaltige KI zwar im Kampf gegen den Klimawandel nützlich sein, selbst aber auch hohe Kosten für die Umwelt verursacht.

Zwischen Vision und Realität: Diskurse über nachhaltige KI in Deutschland

Der Artikel untersucht die Rolle von KI im Klimawandel. In Deutschland wächst die Besorgnis über ihre ökologischen Auswirkungen. Kann KI wirklich helfen?