Unsere vernetzte Welt verstehen
Nichts zu verbergen?
In unserer vernetzten und digitalisierten Gesellschaft sind personenbezogene Daten das neue Gold. Immer mehr müssen wir uns mit komplexen Themen wie Privatsphäre und Datenschutz auseinandersetzen – besonders in der Arbeitswelt. Wie lassen sich diese Konzepte am besten erklären? Beim “Game Jam: Unveil the Privacy Threat” kamen EntwicklerInnen, DesignerInnen, Forschende und Kreative zusammen, um diese Sachverhalte durch Serious Games auf intuitive Weise zu vermitteln. Für den ersten Teil unseres Rückblicks hat HIIG-Forscher Maximilian von Grafenstein zusammengefasst, welche Spielideen konkret entwickelt wurden. Im zweiten Teil zeigen wir die besten #PrivacyJam Momente:
https://twitter.com/jilliancyork/status/916591963956277248
Before getting started with the development of game concepts, three use cases helped all participants to recollect the many facets of privacy.
@jilliancyork #PrivacyJam #Berlin @hiig_berlin “I’ve got nothing to hide” think again on the #4points + bonus! pic.twitter.com/GkAsTPEGhQ
— Lorena Marciano (@MarcianoLorena) October 7, 2017
Use case 1: I’ve got nothing to hide!
In her keynote, Jillian York (Director of International Freedom of Expression at Electronic Frontier Foundation) gave five reasons for why the statement “I’ve nothing to hide!” is wrong:
Many people think they have nothing to hide and thus do not need to protect their privacy. The reasoning behind this is that only people who have done something illegal would want to conceal their behavior; but, so the thinking goes, there’s no social need for protection for illegal behavior!
However, this only relates to a tiny part of privacy protection. In fact, privacy laws doesn’t just protect those (allegedly) engaged in “illegal” behavior. Privacy also protects a person against the loss of reputation that can occur when information is disclosed and/or used in the wrong context. As early as the 17th century, the French high priest and statesman Cardinal Richelieu stated: “Give me a letter of six sentences written by the most honorable man, and I will find something sufficient to hang him.” Of course, we don’t hang people anymore. However, what this kind of reasoning indicates is that there can always be somebody who wants to use personal information against someone. It is this misuse of personal information that privacy seeks to protect against. Some people even say that it actually doesn’t matter what you may have to hide; what matters is your ability to decide whether to hide something or not. This ability is guaranteed by privacy. Privacy is hence an essential precondition for the enrollment of an autonomous personality.
Use case 2: Oops… wrong recipient!
The second speaker, Michelle Dennedy (Cisco), illustrated how challenging it is to sensitise employees to privacy.
The most common privacy threat in companies arises when an employee accidentally sends personal information about somebody else to the wrong recipient. This may sound like a no-brainer, but in fact, it is one of the biggest challenges in implementing effective privacy protection policies within companies.
There are two typical constellations that give rise to slightly different challenges for companies trying to mitigate this threat: In the first constellation, an employee uses an email client, and in the second constellation, he/she grants access to a file depository. In both cases, an employee typically sends an email, or grants access to a repository, to the wrong recipient because the email client or repository incorrectly autocompletes the address based on the first few letters. The employee forgets to double-check the name and… oops, the information is sent to the wrong recipient. But there are differences between the cases: The differences between both cases refer to how such an employee can react. If the employee has granted access to a file repository, in principle, he or she can still restrict access retrospectively. In contrast, if the employee has sent an email, the information is definitely “gone”, and he or she can only ask the recipient not to read and/or open the content. However, in both cases employees often do not react at all, or they do not appropriately, because they fear negative consequences if his or her colleagues or superior find out about it.
Use case 3: Unraveling the anonymity paradox!
Jonathan Fox (Cisco) demonstrated the challenges of data anonymisation in his keynote “Unravel the anonymization paradox!”.
If personal information were anonymized, all our privacy concerns would be gone! But what does “anonymized” mean? This question is one of the hardest to resolve in the privacy debate. At present, privacy experts are grappling with the paradox that – in the big data era – there is no anonymous data anymore. All data can always be related to an individual by means of data analysis technologies. The reason for this is that data is only considered “anonymized” if it cannot be related to an identified or even to an identifiable individual.
In order to understand this paradox, imagine more than three million Berlin citizens – and another million tourists – are carrying around their personal devices every single day. Imagine that there is a Berlin-wide wifi system, which is publically available for all people who have switched on their devices’ wifi by default. This wifi system collects the movement data of all these devices over a longer period of time. Can you imagine how useful this data would be for urban traffic management and many other innovations? But wouldn’t it be creepy if this data could also be misused against an individual later on? Imagine that this data is thus anonymized in order to mitigate these risks. In the process all personal identifiers (i.e. the MAC address and IMEI) of the devices captured by the wifi system – which could in principle lead to an identification of the owner or even carrier of a device – are “hashed” (i.e. substituted by a specific hash value for each identifier). This hash value does not, per se, contain information referring to the owner or carrier of the device. However, it is still possible to capture the device’s movement pattern by referring to this hash. This movement pattern becomes more and more precise over time. Now imagine a person who gets access to that movement pattern (e.g. an employee of the provider of the wifi system or another data-driven company) and suddenly discovers that this device must be owned by somebody he knows very well. The reason for this is that this device “leaves” the building where he lives at the same time every morning and “moves” to an address where only lawyers work: In terms of probability, that person must be his wife!
This risk of re-identification of “anonymized” data exists generally, where it is combined with further information. It is hard to say which information will be added and hard to say what the consequences of an identification are. It is hard to say under which conditions this risk is low enough in order to be socially acceptable.
Brainstorming time! The teams are set up and already working on their ideas for the @hiig_berlin #privacyjam #GameJam pic.twitter.com/a5FDh03vkV
— Booster Space (@Booster_Space) October 7, 2017
#PrivacyJam #GameJam It takes a village, and beer, to build a data centric game. pic.twitter.com/CNJ1J65x1J
— Michelle Finneran Dennedy, JD (@mdennedy) October 7, 2017
It's getting dark, but there's no time to waste and our Jamers are eager to go on! #privacyjam #gamejam @hiig_berlin pic.twitter.com/Yw2wetlkHy
— Booster Space (@Booster_Space) October 7, 2017
Full house #privacyjam with @mdennedy en vogue for a 48h ride. #gamejam #privacy #nothingtohide #gaming #love #sweatergoals pic.twitter.com/EIIvXQjiAz
— Katharina Beitz (@katharina_beitz) October 8, 2017
The winners are … ‘the activists’ @hiig_berlin @Cisco_Germany #privacyjam pic.twitter.com/Hg323bYZ4R
— Klaus Lenssen (@klaus_lenssen) October 8, 2017
The #end of #privacyjam thanks to @hiig pic.twitter.com/62aqIWSNxH
— Booster Space (@Booster_Space) October 8, 2017
And the winners are…
Taking the perspective of an activist in a repressive state, the players of “Pieces of Data” learn how data collected by their smartphones can harm them, and even other parties – and how to protect themselves against such threats. “Because this game was so advanced in its development, it won!”, Maximilian von Grafenstein states.
This Game Jam was part of our research project Privacy by Design in Smart Cities. |
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de. |
Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de
Jetzt anmelden und die neuesten Blogartikel einmal im Monat per Newsletter erhalten.
Plattform Governance
Beschäftigte durch Daten stärken
Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?
Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk
Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.
Zwischen Vision und Realität: Diskurse über nachhaltige KI in Deutschland
Der Artikel untersucht die Rolle von KI im Klimawandel. In Deutschland wächst die Besorgnis über ihre ökologischen Auswirkungen. Kann KI wirklich helfen?