Zum Inhalt springen
13848268535_09564bab92_k
20 Dezember 2016

Hör auf zu zählen, fang an zu denken!

Im Sommer veröffentlichte die Vereinten Nationen (VN) ihren 9. eGovernment survey, der seit 2001 ein Ranking aller für 193 Mitgliedstaaten im Bereich elektronische Regierungsführung erstellt. Dies folgt einem Trend, politische Themen wie Meinungsfreiheit, Netzfreiheit oder Rechtsstaatlichkeit objektiv messen und in globale Rankings einordnen zu wollen. Am Beispiel des eGovernment survey der VN wird aufgezeigt, warum dieser Ansatz auf Annahmen basiert, die schwerlich mit der Realität vereinbar sind.

Once upon a (recent) time there were indices that gave birth to rankings….

In August of this year, the UN published its 9th E-government survey since 2001, which – based on an eGovernment index – is supposed to assess and consequently rank (on an ordinal scale) the status and development of eGovernment in all 193 UN member states. Well, and here the problem lies, because this survey is not really assessing data. What it is actually doing is compiling and packaging data (which is a valuable endeavour in its own right). Yet, this survey is not assessing data in the sense of assigning meaning to data or critically reflecting upon it.

And in fact, this survey is in good company with other indices and their rankings that have sprung up in recent years to measure on a global level the status of democracy, freedom on the net, or the rule of law – to name just a few. Subsequently, the UN’s e-government survey is an exemplary case to demonstrate how all these global ranking exercises come with severe limitations concerning the insight we can actually draw from them.

UN’s E-government survey: temptations and myths

So let’s have a closer look at this survey. Its assessments, or better yet, data compilations are based on the results of the E-Government Development Index (EGDI). The EGDI itself is a composite of the Online Service Index (OSI) which looks at the scope and quality of online services. The Telecommunication Infrastructure Index (TII) which is supposed to capture the current state of affairs in ICT infrastructure and finally the Human Capital Index (HCI) which is intended to provide data about educational aspects such as adult literacy or mean years of schooling. The weighted average of the normalised scores of each of these composite indices is what establishes the EGDI, which ultimately results in an eGovernment ranking of all 193 UN member states.

The tempting thing with indices and rankings like the EGDI is that they come with a twofold promise: first to represent objectively measured facts resulting in objective truth, free of any element of arbitrary, subjective interpretation and second to provide information about development over time (in this case the eGovernment status of a particular country) and allow for comparison with other units of analysis (in this case, other countries). Yet, as is often the case with temptations, at second glance there appear to be some serious downsides. In terms of rankings like the EGDI those downsides come in the form of two myths concerning measuring and counting to which quantitative analysis is often attributed.

The myth of measuring and counting

First, there is the myth that the objective truth is measurable and countable. True, in theory I can measure anything, but in reality, physical objects are much more quantifiable – able to be counted and measured – than sentiments, perceptions or values. It is easier to count the number of online services, broadband subscriptions or mean years of schooling than to measure the willingness for, openness to or acceptance of eGovernment applications. Sure, I can assign a numerical value to anything. But are my measurements credible if, for example, I try to measure the impact of eGovernment on democracy by the number of queries submitted online, as does the eGovernment Economics Project (eGEP), another eGovernment ranking conducted, this time, by the European Union (EU)?

The ugly truth is that I cannot directly measure social phenomena like I can measure global warming. Unfortunately eGovernment is not a physical object, readily quantifiable, but an analytical concept that describes/represents the impact of ICTs on social coordination processes, such as the distribution of collective goods or the implementation of collectively binding rules in accordance to (democratic) values such as participation, effectiveness or efficiency. That of course always comes with the analytic territory of subjective interpretation and judgment. I can count the number of services provided online, but I surely cannot count the social or political relevance associated with or resulting from this. Of course this is unsettling as this strips us from the prospect that social phenomena can be measured as precisely and objectively as natural scientific phenomena and that the results of such measurement research, like (eGovernment) rankings, can claim to represent objective truth.

If we see, however, critical reflection and (subjective) interpretation not as outdated, but as vital analytical instruments for understanding the world – or as in this case, for capturing the implications of ICTs on government – we might get closer to an understanding of reality. Yet, this has to come with the acceptance that data is not knowledge by itself and that analytical reflection cannot be restricted to methodological questions of data collection.

The myth of universal metrics

The second myth concerns the universal metric we apply when measuring  individual indicators. Admittedly, this sounds very abstract so let’s get straight to an example. As mentioned above, one of the constitutive sub-indices of the EGDI is the Online Service Index (OSI). While the EGDI does not specify which services the OSI covers, we can assume that it is based on a numerical counting of online services. This is most likely (as the information on this aspect is also very scarce) derived by measuring some qualitative aspects of these services such as user friendliness, which is then translated into a specific score for the respective service on OSI. So, for example, in the case of a digital land registry, the question would be if this service is accessible online and if this service is provided in a way that citizens can easily use it. This individual score is then aggregated with the scores of the OSI’s other indicators, leading to the overall OSI score and the resulting ranking a country achieves.

Let’s take a step back to consider what this implies. Basically, it assumes that there exists something like a universal metric for or relevancy of digital land record systems for eGovernment. Regardless of being an agrarian or industrialized country, from this perspective the metric of the indicator, “digital land record system”, for determining the eGovernment status is always the same. The actual scoring of countries on this indicator of course can be different, depending on whether they provide online services to their land record systems and whether they are user friendly or not. But the weighting/significance assigned to the indicator, digital land record system, in relation to the other indicators that make up for the OSI is always the same.

The problem is that there is no such thing as a universal or uniform metric or significance of digital land record systems for the eGovernment status of a country. Put simply: In an OECD context it is surely positive to have online services related to land records. Yet, it is more a matter of convenience than of crucial importance, as functioning (analog) land registries are the norm and the number of conflicts over land is relatively low. In developing countries however, where major parts of the economy are still based on agriculture and are thereby related to land ownership, contestations over property as well as the common practice of “land grabbing“ constitute major fault lines within these societies (for example in India, 70% of court cases involve disputes over land). Against such a backdrop, a functioning and trusted digital land records system is (or would) constitute a vital component of  (e) government.

This list of the context-specific relevance of digital public services and systems can easily be extended. Digital financial and taxation systems have a much higher significance in developing countries, suffering from the often chronically low rates of tax collection. Digitally enabled and secured public procurements systems gain much more relevance in such contexts which have a long history of fraudulent interferences and acts of self-enrichment. Global rankings and indices of eGovernment like the EGDI cannot accurately convey/depict the relative significance of  these indicators. Either they conceptualize these indicators in view of their relevancy in and for government systems in the OECD world, leading to a distortion in assessing the importance of these online services and systems in developing contexts, or they take the context conditions in non-OECD countries as their conceptual starting point for defining the metric of eGovernment indicators, which inevitably leads to distorting effects in measuring the eGovernment status in OECD countries.

Much ado about nothing?

So where does this leave us? It might help to remember that the greatest thinkers of the past innovated without having any global indices and rankings at their disposal. Today, even better, we have more data and statistics available than former generations of academics and thinkers could have imagined. And that is exactly the real value of all these indices and derived rankings: to provide data to spawn rational inquiry. It is through the EGDI that we have become aware of the increase in the provision of open data portals, or online one-stop-shop approaches on a global level. But they can only serve as a starting point or raw material for further analysis and reflection. Indices and rankings provide no knowledge per se, regardless of how methodologically refined they have become. At best, they are a starting point, not the end or functional equivalent for any meaningful research.

Rüdiger Schwarz ist assoziierter Forscher am HIIG und hat kürzlich einen Forschungsaufenthalt bei @iLabAfrica an der Strathmore University in Kenia verbracht. Sein thematisches Interesse richtet sich vor allem auf die unterschiedlichen Auswirkungen des Internets auf öffentliche Institutionen in Ländern jenseits des OSZE Raumes, mit einem speziellen Fokus auf Afrika. Zur Zeit arbeitet er einer seiner Dissertation zu e-government in Kenia.

Photo: flickr.com CC BY.SA 2.0

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Rüdiger Schwarz

Ehem. Assoziierter Forscher: Globaler Konstitutionalismus und das Internet

Auf dem Laufenden bleiben

HIIG-Newsletter-Header

Jetzt anmelden und  die neuesten Blogartikel einmal im Monat per Newsletter erhalten.

Forschungsthema im Fokus Entdecken

Plattform Governance

In unserer Forschung zur Plattform Governance untersuchen wir, wie unternehmerische Ziele und gesellschaftliche Werte auf Online-Plattformen miteinander in Einklang gebracht werden können.

Weitere Artikel

Ein moderner U-Bahnhof mit Rolltreppen, die zu den Plattformen führen – ein Sinnbild für den geregelten Zugang zu Daten, wie ihn Zugangsrechte im NetzDG und DSA ermöglichen.

Plattformdaten und Forschung: Zugangsrechte als Gefahr für die Wissenschaftsfreiheit?

Neue Digitalgesetze gewähren Forschenden Zugangsrechte zu Plattformdaten, doch strikte Vorgaben werfen Fragen zur Wissenschaftsfreiheit auf.

Drei Gruppen von Menschen haben Formen über sich, die zwischen ihnen und in Richtung eines Papiers hin und her reisen. Die Seite ist ein einfaches Rechteck mit geraden Linien, die Daten darstellen. Die Formen, die auf die Seite zusteuern, sind unregelmäßig und verlaufen in gewundenen Bändern.

Beschäftigte durch Daten stärken

Arbeitsplätze werden zunehmend datafiziert. Doch wie können Beschäftigte und Gewerkschaften diese Daten nutzen, um ihre Rechte zu vertreten?

Eine stilisierte Illustration mit einem großen „X“ in einer minimalistischen Schriftart, mit einem trockenen Zweig und verblichenen Blättern auf der einen Seite und einem leuchtend blauen Vogel im Flug auf der anderen Seite. Das Bild symbolisiert einen Übergangsprozess, wobei der Vogel das frühere Twitter-Logo darstellt und das „X“ das Rebranding der Plattform und Änderungen im Regelwerk von X symbolisiert.

Zwei Jahre nach der Übernahme: Vier zentrale Änderungen im Regelwerk von X unter Musk

Der Artikel beschreibt vier zentrale Änderungen im Regelwerk der Plattform X seit Musks Übernahme 2022 und deren Einfluss auf die Moderation von Inhalten.