Download as pdf or txt
Download as pdf or txt
You are on page 1of 314

Information Technology and Law Series IT&LAW 32

Regulating
New Technologies
in Uncertain Times

Leonie Reins Editor


Information Technology and Law Series

Volume 32

Editor-in-chief
Simone van der Hof, eLaw (Center for Law and Digital Technologies),
Leiden University, Leiden, The Netherlands

Series editors
Bibi van den Berg, Institute for Security and Global Affairs (ISGA),
Leiden University, The Hague, The Netherlands
Gloria González Fuster, Law, Science, Technology & Society Studies (LSTS),
Vrije Universiteit Brussel (VUB), Brussels, Belgium
Eleni Kosta, Tilburg Institute for Law, Technology, and Society (TILT),
Tilburg University, Tilburg, The Netherlands
Eva Lievens, Faculty of Law, Law & Technology, Ghent University,
Ghent, Belgium
Bendert Zevenbergen, Center for Information Technology Policy,
Princeton University, Princeton, USA
More information about this series at http://www.springer.com/series/8857
Leonie Reins
Editor

Regulating New
Technologies in Uncertain
Times

123
Editor
Leonie Reins
Tilburg Institute for Law, Technology,
and Society (TILT)
Tilburg University
Tilburg, The Netherlands

ISSN 1570-2782 ISSN 2215-1966 (electronic)


Information Technology and Law Series
ISBN 978-94-6265-278-1 ISBN 978-94-6265-279-8 (eBook)
https://doi.org/10.1007/978-94-6265-279-8
Library of Congress Control Number: 2018965892

Published by T.M.C. ASSER PRESS, The Hague, The Netherlands www.asserpress.nl


Produced and distributed for T.M.C. ASSER PRESS by Springer-Verlag Berlin Heidelberg

© T.M.C. ASSER PRESS and the authors 2019


No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any
means, electronic, mechanical, photocopying, microfilming, recording or otherwise, without written
permission from the Publisher, with the exception of any material supplied specifically for the purpose of
being entered and executed on a computer system, for exclusive use by the purchaser of the work.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.

This T.M.C. ASSER PRESS imprint is published by the registered company Springer-Verlag GmbH, DE
part of Springer Nature
The registered company address is: Heidelberger Platz 3, 14197 Berlin, Germany
Series Information

The Information Technology & Law Series was an initiative of ITeR, the national
programme for Information Technology and Law, which was a research programme
set up by the Dutch government and The Netherlands Organisation for Scientific
Research (NWO) in The Hague. Since 1995 ITeR has published all of its research
results in its own book series. In 2002 ITeR launched the present internationally
orientated and English language Information Technology & Law Series. This
well-established series deals with the implications of information technology for
legal systems and institutions. Manuscripts and related correspondence can be sent
to the Series’ Editorial Office, which will also gladly provide more information
concerning editorial standards and procedures.

Editorial Office

T.M.C. Asser Press


P.O. Box 30461
2500 GL The Hague
The Netherlands
Tel.: +31-70-3420310
e-mail: press@asser.nl
Simone van der Hof, Editor-in-Chief
Leiden University, eLaw (Center for Law and Digital Technologies)
The Netherlands
Bibi van den Berg
Leiden University, Institute for Security and Global Affairs (ISGA)
The Netherlands
Gloria González Fuster
Vrije Universiteit Brussel (VUB), Law, Science,
Technology & Society Studies (LSTS)
Belgium
Eleni Kosta
Tilburg University, Tilburg Institute for Law, Technology, and Society (TILT)
The Netherlands
Eva Lievens
Ghent University, Faculty of Law, Law & Technology
Belgium
Bendert Zevenbergen
Princeton University, Center for Information Technology Policy
USA
Contents

Part I Introduction
1 Regulating New Technologies in Times of Change . . . . . . . . . . . . . 3
Ronald Leenes
2 Regulating New Technologies in Uncertain Times—Challenges
and Opportunities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Leonie Reins

Part II New Technologies and Impacts on Democratic Governance


3 Between Freedom and Regulation: Investigating Community
Standards for Enhancing Scientific Robustness of Citizen
Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Anna Berti Suman
4 Human Rights in the Smart City: Regulating Emerging
Technologies in City Places . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Tenille E. Brown
5 Automated Driving and the Future of Traffic Law . . . . . . . . . . . . . 67
Nynke E. Vellinga
6 Coercive Neuroimaging Technologies in Criminal
Law in Europe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Sjors L. T. J. Ligthart

Part III New Technologies and Market Regulation


7 Planting the Seeds of Market Power: Digital Agriculture,
Farmers’ Autonomy, and the Role of Competition Policy . . . . . . . . 105
Tom Verdonk

vii
viii Contents

8 Sharing Data and Privacy in the Platform Economy:


The Right to Data Portability and “Porting Rights” . . . . . . . . . . . . 133
Silvia Martinelli
9 Regulating Smart Distributed Generation Electricity
Systems in the European Union . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Theodoros G. Iliopoulos

Part IV The Data in New Technologies—The Utilization


of Data and the Protection of Personal Data
10 A Public Database as a Way Towards More Effective
Algorithm Regulation and Transparency? . . . . . . . . . . . . . . . . . . . 175
Florian Wittner
11 Access to and Re-use of Government Data and the Use
of Big Data in Healthcare . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Miet Caes
12 The Challenges of Risk Profiling Used by Law Enforcement:
Examining the Cases of COMPAS and SyRI . . . . . . . . . . . . . . . . . 225
Sascha van Schendel
13 Regulating Data Re-use for Research: The Challenges
of Innovation and Incipient Social Norms . . . . . . . . . . . . . . . . . . . . 241
Hannah Smith
14 European Cloud Service Data Protection Certification . . . . . . . . . . 261
Ayşe Necibe Batman
15 Data Privacy Laws Response to Ransomware Attacks:
A Multi-Jurisdictional Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
Magda Brewczyńska, Suzanne Dunn and Avihai Elijahu

Part V Conclusion
16 Concluding Observations: The Regulation of Technology—What
Lies Ahead—And Where Do We Want to End Up? . . . . . . . . . . . . 309
Leonie Reins
Editor and Contributors

About the Editor

Leonie Reins is an Assistant Professor at the Tilburg Institute for Law, Technology,
and Society (“TILT”) at Tilburg University in the Netherlands. Previously, she was a
Postdoctoral Researcher at KU Leuven, Belgium, where she also wrote her Ph.D.
thesis on the coherent regulation of energy and the environment in the EU. Leonie
completed an LL.M. in Energy and Environmental Law at KU Leuven, and sub-
sequently worked for a Brussels-based environmental law consultancy, providing
legal and policy services for primarily public sector clients. Leonie’s research
focuses on the intersections of international and European energy, climate and
environmental law.

Contributors

Ayşe Necibe Batman Frankfurt am Main, Germany


Magda Brewczyńska Tilburg Institute for Law, Technology, and Society (TILT),
Tilburg University, Tilburg, The Netherlands
Tenille E. Brown Faculty of Law, University of Ottawa, Ottawa, Canada
Miet Caes Leuven Institute for Healthcare Policy, Leuven, Belgium
Suzanne Dunn Faculty of Law, University of Ottawa, Ottawa, Canada
Avihai Elijahu Faculty of Law, University of Haifa, Kiryat Shmona, Israel
Theodoros G. Iliopoulos Hasselt University, Hasselt, Belgium
Ronald Leenes Tilburg Institute for Law, Technology, and Society (TILT),
Tilburg University, Tilburg, The Netherlands

ix
x Editor and Contributors

Sjors L. T. J. Ligthart Department of Criminal Law, Tilburg Law School,


Tilburg University, Tilburg, The Netherlands
Silvia Martinelli University of Turin, Turin, Italy
Leonie Reins Tilburg Institute for Law, Technology, and Society (TILT), Tilburg
University, Tilburg, The Netherlands
Hannah Smith Centre for Health, Law, and Emerging Technologies, University
of Oxford, Oxford, UK
Anna Berti Suman Tilburg Institute for Law, Technology, and Society (TILT),
Tilburg University, Tilburg, The Netherlands
Sascha van Schendel Tilburg Institute for Law, Technology, and Society (TILT),
Tilburg University, Tilburg, The Netherlands
Nynke E. Vellinga Faculty of Law, University of Groningen, Groningen, The
Netherlands
Tom Verdonk Institute for Consumer, Competition & Market, University of
Leuven (KU Leuven), Leuven, Belgium
Florian Wittner Department of Law, Hans-Bredow Institute for Media Research
at the University of Hamburg, Hamburg, Germany
Part I
Introduction
Chapter 1
Regulating New Technologies in Times
of Change

Ronald Leenes

Contents

1.1 Introduction........................................................................................................................ 3
1.2 Back to the Future............................................................................................................. 5
1.3 Regulating Technology ..................................................................................................... 7
1.4 Connecting the Dots.......................................................................................................... 11
1.5 Solutions ............................................................................................................................ 13
1.6 Conclusion ......................................................................................................................... 15
References .................................................................................................................................. 16

Abstract This chapter provides an introduction to the overarching topic and


question of this volume on how and whether to regulate new technologies in times
of change. It introduces the regulating technology (development) model.

Keywords regulation  technology  innovation  Law of the Horse

1.1 Introduction

Let me start with looking back at an earlier point in my career. We had just survived
the Millennium Bug and Internet was still written with a Capital I. In fact, the
internet as we now know it was less than five years old. I was teaching in the

This is an extended and adapted version of the keynote presented at the Ph.D. symposium at
Tilburg University on 14 June 2019.

R. Leenes (&)
Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University, Tilburg, The
Netherlands
e-mail: r.leenes@uvt.nl

© T.M.C. ASSER PRESS and the authors 2019 3


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_1
4 R. Leenes

department of Public Administration at Twente University. My courses dealt with


law and ICT and IT and public governance. My students were really excited by the
new opportunities offered by email and the World Wide Web. Social media did not
yet exist, and online music and video were of low quality. Yet, my students saw the
prospects of the emerging technologies and were eager to produce course papers
about e-commerce and e-government. They had to focus on the legal aspects of
these developments and many seemed to follow similar arguments: a new product
or service is emerging, such as online shopping, this (type of) service is not
mentioned in the law, hence we need new rules, new law. Law has to adapt to this
new reality.1 Oftentimes, this conclusion that the law needs to be updated as a result
of new technologies was presented as obvious.2 The argument, or rather the claim,
put forward by my students was as follows: “We face a new technology, in this case
the internet, or a service on the internet, such as e-commerce. The law is silent on
these topics, which makes total sense because it is likely outdated and lawyers are
old-fashioned anyway. Why? Well, let’s face it, the law is paper-based. Besides, it
was developed for other circumstances and other phenomena and is created by
people who don’t understand modern times. Hence, we almost certainly need new
law, new rules.”
As I said, I was still young, knew little of technology law, and was a bit prone to
following this line of reasoning. However, I was also sufficiently versed in tradi-
tional law to suspect flaws in their lines of reasoning.3 Maybe (surely) the legal
system is flexible enough to cope with these, so-called, novelties. After all, how
different is e-commerce from distance selling such as the kind based on catalogues?
Why would ordering goods by telephone (or heaven forbid by fax) from a store be
different to ordering stuff online? And yes, even in the old days, one could order
goods from stores in other countries. Why would civil law, in particular contract
law, not be applicable or be outdated? Why would the regulation pertaining to
distance selling, which had been around for a while, not suffice? Why would
concepts such as agreement, contract, default, tort, etc. not do? Should we not first
explore whether they do, before jumping to the conclusion that we need new law?
With that harsh message and homework, my students went to the library and the
drawing board in order to think-tank on the issues at hand and the adequacy of
existing concepts and mechanisms.

1
Not only students struggled with the fit of the normative framework to changing reality, also
legislators around the globe pondered whether the online world requires new law (urgently). For
The Netherlands, see Ministerie van Justitie 1997–1998. See also Koops et al. 2006.
2
This may be a result of the engineering mindset of my students who had enrolled in a technical
university.
3
I had read Frank Easterbrook’s lecture at the 1996 Cyberlaw conference entitled ‘Cyberspace
and the Law of the Horse’, 1996 U Chi Legal F 207, which conveys the message that “the best way
to learn the law applicable to specialized endeavors is to study general rules”.
1 Regulating New Technologies in Times of Change 5

1.2 Back to the Future

After my move to Tilburg University, I became more and more intrigued by the
relation between technology and regulation. It will probably not surprise you that
the patterns I observed in Twente also surfaced in Tilburg.4 Anytime a new tech-
nology materialises, or when innovators and entrepreneurs come up with a novel
way of doing business, calls for regulatory changes can be heard. These voices do
not only come from students and Ph.D. students, who by definition still have a lot to
learn, but also from developers, engineers, policymakers, and the odd scientist, who
may quickly arrive at the conclusion that there is a regulatory disconnect5 in need of
fixing.
Many people seem to suffer from the ‘Flawed Law Syndrome’: the urge to call
law or regulation outdated or flawed (disconnected) and the desire to fix the
problems by addressing the law, rather than using other ways to mend the assumed
gaps (‘Legal Solutionism’).
Of course, industry will also complain that the law needs to be changed.6
Industry typically brings forward two claims regarding the regulatory framework in
their domain: one, that they are unduly constrained and two, that the rules are
unclear. This seems to be the knee-jerk reaction every time a new technology
emerges, rather than exploring the actual state of the art with respect to the tech-
nology and the law.7
We clearly see this ‘call-to-regulate’ reflex in the field of self-driving vehicles,
where Google (currently Waymo), and the car industry more generally, call for
regulation.8 A similar response can be seen with regard to “newish” ‘taxi-like’
services with human drivers, such as Uber, where a strong urge from the new
services to regulate the field is visible.9 Of course, by regulating the field, they
mean “in a manner that is to their advantage”. Uber’s concerns are different from
Waymo’s, but the source of the issues is the same in both cases: there is a regulatory

4
So much for the hypothesis that the engineering mindset of students at a technical university was
the cause of their legal solutionism. The term Solutionism was introduced by Morozov 2013a.
5
Brownsword 2008.
6
See, for instance, http://www.drivingsales.com/news/google-and-auto-executives-urge-
congress-to-develop-national-self-driving-car-regulations/; http://nhv.us/content/16024540-uber-
urges-nh-lawmakers-introduce-statewide-regulations-ride. Last accessed 23 October 2018.
7
Leenes et al. 2017 for an exploration of this phenomenon in the field of robotics.
8
See for calls in the US for instance, http://www.drivingsales.com/news/google-and-auto-
executives-urge-congress-to-develop-national-self-driving-car-regulations/. Last accessed 23
October 2018; https://collisionweek.com/2018/09/10/vehicle-manufacturers-suppliers-call-senate-
passage-autonomous-vehicle-legislation/. Last accessed 23 October 2018.
9
See, for instance, http://nhv.us/content/16024540-uber-urges-nh-lawmakers-introduce-
statewide-regulations-ride. Last accessed 23 October 2018.
6 R. Leenes

disconnect. To be fair, scientists are also unhappy and complain, for instance that
killer drones should be banned.10
There is a steady pattern of calls for new regulation whenever new technologies
enter the stage. However, if and when new regulation is introduced, the complaints
often remain. To regulate means to weigh interests and the outcome of this process
can hardly ever satisfy all. A prime example is data protection regulation. After four
to five years of tough negotiations, comprising the various legislative stages, the
European Parliament processed over 4000 amendments11 to the original
Commission proposal for a General Data Protection Regulation (“GDPR”) and new
regulation was finally agreed on.12 The internet giants have lobbied tremendously,
but apparently did not get what they bargained for. Their dissatisfaction13 is not
entirely surprising as Google and Facebook stand to lose a lot and have been in
legal battles with the Data Protection Authorities based on the former Data
Protection Directive 95/46/EC already.14
Let me return to the story and get back to the behaviour of industry in response
to regulation later on. There seems to be a strong call for new regulation when a
new technology presents itself. Some suggest a leap forward and claim that in a
dynamic environment we need dynamic regulation, or flexible regulation. Certainly
in the Netherlands, the road proposed is that of experimental regulation, such as
sunset provisions, as a means of coping with uncertainty and offering flexibility.15
I am not particularly happy with this direction and will throw my head in the
wind. Before doing so, I want to return to a story of old. Do we really need new
regulation to cope with issues associated with new technologies, or are the classical
instruments sufficient? I have mentioned Justice Frank Easterbrook’s take on this
question already in a footnote, but will turn to his idea explicitly. Easterbrook’s
target was the proposal by Lessig and others to create a new area of law,
Cyberlaw.16 This idea of creating cyberlaw as a response to novelty (like cyber-
space), is nonsense in Easterbrook’s opinion. He illustrates his point by explaining
why there is no Law of the Horse and why we should not strive to create one. The

10
See, for instance, http://www.oxfordmartin.ox.ac.uk/news/201412_Robo-Wars. Last accessed
23 October 2018.
11
For an overview of the 3132 amendments, see https://lobbyplag.eu/map/amendments/libe/. Last
accessed 23 October 2018.
12
The first COM proposal of the GDPR (2016/679) was published on 25 January 2012, it entered
into force on 24 May 2016 and became directly applicable in all EU Member States on 25 May
2018.
13
See, for instance, https://edri.org/edrigramnumber10-22facebook-doesnt-like-eu-regulation/.
Last accessed 23 October 2018.
14
Consider the cases launched by Max Schrems, see https://en.wikipedia.org/wiki/Max_Schrems.
Last accessed 23 October 2018. See also http://www.cnbc.com/2016/02/01/eu-privacy-rules-may-
hit-internet-giants-hard.html. Last accessed 23 October 2018.
15
E.g., Ranchordás 2014.
16
With this, Easterbrook started a long line of debate about Cyberlaw. One should in this line at
least mention Lessig’s response, Lessig 1999; and Andrew Murray’s wonderful 2013 Bileta
keynote: Murray 2013.
1 Regulating New Technologies in Times of Change 7

law of the horse is a metaphor for comprehensive regulation around all things
horses. Whenever there is an issue involving a horse, the law of the horse is where
to look for answers.
From a practical perspective there is significant challenge in creating such law,
after all what are the potential topics to be addressed by this law? However, there
may be merit in such an effort. On the positive side, having everything in a single
act is convenient. At least as long as we can determine that we are dealing with a
horse issue. That might be simple, you think, but what about the new invention of
the Mule? Are they covered by the law of the horse? What about unicorns? Most
certainly these are science fiction, but a quick glance at the Wikipedia entry on
horse17 leads us to the realm of the Equids, with members such as the donkey,
mountain zebra, plains zebra and crossbreeds such as the mule, hinny, jenny and
zorse.
Of course all of this deals with the classification of events, facts, observations
into legal categories, similar to the earlier e-commerce example. E-commerce might
be a species of the genus contracting, just like a donkey is a species of the genus
equus. Qualification and classification are tasks any legal scholars is trained in.
Having said that, in Easterbrook’s view, the general legal concepts and mech-
anisms are flexible and can cope satisfactory with new phenomena. The criminal
provisions regarding manslaughter do not distinguish between knives, guns and
poison, they are simply means to accomplish death (in certain cases).
Before Easterbrook, legal philosopher Lon Fuller had a similar message when he
wrote that good law is the law which is possible to obey.18 Legal compliance is
probably easier to achieve with a limited set of general rules, rather than with a
large set of specific rules for every new topic.
To stay with the law of the horse. Supposing that the law of the horse would
exist, having a single set of rules applicable to all horse-likes would be preferable
over a statute with specific rules for all horse-likes.

1.3 Regulating Technology

From the foregoing it is clear that we should be careful with interventions in


technological development. Not so much because of phenomena such as
Collingridge’s dilemma—“When change is easy, the need for it cannot be foreseen;
when the need for change is apparent, change has become expensive, difficult, and
time-consuming.”,19—but simply because of the old saying “if it ain’t broke, don’t
fix it”. All too easily we hear claims that the law is inadequate, without it being

17
https://en.wikipedia.org/wiki/Horse. Last accessed 23 October 2018.
18
Fuller 1969.
19
David Collingridge quoted in Morozov 2013b, p. 255.
8 R. Leenes

clear what the actual regulation is or requires with respect to the technology in
question.20
We can observe that in domains like big data, where entrepreneurs and devel-
opers literally ask for establishing “Pirate Islands” with few or no rules where they
can experiment without fear for fines. In robotics and many other domains the
sirens of ‘Pirate Island’ and ‘Experimental zones’ can also be heard. These were, or
are to be created to limit the scope or effects of rules that supposedly hamper
innovation. When asked what rules actually hamper innovation,21 the silence is
often deafening. The call for lessening the burden of rules seems related to the knee
jerk reaction that new law is required to cope with technological innovation.
The fact that scientists do not know the rules that define their playing field while
maintaining that they are constrained by them is problematic. For starters, igno-
rantia juris non excusat (“ignorance of the law excuses not”), and second, the law
has normative force, the rules are supposed to be adhered to. To take an example
out of a different, highly regulated domain, every professional cook is aware of the
significant amount of rules applying to their business. Enforcement, including
non-legal by TV shows like GourmetPolice, has helped raise awareness, and likely
compliance.
Developers consider the law and legal and ethical requirements not for them, but
for others and they happily muddle along as if they are unconstrained. That is, until
corporate legal, or some supervisory authority or Media come into play. A recent
example in this space is the Cambridge Analytica affair.22
Of course it is not always easy to determine the applicable rules because the
norms are abstract, they talk about products, services and not so much about
household social robots. We always have to qualify everyday phenomena into the
appropriate legal terms and interpret legal concepts and rules. And of course, there
are also real tensions because existing regulation may have undesired effects,
lacunae, or different legal regimes may lead to conflicting results. And not always
should actors blindly follow the law. Sometimes the law really is outdated and
requires change. But we need to look at things from the proper perspective and we
have to keep in mind that different interests need to be balanced.
In my understanding,23 there is an interaction between innovation/technological
development, regulation and normative outlooks such as privacy and, autonomy. If
one of the edges changes, then the other two do as well. Regulation could be

20
See Leenes et al. 2017, p. 7.
21
One extreme example of a claim that rules are in the way of innovation is Consumer
Technology Association President Gary Shapiro’s statement at a House Oversight Committee
hearing on artificial intelligence that the GDPR is “going to kill people, because if you can’t
transfer, for example, medical information from one hospital to another in the same region, that has
life consequences.” https://www.axios.com/gary-shapiro-gdpr-kill-people-1524083132-e3d317c0-
7952-4a55-9c2d-c84d82dc03e7.html. Last accessed 16 October 2018.
22
See the excellent Guardian dossier “the Cambridge Analytica Files” https://www.theguardian.
com/news/series/cambridge-analytica-files. Last accessed 16 October 2018.
23
This is one of the models that inspires our work at TILT.
1 Regulating New Technologies in Times of Change 9

adapted on the basis of technological development, but our (perception of) values
may equally change. For instance Facebook’s defining social interactions online
seems to have affected how we appreciate privacy. The mutual-shaping perspective
that is implied in this model, departs from the assumption that there is a funda-
mental interdependence between social, technological, and normative transforma-
tions. This interdependence exists in an ongoing process of socio-technological
change that is dynamic and open-ended, and that occurs in the context of a specific
time and place (Fig. 1.1).24
Yet, as discussed earlier, regulation is commonly seen as an impediment to
innovation. In the context of the GDPR, someone stated “… it’s also going to kill
people”.25 Now of course, this person had a particular stake in the debate (did I
mention he is a lobbyist?), but the fear that regulation hampers technological
development is prominent. What is interesting in this respect is that people rarely
complain about gravity impeding innovation. Gravity is simply taken as a design
constraint. Why the opposition against regulation, which in many cases simply can
be taken as yet another constraint?26
Whether regulation impedes innovation or is a necessary constraint that should
be taken as it is depends amongst others on the context.
There is a difference between testing an autonomous vehicle (that is a big word
for a car that is less than 10 cm long) that should be racing on a slot car race track
and the kind of testing required to get vehicles like Tesla’s self-driving cars on
public roads. Teslas will have to be tested under realistic circumstances, and hence
will have to drive on public roads. It simply does not matter it performs well on the
test track. What does matter is that it will not hit unexpected obstacles, such as
trucks.27 A legal obstacle in this realm has been the Vienna Convention on Road
Traffic. Article 8(1) of this convention requires that “[e]very moving vehicle or
combination of vehicles shall have a driver.”28 Arguably, driver means human
driver in this provision.29 In the Tesla case, this legal obstacle is manageable.
A Tesla has a driver behind the steering wheel and hence the car satisfies the
conditions of Article 8 of the Vienna Convention. However, the future of
self-driving vehicles will likely be one without steering wheels. The existing rules
then have consequences. Either we ban self-driving vehicles from public roads
through the enforcement of the national road traffic laws based on Article 8 of the
Vienna Convention, or we change the regulation removing the requirement for a

24
Boczkowski 2004, pp. 255–267.
25
See n. 22.
26
Of course I know that regulation can be changed and gravity cannot, but still.
27
See https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-
car-elon-musk for an account of the first time it became clear that the Tesla Autopilot was not
ready in this respect, yet. Last accessed 17 October 2018.
28
For more on regulating self-driving vehicles, see Leenes and Lucivero 2014.
29
Article 1(v) of the Vienna Convention defines “driver as (v) “Driver” means any person who
drives a motor vehicle or other vehicle (including a cycle), or who guides cattle, singly or in herds,
or flocks, or draught, pack or saddle animals on a road”.
10 R. Leenes

Fig. 1.1 The interplay


between regulation,
technology development, and
normative notions in the
context of society. [Source
The author]

driver to be physically present. At least, if we want these cars on the roads soon.
This question is further addressed in Chap. 5 of this contribution.
This is where multiple interests come into play. Not everyone is convinced that
we should cast aside all limitations in road traffic regulation to pave the way for
driverless cars and some feel that we must resist the pressure from industry and
developers. One of the reasons to be careful is that industry and large corporate
players mobilise a strong lobby to get the rules they want (regulatory capture).30
Maybe prudence should prevail over speed in terms of adaptation of the regulatory
framework for self-driving vehicles produced by Waymo, Tesla, etc.
Legal action is required for other phenomena in the space of autonomous
vehicles however. Not only the car industry (and interestingly enough Search
engine giant Google (Waymo)) is racing to get a firm position in the market for
autonomous vehicles, there is also a Do-it-yourself scene. Renowned iPhone and
PlayStation hacker George Holz is eager to ship a 999 USD costing green box,
called Comma One,31 which turns certain types of Hondas into cars with the same
functionality as the Tesla S. Not quite a self-driving car, but it can drive quasi
autonomously. I do not know how adventurous you are, but I am certainly not
going to hand over control of my car to a small green box with significantly less
computing power than my iPhone. In cases like these, I feel we need authorities that
enforce the existing rules. Fortunately, the US National Highway Traffic Safety
Administration agrees with me and has informed George Holz that he will have to
comply with the safety requirements for motor vehicles.32 Is this hampering
innovation or a necessary reminder of his responsibilities? I think the reminder that
the norms are there to be observed was essential. George Holtz did not agree and to
circumvent his liability and responsibility under the Motor Vehicle Safety Act, he
posted the software and schematics of the green box on Github, facilitating the

30
See Stigler 1971, pp. 3–21.
31
See https://techcrunch.com/2016/09/13/comma-ai-will-ship-a-999-autonomous-driving-add-
on-by-the-end-of-this-year/. Last accessed 17 October 2018.
32
See https://www.scribd.com/document/329218929/2016-10-27-Special-Order-Directed-to-
Comma-ai. Last accessed 17 October 2018.
1 Regulating New Technologies in Times of Change 11

daring among us to try it out.33 If you want to keep one lesson from this talk, then
this is it: do not try this at home.

1.4 Connecting the Dots

Establishing that there is regulatory disconnect, such as in the case of the driver
requirement for self-driving cars, or that the regulation contains lacunae is actually
difficult. Nevertheless, it is a necessary step in the field of technology regulation.
We cannot simply jump to the conclusion that we need new law, new rules.
In my teaching, I have used this work in progress model to illustrate the steps
and some of the questions that need to be asked (Fig. 1.2).
Let me illustrate this by means of my simple Law Technology and Society
(LTS) model. Moving through the model from left to right as I think we should be
doing. Starting with (1) the technology. This step seems easy, but actually is not.
There seem to be two conflicting approaches here. The start of the conversation
might be an instance of a particular type of technology, let us take the well-known
Google self-driving vehicles as an example. Then the discussion focusses on this
very specific instance of the technology, or we move to the broad super-category of
‘self-driving vehicles’.34 Neither approach seems desirable. In the first we might
focus on the potentially coincidental features of the technology that then determines
how to proceed towards regulation,35 in the latter case, the discussion runs the risk
of becoming abstract and unhelpful because of the generalisation.
Lyria Bennett Moses36 rightfully addresses the problem of addressing ‘tech-
nology’ as a regulatory target and instead calls attention for looking at the
socio-technical landscape, which resembles my earlier call for a mutual shaping
perspective. In this phase, taking a socio-technical lens we should determine what
the technology of focus actually is, what its relevant characteristics are and which
interests are at stake or are being promoted.
In the next stage (stage 2), the issues raised by the technological development are
addressed. Here all sorts of distinctions can be made with respect to the issues. Are
we talking about potential risks (autonomous vehicles may have to make decisions
about whether to hit the child chasing a ball on our side of the road, or the elderly

33
See https://www.slashgear.com/comma-ai-self-driving-agent-open-sourced-after-comma-one-
fail-01465892/ and https://github.com/commaai/openpilot. Last accessed 17 October 2018.
34
Or take the other grand technologies of fame, such as nanotechnology, biotechnology, neu-
rotechnology, etc.
35
For instance, in the US the focus in developing self-driving vehicles seems to be on the
autonomy of the car based on sensors in the car. In Europe there is much more attention for
collaboration between the vehicle and its environment to establish the intended autonomy. See
Leenes and Lucivero 2014 for more information on these differences in approach.
36
Bennett Moses 2013.
12 R. Leenes

Fig. 1.2 The regulating technology (development) model v1. [Source The author]

couple crossing the street from the other side),37 or are there manifest problems
already (such as autonomous vehicles causing accidents on public roads). Again,
the socio-technical context as well as the various stakeholders came into play. Who
defines the problems or places topics on the agenda, who are the stakeholders
anyway, etc.? At this stage, also the question that I have beaten to death so far, what
does the current law have to say about this problem/technology comes into view.
Then if there is a regulatory gap, we might consider intervening (stage 3). Here,
regulation comes into play. There appear to be three broadly accepted under-
standings of what ‘regulation’ is.38 In the first, regulation is the promulgation of
rules by government accompanied by mechanisms for monitoring and enforcement,
usually assumed to be performed through a specialist public agency. The second
assumes regulation to be any form of direct state intervention in the economy,
whatever form that intervention might take. The third one takes regulation to be all
mechanisms of social control or influence affecting all aspects of behaviour from
whatever source, whether they are intentional or not. I subscribe to Julia Black’s
decentred conceptualisation of regulation, which moves beyond the state as the sole
regulator and which includes other modalities of regulation. Regulation, then, is
‘the sustained and focused attempt to alter the behaviour of others according to

37
Think of Applied Trolley Problem here.
38
Black 2002.
1 Regulating New Technologies in Times of Change 13

standards or goals with the intention of producing a broadly identified outcome or


outcomes, which may involve mechanisms of standard- setting,
information-gathering and behaviour-modification’.39 In this stage, questions need
to be raise who is to intervene, who (or what) to address, through which (combi-
nation of) means (e.g. law, norms, architecture, markets).
At all places in the model we need to ask critical questions. There are many tools
that can help us in this respect.
Many of us (legal scholars), take for granted that we need regulation to cope
with undesirable results of technology and innovation. But increasingly, I do not
take that for granted and I become more sensitive to the position taken by econ-
omists and many American legal scholars that regulation is only permissible to
address market failures, like unfair competition, windfalls etc. In Europe, we
acknowledge that also protecting human rights (for instance privacy and data
protection) and even furthering social goals such as solidarity are equally appro-
priate goals.40 Yet, regulation should not be our first reflex. Ideally, we should not
just regulate ‘just because we can’. Lawyers too may suffer from hammer syndrome
(nails everywhere)! Let the market handle things.
The regulator needs to justify that a problem fits within one of the three cate-
gories market failure, human rights protection, conflict resolution to warrant
intervention. Interestingly, the box ticked then also provides some guidance as to
how to regulate. For instance, in the case of Uber, one could argue that all sorts of
costs (like insurance) are not incorporated into the price of the service and that Uber
can therefore charge lower prices than traditional taxi services. To create a level
playing field, Uber could be obliged to insure their drivers just like any (other) taxi
service does. On the other hand, maybe the traditional taxi services are at the root of
market failure here. Maybe the compulsory license system present in many cities is
preventing newcomers entering the market and this issue should be addressed.
I close this part with a claim that determining regulatory disconnect/failure is
difficult. I refer to our work on the cookie wars for a case study on what we consider
to be an example of regulatory failure.41

1.5 Solutions

We will now briefly look at solutions. A suitable case to explore a little is


self-driving vehicles. We do not know exactly yet what these will look like or what
their requirements are with respect to the (road) infrastructure. Hence regulating
these vehicles is not straightforward. We need flexibility. Does this mean

39
Black 2002, p. 26; Black 2005.
40
Prosser 2010, pp. 11–20.
41
Leenes and Kosta 2015; Leenes 2015.
14 R. Leenes

experimental regulation or sunset provisions?42 I am not going into details what this
means exactly because I think their names are self-explanatory.43
The crux of both is that they are temporary measures, implying that they can be
changed and thus provide for flexibility. This type of regulation provides legal
certainty because there are rules. But this certainty is also (time) limited. We know
that the rules may change in the case of experimental rules and we know that they
may change or terminate at time T+1.
This is one way of coping with the flexibility required by innovation. There is
another way of achieving flexibility. We can try to regulate behaviour by clear rules
or by more abstract principles.44 This distinction is not orthogonal to that of
experimental versus fixed regulation, but merely addresses the form of the norms.
Principles and rules are encountered all over the law. In (continental) civil law
principles and concepts such as “reasonable” or “equitable” and in data protection
we find calls for “appropriate” technical and organisational measures to be taken to
ensure a level of security appropriate to the risk (Article 32 General Data Protection
Regulation (GDPR)).45 These vague and or open textured concepts are further
developed in case law and handbooks. Yet, they are incredibly flexible and allow
for new phenomena and risks to be incorporated or excluded over time.
On the other hand we have clear rules. The Dutch constitution contains a very
clear provision in Article 13, which states that telegraph messages are secret
(protected).46 Which means something like communication by telegraph is pro-
tected communication. Telegraphs are out of fashion now, but referred to a clearly
defined technology. The rule makes very clear what is protected but in a way turned
out not to be future proof. The scope of communication secrecy was clearly defined
in Article 13 of the Dutch Constitution: telegraph, telephone, letters. But then we
got new communication technologies: fax, email, SMS. What about their protec-
tion? Strict/literal interpretation rules them out, teleological interpretation poten-
tially not (all). Moving towards technology-neutral provisions47 is a common
solution to this kind of problems. Instead of mentioning the specific technologies
(letter, telephone, telegraph), regulate communication. Instead of requiring a driver
to be present to keep the vehicle under control, regulate that the vehicle must be
safe for passengers and bystanders at all times. The notion of technology-neutral
regulation is of course not new, but does change the discourse about regulatory
approach.

42
Of course any regulation can be adapted, but it if the regulation itself contains conditions for its
review and change, actual adaptation is much easier because the review is automatically triggered,
rather than requiring some actor to initiate it.
43
See Ranchordás 2014 for an extensive account of the various concepts in this space.
44
These are also called standards, which is kind of confusing because standards in the context of
certification are actually quite precise. On regulation by rules and principles see Braithwaite 2002.
45
Regulation (EU) 2016/679.
46
In Dutch: “Het telegraafgeheim is onschendbaar.”
47
See Koops 2006.
1 Regulating New Technologies in Times of Change 15

By the way, in practice, we also see counter examples. The previous Dutch
minister of Traffic thought she could limit the number of lethal bicycle accidents by
prohibiting people to use their smartphone while riding a bike.48 The abstract
provision that you have to be vigilant in traffic would thus be transformed into a
very specific prohibition.

1.6 Conclusion

In this final section, I want to briefly touch on a few issues of technology regulation.
There is, as you will have noticed, a clear issue with principle-based regulation and
also with all types of experimental regulation. They imply legal uncertainty. We do
not know yet what appropriate measures are under de GDPR. Time and courts will
tell. Uncertainty is unavoidable in a highly dynamic environment. The law has
(successfully) coped with this for thousands of years. But, there are also other issues
we need to keep in mind.
An important one is regulatory capture. I mentioned this already in passing.
Interested parties, industry at the forefront, will invest significantly in getting their
way. In getting the regulation they want. See the net neutrality battle in the US.49
Some are fully aware of this, for instance in the case where the consumer watchdog
in the US called to withstand Google’s pressure for swift regulation of autonomous
vehicles.50
Regulation is also a means to prevent newcomers to enter the market. Uber for
instance claims that the traditional taxi companies have fostered a licensing system
as a barrier to entry. Whether they are right is hard to tell as I already mentioned.
Another issue is that we should be aware that not all norm addressees are equal.
Why do some people comply with the rules, while others do not? Kagan and
Scholtz provide a useful distinction that we need to keep in mind.51 Amoral cal-
culators make cost-benefit assessments and then determine whether they comply
with the rules or not. The content of the rules does not matter, the fines do.
A different group is that of the political citizens who do not follow certain rules as a
matter of civil disobedience. And then there are the organisationally incompetent.
These are the ignorati, they do not know or understand the rules. We need to be
aware that all three types operate in the same space and we should not assume too
easily that the rules are inadequate.

48
https://www.rtlnieuws.nl/editienl/bellen-of-appen-op-de-fiets-het-zou-verboden-moeten-
worden. Last accessed 17 October 2018.
49
See, for instance, https://www.politico.com/story/2015/02/net-neutrality-a-lobbying-bonanza-
115385. Last accessed 17 October 2018.
50
See http://www.bodyshopbusiness.com/consumer-watchdog-group-urges-california-dmv-to-
ignore-pressure-from-google/. Last accessed 17 October 2018.
51
Kagan and Scholtz 1984, p. 494.
16 R. Leenes

The world of technological development and innovation is full of pioneers, but


there are also pirates. We will have to cope with both.
And then there is one more thing. We live in the sharing economy. Maybe that is
indeed the next big thing, but let us not forget that we are in uncharted territory with
lots of promises that may not hold. The website The Drive had an interesting article
in December 201652 about the mobility bubble, with the compelling caption “When
the Mobility Bubble Bursts, Which Companies Go ‘Pop’?” over an image of the
burning Zeppelin Hindenburg in New York. Uber at that point in 2016 had lost 1.2
Billion USD per six months in 2016 without a clear business model. Do we really
take experiments like these as guiding lights for new regulation?

References

Bennett Moses L (2013) How to Think about Law, Regulation and Technology: Problems with
‘Technology’ as a Regulatory Target. Law Innovation and Technology 5:1
Black J (2002) Critical reflections on regulation. Australian Journal of Legal Philosophy 27:1–35
Black J (2005) What is Regulatory Innovation? In: Black J, Lodge M, Thatcher M
(eds) Regulatory Innovation. Edward Elgar, Cheltenham
Boczkowski PJ (2004) The mutual shaping of technology and society in Videotex newspapers:
Beyond the diffusion and social shaping perspectives. The Information Society 20:255–267
Braithwaite J (2002) Rules and Principles: A Theory of Legal Certainty. Australian Journal of
Legal Philosophy 27:47–82
Brownsword R (2008) Rights, Regulation and the Technological Revolution. Oxford University
Press
Fuller L (1969) The Morality of Law. Yale University Press
Kagan R, Scholtz J (1984) The criminology of the corporation and regulatory enforcement
strategies. In: Hawkins J, Thomas J (eds) Enforcing Regulation. Kluwer, Alphen aan den Rijn,
pp 67–95
Koops B-J (2006) Should ICT Regulation be Technology-Neutral? In: Koops B-J, Lips M,
Prins C, Schellekens M (eds) Starting Points for ICT Regulation - Deconstructing Prevalent
Policy One-Liners. T.M.C. Asser Press, The Hague, pp 77–108
Koops B-J, Lips M, Prins C, Schellekens M (eds) (2006) Starting Points for ICT Regulation -
Deconstructing Prevalent Policy One-Liners. T.M.C. Asser Press, The Hague
Leenes R (2015) The Cookiewars – From regulatory failure to user empowerment? In: van
Lieshout M, Hoepman J-H (eds) The Privacy & Identity Lab; 4 years later. Privacy & Identity
Lab, Nijmegen, pp 31–49
Leenes R, Kosta E (2015) Taming the Cookie Monster with Dutch Law – A Tale of Regulatory
Failure. Computer Law & Security Review 31:317–335
Leenes R, Lucivero F (2014) Laws on Robots, Laws by Robots, Laws in Robots: Regulating
Robot Behaviour by Design. Law, Innovation, and Technology 6:194–222
Leenes R, Palmerini E, Koops B-J, Bertolini A, Salvini P, Lucivero F (2017) Regulatory
Challenges of Robotics: Some Guidelines for Addressing Legal and Ethical Issues. Law,
Innovation and Technology 9:1, 1–44
Lessig L (1999) The Law of the Horse: What Cyberlaw Might Teach. Harvard Law Review
113:501–549

52
See http://www.thedrive.com/tech/6491/when-the-mobility-bubble-bursts-which-companies-
go-pop. Last accessed 14 November 2018.
1 Regulating New Technologies in Times of Change 17

Ministerie van Justitie (1997–1998) Nota Wetgeving voor de elektronische snelweg,


Kamerstukken II 25 880 1997–1998
Morozov E (2013a) ‘To Save Everything, Click Here’ - Technology, Solutionism and the Urge to
Fix Problems That Don’t Exist. Allen Lane, London
Morozov E (2013b) The Collingridge Dilemma. In: Brockman J (ed) This explains everything.
Harper Perennial, New York, p 255
Murray A (2013) Looking Back at the Law of the Horse: Why Cyberlaw and the Rule of Law are
Important. SCRIPTed 10:310
Prosser T (2010) The Regulatory Enterprise: Government Regulation and Legitimacy. Oxford
University Press
Ranchordás S (2014) Constitutional Sunsets and Experimental Legislation: A Comparative
Perspective. Edward Elgar Publishing, Cheltenham
Stigler G (1971) The Theory of Economic Regulation. Bell Journal of Economics and
Management Science 2:3–21

Ronald Leenes is full professor in regulation by technology at the Tilburg Institute for Law,
Technology, and Society (Tilburg University). His primary research interests are regulation by
(and of) technology, specifically related to privacy and identity management. He is also motivated
by and trying to understand the effects of profiling, function creep and privacy infringements in
general. A growing area of interest in his portfolio is the potential and negative effects of Big Data
Analytics.
Chapter 2
Regulating New Technologies
in Uncertain Times—Challenges
and Opportunities

Leonie Reins

Contents

2.1 Introduction........................................................................................................................ 20
2.2 The Scope: Democratic Governance, Market Regulation, and Data ............................... 21
2.3 The Contributions.............................................................................................................. 22
2.3.1 Part I: New Technologies and Impacts on Democratic Governance .................... 22
2.3.2 Part II: The Economic Perspective—Market Regulation of New Technologies ........ 25
2.3.3 Part III: The Data in New Technologies ............................................................... 26
2.4 The Way Forward ............................................................................................................. 28
References .................................................................................................................................. 28

Abstract This chapter provides an introduction to the volume on Regulating New


Technologies in Uncertain Times—Challenges and Opportunities. The volume is
structured along three main themes that capture the broader topic of “Regulating
New Technologies in Uncertain Times”. These three themes are: 1. The relationship
between new technologies on democratic governance; 2. Market regulation and
new technologies; and 3. The data in new technologies. It is considered that these
three themes encapsulate some of the most pressing regulatory challenges in respect
of new technologies and are therefore worth assessing in more detail. In this
introductory chapter, the three main themes that feature in this volume are dis-
cussed, before providing a brief introduction to all fourteen individual
contributions.

Keywords regulation  new technologies  democratic governance  markets 


data

L. Reins (&)
Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University, Tilburg, The
Netherlands
e-mail: l.s.reins@uvt.nl

© T.M.C. ASSER PRESS and the authors 2019 19


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_2
20 L. Reins

2.1 Introduction

Technology has the ability to serve humans and to make our lives easier. Yet, in
doing so, technology disrupts. It changes the status quo by enabling new forms of
interaction, new types of medical treatment or new forms of energy generation.
These new applications of technologies are often accompanied with uncertainty as
to their long-term (un)intended impacts. That is why regulators across the globe
seek to strive a balance between the appropriate protection of societies against these
risks, whilst at the same time trying not to stifle the development of these new
technologies. However, societies and the citizens that live in them, have different
collective and individual preferences in terms of the amount of uncertainty and the
type of risk that they are willing to accept. The way in which regulation can address
these differing, and sometimes conflicting, societal objectives is therefore a crucial
question of legal research. New technologies also raise questions about the
boundaries of the law as the line between harmful and beneficial effects often
becomes difficult to draw. Societal acceptance of new technologies is essential to
making them a success. Yet, societal acceptance is increasingly difficult in times
that can easily be described as “uncertain”.
With nearly one fifth of the 21st century behind us, it is safe to conclude that
mankind finds itself confronted with several significant challenges. These relate, for
instance, to the need to adapt to rising temperatures, the need to distribute resources
among an ever-increasing global population, and the need to ensure that digital-
ization and artificial intelligence does not exceed the limits of human control. At the
same time, global institutions are under pressure and multilateral collaboration
seems to have had to cede ground to unilateralism by sovereign nations.
Against this background, the question that arises is how technologies that are
developed through human ingenuity and which can contribute to solving the
problems humanity currently faces, can be regulated in a manner that safeguards
basic principles and human rights, without simultaneously stifling the development,
implementation and application of these technologies in practice. Considering the
intrinsic linkage with innovation, and the corresponding concept of novelty, it is
considered that—notwithstanding the benefits of wisdom and experience shared by
older generations—young legal scholars can provide valuable insights in this
regard. Thereto this volume presents fourteen high-quality contributions by par-
ticipants in the first Ph.D. Colloquium on “Regulating New Technologies in
Uncertain Times”, organized by the Tilburg Institute for Law and Technology
(“TILT”) at Tilburg University (The Netherlands) in June 2018.
The Ph.D. Colloquium brought together 19 young researchers / Ph.D. candidates
from over 12 universities in eight countries. The Colloquium saw presentations
organized along several themes, such as patents and innovation, energy law and
new technologies, new technologies and human rights, automation and artificial
intelligence, new technologies and algorithms, new technologies and competition
law, data protection, and humans and health.
2 Regulating New Technologies in Uncertain Times … 21

In this introductory chapter, the three main themes that feature in this volume are
discussed, before providing a brief introduction to all fourteen individual
contributions.

2.2 The Scope: Democratic Governance, Market


Regulation, and Data

This volume is structured along three main themes that capture the broader topic of
“Regulating New Technologies in Uncertain Times”. These three themes are:
• The relationship between new technologies and democratic governance;
• Market regulation and new technologies; and
• The data in new technologies.
It is considered that these three themes encapsulate some of the most pressing
regulatory challenges in respect of new technologies and are therefore worth
assessing in more detail. In this regard, both the Colloquium and this edited volume
have adopted an approach that seeks to identify commonalities between the regu-
lation of wholly different types of technologies. It is hoped that the results of the
research endeavors by the researchers featured in this volume will contribute to a
better understanding of the challenges of regulating new technologies and balancing
different societal values in the process.
The idea of this volume is therefore to bring together legal researchers who study
the regulation of new technologies from different legal background in order to
identify common problems and also some common solutions in terms of the reg-
ulation of these technologies. The aim is to learn from different legal disciplines and
to cross the boundaries that often exist between these disciplines. Consequently,
notwithstanding the three main themes identified, some of the contributions
examine the regulation of technology from a more theoretical perspective; i.e.
projects that deal with the broader underlying aspects of regulation such as legiti-
macy, trust, democracy, uncertainty, risk, precaution, competition and innovation.
Other contributions examine the regulation of a specific new technology in a
specific field, such as (public) health, data protection, cybersecurity, and intellectual
property, freedom of expression and autonomous driving.
Traditionally, risk regulation has been conceptualized as being either
technology-based, performance-based or management-based.1 Each of these three
forms of regulation has its own characteristics, merits and demerits, and may be
applied in different situations depending on the regulatory preferences at any given
point in time. Technology-based regulation is generally considered to reduce
uncertainty in respect of the operation of a specific-technology and (un)intended

1
Bonnín Roca et al. 2017, p. 1215; as well as Coglianese and Lazer 2003.
22 L. Reins

externalities.2 Performance-based risk regulation typically is considered as leaving


room for a greater amount of innovation as it merely regulates based on a defined
and desired outcome, rather than on the manner or means of technology by which
this particular outcome must be achieved.3 The downside of risk regulation that is
performance-based is that the actual demonstration and measurement of particular
outcomes in terms of performance are generally not straightforward to capture.
Management-based regulation can be characterized as a form of regulation whereby
the decision-making power is shifted to the actor possessing the greatest amount of
information or knowledge, thereby reflecting an information asymmetry between
stakeholders.4 One of the underlying objectives of this edited volume is to examine
whether, in the face of the current pace of technological developments, the tradi-
tional conceptualization of risk regulation is still sufficient. In particular, there
appears to be a broader felt need for regulation that is adaptive and anticipatory.

2.3 The Contributions

2.3.1 Part I: New Technologies and Impacts on Democratic


Governance

The first of the three main themes covered by this edited volume is entitled “New
Technologies and Impacts on Democratic Governance” (Part I). This section
addresses questions of democracy and governance relating to new technologies.
The objective of the contributions presented in this section is to provide an analysis
of the overarching and cross-cutting concepts that have a bearing on the regulation
of new technologies in a diverse society.
New technologies are not developed in a vacuum. Typically they are created or
developed in response to a perceived problem in society. Although the exact
objective will obviously depend on the specific technology in question, generally
speaking, they are aimed at making human life easier and more comfortable, as
already mentioned above. In other words, they seek to address a societal problem.
In doing so, however, they compete with other societal values that are expressed,
typically, by democratic means. By means of an example, a new technology to
extract gas from the earth may provide a solution to the problem of energy scarcity,
or energy supply security. Yet, at the same time, the deployment of that new
technology may go hand in hand with a risk of environmental pollution. The
regulation of new technologies therefore has a crucial role to play in finding an
equilibrium between these competing societal objectives. Similarly, the ability
provided by new technologies to have computers and algorithms take decisions

2
Bonnín Roca et al. 2017, p. 1217.
3
Ibid.
4
Ibid.
2 Regulating New Technologies in Uncertain Times … 23

previously taken by humans raises questions in terms of the need for democratic
accountability in respect of the deployment of these types of technologies.
These are the types of questions that are addressed by the three contributions
presented in Sect. 2.1 of this volume. They discuss issues that relate to the
democratic control over new technologies and the need to ensure that a diverse
range of societal values and objectives is adequately balanced with the need to
implement one specific technology that will contribute, in turn, to one or more other
societal values and objectives. In this regard, questions also arise as to whether
traditional forms of representative democracy at a broad level are apt for regulating
the deployment of technologies deployed at a local level.
In Chap. 3, Anna Berti Suman analyses the demarcation between freedom and
regulation by investigating the concept of community-standards for enhancing the
scientific robustness of Citizen Science initiatives. Citizen Science is defined as “the
engagement of non-expert users with technology applications to collect data about
their environment or their own physical condition” and finds its justification in a
number of rights, such as the citizens’ right to live in a healthy environment and to
have access to environmental information. Community standards are norms that are
developed by and within the community itself and which should ensure the validity
of the data that has been produced by the community-led initiative. Berti Suman
examines an approach to regulation that can be described as rights-based and rooted
in general regulatory principles. She highlights that the legitimacy of these initia-
tives is often questioned, because they may not necessarily follow established
scientific standards and expert opinion. In her contribution, she explores the
alternative between applying scientific standards to Citizen Science or rather relying
on community standards as a means to make Citizen Science practices accepted as
data collection methods. She suggests that community standards could be a way to
regulate Citizen Science, in order to ultimately improve its scientific robustness and
its potential citizen participation.
Tenille E. Brown, in Chap. 4, entitled “Human Rights in the Smart City:
Regulating Emerging Technologies in City Places” addresses the emergence of
human right standards that are truly human focused, and equality driven. In her
contribution she observes that emerging technologies such as augmented reality, the
internet of things, and the now ubiquitous private car hailing applications, signal an
entirely new facet to digital processes. This new facet consists of the fact that these
processes rely on, and are designed to interact with, the non-digital built environ-
ment. Brown puts forward that the new physicality of digital technologies in the
smart city raises questions about the desired approach towards understanding and
categorizing technologies in law and policy. Although regulatory concerns relating
to digital activities and technologies remain relevant, notably in terms of privacy,
intellectual property and licensing concerns, the geospatially-connected city raises
novel legal challenges that are often not considered by technology experts. Issues of
human rights, legal obligations in relation to equality and promoting access to
services, are traditionally associated with city law, but have not yet been substan-
tively engaged with by smart city experts. A focus on driven legal frameworks that
are rooted in human rights should underscore that the “smart” in the smart city
24 L. Reins

refers to more than advanced technology, and instead signals the development of
human rights legal standards that are truly human focused, and equality driven.
In Chap. 5, entitled “Automated driving and the Future of Traffic Law”, Nynke
Vellinga assesses the regulatory consequences of the technological developments
that enable vehicles to move in an independent and automated manner. The existing
legal framework as contained in domestic and international traffic laws is based on
the notion that a human driver is behind the steering wheel and is in control of the
vehicle. Vellinga’s contribution assesses the legal consequences of the fact that
vehicles are capable of “driving themselves” and the legal challenges that auto-
mated driving poses for traffic law. In this regard, technical regulations could reflect
the outcome of the debate on ethical issues concerning automated driving. Vellinga
addresses the highly debated “trolley problem”—if a fatal accident is unavoidable,
who should the vehicle protect or kill?—and the issue of whether and how it can be
translated into technical requirements. A vehicle can be programmed in a manner so
that it, in case of an unavoidable accident, does not favor one road-user over the
other, but tries to limit the number of victims and severity of the injuries. Vellinga’s
technology-specific contribution aims to provide insights in the bottlenecks
regarding the international conventions on road traffic and their implementation in
domestic law, by taking inspiration from maritime and aviation traffic laws in the
quest for a solution to this ethical conundrum that arises as a result of technological
developments.
In Chap. 6, Sjors Ligthart delves into the legal perspective of coercive neu-
rotechnologies and forensic evaluations. Neuroscience is developing constantly and
improves neuroimaging technologies which can acquire brain related information.
These technologies could be very useful to answering crucial legal questions in a
criminal law context. However, not all defendants and convicted persons are
anticipated to be willing to cooperate with these technologies on a voluntary basis.
Therefore the possibility of coercive use of these technologies becomes an
important issue. The use of coercive neuroimaging technologies in criminal law,
however, raises serious legal questions under European human rights law. By
means of an example, how does such coercive use relate to the prohibition of
torture, inhuman and degrading treatment (‘ill-treatment’), as contained in Article 3
of the European Convention on Human Rights (“ECHR”)? In his chapter, Ligthart
describes four neuroimaging applications and explains how they could contribute to
materializing the aims of criminal law. Furthermore, he conceptualizes two types of
coercion with which neuroimaging can be applied and explains why that distinction
is relevant in this context. Finally, the chapter explores the legal implications of
coercive neuroimaging in the context of the prohibition of ill-treatment.
The four chapters therefore each assess different new technologies in terms of
their impact on systems of democratic governance and the rule of law.
2 Regulating New Technologies in Uncertain Times … 25

2.3.2 Part II: The Economic Perspective—Market


Regulation of New Technologies

The contributions contained in Part II adopt an economic perspective towards the


regulation of new technologies. They reflect on the question on how to ensure that
new technologies find their place in the market, whilst at the same time ensuring
that other public policy objectives, such as safety or privacy, are also protected
through adequate regulation.
First, in Chap. 7, entitled “Planting Seeds of Market Power: Digital Agriculture,
Farmers’ Autonomy, and the Role of Competition Policy”, Tom Verdonk provides
a competition law perspective on technological and data-driven developments in the
agricultural sector in the EU. The contribution departs from the idea that digital
agriculture could exacerbate existing power imbalances, dependencies and barriers
to entry in the already highly concentrated agricultural markets of seeds and
agrochemicals. The risk exists since the few remaining conglomerate suppliers
could misuse their market power and platforms for digital agriculture services could
benefit from network effects. Verdonk’s chapter explains how some digital
agriculture-related practices may lead to distortions of competition and deteriora-
tions of farmers’ autonomy, but nonetheless do not necessarily violate EU com-
petition rules. In response to these market power concerns, however, authorities
may seek regulatory solutions beyond EU competition law. In that regard, Verdonk
suggests that laws on unfair trading practices, sector-specific legislation and
self-regulatory mechanisms are worth exploring, not in the least on the basis of the
EU’s Common Agricultural Policy (“CAP”).
Staying within the realm of competition law, in Chap. 8, entitled “Sharing data
and privacy in the platform economy: the right to data portability and ‘porting
rights’”, Silvia Martinelli analyses the questions on how to free consumers and
suppliers from the dictatorship of platforms and how to increase competition in the
EU’s Digital Single Market. She analyses the right to data portability and its
peculiarities in the platform economy, where this right is fundamental for compe-
tition law, users’ protection and privacy, because of the presence of strong direct
and indirect network effects and consequent high switching costs. The contribution
underlines six critical issues related to the right to data portability: (1) a privacy
issue, due to the huge sharing of data of other individuals; (2) the need to establish
also the portability of non-personal data; (3) the need to establish the portability
also for professional users that are not natural person; (4) the need to protect the
rights of the controller and his investment when the data are not merely collected
but also reworked; (5) the risk of decreased competition with a strong and
non-scalable regulation; (6) the necessity to pay attention at the technical solutions
available in order to assure practicable application methods, in particular consid-
ering the needs of smaller operators.
Theodoros Iliopoulos looks into the new regulatory challenges for distributed
generation electricity systems in the EU in Chap. 9. He investigates whether they
can be regarded as a disruptive innovation which introduces a new business model
26 L. Reins

that affects the dynamics of the market and makes it expand to new customers.
Iliopoulos observes that customers will have a key role to play in decentraliszed and
smart distributed generation electricity systems. In this regard, the existing mis-
match between legislation and innovation can best be described by the terms
‘regulatory disconnection’ or ‘regulatory uncertainty’. The ‘regulatory disconnec-
tion’ debate also applies to the field of electricity. Iliopoulous notes that the shift
from centralised electricity systems towards a decentralised distributed generation
paradigm has been rendered possible. Furthermore, digital evolution has enhanced
the empowerment of electricity consumers who can now avail themselves of a
number of sophisticated features in order to actively interact with the grid operators
and to turn into active market actors. EU law and the legislative proposals regu-
lating distributed generation in electricity systems are examined.

2.3.3 Part III: The Data in New Technologies

Part III is dedicated to the data in new technologies. In these uncertain times, it is
safe to say that many things already are, or are rapidly becoming, digital.
Consequently, questions arise as to (a) the utilization of data for all different types
of purposes; meaning not only to the benefit of consumers, but also for the
attainment of public policy objectives (safety/law enforcement) and health care; and
to (b) issues regarding privacy and data protection (personal level).
In Chap. 10, “A public database as a way towards more effective algorithm
regulation and transparency”, Florian Wittner assesses whether such a public
database could contribute to more effective algorithm regulation and transparency.
Wittner discusses the notion of a public database that gives graduated access to
information concerning algorithmic decision-making (“ADM”) systems used by
companies. These systems enable the analysis of algorithms’ consequences and
help individuals make more informed decisions. Permitting access to a public
database would require the consideration of affected companies’ justified interests,
but could further overall societal trust and acceptance, by increasing control. The
contribution tries to analyze how some of the EU’s GDPR provisions (such as
Articles 20 and 35) can provide a legal basis for this endeavor. Wittner also draws
comparisons to similar regulatory approaches in other areas (such as Environmental
Law) and makes specific recommendations for action.
Miet Caes, in Chap. 11, entitled “The impact of legislation concerning access to
government data and re-use of public sector information on the use of big data in
healthcare”, turns to the use of big-data in healthcare and the regulatory challenges
it poses, in. She addresses the challenge of data availability for the use of big data in
healthcare. Currently, big-data is strictly regulated due to the diverse interests which
are at stake, such as the protection of private life, confidentiality of patient infor-
mation, intellectual property, and the financial interests of governments. Although
the government is in the possession of valuable data for the big data applications of
private healthcare actors, a lot of these data remain out of their reach. Caes aims to
2 Regulating New Technologies in Uncertain Times … 27

demonstrate that some of the aforementioned legal limitations unnecessarily or


disproportionally hinder the use of big data by private healthcare actors.
Chapter 12, by Sascha van Schendel, is entitled “The Fundamental Challenges of
Risk Profiling used by Law Enforcement: examining the cases of COMPAS and
SyRI”. Van Schendel delves into the use of big data by national Law Enforcement
Agencies. Risk profiling in the age of Big Data in the law enforcement sector turns
the traditional practices of searching for suspects or determining the threat level of a
suspect into a data-driven process. Van Schendel observes that the increased use of
new technological developments, such as algorithms, automated decision-making,
and predictive analytics, creates fundamental challenges. Risk profiling is fre-
quently used in the United States and is becoming more prominent in national law
enforcement practices in Member States of the European Union. Van Schendel
examines the fundamental challenges that this development brings using risk pro-
filing examples, namely SyRI (from the Netherlands) and COMPAS (from the
United States). In particular, she examines several issues arising.
In the second part of Part III, the focus turns towards the protection of personal
data.
In Chap. 13, Hannah Smith addresses the regulation of data re-use for research
and the associated challenges of innovation and incipient social norms. Her chapter
explores the presence and potential drivers of divergences between the law and
individuals’ constructions of appropriate data processing. Smith’s analysis draws
upon the EU’s GDPR and data collected from two focus groups convened for this
study. She proposes that whilst the legal approach to data processing is unaltered by
innovations in data processing, this novelty caused participants to modify their
views. The uncertainty resulting from innovative data processing and disillusion-
ment with its supposed benefits prompted desires for greater control over personal
data and a questioning of the ‘public interest’ in research.
Ayşe Necibe Batman analyses the European Cloud Service Data Protection
Certification in Chap. 14. She observes that cloud computing is both an econom-
ically promising and an inevitable technology. Nevertheless, some deployment
models can be a source of risk in terms of the protection of personal data. The risks
of data loss and data breach can hold private entities back from using cloud ser-
vices. Batman assesses Articles 42 and 43 of the GDPR, which provide a new
auspicious framework for certification mechanisms to minimize these risks. She
observes further that these articles do not specify any criteria for certification
mechanisms and are also technology-neutral. To be implementable, the certification
criteria ought to be defined and a transparent procedure needs to be established. An
effective data protection certification mechanism can serve to build trust and resolve
the existing uncertainties limiting the broader usage of cloud services: certification
implies a presumption of conformity with regulatory standards, and may be seen as
an indicator of quality, which can lead to a distinction on the market.
In Chap. 15, entitled “Data Privacy Laws Response to Ransomware Attacks: A
Multi-Jurisdictional Analysis”, Magda Brewczyńska, Suzanne Dunn and Avihai
Elijahu carry out a multi-jurisdictional analysis on ransomware and data privacy
protection. Considering that security has always been central to the protection of
28 L. Reins

personal data, this chapter proposes an analysis of ransomware attacks through the
lens of the well-established information security model, i.e. the CIA (confidentiality,
integrity, and availability) triad. Using these three basic security principles, the
chapter examines whether ransomware will be considered a data breach under data
privacy laws and what the legal implications of such breaches are. In order to illustrate
these points, Brewczyńska, Dunn and Elijahu focus on ransomware attacks that target
organisations that process personal data and highlight current data privacy laws from
three different jurisdictions, namely the European Union (EU), Israel and Canada.

2.4 The Way Forward

It is my hope that this edited volume will be the start of a series in which young
scholars of technology regulation are provided the opportunity to present their
research. As humanity evolves, so will technology. Whether technology will ever
over-take human ingenuity is an open question beyond the scope of this contri-
bution. However, the fact that it can be raised should provide sufficient inspiration
for young scholars to continue their research endeavors in this exciting field.
At this stage, the editor would like to express her gratitude to the contributors to
this volume, as well as to the reviewers and participants in the conference, whose
participation in this project was instrumental in the creation of this edited volume.
In particular, the editor and the contributors would like to thank all discussants/
reviewers that contributed to this volume: Prof. Ronald Leenes, Prof. Eleni Kosta,
Dr. Merel Noorman, Dr. Inge Graef, Dr. Marin Husovec, Dr. Bo Zhao, Dr. Sabrina
Röttger-Wirtz, Dr. Colette Cuijpers and Dr. Aaron Martin.

References

Bonnín Roca J, Vaishnav P, Morgan MG, Mendonça J, Fuchs E (2017) When risks cannot be
seen: Regulating uncertainty in emerging technologies. Research Policy 46:1215–1233
Coglianese C, Lazer D (2003) Management‐Based Regulation: Prescribing Private Management to
Achieve Public Goals. Law & Society Review 37:691–730

Leonie Reins is an Assistant Professor at the Tilburg Institute for Law, Technology, and Society
(“TILT”) at Tilburg University in the Netherlands. Previously she was a Post Doctoral Researcher
at KU Leuven, Belgium where she also wrote her Ph.D. thesis on the coherent regulation of energy
and the environment in the EU. Leonie completed an LL.M. in Energy and Environmental Law at
KU Leuven, and subsequently worked for a Brussels-based environmental law consultancy,
providing legal and policy services for primarily public sector clients. Leonie’s research focuses on
the intersections of international and European energy, climate and environmental law.
Part II
New Technologies and Impacts on
Democratic Governance
Chapter 3
Between Freedom and Regulation:
Investigating Community Standards
for Enhancing Scientific Robustness
of Citizen Science

Anna Berti Suman

Contents

3.1 Introduction: Citizen Science at the Intersection Between Freedom and Regulation ..... 32
3.2 Defining Citizen Science and Community Standards....................................................... 33
3.3 Theoretical Justification of Citizen Science as a Legitimate Method and Practice......... 37
3.4 Tensions Between Expert Science and Citizen Science................................................... 38
3.4.1 Insights from the Literature.................................................................................... 38
3.4.2 Empirical Insights................................................................................................... 41
3.5 A Critical Analysis of Community Standards .................................................................. 43
3.6 Conclusion ......................................................................................................................... 44
References .................................................................................................................................. 46

Abstract Increasingly, non-expert users engage with technology applications to


collect data about their external environment or their own physical conditions. The
practice is labelled ‘Citizen Science’, meaning participatory initiatives aimed at
including laymen in knowledge production regarding complex issues, such as
environmental health risks (e.g. radiation and air pollution, as illustrated by the
cases presented). Citizen Science finds its justification in a number of rights, such as
the citizens’ right to live in a healthy environment and the right to environmental
information. Yet the legitimacy of these initiatives is often challenged as they may

A. Berti Suman (&)


Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University,
Cobbenhagenlaan 221, DE 5037 Tilburg, The Netherlands
e-mail: a.bertisuman@uvt.nl

© T.M.C. ASSER PRESS and the authors 2019 31


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_3
32 A. Berti Suman

not follow scientific standards and expert opinion. Despite the fact that the potential
of Citizen Science to provide new evidence to risk governance has been demon-
strated in a number of cases, e.g. remarkably in the Safecast radiation monitoring
case, the validity of Citizen Science-produced data is still questioned. One solution
for enhancing the scientific robustness of such grassroots initiatives would be to
have them regulated by scientific standards. However, regulating Citizen Science
may deprive it of its ‘grassroots’ nature. An alternative would be the application of
‘community standards’, namely norms ensuring the validity of the data produced by
the citizen scientists but developed ‘from below’, from within the community itself.
The chapter will explore the two alternatives and verify whether community
standards could be accepted as a way of regulating Citizen Science, in order to
ultimately improve its scientific robustness.


Keywords Citizen Science community standards  validity  citizen-produced
 
data citizen participation regulation

3.1 Introduction: Citizen Science at the Intersection


Between Freedom and Regulation

Increasingly, non-expert users engage with technology applications to collect data


about their external environment or their own physical conditions. The practice is
labelled ‘Citizen Science’, meaning participatory initiatives aimed at including
laymen in knowledge production regarding complex issues, such as environmental
health risks (e.g. radiation and air pollution, as illustrated by the cases presented).
Citizen Science finds its justification in a number of rights, such as the citizens’
right to live in a healthy environment and the right to environmental information.
Yet the legitimacy of these initiatives is often challenged as they may not follow
scientific standards and expert opinion. Despite the fact that the potential of Citizen
Science to provide new evidence to risk governance has been demonstrated in a
number of cases, e.g. remarkably in the Safecast radiation monitoring case,1 the
validity of Citizen Science-produced data is still questioned. One solution for
enhancing the scientific robustness of such grassroots initiatives would be to have
them regulated by scientific standards. However, regulating Citizen Science may
deprive it of its ‘grassroots’ nature. An alternative would be the application of
‘community standards’, namely norms ensuring the validity of the data produced by
the citizen scientists but developed ‘from below’, from within the community itself.

1
Safecast is a global volunteer-centered Citizen Science project launched after the 2011
Fukushima Daiichi Nuclear Power Plant disaster and aimed at making freely available data about
radiation in Japan and beyond. It is based on data collected by lay people using the Safecast
bGeigie Nano, a portable radiation detector. For the Safecast platform, see https://blog.safecast.
org/about/. Last accessed 6 August 2018. Hemmi and Graham 2014.
3 Between Freedom and Regulation: Investigating Community Standards … 33

The chapter will explore the two alternatives and verify whether community
standards could be accepted as a way of regulating Citizen Science, in order to
ultimately improve its scientific robustness.
The methodology for compiling the present chapter has been shaped by a combi-
nation of (a) a review of secondary data, including a literature review of scientific
publications; a secondary analysis of data files of earlier social research; a content
analysis of mass communication messages (such as blog posts and newspaper
articles), of Citizen Science websites, of email discussions within Citizen Science
groups (such as the European and US Citizen Science Associations’ mailing lists),
and of documents produced by organizations (such as white papers and toolkits
issued by the European Commission Joint Research Center); and (b) the collection
of elicited or primary data, including in-depth semi-structured interviews with key
persons in the Citizen Science field and in the broader participatory science domain;
in addition, a web survey with participants of Citizen Science initiatives and project
leaders and follow-up interviews have been performed.2
The choice of complementing the available secondary data sources with primary
data sources has been motivated by the novelty of the practices and technologies
here under study. Citizen Science, and in particular the discussions on its scientific
rigor, being a relatively new topic in the academic debate, the studies already
available were not sufficient to ground the arguments contained in this chapter. As
an ever evolving participatory method, Citizen Science and its relation to expert
science is better understood by complementing the available literature with
empirical sources.

3.2 Defining Citizen Science and Community Standards

Citizen Science in a sentence can be defined as “the active participation of lay


people in scientific research”.3 Although the hype of Citizen Science is a recent
trend,4 the practice is relatively old. It can indeed be associated with the experience
of the Cornell Lab of Ornithology,5 where the first citizen scientists were engaged
in bird monitoring since 1997. A few years before, the term was mentioned for the
first time when Alan Irwin published the book entitled “Citizen Science as a study

2
The interviews and survey have been performed as part of the ongoing Ph.D. research of the
author (start date September 2017, end date September 2020). Ethical clearance for the data
collection has been granted by Tilburg Law School (TLS-ERB #2018/01 issued on 12 June 2018).
3
Den Broeder et al. 2017, p. 1; for ‘What Isn’t Citizen Science?’, see Eitzel et al. 2017, p. 11.
4
Citizen Science is increasingly investigated from the academic scholarship and from organi-
zations both as a practice for contributing to science and as a phenomenon impacting on data
collection practices, on citizen behaviour, on project design and management and, eventually, also
on policy-making Hallow et al. 2015; Berti Suman and Van Geenhuizen (forthcoming).
5
http://www.birds.cornell.edu/page.aspx?pid=1664#. Last accessed 26 August 2018.
34 A. Berti Suman

of people, expertise and sustainable development”.6 From the words of Irwin,


Citizen Science emerges as a method and a practice characterized by the interplay
between (lay) people and (scientific) expertise. The range of application is broader
than sustainable development, as Citizen Science encompasses the engagement of
lay people from the domains of biology, conservation and ecology; geographic
information research; epidemiology and the monitoring of health and the envi-
ronment etc.7 Despite the majority of Citizen Science projects being aimed at
amateurial monitoring or being framed as a learning experience for the citizens, an
increasing number of initiatives are recognized as actually contributing to science
and even influencing policies.8
As affirmed in a recent report from the non-profit BSCS Science Learning,9
“despite initial scepticism by scientific traditionalists, citizen science has proven
itself as a method for conducting rigorous science.”10 In view of the widely rec-
ognized potential of Citizen Science it seems worth investigating what causes the
practice to be regarded as a valid source of knowledge, both for scientific and for
policy purposes. It is under this lens of analysis that this chapter approaches Citizen
Science, thus inserting itself into the flourishing scholarship that recently discussed
Citizen Science's challenges in terms of representativeness, validity of the data in
terms of quality,11 accuracy and reliability,12 but also in terms of achieving “deep
citizen engagement and policy influence”.13 This chapter aims to shed light on the
rather scarcely researched scientific potential of Citizen Science and the conditions
under which to fully realize it, going beyond those discussions that have mainly
focused on the educational benefits for participants and which have presented
educational goals as contrasting with scientific goals.14 As recently demonstrated
by a completed project from BSCS Science Learning, the two aims would not
actually be conflicting.15 In fact, it seems possible to design guidelines to harmo-
nize both the scientific and educational benefits of Citizen Science, and to ensure
that it can actually contribute both to science and society.

6
Irwin 1995.
7
Kullenberg and Kasperowski 2016, p. 1; Berti Suman and Van Geenhuizen (forthcoming).
8
Van Brussel and Huyse 2018; Berti Suman and Van Geenhuizen (forthcoming).
9
BSCS Science Learning is an independent non-profit dedicated to transforming science
education.
10
Edelson et al. 2018. Report available at https://bscs.org/tech-report/2018-1. Last accessed 26
August 2018.
11
Van Brussel and Huyse 2018.
12
Foody et al. 2016.
13
Van Brussel and Huyse 2018, p. 1.
14
Zoellick et al. 2012; BSCS 2018, p. 1; Edelson et al. 2018.
15
BSCS Science Learning (2018) Press release: New BSCS Report Presents Guidelines for
Designing Citizen Science Projects that Merge Science and Education. Available at https://media.
bscs.org/tech-report/2018-1/bscs_citscireport_release.pdf. Last accessed 12 August 2018, p. 1;
Edelson et al. 2018.
3 Between Freedom and Regulation: Investigating Community Standards … 35

The recognition of the potential of Citizen Science not only for social purposes,
but also for scientific aims has recently raised a lively debate on the need to adopt
quality standards in such initiatives. The ‘scientists’ in Citizen Science projects
being mostly laymen engaged in scientific measurements, it seems particularly
pressing that the methods followed and the tools used are scientifically sound.
The mentioned standards would consequently represent a blueprint and reference
for planning and performing Citizen Science initiatives that make sense for the
scientific community. In other words, they are nothing more than guidelines that
citizen scientists and project coordinators have to follow to ensure scientific rigour
in conducting their observations or monitoring. The reliance on such rules brings
the promise of ensuring the quality, reliability, credibility and verifiability of such
participatory projects. Obtaining this outcome is indispensable as, ultimately, only
the initiatives presenting these characteristics will be listened to by scientists and
maybe policy-makers.
Yet there are two options which entail two opposite choices. One solution to
ensure scientific rigour is that of adopting ‘expert standards’, namely standards that
have been produced by the scientific community and that are followed by experts
involved in performing similar measurements. For example, a Citizen Science
project aimed at mapping alien species should, viewed from this perspective, be
guided by the practices currently followed by scientists performing such mapping.
This approach can be labelled as more ‘institutionally driven’ compared to the other
solution proposed next. It does not overturn hierarchies of knowledge creation,
leaving the ultimate power of control over Citizen Science initiatives in the hands of
institutionally recognized experts and scientists. However, the need to hold to
expert practices may be criticized as it would entail that the laymen’s contribution
to scientific knowledge production is limited to data collection. Differently, Citizen
Science aims also at innovating the methods for data collection. In addition, the rise
of Citizen Science can be viewed as an answer to political decision-making pri-
marily dominated by the reliance on expert opinion. Forcing Citizen Science ini-
tiatives to strictly follow expert standards may risk undermining the potential itself
of the practice to respond to a legitimacy crisis of and loss of trust in science.16
Another option is represented by the so-called ‘community standards’, which can
be defined as local norms bounding and determining acceptable practices within the
community of reference. Such standards have recently proliferated within the field of
Citizen Science projects. Examples come from the US debate, such as the DataONE
Data Management Guide for Public Participation in Scientific Research,17 the
Guidance for Quality Assurance Project Plans,18 the Volunteer Monitor’s Guide to

16
Bijker et al. 2009.
17
Wiggins et al. 2013. Available at http://www.birds.cornell.edu/citscitoolkit/toolkit/steps/accept/
DataONE-PPSR-DataManagementGuide.pdf. Last accessed 27 August 2018.
18
https://www.epa.gov/sites/production/files/2015-06/documents/g5-final.pdf. Last accessed 27
August 2018.
36 A. Berti Suman

Quality Assurance Project Plans,19 and the EPA Requirements for Quality
Management Plans20 by the United States Environmental Protection Agency (EPA).
This option has been recently welcomed by the Citizen Science community
worldwide and could be labelled as more ‘grassroots-driven’ if confronted with the
first option presented above. A sectorial work of defining community standards has
flanked the general discussion. Within the domain of Citizen Science aimed at
mapping biodiversity, the Chapman Principles of Data Quality21 are considered the
reference point for developing any project falling under this area. Recently, dis-
cussions on quality assurance mechanisms have multiplied in workshops organized
by the European Commission Joint Research Center together with competent
authorities and the Citizen Science community, for example in the field of detecting
Invasive Alien Species in Europe.22 The Principles, released in 2009, are expected to
be updated in the coming months. In the domain of forestry, the US Forest Service
Citizen Science unit recently issued the new Forest Service Citizen Science Project
Planning Guide and Project Plan Template,23 two tools with step-by-step guidance
for planning a citizen science project applied to forest protection. Also in the field of
participatory water monitoring, a particularly flourishing activity of community
standards-setting has emerged. In California, the Clean Water Team24 recently
published the Surface Water Ambient Monitoring Program’s (SWAMP)
Bioassessment Quality Assurance Project Plan25 containing guidelines and a toolkit
to ensure the quality of water monitoring by citizen scientists.
These community standards appear as a half-way solution: on one side they
would ensure that Citizen Science practices contribute to scientific knowledge
production on the basis of valid data, on the other they would not be ‘imposed’ by
external experts but rather developed within the Citizen Science community. In
addition, as will be shown throughout the chapter, this community is not only
formed by laymen, but often citizens engage in similar projects because they have a
form of expertise to provide. They are not acting in their professional role of

19
https://www.epa.gov/sites/production/files/2015-06/documents/vol_qapp.pdf. Last accessed 27
August 2018.
20
https://www.epa.gov/sites/production/files/2016-06/documents/r2-final.pdf. Last accessed 27
August 2018.
21
Chapman 2005.
22
https://easin.jrc.ec.europa.eu/easin/NewsAndEvents/DetailEvents/5f26e136-d914-413b-a851-
393c26b25f89. Last accessed 3 January 2019. The author participated in the workshop.
23
https://www.fs.fed.us/working-with-us/citizen-science/resources. Last accessed 27 August
2018.
24
The Clean Water Team (CWT) is the citizen monitoring program of the State Water Resources
Control Board of California. The CWT is a part of the Surface Water Ambient Monitoring
Program (SWAMP).
25
https://www.waterboards.ca.gov/water_issues/programs/swamp/docs/qapp/bioassessment_
qapp.pdf. Last accessed 27 August 2018.
3 Between Freedom and Regulation: Investigating Community Standards … 37

experts, but they may well be experts either by experience (the notion of ‘experi-
ential experts’)26 or because of their amateurial interest. In addition, the sample of
citizen scientists interviewed in the aforementioned empirical research on Citizen
Science for environmental policies has shown that often participants and project
coordinators are trained in the specific field to which the monitoring is applied and
thus hold a considerable expertise on the topic. What makes citizen scientists dif-
ferent from professional scientists is that they are not running the Citizen Science
project as part of the fulfilment of their professional role but out of other motiva-
tions (e.g. the desire to protect the environment, concern, curiosity, as a hobby…).
The appropriateness of reliance on community standards to ensure the scientific
rigour of Citizen Science will be assessed. The pros will be discussed, as well as
two important sources of concern related to the external uptake of such standards
and to the trade-off between quality and participation. The theoretical and empirical
analysis will inspire the formulation of conclusive reflections that may contribute to
strengthening the potential and use of community standards, in order to cope with
the challenges the adoption of these standards brings about.

3.3 Theoretical Justification of Citizen Science


as a Legitimate Method and Practice

Citizen Science is not just amateurial and recreational monitoring. In this chapter,
Citizen Science is regarded both as a method of production of scientific knowledge
and as a practice by which laymen enter the scientific debate. I argue that the
legitimacy of Citizen Science as a method and as a practice should be acknowledged.
Such legitimacy would derive in the first place from what Becker et al. defined as the
“cross-fertilization process” entailed in practices of participatory science.27 The
authors analysed practices of grassroots noise monitoring and underlined that the data
provided by the citizens on noise gave precious insights into the objective status of
noise pollution but also on its social perception. These insights would arguably
strengthen the societal response to shared problems such as environmental pollution
as they could complement expert knowledge with perceptions, desires and claims
from the citizens.28 However, the inclusion of non-expert knowledge in current
decision-making is still weak as the fundamental question on whether political and
social issues are better resolved only through technical expertise or rather also
through democratic deliberation is still controversial.29
A deeper theoretical inspection of the issue would be advisable. Indeed, as
Bäckstrand rightly pointed out, the “theoretical foundations for coupling democratic

26
Berti Suman 2018a.
27
Becker et al. 2013, p. 1.
28
Berti Suman 2018b.
29
Bäckstrand 2004, p. 24.
38 A. Berti Suman

citizen participation with scientific assessment” are still weak.30 The author opens the
way for such inspection by presenting three main arguments.31 First, she frames
Citizen Science as a tool to restore public trust in science by bridging the gap between
scientists and citizens. Secondly, Citizen Science is discussed as a way to strengthen
scientists’ ability to cope with post-normal science issues the complexity of which
demands a cooperation among the affected stakeholders. Thirdly, Citizen Science
could contribute to making science more transparent, accountable and ultimately more
democratic, thus again boosting the quality of scientific knowledge production.
Lastly, from a legal perspective, I argue that Citizen Science should be viewed as
a form of rights in action concretizing claims grounded in fundamental rights, such
as the right to participate in the democratic debate, the right to (environmental)
information, the right to health and the right to live in a healthy environment. In
addition, also to the very recent ‘right to science’32—framed as the right to enjoy
the benefits of scientific progress and its applications—and the 'right to contribute'
to the production of (environmental) data,33 which may legitimize citizens’
participation in scientific knowledge production.34

3.4 Tensions Between Expert Science and Citizen Science

3.4.1 Insights from the Literature

Citizen Science has recently been scrutinized in terms of its scientific quality. In
particular, both the Citizen Science community and interested researchers from
academia have engaged in methodological discussions to inspect to what extent
such forms of participatory science can produce sound results. These reflections are
primarily of interest for the citizen scientists themselves as the quality, reliability
and validity of the Citizen Science data is crucial for ensuring that these initiatives
are listened to at a higher institutional level, composed by scientists and by
policy-makers.35
Recently, the aforementioned Citizen Science initiative Safecast36 aimed at
monitoring post-Fukushima radiation had to deal with the criticism coming from
‘institutional science’. Indeed, as an example of the confrontation between Citizen
Science and expert-based science, Safecast was criticized in an academic paper

30
Ibid., p. 27.
31
Ibid., pp. 30–35.
32
The first mention of the right to science in an official document was in 2012, when Farida
Shaheed, Special Rapporteur for the United Nations, submitted a report to the UN Human Rights
Council on the scope and application of the right to science. See Shaheed 2012.
33
Balestrini 2018.
34
Berti Suman and Pierce (forthcoming).
35
Schade et al. 2017.
36
See footnote 1 above.
3 Between Freedom and Regulation: Investigating Community Standards … 39

which claimed that its measurements were inaccurate or even totally wrong. In
particular, the Safecast bGeigie (the Geiger counter used in the project for radiation
monitoring) was alleged to systematically overestimate the true ambient dose of
radiation. The paper was published by Cervone and Hultquist under the title
“Calibration of Safecast dose rate measurements” in the Journal of Environmental
Radioactivity.37 The Safecast group released a rebuttal38 in which they affirmed that
third-party analysis and criticism of the Safecast methods, system and results are
very welcome, either by academia or by other interested groups. However, the
Safecast scientists challenged the claims and conclusions of Cervone and Hultquist,
alleging that such claims were erroneous, stemming from misunderstandings of the
Safecast bGeigie system, from improper statistical analysis and from wrong
assumptions regarding the purpose and utility of the Safecast radiation surveys. In
defending their methods and choices, the Safecast community used a scientific
terminology and grounded their justification in scientific literature. The con-
frontation, although it may be detrimental to the Citizen Science community, can
also be a way for the practice to (re)affirm its validity, vis-à-vis possible criticisms
and provocations from the scientific community.
Furthermore, the practice of participatory science projects run by academic
institutions has shown that, despite the quality of data gathered by the citizen
scientists often being lower than that produced by scientists, the potential of Citizen
Science is in the ‘big number’ of data collection points which would compensate
the lower quality of the tools used by the grassroots. A recent report by Parrish
et al.,39 timely entitled “Exposing the Science in Citizen Science”, designed a
science-based typology of Citizen Science focused “on the degree to which projects
deliver the type(s) and quality of data/work needed to produce valid scientific
outcomes directly useful in science and natural resource management”.40 In the
report, the authors illustrate the requirements for making Citizen Science a source of
“rigorous information”,41 measuring it against traditional science.
The report inspects the effectiveness of both quality assurance methods, aimed at
increasing data quality in the design and implementation phases of a Citizen Science
project, and quality control methods, aimed at checking the quality of the scientific
outputs. The authors conclude that “high quality science can be produced with
massive, largely one–off, participation if data collection is simple and quality control
includes algorithm voting, statistical pruning, and/or computational modelling”.42 As
indicated above, the great amount of data in large-scale Citizen Science projects
would compensate the possible quality failures of non-expert measurements.

37
Cervone and Hultquist 2018.
38
See https://blog.safecast.org/2018/08/rebuttal-of-calibration-of-safecast-dose-rate-measurements-
by-cervone-and-hultquist/. Last accessed 27 August 2018.
39
Parrish et al. 2018.
40
Ibid., p. 1.
41
Ibid.
42
Ibid.
40 A. Berti Suman

A different conclusion is instead reached with regard to smaller-scale projects


where participants are engaged “in repeated, often complex, sampling”.43 In such
instances the amount of data produced is not enough to outweigh the quality
shortages. For those cases, the authors suggest that quality can still be ensured by
means of expert intervention both a fortiori and ex-post. Parrish et al. identify the
need for “expert-led training and well-designed materials” to guide the participants
in the initiative, and for “independent verification” afterwards of the results, per-
formed by experts.44 Although the rationale of these expert checks is clear, I see
here a possible conflict between the aims of Citizen Science to democratize science
production and the inevitable need to rely on expert opinion to validate Citizen
Science. Whereas freeing science from the exclusive reliance on expert knowledge
is a noble goal, the Citizen Science community should creatively think of ways to
ensure the production of valuable and reliable results. This may ultimately result in
a loop: Citizen Science tries to open up science to the people, but to be listened to at
the higher political and scientific levels still has to refer back to expert science.
Under this aspect, one of the founders of the AiREAS project, Close, argued that
there is a contrast between current scientific expertise, which he situates in the field
of ‘control’ and civic science which would be located in the field of ‘awareness’.
The two sciences may have very different goals, one being the collecting of data for
shaping scientific discourses, the other mainly aimed at making the citizens aware
of scientific matters. In both cases, data collection is just a means, not a goal.
However, such data collection has to be valid, and the citizens may need to rely on
scientific expertise to achieve this goal. For ‘traditional scientists’, this civic
impingement into classic science may be confusing, and they may feel they are
losing control of scientific production. However, this lack of control may be an
opportunity to innovate regarding scientific methods and stimulate a social progress
based on civic awareness. Yet the two sides, the experts and the citizens, need to
find a shared language to engage in dialogue and understand each other.45
Another possibility to ensure that Citizen Science is trusted by scientists and
policy-makers would be having it regulated, as other formalized data collection
practices are. Currently, Citizen Science is not subjected to uniform regulations
worldwide or nationally, nor does it follow specific pre-set standards. Yet experts of
Citizen Science (for example the European Commission Joint Research Center hub)46
have recently been advocating for the adoption of shared protocols, key-terms and
language to enhance exchangeability and verifiability of results (which would con-
figure the afore-presented ‘community standards’). In addition, from the Citizen
Science community there has been recently a flourishing production of ‘toolkits’ to
guide citizen scientists, especially the less technology-literate ones. As affirmed in the

43
Ibid.
44
Ibid.
45
Virtual discussion with Jean Paul Close, co-founder of AiREAS, on 15 October 2018. For more
information, see Sect. 3.1.
46
In-person discussion with an expert on Citizen Science at the EC JRC, Ispra, Italy, on 14 May
2018. More information in Sect. 3.1.
3 Between Freedom and Regulation: Investigating Community Standards … 41

Citizen Sensing47 Toolkit, “[t]echnology can be a daunting aspect of citizen sensing


for many participants, especially those with little technical or scientific expertise”.48
Such toolkits aim to guide future citizen scientists in the process of choosing and
eventually building the right technology for the information they want to collect.
There are indeed two dominant approaches in the design of a Citizen Science
project: either the participants will adopt ready-made tools or they will build the
tools themselves. In the first case the citizen scientists will use technologies, e.g.
smartphones, air sensors and Geiger counters, that are already regulated and
compliant with standards. In the second case, they will assemble pieces to create a
device, often making use of open hardware tools based on open source
components-platforms such as Arduino. In the first case, there is the conceivable
need to regulate how these technologies are used when laymen operate them to
produce scientific output. For the second case, the regulatory need would regard not
only the methods used but also the design of the Citizen Science technologies.

3.4.2 Empirical Insights

As detailed in the introduction, I find it appropriate to complement the available lit-


erature on the topic discussed with some of the results of an ongoing empirical study49
I am conducting on participants and project coordinators of Citizen Science projects for
environmental policies. Performing a discourse analysis of the responses collected was
especially illuminating for some of the discussions developed in this chapter.
First, an actual ‘expertise’ emerged in the respondents’ group, supporting the
argument developed in Sect. 3.2 on the circumstance that many citizen scientists are
scientifically literate and in some cases even experts. I indeed indicated that, fre-
quently, participants of Citizen Science initiatives and project coordinators have a
considerable expertise on the topic of the initiative, either by experience or by previous
training. A respondent from the Safecast case (grassroots radiation monitoring
post-Fukushima) affirmed that he50 “created an iOS app that interfaced with analogue
Geiger counters (Geiger Bot) and added mapping.” He was thus able to create an app
and connect it with the Geiger counters. A respondent from the AiREAS case51 said:
“My first contact with AiREAS was because of my interest in Open Data for use in my

47
Citizen Sensing can be considered a sub-set of the broader notion of Citizen Science, focused
on lay people monitoring external factors through sensors and spreading the information on
networks/visual representations of the collected data. More information in Berti Suman 2018a.
48
Making Sense Project 2018, p. 80.
49
See Sect. 3.1 for more details.
50
In general, the sample includes people over the age of 20 and below the age of 70, mostly male
participants, of diverse nationality and language, but primarily from the Netherlands, Japan and the US.
51
AiREAS is an initiative launched in the city of Eindhoven aimed to create an intelligent, real
time measurement system through which anyone could check the status of the quality of the local
air at any time in their direct vicinity. For the AiREAS website and platform, see http://www.
aireas.com/welcome-to-aireas/. Last accessed 27 August 2018.
42 A. Berti Suman

IT development projects.” Again, a form of expertise emerges. Another Safecast


participant adds that “most core members do that in their spare time and are not
originally nuclear scientists”, but he admits “although there is a large proportion of
people with a technical background of one kind or another” (emphasis added). The
breadth of expertise is wide, ranging from nuclear sciences to health sciences. An
AiREAS respondent said to have joined the project because of “a mutual interest in the
effects of environmental factors of impact on human wellbeing, in particular appli-
cation of my expertise on air quality and cardiovascular health.” A Safecast participant
adds: “I classify Safecast as citizen-driven, because that’s about the only thing the
participants have in common. We are artists, engineers, homemakers, teachers, and
writers. No one is excluded, and no qualifications are required” (emphasis added).
The value of the specific Citizen Science projects for improving the quality and
openness of science is stressed by the participants. A respondent timely stated:
“Before Safecast it was normal for averages to be published and without any info
on how the data was collected, now only specifics with details down to the device
level are demanded by everyone. The entire scope of radiation monitoring shifted
because of our efforts” (emphasis added). The promises and achieved results of the
project seem high in the view of the participants. An AiREAS participant argued
that what motivated him to join the initiative was the opportunity of a “data pro-
cessing and visualization targeted on making the invisible visible and creating
awareness” (emphasis added).
Discourses on the reliability and quality of the data collected are recurrent.
A Safecast participant for example stated that he “got involved for reasons that
became the core purpose of Safecast: to spread reliable information” (emphasis
added). Another respondent, this time from Safecast, argued: “I feel we have had
several significant impacts: (1) That “amateurs”, lightly trained and loosely orga-
nized, can gather high quality data. (2) That simple devices, with moderate cost,
can be deployed rapidly in response to unexpected radiological emergencies.
(3) Sensational, purposeful misinformation CAN be countered in the Internet Age”
(emphasis added, capital by the respondent). The importance of the validity of the
data produced for individual and collective choices is stressed: “Safecast has made a
meaningful contribution to individuals and to society. There are families who made
important decisions about their lives. Those decisions were fraught with doubt and
uncertainty. Safecast data assuaged some of that. Governments too, may not have
been comforted by Safecast looking over their shoulder, but I believe their work
was ‘adjusted’ because Safecast set an example of openness and objectivity”
(emphasis added). The individual level and the collective are highly intertwined in
the words of the respondents, as emerges clearly from this response: “Being a
resident of Japan I’m always concerned how the events in Fukushima might affect
me here. Being able to track levels closely is a great relief. I look forward to being
able to provide similar relief to others facing and managing environmental risks.”
The potential of Citizen Science is evident in the words of some responses. Safecast
has been “a game changer”, it is argued, “just watching the online map mature and the
way all data can be downloaded and used by anyone but still retaining data history had
never happened on the same scale before.” Scientific words are not frequently used by
respondents, except in some cases, such as the opinion of an AiREAS citizen scientist
3 Between Freedom and Regulation: Investigating Community Standards … 43

defining it as “a good and statistically sound evaluation of collected data” (emphasis


added). Again from the AiREAS case, a participant argued that, despite “scientifically
[there being] no novel viewpoints, however [the initiative provides] very interesting
insight in the City of Eindhoven and the difficulties involved in developing ‘citizen
science’” (emphasis added). Also the explicit recognition by the respondent of being
part of a Citizen Science project as such was not frequent.

3.5 A Critical Analysis of Community Standards

The brief discourse analysis that preceded this section has shown how, internally to
the Citizen Science community, initiatives are perceived as legitimate and the quality
of the data produced is acknowledged. However, this is not always the case beyond
the borders of the community, where traditional knowledge and power structures
dominate. Such structures are highly challenged by forms of scientific production that
do not fall within standard and formalized categories. However, the likelihood that
Citizen Science-produced data will be considered valid and will ‘win’ the challenge
and therefore be considered valid is highly contextual. As Jasanoff wisely argued, the
legitimacy of the decision taken by the ‘decision-makers’ in a specific community
ultimately depends upon their ability to reconstruct a ‘plausible scientific rationale for
the proposed action’.52 Also the plausibility of Citizen Science evidence is heavily
interconnected with the trust that external actors will have in the citizen scientists.
Trust is partially irrational, but it can also be stimulated through specific measures.
Showing the external world that a Citizen Science project was run on the basis of
previously agreed standards (the so-called ‘community standards’) seems particularly
to contribute to the building of trust in a grassroots initiative. However, as such
standards still spring from the citizen level, it is not certain that the external actors
will consider these community standards as valid in first instance, which may make
the reliance on such standards rather pointless. On the other hand, the possibility of
relying on expert standards also presents criticism, as discussed above.
The second concern posed by community standards relate to what Parsons,
Lukyanenko and Wiersma described as the “trade-off between quality and partic-
ipation”.53 The authors in their article “Easier citizen science is better” acknowl-
edge that the participation rate of laymen in research is unprecedented. They
distinguish a passive participation, namely the simple act of allowing their data to
be collected for a scientific purpose, and an active participation, where the citizen
scientists have to actively perform some form of data collection. Parsons,
Lukyanenko and Wiersma argue that, for active projects, the request to follow strict
categories of what is observed may inhibit the participation of non-experts.
For example, in the case of bird monitoring, asking the participants to classify the
observation under a certain species may either exclude those who lack the relative

52
Jasanoff 1987.
53
Parsons et al. 2011, p. 47.
44 A. Berti Suman

knowledge necessary for the identification or lead to misidentification. The result,


according to the authors, would be this trade-off between participation and data
quality. However, the study suggests that this conflict may be reconciled by giving
participants the possibility to report “a sighting in terms of observed attributes,
eliminating the need to force a (possibly incorrect) classification.”54 This reflection
suggests that community standards ensuring data quality should be designed in a
way that encourages the participation of a broader public, by requesting observa-
tions to be captured by flexible attributes rather than strict categories. However, as
anticipated in the previous sections, there is still the risk that the scientific com-
munity will not appreciate this flexibility in standard-setting, and this again may
undermine the credibility of Citizen Science projects.

3.6 Conclusion

Throughout the chapter, it emerged both from theoretical and from empirical
insights that Citizen Science can be regarded as a legitimate method and practice
having the potential to produce sound scientific knowledge. The empirical research
showed that often Citizen Science groups present forms of expertise that they can
bring to the table. In addition, it resulted in the participants actually recognizing the
value of Citizen Science for improving the quality and openness of science. They
are also aware that the data they collect is reliable and valid, although they rarely
use scientific words in their discourses.
Despite the recognized potential of Citizen Science to ultimately improve sci-
ence, this outcome is achieved only if the initiative follows specific requirements,
which have been identified in the quality, reliability, credibility and verifiability of
the Citizen Science project. The need for reliance on pre-set standards appears
indispensable, considering that the ‘scientists’ in Citizen Science projects are often
non-expert people. This reliance seems necessary to ensure that Citizen Science
initiatives are not viewed as amateurial practice, but are actually taken into account
and valued by scientists and maybe policy-makers.
Whereas imposing external standards derived from the scientific community
risks depriving Citizen Science of its own nature, the possibility of resorting to
community standards has been welcomed. It has been argued that such standards
may still grant that Citizen Science is bound by rules in producing scientific
knowledge. This way Citizen Science’s aims to democratize science and free it
from exclusive reliance on expert opinion are preserved.
The need for standards regulating Citizen Science practices has been framed
differently in the case of citizen scientists using available technologies, in which
case only the methods for using these technologies should be regulated, and in the
case of the participants also building the tools, where regulation is required both at
the design stage and at the measurement stage.

54
Parsons et al. 2011, p. 47.
3 Between Freedom and Regulation: Investigating Community Standards … 45

If the legitimacy of Citizen Science within the community of citizen scientists is not
debated, its external legitimacy still requires some effort and creative thinking. In order to
start this process, first, dominant knowledge and power structures based on expert
opinion should be ‘opened up’ to welcome contributions ‘from below’. The cross-
fertilization of Citizen Science practices towards traditional scientific practices is com-
plementary, not alternative. Therefore, scientists should regard Citizen Science not as
an ‘enemy’ but as an opportunity to improve scientific knowledge production.
Consequently, the scientist community should engage in constructive discussions on
which standards guiding Citizen Science projects are acceptable and appropriate. This
mutual process of learning and understanding respective methods, the scientists towards
the citizen scientists and vice versa, could ultimately strengthen both traditional science
and Citizen Science. Future research is needed on the strategies and mechanisms that
could guide this process of mutual learning and collective standard-setting, both from a
theoretical point of view and from an empirical perspective.

Acknowledgements I would like to thank the organizers of the Ph.D. Colloquium that created a
highly stimulating venue for discussion. A special acknowledgment goes to the Citizen Science
community which enthusiastically responded to my research, especially the Safecast and AiREAS
project coordinators and volunteers. Lastly, I sincerely thank the Brocher Foundation, Geneva,
which hosted the finalization of this piece.

References

Bäckstrand K (2004) Citizen Science for Sustainability: Reframing the Role of Experts,
Policy-Makers and Citizens in Environmental Governance. Global Environmental Politics.
https://doi.org/10.1162/152638003322757916
Balestrini M (2018) Beyond the transparency portal: Citizen data and the right to contribute.
ICT4D Blog. http://ictlogy.net/20181004-mara-balestrini-beyond-the-transparency-portal-
citizen-data-and-the-right-to-contribute/. Last accessed 23 December 2018
Becker M et al (2013) Awareness and Learning in Participatory Noise Sensing. PloS ONE. https://
doi.org/10.1371/journal.pone.0081638
Berti Suman A (2018a) Challenging risk governance patterns through citizen sensing: the Schiphol
Airport case. International Review of Law, Computers & Technology. https://doi.org/10.1080/
13600869.2018.1429186
Berti Suman A (2018b) The smart transition: an opportunity for a sensor-based public-health risk
governance? International Review of Law, Computers & Technology. https://doi.org/10.1080/
13600869.2018.1463961
Berti Suman A, Pierce R (forthcoming) Challenges for Citizen Science and the EU Open Science
agenda under the GDPR. European Data Protection Law Review
Berti Suman A, Van Geenhuizen M (forthcoming) Not just monitoring: rethinking Citizen Sensing
for risk-related problem-solving. Environmental Planning and Management
Bijker WE, Bal R, Hendriks R (2009) The Paradox of Scientific Authority: The Role of Scientific
Advice in Democracies. The MIT Press, Cambridge, Mass; London, England
Cervone G, Hultquist C (2018) Calibration of Safecast dose rate measurements. Journal of
Environmental Radioactivity (190–191):51–65
Chapman AD (2005) Principles of Data Quality, version 1.0. Report for the Global Biodiversity
Information Facility, Copenhagen. http://www.gbif.org/document/80509 Accessed on 12
August 2018
46 A. Berti Suman

Den Broeder L et al (2017) Public Health Citizen Science; Perceived Impacts on Citizen Scientists:
A Case Study in a Low-Income Neighbourhood in the Netherlands. Citizen Science: Theory
and Practice. https://doi.org/10.5334/cstp.89
Edelson DC, Kirn SL, Workshop Participants (2018) Designing citizen science for both science
and education: A workshop report. Technical Report No. 2018-01. Colorado Springs, BSCS
Science Learning. https://bscs.org/tech-report/2018-1 Accessed on 26 August 2018.
Eitzel M et al. (2017) Citizen Science Terminology Matters: Exploring Key Terms. Citizen
Science: Theory and Practice: 1–20 ISSN 2057-4991
Foody G et al (2016) Strategies Employed by Citizen Science Programs to Increase the Credibility
of Their Data. Citizen Science: Theory and Practice. https://doi.org/10.5334/cstp.6
Hallow B, Roetman PEJ, Walter M, Daniels CB (2015) Citizen Science for policy development:
The case of koala management in South Australia. Environmental Science & Policy. https://
doi.org/10.1016/j.envsci.2014.10.007
Hemmi A, Graham I (2014) Hacker science versus closed science: building environmental
monitoring infrastructure. Information, Communication & Society. https://doi.org/10.1080/
1369118x.2013.848918
Irwin A (1995) Citizen Science: A Study of People, Expertise and Sustainable Development.
Routledge, London
Jasanoff SS (1987) Contested Boundaries in Policy-relevant Science. Social Studies of Science 17
(2):195–230
Kullenberg C, Kasperowski D (2016) What Is Citizen Science? – A Scientometric Meta-Analysis.
PLoS ONE. https://doi.org/10.1371/journal.pone.0147152
Making Sense Project (2018) Citizen Sensing. A toolkit. ISBN/EAN: 978-90-828215-0-5. http://
making-sense.eu/publication_categories/toolkit/ Accessed on 4 August 2018
Parrish JK, Burgess H, Weltzin JF, Fortson L, Wiggins A, Simmons B (2018) Exposing the
Science in Citizen Science: Fitness to Purpose and Intentional Design. Integrative and
Comparative Biology. https://doi.org/10.1093/icb/icy032
Parsons J, Lukyanenko R, Wiersma Y (2011) Easier citizen science is better. Nature 471
Schade S, Manzoni-Brusati M, Tsinaraki C, Kotsev A, Fullerton K, Sgnaolin R, Spinelli F,
Mitton I (2017) Using new data sources for policymaking. EUR 28940 EN. https://doi.org/10.
2760/739266, JRC109472
Shaheed F (2012) The right to enjoy the benefits of scientific progress and its applications. A/
HRC/20/26, HRC, Geneva
Van Brussel S, Huyse H (2018) Citizen science on speed? Realising the triple objective of
scientific rigour, policy influence and deep citizen engagement in a large-scale citizen science
project on ambient air quality in Antwerp. Journal of Environmental Planning and
Management. https://doi.org/10.1080/09640568.2018.1428183
Wiggins A et al (2013) Data Management Guide for Public Participation in Scientific Research.
Albuquerque, DataONE
Zoellick B, Nelson SJ, Schauffler M (2012) Participatory Science and Education: Bringing Both
Views into Focus, Frontiers in Ecology and the Environment 10(6):310–313

Anna Berti Suman is a Ph.D. researcher at the Tilburg Institute for Law, Technology, and
Society. Her Ph.D. project aims at investigating how ‘Citizen Sensing’ affects the governance of
environmental risk to public health and how it can be integrated within the current frameworks for
risk governance. Her specializations are Health and Environmental Law and Technology, and
Citizen Science. She has work and research experience in the health sector (Chelsea &
Westminster Hospitals, London), Extractive Industries (Unión de Afectados y Afectadas por
Texaco, Ecuador) and Water Law (Comisión Económica para América Latina y el Caribe, and
Fundación Chile, Chile).
Chapter 4
Human Rights in the Smart City:
Regulating Emerging Technologies
in City Places

Tenille E. Brown

Contents

4.1 Introduction........................................................................................................................ 48
4.2 Visions of the Smart City ................................................................................................. 50
4.3 Human Rights in the Smart City ...................................................................................... 51
4.3.1 Human Rights Cities .............................................................................................. 52
4.3.2 Human Rights as the Foundation of the Smart City............................................. 54
4.4 The Smart City Competition ............................................................................................. 55
4.4.1 Smart City Strategies in Canadian Cities............................................................... 56
4.4.2 Law and Policy in Smart City Proposals............................................................... 58
4.5 Implementing Rights in the Smart City............................................................................ 59
4.5.1 Strategies................................................................................................................. 59
4.6 Conclusion ......................................................................................................................... 62
References .................................................................................................................................. 63

Abstract The emergence of technology processes to be used and accessed in city


spaces has come swiftly and a world of geospatially driven technology in the
emerging smart city is now upon us. The ubiquity of digital technologies in the built
environment of the smart city raise questions about how we approach, understand
and categorize technologies for law and policy purposes. Areas traditionally looked
at in relation to digital activities remain relevant, in addition, however, the smart
city raises legal concerns that are often not considered by technology experts. Issues
of human rights, legal obligations in relation to equality and promoting access to

T. E. Brown (&)
Faculté de droit | Faculty of Law, Université d’Ottawa | University of Ottawa, Ottawa,
Canada
e-mail: tbrow030@uottawa.ca

© T.M.C. ASSER PRESS and the authors 2019 47


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_4
48 T. E. Brown

services, have not yet been substantively engaged with in the creation of the smart city.
This chapter has argued that existing legal frameworks pertaining to human rights laws
and norms provide guidance for developers of smart cities and must be adopted as
guiding legal frameworks in the creation of the smart city. Early as it is in smart city
processes there is an opportunity to identify and develop the appropriate legal frame-
works to ensure the smart city protects and promotes human rights standards. Focusing
on human rights driven legal frameworks will underscore that the “smart” in the smart
city refers to more than advanced technology, and instead signals the development of
human rights legal standards that are truly human focused, and equality driven.

 
Keywords smart cities emerging technologies human rights  new technolo-
  
gies smart city challenge equality law and policy

4.1 Introduction

By 2020, it is estimated that $400 billion a year will be spent building smart cities.1
Smart cities are touted to the efficient, advanced and an innovative way to solve
challenges unique to the urban context. Adopting technologies in the city context
affords new possibilities for creating a connected and citizen driven city in an
increasingly city-based population.2 Adoption of smart city strategies is, we are
told, smart. Technology can be used to address traffic congestion, and climate
change; technology can drive economic growth, entrepreneurship and innovation;
technology can enhance the lives of residents and businesses alike. Smart cities
utilize technology to create high-speed digital infrastructure, which results in a
network of data points, sensors and possibilities for citizens to be connected.
The smart city signals a new facet of technological development. The smart city
utilises location driven technologies that interact with the built environment. This
new form of digital technology, which is physically situated in the city-context,
raises questions about how we approach, understand and categorize technologies in
law and policy. Areas traditionally looked at in relation to digital activities remain
relevant, including laws primarily in the areas of privacy, and intellectual property.
However, in addition, the geospatially-connected city raises legal concerns that are
often not considered by technology experts. Areas of law traditionally engaged with
at the city level includes issues of equality, promoting access to services, promoting
ease of movement, and ensuring that city governments attain required levels of
human rights protections in the delivery of their services. These considerations

1
Bernard Marr, “How Big Data and The Internet Of Things Create Smarter Cities,” (19 May
2015) Forbes. Online at https://www.forbes.com/sites/ber-nardmarr/2015/05/19/how-big-data-
and-the-internet-of-things-create-smarter-cities/#d6658e117677. Last accessed 15 November
2018.
2
Over 80% of Canadians live in urban areas. See Future Cities Forum 2018, p. 2.
4 Human Rights in the Smart City … 49

occur within the parameters of overarching legal standards contained in domestic


human rights law, such as the Canadian Charter of Rights and Freedoms or the UK
Human Rights Act, or increasingly as is enshrined in regional and international
human rights instruments.
As the idea of smart city enters into mainstream consciousness through activities
such as government sponsored competitions that encourage cities to compete for
monetary prizes, and also simply with the mainstreaming of the emerging tech-
nologies that are the backbone of smart cities (the internet of things, and smart
transportation systems are amongst the most widespread), there is a push to con-
sider how best to serve the needs of citizens. Engaged commentators have ques-
tioned the definition of the “smart” in the smart city, suggesting that smartness is
not simply measured by technological innovation, but also by the ability to meet the
needs and concerns of citizens.3 As citizens are increasingly less able to opt out of
technology systems, assessing how well the smart city respects its citizens rights
will be integral to the creation of sustainable smart cities. As we move into a world
of technology driven cities, with reliance on technological processes for the
delivery of all manner of services, we must also ensure that human rights standards
and norms continue to be respected as the cornerstone of modern city development.
This chapter asks how can we embed human rights protections into the creation
of smart cities? It is suggested that existing norms, standards and laws pertaining to
human rights writ-large provide a benchmark for municipal governments and
interest groups as they develop smart city processes. The current effort to broaden
our approach to innovation to focus on equity and access for citizens is correct in
intent, and in order to implement this fully we must turn to substantive sources of
rights. Existing human rights oriented law and regulation shows a detailed analysis
of rights afforded to citizens, and outlines accompanying obligations of govern-
ments. More than simply aspirational, reviewing human rights obligations reveals
that minimum standards required for the development of city processes, and pro-
vides guidance for experts who are shaping the future Smart City.
In Sect. 4.2 competing definitions and views of the smart city are introduced.
Section 4.3 introduces the concept of human rights standards as potential standards
for guiding the development of smart cities. The growing movement of “human
rights cities” is demonstrative of efforts to utilise human rights laws and standards at
the city level. Section 4.4 examines a case study of the smart city competition,
highlighting recent efforts to introduce smart cities through competition processes.
It highlights the law and policy information that is contained in select proposals and
source documents. Section 4.5 details strategies for incorporating human rights
laws into smart cities as part of a robust law and regulatory framework. Section 4.6,
the conclusion, finishes this chapter with some final points on the importance of
addressing challenges in smart cities through a human rights lens.

3
For an early examination of this, see Hollands 2008.
50 T. E. Brown

4.2 Visions of the Smart City

A singular definition of the smart city is still evolving. Broadly speaking it incor-
porates information and communication technologies in the delivery of city ser-
vices.4 Smart cities in their current manifestations are understood as
“[technologically] instrumented and networked [cities], [with] systems [that are]
interlinked and integrated, and [where] vast troves of big urban data are being
generated [by sensors] and used to manage and control urban life in real-time.”5
Thus the smart city adopts a series of sensors, networks, and internet-driven
infrastructure to measure efficiency. They are heavily dependent on technology, but
the focus is on utilizing technology to enhance service delivery. With this there is
an emphasis on the smart city understood as more of a process, rather than a static
outcome. The United Kingdom Department for Business, Innovation, and Skills,
defines smart cities processes as occurring when there is “increased citizen
engagement, hard infrastructure, social capital and digital technologies” which
together “make cities more livable, resilient and better able to respond to chal-
lenges.”6 While for Mark Deakin the smart city is a city that utilizes ICT to meet the
demands of its citizens, a sentiment mirrored by IBM who defines a smart city as
“one that makes optimal use of all the interconnected information available today to
better understand and control its operations and optimize the use of limited
resources.”7
Increasingly commentators have highlighted the need to ensure that we have a
human-focused approach to the smart city, and urged us not to get lost in devel-
oping smart cities that are technologically determined. Instead the goal for the smart
city is to utilize these technologies to improve the community8 This definition of the
smart city from Manchester, United Kingdom, summarizes much of the interest in
smart cities from the perspective of citizen engagement. The Manchester Digital
Development Agency explains that a smart city means “‘smart citizens’ where
citizens have all the information they need to make informed choices about their
lifestyle, work and travel options.”9 Thus smart cities are generally understood to
occur when governments that harness technology to improve service production or
delivery, enhance local economies and provide residents improved access to city
governance, resulting in a better quality of life.10 An even more human-centered or
social justice approach was offered in an early piece by Robert Hollands who
argued that “progressive” smart cities “must seriously start with people and the

4
Komninos 2002.
5
Kitchin 2015a.
6
Department for Business Innovation and Skills 2013.
7
Cosgrove 2011.
8
Lehr 2018, p. 3.
9
See online Manchester Smarter City Programme, https://www.manchester.gov.uk/smartercity.
Last accessed 31 August 2018
10
Kitchin 2015b.
4 Human Rights in the Smart City … 51

human capital side of the equation, rather than blindly believing that IT itself can
automatically transform and improve cities.”11 The progressive smart city thus does
not exist apart from technological innovation, but instead utilizes technology for
purposes of “enhancing democratic debates about the kind of city it wants to be and
kind of city people want to live in.” For Holland, and others, this is particularly
important for the simple fact that technology can itself cause both harms by
depending social inequality and entrench the digital divide.

4.3 Human Rights in the Smart City

Given that the definition of the smart city is not fixed, it is self-evident that there
would be multiple normative frameworks that underpin smart city initiatives. The
configuration of the smart city is incredibly complex and ultimately “one finds that
[it involves] quite a diverse range of things—information technology, business
innovation, governance, communities and sustainability.”12 However, the creation
of smart cities must be done with a view to looking beyond the concerns typically
associated with technology and instead we must think broadly about how rights
approaches can be incorporated into city-places.
There is a distinction between human rights cities and smart cities. Human rights
cities are concerned with people and ensuring minimum rights are achieved for all.
Whilst smart cities can, potentially, be people-centric in this way, they are equally,
if not more, concerned with the place of the city. One commentator observes that
the infrastructure required for smart cities is the most important consideration: “An
urban street is the most scarce, expensive piece of land and resource. Everybody
wants to be on it and they don’t want to share with anyone. There’s only so much of
it, so the more you can co-ordinate it, the more benefits you can get.”13 Smart cities
advocates are clear that this scarce resource will have to be adapted to meet the
needs of the automated vehicle. But how can we adapt the road to meet the needs of
underserviced communities to access the city? It should be a small step to con-
necting digital technologies to city challenges to the people that need government
support.

11
Hollands 2008, p. 315.
12
Hollands 2008, p. 306.
13
Luke Dormehl, “The Road to Tomorrow: Streets Need to be as Smart as the Cars Driving on
Them” (7 November 2016). Online at http://senseable.mit.edu/news/pdfs/20161107_Wired.pdf.
Last accessed 15 November 2018.
52 T. E. Brown

4.3.1 Human Rights Cities

How have human rights been addressed at the city level traditionally? One of the
focus areas for smart cities is on creating “inclusive, sustainable and resilient
cities.”14 There are some human rights issues that are directly concerned with and
motivated by the digital sphere. For example the problem of the digital divide was
early recognized to be concern undermining the assumption that technology affords
an opportunity for all people to engage with the Internet.15 The problem of the
digital divide includes both a recognition that large parts of the world do not have
access to the internet, and also that there is inequality in those who are afforded the
ability to learn how to use digital tools. This concern accumulated with the United
Nations amending the Universal Declaration of Human Rights in 2016 to now
include a human right to the Internet.16 Non-digital processes are equality appli-
cable to digital activities. Discussions on freedom of speech, gender based violence,
and regulation in Internet spaces are likewise deeply connected to rights based
concerns in the digital world. Beyond this specifically digital focus, there are
multiple human rights that exist and are applicable to digital processes, and to
people in city and urban contexts. Claims to gender equality, housing, right to life,
liberty and security, right to democratic processes, and rights of the child, amongst
others, all have relevance in the modern digital applications. They are all rights
based, can be traced to various international and regional human rights instruments,
and they exist as principles, norms, and legal rules that apply to all citizens
including those living in (future) smart cities, or even simply amongst digital
processes.
The applicability of human rights instruments and norms to the activities of
municipal governments has long been a facet of human rights discussion particu-
larly in relation to the processes required for the implementation of rights standards.
Where initially human rights instruments were adopted to bind at the state level, in
recent decades there has been an increase in municipal governments and local
authorities adopting international instruments. This has resulted in the phenomenon
of “human rights cities.”17 A human rights city is defined as an “urban entity or
local government that explicitly basis its policies, or some of them, on human rights
as laid down in international treaties, thus distinguishing itself from other local
authorities.”18 Thus human rights cities directly incorporate regional and interna-
tional rights standards into their governance structure, for example San Francisco

14
Impact Canada, “Creating a Platform for Bold Ideas” (undated). Online at https://impact.
canada.ca/en/challenges/smart-cities. Last accessed 15 November 2018.
15
See Warschauer 2002.
16
General Assembly, Oral Revisions of 30 June, HRC, 2016, thirty-second session, A/HRC/32/L/
20.
17
Oomen et al. 2016; Oomen and Baumgartel 2014, p. 726; and Marks and Modrowski 2008. On
human rights in cities generally, see Davis 2007, pp. 258–286; Lozner 2008; Merry et al. 2010.
18
Oomen and Baumgartel 2014, p. 710.
4 Human Rights in the Smart City … 53

that adopted CEDAW into their local city ordinances.19 Cities throughout the world
have adopted human rights as part of their government policies, including
Barcelona in Spain, Utrecht, The Hague, Middleburg, Nijmegen and Amsterdam in
the Netherlands,20 Rosario in Argentina, Washington in United States of America,
and Gwanju in South Korea,21 amongst others.
Efforts to implement human rights standards in city spaces have resulted in the
creation of the “Global Charter-Agenda for Human Rights in the City” [The
Charter].22 The Charter was created in 2011 by the United Cities and Local
Governments23 body, which is the global platform that represents and defends the
interests of local governments before the international community. The Charter is
intended “as a tool for local governments to build more inclusive, democratic and
solidarity-based societies in dialogue with urban dwellers.” Human rights included
in the Charter include the right to the city, right to participatory democracy, right of
women and men to equality, rights of children, right to housing, and cultural rights,
amongst others. Other examples of the efforts to incorporate right to the city as part
of a broader agenda for human rights include the 2000 European Charter for Human
Rights in the City, the 2001 World Charter for the Right to the City, the 2006
Charter of Rights and Responsibilities of Montreal, the 2010 Montreal Charter of
Rights and Responsibilities, and the 2012 Gwanju Human Rights Charter.24
There is debate about how concrete—or how justiciable—claims to a right to the
city is. The rights listed in the Charter are mostly established and justiciable rights
enshrined in other international legal documents. Thus we see rights of the child is
reflective of provisions in the Convention on the Rights of the Child25 and rights to
participatory democracy and civic peace reflect the International Covenant on Civil
and Political Rights.26 Certainly in the Human rights cities have followed this
practice by adopting international and regional instruments into their by-laws, city
governance structures and guiding city documents. The exact role of human rights
efforts, including the right to the city, in the development of smart cities is
uncertain. Technology provides an opportunity for the citizen to engage in demo-
cratic processes with self-determination in the smart city context. But this does not
necessarily indicate an engagement with international human rights laws, or even
with normative values associated with human rights dialogue.

19
Lozner 2008.
20
van den Berg 2016.
21
Oomen 2016, pp. 1–19.
22
UCLG Committee on Social Inclusion, Participatory Democracy and Human Rights 2012.
23
Formally adopted by Florence UCLG World Council in 2011.
24
Purcell 2013; UCLG Committee on Social Inclusion, Participatory Democracy and Human
Rights 2012.
25
Convention on the Rights of the Child, GA Res 44/25, 20th November 1989.
26
International Covenant on Civil and Political Rights, GA Res 2200A (XXI), 16 December
1966.
54 T. E. Brown

4.3.2 Human Rights as the Foundation of the Smart City

Government bodies must incorporate human rights considerations in development


of smart cities processes. This can be iteratively done, in that human rights norms
are an ongoing, goal-based process. Taking a human rights approach ensures that
people and human rights norms are placed at the centre of technology initiatives in
urban contexts.27 Taking a human centred approach ensures that there are the best
possible guards against undesirable and unintended consequences of technology.
As well it highlights the most laudable potential for technology to service the
greatest human rights challenges of our day, such as the potential of utilising
data-metrics to address housing shortages, or, calculating and moving food bank
supplies to people in need. Human rights must also be achieved with reference to at
least the minimum content in local, regional and international law. Adopting human
rights laws as one of the key reference points in the development of smart cities not
only ensures that legal obligations are fulfilled, which ought to be a central concern
for city-lawyers and government. In addition, it elevates the independence and
self-determination of that city: “Reference to human rights can form a common
language, thereby rallying different people, activities, and interests and strength-
ening social cohesion within the city; underlining a particular identity, but also
strengthening its autonomy vis-à-vis the national government.”28
Technology driven norms can be utilized to create human rights norms.29 The
focus for technologists is on ensuring access to the technology itself, assuming
egalitarian results. Adopting human rights norms as the framework for smart cities
will take a marketable shift from assumptions about the utility of technology, to the
focus on measureable standards of human rights. Possibly one of the worst out-
comes from a smart city initiative would be that “smart” language is adopted
without attention paid to the supposed “smart” concept. In this scenario we would
see smart be adopted with the assumption that rights standards are automatically
met. Groups that want to harness the good will attached to the label of smart could
do this with intent, but it could also be with ignorance, as the implications of a
technological change are not fully examined. Adopting “smart-washing” will pre-
vent the opportunity to restructure city places as human rights driven. That taking a
shallow approach will essentially prevent a real opportunity for change. Introducing
human rights centred approach to defining the “smart” in the smart city would
provide guidance as to the existing legal standards and obligations on cities, as well
as help identify aspirational goals based on long-identified human rights needs.
Human rights will ultimately have to be claimed: “Rights are always the outcome of
political struggle. They are the manifestation, the end result of collective claims

27
Ken Greenberg, “Future Cities: A Focus on Human-Centred Urbanism Enabled by
Technology” (23 February 2018). Online at http://ncc-ccn.gc.ca/posts/future-cities-a-focus-on-
human-centred-urbanism. Last accessed 15 November 2018.
28
Oomen and Baumgartel 2014, p. 726.
29
de Vries et al. 2013; Postman 1993.
4 Human Rights in the Smart City … 55

made by mobilized citizens.”30 By any accounts, the smart city should correctly be
the city that adopts human rights norms as the motivating factor for city
governance.

4.4 The Smart City Competition

Worldwide there has been interest in smart cities, often with substantial government
financial support.31 In 2017, Columbia, Ohio won their submission for the United
States smart cities competition. Columbia won 50 million dollars for their proposal
focused on infrastructure and the creation of an integrated data platform. The
European Union has prioritized the development of smart cities programs with its
20/20 horizon program there are smart cities competition hosted by the international
federal of automobiles and by the royal institution of chartered surveyors.
The Canadian smart city challenge launched in January 2018 is demonstrative of
smart city challenges. The smart city envisioned in the Federal call for proposal
explains “a smart cities approach means achieving meaningful outcomes for resi-
dents through the use of data and connected technology.”32 The competition was
open to all manner of governance bodies including municipalities, local or regional
governments, and Indigenous communities. The call resulted in over 200 proposals
from municipalities and First Nations communities. A definition of what a smart
city may be is not provided by the Federal Government as part of the challenge,
instead, step one of the challenge requires participants to put together a challenge
statement which defines what that community seeks to achieve in their smart city.
Examples of challenge statements include focus on employment: “After years of
decline, our community will transform a former industrial neighbourhood into one
of the top location in Canada for economic growth”; on safety and security: “The
neighbourhood in our community with the highest crime rate will become safer than
the national average”; and on transport: “Our community will ensure that every
senior who is able to live independently at home is empowered to do so.”
Technology is at the centre of the smart cities challenge. The guidelines specify that
“there is no limit to the number of technologies” incorporated into the smart cities
and it could include artificial intelligence, assistive technology, augmented reality
or virtual reality, autonomous and connected vehicles, big data analytics, cloud
computing, open data platforms, internet of things and mobile applications.33

30
Purcell 2013, p. 146.
31
McClellan 2018.
32
Supra note 14, Impact Canada, “Creating a Platform for Bold Ideas”.
33
Supra note 14, Impact Canada, “Creating a Platform for Bold Ideas”.
56 T. E. Brown

4.4.1 Smart City Strategies in Canadian Cities

There is very little evidence of the adoption of human rights norms in the sub-
missions to the challenge, although the competition is still at its formative stages.34
In these early stages however it seems that the city proposals do incorporate notions
of equality into their proposals, though not equality vis-à-vis human rights stan-
dards, but rather assumptions that equality will be achieved through access to and
engagement with technologies.
In the City of Edmonton’s submission officials highlight the need for city level
health strategies, and so develops a proposal for a “healthy city ecosystem.” The
challenge statement adopted by Edmonton focuses on “transformation of Canadian
healthcare using an unprecedented municipal approach by focusing on leveraging
relationships, health data and innovative technologies to provide a personalized
health connection and experience as unique as the health of every Edmontonian.”35
A smart city proposal focusing on health is driven by recognition that the city is
increasingly experienced in social isolation, segregation and sedentary lifestyles. As
aspect of promoting connectivity in order to combat social ills of loneliness,
depression and mental health challenges, is to create a smart city that tackles the
digital divide.36 The proposal observes that the biggest challenge is sharing health
data between service providers due to existing legal and privacy considerations.37
In the proposal it is implicitly acknowledged that legal jurisdiction to address the
delivery of health services which constitutionally are in provincial jurisdiction, “[r]
ecognizing urbanization and the increasing role residents’ health affects and is
affected by City services, the City of Edmonton proposes that municipal-level
intervention is necessary.”38
The submission from the City of Ottawa focuses on youth. Identifying that youth
make up 22% of population in Ottawa, the submission identifies a need to ensure
that young people are equipped with skills and resilience to respond to economic,
technological and social changes. The challenge statement states: “Ottawa will
invest in our youth to create a smarter national capital where young people are safe,
active, healthy, and empowered to innovate and co-create our future; we will
increase youth employment by 25% and provide them with the skills, experience
and confidence needed to succeed in today’s digital economy.”39 Driven by the goal
of increasing employment opportunities for youth, means that the focus is on social
and civic engagement, coupled with promoting health and safety. Aspects of health

34
The first stage of the smart city challenge closed April 2018. The final stage, the selection of the
final winner has not occurred as of writing.
35
City of Edmonton 2018, p. 4.
36
City of Edmonton 2018, p. 5.
37
City of Edmonton 2018, p. 20.
38
City of Edmonton 2018, p. 3.
39
City of Ottawa 2018, section 2.1.
4 Human Rights in the Smart City … 57

and safety focused on include mental health, and food security.40 A key part of the
smart city for youth is creating programs to enable youth to develop technical
capabilities including coding, promoting interest in STEM career options, and
incorporating processes of data-fication into learning and measuring activities. The
smart cities submission builds on Ottawa’s planned outlined in its’ initiative “Smart
City 2.0.”41 The focus for Ottawa are explained as ensuring Ottawa is a connected
city, has a “smart” economy, which refers to economic growth and development of
local entrepreneurs, and innovative government.
A notable submission came from Montreal. The guiding principles for Montreal
indicate a strong willingness to engage with human rights oriented strategies. The
indicated areas of interest are housing, food security, creation of local services that
focus on neighborhood and community life, environment and climate change, and
finally security issues. Notably, the City of Montreal refers to international stan-
dards for food security as are outlined in the Rome Declaration on World Food
Security and the World Food Summit Plan of Action.
There is a heavy emphasis on technology as drivers in the smart city. Edmonton
incorporates plans to introduce artificial intelligence, assistive technologies mobile
applications, health or medical technology, sensors, cloud computing, and open
data platforms. It also proposes to adopt a technology called “data lake,” which
refers to a storage repository that holds raw data in its native format (structured,
semi-structured, and unstructured data) using a flat architecture (instead of it being
organized in files or folders). This means that they intend to have a repository of
data collected and retrievable for unknown future uses. This management of data,
particularly as is proposed health data, creates challenges legally speaking, par-
ticularly from the privacy perspective. In Ottawa technical tools to be incorporated
include open government data, Internet of things, artificial intelligence and machine
learning, and a network of physical infrastructure to support digital processes in the
way of a network of innovation offices dispersed throughout the city.42
Canadian cities that were active in the smart cities challenged are largely tech-
nologically engaged cities already recognized as forerunners in the smart city
movement.43 These three cities have substantive open data portals and open gov-
ernment processes; indeed Montreal was the first city in Canada to create an open
data portal. Edmonton established its Smart City Program in 201644 and recently
won an award for its open city initiative.45 The “Canadian Open Data Summit” an
important conference has been held in Ottawa in 2015 and in Edmonton in 2017.

40
City of Ottawa 2018, section 2.2, program goal 3.
41
City of Ottawa 2017.
42
City of Ottawa 2018, section 2.2.
43
Hollands 2008; Deakin and Al Waer 2011.
44
City of Edmonton 2018, p. 12.
45
See City of Edmonton, “Smart City Awards.” Online at https://www.edmonton.ca/city_
government/initiatives_innovation/smart-city-awards-recognition.aspx. Last accessed 15
November 2018.
58 T. E. Brown

All three have their own strategies and processes for their development as smart
cities that exist apart from submissions to the federal competition. The role of
existing city-norms in relation to their engagement with technology is unknown at
this point and whether this will translate into successful applications to the Smart
Cities challenge is unknowable.

4.4.2 Law and Policy in Smart City Proposals

Direct engagement with legal principles, rules, and standards broadly (even not
those that relate to human rights standards) in these smart cities proposals are
minimal. All three proposals incorporate reference to privacy principles and make a
commitment to comply with key privacy legislation in Canada.46 This is not sur-
prising. Privacy commissioners from all provinces and the federal government
submitted a joint letter to the Minister of Infrastructure and Communities, the
governmental department administering the smart cities challenge. In the letter, the
privacy commissioners urge groups competing in the challenge to ensure that
privacy and security of personal information are incorporated into proposals, and
ask that the selection committee consider this factor in the selection of winning
proposals.47
Within these proposals the most overtly human rights oriented approach in the
Canadian example is the example challenge statement contained in the Federal
government guidelines under the subject heading “Be empowered and included in
society.” The guidelines suggest that an appropriate challenge statement may be:
“Our community will ensure that every person without a home has access to nightly
shelter, and will connect 100 percent of vulnerable residents with the services,
activities, and programs that are known to reduce the risk of homelessness.”48 If this
example were to be adopted as a motivation for a smart city we would be close to
the creation of a smart city for human rights purposes. Other than consideration of
privacy there is no reference to applicable rules and regulations that might be
guiding smart city development. Whilst it can be assumed that existing manage-
ment of data infrastructure will be continue to be relevant in the smart cities, and
principles of licensing and data-ownership already utilized by these cities will grow
in importance in their smart city projects.
The focus on the development of technology-driven processes indicates an
engagement with technology on technologies terms, but not with technology to
engage principles of human rights or meet human rights standards. Although this
does not mean that human rights are not important for these select Canadian cities,
but rather is evidence of a choice not to put these judiciable standards at the center

46
Specifically, Personal Information Protection and Electronic Documents Act (S.C. 2000, c. 5).
47
Office of the Privacy Commissioner of Canada 2018.
48
Supra note 14, Impact Canada, “Creating a Platform for Bold Ideas”.
4 Human Rights in the Smart City … 59

of proposals. This may not indicate an unwillingness to incorporate human rights


standards, but perhaps indicates an inability to manage smart city processes through
non-technological standards and norms.

4.5 Implementing Rights in the Smart City

Identifying appropriate legal frameworks to regulate smart cities can only be done
with an understanding that technologies adopted in the smart city are geospatially
driven and locate people and places across all manner of services. This means there
is a more intimate connection with people’s personal context and their intimate
daily lives and behaviours, than would ordinarily be the case in relation to tech-
nologies in cyberspace. Internet has spilled over into the built environment and is
not just an example of yet another new technology, but is instead the creation of
entire new ways of thinking about technology. Laura Forlano explains:
The digital and material are no longer considered to be separate and discrete entities but
rather, they have been integrated into hybrid forms such as the digitally material…There is
a need for emergent notions of place that specify the ways in which people, place and
technology are interdependent, relational and mutually constituted.49

For the smart city this means that the built environment is as much a consid-
eration in the creation of technology, as the architecture of the technology itself; and
vice-versa. There is a new co-dependency or co-creation between technologies and
places, particularly in urban contexts. Of course, these new places are regulated
through a variety of laws that concern citizens. Managing collective and individual
interests in city places requires there be thought and care given to protecting rights
of individuals and collectives alike. Within this body of laws, human rights norms
and standards could serve to provide guidance for smart city developers whom are
searching for the definitional standard of “smart” in the smart city.

4.5.1 Strategies

Greater attention must be paid to developing a robust legal and regulatory


framework: Broadly speaking, law and policy frameworks are essential for guiding
the development of smart cities and they need to be adopted throughout as mini-
mum guiding frameworks. The guidelines provided to Canadian cities show little
reference to law, policy and applicable standards. Instead the focus is on adopting
technologies that will in some way benefit citizens. The temptation of relying on
technology—or rather the coders, software engineers, and data architects who
create these technologies—to develop regulatory standards is not unique to smart

49
See Forlano 2013.
60 T. E. Brown

cities. Cardullo and Kitchin argue that the empowerment of citizens can be achieved
through the adoption of domain-level experts:
Using domain-level experts – bureaucrats, technocrats, specialist workers – creates effi-
ciencies and utilises accreted knowledge to tackle issues that citizens may have little
experience or knowledge of.50

This perspective is from two of the leading thinkers on smart cities and it
certainly speaks to how highly specialized aspects of technology can be, and of the
need to develop processes for bridging knowledge gaps. Navigating emerging
technologies of the smart city requires technical and sophisticated knowledge. This
is increasingly true in the context of algorithmic governance and open data pro-
cesses. However, transparency in government and law processes and sharing
information about decision-making is a central component of functioning liberal
democracies. Leaving the development of smart city processes to technologists
(coders and developers) ignores other domains of knowledge, including laws and
policies on human rights standards.
Focusing solely on laws directly related to technology no longer provides a
cohesive regulatory response: Laws typically associated with technological pro-
cesses now no longer meet all of the regulatory needs of the smart city environment.
Legal regulation has traditionally occurred in a reactionary fashion to
informational-type issues that would typically arise in cyberspace, including issues
of free speech, defamation, ownership of internet spaces, and evolving demands on
concepts of privacy. Management and ownership of data rely on rules of intellectual
property (IP). The advent of big data, open data, public sector information,
public-private partnerships that generate mass amounts of data, to name a few, all
entail elements of IP for management.51 Licensing, which is contract law, has been
adopted to manage the complex network of IP rights that exist in software created to
manage and utilize digital processes. Open licensing schemes in the form of cre-
ative commons have been adopted as a counter measure to privatized systems.
Licensing schemes have also been adopted to indemnify providers of information,
such as the inclusion of no-liability clauses contained in government open data
licenses. Finally protection of privacy interests is an ongoing consideration as
technologies emerge to challenge individual privacy.52 Principles of privacy have
been utilized to ensure anonymization of data, and also raise concerns about the
possibility of de-anonymizing data.53 Most recently privacy is a prime concern in
the emergence of data-veillance.54 These legal and policy approaches can be lim-
ited. They do not take into account just how startling different the new and
emerging technologies are when compared to Internet based technologies. In

50
Cardullo and Kitchin 2017, p. 3.
51
Coletta et al. 2017.
52
Kitchin 2016, p. 82.
53
Sweeney 2015.
54
Sadowski 2017, pp. 6–11.
4 Human Rights in the Smart City … 61

particular these law and policy responses still largely conceive of digital activities
as informational in nature.55
Law and regulatory responses need to incorporate human rights oriented areas
of law: Connected to this new hybrid reality of city places, regulators of smart city
technology are now working in areas that are regulated through non-technology
related laws and policies. Human rights codes, anti-discrimination laws and
equity-driven policies as adopted by government bodies are all important regula-
tions that apply to the delivery of government services in city places. At the state
level human rights laws place requirements on governments as they deliver services
and interact with citizens. For example, the Ontario Human Rights Code protects
citizens “right to equal treatment with respect to services, goods and facilities,
without discrimination.”56 Protected social areas include accommodation (housing),
employment, contracts, and delivery of goods, services and facilities. With each of
these protected classes there are informational guidelines and a body of
rights-interpretation carried out through jurisprudence at the human rights com-
mission. The same level of detail about human rights standards exists across all
manner of international and regional laws. Where national standards allow, inter-
national human rights norms can be directly incorporated into national laws. The
Ontario Human Rights Code for example affirms principles of freedom, justice and
peace in accordance with the Universal Declaration of Human Rights in the
preamble. Otherwise, applicable international human rights treaties can be identi-
fied based on whether states have become a signatory to a given instrument, or
regional human rights standards may be given direct effect in a state. Not only do
these legal standards provide guidance as to the minimum level of protection that
must be met in the creation of smart cities, but also provide guidelines for how to
develop a more robust human rights driven smart city.
Bodies focused on human rights processes need to be incorporated and main-
streamed into the smart city conversation: There are many laws and regulations
applicable to city places, and these will focus on different aspects of concern or
need. As has been highlighted these include areas of data ownership and man-
agement, licensing issues and privacy, but also include a whole manner of rights
based concerns. There are government departments, ombudsmen, and administra-
tive bodies across all levels of government that are tasked with addressing spe-
cialized concerns and monitoring activities. The important role of these bodies was
underscored in the Canada Smart Cities challenge, where Federal and Provincial
privacy commissioners recognized that privacy laws and principles were not
highlighted in the original guidelines for competition participants. They wrote a
public letter reminding participants and the government department overseeing the
competition that there exist privacy laws applicable to the activities of government,
and in turn privacy was incorporated as a measuring standard when assessing
applicants proposals. Bodies that could assist with similar regulatory oversight on

55
Forlano 2013.
56
Human Rights Code 1990, section 1.
62 T. E. Brown

issues of human rights include municipal ombudsman’s and human rights tribunals,
and inevitably litigation that raises issues of constitutional interpretation. In
Ontario, Canada, the Ontario Human Rights Commission has released informa-
tional guidelines for municipalities in which they highlight concerns of racial dis-
crimination, direct and indirect, which occur at the city level.57 Just how these
standards will intersect with difficult questions of data management and the use of
algorithms in the smart city context is currently unknown.58 This is precisely why
interest groups and oversight bodies need to be present in order to highlight areas of
concern about human rights in smart cities.
Human rights laws are not optional: This last point is a reminder as much as it is
a strategy. Examining how the smart city ought to protect and promote the rights of
its citizens is not simply a laudatory goal, but will be a regulatory requirement. The
strength of a human rights claim is commensurate with existing human rights
frameworks. There may be articulations of equity and rights as is contained in a
national constitution, or provincial/state human rights code, or regional legal body,
and as in these examples reinforced through the justice system. Or there may be
more robust laudatory human rights discussions as is exemplified in human rights
cities which have embraced norms found in international law. Whatever the level of
engagement with human rights, regulatory bodies must consider these laws as they
relate to the built environment and citizens in city places. Rules and regulations
pertaining to human rights are not optional.

4.6 Conclusion

The importance of defining the content of “smart” has been recognised in the early
response to the emergence of smart city processes. Whether smart indicates that
smart cities ought to focus on service delivery, optimization of resources, or ought
to be citizen oriented, is not settled. This chapter has argued that existing legal
frameworks pertaining to human rights laws and norms provide guidance for
developers of smart cities and must be adopted as guiding legal frameworks in the
creation of the smart city. These obligations, laws, and standards serve to provide
guidance for government bodies and regulators as smart cities develop. Although
smart city processes are in relatively early stages, the creation of the smart city has
thus far included minimal efforts to ensure that the applicable legal and regulatory
framework is created with reference to existing legal frameworks for the realisation
of human rights at the domestic, regional and international level.
Smart city advocates must focus on the target population and beneficiaries of a
smart city project. Smart cities will not be created in a vacuum, but instead will
bump up against challenges, new and old. New challenges related to development

57
Ontario 2010.
58
Pasquale 2015.
4 Human Rights in the Smart City … 63

of technology, software and hardware, ensuring that city infrastructure can be


adopted to meet the needs of emerging technologies, and developing the capabil-
ities of a work force to engage with smart technologies. “Old” challenges relate to
the access to housing, discrimination, movement around the city and equality in
government spending through out the city. Through early activities in the smart city
processes, participants have recognised the need to promote “rights” and it is
apparent that rights in this context refer to the “new” set of challenges in developing
technology processes. It is not clear how participants intend to address “old”
challenges, which are to be correctly understood as ongoing. Human rights cities
offer an understanding of how cities can develop focusing on the development of
resilient, sustainable, and inclusive processes as components of the smart city.
Whilst the human rights perspective would drive us towards a smart city that
diligently and specifically meets the legal standards that promote the dignity of the
person. Human rights open up possibilities for accountability and measurability and
also create positive obligations that government bodies must meet. Knowing that
human rights standards already exist worldwide, and that they place non-delegable
legal obligations on governments, as we move forward the question driving smart
cities must be, how are government bodies going to utilise human rights as the
motivating and organizing factor of the smart city?

References

Cardullo P, Kitchin R (2017) Being a ‘Citizen’ in the Smart City: Up and Down the Scaffold of
Smart Citizen Participation. https://osf.io/preprints/socarxiv/v24jn. Accessed 31 August 2018
City of Ottawa (2017) Smart City 2.0. https://documents.ottawa.ca/sites/documents.ottawa.ca/files/
smart_city_strategy_en.pdf. Accessed 31 August 2018
City of Ottawa (2018) Future Ready Youth: City of Ottawa Submission to Infrastructure Canada’s
Smart Cities Challenge. https://ottawa.ca/en/city-hall/public-engagement/projects/smart-city-
20-ottawas-smart-city-strategy. Accessed 31 August 2018
City of Edmonton (2018) Smart Cities Challenge Submission. https://smartcities.edmonton.ca.
Accessed 31 August 2018
Coletta C, Heaphy L, Perng SY, Waller L (2017) Data-driven Cities? Digital Urbanism and its
Proxies. Tecnoscienza 8:5–18
Cosgrove M et al (2011) Smart Cities Series: Introducing the IBM City Operations and
Management Solutions. IBM
Davis M (2007) Thinking Globally, Acting Locally: States, Municipalities, and International
Human Rights. In: Soohoo C et al (eds) Bringing Human Rights Home: A History of Human
Rights in the United States. University of Pennsylvania Press, Pennsylvania, pp 258–286
Deakin M, Al Waer H (2011) From Intelligent to Smart Cities. Intelligent Buildings International
3:140–152
Department for Business Innovation & Skills (2013) Smart Cities Background Paper. https://
assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/
246019/bis-13-1209-smart-cities-background-paper-digital.pdf. Accessed 31 August 2018
de Vries MJ, Hansson SO, Meijers AWM (eds) (2013) Norms in Technologies. Springer, The
Netherlands
Dormehl L (2016) The Road to Tomorrow: Streets Need to be as Smart as the Cars Driving on
Them. http://senseable.mit.edu/news/pdfs/20161107_Wired.pdf. Accessed 31 August 2018
64 T. E. Brown

Forlano L (2013) Making Waves: Urban Technology and the Coproduction of Place. First Monday
18(1). http://firstmonday.org/ojs/index.php/fm/article/view/4968/3797
Future Cities Forum (2018) Smart Leaders for Smart Cities: Summary Report. https://ottawa.
impacthub.net/2018/05/07/what-do-we-mean-when-we-talk-about-smart-cities/?mc_cid=
598935aa48&mc_eid=f78159c2e1. Accessed 31 August 2018
Greenberg K (2018) Future Cities: A Focus on Human-Centred Urbanism Enabled by Technology.
http://ncc-ccn.gc.ca/posts/future-cities-a-focus-on-human-centred-urbanism. Accessed 31
August 2018
Human Rights Code, R.S.O. 1990, c. H.19. (2018) Current March 31, 2018. https://www.ontario.
ca/laws/statute/90h19. Accessed 23 September 2018
Hollands RG (2008) Will the Real Smart City Please Stand Up? Intelligent, Progressive or
Entrepreneurial. City 12:303–320
Kitchin R (2015a) Data-driven, Networked Urbanism. http://www.spatialcomplexity.info/files/
2015/08/SSRN-id2641802.pdf. Accessed 31 August 2018
Kitchin R (2015b) The Promise and Perils of Smart Cities. Available at https://www.scl.org/
articles/3385-the-promise-and-perils-of-smart-cities. Accessed 31 August 2018
Kitchin R (2016) Getting Smarter About Smart Cities: Improving Data Privacy and Data Security.
Data Protection Unit, Department of the Taoiseach, Dublin
Komninos N (2002) Intelligent Cities: Innovation, Knowledge Systems and Digital Spaces. Spon
Press, London/New York
Lehr T (2018) Smart Cities: Vision on-the-Ground. In: McClellan S et al (eds) Smart Cities:
Applications, Technologies, Standards and Driving Factors. Springer, Cham, Switzerland,
pp 3–15
Lozner SL (2008) Diffusion of Local Regulatory Innovations: The San Francisco CEDAW
Ordinance and the New York City Human Rights Initiative. Columbia Law Review 104:768–
800
Marks SP, Modrowski KA (2008) Human Rights Cities: Civic Engagement for Societal
Development. UN Habitat, New York
Marr B (2015) How Big Data and The Internet Of Things Create Smarter Cities. https://www.
forbes.com/sites/ber-nardmarr/2015/05/19/how-big-data-and-the-internet-of-things-create-
smarter-cities/#d6658e117677. Accessed 31 August 2018
McClellan S, Kimenez JA, Koutitas G (eds) (2018) Smart Cities: Applications, Technologies,
Standards, and Driving Factors. Springer, Cham, Switzerland
Merry SE, Levitt MS, Yoon D (2010) Law From Below: Women’s Human Rights and Social
Movements in New York City. Law & Society Review 44:101–128
Office of the Privacy Commissioner of Canada (2018) Joint letter from privacy commissioners to
Amarjeet Sohi, Minister of Infrastructure and Communities. https://www.priv.gc.ca/en/opc-
news/news-and-announcements/2018/let_sc_180424/. Accessed 31 August 2018
Ontario Human Rights Commission / Commission ontarienne des droits de la personne (2010)
Anti-Racism and Discrimination for Municipalities. http://www.ohrc.on.ca/sites/default/files/
attachments/Anti-racism_and_anti-discrimination_for_municipalities%3A_Introductory_
manual.pdf. Accessed 23 September 2018
Oomen B (2016) Introduction: The Promise and Challenges of Human Rights Cities. In:
Oomen BM et al (eds) Global Urban Justice: The Rise of Human Rights Cities. Cambridge
University Press, Cambridge, pp 1–19
Oomen B, Baumgartel M (2014) Human Rights Cities. In Mihr A, Gibney M (eds) The Sage
Handbook of Human Rights. Sage, Los Angeles, pp 709–730
Oomen B, Davis MF, Grigolo M (eds) (2016) Global Urban Justice: The Rise of Human Rights
Cities. Cambridge University Press, Cambridge
Pasquale F (2015) The Black Box Society: The Secret Algorithms That Control Money and
Information. Harvard University Press, Cambridge
Postman N (1993) Technopoly: The Surrender of Culture to Technology. Knopf, New York
Purcell M (2013) Possible Worlds: Henri Lefebvre and the Right to the City. Journal of Urban
Affairs 36:141–154.
4 Human Rights in the Smart City … 65

Sadowski J (2017) Access Denied: Snapshots of Exclusion and Enforcement in the Smart City. In:
Shaw J, Graham M (eds) Our Digital Rights to the City, pp 6–11. https://meatspacepress.org/
our-digital-rights-to-the-city/. Accessed 31 August 2018
Sweeney L (2015) Only You, Your Doctor, and Many Others May Know. https://techscience.org/
a/2015092903. Accessed 31 August 2018
UCLG Committee on Social Inclusion, Participatory Democracy and Human Rights (2012) Global
Charter Agenda for Human Rights in the City. https://www.uclg-cisdp.org/en/right-to-the-city/
world-charter-agenda. Accessed 31 August 2018
van den Berg E (2016) Making Human Rights the Talk of the Town: Civil Society and Human
Rights Cities, a Case Study of The Netherlands. In: Oomen B et al (eds) Global Urban Justice:
The Rise of Human Rights Cities. Cambridge University Press, Cambridge, pp 44–63
Warschauer M (2002) Reconceptualizing the Digital Divide. First Monday 7(2). http://
firstmonday.org/ojs/index.php/fm/article/view/967/888. Accessed 31 August 2018

Tenille E. Brown is a Ph.D. candidate and part-time professor in the Faculty of Law at the
University of Ottawa. Her research examines the intersection between technology, law, and the
application of geographical insights to property and technology law. Tenille is a member of the
Human Rights Research and Education Centre at the University of Ottawa, and a barrister and
solicitor at the Bar of Upper Canada.
Chapter 5
Automated Driving and the Future
of Traffic Law

Nynke E. Vellinga

Contents

5.1 Introduction........................................................................................................................ 68
5.2 Current Developments ....................................................................................................... 69
5.3 Technical Regulations ....................................................................................................... 70
5.4 Civil Liability .................................................................................................................... 71
5.5 Traffic Laws....................................................................................................................... 73
5.5.1 Rules of the Road................................................................................................... 74
5.5.2 The Notion of ‘Driver’........................................................................................... 74
5.5.3 The Driver of an Automated Vehicle .................................................................... 76
5.5.4 Automation and Traffic Laws................................................................................. 77
5.5.5 Adopting a New Approach .................................................................................... 78
5.6 Final Remarks.................................................................................................................... 80
References .................................................................................................................................. 80
Case Law ................................................................................................................................... 81

Abstract Fully automated vehicles that can operate without human interference
are getting closer to reality. Fully automated vehicles are expected to offer many
benefits, from limiting the need for parking space to increased road traffic safety.
Besides all the technical challenges, this development also gives rise to several legal
questions. This is not surprising given that most national and international traffic
laws are based on the notion that a human driver is behind the wheel. What is the
legal consequence of letting the vehicle drive itself? This contribution will focus on
the legal challenges automated driving poses for traffic law. Other legal questions
will also be touched upon, for instance questions regarding liability and insurance,
but the emphasis will lie on questions regarding traffic laws as the answers to those

N. E. Vellinga (&)
Faculty of Law, University of Groningen, PO Box 716, 9700 AS Groningen, The Netherlands
e-mail: n.e.vellinga@rug.nl

© T.M.C. ASSER PRESS and the authors 2019 67


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_5
68 N. E. Vellinga

questions can influence the answers to the other legal questions. This contribution
will discuss if an automated vehicle still has a driver within the meaning of the
Geneva Convention on road traffic 1949 and the Vienna Convention on road traffic
1968, that form the base of many national traffic laws across the globe. It will be
explored how, if necessary, these Conventions can be revised in order to accom-
modate automated driving. Inspiration will be drawn from national and interna-
tional maritime traffic laws and from international aviation traffic law.

   
Keywords automated driving traffic law driver liability automated vehicle 
law

5.1 Introduction

In March 2018, a woman was killed in a road accident in Tempe, Arizona, United
States. She was hit by a vehicle when she crossed the street, walking her bike. In
2017, over 40,000 people were killed in motor vehicle accidents on the roads of the
United States.1 That is over 100 fatalities per day. The death of the woman in
Tempe made the headlines. Why? The vehicle that killed her was an automated test
vehicle from taxi service Uber. The accident made headlines across the globe as the
first fatal accident with a self-driving car. The vehicle was equipped with cameras,
radars, software, etc. A human (or ‘safety driver’)2 behind the steering wheel should
take over in case hard- or software were failing. However, apparently neither the
driver nor the vehicle itself braked for the woman crossing the road in Tempe.3 This
terrible event caused the discussion on the safety of automated vehicles to flare
up. It brought various legal questions concerning automated driving to the attention
of the general public. Not only the question who was liable in case of an accident
caused by an automated vehicle4 got attention, but also the conditions under which
automated vehicles are allowed on public roads5 and the technical requirements a

1
National Safety Council (2017) Estimates Show Vehicle Fatalities Topped 40,000 for Second
Straight Year. www.nsc.org/road-safety/safety-topics/fatality-estimates. Last accessed 14 August
2018.
2
Hull et al. (2018) Hyperdrive. Uber Crash Highlights Odd Job: Autonomous Vehicle Safety
Driver. www.bloomberg.com/news/articles/2018-03-23/uber-crash-highlights-odd-job-autonomous-
vehicle-safety-driver. Last accessed 14 August 2018.
3
Hawkins AJ (2018) Uber’s self-driving car showed no signs of slowing before fatal crash, police
say. www.theverge.com/2018/3/19/17140936/uber-self-driving-crash-death-homeless-arizona.
Last accessed 14 August 2018.
4
The relatives of the victim in the accident in Tempe settled with UBER: Reuters (2018) Uber
settles with family of woman killed by self-driving car. www.theguardian.com/technology/2018/
mar/29/uber-settles-with-family-of-woman-killed-by-self-driving-car?CMP=Share_iOSApp_
Other. Last accessed 14 August 2018.
5
The State of Arizona regulated barely anything at the time of the accident.
5 Automated Driving and the Future of Traffic Law 69

vehicle needs to meet got more attention. This contribution aims to give an analysis
of some of these legal challenges on the road ahead. After discussing technical
regulations and liability issues, the challenges around traffic laws will be explored
more in-depth. In discussing a new approach to road traffic laws, inspiration will be
drawn from international maritime and aviation traffic law.

5.2 Current Developments

Although technical developments are moving along fast, there is not yet a fully
automated vehicle, that drives itself from A to B without human interference, on the
public roads today. Most vehicles on public roads today will not be scored higher
than the so-called Level 2. Level 2 means that the vehicle is under specific cir-
cumstances able to control the longitudinal and lateral motions of the vehicle, but it
needs a human driver to supervise the vehicle and to act when necessary.6 This
Level 2, and other levels of automation, are formulated by the SAE International.7
The Levels range from Level 0 (no automation) to Level 5 (full automation). This
contribution will focus on Level 5 vehicles, which are vehicles able to operate
without human supervision under all circumstances, and Level 4 vehicles during the
period of the trip that they drive completely independent from humans.8 The
moment where Level 4 vehicles need a human to drive, for instance in complex
situations, will not be discussed given the brevity of the contribution. Terms like
automated vehicle, autonomous vehicle, self-driving car are used by media to
describe a vehicle that can drive itself without human interference for an entire
trip. In this contribution, only the term ‘automated vehicle’ is used to describe Level
5 and Level 4 vehicles were the vehicle is operating on its own, without human
interference.
At the moment, tests with automated vehicles are taking place on public roads across
the globe. The test vehicles of Waymo, a Google company, have driven over 8 million
miles on US roads,9 whilst Volvo is engaging the public in a self-driving vehicle project
on the streets of Gothenburg.10 In several U.S. States regulation concerning the testing of
automated vehicles is already in place.11 The German legislator clarified the

6
SAE 2018.
7
SAE 2018.
8
SAE 2018.
9
On the road. www.waymo.com/ontheroad/. Last accessed 14 August 2018.
10
Volvo Drive me. www.volvocars.com/intl/buy/explore/intellisafe/autonomous-driving/drive-
me. Last accessed 14 August 2018.
11
See for instance National Conference of State Legislatures: Autonomous Vehicles, Self-driving
vehicles enacted legislation. www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-
vehicles-enacted-legislation.aspx (last accessed 14 August 2018) for an overview.
70 N. E. Vellinga

Straßenverkehrsgesetz,12 and the Dutch parliament is currently reviewing a proposal to


allow testing of automated vehicles on public roads without a driver inside of the vehi-
cle.13 Many governments are stimulating the development of automated driving tech-
nologies, as it is expected that automated vehicles will contribute significantly to road
safety by eliminating human error. Currently, human error is the cause of around 90% of
motor vehicle accidents.14 Automated vehicles could help battle congestion, improve the
mobility of people currently unable to drive, etc.15 Before automated vehicles will
become available to the general public, several legal challenges will have to be overcome.

5.3 Technical Regulations

Automated vehicles will not just pop up on the public roads without any sort of
safety checks. These vehicles will have to be approved by a vehicle authority, just
like a conventional vehicle. Within the European Union, the approval given by the
approval authority of one Member State, is recognized in the other Member States.
This approval can be issued to a single vehicle, but a type of vehicle can also be
approved.16 The approval authority has the task of approving vehicles before the
vehicles are allowed on public roads. A vehicle is approved when it satisfies several
technical regulations, amongst others regulations issued by the United Nations
Economic Commission for Europe.17 These regulatory acts include rules on sound
levels, emissions, seat strength, lamps, braking, steering equipment, and so on.18
See for example UNECE Regulation No. 79, 5.4.1.1.:
“Any fault which impairs the steering function and is not mechanical in nature must be
signalled clearly to the driver of the vehicle.(…)”

12
It is now under specific conditions allowed to use highly or fully automated driving functions
(“hoch- und vollautomatisierte Fahrfunktion”), see § 1a, § 1b Straßenverkehrsgesetz. See also
(2017) Straßenverkehrsgesetz für automatisiertes Fahren geändert. www.bundestag.de/dokumente/
textarchiv/2017/kw13-de-automatisiertes-fahren/499928. Last accessed 14 August 2018.
13
Kamerstukken II 2017/18, 34838. See http://www.zoek.officielebekendmakingen.nl/dossier/
34838 (Accessed 14 August 2018) for the latest developments.
14
NHTSA, Traffic Safety Facts, February 2015. See also Smith 2013.
15
See for instance (2018) Self-driving cars offer huge benefits—but have a dark side.
Policymakers must apply the lessons of the horseless carriage to the driverless car. www.
economist.com/news/leaders/21737501-policymakers-must-apply-lessons-horseless-carriage-
driverless-car-self-driving. Last accessed 14 August 2018.
16
Article 6ff, Directive 2007/46/EC.
17
See Articles 4, 34–35 and Annex IV of Directive 2007/46/EC, The EU is a Contracting Party to
the UNECE Revised 1958 Geneva Agreement by virtue of council decision 97/836/EC or sub-
sequent council decisions.
18
UNECE World Forum for Harmonization of Vehicle Regulations (WP.29) is responsible for
developing and updating these regulations. www.unece.org/trans/main/wp29/introduction.html.
Last accessed 14 August 2018.
5 Automated Driving and the Future of Traffic Law 71

As the example illustrates, in technical provisions reference can be made to the


notion of ‘driver’. At the moment, the driver within the meaning of these regula-
tions is the conventional driver, with his hands on the steering wheel and his feet on
the pedals. However, an automated vehicle does not have a conventional driver.
How can these technical provisions nevertheless accommodate automated driving?
A novel interpretation of the notion of ‘driver’ in this context could provide a
solution. This approach is taken by the United States National Highway Traffic
Safety Administration (NHTSA) regarding some Federal Motor Vehicle Safety
Standards (FMVSS).19 For instance, for the purpose of FMVSS no.
101 Section 5.1.1, the NHTSA interprets the self-driving system of the vehicle as
the driver.20 FMVSS no. 101 Section 5.1.1 states:
“The controls listed in Table 1 and in Table 2 must be located so they are operable by the
driver under the conditions of S5.6.2”

Tables 1 and 2 list different controls and telltales, such as turn signals, brake system
malfunction and the horn. NHTSA states that “If no human occupant of the vehicle
can actually drive the vehicle, it is more reasonable to identify the “driver” as
whatever (as opposed to whoever) is doing the driving.”21 This approach can also
be used regarding the UN regulations and other technical provisions. However,
some technical regulations might need to be rewritten as they might not be suitable
for the described interpretation. Besides the need to rewrite provisions, the rise of
automated vehicles will also give rise to new technical regulations; for instance,
regarding the sensors and cameras an automated vehicle needs to be equipped with.
Technical regulations could also reflect the outcome of the debate on ethical issues
concerning automated driving.22 The outcome of the highly debated ‘trolley problem’ (if
a fatal accident is unavoidable, who should the vehicle protect/kill?) can be translated
into technical requirements. For instance, it can be required that a vehicle is programmed
in such a way that it, in an unavoidable accident, does not favor one road-user over the
other, but tries to limit the number of victims and severity of the injuries.

5.4 Civil Liability

A question that is often raised by media is one concerning civil liability: if an


automated vehicle causes an accident, who will pay the damages? This question
needs, just like the questions regarding technical regulations discussed in the last
section, answering before automated vehicles become available to the general

19
The United States do not use a system of type-approval. Instead, manufacturers have to
self-certify the compliance of their vehicles with the FMVSS.
20
Letter from NHTSA to Chris Urmson 2016.
21
Letter from NHTSA to Chris Urmson 2016.
22
See on ethical dilemma’s concerning automated driving: Ethik-Kommission automatisiertes
und vernetztes Fahren 2017.
72 N. E. Vellinga

public. After all, the outcome of this liability question will be the deciding factor
regarding who needs to be insured against damage caused by the automated vehicle.
The liability question is in turn influenced by traffic laws, by whether or not a traffic
rule has been violated, as will be discussed in the next section.
Automated vehicles come with new risks, such as the risk of hacking or the risk of a
software failure. An in-depth study of the liability issues concerning automated driving
goes beyond the scope of this contribution. In this chapter the position of some, not all,
stakeholders that could face liability claims is discussed only briefly with the aim to
provide a short insight into the questions surrounding liability for damage caused by
automated vehicles. It is important to note that the answer to the question who is liable
for damage caused by an automated vehicle very much depends on national law.
The user of the vehicle—the person dispatching the vehicle and determining its
destination—will likely not be liable when the automated vehicle is involved in an
accident. After all, the user has barely any influence on the behavior of the vehicle.
Nevertheless, the user might still be held liable in case a risk-based liability regime
applies or when the user was at fault because, for instance, he ignored certain
warnings given by the vehicle.23
There might also be circumstances under which the owner or keeper of the
vehicle is liable for the damage caused by the automated vehicle.24 Depending on
national legislation, the owner can be liable for damage caused by the automated
vehicle if he omitted to install an update that could have prevented the accident.
However, it is likely that there will be a shift away from traffic liability to
product liability once automated vehicles are a normal sighting on the public
roads.25 It is likely that most accidents with automated vehicles are not caused by
human error but by the hardware and software of the vehicle, for instance a mal-
functioning sensor or a fault in the software. The liability of the manufacturer for a
defective product is, within the European Union, governed by the Product Liability
Directive of 1985 (Directive 85/374/EEC).26 The manufacturer of the automated
vehicle, or parts of the vehicle (see Article 5 Product Liability Directive) can be
liable for the damage caused by the automated vehicle if the automated vehicle is
considered to be a defective product (Article 6 Product Liability Directive). If the
damage was caused by the software, the question rises if software is a ‘product’
within the meaning of the Directive. In literature, there seems to be a tendency to
answer this question with ‘yes’.27 The manufacturer can avert liability if he suc-
cessfully invokes one of the defenses of Article 7 Product Liability Directive. In the
context of developing such a novel product as automated vehicles, the most
important defense is likely to be the so-called development risk defense: the
manufacturer is not liable if “the state of scientific and technical knowledge at the

23
Marchant and Lindor 2012.
24
Schrader 2016; Engelhard 2017; Schellekens 2015.
25
Engelhard and De Bruin 2017; Marchant and Lindor 2012.
26
Van Dam 2013; Tjong Tjin Tai and Boesten 2016.
27
Reese 1994; De Schrijver and Maes 2010; Engelhard 2017.
5 Automated Driving and the Future of Traffic Law 73

time when he put the product into circulation was not such as to enable the existence
of the defect to be discovered” (Article 7(c) Product Liability Directive). Under
specific circumstances, other stakeholders might be liable (for instance the road
authority for a fault in the road infrastructure).28 Besides the discussion on liability,
there is also discussion on the role insurance plays in an automated future.29

5.5 Traffic Laws

A topic that perhaps does not get as much attention in the media as the liability
questions but is nevertheless a very important topic is traffic laws and automated
vehicles. At first glance this might seem a matter of national law as road traffic
regulation is often national law. However, a large number of national traffic laws are
based on two international conventions: the Geneva Convention on road traffic of
1949 (hereinafter: Geneva Convention) and the Vienna Convention on road traffic
of 1968 (hereinafter: Vienna Convention).30 The aim of the Conventions is to
facilitate international traffic and to increase road safety.31 As automated vehicles
are expected to have a positive effect on road safety, the new technology fits within
the aim of the Conventions to increase road safety. The Conventions are of great
importance worldwide: the Geneva Convention has 96 parties, the Vienna
Convention has 74 Contracting Parties (some of which are also party to the Geneva
Convention). These countries range from Australia (party to the Geneva
Convention), Chile (party to both Conventions), Germany (Vienna Convention),
India (Geneva Convention), Kazakhstan (Vienna Convention), the Netherlands
(Geneva Convention and Vienna Convention), Ukraine (Vienna Convention),
United States (Geneva Convention), to Zimbabwe (Geneva Convention and Vienna
Convention). All these Contracting Parties need to bring their national laws in
conformity with the Convention to which they are party to (Article 1 Geneva
Convention, Article 3 Vienna Convention). The Global Forum for Road Safety of
the UNECE, or Working Party 1 (hereinafter WP.1), is responsible for, among
others, keeping both the Geneva Convention and the Vienna Convention up to date.
Currently, WP.1 is debating on how to revise the Conventions in order to
accommodate automated driving.32 This contribution aims to provide insights in the

28
Van Dam 2013.
29
Vellinga 2017; Schellekens 2015; Van Wees 2016; Verbond van Verzekeraars Toekomstvisie
Automotive. Onderweg naar morgen www.verzekeraars.nl/media/4684/onderweg-naar-morgen.
pdf. Accessed 14 August 2018.
30
The Vienna Convention was the answer to a growing demand for greater uniformity of national
regulations than under the Geneva Convention (Attachment email Robert Nowak, United Nations,
to author, 1 June 2017).
31
Preamble Geneva Convention, preamble Vienna Convention.
32
The testing of automated vehicles on public roads is allowed under the Geneva Convention and
the Vienna Convention.
74 N. E. Vellinga

bottlenecks regarding the Conventions and automated driving and will offer a novel
solution by taking inspiration from maritime and aviation traffic laws.

5.5.1 Rules of the Road

Both Conventions have a chapter on rules of the road (Chapter II of both


Conventions). These traffic rules cover, among others, overtaking (Article 11
Geneva Convention, Article 11 Vienna Convention), giving way (Article 12
Geneva Convention, Article 18 Vienna Convention), and distracted driving (Article
8 para 6 Vienna Convention). The notion of ‘driver’ plays an important part in these
rules of the road, as the traffic rules are often directed at the ‘driver’. The notion of
‘driver’ is such a central concept in the Conventions, the Conventions even state
that that every (moving) vehicle should have a driver (Article 8 para 1 Geneva
Convention, Article 8 para 1 Vienna Convention). An automated vehicle is a
vehicle within the meaning of the Conventions (Article 4 para 1 Geneva
Convention, Article 1(p) Vienna Convention), so an automated vehicle also needs
to have a driver to comply with the Conventions. But does it actually have a driver
within the meaning of the Conventions?

5.5.2 The Notion of ‘Driver’

The Geneva Convention and the Vienna Convention provide similar definitions of
the notion of ‘driver’:
“Article 4 para 1 Geneva Convention: ““Driver” means any person who drives a vehicle,
including cycles, or guides draught, pack or saddle animals or herds or flocks on a road, or
who is in actual physical control of the same,(…).””
“Article 1(v) Vienna Convention: ““Driver” means any person who drives a motor vehicle
or other vehicle (including a cycle), or who guides cattle, singly or in herds, or flocks, or
draught, pack or saddle animals on a road;(…).””

So, a driver is a person who drives a vehicle. In this context, and given the current
stand of the discussion, this ‘person’ is a human, not a legal person.33 This is no
surprise given that the Conventions were drafted in a time when vehicles without a
human behind the wheel seemed almost impossible. It would have been self-evident
a vehicle has a human driver.

33
United Nations Conference on Road and Motor Transport, Committee III on Road Traffic,
Summary Record of the Seventeenth Meeting, held at the Palais des Nations, Geneva, on Tuesday,
6 September 1949 at 3 p.m., E/CONF.8/C.III/SR.17/Rev.1, 21 November 1949, p. 2; Vellinga
et al. 2016; Lutz 2014; Smith 2014.
5 Automated Driving and the Future of Traffic Law 75

The Geneva Convention and the Vienna Convention do not provide more
insights into what it entails to ‘drive’. In the case law of the Contracting Parties,
however, the meaning of ‘driving’ is discussed. The case law of the Contracting
Parties can therefore shine some light on what it entails to be ‘driving’ a vehicle.
In this contribution, the case law of Germany (party to the Vienna Convention)
and the Netherlands (party to both Conventions) will be discussed.34 In the case law
of both countries, what it entails to ‘drive’ has been explained. The interpretation of
‘driving’ is often discussed in relation with the interpretation of ‘driver’ (German:
Fahrzeugführer, Dutch: bestuurder).
Given German case law, the driver of a vehicle is the person “wer das Fahrzeug
in Bewegung zu setzen beginnt, es in Bewegung hält oder allgemein mit dem
Betrieb des Fahrzeugs oder mit der Bewältigung von Verkehrsvorgängen
beschäftigt ist. Bringt ein Kraftfahrer sein Fahrzeug nicht verkehrsbedingt zum
Stehen, bleibt er solange Führer des Kraftfahrzeugs, wie er sich noch im Fahrzeug
aufhält und mit dessen Betrieb oder mit der Bewältigung von Verkehrsvorgängen
beschäftigt ist. Dies ist regelmäßig erst dann nicht mehr der Fall, wenn er sein
Fahrzeug zum Halten gebracht und den Motor ausgestellt hat”.35 This means that
the person who sits behind the wheel while a car is being pushed, unconditionally
following the orders given by the person who pushes the car, is not a driver within
the meaning of German law.36
A different interpretation of the notion of ‘driver’ has developed in the
Netherlands. According to Dutch case law, a person is regarded the driver if he
influences the direction and/or speed of the vehicle by operating the controls.37 The
person pulling the hand brake whilst sitting in the passenger seat,38 the person
walking next to the car while using the steering wheel,39 the person steering the
vehicle while it is being towed40; they are all driver of the vehicle.

34
See also N.E. Vellinga (forthcoming) Self-driving vehicles: preparing road traffic law for a
driverless future, Conference Proceedings ITS World Congress Copenhagen, 17–21 September
2018.
35
Freely translated: (…) who starts to set the vehicle in motion, who keeps the vehicle moving or
who is generally occupied with the operation of the vehicle or with the handling of traffic oper-
ations. If a driver does not bring his vehicle to a halt due to traffic conditions, he remains the driver
of the motor vehicle so long as he is still in the vehicle and occupied with the operation of the
vehicle or with the handling of traffic operations. This is usually no longer the case if he has
stopped the vehicle and turned off the engine. Bundesgerichtshof (BGH)4 StR 592/16, 27 April
2017, ECLI:DE:BGH:2017:270417U4STR592.16.0.
36
BGH 22.03.1977, VI ZR 80/75.
37
Hoge Raad (HR) 13 August 2005, ECLI:NL:HR:2005:AT7292, NJ 2005/542.
38
HR 13 August 2005, ECLI:NL:HR:2005:AT7292, NJ 2005/542.
39
HR 12 June 1990, ECLI:NL:HR:1990:ZC8550, NJ 1991/29, VR 1990/158; HR 23 February
1999, ECLI:NL:HR:1999:ZD348, VR 2000/81.
40
HR 2 February 1965, ECLI:NL:HR:1965:AB3467, NJ 1965/281; HR 26 January 1971, ECLI:
NL:HR:AB5997, NJ 1971/208; HR 1 December 1987, ECLI:NL:HR:1987:AB7814, NJ 1988/689;
HR 2 October 1990, ECLI:NL:HR:1990:ZC8593, NJ 1991/380.
76 N. E. Vellinga

Even though the definitions of ‘driver’ and ‘driving’ given in the case law of the
discussed Contracting Parties differ, they do have some elements in common. It can
be said that the driver decides on the direction and speed (lateral and longitudinal
movements) of the vehicle by operating at least some controls. The decisions of the
driver have an immediate effect: if he decides to brake and pushes the brake pedal
down, the vehicle will immediately slow down.
Summarizing, the driver, within the meaning of the Conventions, is a human
who decides on the speed and direction of the vehicle by operating at least some of
the vehicle’s controls. What does this mean for automated driving? Does an
automated vehicle (Level 5 or Level 4 during the period where the vehicle operates
without human interference) have a driver within the meaning of the Geneva
Convention and the Vienna Convention?

5.5.3 The Driver of an Automated Vehicle

If one thinks of automated vehicles, a couple of potential ‘drivers’ can spring to


mind: the user—the human who provides the destination and who dispatches the
vehicle—, the manufacturer, the self-driving system of the vehicle. After all, the
self-driving system41—the combination of hard- and software—makes decisions
regarding longitudinal and lateral movements. The manufacturer has great influence
on this via the hardware and software it equips the vehicle with. Given the inter-
pretation of the notion of ‘driver’, both the manufacturer and the self-driving system
are nevertheless not the driver of the automated vehicle as they are both not human.
The user of the vehicle, however, is human. Is s/he perhaps the ‘driver’ of the
vehicle, within the meaning of the Conventions? On the one hand, the user does
have some influence on the driving behaviour by setting its destination, dispatching
the vehicle, and perhaps by adjusting certain settings (for example, there might be a
setting for how much distance the automated vehicle should keep from the vehicle
driving in front of it). On the other hand, the user does not decide on swerving,
overtaking, accelerating, braking, etc., and he does not operate the controls during
the trip. Therefore, the actions of the user do not amount to ‘driving’.42 So neither
the user, the manufacturer or the self-driving system is the ‘driver’ of an automated
vehicle within the meaning of the Geneva Convention and the Vienna Convention.
From a legal perspective, an automated vehicle is truly driverless. Given the
importance of the notion of ‘driver’ in the Geneva Convention and the Vienna

41
See also UNECE Global Forum for Road Traffic Safety 2017. See on the discussion on legal
personhood for robots: European Parliament resolution of 16 February 2017 with recommenda-
tions to the Commission on Civil Law Rules on Robotics (2015/2103(INL)); (2018) Open letter to
the European Commission, Artificial intelligence and robotics https://g8fip1kplyr33r3krz5b97d1-
wpengine.netdna-ssl.com/wp-content/uploads/2018/04/RoboticsOpenLetter.pdf. Last accessed 14
August 2018; Hartlief 2018.
42
See also Von Bodungen and Hoffmann 2016.
5 Automated Driving and the Future of Traffic Law 77

Convention, this means that automated driving is incompatible with the


Conventions. This gives rise to the question how the Conventions can be revised in
order to accommodate automated driving.

5.5.4 Automation and Traffic Laws

In order to accommodate automated driving, the Geneva Convention and the


Vienna Convention need revision. Road traffic is not the only mode of transport
facing automation. Maritime traffic and aviation have been familiar with automation
longer: both modes are familiar with the use of an autopilot. Although these
autopilots are perhaps not as advanced as Level 4 and Level 5 vehicles,43 the laws
on maritime and air traffic could nevertheless provide insights into how to
accommodate new levels of automation in traffic law.
Both maritime traffic and air traffic are, to some extent, regulated at an inter-
national level. Take for instance the United Nations International Regulations for
Preventing Collisions at Sea of 1972 (COLREGS 1972)—a time in which some
form of automation of the steering task was not unfamiliar in sailing—and Annex 2
of the Convention on International Civil Aviation (Chicago Convention), that entail
traffic rules for traffic at sea (COLREGS 1972) and in the air (Annex 2 Chicago
Convention).
The COLREGS 1972 and Annex 2 Chicago Convention both provide rules on
how to deal with a head-on situation:
“Rule 14 COLREGS 1972: “(…) When two power-driven vessels are meeting on reciprocal
or nearly reciprocal courses so as to involve risk of collision, each shall alter her course to
starboard so that each shall pass on the port side of the other.(…)””
“3.2.2.2 Annex 2 Chicago Convention: “(…) When two aircraft are approaching head-on or
approximately so and there is danger of collision, each shall alter its heading to the right.””

These traffic rules are directed at the vehicle, not at a human who might have access
to the controls. Who or what alters the course of the vessel or aircraft—the master
of the ship, the helmsman, the pilot-in-command, the autopilot—is irrelevant.44 As
long as the ship or aircraft alters its course in the right direction, it is obeying the
law.
The COLREGS 1972 put the responsibility for the operation of the vessel with,
among others, the master of the ship and the crew:
“Rule 2 a COLREGS 1972: “Nothing in these Rules shall exonerate any vessel, or the
owner, master or crew thereof, from the consequences of any neglect to comply with these

43
For instance, the autopilot on board of a ship needs to be regularly checked by a crew member,
see the International Convention on Standards of Training, Certification and Watchkeeping for
Seafarers.
44
Cockcroft and Lameijer 2012.
78 N. E. Vellinga

Rules or of the neglect of any precaution which may be required by the ordinary practice of
seamen, or by the special circumstances of the case.””

The responsibility for the operation of an aircraft lies with the pilot-in-command:
“2.3.1 Annex 2 Chicago Convention: “The pilot-in-command of an aircraft shall, whether
manipulating the controls or not, be responsible for the operation of the aircraft in accor-
dance with the rules of the air, except that the pilot-in-command may depart from these
rules in circumstances that render such departure absolutely necessary in the interests of
safety.””

So, the owner of the ship, the master of the ship and the crew or the pilot-in
command can be held responsible if a vessel or an aircraft does not alter its course
in a head-on situation. As is clearly stated in 2.3.1 Annex 2 Chicago Convention, it
is not relevant if this person was operating the controls.
This approach—directing behavioural rules at the vehicle and assign responsi-
bility to a person—establishes a distinction between who or what operates the
vehicle and who is responsible for that operation. The approach can be used as a
blueprint for revising the Geneva Convention and the Vienna Convention. It would
open up the possibility of automated driving, where the self-driving system ‘drives’
the vehicle while a human remains responsible for the operation of the vehicle.

5.5.5 Adopting a New Approach

As discussed above, the Geneva Convention and the Vienna Convention will need
revision in order to accommodate automated driving. The approach from maritime
and aviation traffic law can serve as a blueprint for this revision. In order to achieve
the same system and distinction between who or what operates the vehicle and who
is responsible for the operation, three steps will need to be taken.
First, the provisions stating that every vehicle should have a driver (Article 8
para 1 Geneva Convention, Article 8 para 1 Vienna Convention) need to be deleted
from the Conventions. This is needed in order to accommodate vehicles with a high
degree of automation, the vehicles that do not have a driver within the meaning of
the Conventions.
The next step is to take the notion of ‘driver’ out of the rules of the road.
Currently, the notion of ‘driver’ is omnipresent in the Conventions. This needs to
change to accommodate automated driving. If traffic rules are no longer directed at
the driver but at the vehicle, following the example from the COLREGS 1972 and
Annex 2 Chicago Convention, it becomes irrelevant whether a vehicle is driven by
a system, a human, or something else. Even though the notion of ‘driver’ is, as
pointed out above, omnipresent in the Conventions, this approach does not come
out of the blue: the Vienna Convention already knows a traffic rule directed at the
vehicle instead of the driver. Article 11 para 9 Vienna Convention states:
“A vehicle shall not overtake another vehicle which is approaching a pedestrian crossing
marked on the carriageway or sign-posted as such, or which is stopped immediately before
5 Automated Driving and the Future of Traffic Law 79

the crossing, otherwise than at a speed low enough to enable it to stop immediately if a
pedestrian is on the crossing. (…)”

This provision shows that the new approach discussed in this contribution is not
entirely unfamiliar to the Vienna Convention.
A significant number of provisions from the Geneva Convention and the Vienna
Convention will need revision, but some provisions—like Article 11 para 9 Vienna
Convention—can stay as they are. Most of the provisions that do need to change are
provisions specifically written with a human in mind. Take for example Article 8
para 3 Vienna Convention:
“Every driver shall possess the necessary physical and mental ability and be in a fit physical
and mental condition to drive.”

This provision is clearly written for a human. It serves no purpose to change this
provision. Another example of a provision that does not need revision is Article 7
para 5 Vienna Convention on the wearing of safety belts. Apart from the actual
feasibility, it does not serve any purpose to require a vehicle to wear a safety belt. It
is, however, important to remain required for conventional drivers and passengers
to wear safety belts as it increases their safety.45 This provision can therefore be left
unchanged.
The final step in adjusting the Conventions according to the new approach is to
assign responsibility for the operation of the vehicle to a (legal) person. This
establishes the distinction between who or what operates the vehicle and who is
responsible for the operation. If the vehicle is driven by a human—which is the case
with conventional vehicles and automated vehicles with the self-driving system
switched off—the person operating the vehicle and the person responsible for the
operation are one and the same person: the human driver. When there is no (human)
driver, as is the case with an automated vehicle, the vehicle might be operated by
the self-driving system. The system decides on making a turn, overtaking another
vehicle, stopping for the red traffic light, etc. The responsibility for the operation,
however, has to rest with a person. A reason for assigning responsibility to a person
could be the influence he can exercise over the operation of the vehicle, just like the
pilot-in-command and the master of a ship have a high degree of influence over the
operation of their aircraft or vessel. Another option is to assign responsibility to
multiple persons, following the example of rule 2 a COLREGS 1972 which assigns
responsibility to the owner, master or crew of the vessel. The assigning of
responsibility can have consequences beyond traffic law. Depending on national
law, violation of traffic laws can expose the person responsible for the vehicle’s
operation to criminal liability. This person might also face a claim for damages if
the vehicle violated a traffic rule and, as a consequence, caused damage to another
party. These consequences should not be disregarded. With whom this responsi-
bility lies—the user of the vehicle, the manufacturer, the software programmer, the
owner, etc.—is ultimately a policy matter.

45
World Health Organization 2004.
80 N. E. Vellinga

5.6 Final Remarks

To turn the discussed approach into reality, a substantial overhaul of the Geneva
Convention and the Vienna Convention is necessary. As discussed, the traffic rules
should be directed at the vehicle instead of the driver. This requires Article 8 para 1
Geneva Convention and Article 8 para 1 Vienna Convention to be deleted and it
requires amendments to a significant number of traffic rules. It will also be nec-
essary to adopt a new provision on the responsibility for the operation of the
vehicle. Amending the Conventions, especially the Geneva Convention, can be a
time-consuming and challenging process (Article 31 Geneva Convention, Article
49 Vienna Convention). If amending the Conventions is deemed too complex or
politically not feasible, a new convention can be drafted using the discussed
approach.
Either way, the discussed approach would accommodate traffic of all levels of
automation. Whether your vehicle is a conventional vehicle, a vehicle with several
advanced driver assistance systems, an old-timer, or a fully automated vehicle, all
vehicles must obey the same traffic rules from national laws based on the revised
Conventions. This avoids discrepancies and provides legal certainty: an automated
vehicle does not need a software update with new traffic rules before crossing the
border to another country whilst the same person remains responsible for the
operation of the vehicle. The approach offers a novel solution for the absence of a
‘driver’ in an automated vehicle whilst still accommodating conventional driving in
conventional vehicles.
Although automated vehicles are already being tested on public roads across the
globe, it will take a number of years, if not decades, before automated vehicles will
become available for consumers. Until then, multiple legal questions need to be
answered. This contribution has provided a novel solution for one of the main legal
issues that needs to be solved. By taking a new approach inspired on maritime and
aviation traffic laws, the Geneva Convention on road traffic and the Vienna
Convention on road traffic can be revised in such a way that both Conventions
accommodate traffic of all levels of automation. This is a first step in removing legal
barriers for automated driving. More discussion on the legal issues concerning
automated driving is paramount to provide solid solutions for the driverless future.

References

Cockcroft AN, Lameijer JNF (2012) A Guide to the Collision Avoidance Rules. International
Regulations for Preventing Collisions at Sea, 7th edn. Butterworth-Heinemann, Amsterdam
De Schrijver S, Maes M (2010) Aansprakelijkheid in een ambient intelligent-omgeving: Wie heeft
het gedaan? Computerrecht 6/174
Engelhard EFD (2017) Wetgever pas op! De (vrijwel) autonome auto komt eraan. Ars Aequi
03:230–236
5 Automated Driving and the Future of Traffic Law 81

Engelhard EFD, de Bruin RW (2017) EU Common Approach on the Liability Rules and Insurance
Related to Connected and Autonomous Vehicles. In: Evas T (ed) The European Added Value
of a Common EU Approach to Liability Rules and Insurance for Connected and Autonomous
Vehicles. Study by the European Added Value Unit within the European Parliamentary
Research Service (EPRS), European Union 2018
Ethik-Kommission automatisiertes und vernetztes Fahren (2017) Bericht Juni 2017. www.bmvi.
de/SharedDocs/DE/Publikationen/DG/bericht-der-ethik-kommission.pdf?__blob=
publicationFile. Accessed 14 August 2018
Hartlief T (2018) Van knappe koppen en hun uitvindingen. NJB 2018/878, 1265
Letter from NHTSA (2016) Letter from NHTSA to Chris Urmson, Director, Self-Driving Car
Project Google, Inc. https://www.isearch.nhtsa.gov/files/Google-compiled-response-to-2012-
Nov-2015-interp-request-4-Feb-2016-final.htm. Accessed 14 August 2018
Lutz LS (2014) Anforderungen an Fahrerassistenzsysteme nach dem Wiener Übereinkommen über
den Straßenverkehr. Neuen Zeitschrift für Verkehrsrecht 27:67–72
Marchant GE, Lindor RA (2012) The Coming Collision Between Autonomous Vehicles and the
Liability System. Santa Clara L. Rev. 52 :1321–1340
Reese J (1994) Produkthaftung und Produzentenhaftung für Hard- und Software. Deutsches
Steuerrecht 1121
SAE (2018) International Taxonomy and Definitions for Terms Related to Driving Automation
Systems for On-Road Motor Vehicles, J3016. June 2018
Schellekens M (2015) Self-driving cars and the chilling effect of liability. Computer Law &
Security Review 31:506–517
Schrader PT (2016) Haftungsfragen für Schäden beim Einsatz automatisierter Fahrzeige im
Straßenverkehr. Deutsches Autorecht 5:242–246
Smith BW (2013) Human error as a cause of vehicle crashes. www.cyberlaw.stanford.edu/blog/
2013/12/human-error-cause-vehicle-crashes. Accessed 14 August 2018
Smith BW (2014) Automated Vehicles Are Probably Legal in the United States. Tex. A&M L.
Rev. 1:411–521
Tjong Tjin Tai E, Boesten S (2016) Aansprakelijkheid, zelfrijdende auto’s en andere
zelfbesturende objecten. NJB 2018/496, 656–664
United Nations Economic Commission for Europe, Inland Transport Committee, Global Forum
for Road Traffic Safety (2017) Seventy-fifth session, Informal document No. 8. Automated
Vehicles: Policy and Principles Discussion Document, submitted by Germany, Japan, Spain,
the Netherlands and the United Kingdom, 4 September 2017
Van Dam C (2013) European Tort Law, 2nd edn. Oxford University Press
Van Wees KAPC (2016) Zelfrijdende auto’s, aansprakelijkheid en verzekering; over nieuwe
technologie en oude discussies. Tijdschrift voor Vergoeding Personenschade 2:29–34
Vellinga NE (2017) From the testing to the deployment of self-driving cars: Legal challenges to
policymakers on the road ahead. Computer Law & Security Review 33:847–863
Vellinga NE, Vellinga WH, Van Wees KAPC (2016) Testen van autonome of zelfrijdende auto’s
op de openbare weg. Verkeersrecht 64:218–227
Von Bodungen B, Hoffmann M (2016) Das Wiener Übereinkommen über den Straßenverkehr und
die Fahrzeugautomatisierung (Teil 2). Wege aus dem Zulassungsdilemma.
Straßenverkehrsrecht 16:93–97
World Health Organization (2004) World Report on Road Traffic Injury Prevention. World Health
Organization, Geneva
82 N. E. Vellinga

Case Law

Bundesgerichtshof
BGH 22.03.1977, VI ZR 80/75
BGH 4 StR 592/16, 27 April 2017, ECLI:DE:BGH:2017:270417U4STR592.16.0
Hoge Raad
HR 2 February 1965, ECLI:NL:HR:1965:AB3467, NJ 1965/281
HR 26 January 1971, ECLI:NL:HR:AB5997, NJ 1971/208
HR 1 December 1987, ECLI:NL:HR:1987:AB7814, NJ 1988/689
HR 12 June 1990, ECLI:NL:HR:1990:ZC8550, NJ 1991/29, VR 1990/158
HR 2 October 1990, ECLI:NL:HR:1990:ZC8593, NJ 1991/380
HR 23 February 1999, ECLI:NL:HR:1999:ZD348, VR 2000/81
HR 13 August 2005, ECLI:NL:HR:2005:AT7292, NJ 2005/542

Nynke E. Vellinga is a Ph.D. researcher at the Faculty of Law of the University of Groningen in
the Netherlands. Her research concerns several legal aspects of automated driving. Before starting
her Ph.D. research, Nynke was already involved in legal research on automated driving.
Chapter 6
Coercive Neuroimaging Technologies
in Criminal Law in Europe
Exploring the Legal Implications for the
Prohibition of Ill-Treatment (Article 3 ECHR)

Sjors L. T. J. Ligthart

Contents

6.1 Introduction........................................................................................................................ 84
6.2 Coercive Forensic Neuroimaging...................................................................................... 86
6.2.1 Introduction............................................................................................................. 86
6.2.2 Forensic Neuroimaging .......................................................................................... 86
6.2.3 Two Types of Coercion ......................................................................................... 89
6.2.4 Conclusion .............................................................................................................. 91
6.3 Article 3 ECHR and Coercive Neuroimaging .................................................................. 92
6.4 Analogy and Deduction..................................................................................................... 94
6.4.1 Introduction............................................................................................................. 94
6.4.2 Analogy................................................................................................................... 95
6.4.3 Conclusion: From Analogy to Deduction.............................................................. 97
6.5 Synthesis: Exploring Legal Implications of Coercive Neuroimaging in Light of Article 3
ECHR................................................................................................................................. 97
6.6 Conclusion ......................................................................................................................... 99
References .................................................................................................................................. 100

Abstract Neuroscience is developing constantly and improves neuroimaging


technologies which can acquire brain related information, such as (f)MRI, EEG and
PET. These technologies could be very useful to answering crucial legal questions

The findings that are presented in this chapter are derived from the author’s current Ph.D. project
and partly based on Ligthart 2018.

S. L. T. J. Ligthart (&)
Department of Criminal Law, Tilburg Law School, Tilburg University, PO Box 90153, 5000
LE Tilburg, The Netherlands
e-mail: s.l.t.j.ligthart@uvt.nl

© T.M.C. ASSER PRESS and the authors 2019 83


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_6
84 S. L. T. J. Ligthart

in a criminal law context. However, not all defendants and convicted persons are
likely to cooperate with these technologies, and as a consequence the possibility of
coercive use of these technologies is an important issue. The use of coercive
neuroimaging technologies in criminal law, however, raises serious legal questions
regarding European human rights. For instance, how does such coercive use relate
to the prohibition of torture, inhuman and degrading treatment (‘ill-treatment’,
Article 3 European Convention on Human Rights)? This chapter describes four
neuroimaging applications and explains how they could contribute to materializing
the aims of criminal law. Furthermore, it conceptualizes two types of coercion with
which neuroimaging can be applied and explains why that distinction is relevant in
this context. Finally, it explores the legal implications of coercive neuroimaging in
the context of the prohibition of ill-treatment.

  
Keywords Neurolaw Neuroimaging Neurotests European Convention on

Human Rights Prohibition of ill-treatment Coercion

6.1 Introduction

The Grand Chamber of the European Court of Human Rights (ECtHR) finds it
beyond dispute that the fight against crime depends to a great extent on the use of
modern scientific technologies of investigation and identification.1 In 2015 four
empirical studies were published in the Journal of Law and the Biosciences that
show the ways in which neuroscientific technologies and neurobiological data
already play a role in the criminal justice systems in the Netherlands, England and
Wales, the United States and Canada.2 Currently, public prosecutors as well as
criminal defence lawyers are introducing brain evidence in criminal cases.3 The
outcomes of these studies underline the relevance of neuroscience for criminal
(procedural) law and the importance of the discussion about the legal implications
of neuroscience for the law: neurolaw.4
The relevance of neuroscience for the law has been discussed for more than a
decade.5 Many interesting questions have already been addressed, however, many

1
ECtHR (GC) 4 December 2008, appl.nos. 30562/04 and 30566/04 (S. & Marper/UK), § 105.
2
Respectively De Kogel and Westgeest 2015; Catley and Claydon 2015; Farahany 2015;
Chandler 2015.
3
Shen 2016a, p. 1.
4
Neurolaw is a relatively new research domain that studies the possible and actual impact of
neuroscience on the law and legal practice: Meynen 2016a, p. 115.
5
Greene and Cohen 2004; Shen 2016b.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 85

(new) research topics still have to be examined.6 An important matter regarding the
US jurisdiction, which has got relatively much attention, concerns the constitutional
implications of coercive neuroimaging in criminal cases—more specifically in the
context of (mental) privacy and the self-incrimination clause.7 Despite the fact that
the neurolegal discussion is getting increased attention among European scholars,8
this topic has not been fully examined in the European legal context. Nevertheless,
fundamental questions—to a considerable extent similar to those in the US—arise
in light of the European Convention on Human Rights (ECHR). More specifically,
coercive neuroimaging in criminal cases raises legal questions regarding the pro-
hibition of torture, inhuman and degrading treatment (‘ill-treatment’, Article 3
ECHR), the right to respect for bodily integrity and privacy (Article 8 ECHR) and
the principle against compulsory self-incrimination (Article 6 ECHR).9
Yet, no case law of the ECtHR exists on these specific topics. So, in order to
examine these legal questions, a research method should be developed to overcome
the current lack of case law. For instance, we could examine the general content,
scope and underlying ideas of these fundamental rights and principles, and apply
them to the use of coercive neuroimaging in forensic settings.10 Another possibility
is to create an analogy between the use of coercive neuroimaging in criminal law
(‘forensic neuroimaging’) on the one hand and, on the other hand, treatments about
which case law does already exist.11 This enables us to examine whether and under
which circumstances comparable treatments could be justified. From the general
conditions for those comparable treatments, a specific legal framework for coercive
neuroimaging could be deduced.
This chapter aims to explore the latter method in the context of the prohibition of
torture, inhuman and degrading treatment (Article 3 ECHR).12 In order to do so,
different types of neuroimaging and their forensic relevance will be described
shortly in Sect. 6.2. Furthermore, Sect. 6.2 also conceptualizes two types of coer-
cion and explains their (legal) relevance. In Sect. 6.3, the meaning and scope of the
prohibition of ill-treatment will be briefly explicated. In Sect. 6.4 an analogy
between coercive forensic neuroimaging and other treatments will be created in the
context of the prohibition of ill-treatment, which leads to a starting point for the
deduction of a legal framework for coercive neuroimaging in the context of Article
3 ECHR. Based on case law of the ECtHR, Sect. 6.5 explores the way to which this
starting point will lead. Finally, some conclusions will be drawn in Sect. 6.6.

6
Shen 2016b.
7
See i.a. Shen 2013; Farahany 2012a, b; Pardo and Patterson 2015, pp. 148–178.
8
See i.a. Moratti and Patterson 2016; De Kogel and Westgeest 2015; Catley and Claydon 2015.
9
See i.a. Encinas de Muñagorri and Saas 2016, p. 104; Meynen 2016b, p. 162; Rusconi and
Mitchener-Nissen 2013, pp. 7–10; Farrell 2009, pp. 93–95; Van Toor 2017; Ligthart et al. 2017,
pp. 579–603.
10
See e.g. Van Toor 2017.
11
See e.g. Farahany 2012b.
12
In a European context, this method, as well as the legal implications for Article 3 ECHR, has
not received much attention. Therefore, this chapter addresses both topics.
86 S. L. T. J. Ligthart

6.2 Coercive Forensic Neuroimaging

6.2.1 Introduction

This section describes the outlines of some neuroimaging technologies and their
(potential) relevance for criminal law. Most of these technologies and applications
are not ready for practical, forensic use yet. Nevertheless, because neuroscience is
developing constantly, it is important to look over today’s horizon and consider the
potential legal implications of the use of these technologies in criminal law before
they actually arrive in court.13 Furthermore, this section conceptualizes two dif-
ferent types of coercion which can be used in order to apply forensic neuroimaging
technologies. The practical and legal relevance of this distinction will be explained.

6.2.2 Forensic Neuroimaging

Neuroimaging is a collective term for different technologies that are, to some extent,
able to represent our brains and reflect their activity. Generally, two types of
neuroimaging can be distinguished: structural and functional neuroimaging.14
Structural imaging technologies, such as MRI and CT15 are able to represent our
brain anatomy, i.e. the biological structures of our brains.16 With functional neu-
roimaging, like fMRI, PET and EEG,17 brain activity can be reflected.18 These
technologies can be applied in order to acquire information from a subject’s brain.
This information could, to some extent, contribute to answering legal questions in
criminal cases.19 Because of their potential relevance for criminal justice, four
neuroimaging applications will be briefly discussed below:
A. brain-based lie detection;
B. brain-based memory detection;
C. diagnostic neuroimaging and
D. neuroimaging to predict future dangerousness (‘neuroprediction’).20

13
Nadelhoffer and Sinnott-Armstrong 2012, p. 634.
14
Richmond 2012, p. 3.
15
Magnetic resonance imaging and computerized tomography.
16
Roskies 2013, p. 37.
17
Functional magnetic resonance imaging, positron emission tomography and electroencephalography.
18
Roskies 2013, p. 66.
19
Greely and Wagner 2011, pp. 796–799; De Kogel and Westgeest 2015, pp. 580–605; Farahany
2015, pp. 485–509.
20
See for these applications i.a. Farah et al. 2014; Meijer et al. 2016; Vincent 2011; Glenn and
Raine 2014.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 87

Ad (A) According to the assumption of brain-based lie detection, some parts of


our brains are more involved with lying than with truth telling. This means that
when we are telling a lie, those specific brain areas will be activated more than
others, which can be measured with functional neuroimaging like fMRI.21 Yet,
there is no single brain region that always becomes active when a subject tells a lie
during a laboratory examination.22 However, there is considerable agreement across
studies on which brain areas are more active during an instructed deceptive
response than during telling the truth.23
Brain-based lie detection can be very helpful to find the truth during a police
investigation or at trial, which is highly relevant for criminal justice.24 Supposedly,
lie detection could also, to some extent, be relevant regarding forensic diagnostics
in the context of the insanity defence or the unfitness to stand trial.25 For instance, it
could be used in order to verify the statement of a defendant, that he heard com-
manding voices at the time of the offence, which ordered him to kill the victim.26
Ad (B) One of the most promising brain-based memory detection tests is the
concealed information test.27 This test enables researchers to determine with high
accuracy whether a subject recognizes certain stimuli, for instance by measuring a
specific brain wave—P300—with EEG.28 Such a concealed information test may
contain the following. Imagine that a suspect of an armed robbery will be subjected
to a concealed information test with P300 measurements. In this context, he has to
observe five pictures of different pistols. One of the pistols has actually been used
during the robbery. While the suspect is observing the pictures attentively, his P300
amplitudes are being measured with EEG. When the suspect’s P300 wave is sig-
nificantly higher when he is observing the pistol that has been used during the
robbery, this implies that he recognises that pistol. If the same effect occurs when
the suspect observes a picture of the sports bag that has been used, the escape car
and the victim, the suspicion that this suspect has been involved in the robbery will
increase.29 In that way, brain-based memory detection can contribute to the
assessment of crime-related knowledge of an individual, and could therefore be
highly relevant for criminal law.30 Just like lie detection, brain-based memory
detection could also presumably be relevant to verify statements of the subject in
the context of forensic diagnostics.31

21
Farah et al. 2014.
22
Gamer 2014, p. 176.
23
Farah et al. 2014, p. 124.
24
Greely 2013, pp. 124–127.
25
Meynen 2018, pp. 2–3.
26
Meynen 2017.
27
Rosenfeld 2018; Verschuere et al. 2011.
28
Meijer et al. 2014, pp. 881 and 899; Rosenfeld et al. 2013b.
29
Ligthart et al. 2017, pp. 583–584.
30
Greely 2013, pp. 127–130.
31
Meynen 2017; Verschuere and Meijer 2014, p. 166.
88 S. L. T. J. Ligthart

Despite brain-based lie detection and the concealed information test can both be
very valuable for criminal justice, these tests are still in an experimental stage32 and
more (field) research has to be done before these tests will be ready for practical
(forensic) use.33 An important current limitation of these tests, for instance, is that
they are sensitive for countermeasures.34 This means that the subject can frustrate or
manipulate the scan results with the use of mental or physical actions, like moving
his tongue or recall emotional memories.35 However, should we develop effective
lie and memory detection, we have to decide how, and under what circumstances
we want it to be usable.36
Ad (C) Diagnostic neuroimaging can contribute to assess whether or not a
person suffers from specific brain abnormalities, like a brain tumour, traumatic
brain injury or some types of dementia.37 This can be relevant in the context of the
insanity defence, for instance to exclude or confirm somatic causes of mental
complaints. Through the detection (or exclusion) of symptoms of psychiatric or
neurological disorders, neuroimaging could add valuable information to forensic
diagnostics. Currently, diagnostic evaluations depend to a large extent on the
subjective statements of the individual who is being evaluated.38 If the individual
refuses to cooperate with a forensic evaluation, or fakes certain symptoms of a
mental disease, an accurate evaluation can be hindered. With the use of neu-
roimaging, forensic evaluations could become less dependent on the subjective
cooperation of the individual, and can therefore contribute to overcome classical
problems of psychiatry by circumventing an individual’s silence or reluctance to
cooperate with forensic evaluations.39
Ad (D) Finally, in the context of neuroprediction, neuroimaging can contribute
to the assessment of personalized risk of future dangerousness. Some brain features
appear to correlate robustly with aggressive behaviour. These features can be
detected through neuroimaging.40 Doing so, neuroimaging can add important
personalized information about the brains of individuals to current risk assessment
tools, in order to improve the prediction of future dangerousness,41 which is highly
relevant in the context of criminal justice.42 Such a neuroscientific contribution is

32
Nevertheless, fMRI-based lie detection is already being offered on a commercial basis: http://
www.noliemri.com/. Last accessed 15 November 2018.
33
Meixner Jr 2018, pp. 420–422; Wagner et al. 2016; Meijer et al. 2016, p. 600.
34
Rosenfeld et al. 2004; Wagner et al. 2016.
35
E.g. Rosenfeld et al. 2004, pp. 205–219.
36
Greely 2009, p. 53.
37
Simpson 2012; a case in the Netherlands, Court Midden-Nederland 2 June 2015, ECLI:NL:
RBMNE:2015:4866.
38
Meynen 2017, p. 313.
39
Meynen 2017, pp. 313–314.
40
Glenn and Raine 2014.
41
Nadelhoffer et al. 2012, p. 86; Aharoni et al. 2013.
42
Nadelhoffer and Sinnott-Armstrong 2012, p. 634.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 89

especially valuable since current risk assessment tools are far from perfect.43
Furthermore, through the detection of neurobiological predispositions for aggres-
sive behaviour, neuroprediction can make forensic risk assessments less depended
of the cooperation of the subject and could therefore contribute to circumventing
the subject’s silence or reluctance to cooperate.
In summary, this section distinguished four neuroimaging applications that can
obtain information from someone’s brain. These applications can, in their own way,
be very helpful to answer crucial legal questions in a criminal law context regarding
guilt, legal responsibility and the risk of future dangerousness. Yet, most of these
applications are not ready for practical forensic use. Therefore, more research has to
be done. However, looking over today’s horizon we should already think about
whether and under which legal conditions we want (coercive) neuroimaging to be
usable in a criminal law context. The following section conceptualizes two types of
coercion, with which forensic neuroimaging could be applied. Furthermore, it
explains the practical and legal relevance of this distinction.

6.2.3 Two Types of Coercion

In cases where the subject refuses cooperation with a neuroimaging assessment, the
assessment could be performed coercively. In this context, two types of coercion
should be distinguished: physical coercion and legal coercion.44
In the context of physical coercion, physical force is being used to accomplish a
primary goal. For instance, if a defendant refuses to cooperate with a DNA
investigation, or even resists, a police officer could physically overpower the
defendant while another policeman pulls out a hair. Because of this physical force,
the required cell material has been acquired.
In the context of legal coercion, no physical force is used. Instead, the primary
goal will be achieved through a decision of the subject itself. However, this decision
has been influenced by a legal threat to the subject that refuses cooperation. Such a
threat could be direct or indirect. For instance, if a convicted person in the
Netherlands does not pay his fine, the public prosecutor could, based on a legal
provision, send him to jail as an instrument of pressure (direct legal threat). Most
people do not want to pay their fines. However, going to jail is even worse.
Therefore, most people decide to pay their fines by themselves. They do not
cooperate voluntarily with the execution of their punishment, but show willingness
to fulfil their duties in order to avoid worse consequences.45 Another example
regarding the Dutch legal system relates to the possibility of parole in the context of
the detention under an entrustment order (TBS: terbeschikkingstellingsmaatregel).

43
Douglas et al. 2017.
44
Ligthart 2017. C.f. Pardo and Patterson 2015, p. 153.
45
Chandler 2014 speaks about ‘legally-coerced consent’.
90 S. L. T. J. Ligthart

According to the law,46 a request for parole has to contain the results of a recent risk
assessment and the outcomes of a forensic diagnosis. So, if the detainee under an
entrustment order would like to request for parole successfully, he has to cooperate
with a risk assessment tool and a forensic diagnosis. Non-cooperation is indirectly
being threatened with the rejection of any parole request.
Regarding coercive neuroimaging, an unwilling (or even resistant) subject could
be physically coerced to undergo a neuroimaging assessment through the use of a
measure of restraint like a security bed or chemical fixation. Cooperation with
neuroimaging could also be effectuated through legal coercion. For instance: if the
law considers non-cooperation a criminal offence, or the law states that a request for
bail or parole can only be approved if the subject does cooperate with, e.g.,
brain-based lie detection or a neuroprediction test. In this latter scenario, an
unwilling detainee who really wants to get out of prison receives in a certain way an
offer that he cannot refuse.47
The differentiation between physical and legal coercion is relevant in two ways:
practically and legally.48 First of all, not every neuroimaging technology can be
performed effectively with the use of physical coercion. If the subject has to per-
form a task for instance, like pressing a yes-or-no-button in the context of
brain-based lie detection, physical coercion will be useless. However, in such a
scenario it may still be possible to coerce the subject’s cooperation by making a
severe legal threat to non-cooperation.49 The same applies mutatis mutandis to
neuroimaging assessments where the subject has to observe presented stimuli at-
tentively, such as the concealed information test.50 After all, compelling someone
with physical force to be attentive seems impossible. Moreover, as stated in the
former section, brain-based lie detection and concealed information tests are vul-
nerable for physical and mental countermeasures, like moving your tongue or
recalling emotional memories. Preventing such countermeasures with physical
force is hard to imagine. However, if the use of countermeasures can be detected,51
the law could threaten subjects with negative consequences if they use some kind of
countermeasures.52 Such a legal threat could potentially make the subject decide
not to manipulate the neuroimaging assessment.
Secondly, the distinction between physical and legal coercion could also be
relevant for the legal appreciation of coercive neuroimaging in light of European
human rights. If the authorities only use legal coercion, no (additional) physical
intervention takes place. The presence or absence of physical interventions, such as
the use of a security bed, could, for instance, be relevant in the context of the

46
Verlofregeling TBS.
47
Meynen 2017, pp. 326–327.
48
Ligthart 2017.
49
Meynen 2017, p. 326.
50
Rosenfeld et al. 2008, p. 906; Picton 1992, p. 456.
51
Which is already possible to some extent: Rosenfeld et al. 2013a, pp. 120–121.
52
C.f. Rosenfeld et al. 2013a, p. 121.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 91

prohibition of ill-treatment.53 Furthermore, if a subject agrees with a neuroimaging


assessment and participates willingly—with consent—the question arises whether
European human rights still offer protection to the subject’s interests. For instance,
in some cases, informed consent for a specific treatment can avert state liability
under Article 3 ECHR.54 The follow-up question will be whether consent given
under certain legal pressure, such as a possible rejection of parole, can be con-
sidered as a valid informed consent in this context.55

6.2.4 Conclusion

This section described different neuroimaging technologies which can represent


brain anatomy and activity. These technologies can be used in a criminal law
context in order to detect lies and memories. Furthermore, they can contribute to
forensic diagnostics and to the prediction of future dangerousness. Thinking about
coercive neuroimaging, it is important to distinguish physical and legal coercion.
This distinction is relevant in a practical sense, because some neuroimaging tech-
nologies, like task based functional neuroimaging, can hardly be performed with
physical force. Besides, preventing countermeasures with physical force is also hard
to imagine. The distinction between these two types of coercion is also relevant in a
legal sense. Legal coercion does not require (additional) interferences to the sub-
ject’s physical integrity and therefore will be less intrusive, for instance in the
context of the prohibition of ill-treatment. Furthermore, in the context of legal
coercion the subject gives, to some extent, consent for the neuroimaging applica-
tion. Given (informed) consent can, under certain circumstances, avert state liability
for a specific treatment. The next section gives a brief overview of the meaning and
scope of the prohibition of ill-treatment and illustrates how coercive forensic
neuroimaging raises questions in this context.

53
See e.g. ECtHR 29 May 2012, appl.nos. 16563/08, 40841/08, 8192/10 and 18656/10 (Julin/
Estonia); ECtHR 18 October 2012, appl.no. 37679/08 (Bureš/Czech Republic). C.f. Thompson
2005, p. 1625.
54
E.g. ECtHR 3 October 2008, appl.no. 35228/03 (Bogumil/Portugal), § 71: “Si en effet il y a eu
consentement éclairé, comme l’allègue le Gouvernement, aucune question ne se pose sous l’angle de
l’Article 3 de la Convention.” See on this topic also Buelens et al. 2016; Jacobs 2012, pp. 45–57.
55
C.f. Buelens et al. 2016, pp. 484–485.
92 S. L. T. J. Ligthart

6.3 Article 3 ECHR and Coercive Neuroimaging

According to Article 3 ECHR, “No one shall be subjected to torture or to inhuman


or degrading treatment or punishment.” This prohibition is absolute.56 No dero-
gation from it shall be made in times of war or other public emergencies (Article 15
(2) ECHR). Furthermore, the Court notes that “the philosophical basis underpinning
the absolute nature of the right under Article 3 does not allow for any exceptions or
justifying factors or balancing of interests, irrespective of the conduct of the person
concerned and the nature of the offence at issue.”57 Even the need to fight terrorism
and organized crime or to save an individual’s life, cannot justify state conduct that
is prohibited by Article 3 ECHR.58
In order to fall within the scope of Article 3 ECHR a treatment must attain ‘a
minimum level of severity’. The suffering and humiliation involved must in any
event go beyond that inevitable element of suffering or humiliation connected with
a given form of legitimate treatment or punishment.59 The assessment of this
minimum threshold is relative and depends on all circumstances of the case, such as
the duration of the treatment, its physical and mental effects and, in some cases, the
sex, age and state of health of the victim. Further relevant factors include the
purpose for which the treatment was inflicted together with the intention or moti-
vation behind it.60 For the question of whether ill-treatment falls within the scope of
Article 3 ECHR, it is also of importance that the Court approaches the ECHR as a
living instrument, which must be interpreted in the light of present-day condi-
tions.61 Consequently, the threshold for the minimum level of severity has been
lowered by the Court in recent years.62
If a specific treatment reaches the threshold of Article 3 ECHR, and therefore
falls within its scope and breaches the prohibition, it could subsequently be qual-
ified as torture, inhuman or degrading treatment. Where torture can be seen as the
highest grade of ill-treatment, degrading treatment is the lowest.63 According to the
Court’s Grand Chamber, torture contains a “deliberate inhuman treatment causing
very serious and cruel suffering.”64 Treatment is considered to be inhuman when it
was “premeditated, was applied for hours at a stretch and caused either actual

56
See for a nuance: Vorhaus 2002.
57
ECtHR (GC) 1 June 2010, appl.no. 22978/05 (Gäfgen/Germany), § 107.
58
ECtHR (GC) 25 September 1997, appl.no. 23178/94 (Aydin/Turkey), § 81; ECtHR (GC) 26
July 1999, appl.no. 25803/94 (Selmouni/France), § 95; ECtHR (GC) 1 June 2010, appl.no. 22978/
05 (Gäfgen/Germany), § 107.
59
ECtHR 18 January 1978, appl.no. 5310/71 (Ireland/UK), § 163.
60
ECtHR (GC) 1 June 2010, appl.no. 22978/05 (Gäfgen/Germany), § 88.
61
ECtHR 25 April 1978, appl.no. 5856/72 (Tyrer/UK), § 31.
62
Harris et al. 2014, p. 237.
63
Vorhaus 2002, p. 375.
64
ECtHR (GC) 25 September 1997, appl.no. 23178/94 (Aydin/Turkey), § 82.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 93

bodily injury or intense physical or mental suffering.”65 A treatment can be held as


degrading “when it humiliates or debases an individual, showing a lack of respect
for, or diminishing, his or her human dignity, or when it arouses feelings of fear,
anguish or inferiority capable of breaking an individual’s moral and physical
resistance.”66 For a treatment to be qualified as degrading, it may be sufficient that
the victim is humiliated in his own eyes, even if not in the eyes of others—as long
as a minimum level of severity has been reached.67 According to Duffy, the position
of the Court is probably that the experience of the victim constitutes an important
consideration, but that a state cannot be convicted for actions which the victim finds
degrading merely because of his own unreasonable attitudes or exceptionally
sensitive nature.68
At first sight it may not immediately be clear how the use of neuroimaging
technologies that are broadly being used in day-to-day medical practice, like fMRI,
could be problematic regarding the prohibition of ill-treatment. However, there is an
important distinction between neuroimaging in a medical setting and in the context
of criminal law. In a medical setting, persons normally agree with neuroimaging
because it is in their own best interest, for instance to diagnose and subsequently
treat a brain tumour. In the context of criminal law, however, neuroimaging will not
always be in the interest of the subject. Instead, neuroimaging could be applied
coercively by the authorities, for instance in order to detect lies during trial, or to
assess the risk of recidivism of a detainee who requested for probation. Regarding
governmental use of neuroimaging technologies, Meegan warns:
Although one would like to think that free societies could be trusted to use such techniques
appropriately, recent events (e.g., the use of torture in interrogations and the increased
invasiveness of domestic surveillance by the United States since 9/11) make it clear that
such thinking would be naïve.69

The use of coercive neuroimaging in criminal justice, i.e. against a persons will,
raises questions regarding the prohibition of ill-treatment.70 Imagine, for instance,
that the prison authorities would like to subject a detainee to a CT brain scan, in
order to examine whether or not he suffers from a specific form of brain injury and
therefore poses a risk for future dangerousness.71 A CT scan exposes the subject to
some extent to radiation72 and requires intrusive injection of contrast liquid into the
bloodstream of the subject. If the detainee refuses cooperation, the authorities could
compel him to undergo the brain scan by using physical coercion, such as the use of

65
ECtHR (GC) 21 January 2011, appl.no. 30696/09 (M.M.S./Belgium and Greece), § 220.
66
ECtHR (GC) 17 July 2014, appl.nos. 32541/08 and 43441/08 (Svinarenko and Slyadnev/
Russia), § 115.
67
ECtHR 25 April 1978, appl.no. 5856/72 (Tyrer/UK), § 30–32.
68
Duffy 1983, p. 319.
69
Meegan 2008, p. 14.
70
Thompson 2005.
71
Aharoni et al. 2013 were able to predict the future rearrest of prisoners with neuroimaging.
72
Rushing et al. 2012, p. 8.
94 S. L. T. J. Ligthart

instruments of restraint like handcuffs, a head fixation strap and/or a security bed.
Especially if the detainee resists such a treatment, this could lead to physical harm.
Could such a treatment cause great distress and physical suffering and therefore
reach a minimum level of severity?73 And (to what extent) would it be relevant if no
physical, but legal coercion will be used, for instance by telling the detainee (based
on a legal provision) that if he keeps refusing cooperation, his request for probation
will be denied?74 Fundamental questions like these arise regarding coercive neu-
roimaging in criminal law. The following section explores a research method to
address questions like these.

6.4 Analogy and Deduction

6.4.1 Introduction

Since, at present, no ECtHR case law exists regarding coercive forensic neu-
roimaging in light of Article 3 ECHR, it could be helpful to compare ‘coercive
forensic neuroimaging’ with other treatments, about which case law does already
exist. Doing so, it is important to be aware of the context in which such an analogy
is being created, and which aspects of a treatment are relevant in that specific
context. In this chapter, the analogy takes place in the context of Article 3 ECHR.
For the ultimate judgment of a treatment in this context, the Court specifically pays
attention to four typical characteristics of a treatment itself: the method, nature and
purpose of the treatment75 and the manner in which it was carried out.76 Because
these four characteristics are relevant for the ultimate appreciation of a treatment in
light of Article 3 ECHR, these characteristics play a central role in the analogy in
the following section.

73
C.f. ECtHR 19 February 2015, appl.no. 75450/12 (M.S./Croatia (No. 2)); ECtHR18 October
2012, appl.no. 37679/08 (Bureš/Czech Republic); ECtHR 13 February 2014, appl.no. 66393/10
(Tali/Estonia).
74
Ligthart 2017.
75
ECtHR (GC) 26 October 2000, appl.no. 30210/96 (Kudła/Poland), § 91; ECtHR 10 October
2001, appl.no. 33394/96 (Price/VK), § 21–30; ECtHR 18 January 1978, appl.no. 5310/71 (Ireland/
VK), § 163; ECtHR (GC) 1 June 2010, appl.no. 22978/05 (Gäfgen/Germany), § 88; EHRM 27
November 2003, appl.no. 65436/01 (Hénaf/France), § 47–60; ECtHR 24 July 2014, appl.no.
28761/11 (Al Nashiri/Poland), § 508.
76
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 82; ECtHR 5 April 2005,
appl.no. 54825/00 (Nevmerzhitsky/Ukraine), § 94 and 97.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 95

6.4.2 Analogy

In this section, an analogy will be created between coercive forensic neuroimaging


and other treatments about which case law does already exist. As stated in
Sect. 6.4.1, for this we have to focus on the four characteristics that are important in
light of Article 3 ECHR, namely the nature, method and purpose of a treatment and
the manner in which it was carried out. Firstly, this section compares the nature,
method and purpose of coercive forensic neuroimaging with the nature, method and
purpose of compulsory medical treatments to obtain evidence of a crime. Secondly,
the way in which coercive forensic neuroimaging can be carried out—with physical
and legal coercion—will be compared with respectively the use of measures of
restraint and punishment as an instrument of pressure.
The nature, method and purpose of coercive forensic neuroimaging are, to a
relevant extent, comparable with those of a compulsory medical treatment to obtain
evidence of a crime. In the context of Article 3 ECHR, the Court describes such a
treatment as follows:
a medical procedure in defiance of the will of a suspect in order to obtain from him or her
evidence of his or her involvement in the commission of a criminal offence.77

According to this definition, the method of such a treatment is the use of a


medical procedure. The nature of it, is that the medical procedure is being used
against the individual’s will. The purpose of this treatment is to obtain evidence
from the suspect, regarding a criminal offence. Keeping this in mind, let us consider
coercive neuroimaging in criminal law.
The method of coercive neuroimaging concerns the use of neuroimaging tech-
nologies, such as (f)MRI, PET and EEG. These technologies are, basically, also
medical technologies and procedures,78 that normally are being used in order to
localize a (brain) tumour, or to verify whether someone’s leg is broken. The nature
of coercive neuroimaging is that the medical procedure is being applied against the
subject’s will. The purpose of coercive forensic neuroimaging could be described as
to obtain evidence from (the brain of) the subject, regarding a (future) criminal
offence. For instance, as described in Sect. 6.2, memory detection with EEG can be
used in order to acquire information from the subject’s brain regarding the possible
recognition of crucial aspects of a criminal offence. The use of MRI in the context
of forensic diagnostics, aims to obtain information from the subject’s brain in order
to asses or exclude a mental disorder, for instance in the context of the insanity
defence. In the context of neuroprediction, neuroimaging technologies are being
used to acquire information from the individual’s brain, in order to predict the risk
of a future criminal offence.

77
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 70. C.f. ECtHR 13 May
2008, appl.no. 52515/99 (Juhnke/Turkey), § 72; ECtHR 3 May 2012, appl.no. 23880/05
(Salikhov/Russia), § 73.
78
Greely and Wagner 2011, pp. 763–765, 768 and 772.
96 S. L. T. J. Ligthart

Ultimately, as this analysis shows, the nature, method and purpose of coercive
neuroimaging in criminal law are, in a relevant way, comparable with the nature,
method and purpose of a compulsory medical treatment in order to obtain evidence
regarding a criminal offence. Of course, there are also clear differences.79 However,
regarding these three relevant characteristics of a treatment, there are important
similarities. Therefore, case law regarding compulsory medical treatments in order
to obtain evidence of a crime, could be relevant for the judgment of coercive
forensic neuroimaging in the context of the prohibition of ill-treatment.
Regarding the fourth relevant aspect of a treatment—the manner in which it was
carried out—coercive forensic neuroimaging can be applied in two different ways:
with physical and legal coercion. As described in Sect. 6.2.3, physical coercion
consists of two typical components: (1) physical force, (2) which accomplishes an
intended result. The use of measures of restraint also contains these two aspects.
Case law of the ECtHR regarding measures of restraint concerns the use of
handcuffs, ‘fixation beds’ and chemical fixation (physical force), in order to prevent
a person of suppressing evidence, absconding, causing injury or damage, or to
realize a medical intervention (accomplishing an initiated result).80 So basically, the
use of physical coercion is in a relevant way comparable with the use of instruments
of restraint.
Legal coercion, on the other hand, consists of (1) a legal threat to the subject
whose cooperation for a specific action is required, (2) which could enforce a
specific result. The use of punishment as an instrument of pressure also consists of
these two aspects. In such a context the law threatens with criminal liability in order
to influence the behaviour of the legal subject concerned. For instance, in the case
of Ülke/Turkey the Turkish authorities tried to compel Ülke to perform his military
service, by successively prosecuting, convicting and punishing him for refusing to
do so.81 Another, more forensic example is given by Peters/The Netherlands. In
this case a prisoner was ordered to undergo a urine test and, when he refused to
comply, a disciplinary sanction was imposed.82 Threatening with disciplinary
sanctions, the prison authorities tried to compel Peters to supply a urine sample.
Ultimately, the manner in which coercive neuroimaging is carried out depends
on the type of coercion that is used. On the one hand, physical coercion is, in a
relevant way, comparable with the use of measures of restraint. Legal coercion on

79
For instance, in ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany) the medical
procedure consisted of forcibly administering emetics via a tube through the applicant’s nose into
his stomach, in order to obtain a swallowed package of cocaine. Such a treatment is way more
(physically) intrusive than coercive neuroimaging. Nevertheless, both procedures are medical by
their nature, imposed against the will of the subject in order to obtain evidence regarding a criminal
offence.
80
E.g. ECtHR 29 September 2005, appl.no. 24919/03 (Mathew/The Netherlands), § 180; ECtHR
24 September 1992, appl.no. 10533/83 (Herczegfalvy/Austria), § 81; ECtHR 27 November 2003,
appl.no. 65436/01 (Hénaf/France), § 48–52.
81
ECtHR 24 January 2006, appl.no. 39437/98 (Ülke/Turkey).
82
ECHR 6 April 1994, appl.no. 21132/93 (Peters/The Netherlands).
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 97

the other hand is, to a relevant extent, comparable with the use of punishment as an
instrument of pressure. Consequently, case law of the ECtHR regarding measures
of restraint and punishment as an instrument of pressure could be relevant for the
examination of coercive neuroimaging in the context of the prohibition of
ill-treatment.

6.4.3 Conclusion: From Analogy to Deduction

In this section an analogy has been created between coercive neuroimaging on the
one hand and, on the other, treatments about which case law of the ECtHR does
already exist. This analogy focused on four typical characteristics of a treatment,
which are important for the ultimate judgement in the context of Article 3 ECtHR,
namely the nature, method and purpose of the treatment and the manner in which it
was carried out. Ultimately, I argue coercive neuroimaging is, in a relevant way,
comparable with compulsory medical treatments in order to obtain evidence of a
crime and, depending on the type of coercion that will be used, with the use of
measures of restraint and punishment as an instrument of pressure. In order to
examine the legal implications and boundaries of coercive neuroimaging in the
context of the prohibition of ill-treatment, case law of the ECtHR regarding these
comparable treatments can be analysed. Based on the general conditions that the
court sets for those comparable treatments, a specific legal framework for coercive
forensic neuroimaging in the context of Article 3 ECHR can be deduced. The
following section explores this kind of deductive reasoning.

6.5 Synthesis: Exploring Legal Implications of Coercive


Neuroimaging in Light of Article 3 ECHR

The former sections described (the forensic relevance of) different neuroimaging
technologies and explored the legal questions that arise regarding a coercive,
forensic use of such technologies in the context of Article 3 ECHR. Furthermore,
coercive forensic neuroimaging has been compared with other treatments, which
enables us to examine whether and, if so, under which conditions comparable
treatments are permitted in light of Article 3 ECHR and, ultimately, to deduce a
legal framework for coercive forensic neuroimaging. This section explores the last
step: the deduction of a legal framework.
As argued in Sect. 6.4, in the context of Article 3 ECHR, coercive forensic
neuroimaging is, to a relevant extent, comparable with the use of a compulsory
medical procedure in order to obtain evidence of a crime. This latter type of
treatment is not as such prohibited under Article 3 of the Convention, provided that
98 S. L. T. J. Ligthart

it can be convincingly justified on the facts of a particular case.83 According to the


Court, “[t]his is especially true where the procedure is intended to retrieve from
inside the individual’s body real evidence of the very crime of which he is sus-
pected. The particularly intrusive nature of such an act requires a strict scrutiny of
all the surrounding circumstances.”84 In connection with the intrusive nature of the
act, due regard must be had to the following five factors:
1. the extent to which the forcible medical intervention is necessary to obtain the
evidence;
2. the health risks for the subject;
3. the manner in which the procedure was carried out and the physical pain and
mental suffering it caused;
4. the degree of medical supervision available; and
5. the actual effects on the suspect’s health.85
In light of all the circumstances of the individual case, the intervention must not
attain a minimum level of severity.
So, ultimately, if coercive forensic neuroimaging is, in the context of Article 3
ECHR, in a relevant way comparable with the use of a compulsory medical pro-
cedure in order to obtain evidence of a crime, the same reasoning would apply for
coercive forensic neuroimaging. This means that, if a member state of the Council
of Europe wants to implement coercive forensic neuroimaging in its legal system, it
must have due regard to the five factors as outlined above.86
One could, however, object to this reasoning, because neuroimaging technolo-
gies do not retrieve ‘real’ evidence or material from inside the individual’s body.87
Instead, they only acquire information about brain anatomy or activity.
Furthermore, it can be argued that, basically, neuroimaging is not a particularly
intrusive method, because it does not actually enter the subject’s body, head or
brain.88 It only observes the subject’s brain from outside. Therefore, according to
the court’s argumentation, there is no reason to have due regard to the specified five
factors. After all, those factors are (only) of special importance in connection with
the particularly intrusive nature of a compulsory medical procedure in order to

83
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 70–71; ECtHR 3 May
2012, appl.no. 23880/05 (Salikhov/Russia), § 73–74; ECtHR 13 May 2008, appl.no. 52515/99
(Juhnke/Turkey), § 72. See also ECtHR 5 January 2006, appl.no. 32352/02 (Schmidt/Germany);
ECHR 4 December 1978, appl.no. 8239/78 (X./The Netherlands).
84
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 71; ECtHR 3 May 2012,
appl.no. 23880/05 (Salikhov/Russia), § 73 and 75.
85
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 71-47; ECtHR 3 May 2012,
appl.no. 23880/05 (Salikhov/Russia), § 75.
86
An in-depth analysis regarding the precise meaning of these factors and their implications for
coercive forensic neuroimaging is beyond the scope of this chapter.
87
C.f. Pardo and Patterson 2015, p. 156.
88
Except when contrast liquid is required.
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 99

obtain evidence of a crime.89 On the other hand, it can also be argued that, while
neuroimaging is not intrusive in a physical sense, it is, however, in the sense that it
enables the authorities to intrude into someone’s mental live. As Brownsword
states:
With the development of powerful new brain-imaging technologies (…) researchers have a
window into the brains and, possibly, into a deeper understanding of the mental lives, of
their participants.90

These objections and argumentations show that the deduction of a legal


framework for coercive forensic neuroimaging requires an in-depth analysis of the
court’s case law, as well of the different neuroimaging technologies. Such an
analysis is, however, beyond the scope of this chapter.

6.6 Conclusion

This chapter opened with the vision of the Grand Chamber of the European Court
of Human Rights, that it is beyond dispute that the fight against crime depends to a
great extent on the use of modern scientific technologies. The chapter explained
how different neuroimaging technologies can be very helpful to answer crucial legal
questions and therefore contribute to materializing the aims of criminal law. The
use of such technologies in a criminal law context does, however, also raise fun-
damental legal questions. This chapter explored the implications of coercive neu-
roimaging technologies in light of the prohibition of ill-treatment. It explained that
these technologies can be applied with physical and legal coercion and argued that
this distinction is relevant in two ways: practically and legally. This chapter fur-
thermore explored a research method to examine the legal implications of coercive
neuroimaging in light of the European Convention on Human Rights. In the context
of Article 3 ECHR, I argued that coercive forensic neuroimaging is, in a relevant
way, comparable with the use of a compulsory medical procedure in order to obtain
evidence of a crime. Based on this analogy, some possible legal implications of
coercive forensic neuroimaging in light of the prohibition of ill-treatment have been
explored. Note that this chapter is not exhaustive nor concluding, but rather
explorative. In order to establish a legal framework for coercive forensic neu-
roimaging, many more research has to be done. This chapter does, however, pro-
vide a useful research method and took the first step in applying it. These findings
can contribute to the further debate about the legal regulation of neuroimaging
technologies in a criminal law context.

89
ECtHR (GC) 11 July 2006, appl.no. 54810/00 (Jalloh/Germany), § 71; Jacobs 2012, p. 124.
90
Brownsword 2012, p. 223.
100 S. L. T. J. Ligthart

References

Aharoni E, Vincent GM, Harenski CL, Calhoun VD, Sinnott-Armstrong W, Gazzaniga MS,
Kiehl KA (2013) Neuroprediction of future rearrest. PNAS 110:6223–6228
Brownsword R (2012) Regulating brain imaging: Questions of privacy, informed consent, and
human dignity. In: Richmond S, Rees S, Edwards SJL (eds) I know what you’re thinking:
Brain imaging and mental privacy. Oxford University Press, London, pp 223–244
Buelens W, Herijgers C, Illegems S (2016) The View of the European Court of Human Rights on
Competent Patients’ Right of Informed Consent. Research in the Light of Article 3 and 8 of the
European Convention on Human Rights. European Journal of Health Law 23:481–509
Catley P, Claydon L (2015) The use of neuroscientific evidence in the courtroom by those accused
of criminal offenses in England and Wales. Journal of Law and the Biosciences 2:510–549
Chandler J (2014) Legally-coerced Consent to Treatment in the Criminal Justice System. In:
Holmes D, Jacob JD, Perron A (eds), Power and the Psychiatric Apparatus. Ashgate Publishing
Limited, Surrey, pp 199–216
Chandler J (2015) The use of neuroscientific evidence in Canadian criminal proceedings. Journal
of Law and the Biosciences 2:550–557
De Kogel CH, Westgeest EJMC (2015) Neuroscientific and behavioral genetic information in
criminal cases in the Netherlands. Journal of Law and the Biosciences 2:580–605
Douglas T, Pugh J, Singh I, Savulescu J, Fazel S (2017) Risk assessment tools in criminal justice
and forensic psychiatry: The need for better data. European Psychiatry 42:134–137
Duffy PJ (1983) Article 3 of the European Convention on Human Rights. International and
Comparative Law Quarterly 32:316–346
Encinas de Muñagorri R, Saas C (2016) France. Is the Evidence Too Cerebral to Be Cartesian? In:
Moratti S, Patterson D (eds) Legal Insanity and the Brain: Science, Law and European Courts.
Hart Publishing, Portland, pp 77–110
Farah MJ, Hutchinson JB, Phelps EA, Wagner AD (2014) Functional MRI-based lie detection:
scientific and societal challenges. Nature Reviews Neuroscience, 15:123–131
Farahany NA (2012a) Incriminating Thoughts. Stanford Law Review 36:351–408
Farahany NA (2012b) Searching Secrets. University of Pennsylvania Law Review 160:1239–1307
Farahany NA (2015) Neuroscience and behavioral genetics in US criminal law: an empirical
analysis. Journal of Law and the Biosciences 2:485–509
Farrell BR (2009) Can’t get you out of my head: The Human Rights Implications of Using Brain
Scans as Criminal Evidence. Interdisciplinary Journal of Human Rights Law 4:89–96
Gamer M (2014) Mind Reading Using Neuroimaging. Is This the Future of Deception Detection?
European Psychologist 19:172–183
Glenn AL, Raine A (2014) Neurocriminology: Implications for the Punishment, Prediction and
Prevention of Criminal Behaviour. Nature Review Neuroscience 15:54–63
Greely HT (2009) Neuroscience-Based Lie Detection: The Need for Regulation. In: Bizzi E,
Hyman SE, Raichle ME, Kanwisher N, Phelps EA, Morse SJ, Sinnott-Armstrong W,
Rakoff JS, Greely HT (authors) Using Imaging to Identify Deceit. Scientific and Ethical
Questions. American Academic of Arts and Sciences, Cambridge, pp 45–55
Greely HT (2013) Mind Reading, Neuroscience, and the Law. In: Morse SJ, Roskies AL (eds) A
Primer on Criminal Law and Neuroscience. Oxford University Press, New York, pp 120–149
Greely T, Wagner AD (2011) Reference Guide on Neuroscience. National Academies Press/
Federal Judicial Center, Washington D.C.
Greene J, Cohen J (2004) For the law, neuroscience changes nothing and everything. Philos
Trans R Soc Lond B Biol Sci https://doi.org/10.1098/rstb.2004.1546
Harris DJ, O’Boyle M, Bates E, Buckley C (2014) Harris O’Boyle & Warbrick: Law of the
European Convention on Human Rights. Oxford University Press, New York
Jacobs P (2012) Force-feeding of prisoners and detainees on hunger strike (diss. Tilburg).
Intersentia, Antwerp
6 Coercive Neuroimaging Technologies in Criminal Law in Europe 101

Ligthart SLTJ (2017) Gedwongen neurotests in de strafrechtspleging: Dwangvormen en hun


(juridische) relevantie. Strafblad 6:507–512
Ligthart SLTJ (2018) Gedwongen brain imaging in de strafrechtspleging en artikel 3 EVRM: van
analogie naar deductie. In: Bosma AK, Buisman SS (eds) Methoden van onderzoek in het
strafrecht, de criminologie en de victimologie. Wolters Kluwer, Deventer, pp 51–66
Ligthart SLTJ, Kooijmans T, Meynen G (2017) Neurotests in de Nederlandse strafrechtspleging:
een verkenning van juridische mogelijkheden en uitdagingen. Delikt en Delinkwent 8:579–603
Meegan DV (2008) Neuroimaging Techniques for Memory Detection: Scientific, Ethical, and
Legal Issues. AJOB 8:9–20
Meijer EH, klein Selle N, Elbert L, Ben-Shakhar G (2014) Memory detection with the Concealed
Information Test: A meta-analysis of skin conductance, respiration, heart rate, and P300 data.
Psychophysiology, 51:879–904
Meijer EH, Verschuere B, Merckelbach H, Ben-Shakhar G (2016) Deception detection with
behavioral, autonomic, and neural measures: Conceptual and methodological considerations
that warrant modesty. Psychophysiology 53:593–604
Meixner Jr JB (2018) Admissibility and Constitutional Issues of the Concealed Information Test in
American Courts: An Update. In: Rosenfeld JP (ed) Detecting Concealed Information and
Deception: Recent Developments. Academic Press, London, pp 405–430
Meynen G (2016a) Legal Insanity: Explorations in Psychiatry, Law and Ethics. Springer, Cham
Meynen G (2016b) Legal Insanity and Neurolaw in the Netherlands. In: Moratti S, Patterson D
(eds) Legal Insanity and the Brain: Science, Law and European Courts. Hart Publishing,
Portland, pp 137–168
Meynen G (2017) Brain-based mind reading in forensic psychiatry: exploring possibilities and
perils. Journal of Law and the Biosciences 4:311–329
Meynen G (2018) Forensic psychiatry and neurolaw: Description, developments and debates.
International Journal of Law and Psychiatry. https://doi.org/10.1016/j.ijlp.2018.04.005
Moratti S, Patterson D (eds) (2016) Legal Insanity and the Brain: Science, Law and European
Courts. Hart Publishing, Portland
Nadelhoffer T, Bibas S, Grafton S, Kiehl KA, Mansfield A, Sinnott-Armstrong W, Gazzaniga M
(2012) Neuroprediction, Violence, and the Law: Setting the Stage. Neuroethics 5:67–99
Nadelhoffer T, Sinnott-Armstrong W (2012) Neurolaw and Neuroprediction: Potential Promises
and Perils. Philosophy Compass 7:631–642
Pardo S, Patterson D (2015) Minds, Brains, and Law. The Conceptual Foundations of Law and
Neuroscience. Oxford University Press, New York
Picton TW (1992) The P300 wave of the human event-related potential. J Clin Neurophysiol
9:456–479
Richmond S (2012) Introduction. In: Richmond S, Rees S, Edwards SJL (eds) I know what you’re
thinking: Brain imaging and mental privacy. Oxford University Press, London, pp 1–10
Rosenfeld JP (ed) (2018) Detecting Concealed Information and Deception: Recent Developments.
Academic Press, London
Rosenfeld JP, Soskins M, Bosh G, Ryan A (2004) Simple, effective countermeasures to
P300-based tests of detection of concealed information. Psychophysiology 41:205–219
Rosenfeld JP, Labkovsky E, Winograd M, Lui MA, Vandenboom C, Chedid E (2008) The
Complex Trial Protocol (CTP): a new, countermeasure-resistant, accurate, P300-based method
for detection of concealed information. Psychophysiology 45:906–919
Rosenfeld JP, Hu X, Labkovsky E, Meixner J, Winograd MR (2013a) Review of recent studies
and issues regarding the P300-based complex trial protocol for detection of concealed
information. International Journal of Psychophysiology 90:118–134
Rosenfeld JP, Labkovsky E, Winograd M, Lui MA, Vandeboom C, Chedid E (2013b) The
Complex Trial Protocol (CTP): A new, countermeasure-resistant, accurate, P300-based method
for detection of concealed information. Psychophysiology 45:906–919
Roskies AL (2013) Brain Imaging Techniques. In: Morse SL, Roskies AL (eds) A primer on
criminal law and neuroscience. Oxford University Press, New York, pp 37–74
102 S. L. T. J. Ligthart

Rusconi E, Mitchener-Nissen T (2013) Prospects of functional magnetic resonance imaging as lie


detector. Frontiers in Human Neuroscience. https://doi.org/10.3389/fnhum.2013.00594
Rushing SE, Pryma DA, Langleben DD (2012) PET and SPECT. In: Simpson JR
(ed) Neuroimaging in Forensic Psychiatry: From the Clinic to the Courtroom.
Wiley-Blackwell, Chichester, pp 3–26
Shen FX (2013) Neuroscience, Mental Privacy, and the Law. Harvard Journal of Law & Public
Policy 36:653–713
Shen FX (2016a) Neuroscientific evidence as instant replay. Journal of Law and the Biosciences.
https://doi.org/10.1093/jlb/lsw029
Shen FX (2016b) Law and Neuroscience 2.0. Ariz. St. L.J. 48:1043–1086
Simpson JR (ed) (2012) Neuroimaging in Forensic Psychiatry: From the Clinic to the Courtroom.
Wiley-Blackwell, Chichester
Thompson SK (2005) The legality of the use of psychiatric neuroimaging in intelligence
interrogation. Cornell Law Review 90:1601–1638
Van Toor DAG (2017) Het schuldige geheugen? Een onderzoek naar het gebruik van
hersenonderzoek als opsporingsmethode in het licht van de eisen van instrumentaliteit en
rechtsbescherming (diss.), Nijmegen. Wolters Kluwer, Deventer
Verschuere B, Ben-Shakhar G, Meijer EH (eds) (2011) Memory Detection: Theory and
Application of the Concealed Information Test. Cambridge University Press, New York
Verchuere B, Meijer EH (2014) What’s on Your Mind? Recent Advances in Memory Detection
Using the Concealed information Test. European Psychologist 19:162–171
Vincent NA (2011) Neuroimaging and Responsibility Assessments. Neuroethics 4:35–49
Vorhaus J (2002) On Degration – Part One: Article 3 of the European Convention on Human
Rights. Common Law World Reviews 31:374–399
Wagner AD, Bonnie RJ, Casey BJ, Davis A, Faigman DL, Hoffman MB, Jones OD, Montague R,
Morse S, Raichle ME, Richeson J, Scott ES, Steinberg L, Taylor-Thompson KA, Yaffe G
(2016) fMRI and Lie Detection. The MacArthur Foundation Research Network on Law and
Neuroscience

Sjors L.T.J. Ligthart is a Ph.D. candidate at Tilburg University, Department of Criminal Law,
supervised by Prof. Dr. T. Kooijmans and Prof. Dr. G. Meynen.
Part III
New Technologies
and Market Regulation
Chapter 7
Planting the Seeds of Market Power:
Digital Agriculture, Farmers’ Autonomy,
and the Role of Competition Policy

Tom Verdonk

Contents

7.1 Introduction........................................................................................................................ 106


7.2 Digital Agriculture in the European Union and Its Benefits............................................ 108
7.2.1 Technological Background: What is Digital Agriculture? .................................... 108
7.2.2 The Benefits of Digital Agriculture ....................................................................... 109
7.3 Adverse Effects on Competition and Farmers’ Autonomy .............................................. 112
7.3.1 Competitive Analysis ............................................................................................. 114
7.3.2 Superior Bargaining Position Facilitates Exclusionary and Exploitative
Conduct................................................................................................................... 120
7.4 Regulatory Responses: EU Competition Law Fit-for-Purpose?....................................... 121
7.4.1 Assessment Under EU Competition Law .............................................................. 122
7.4.2 Beyond Traditional Competition Law: Other Rules to the Rescue?..................... 124
7.5 How to Proceed: Policy Recommendations...................................................................... 129
References .................................................................................................................................. 130

Abstract Digital technologies are expected to radically transform agriculture,


especially as the use of collected data could influence farmers’ decision-making and
facilitate precision farming. In the face of major food security challenges,
production-enhancing and resource-efficient innovations in digital agriculture are
generally considered beneficial. However, digital agriculture could also exacerbate
existing power imbalances, dependencies and barriers to entry in the already highly
concentrated agricultural inputs markets of seeds and agrochemicals, particularly
since the few remaining conglomerate suppliers could misuse their market power

T. Verdonk (&)
Institute for Consumer, Competition & Market, University of Leuven (KU Leuven), 3000
Leuven, Belgium
e-mail: tom.verdonk@kuleuven.be

© T.M.C. ASSER PRESS and the authors 2019 105


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_7
106 T. Verdonk

and platforms for digital agriculture services could benefit from network effects. As
data-driven digital agriculture grows rapidly as a valuable asset and an important
input factor for agricultural production in the European Union, it is important to
assess its role from a competition policy perspective. This chapter explains how
some digital agriculture-related practices may lead to distortions of competition and
deteriorations of farmers’ autonomy, but nonetheless do not necessarily violate EU
competition rules. In response to these market power concerns, however, authorities
may seek regulatory solutions beyond EU competition law. In that regard, (pro-
posals for) laws on unfair trading practices, sector-specific legislation and
self-regulatory mechanisms are worth exploring, not in the least on the basis of the
EU’s Common Agricultural Policy.

 
Keywords digital agriculture agricultural inputs competition law farmers’ 
 
autonomy Common Agricultural Policy unfair trading practices sector-specific 
 
legislation self-regulation European Union

7.1 Introduction

At its popularity peak in March 2010, the social network game FarmVille had 35
million daily active users and over 80 million monthly active users. Within two
years after its launch in 2009, the game, which allowed players to run virtual farms,
had become the most popular game on Facebook, a position it held for over two
years.1 Needless to say, the concept of cultivating a farm by plowing, planting and
harvesting crops and trees and taking care of farm animals brought joy to many, and
its popularity level was perhaps elevated to an absurdity by the popularity of the
game within the agricultural community. Indeed, stories of farmers switching on
their computers after a day of hard work to repeat their activities, only this time in a
virtual world, appeared in the media.2
Almost a decade after FarmVille’s popularity peak, most players have taken off
their virtual farming boots for good, but the idea of running a farm behind a
computer is still very much alive. Only this time, the concept has found its way into
the real world. In recent years, many digital technologies have been introduced into
the agricultural sector. Although the digitisation and automation of farming

1
Frum L (2014) Five years on, millions still dig ‘FarmVille’. www.edition.cnn.com/2014/07/31/
tech/gaming-gadgets/farmville-fifth-anniversary/index.html (accessed July 2018); Helft M (2010)
Will Zynga Become the Google of Games? www.nytimes.com/2010/07/25/business/25zynga.html
(accessed July 2018); Liszkiewicz A (2010) The Real Reason You Are Addicted To Farmville
(And Zynga Is A $5 Billion Company). www.businessinsider.com/the-cleverest-explanation-as-to-
why-zynga-is-a-multi-billion-company-you-will-ever-read-2010-4?IR=T (accessed July 2018).
2
Quenqua D (2009) To Harvest Squash, www.nytimes.com/2009/10/29/fashion/29farmville.html
(accessed July 2018); CBC News (2010) Even real farmers play Facebook game Farmville. www.
cbc.ca/news/even-real-farmers-play-facebook-game-farmville-1.897140 (accessed July 2018).
7 Planting the Seeds of Market Power 107

activities and the respective technological trends seem to have gained a firmer
foothold in the United States so far, European farmers are gradually discovering
them as well, not in the least due to generous support from the European Union
(EU).3
Despite the clear benefits of these innovations, digital agriculture raises a host of
legal issues. Until now, most public attention lies on the ownership of data, often
collected and analysed by these technologies, brought forward by security and
privacy perspectives.4 Recent mergers and acquisitions activity in the agricultural
inputs industries have, however, also raised questions related to market power
among competition authorities. Merger control assessments revealed that their main
concern is that digital agriculture services could exacerbate existing power imbal-
ances, dependencies, and barriers to entry in the already highly concentrated
agricultural inputs supply markets, at the expense of farmers and consumers.
As (data-driven) digital agriculture grows rapidly as a valuable asset and an
important input factor for agricultural production, it is important to assess its role
from a competition policy perspective. In particular, the argument has been put
forward that digital agriculture may facilitate consolidation and conglomerate
market power, leading to distortions of competition. Moreover, certain digital
agriculture-related practices may lead to deteriorations of farmers’ autonomy.
Though protection of farmers’ autonomy is not necessarily an issue for EU com-
petition law, regulatory intervention to protect farmers against certain (possibly)
harmful trading practices has been justified on the basis of other fields of law and
policy. In the EU, the Common Agricultural Policy (CAP) has provided deroga-
tions from competition policy since the Treaty of Rome,5 many EU Member States
have introduced specific laws on so-called unfair trading practices (UTPs), often
specifically addressing the food supply chain,6 and, most recently, the European
Commission has proposed a directive to tackle UTPs at the expense of farmers and
other small and medium-sized farmers (SMEs) in the food supply chain.7
Against this background, this chapter seeks to explore the (potential) competi-
tion concerns of digital agriculture in the EU and to examine appropriate regulatory
responses. The second section describes the development, current and future role
and benefits of digital agriculture services, particularly those based on data col-
lection and usage. The third section then discusses some of the risks of digital
agriculture, especially for farmers and competitors. Building on that, the fourth
section analyses the implications of digital agriculture for competition policy,
elaborates on some of the challenges for competition law enforcement of digital
agriculture-related competition concerns and incorporates an assessment of the

3
Lamborelle and Fernández Álvarez 2016.
4
See e.g. Ferris 2017; Sykuta 2016; Wolfert et al. 2017.
5
See Regulation (EU) No 1308/2013 establishing a common organisation of the markets in
agricultural products (CMO Regulation) [2013] OJ L347/672.
6
For an overview, see Renda et al. 2014, pp. 385–389.
7
European Commission 2018a.
108 T. Verdonk

(potential) relevance of other areas of law and policy in developing appropriate


regulatory responses. The fifth and final section shares a few recommendations for
policy and regulatory action.

7.2 Digital Agriculture in the European Union and Its


Benefits

7.2.1 Technological Background: What is Digital


Agriculture?

The agricultural sector, like many others, has seen the emergence of information
technology-based innovations such as remote sensing, cloud computing and
‘Internet of Things’ that (could) disrupt the traditional methods of production and
distribution.8 The concept of digital agriculture, also often colloquially dubbed
‘smart farming’ of ‘farming 4.0’, refers to “the collection, analysis and use of data
from a multiplicity of sources with the goal of optimizing productivity, profitability
and sustainability of farming operations.”9 To that end, a wide range of tools is
used, including self-driving tractors, GPS (global positioning systems), robot
milking machines, automated egg production, drones, satellite data and social
media.10 Whereas the application of digital agriculture is not limited to a single
agricultural sub-sector (e.g. crop production or livestock), it is closely related to the
older concept of precision agriculture, which could be defined as “the practice of
using a segmented management approach in which the various aspects of crop
production are tailored to meet the unique needs of each individual segment of
land.”11 In other words, this practice revolves around measuring and responding to
field variability for crops, and aims to optimise agricultural processes to ensure
maximum productivity and increase resource efficiency.12
A key element of these developments in digital agriculture is that many of these
new technologies are data-driven.13 Data collection and analysis in agriculture is
certainly not new; it has been around since the introduction of precision agriculture
a few decades ago.14 New data technologies or ‘Big Data’, however, can improve
these processes significantly, particularly in terms of speed. Unlike the older pre-
cision agriculture technologies, many of the new technologies instantly (real-time)
send data to the suppliers. Prior to this, farmers would have to manually transfer

8
Van Es and Woodard 2017.
9
The Hale Group and LSC International 2014.
10
Poppe et al. 2015, p. 11.
11
Ferris 2017, p. 310; Magnin 2016.
12
Tzounis et al. 2017, p. 32.
13
Wolfert et al. 2017, p. 70; Lianos and Katalevsky 2017, p. 5.
14
Russo 2013.
7 Planting the Seeds of Market Power 109

data to their computers or hand USB sticks to agronomists to analyse.15 Big Data is
defined by several characteristics: volume, referring to the size of the data; velocity,
measuring the flow of data; variety, reflecting the frequent lack of structure or
design to the data; and veracity, reflecting the accuracy and credibility of the data.16
For agricultural production, a classification of three relevant data categories can be
made: agronomic data, which refers to information regarding the yields of crops and
the number of input products (such as seeds, pesticides, water, etc.) applied;
machine data, which refers to information about equipment; and weather data.17
Although the mere collection of agricultural Big Data will have no real value, the
V’s of Big Data and an adequate analysis of the three data categories helps farmers
to create context, situation and location awareness and can thus be used to enhance
management and decision-making tasks.18 In order to increase the user-friendliness
of digital agriculture services, their suppliers have developed integrated platforms,
which can be used through smartphones and tabloids. Examples of such platforms
available in Europe are AGCO’s AgCommand, John Deere’s FarmSight, BASF’s
Maglis, Bayer’s Xarvio Field Manager, and Monsanto’s VitalFields. As one will
notice, these platforms belong to producers and suppliers of agricultural inputs or
farming equipment. Although these platforms are currently predominantly com-
mercialised as decision-support systems, a gradual shift occurs causing these
platforms to serve as intermediaries to farmers for purchasing inputs like seeds and
pesticides.19

7.2.2 The Benefits of Digital Agriculture

Today’s situation in which food production exceeds the world population’s demand
is historically unique, yet has also become increasingly fragile due to a growing
world population, climate change and urbanisation.20 In the face of growing food
security concerns,21 innovations in agricultural production are—among other public
and private initiatives—key in safeguarding food security for future generations.
Not surprisingly, digital agriculture is widely regarded as an appropriate method to

15
Khan 2013.
16
Coble et al. 2018, p. 80.
17
Dowell 2015 via Ferris 2017, p. 313.
18
Wolfert et al. 2014, pp. 267–268.
19
Lianos and Katalevsky 2017, pp. 5–6.
20
Keulemans 2015.
21
According to the Rome Declaration on World Food Security and the World Food Summit Plan
of Action, food security exists “when all people at all times have physical, social and economic
access to sufficient, safe and nutritious food that meets their dietary needs and food preferences for
an active and healthy life.”
110 T. Verdonk

that end.22 Digital agriculture creates many benefits for farmers as well as other
public and private stakeholders, such as increasing productivity, lowering pro-
duction and transaction costs, improving transparency, creating better regulation,
and more effectively pursuing various public interests.
The most basic advantage of digital agriculture is that digital innovations can
help farmers produce higher yields, lower crop damage and use fewer inputs such
as water, fuel and fertiliser.23 Data analytics helps to optimise decision-making,
thereby reducing production and transaction costs. These costs can also be reduced
due to economies of scope as many of these platforms are held by conglomerate
suppliers, which also supply agricultural inputs like seeds and pesticides. Their
platforms could, therefore, serve as one-stop shop solutions for farmers, which
eliminates the necessity for farmers to invest time and energy in searching for,
establishing and maintaining relationships with different suppliers.24 All of these
resultant efficiencies may be passed on to consumers, lowering their food prices and
(by using fewer chemicals like pesticides and fertilisers) increasing their health.25
These benefits will be even greater due to the network economy characteristics
of data-driven technologies. By collecting data, providers are able to improve their
services. This may create so-called ‘positive feedback loops’: as the collection of
data can lead to improvements, these services may attract more farmers, which in
turn enables providers to collect even more data which one again can be used to
improve their services.26 In other words, the network effects of digital agriculture
platforms could create ever more benefits for farmers in the future, if these inno-
vations continue to grow in popularity. In addition, some economic theories argue
that economies of scale and scope in general could benefit innovation (see e.g. the
‘Schumpeterian Hypothesis’).27
Moreover, the gathering of information could empower farmers because it cre-
ates visibility of the performance of agricultural inputs, provided that they will get
access to the data. Similar to ‘normal’ consumers, farmers and other organisations
may suffer from biases that lead to non-optimal decisions. If farmers would get
access to the data, it could provide farmers with insights on the earnings and costs
ratio of (brand-name) products.28 At the same time, increased transparency of
agrochemical usage could be valuable to public authorities for environmental

22
Gebbers and Adamchuk 2010; Perez 2002; Poppe et al. 2013; last two sources via EIP-AGRI
Seminar 2016, p. 7.
23
Tzounis et al. 2017, p. 32.
24
Lianos and Katalevsky 2017, p. 10.
25
Lamborelle and Fernández Álvarez 2016.
26
OECD 2017, p. 58; Schepp and Wambach 2016, p. 121.
27
Schumpeter 1942.
28
Lianos and Katalevsky 2017, p. 8.
7 Planting the Seeds of Market Power 111

protection and food safety purposes.29 For example, the technology could generate
reliable information that can be used to design better regulation, and effective
enforcement may benefit from early detections of violations of the relevant regu-
lations. This is also in the interest of law-abiding farmers, as the ‘rotten apples’ or
‘bad seed’ in the market are dealt with.30
Given these benefits, it is hardly surprising that the EU grants financial support
to the development and utilisation of digital agriculture. The EU made 100 million
euros available under the Horizon 2020 work programme 2018–2020 to finance the
development and uptake of digital agriculture.31 Moreover, the European
Commission established the European Innovation Partnership for agriculture
(EIP-AGRI), which funds (digital) agricultural innovation and has a network that
shares relevant knowledge with different public and private stakeholders.32 The
Commission also supports digital agriculture through subsidies to farmers and
researchers, mainly through funds under the CAP and a guarantee scheme in close
cooperation with the European Investment Bank.33
In light of the objectives of the CAP, support from the EU for digital agriculture
is fully understandable. Amongst other objectives, the CAP seeks—on the basis of
Article 39 of the Treaty on the Functioning of the European Union (TFEU)—to
increase agricultural productivity, to ensure a fair standard of living for the agri-
cultural community, to stabilise markets, to assure the availability of supplies, and
to ensure that supplies reach consumers at reasonable prices.34 Digital technologies
could contribute to all of these objectives. Indeed, these technologies are able to
increase agricultural productivity and to optimise utilisation of production—after
all, that is their main objective. Moreover, optimisation of production implies a
higher turnover against relatively lower costs, which would increase the individual
earnings of farmers. Furthermore, data-driven technologies could help to stabilise
markets, as data collection could create market transparency and obtain insights on
the ratio of supply to demand. Market stabilisation as an objective is closely con-
nected with assuring the availability of supplies, as mechanisms must be designed

29
Carbonell 2016. Environmental protection could benefit from the increased transparency, as the
technologies would gather information that may be used for research purposes. In that regard, one
could, for example, think of research on the Bee Colony Collapse Disorder. Related to this, the
European Commission adopted new rules in May 2018, which allow satellite data to be used as
evidence when checking farmers’ fulfilment of requirements under the CAP for area-based pay-
ments as well as cross-compliance requirements. See https://ec.europa.eu/info/news/modernising-
cap-satellite-data-authorised-replace-farm-checks-2018-may-25_en (accessed July 2018).
30
Ferris 2017, p. 317; American Farm Bureau Federation 2016, which refers to a US survey in
which a majority of farmers (77%) actually expressed their concerns that data would be used for
regulatory enforcement purposes.
31
European Commission 2017.
32
Lamborelle and Fernández Álvarez 2016. Visit EIP-AGRI’s website for more information:
www.ec.europa.eu/eip/agriculture/en/focus-groups/mainstreaming-precision-farming (accessed
July 2018).
33
Lamborelle and Fernández Álvarez 2016.
34
For an extensive discussion of EU agriculture law and the CAP, see McMahon and Cardwell 2015.
112 T. Verdonk

to smooth out (short-term) fluctuations in prices, demand and supply. Finally,


production-enhancing and resource-efficient innovations lower prices for the final
consumers, if intermediary suppliers pass on the generated economic efficiencies to
them.

7.3 Adverse Effects on Competition and Farmers’


Autonomy

Since digital agriculture produces a host of beneficial effects, the general support for
and the rapid emergence of its technologies are hardly surprising. From a business
perspective, farmers are likely to adopt these technologies, as long as the gains
outweigh the costs. Particularly, digital agriculture’s promise of a high return on
investment is very appealing to farmers.35 However, digital technologies may also
generate certain risks—including from a competition policy perspective—that may
not be adequately appraised through a business perspective, which predominantly
takes into account the (short-term) interests of an individual firm. The rapid
emergence of digital agriculture and its future significance for competition policy
perspectives can be observed in the European Commission’s decisional practice
with regard to the most recent merger wave in the agricultural input industry.36 This
wave was initiated in 2014 by agricultural input producer Monsanto’s attempt to
acquire competitor Syngenta.37 Although this bid stranded in August 2015, it
triggered a number of other mergers and acquisitions transactions involving agro-
chemical firms. Most notably, the merger between Dow and DuPont, the acquisition
of Syngenta by ChemChina and the acquisition of Monsanto by Bayer attracted

35
Johnson 2012.
36
For an elaborate overview of recent consolidation processes in the agricultural input industry,
see IPES-Food 2017.
37
Kirchfeld A, Noel A and Winters P (2014) Monsanto Said to Have Weighed $40 Billion
Syngenta Deal. www.bloomberg.com/news/articles/2014-06-23/monsanto-said-to-have-weighed-
40-billion-syngenta-deal (accessed July 2018); Gillam C (2015) Monsanto drops pursuit of Swiss
agribusiness rival Syngenta. www.reuters.com/article/us-syngenta-ag-m-a-monsanto/monsanto-
drops-pursuit-of-swiss-agribusiness-rival-syngenta-idUSKCN0QV1SL20150826 (accessed July
2018).
7 Planting the Seeds of Market Power 113

scrutiny from authorities and stakeholders alike.38 In its review decision of the
Dow/DuPont merger finalised in March 2017, the Commission described precision
agriculture and data-driven agriculture still as “a growing trend”, but stayed cryptic
on its future development and ultimately refrained from drawing any conclusions
with regard to its implications for competition.39 Less than a year later, when
reviewing the Bayer’s acquisition of Monsanto, EU Commissioner for Competition
Vestager noted that “digitalisation is radically changing farming” and explicitly
warned “that through the merger, competition in the area of digital farming and
research is not impaired”.40 Nevertheless, the Commission approved the merger
about a month later, albeit under strict conditions.41
This turn of events justifies a closer examination of the implications for com-
petition assessments of digital agriculture, especially in the long term and with a
view to the overall functioning of agricultural markets. This section examines the
current state of competition in digital agriculture and agricultural inputs; then
explains how digital agriculture may impact competition and farmers’ autonomy.
Subsequently, this section discusses a number of practices that could arise from
market power imbalances.

38
In line with its case practice, the Commission assesses parallel transactions according to the
so-called ‘priority rule’ (first come, first served): European Commission (2017) Dow/DuPont
(Case M.7932) [no final non-confidential version published yet]. www.ec.europa.eu/competition/
mergers/cases/decisions/m7932_13668_3.pdf; European Commission (2017) ChemChina/
Syngenta (Case M.7962) [no final non-confidential version published yet]. www.ec.europa.eu/
competition/mergers/cases/decisions/m7962_4097_3.pdf (accessed July 2018); European
Commission (2018a, b, c) Bayer/Monsanto (Case M.8084) [no non-confidential version published
yet]. In May 2018, the U.S. Department of Justice approved the acquisition as well. Similar to
commitments made in the European Commission’s merger review, Bayer made commitments to
divest some assets to secure approval from the antitrust authority. See U.S. V. Bayer AG and
Monsanto Company (2018) U.S. V. Bayer AG and Monsanto Company, Case 1:18-cv-0124.
www.justice.gov/atr/case/us-v-bayer-ag-and-monsanto-company (accessed July 2018).
39
In contrast, most players from the crop protection industry, on the other hand, felt more able to
predict the future and stated that “precision agriculture will play a major role in the [European
Economic Area] in the next five to 10 years.” See European Commission (2017) Dow/DuPont
(Case M.7932) [no final non-confidential version published yet]. www.ec.europa.eu/competition/
mergers/cases/decisions/m7932_13668_3.pdf (accessed July 2018), para 246.
40
Original text: “Die Digitalisierung verändert auch die Landwirtschaft radikal. Man kann für
jeden Quadratmeter genau ermitteln, was die ideal dosierte Saat oder Pestizidmenge ist. Das ist
faszinierend. Gerade deshalb müssen wir aufpassen, dass durch die Fusion der Wettbewerb beim
Digital Farming und bei der Forschung hierzu nicht eingeschränkt wird.” Höning A and
Beermann M (2018) “Wir wollen Fusionen nicht verhindern”. www.rp-online.de/wirtschaft/
unternehmen/margrethe-vestager-wir-wollen-fusionen-nicht-verhindern_aid-20606393 (accessed
July 2018); translation from Taylor E (2018) EU says Bayer Monsanto must not hurt competition
in digital farming. www.reuters.com/article/us-monsanto-m-a-bayer-eu/eu-says-bayer-monsanto-
must-not-hurt-competition-in-digital-farming-paper-idUSKBN1FU0IJ (accessed July 2018).
41
European Commission—DG COMP (2018) Mergers: Commission clears Bayer’s acquisition
of Monsanto, subject to conditions. Press Release IP/18/2282. http://europa.eu/rapid/press-release_
IP-18-2282_en.htm (accessed July 2018).
114 T. Verdonk

7.3.1 Competitive Analysis

To provide insights on the overall industry competition in digital technology and to


determine where the market power lies in the network of buyers, suppliers, new
entrants, competitors and substitutes in the market of digital agriculture, a so-called
‘five forces’ analysis can be used. This tool, developed by Michael E. Porter, offers
a framework to evaluate the competitive intensity of an industry. It includes three
forces from horizontal competition—(1) the threat of established rivals, (2) the
threat of new entrants, and (3) the threat of substitute products or services—and two
others from vertical competition—(4) the bargaining power of suppliers and (5) the
bargaining power of customers.42 Please note that some industry developments
impact more than one force. Either way, in an attempt to present a concise analysis,
the following observations can be made with respect to the competitive forces for
the digital agriculture technology industry in the EU.
(1) The threat of established rivals
When opening a second phase investigation with regard to the Bayer/Monsanto
merger, the Commission expressed in an accompanying press release its concerns
that digital agriculture could exacerbate conglomerate market power, since the
merged entity would hold both the largest portfolio of pesticides products and the
strongest global market positions in seeds and traits, making it the largest integrated
company in the industry. Notably, with the advent of digital agriculture, the con-
cerns were that this could give the merged entity the ability to exclude competitors
from the market through bundling of seeds and pesticides products at distributor
level or at grower level.43
Although the subsequent in-depth investigation did not confirm these competi-
tion concerns, the Commission did deem Bayer’s commitment to license a copy of
its worldwide current offering and pipeline on digital agriculture to BASF, a
German chemical company also active in the field of agriculture, necessary. In order
to address the loss of potential competition in Europe between Bayer’s recently
launched Xarvio offering and Monsanto’s FieldView platform (the leading platform
worldwide, but only launched in Europe shortly after the Commission announced
its decision), Bayer’s commitment to BASF would allow the German competitor to
replicate Bayer’s position in digital agriculture in the European Economic Area,
according to the Commission, and would also ensure “that the race to become a
leading supplier in Europe in [digital agriculture] remains open”.44

42
Porter 1979.
43
DG COMP (2017) Mergers: Commission opens in-depth investigation into proposed acqui-
sition of Monsanto by Bayer. Press Release IP/17/2762. www.europa.eu/rapid/press-release_IP-
17-2762_en.htm (accessed July 2018); Bayer/Monsanto (Case M.8084), Initiation of proceedings,
OJ 2017/C 286/01.
44
DG COMP (2018) Mergers: Commission clears Bayer’s acquisition of Monsanto, subject to
conditions. Press Release IP/18/2282. http://europa.eu/rapid/press-release_IP-18-2282_en.htm
7 Planting the Seeds of Market Power 115

However, similar to other agricultural input markets, a process of consolidation


occurs in the market for digital agriculture. Obviously, acquisitions of competitors
may lower competition in the market. In addition to the three larger mergers and
acquisitions, agrochemical firms have acquired many smaller firms. In 2013, for
example, Monsanto bought the Climate Corporation for almost one billion U.S.
dollars.45 Among other things, this company developed a platform that heavily
relies on weather data analysis to provide yield feedback and offer insurance
products.46 Until recently, the Climate Corporation had its main focus on the U.S.,
but this will change soon. In 2016 Monsanto acquired Estonia-based VitalFields,
integrated the firm into the Climate Corporation team, and launched Monsanto’s
Climate FieldView digital agriculture platform into regions of the EU for the 2018
growing season.47 Since the global market size for these services is expected to
reach $4.55 billion by 2020,48 it is not surprising that Monsanto also launched a
venture capital arm to fund tech start-ups a few years ago.49 Other large agro-
chemical firms have engaged in similar mergers and acquisitions activity.50
While critics often point to consolidation processes as a warning sign for the state
of competition, it seems that competition in digital agriculture in the EU is, at least
according to the Commission, still existent. So far, no (dominant) market leader has
emerged. The threat of established rivals is, however, declining. The overall con-
solidation in the agricultural input industry also affects the digital agriculture market,
most notably due to acquisitions of large and small competitors by a few large firms,
the rise of conglomerate market power due to strong market presence in

(accessed July 2018); European Commission (2018a, b, c) Bayer/Monsanto (Case M.8084) [no
non-confidential version published yet].
45
Vance A (2013) Monsanto’s Billion-Dollar Bet Brings Big Data to the Farm. www.bloomberg.
com/news/articles/2013-10-02/monsanto-buys-climate-corporation-for-930-million-bringing-big-
data-to-the-farm (accessed July 2018); press release Monsanto (2013) Monsanto to Acquire The
Climate Corporation, Combination to Provide Farmers with Broad Suite of Tools Offering Greater
On-Farm Insights. www.monsanto.com/news-releases/monsanto-to-acquire-the-climate-
corporation-combination-to-provide-farmers-with-broad-suite-of-tools-offering-greater-on-farm-
insights/ (accessed July 2018).
46
Press release Monsanto (2013) Monsanto Completes Acquisition of The Climate Corporation.
www.monsanto.com/news-releases/monsanto-completes-acquisition-of-the-climate-corporation/
(accessed July 2018).
47
Press release Monsanto (2016) The Climate Corporation Acquires VitalFields to Expand
Digital Agriculture Innovation for European Farmers. www.monsanto.com/news-releases/the-
climate-corporation-acquires-vitalfields-to-expand-digital-agriculture-innovation-for-european-
farmers/ (accessed July 2018).
48
Accenture 2017.
49
Visit Monsanto’s website for more information: www.monsanto.com/company/monsanto-
growth-ventures/ (accessed July 2018).
50
Lianos and Katalevsky 2017, p. 10.
116 T. Verdonk

‘neighbouring markets’ and the formation of alliances, again between these same few
large firms.51 These factors will reduce competitive constraints exercised by rivals.
(2) The threat of new entrants
Similar to online network platforms, digital agriculture platforms could benefit from
data-driven direct and indirect network effects, thus increasing barriers to entry.
Other major sources of barriers to entry could also lower the threat of new entrants:
economies of scale and scope, making it difficult for newcomers to produce as
cost-efficient; product differentiation, since some suppliers of agricultural inputs
could benefit from their well-established brands; capital requirements, which could
refer to the necessary final resources to develop digital agriculture products and
services as well as the necessary volume, variety, velocity and veracity of data to
launch successful products and services; and cost disadvantages independent of
size, such as the first-mover advantage (again, also due to data collection) and the
gained learning curve. The fact that Bayer’s commitment to license a copy of its
worldwide current offering and pipeline on digital agriculture to BASF was nec-
essary to obtain the Commission’s approval could suggest that barriers to entry are
already relatively high. After all, despite being one of the major agrochemical firms
in the EU, BASF’s present position in the market of digital agriculture was con-
sidered endangered, or at least to a high degree fragile, without the committed
license.
Moreover, digital agriculture may facilitate consolidation in the already con-
centrated agricultural input markets. Strong centralised platforms from conglom-
erate market players may invite other suppliers to offer their services and products
on the platform as well, and some have in fact already expressed their intention to
do so.52 Although potentially beneficial for those competing suppliers—since they
will get the opportunity to reach many farmers without having to build their own
platform(s) or establish relationships otherwise—this could also affect them, par-
ticularly non-conglomerate and small(er) firms, in the long term. It leads to an
increased dependency of these businesses as well as newcomers on digital plat-
forms as quasi ‘gatekeepers’ to markets and consumers.53

51
Gillam C (2013) DuPont, with Deere & Co, to roll out precision farming program. https://www.
reuters.com/article/us-usa-farming-data/dupont-with-deere-co-to-roll-out-precision-farming-
program-idUSBRE9A709920131108 (accessed July 2018); Lianos and Katalevsky 2017, p. 18:
“BASF and Monsanto have collaborated since 2007 on R&D partnerships worth $2.5 billion in
breeding, biotech, pesticides, ag microbials, ag biologicals, and precision agriculture.”
52
For example, Monsanto has publicly expressed its desire to build a “centralized and open data
platform”. See Burwood-Taylor L (2016) What do Monsanto’s Plans to Open Up its Digital
Platform Mean for the Agriculture Industry? www.agfundernews.com/what-do-monsanto-plans-
to-open-up-its-digital-platform-mean-for-the-agriculture-industry.html (accessed July 2018).
53
European Commission 2018b.
7 Planting the Seeds of Market Power 117

Table 7.1 Concentration 1985 (%) 1996 (%) 2012 (%)


ratios in the global seed
industry (1985–2012) [Source CR1 4.1 5 21.8
European Commission] CR2 5.7 8 37.3
CR3 6.8 10.2 44.4
CR4 7.9 11.7 48.2
CR5 8.9 13 48.2
CR6 9.9 14.1 54.6
CR7 10.9 15.1 57.5
CR8 11.7 16 59.7
CR9 12.5 16.8 60.7

(3) The threat of substitute products or services


As in all agricultural input markets, it is innovation rather than prices that shapes the
competitive dynamics of the market for digital agriculture. This means that firms are
mainly competing by improving their product and to a lesser extent by lowering
their prices and costs. In innovation markets the threat of substitutes is real; in
agricultural input markets it is even leading to a gradual convergence of two pre-
viously fairly independent value chains—i.e. seeds and crop protection products.54
By integrating their supply of these products, conglomerate suppliers active in
digital agriculture could equally blur the distinction between the value chain of
digital agriculture products and services and these other value chains.
(4) The bargaining power of suppliers
Identifying the suppliers of digital technology providers is easier said than done, as
it depends on the notion of suppliers. If one considers the digital technology service
as an inherent (non-separate) product offered by an agricultural input supplier in
addition to many other inputs, then suppliers are active in a wide range of products
and services, ranging from energy to chemical inputs, and their economic power is
widely dispersed. A more restrictive approach to the concept of digital technology
service, as taken earlier in this chapter, would regard the service as an intermediary
service for inputs supply. Under that notion, suppliers of digital agriculture tech-
nology are the suppliers of agricultural inputs.
Even given the traditionally high concentration levels of most food supply chain
levels, the level of concentration in the agricultural input markets is remarkable.
During the past decades global concentration in agricultural input markets rose
significantly, particularly in the markets for seeds, agrochemicals, fertilizers, animal
pharmaceuticals and farm machinery.55 With regards to the first two products, this

54
Olson et al. 2010, p. 10.
55
ETC Group 2015; Lianos and Katalevsky 2017, p. 19; IPES-Food 2017; Wesseler et al. 2015.
Other agricultural input markets with high concentration levels are fertilisers, animal pharma-
ceuticals, and farming equipment.
118 T. Verdonk

led to six large firms—BASF, Bayer, Dow Chemical, DuPont, Monsanto and
Syngenta (known as the ‘Big Six’)—achieving a combined market share in seeds of
almost 55% globally and almost 50% in the EU and a combined market share in
agrochemicals of more than 75% globally, more than 80% in the EU in 2014.56 The
evolution of the consolidation process in the global seed industry during the past
three decades is clearly shown in the Table 7.1, designed by the European
Commission.57
Although more recent market shares from credible sources are not publicly
available, consolidation did not end in 2012 or 2014, as the three major mergers
involving Big Six firms since mid-2016 show.58
Whether a provider of digital agriculture encounters strong bargaining power
from suppliers will vary from provider to provider. Many digital agriculture ser-
vices are already offered by agricultural input suppliers. For those vertically inte-
grated businesses, bargaining power of suppliers is irrelevant, provided that the
supplier offers a full range of inputs. Competing providers of digital agriculture,
which are not owned by agricultural input suppliers, may face a different situation.
Providers that want to expand their services and include intermediation services for
the acquisition of agricultural inputs will have to establish contractual relations with
agricultural input suppliers. Those firms may be confronted with reluctance of these
suppliers to establish relations. After all, why would an agricultural input supplier
supply inputs to a downstream competitor?
(5) The bargaining power of customers
Bargaining power of farmers vis-à-vis their contract partners is generally considered
to be weak. Similar to their other upstream and downstream markets, bargaining
power among farmers is widely dispersed, as many farmers are faced with far fewer
providers of digital agriculture. Moreover, the nature of digital agriculture is
expected to further undermine their bargaining power. A study, commissioned by
the European Parliament, concluded that the most profound impact of digital
agriculture lies in its potential effects upon the autonomy of the farmer, in addition
to its impact on social values and the sustainability of local farming structures.
According to this study, digital agriculture could impact the autonomy of the
farmer, as its technical complexity, scale and infrastructural requirements may lead
to informational asymmetries and dependence on suppliers.59 Farmers’ autonomy is
also jeopardised because it is no longer the farmer that takes the sole responsibility

56
Wesseler et al. 2015, pp. 75–76.
57
Ibid., 20 (see figure 6).
58
European Commission (2017) Dow/DuPont (Case M.7932) [no final non-confidential version
published yet]. www.ec.europa.eu/competition/mergers/cases/decisions/m7932_13668_3.pdf (ac-
cessed July 2018); European Commission (2017) ChemChina/Syngenta (Case M.7962) [no final
non-confidential version published yet]. www.ec.europa.eu/competition/mergers/cases/decisions/
m7962_4097_3.pdf (accessed July 2018); European Commission (2018a, b, c) Bayer/Monsanto
(Case M.8084) [no non-confidential version published yet].
59
Kritikos 2017, p. 39.
7 Planting the Seeds of Market Power 119

for the decision-making process of agricultural production. This is clearly far from
surprising, as feedback for decision-making entails the core of the digital services
offered. But one must realise that the dividing line between decision support and
decision-making is thin. By granting access to data about the core of his business
activities to his supplier, should the farmer not fear welcoming a Trojan horse onto
his farm?60
After all, reduced autonomy entails a simultaneous increase in dependence on
the supplier. Notwithstanding the aforementioned benefits of data-generated posi-
tive feedback loops, data-based technologies may ultimately create a customer
lock-in.61 Switching to another supplier can become difficult because farmers can
be(come) technologically dependent on their suppliers. Furthermore, practices like
data ownership clauses, exclusivity clauses and lack of interoperability may prevent
the transfer of data from one firm to another. To what extent, under what conditions
and during what period is a farmer still effectively free in switching to another
supplier or platform?
(6) Other adverse effects
Leaving aside a more extensive discussion of other concerns that are not directly
related to market power, attention should be drawn to concerns over ownership
(who owns the data?), privacy (who has access to the data?) and security (is the data
safe?).62 As with ‘normal’ consumers, farmers are vulnerable to biases and errors in
(data-driven) technologies. Exposed to such gaps, farmers’ businesses risk that data,
including personal and competitively sensitive information, is intentionally or
inadvertently (as in, manipulation) misused by others at their expense. Another
major challenge for the EU will be to prevent the emergence of a two-speed
agricultural sector in the EU. If only financially stronger farmers have access to
these technologies, their poorer competitors may not be able to keep up with the rat
race. As the financial strength of farmers and the percentage of the population
working in agriculture vary strongly from region to region, the described tech-
nologies can accelerate economic inequality between regions of the EU.63

60
Khan 2013: ““If you inadvertently teach Monsanto what it is that makes you a better farmer
than your neighbor, it can sell that information to your neighbor,” said John McGuire, an
agriculture technology consultant who runs Simplified Technology Services and developed
geospatial tools for Monsanto in the late-1990s. And if the corporation gathers enough infor-
mation, “it opens the door for Monsanto to say, ‘We know how to farm in your area better than
you do,’” he said.”
61
Kritikos 2017, p. 39.
62
This is especially relevant as agricultural data as can constitute a hybrid of business, personal
and intellectual property (trade secrets). Different regulatory frameworks apply to the different
categories of data. Particularly, in the area of personal data recent developments can have
far-reaching consequences—see, for example, the General Data Protection Regulation (GDPR).
See also Sykuta 2016, p. 57.
63
Kritikos 2017, p. 40. See also DG Agriculture and Rural Development (2013) How many
people work in agriculture in the European Union? www.ec.europa.eu/agriculture/sites/agriculture/
files/rural-area-economics/briefs/pdf/08_en.pdf (accessed July 2018).
120 T. Verdonk

7.3.2 Superior Bargaining Position Facilitates Exclusionary


and Exploitative Conduct

Having identified imbalances of economic power between farmers and some digital
agriculture providers, suppliers could utilise their strong market positions vis-à-vis
farmers to bargain restrictive contract terms or engage in other restrictive trading
practices. Harmful practices can be divided into (1) abusive practices, which may be
aimed at competitors (exclusionary conduct) and customers (exploitative conduct),
and (2) other UTPs.
Without claiming to present an exhaustive list of these practices, one could think of:
(1) Abusive practices:64
• Exclusive dealing (e.g. exclusive purchasing and conditional rebates);
• Tying and bundling;
• Predation;
• Refusal to supply and margin squeeze.
(2) UTPs:65
• Ambiguous contract terms;
• Lack of written contracts;
• Retroactive contract changes;
• Unfair transfer of commercial risk (e.g. liability disclaimers);
• Unfair use of (confidential) information;
• Unfair termination of a commercial relationship;
• Territorial supply constraints.
Moreover, digital agriculture may facilitate collusive practices between com-
petitors at various levels of the upper segment of the food supply chain.66 The
collection and analytics of Big Data regarding agricultural production increase the
transparency of the market and reduce uncertainty on the development of the
market. If shared with competitors, the exchange of competitively sensitive infor-
mation allows market parties to coordinate their future conduct, even in the absence
of an explicit cartel agreement, and thus reduces competition in the market. Firstly,
similar to the potentially exclusionary effects of patent pools and other collective
intellectual property licensing arrangements, collective data-exchange arrangements
allow bigger suppliers of agricultural inputs to limit access to data for
non-conglomerate and smaller suppliers. Secondly, manipulation of commodity
markets, where real-time data is highly valuable to traders, is a real risk.

64
Examples are taken from the Communication from the Commission—Guidance on the
Commission’s enforcement priorities in applying Article 82 of the EC Treaty to abusive exclu-
sionary conduct by dominant undertakings.
65
European Commission 2013.
66
On competition and algorithms in general, see OECD 2017.
7 Planting the Seeds of Market Power 121

Commodity traders may exchange data acquired by digital agriculture to their


advantage.67 Finally, even farmers may coordinate their market behaviour by
exchanging data directly or indirectly through a common supplier (so-called ‘hub
and spoke’ arrangements) and engaging in practices like (product) market sharing
and production limitation.

7.4 Regulatory Responses: EU Competition Law


Fit-for-Purpose?

The previous section identified a few major risks that (data-driven) digital agri-
culture pose from a competition policy perspective. Due to network effects and
conglomerate market power, the market of digital agriculture could face consoli-
dation. Stronger positions of economic strength may facilitate abusive and other
unfair trading practices vis-à-vis competing technology providers and
customer-farmers, while also reducing the autonomy of farmers in general. This
raises the question of what role regulation could and should play in preventing and
resolving these issues.
With regard to the role of economic regulation in the agricultural input indus-
tries, the recent merger wave revealed a highly polarised debate. Two extreme
positions can be identified in the ‘to regulate or not regulate’ debate. On the one
side, there is an increasingly loud call, especially from non-governmental organi-
sations, to bring the expansion of the major agricultural input suppliers—the
sooner, the better—to a halt. They regard the emergence of digital agriculture as an
important catalyst for consolidation.68 On the other hand, in addition to the ‘usual
suspects’ of merger benefits (economies of scale, synergies, etc.), one could point to
the infancy of the market and argue to refrain from any regulatory responses—apart
from the deterrent effect of (ex post) EU competition rules—and to allow the market
to develop for a while, before assessing the exact problems and challenges for
adequate regulatory responses.
None of these positions deserve full support because the truth, as so often, lies
somewhere in the middle. The first position relies perhaps too heavily on doom

67
Noyes K (2014) Cropping up on every farm: Big data technology. www.fortune.com/2014/05/
30/cropping-up-on-every-farm-big-data-technology/ (accessed July 2018); Kritikos 2017, pp. 15–
16. John Deere has stated on its website (see www.deere.com/privacy_and_data/policies_
statements/en_US/data_principles/frequently_asked_questions.page?) that it “will NOT use inter-
nally or share anonymized data to external parties who John Deere believes intend to use it to
influence markets, provide an advantage to commodity traders or support supply hedging by food
companies.”
68
NGOs like Friends of the Earth even dubbed the acquisition of Monsanto by Bayer the ‘Merger
from Hell’: www.foeeurope.org/sites/default/files/agriculture/2017/foee-ceo-baysanto-190517.pdf
(accessed July 2018); www.greeneuropeanjournal.eu/baysanto-the-campaign-against-the-merger-
from-hell (accessed July 2018).
122 T. Verdonk

scenarios, in which hypothetical abuse scenarios are utilised to justify far-reaching


regulatory intervention. The second position ignores other legal domains that have
been or are developed, sometimes very recently, to address some perceived gaps in
EU competition law enforcement, often specifically for the food supply chain. This
section evaluates what role EU competition law could play in resolving the iden-
tified risks that could possibly be problematic for farmers, explores other areas of
law and policy that have been developed to govern imbalances of economic power,
and shares some first critical observations on the (potential) role of those domains.

7.4.1 Assessment Under EU Competition Law

To address the market failure of market power, a selection of legal rules, collec-
tively known as EU competition law, has been introduced. This includes the (ex
ante) merger control regime of the EU Merger Regulation69 and the (ex post)
competition rules of Articles 101 and 102 TFEU.70 While Article 101 TFEU
prohibits anticompetitive agreements between undertakings (i.e. the ‘cartel prohi-
bition’), Article 102 TFEU prohibits abuses of dominant market positions (i.e. the
‘abuse of dominance prohibition’). In response to the question whether a compe-
tition authority or alleged victim (e.g. a farmer) could argue that certain practices by
an agricultural technology supplier violate Article 101 or Article 102 TFEU, the
answer would depend on the definition of the relevant market and the subsequent
assessment of market power with regard to the undertaking(s) involved on the
defined market. If a market is defined in a very broad manner, it could be that none
of the digital agriculture providers has a significant market position. The opposite
may also be true in very narrowly defined markets. Interestingly, discussions on
competition law assessments often show that the definition of the relevant market in
a case is open to discussion. After all, it is just a policy tool. Either way, this chapter
is intended to explore the boundaries of regulatory interventions beyond EU
competition law and for the sake of that argument it is assumed that is unlikely that
at this point one of the digital agriculture providers occupies such a strong market
position. The strength of this assumption is based on the recent Commission’s
Bayer/Monsanto merger assessment, which concluded that no (dominant) market
leader in the area of digital farming had emerged.
Article 101 TFEU may apply to (horizontal) collective research and develop-
ment arrangements and other joint ventures between digital agriculture providers,
which may facilitate the exchange of agricultural data, as well as (vertical) agree-
ments between the providers of digital services and farmers, which may also restrict

69
Council Regulation (EC) 139/2004 on the control of concentrations between undertakings (EU
Merger Regulation) [2004] OJ L24/1.
70
The third main pillar of EU competition policy, state aid control, is excluded from the scope of
this chapter.
7 Planting the Seeds of Market Power 123

competition.71 With regard to the former, it must be noted that the transfer of
know-how is generally considered to be consumer welfare enhancing. With regard
to the latter, farmers could argue that restrictive practices may harm their welfare.
However, unlike horizontal agreements, vertical agreements are fundamentally
pro-competitive under EU competition law because they may provide substantial
scope for efficiencies. Firms in a vertical relationship want the other firm to reduce
prices. Generally, this is considered to be positive: lowering their prices means a
better deal for the final consumers. A vertical restraint imposed by one firm on the
other may well be pro-competitive as it is likely to be designed to elicit a lower
price from the other firm in the vertical relationship.72 Vertical restraints can,
however, be anticompetitive by reducing competition at the horizontal level, for
example, if they foreclose the market to competitors.73 With the exception of
so-called ‘hardcore restrictions’, vertical agreements may nonetheless often benefit
from the Vertical Agreements Block Exemption Regulation (VABER). If none of
the agricultural input suppliers holds a market share of 30 per cent or higher of the
market for digital agriculture in the EU, most agreements between suppliers and
farmers would satisfy the criteria of the VABER, thus avoiding the applicability of
Article 101 TFEU.74
In order to violate Article 102 TFEU, a firm must have a dominant position on a
certain market and exploit this position to eliminate competition. The definition of a
‘dominant position’ was established by the ECJ in United Brands, describing it as
“a position of economic strength enjoyed by an undertaking which enables it to
prevent effective competition being maintained on the relevant market by giving it
the power to behave to an appreciable extent independently of its competitors,
customers and ultimately of its consumers.”75 Market shares provide a useful first
indication, but other factors are relevant as well in order to assess dominance.
Although digital agriculture may facilitate the emergence of a dominant under-
taking due to network effects and conglomerate market power, no dominant sup-
plier of digital agriculture has emerged yet. Any practices, no matter how harmful to
the concerned farmer, will thus not violate Article 102 TFEU.
Even if a supplier were to become dominant, one should bear in mind that
Article 102 TFEU does not prohibit a firm to hold a dominant position, but only
prohibits a firm to abuse that position. Abusive behaviour can take many different
forms and shapes, but Article 102 TFEU is primarily concerned with exclusionary

71
Perhaps needless to say, the fact that farmers do not support or do not intend for anticompetitive
effects to take place is of no relevance for the applicability of the cartel prohibition.
72
Commission’s Guidelines on Vertical Restraints (2010/C 130/01), para 6 and 100; Bishop and
Walker 2010, p. 191.
73
Bishop and Walker 2010, p. 195.
74
Commission Regulation (EU) No 330/2010 on the application of Article 101(3) of the Treaty
on the Functioning of the European Union to categories of vertical agreements and concerted
practices (Vertical Agreements Block Exemption Regulation), L 102/1, Article 3.
75
Case 27/76 United Brands v Commission (1978) ECR 207, para 65.
124 T. Verdonk

abuses, as opposed to exploitative abuses.76 Practices that harm the autonomy or


profitability of farmers are thus less likely to violate Article 102 TFEU.
Nonetheless, the case law from the Court of Justice of the European Union and the
decisional practice from the European Commission held many of the exploitative
and exclusionary practices identified in the previous section to constitute abuse in
the sense of Article 102 TFEU, provided that those practices were pursued by a
dominant undertaking.

7.4.2 Beyond Traditional Competition Law: Other Rules


to the Rescue?

During the past decades, legislators and authorities in the EU have often looked
beyond EU competition law to deal with abusive practices arising from imbalances
of economic power, sometimes specifically aimed at the food supply chain. Often
due to the absence of dominance, certain practices between businesses in a vertical
relationship did not necessarily infringe EU competition law but were nonetheless
considered to be (potentially) harmful. Broadly speaking, two types of regulation
can be distinguished that have been proposed, implemented and/or enforced, and
are relevant here: the first being general laws on UTPs; and the second being
sector-specific legislation and self-regulatory mechanisms for the food supply
chain. With regard to both categories, however, there has generally been reluctance
among legislators in intervening in economic transactions between businesses due
to their respect of the freedom of contract.77
Even when imposed by a firm that is not dominant in the EU competition law
sense, certain practices may still have a profound adverse impact on the market.
Smaller operators often perceive UTPs to jeopardise their profitability and ability to
compete fairly and to affect their capacity to invest because UTPs decrease the part
of the added value generated that these operators would otherwise be able to
appropriate.78 Moreover, UTPs, particularly unilateral and retroactive changes to
contracts, create unpredictability for the affected party, which increases transaction
costs. At the aggregate level, this will increase market uncertainty and influence the
competitiveness of the sector as a whole. Therefore, UTPs can result in the
misallocation of resources and (disproportional) exclusionary conduct, at the
expense of smaller operators. In addition, smaller operators are reluctant in
enforcing general (contract) law provisions due to the so-called ‘fear factor’,

76
This is illustrated by the Communication from the Commission—Guidance on the
Commission’s enforcement priorities in applying Article 82 of the EC Treaty to abusive exclu-
sionary conduct by dominant undertakings (emphasis added).
77
Bakhoum 2018.
78
European Commission Staff Working Document Impact Assessment 2018, p. 16.
7 Planting the Seeds of Market Power 125

referring to a weaker party (typically, an SME) in a commercial relationship fearing


that initiating litigation or filing a complaint may lead the stronger party to ter-
minate the commercial relationship or impose other reprisals.79
Apart from the Late Payments Directive80 and the Directive on misleading and
comparative advertising,81 which applies to B2C and B2B relations, there are no
common EU rules on UTPs between businesses. This is remarkable in itself, as EU
law has a prominent, if not leading, role in many adjacent areas of law and policy,
such as consumer law, competition law, and agricultural policy. To address UTPs,
most EU Member States have implemented specific laws on UTPs;82 others only
have legislation with a limited scope83 or no UTP legislation.84 Due to their, in
general, weak bargaining power in comparison to the significant bargaining power
wielded by large operators at other levels of the chain, small operators in the food
supply chain, especially farmers, have attracted specific attention from authorities.
Twelve Member States have adopted legislative instruments specifically applicable
to the food supply chain, whereas in eight Member States the UTP legislation is
applicable to all sectors, though sometimes including specific provisions on prac-
tices in food and groceries trade.85
Within various legislative initiatives, two main approaches can be distinguished.
In some Member States, UTPs are addressed by stretching the scope of competition
law beyond the boundaries of Article 102 TFEU, and applying the concept of abuse
to situations of economic dependence or superior bargaining power.86 In most other
Member States that have legislative instruments on UTPs, legislation has been
adopted outside of the scope of national competition law. This legislation tends to
focus on contractual relations between parties. Similar to EU consumer law, these
laws focus on general (contractual) principles like good faith, reciprocity, trans-
parency, and proportionality, with regard to the rights and obligations of contract
parties.87 Small parties like farmers can rely on these provisions to protect

79
Falkowski et al. 2017, p. 23.
80
Directive 2011/7/EU on combating late payment in commercial transactions.
81
Directive 2006/114/EC concerning misleading and comparative advertising.
82
Austria, Bulgaria, Croatia, Cyprus, Czech Republic, France, Germany, Greece, Hungary,
Ireland, Italy, Latvia, Lithuania, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, United
Kingdom. See European Commission Staff Working Document 2018, p. 14.
83
Belgium, Denmark, Finland, Sweden.
84
Estonia, Luxembourg, Malta, Netherlands.
85
See European Commission Staff Working Document 2018, p. 151.
86
Recital 8 of Regulation 1/2003 states: “Member States should not under this Regulation be
precluded from adopting and applying on their territory stricter national competition laws which
prohibit or impose sanctions on unilateral conduct engaged in by undertakings. These stricter
national laws may include provisions which prohibit or impose sanctions on abusive behaviour
toward economically dependent undertakings.” For example, Germany’s §20(2) of the Gesetz
gegen Wettbewerbsbeschränkungen and France’s Art L420(2), 2nd para of the Code de
Commerce.
87
European Commission Staff Working Document 2018, pp. 150–155.
126 T. Verdonk

themselves against UTPs, if those practices do not violate traditional competition


law provisions. In that regard, laws on UTPs can play an important role in pro-
tecting farmers’ autonomy.
Besides legislation, many self-regulatory mechanisms in the European food
supply chain have been enacted. These, however, have not always worked as
effective as hoped. Arguably the most prominent example is the Supply Chain
Initiative, set up by inter alia the High Level Forum for a Better Functioning Food
Supply Chain of the European Commission in 2013. Launched in response to UTPs
in the lower segment of the food supply chain, this initiative encompasses a set of
so-called Principles of Good Practice. In its five-year existence, the SCI has
received very few complaints,88 despite many stakeholders claiming the widespread
occurrence of UTPs. According to the European Commission, this is due to the
aforementioned fear factor, lack of involvement of some operators in the food
supply chain, and the impossibility of the SCI to impose sanctions or to publish
decisions.89
In that respect, it is worth mentioning that in the United States a coalition of
industry stakeholders, including major agricultural data technology providers and
the American Farm Bureau Federation, agreed on a number of core principles in
2014.90 These so-called ‘Privacy and Security Principles for Farm Data’ outline
expectations with regard to inter alia ownership, collection, transparency, porta-
bility, disclosure, use, sale, and retention of data as well as contracting practices.
Moreover, one of these principles aims to tackle anticompetitive behaviour,
explicitly stating that agricultural technology providers “should not use the data for
unlawful or anti-competitive activities, such as a prohibition on the use of farm data
by the [agricultural technology provider] to speculate in commodity markets.”91
Although almost forty agricultural technology providers and farm industry organ-
isations committed to uphold these principles, less than half have actually done so
as of February 2018.92
It is therefore interesting that in April 2018 a coalition of associations from the
agri-food chain in the European Union launched the ‘EU Code of Conduct on
agricultural data sharing by contractual agreement’ (Code of Conduct). This Code

88
Supply Chain Initiative (2018) Five years of Supply Chain Initiative and future prospects.
www.supplychaininitiative.eu/news/press-release-five-years-supply-chain-initiative-and-future-
prospects (accessed July 2018).
89
European Commission 2018a, pp. 2–3.
90
See www.agdatatransparent.com/the-privacy-and-security-principles-for-farm-data/ (accessed
July 2018).
91
See www.agdatatransparent.com/principles/ (accessed July 2018).
92
Hettinger J (2018) Few Big Ag companies have yet to follow through on data transparency
pledge. www.newfoodeconomy.org/big-ag-follow-through-farm-bureau-data-transparency-pledge/
(accessed July 2018): “Among those who have yet to do so are some of the industry’s biggest
players – Monsanto, Dow DuPont, John Deere and CNH Industrial, the equipment manufacturing
company that makes both the Case IH and New Holland brands of combines, tractors and other
equipment.” Also Ferris 2017 (p. 310) questioned the effectiveness of the principles, noting that
“these are simply voluntary standards which do not hold the force of law.”
7 Planting the Seeds of Market Power 127

aims to promote data sharing under fair and transparent conditions between dif-
ferent stakeholders of the agri-food sector, specifically with regard to privacy, data
protection, intellectual property, data attribution, relationships of trust or power,
storage, conservation, usability, and security. Although EU Commissioner for
Agriculture and Rural Development Phil Hogan displayed optimism at the launch
event,93 scepticism seems justified. Firstly, the principles of the above mentioned
initiative in the US, with many of the same signatories, has at the moment not been
implemented by many. Secondly, self-regulatory mechanisms in the food supply
chain in general like the SCI have not been successful. Thirdly, compliance with the
Code is voluntary, leaving a jeopardised party with few tools in case of a breach.
In response to the wide divergence of Member States’ regulatory approaches to
UTPs and the insufficient effectiveness of self-regulatory mechanisms, the
Commission has recently proposed a directive on “unfair trading practices in
business-to-business relationships in the food supply chain”.94 Despite this ambi-
tious title, the proposal only deals with a small selection of UTPs, which occur
specifically in relation to the sales of food products by a supplier that is a small and
medium-sized enterprise (SME) to a buyer that is not an SME.95 In other words,
SMEs in the food supply chain that face an imbalance in market power when
buying products or services, e.g. a farmer using digital technology or purchasing
inputs from a large supplier, are not covered by the proposal.
The Commission’s proposal blends the CAP and the laws on UTPs. Although
the directive aims to protect all SMEs active in the food supply chain when buying,
its legal basis reveals it is specifically aimed at farmers. With reference to Article 39
TFEU, the Commission states that the common rules on UTPs are necessary to
ensure a fair standard of living for the agricultural community since UTPs threaten
the profitability of farmers. In line with a minimum harmonisation approach, the
directive proposes to prohibit four types of practices, without any exception, and
four other types of practices, unless they were agreed in clear and unambiguous
terms at the conclusion of the supply agreement. If adopted, each Member State will
have to designate a public authority that enforces these rules and has the power to
impose “effective, proportionate and dissuasive” fines on firms that have in an
abusive way. All in all, these provisions should help preserve the autonomy of

93
European Commission 2018c.
94
European Commission 2018a. The proposed directive is the result of a long process, in which
the Commission involved stakeholders and academia. Already in 2010, it set up a High Level
Forum for a Better Functioning Food Supply Chain, which helped to launch a voluntary code of
conduct, the SCI’s Principles of Good Practice, in 2013. Other subsequent initiatives included the
establishment of the Agricultural Markets Task Force in 2016, and the launch of an inception
impact assessment and a public consultation on the improvement of the food supply chain in 2017.
The Commission’s co-legislators, the European Parliament and the Council, have also not been
silent on the matter of UTPs. Both institutions called on the Commission—through a resolu-
tion (2015/2065(INI)) and conclusions (press release 769/16), resp.—to submit a proposal.
95
European Commission 2018a, Article 1(2).
128 T. Verdonk

small and medium-sized farmers (and other SMEs in the food supply chain) on the
basis of the CAP and protect them against their larger purchasers.
Notwithstanding how digital agriculture can help to achieve the CAP’s objec-
tives, digital technology can also jeopardise the profitability of farmers in the long
term, i.e. prevent them from obtaining a fair standard of living. As explained, digital
technology could exacerbate existing imbalances of market power and create
relationships of economic dependence between inputs suppliers and farmers. These
positions of economic strength could facilitate abusive and unfair trading practices
by the former vis-à-vis the latter. Since those practices are highly unlikely to be
tackled by ex post EU competition rules until a dominant provider of digital
agriculture has emerged, should not the Commission’s proposal also address the
enforcement gap, which may soon arise, or at least assess the desirability of doing
so?
Another recent proposal from the Commission that aims to strengthen the
autonomy of businesses vis-à-vis stronger contracting partners is the proposed
regulation that aims to promote fairness and transparency for business users of
online intermediation services. Recognising the dependence of businesses on cer-
tain platforms, the proposal aims to protect the former against (potentially) harmful
trading practices from the latter.96 As explained, non-conglomerate and smaller
suppliers of inputs could face similar concerns due to tendencies of platforms to
become so-called ‘bottlenecks’. The Commission’s proposal could prevent the
occurrence of abusive practices through digital agriculture platforms held by con-
glomerate inputs suppliers vis-à-vis the suppliers of agricultural inputs, thus pro-
moting the open competition in digital agriculture the Commission seeks.
However, the current proposal only applies to platforms that offer goods or
services to, at least, consumers.97 Unlike many online platforms such as Amazon
and Ebay that offer goods or services to consumers and businesses (thus consti-
tuting a business-to-business/business-to-consumer, or hybrid, platform), agricul-
tural technology platforms in principle only offer goods or services to businesses.
Therefore, even if digital agriculture platforms were to experience a similar
development as e-commerce platforms in terms of economic strength vis-à-vis their
upstream contracting partners, they would escape the applicability of the proposed
stricter regulation. Perhaps, at this point, it may be too early to compare digital
agriculture platforms to e-commerce platforms, but given the speed in which digital
agriculture has taken off and the competitive implications of recent developments in
agricultural input industry, the suggestion whether the proposed regulation should
also apply to platforms that only have business-to-business relations downstream
may be worth considering.

96
European Commission 2018b.
97
European Commission 2018b, Article 1(2).
7 Planting the Seeds of Market Power 129

7.5 How to Proceed: Policy Recommendations

With the generous financial support for digital agriculture from the EU and the
recent introduction of the first major digital agriculture platforms into regions of the
EU, European agriculture is on the eve of a potential revolution. On the other side
of the Atlantic, innovations adopted by U.S. farmers have shown to be (very)
promising in the search for production-enhancing and resource-efficient methods in
agriculture. Seen in a wider context, it is equally hardly surprising that digital
agriculture is welcomed as a solution for global challenges related to food security
and sustainability.
Nevertheless, digital agriculture is a double-edged sword for farmers. Besides its
many benefits, it has generated many concerns inter alia over data ownership,
privacy and security. Moreover, this chapter has explained how it can exacerbate
the existing imbalances of economic power in the upper segment of the food supply
chain and facilitate abusive or unfair trading practices vis-à-vis customers and
competitors. While the European Commission considered the Bayer/Monsanto
merger review to be too early for far-reaching intervention in competition in digital
agriculture—a commitment from Bayer to licence a copy of its worldwide digital
agriculture products to BASF was considered sufficient to obtain approval—, it is
clear to all those involved that digital agriculture is able to generate important shifts
in the organisation and management of agricultural production. Therefore, the
implications of digital agriculture warrant scrutiny and supervision from a com-
petition policy perspective.
Having regard to the rapid development of digital agriculture, the technologies’
economic properties and its effects on competition and farmers’ autonomy (par-
ticularly, those of digital agriculture platforms), and the current state of competition
in the agricultural input markets, the following suggestions are worth considering in
view of effective competition policy:
(1) Market studies and sector inquiries may help to monitor the developments of
dynamics and competitive forces in the market for agricultural digital tech-
nologies and the agricultural input industry at large. The ex ante merger control
assessment of Bayer’s acquisition of Monsanto by the European Commission
raised red flags regarding the current state of competition in digital agriculture,
but above all illustrated the rapidity in which digital agriculture technology can
impact competitive dynamics. Moreover, scrutiny from authorities is justified
because of the specific nature of digital agriculture platforms and the presence
of conglomerate market power. Strengthened positions of economic power may
facilitate abusive practices, such as violations of ex post competition laws and
other provisions that govern imbalances of economic power. Finally, deterio-
rations of farmers’ autonomy due to digital technologies may jeopardise their
profitability, which may be at odds with the objectives pursued by the EU’s
CAP. A big question for the near future will be to what extent the recently
launched Code of Conduct will be successful in resolving some data-related
issues in the sector, given the limited success of self-regulatory mechanisms in
130 T. Verdonk

the European food supply chain and the worse-than-expected uptake of agri-
cultural data principles in the United States.
(2) The international and dynamic nature of innovations in digital agriculture and
other agricultural input markets increase the need for cross-border cooperation
between national and European authorities, including the competition author-
ities. Instead of approaching the different issues of digital agriculture as com-
pletely separate policy issues, such as merely a competition issue or a privacy
issue, effective legislation and enforcement could benefit from a (partly)
coordinated approach. In that way, the different regulatory authorities can build
on each other’s findings. To facilitate the exchange of knowledge and expertise,
a network could be established or the existing EIP-AGRI network could take
the lead. This creates a central point of contact, which simplifies the collection
of authorities’ findings and stakeholders’ perspectives, and allows all involved
to contribute in an effective and efficient manner.
(3) Current legislative proposals from the Commission that primarily address a
number of perceived gaps in the current EU competition law framework do not
apply to relationships between businesses active in digital agriculture, even
though these proposals aim to tackle unfair trading practices arising from weak
bargaining positions vis-à-vis relatively stronger contracting parties in the food
supply chain or platforms. Since these practices could likewise occur in the
digital agriculture market in the near future, this begs the question whether the
subject matter of the proposed legislation should be expanded to digital agri-
culture. Research and consultations could help provide insights on the necessity
of broadening the proposals’ applicability.
Among other solutions, these suggestions may allow European societies to reap
the benefits of innovations in digital agriculture, while mitigating potentially eco-
nomic and social adverse effects related to market power.

References

Accenture (2017) Digital Agriculture: Improving Profitability. Available at www.accenture.com/_


acnmedia/Accenture/Conversion-Assets/DotCom/Documents/Global/PDF/Digital_3/
Accenture-Digital-Agriculture-Point-of-View.pdf. Accessed July 2018
American Farm Bureau Federation (2016) Farm Bureau Survey: Farmers Want to Control Their
Own Data. Available at www.fb.org/newsroom/farm-bureau-survey-farmers-want-to-control-
their-own-data. Accessed July 2018
Bakhoum M (2018) Abuse Without Dominance in Competition Law: Abuse of Economic
Dependence and its Interface with Abuse of Dominance. In: Di Porto F, Podszun R
(eds) Abusive Practices in Competition Law. Edward Elgar (forthcoming)
Bishop S, Walker M (2010) The Economics of EC Competition Law: Concepts, Applications and
Measurement, 3rd edn. Sweet & Maxwell, London
Carbonell I (2016) The ethics of big data in agriculture. Internet Policy Review 5:1–13
Coble KH, Mishra AK, Ferrell S, Griffin T (2018) Big Data in Agriculture: A Challenge for the
Future. Applied Economic Perspectives and Policy 40:79–96
7 Planting the Seeds of Market Power 131

Dowell T (2015) Big Data on the Farm (Part I): What Is It? Available at www.agrilife.org/
texasaglaw/2015/09/01/big-data-on-the-farm-part-i-what-is-it/. Accessed July 2018
EIP-AGRI Seminar (2016) Data revolution: emerging new data-driven business models in the
agri-food sector. Available at www.ec.europa.eu/eip/agriculture/sites/agri-eip/files/eip-agri_
seminar_data_revolution_final_report_2016_en.pdf. Accessed July 2018
ETC Group (2015) Breaking Bad: Big Ag Mega-Mergers in Play Dow + DuPont in the Pocket?
Next: Demonsanto? Available at www.etcgroup.org/sites/www.etcgroup.org/files/files/etc_
breakbad_23dec15.pdf. Accessed July 2018
European Commission (2013) Green Paper on Unfair Trading Practices in the
Business-to-Business Food and Non-Food Supply Chain in Europe, COM(2013) 37 final
European Commission (2017) European Union funds digital research and innovation for
agriculture to tackle societal challenges. Available at www.ec.europa.eu/info/news/european-
union-funds-digital-research-and-innovation-agriculture-tackle-societal-challenges_en.
Accessed July 2018
European Commission (2018a) Proposal for a directive on unfair trading practices in
business-to-business relationships in the food supply chain, COM(2018) 173 final
European Commission (2018b) Proposal for a regulation promoting fairness and transparency for
business users of online intermediation services, COM(2018) 238 final.
European Commission (2018c) Remarks by Commissioner Hogan at launch of an EU Code of
Conduct on Agricultural Data, Brussels. Available at https://ec.europa.eu/commission/
commissioners/2014-2019/hogan/announcements/remarks-commissioner-hogan-launch-eu-
code-conduct-agricultural-data-brussels_en. Accessed July 2018
European Commission Staff Working Document Impact Assessment (2018) Initiative to improve
the food supply chain (unfair trading practices). SWD(2018) 92 final. Available at https://eur-
lex.europa.eu/legal-content/EN/TXT/?uri=SWD:2018:092:FIN. Accessed July 2018
Falkowski J, Ménard C, Sexton R, Swinnen J, Vandevelde S, Di Marcantonio F, Ciaian P (2017)
Unfair trading practices in the food supply chain: A literature review on methodologies,
impacts and regulatory aspects (study commissioned by the European Commission’s Joint
Research Centre). Available at www.ec.europa.eu/jrc/en/publication/unfair-trading-practices-
food-supply-chain-literature-review-methodologies-impacts-and-regulatory. Accessed July
2018
Ferris JL (2017) Data Privacy and Protection in the Agriculture Industry: Is Federal Regulation
Necessary? Minnesota Journal of Law Science & Technology 18:309–342
Gebbers R, Adamchuk V (2010) Precision Agriculture and Food Security. Science 327:828–831
IPES-Food (2017) Too big to feed: Exploring the impacts of mega-mergers, consolidation and
concentration of power in the agri-food sector. Available at www.ipes-food.org/images/
Reports/Concentration_FullReport.pdf. Accessed July 2018
Johnson J (2012) Precision Agriculture: Higher Profit, Lower Cost. Available at www.precisionag.
com/institute/precision-agriculture-higher-profit-lower-cost/. Accessed July 2018
Keulemans W (2015) Food Production and Food Security: The Incomplete Truth. Available at
www.kuleuven.be/metaforum/docs/pdf/wg_33_e.pdf. Accessed July 2018
Khan L (2013) Monsanto’s scary new scheme: Why does it really want all this data? Available at
www.salon.com/2013/12/29/monsantos_scary_new_scheme_why_does_it_really_want_all_
this_data/. Accessed July 2018
Kritikos M (2017) Precision agriculture in Europe. Legal, social and ethical considerations (study
commissioned by the European Parliamentary Research Service’s Scientific Foresight Unit).
Available at www.europarl.europa.eu/RegData/etudes/STUD/2017/603207/EPRS_STU(2017)
603207_EN.pdf. Accessed July 2018
Lamborelle A, Fernández Álvarez L (2016) Farming 4.0: The future of agriculture? Available at
www.euractiv.com/section/agriculture-food/infographic/farming-4-0-the-future-of-agriculture/.
Accessed July 2018
Lianos I, Katalevsky D (2017) Merger Activity in the Factors of Production Segments of the Food
Value Chain: A Critical Assessment of the Bayer/Monsanto merger. Available at www.
discovery.ucl.ac.uk/10045082/1/Lianos_cles-policy-paper-1-2017.pdf. Accessed July 2018
132 T. Verdonk

Magnin C (2016) How big data will revolutionize the global food chain. Available at www.
mckinsey.com/business-functions/digital-mckinsey/our-insights/how-big-data-will-
revolutionize-the-global-food-chain. Accessed July 2018
McMahon J, Cardwell M (eds) (2015) Research Handbook on EU Agriculture Law. Edward Elgar
OECD (2017) Algorithms and Collusion: Competition Policy in the Digital Age. Available at
www.oecd.org/competition/algorithms-collusion-competition-policy-in-the-digital-age.htm.
Accessed July 2018
Olson K, Rahm M and Swanson M (2010) Market Forces and Changes in the Plant Input Supply
Industry, Choices 25:6–11
Perez C (2002) Technological Revolutions and Financial Capital: The Dynamics of Bubbles and
Golden Ages. Edward Elgar, Cheltenham
Poppe K, Wolfert S, Verdouw C, Renwick A (2015) A European Perspective on the Economics of
Big Data. Farm Policy Journal 12:11–19
Poppe K, Wolfert S, Verdouw C, Verwaart T (2013) Information and Communication Technology
as a Driver for Change in Agri-food Chains. EuroChoices 12:60–65
Porter ME (1979) How Competitive Forces Shape Strategy. Harvard Business Review 57:137–145
Renda A, Cafaggi F, Pelkmans J, Iamiceli, P, Correia de Brito A, Mustilli F, Bebber L (2014)
Study on the legal framework covering business-to-business unfair trading practices in the
retail supply chain. Final report (prepared for the European Commission, DG Internal Market),
DG MARKT/2012/049/E. Available at http://ec.europa.eu/internal_market/retail/docs/140711-
study-utp-legal-framework_en.pdf. Accessed July 2018
Russo J (2013) Big Data & Precision Agriculture. Available at www.precisionag.com/systems-
management/data/big-data-precision-agriculture. Accessed July 2018
Schepp N, Wambach A (2016) On Big Data and Its Relevance for Market Power Assessment.
Journal of European Competition Law & Practice 7:120–124
Schumpeter J (1942) Capitalism, Socialism and Democracy. Harper & Brothers, New York
Sykuta ME (2016) Big Data in Agriculture: Property Rights, Privacy and Competition in Ag Data
Services. International Food and Agribusiness Management Review 19:57–74
The Hale Group and LSC International (2014) The digital transformation of row crop agriculture.
Report to the Iowa AgState Group. Available at www.cals.iastate.edu/sites/default/files/misc/
172832/agstate-executive-summary-15-dec-docx.pdf. Accessed July 2018
Tzounis A, Katsoulas N, Bartzanas T, Kittas C (2017) Internet of Things in agriculture, recent
advances and future challenges. Biosystems Engineering 164:31–48
Van Es H, Woodard J (2017) Innovation in Agriculture and Food Systems in the Digital Age. In:
Dutta S, Lanvin B, Wunsch-Vincent S (eds) Global Innovation Index 2017. Innovation
Feeding the World, 10th edn. Cornell University, INSEAD/the World Intellectual Property
Organization, Ithaca/Fontainebleau/Geneva, pp 97–104
Wesseler J, Bonanno A, Drabik D, Materia V, Malaguti L, Meyer M, Venus T (2015) Overview of
the agricultural inputs sector in the EU (study requested by European Parliament
Directorate-General for Internal Policies). Available at www.europarl.europa.eu/RegData/
etudes/STUD/2015/563385/IPOL_STU(2015)563385_EN.pdf. Accessed July 2018
Wolfert S, Ge L, Verdouw C, Bogaardt M (2017) Big Data in Smart Farming – A review.
Agricultural Systems 153:69–80
Wolfert S, Sørensen C, Goense D (2014) A Future Internet Collaboration Platform for Safe and
Healthy Food from Farm to Fork. 2014 Annual SRII Global Conference. IEEE, San Jose
(United States), pp 266–273

Tom Verdonk is a Ph.D. candidate at the KU Leuven’s Institute for Consumer, Competition &
Market. He obtained his LL.B. (Utrecht Law College honours programme) and his LL.M. (Law
and Economics, cum laude) at Utrecht University. During his studies he worked as a paralegal for
an Amsterdam-based competition law firm. Prior to joining KU Leuven, he completed internships
at law firms in Amsterdam and Brussels and with a Dutch Senate delegation.
Chapter 8
Sharing Data and Privacy
in the Platform Economy: The Right
to Data Portability and “Porting Rights”

Silvia Martinelli

Contents

8.1 Introduction to the “Platform Economy”: Network Effects and Switching Cost ............ 134
8.2 The Right to Data Portability as a Milestone for Competition Law, User’s Protection
and Privacy: An Introduction............................................................................................ 137
8.3 The Right to Data Portability in the General Data Protection Regulation
and in the Guidelines of the Article 29 Working Group: The Privacy Concern ............ 140
8.4 Non-personal Data and Professional Users: The Proposals of the EU Commission ...... 145
8.5 Provided, Observed and Inferred Data: Regulating New Technology in Uncertain
Times ................................................................................................................................. 148
References .................................................................................................................................. 151

Abstract This chapter analyses the right to data portability and its peculiarities in
the platform economy, where this right is fundamental for competition law, users’
protection and privacy, because of the presence of strong direct and indirect net-
work effects and consequent high switching costs. In particular, it analyses the right
to data portability as set out in the GDPR, together with the interpretation given by
the Article 29 Working Group, and the other “porting rights” in the Digital Single
Market strategy and in the European Commission Proposals “for a Regulation on a
framework for the free flow of non-personal data in the European Union”, “for a
Regulation on promoting fairness and transparency for business users of online
intermediation services” and in the proposed “Directive on certain aspects con-
cerning contracts for the supply of digital content”. It underlines six critical issues
related to the right to data portability: (1) a privacy issue, due to the huge sharing of

S. Martinelli (&)
University of Turin, Turin, Italy
e-mail: silviamartinelli89@gmail.com

© T.M.C. ASSER PRESS and the authors 2019 133


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_8
134 S. Martinelli

data of other individuals; (2) the need to establish the portability of non-personal
data; (3) the need to establish the portability for professional users that are not
natural persons; (4) the need to protect the rights of the controller and his invest-
ment when data is not merely collected but also reworked; (5) the risk of decreased
competition with a strong and non-scalable regulation; (6) the necessity to pay
attention to the technical solutions available in order to assure practicable appli-
cation methods, in particular considering the needs of smaller operators.

Keywords Data portability  Social network  Platform economy  Consumer 



Competition Privacy

8.1 Introduction to the “Platform Economy”: Network


Effects and Switching Cost

In the so-called “networked information economy”,1 where “data is at the centre of


the future knowledge economy and society”,2 platform users continuously generate
huge amounts of information and content, often without commercial goals.
However, new business models are able to exploit the contents created or the
analysis of the data generated for commercial purposes.
Social networks like Facebook enable new forms of communication and con-
nection between users, who meet and communicate through the platform providing
rich and detailed information about themselves. They can be defined as “web-based
services that allow individuals to (1) construct a public or semi-public profile within
a bounded system, (2) articulate a list of other users with whom they share a
connection, and (3) view and transfer their list of connections and those made by
others within the system”.3
The vast majority of social networking services are provided free of monetary
charges; however, they can be monetised through other means, such as advertising
or charges for premium services.4 The companies foster the perception that the
social media services are provided for free, but they have a precise business model
based on the collection and analysis of data to offer targeted advertising services.
“Personal information operates as a currency”5 and the value of the data is
extracted in a four-step “personal data value chain” consisting of (1) collection and
access, (2) storage and aggregation, (3) analysis and distribution and (4) usage of

1
Benkler 2006.
2
European Commission 2014a.
3
Boyd and Ellison 2007.
4
European Commission 2014b.
5
European Commission 2015, Article 3.1; EDPS 2014, 2016a; Resta 2018; Colangelo and
Maggiolino 2017.
8 Sharing Data and Privacy in the Platform Economy … 135

personal datasets.6 The results of the data analysis, crystallized in new data, are
possible thanks to sophisticated algorithms that are able to provide different kinds
of user-advertising services.
Platform as social networks can be also defined as “multi-sided platform”. The
platforms “serve distinct groups of customers who need each other in some way,
and the core business of the two-sided platform is to provide a common (real or
virtual) meeting place and to facilitate interactions between members of the two
distinct customer groups”.7 There are two or more groups of users and the matching
between all of them is made possible by the platform itself.
Social networks are a particular multi-sided platform where users usually pro-
vide data in order to receive the social network’ services; the platform provides the
service to the first group of users, analyses the data and process these data to offer
advertising services to another group of users.8
It is possible to identify a second type of multi-sided platform which serves
distinct groups of customers but uses a different business model, not based on
advertisements. We can use the term “intermediary platform” (or “exchange plat-
form”) to define the multi-sided market platforms which enable the meeting
between sellers and buyers of goods and services: for example Booking, Airbnb,
BlaBlaCar, but also Amazon (when the company is not the seller directly).
In these cases the platform, through the use of data analysis and algorithms,
makes the meeting between two or more groups of users possible while offering
other facilities which allow for the reduction of transaction costs.9 The fundamental
role of these platforms is to “enable parties to realize gains from trade or other
interactions by reducing the transactions costs of finding each other and interact-
ing”.10 Different platforms engage in these activities to different degrees, with no
profit or commercial purpose. It is also not uncommon that the platforms devise
rules and regulations in order to reduce externalities and to increase the trust in the
platform as a whole.
The two described types of platforms, “social network platform” and “inter-
mediary platform”,11 are now spreading across the web and they are becoming the
“new square” and the “new market” where people met and interact, because of the

6
EDPS 2014; European Commission 2017a.
7
Evans et al. 2011; Frank 2014.
8
Stucker and Grunes 2016; Graef 2015; EDPS 2016a.
9
It is estimated that around 60% of private consumption and 30% of public consumption of goods
and services related to the total digital economy are transacted via online intermediaries. European
Commission 2018a.
10
Evans et al. 2011.
11
The present analysis is limited to these two types of platforms here described and it not includes
search engine, because in the opinion of the writer in the latter case there are substantial difference.
In particular, the content listed by the search engine is not created on the “search engine platform”
but it’s only a second representation and organisation of a content published in another website.
Furthermore, in the case of search engine the user’s profile has a different and lower importance,
based on the creation of the filter bubble rather than on the public representation of the user.
136 S. Martinelli

chance they offer to reach a selected audience. For example, Airbnb allows
non-professional individuals to offer rooms or apartments and to find interested
individuals. This became possible only thanks to the platform and because of the
use of Big Data and algorithms and it is likely to increase in the upcoming years.
The major problem of these new “squares” and “markets” is the market domi-
nance by a few actors versus a variety of suppliers and traders. The large size of a
few platform widely used around the world is a concern, because they are private
regulators of the community of users and they acquire more and more power.
A few platforms emerged due to network effects and switching costs, which
reduce competition in the market. The effects are moreover amplified by network
effects caused by the use of Big Data, which are fundamental for the success of this
type of platform and as a result only a limited number of successful platforms assert
itself in the global market.12
To better understand these effects, it is necessary to distinguish between “direct”
and “indirect network effect”: in the first case the value of joining the platform for
the individual user increases with the number of users (“if all my friends are there, I
want to be there”); in the second case, more users on one side of the platform attract
more users on the other side of the platform (“if my consumer/target is there, I want
to sell/promote my products there”).
The existence of strong direct and indirect network effects in the platform
economy13 creates and increases the current dominant positions and in both cases
the large use of Big Data14 profiling is a factor which multiplies these effects:
“volume and quality of data are positively correlated with the variety and quality of
the offered products and services, since companies can offer better products by
analysing ‘more’ consumer behaviour”.15 Traditional network effects, as evidenced
by social networks like Facebook, are now multiplied by network effects involving
the scale of data, network effects involving the scope of data, and network effects
where the scale and scope of data on one side of the market affect the other side of

12
European Commission 2018a.
13
The term “platform economy” is here used to refers to social media platform and exchange
platforms, as mentioned and described above.
14
“Big Data” are commonly defined by the use of the three “V” (or sometimes four or five):
volume, variety (which refers to mostly unstructured data sets from sources as diverse as web logs,
social media, mobile communications, sensors and financial transactions) and velocity (or the
speed at which data is generated, accessed, processed and analysed). The definition is still vague
and “the problem still with the 3Vs and similar definitions is that they are in continuous flux, as
they describe technical properties which depend on the evolving state of the art in data storage and
processing”. See also OECD 2014. More simply, in the words of Viktor Mayer-Schönberger and
Kenneth Cukier, “big data refers to things one can do at a large scale that cannot be done at a
smaller one, to extract new insights or create new forms of value, in ways that change markets,
organizations, the relationship between citizens and governments, and more”. See also
Mayer-Schönberger and Cukier 2013.
15
Engels 2016.
8 Sharing Data and Privacy in the Platform Economy … 137

the market (i.e. as advertising).16 In fact, there is a strong tendency of market


concentration in the data industry: “simply put, the more data is fed into the
analysis, the better and more efficient the service becomes”.17 This is also called
“big data advantage”.18
“Switching costs” are the barrier costs that users may face when trying to switch
to another platform. They can increase due to network effects. When the costs are
getting higher, it becomes more difficult for the users to move to a different plat-
form. In fact, we are witnessing consolidation of platform lock-ins, due to not only
strong network effects and consequent high switching costs (“I don’t want to
change the platform because my friends/consumers/sellers are there” and “If I
decide to change platform I will lose all my friends/customers/connections”), but
also due to the difficulty of transferring reputation and relevant data: a user planning
to move to a different platform will lose his “history”, meaning the interactions and
reputation built day by day on the platform.
Because of the joint presence of the effects described it is particularly difficult for
the user to move to a new platform and, as a consequence, it is difficult for a new
platform to be competitive with the major platforms operators. Furthermore, a limited
number of platforms can manage all the data and relationships between the users.
In the light of the above considerations, it is fundamental to increase the competition
in the “platform market” and this could be done through the widespread use of “the
right to data portability” and “portability rights”, with meaning the rights which can
favour the sharing and transfer of the data between the platforms. In the multi-side
market platforms, more than in different areas, competition, and hence portability,
becomes an imperative. However, at the same time, it is fundamental to analyse and
understand the problems associated with the right to data portability, in order to identify
legal and technical solutions to mitigate the negative effects of this right and to make
sure that the implementation effectively increases competition, not limits it.

8.2 The Right to Data Portability as a Milestone


for Competition Law, User’s Protection and Privacy:
An Introduction

“Data portability” means the ability to move data between applications or platforms
and may be a milestone for boosting competition in the data economy and, in
particular, in the platform economy, because of the strong network effects
described.
At present, the right to data portability is set out in the new Regulation 679/2016
of the European Parliament and of the Council of 27 April 2016 “on the protection

16
Stucker and Grunes 2016.
17
European Commission 2017a.
18
Stucker and Grunes 2016.
138 S. Martinelli

of natural persons with regard to the processing of personal data and on the free
movement of such data, and repealing Directive 95/46/EC” (General Data
Protection Regulation), as a right of the data subject. The right, provided by Article
20 of the GDPR, is the right of the data subject to “receive the personal data
concerning him or her” and it is set out in the GDPR with regard to “personal data”
of a “natural person”.
If strictly interpreted, the right to data portability, as affirmed in Article 20, does
not extend to “non-personal data” and to the data referred to a “professional user”.
Nevertheless, the European Union argued in favor of general portability or trans-
ferability of raw personal and non-personal data19 and the European Commission
has already put forward some proposals to extend such form of portability also to
“non-personal data” and professional users.
Data portability is fundamental not only for privacy, but also for the growth of
the Digital Single Market and it involves competition law, user protection20 and
privacy as a fundamental personal right. The European Data Protection Supervisor
underlines these connections in the “Preliminary Opinion” on “Privacy and com-
petitiveness in the age of big data: The interplay between data protection, com-
petition law and consumer protection in the Digital Economy”, where it is affirmed
that implementing the right to data portability, as set out in the GDPR for the
protection of personal data, by giving the user options to withdraw their personal
information and to port it to another service provider “would potentially empower
individuals while also promoting competitive market structures”. In particular, the
right to data portability goes further than the principle of transparency, which means
the right of the data subject to know everything about the data process and the
possibility to exercise “data access” to know exactly what data were processed.
Data portability is the right to download data in a “structured, commonly used and
machine-readable format” and transmit these data to a different data controller. It
would allow users to transfer data between online services and to give them to third
parties.
Concerning competition law in the platform economy, the advantages are
manyfold:21 in a market characterized by dominant positions and strong network
effects, where the use of data emphasizes more traditional network effects,
emphasized by the network effects caused also by the use of large amounts of data,
data portability and the sharing of these data are essential. Only with plain data
portability new platforms and business models can some form of competition
emerge. Users regain the power to switch to another platform without losing the
time invested in the previous one. If the user can take a copy of his data from a

19
European Commission 2016.
20
The term “user protection” is used instead of “consumer protection” because in the case of the
users of these platforms there is a lack of negotiating power not only for the contract between
consumers and the platform, but also for contracts between the platform and professional users.
21
Vanberg et al. 2017; Graef et al. 2014; Engels 2016; Graef 2015; Lynskey 2017; Graef 2016;
Colangelo and Maggiolino 2017.
8 Sharing Data and Privacy in the Platform Economy … 139

platform and transfer all the data to a new one, this will also reduce the network
effects directly linked to data access.22 For example, the dealer of hats who uses the
platform Amazon to sell his products can decide to leave for the new platform
“BuyBuyBuy” without losing the description of the products created on the first
platform and maybe, if the current obstacle to a plain right to data portability were
to be removed, the comments of the buyers in the reputational feedback system
could be ported too.
Regarding users’ protection (both consumer or professional), the right to data
portability can improve the power of the data subject on his data, in particular if the
right will be used in connection to the right to erase. If a user can take a copy of the
data, ask and obtain the deletion of all his data on the platform, he has more con-
tractual power in the platform relationship. In the previous example, the dealer can
decide to move to a new platform also because of unsatisfactory contractual condi-
tions and delete all his information from the previous one. If a lot of users will act in
the same way, the platform may decide to amend some clauses. As a second example,
on a social media platform such as Facebook, if a user loses his trust in the trans-
parency of the platform, he can take the copy of his data, history and relationship,
move to a new one and delete all the information he uploaded on Facebook.
It is not as easy as it sounds because of the network effects described: the user
will only move to a new platform where he can find his buyers or his friends.
However, with the possibility to exercise a full right to data portability he will not
lose his “history” and the time spent to upload all the information on the platform.
Furthermore, the transfer of the data will reduce the additional network effect
caused by data: the new competitor platform will easily receive large amounts of
data, which will enable the platform to improve the offered services.
As a consequence, if the variety of the platform offer is wider and the cost of the
transition is not excessive, users would have more contractual power and the risk of
abuse of dominance would be avoided.
The possibility for the user to port, share and also delete data is therefore a
milestone for the digital economy and EU Digital Single Market. “Building a
European data economy”23 is part of the European Union “Digital Single Market
strategy”. It aims at “enabling the best possible use of the potential of digital data”
and “unlock the re-use potential of different types of data and its free flow across
border”.24

22
Also the OECD underlined that “The monetary, economic and social value of personal data is
likely to be governed by non-linear, increasing returns to scale. The value of an individual record,
alone, may be very low but the value and usability of the record increases as the number of records
to compare it with increases. These network effects have implications for policy because the value
of the same record in a large database could be much more efficiently leveraged than the same
record in a much smaller data set. This could have implications for competition and for other key
policy items such as the portability of data”. See OECD 2013.
23
European Commission 2018b.
24
Ibid. Data sharing and re-use can be generally understood as making data available to or
accessing data from other companies for business purposes; European Commission 2018c.
140 S. Martinelli

This contribution will first analyse the right to data portability as set out in the
GDPR and in the interpretation given by the Article 29 Working Party.25 Therefore,
it will analyse the other “porting rights” in the Digital Single Market strategy and in
the European Commission Proposals “for a Regulation on a framework for the free
flow of non-personal data in the European Union”, “for a Regulation on promoting
fairness and transparency for business users of online intermediation services” and
in the proposed “Directive on certain aspects concerning contracts for the supply of
digital content”.
A broad interpretation and application of the right to data portability raises important
concerns about privacy and data protection. Data portability increases personal data
circulation, but it constitutes the best way to diminish or slow down the concentration
of power and monopolisation. In a context where “platformisation of our economy and,
more generally, our society”26 is actually becoming true, it is important to improve
competition and enable new platforms to compete. In addition, it constitutes a good
reference to underline some further problems concerning data protection law and its
problems in relation with other European legislation and proposals.

8.3 The Right to Data Portability in the General Data


Protection Regulation and in the Guidelines
of the Article 29 Working Group: The Privacy
Concern

The General Data Protection Regulation aims to protect natural persons in relation
to the processing of personal data, as a fundamental right set out in Article 8 of the
Charter of Fundamental Rights of the European Union, an “integral part of human
dignity, and a prerequisite for many other social goods such as free expression and
innovation”.27 The Regulation shall apply when there is a processing of “personal
data”, that is any information relating to an identified or identifiable natural person,
which is called “data subject”.28

25
Article 29 Working Party 2017.
26
Belli and Zingales 2017.
27
Buttarelli 2017. See also Floridi 2016; Lynskey 2015; UNESCO 2016.
28
As established in Article 4 of the GDPR, “personal data” means any information relating to an
identified or identifiable natural person (‘data subject’); an identifiable natural person is one who
can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an
identification number, location data, an online identifier or to one or more factors specific to the
physical, physiological, genetic, mental, economic, cultural or social identity of that natural per-
son; “processing” means any operation or set of operations which is performed on personal data or
on sets of personal data, whether or not by automated means, such as collection, recording,
organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure
by transmission, dissemination or otherwise making available, alignment or combination,
restriction, erasure or destruction.
8 Sharing Data and Privacy in the Platform Economy … 141

The right of data portability is set out in Article 20 of the Regulation as the right
of the data subject to “receive the personal data concerning him or her, which he or
she has provided to a controller, in a structured, commonly used and
machine-readable format and have the right to transmit those data to another
controller without hindrance from the controller to which the personal data have
been provided”. The data subject could also ask for the direct transmission from one
controller to another, where technically feasible (Article 20.2).
Within the scope of the previous Directive 95/46/EC the data subject could
exercise a right of access to know all the data related to him, but he was constrained
by the format chosen by the data controller to provide the requested information; on
the contrary “the new right to data portability aims to empower data subjects
regarding their own personal data, as it facilitates their ability to move, copy or
transmit personal data easily from one IT environment to another”.29
The right to data portability means only a right to move, copy or transmit the
data. The exercise of the right of portability and the right to be forgotten (Article 17)
are independent: data portability “does not automatically trigger the erasure of the
data from the systems of the data controller, and does not affect the original
retention period applying to the data which have been transmitted”.30
The Article 29 Working Party released the “Guidelines on the right to data
portability”31 providing guidance on the way to interpret and implement the right.
The most important part of this document concerns the conditions under which this
new right applies.
The right to data portability as regulated by Article 20 of the GDPR applies where
the processing is based on consent32 or on a contract (“where processing is necessary
for the performance of a contract to which the data subject is party or in order to take
steps at the request of the data subject prior to entering into a contract”).33
The Article 29 Working Group specifies that the right to data portability regards
not only data provided knowingly and actively by the data subject but also the
personal data generated by his or her activity. In particular, it includes: (a) personal
data concerning the data subject; (b) the data which the data subject has provided to
a data controller. With regard to letter (a), it is necessary to specify that the Article
29 Working Group includes “pseudonymous data that can be clearly linked to a
data subject”, but not anonymous data. With regard to letter (b) the Group distin-
guishes three categories of data: (1) data actively and knowingly provided by the
data subject, (2) observed data and provided data, (3) inferred data and provided
data. In the platform economy, the first category includes all the data uploaded to
the platform by the data subject, for example the information on the profile, photos,
description of the products, etc. In the second category, there are the data generated

29
Article 29 Working Party 2017.
30
Ibid.
31
Ibid.
32
Article 6.1, letter a or Article 9.2, letter a of the GDPR.
33
Article 6.1, letter b of the GDPR.
142 S. Martinelli

on the platform by the user’s activities, for example traffic data and search history.
In the latter group, there are the data created by the data controller analyzing the
first two categories. For the Article 29 Working Group the right to data portability
must be interpreted broadly: the first two categories of data fall into the scope of
data portability and only the latter must be excluded.
The distinction in three categories stems from the need to solve two of the main
problems related to data portability: a privacy issue, due to the huge sharing of data
of other individuals, and the need to protect the rights of the controller and his
investment when the data are not merely collected but also reworked. In this
contribution the first one will be analysed.
In a traditional process of data, the data controller collects and analyses data
provided by the data subject and sometimes he extracts new data from provided
data. In the platform economy the set-up is more complex because the data subjects
can interact with each other and generate new data using the platform. For example,
in a social network it is possible to publish a picture of a group of friends and “tag”
all of them or publish a post about a friend in a group. In the exchange platforms,
the connections between seller and buyer always concerns both parties, because the
data with regard to the exchange contains personal data of both subjects. In addi-
tion, sometimes it is possible to inquire about a seller or a product through a
previous buyer. The data will also involve personal information about other users.
All these data, generated in the platform by the user’s activity, regard more than one
data subject and it becomes a limit to the right to data portability because it would
require the permission of all the data subjects involved.
A broad interpretation of the right of data portability could easily lead to a wide
sharing of data, which relates also to other data subjects. Considering the working
of social network and intermediary platform, in the data “connected to a data
subject” there is a lot of information which relates to all his contacts. The exercise
of the data portability of one data subject could have implications, “privacy inva-
sions”, for a lot of different individuals.
How is it possible to balance the right to “share data” with the right to privacy of
other individuals? What is the right balance between privacy and concurrence/
consumer protection?
The GDPR does not solve the question but it underlines that the right to data
portability “shall not adversely affect the rights and freedoms of others”.34 The
Article 29 Working Group tries to extend the application of the right to data
portability also to the data which involve more than one data subject. In particular,
it said that when a data controller processes “information that contains the personal
data of several data subjects”, he “should not take an overly restrictive interpreta-
tion of the sentence “personal data concerning the data subject”. The example the
Working Group gives is the case of a telephone with numbers and messages from
other individuals and personal data of multiple people; in this case the data con-
troller should response to the data portability request because the data also concerns

34
Article 20.4 of the GDPR.
8 Sharing Data and Privacy in the Platform Economy … 143

the data subject, but if such data are then transmitted to a new data controller, the
new data controller “should not process them for any purpose which would
adversely affect the rights and freedom of the third-parties”.
In the opinion of the writer it is clear that this could be a first compromise but not a
solution, because it seems to enable the portability of data without the knowledge of all
the data subjects involved. However, properly, it shifts responsibility for the protection
of the first data subject to the new one (the next platform or the data subject itself if he
processes the data not only for purely personal or household needs), making it easier for
the first data controller to answer to the data portability request without much concern.
It would be a problem for the next data controller, who should find another ground for
the lawfulness of processing and also for third parties data involved.35
The Article 29 Working Party suggests that where personal data of third parties
are included in the data set, another ground for the lawfulness of processing must be
identified.
Because of the difficulty of distinguishing between the different scopes and
grounds for lawfulness in the Big Data age, where data is collected without
knowing its future utilisation and without distinguishing between different cate-
gories of data and processing, it could be reasonable to raise doubts that such
distinctions and controls could ever be implemented.
Obviously, the implementation of consent mechanisms for other data subjects
involved could be an easy solution to respect the third parties involved. For
example, when a data subject decides to exercise his right to data portability, the
platform can send a request to all the other data subjects involved for the consent to
the transmission of the data referred to them. Through this mechanism the third data
subjects could know about the portability request and consent or object. Although, a
system based on the consent of all the data subjects involved requires the imple-
mentation of system to enable the exclusion of data in the case of objection.
Furthermore, the implementation of tools to “enable data subjects to select the
relevant data and exclude (where relevant) other data subjects’ data”36 might help.
These aspects have a direct bearing on practical and technical application of the
right: the “structured, commonly used and machine-readable format” which support

35
As an example, “when a data subject exercises his or her right to data portability on his or her
bank account, since it can contain personal data relating to the purchases and transactions of the
account holder but also information relating to transactions, which have been “provided by” other
individuals who have transferred money to the account holder. In this context, the rights and
freedoms of the third parties are unlikely to be adversely affected in the webmail transmission or
the bank account history transmission, if their data are used for the same purpose in each pro-
cessing, i.e. as a contact address only used by the data subject, or as a history of one of the data
subject’s bank account. Conversely, their rights and freedoms will not be respected if the new data
controller uses the contact directory for marketing purposes”.
36
Article 29 Working Party 2017. The new version is lighter for data controllers: “Additionally,
the data controllers should implement consent mechanisms for other data subjects involved, to ease
data transmission for those cases where such parties are willing to consent, e.g. if they also want to
move their data to some other data controller. Such a situation might arise, for example, with social
networks, but it is up to data controllers to decide on the leading practice to follow”.
144 S. Martinelli

re-use must also take account of this problem. In addition, it should be an oppor-
tunity to consider implementing new ways and tools to provide further utility for the
end-user.37
There has never been a moment in history with so many reports of personal data
exposure as the one experienced lately.38 It seems that the right to data portability
may have potential adverse effects on privacy39 and could lead to a huge and
uncontrollable use of data, as an open door for companies.40 Although the right to
data portability is crucial not only for competition among platforms, but also for
data protection. In fact, the right to data portability was included in the General
Data Protection Regulation as a right of the data subject, in order to enable him to
control his data.
The Article 29 Working Party said that it “represents an opportunity to
“re-balance” the relationship between data subjects and data controllers”41 and there
are those who believe that the right to data portability “is a stimulus for the IT
design community to reflect on how to do […] privacy in a different way” and it
“offers an opportunity to make the case for new privacy preserving business
models”.42
With regard to this, it is first important to underline that the exercise of data
portability together with the “right to erase”43 enables users to port data in a new
platform and delete the data in the previous one. It reduces consumers’ switching
costs, improves their data control, can increase the data agents’ demand and also
helps individuals appreciate the value of personal data. If consumers are free to
change between the platforms and they understand the value of the data, they can
demand more from the collectors.44 It is “about empowering users to exercise
control and choice over how their data is handled” in order to obtain utility from
accessing their data, “disrupt the established business models of platforms locking
users in and importantly, to prompt creation of alternative commercial approaches
to personal data in the market”.45
The implementation and the use of “personal information management systems”
(PIMS) could be a solution. PIMS are systems that allow individuals to manage
their personal data in secure, local or online storage systems; users can per-
mit certain service providers to either access their data from or analyse data in
their PIMS. It therefore can be used as a clear point of control over access to the

37
Urquhart et al. 2018.
38
ENISA 2018.
39
Van der Auwermeulen 2017.
40
Ibid.
41
Article 29 Working Party 2017.
42
Urquhart et al. 2018.
43
Article 17 of the GDPR. Allow me to refer to Martinelli 2017.
44
Stucker and Grunes 2016, p. 322.
45
Urquhart et al. 2018.
8 Sharing Data and Privacy in the Platform Economy … 145

data.46 These systems are at an early stage of development and the way they are
designed and the underlying business models differ widely, but the objective is to
put users in control of their personal information and to serve as an effective and
user-friendly mechanism to provide or withdraw consent.47 In addition, PIMS
might be an instrument to facilitate the exercise of the users’ right of access,
rectification, erasure and right to data portability.
However currently there are many obstacles to overcome: the highly technical
nature of the subject and solutions involved, the need to demonstrate the value of
use of such technologies to ensure user participation, the lack of consistency used in
data formats, the presence of different policies between the platforms involved, the
relational nature of the data and the management of the personal data of third-party,
the capability of data to be copied, reused and propagated endlessly.48

8.4 Non-personal Data and Professional Users:


The Proposals of the EU Commission

This section analyses the need to establish the portability of non-personal data and
to also grant data portability to professional users that are not natural persons. These
problems derive from the definition of personal data and the scope of the GDPR and
they also give rise to some thoughts on the relationship between the General Data
Protection Regulation and other European legislation and proposals.
The distinction between personal and non-personal data is crucial. If the data are
non-personal the problems related to a large share of them through some portability
right are significantly different from a huge sharing of personal data. In fact, if data
is not personal, there are fewer privacy issues. There is always some concern
regarding “group privacy”,49 but it probably needs a different solution beyond the
traditional data protection. Despite this, if the data are non-personal it is doubtful
whether the GDPR applies, and therefore also the right to data portability as set out
in Article 20.
The Article 29 Working Party, in its Guidelines, establishes that the right to data
portability applies not only to the data “actively and knowingly provided” by the
data subject, but also to the “observed data” provided by the data subject by virtue
of the use of the service or device. Anyway, it seems that only personal data
belonging to these two categories can be the object of the right of data portability as
set out in the GDPR. This is probably one of the reasons for the new proposals by
the European Commission, in particular the Regulation on the free flow of data
which enables the right to data portability of non-personal data.

46
Ibid.
47
EDPS 2016b.
48
Urquhart et al. 2018.
49
Taylor et al. 2017; Mantelero 2016.
146 S. Martinelli

The need of a “free flow of data” is clearly outlined in the European Commission
“Proposal for a Regulation on a framework for the free flow of non-personal
data”,50 with the objective of unlocking the potential of the data economy. The
proposal applies to “non-personal data” and it aims to address three fundamental
issues: “1) Improving the mobility of non-personal data across borders in the single
market, which is limited today in many Member States by localisation restrictions
or legal uncertainty in the market; 2) Ensuring that the powers of competent
authorities to request and receive access to data for regulatory control purposes,
such as for inspection and audit, remain unaffected; and 3) Making it easier for
professional users of data storage or other processing services to switch service
providers and to port data, while not creating an excessive burden on service
providers or distorting the market”.
The third point aims to provide consent to switch service providers and to port
data when the user is a professional and data are “electronic data other than personal
data”. Hence it does not affect the Union data protection legal framework, and in
particular the GDPR, but it integrates it. Nevertheless, some concerns have been
expressed about the possibility of effectively implementing such a distinction and
consequently on the opportunity of introducing new rules on the circulation of data
outside the GDPR.51
Article 6 of the Proposal, “Porting Data”, invites the European Commission to
“encourage and facilitate the development of self-regulatory codes of conduct at
Union level, in order to define guidelines on best practices in facilitating the
switching of providers” and to ensure sufficiently detailed, clear and transparent
information before a contract for data storage is entered into. In particular it
establishes that the professional users have a right to port the data provided under
the contract and that the technical implementation of this right, which must ensure a
structured, commonly used and machine-readable format and allow sufficient time
for the users to switch or port the data, should be “defined by market players
through self-regulation, encouraged and facilitated by the Commission, in the form
of Union codes of conduct which may entail model contract terms”.52
The aim of the Proposal is both to protect professional operators in the use of
providers, platforms and cloud services, to avoid the abovementioned lock-in and to
enable the “free flow of data”; nevertheless the application is limited to
“non-personal data” and the instrument chosen is the code of conduct, encouraged
by the European Commission.
A similar aim inspired the new “Proposal for a Regulation on promoting fairness
and transparency for business users of online intermediation services”.53 This
Regulation would apply to “online intermediation services and online search
engines provided, or offered to be provided, to business users and corporate website

50
European Commission 2017b.
51
EDPS 2018.
52
Recital 22 of the Proposal for a Regulation.
53
European Commission 2018a.
8 Sharing Data and Privacy in the Platform Economy … 147

users, respectively, that have their place of establishment or residence in the Union
and that, through online intermediation services or online search engines, offer
goods or services to consumers located in the Union, irrespective of the place of
establishment or residence of the providers of those services”. The aim is to protect
business users from providers:
“The growing intermediation of transactions through online intermediation
services, fuelled by strong data-driven indirect network effects, lead to an increased
dependence of such business users, including micro, small and medium-sized
enterprises, on those services in order for them to reach consumers. Given that
increasing dependence, the providers of those services often have superior bar-
gaining power, which enables them to effectively behave unilaterally in a way that
can be unfair and that can be harmful to the legitimate interests of their businesses
users and, indirectly, also of consumers in the Union”.54
The core of the proposals is the introduction of a notice period for the modifi-
cation of terms and conditions, a statement of reason based on objective grounds for
suspension and termination, transparency for ranking and differentiated treatment,
access to data and internal complaint-handling systems.
In particular, according to the proposed Article 6, regarding “Access to data”,
providers of online intermediation services “shall include in their terms and con-
ditions a description of the technical and contractual access, or absence thereof, of
business users to any personal data or other data, or both, which business users or
consumers provide for the use of the online intermediation services concerned or
which are generated through the provision of those services”. The description
should include scope, nature and conditions of the access and “might refer to
general access conditions, rather than an exhaustive identification of actual data, or
categories of data, in order to enable business users to understand whether they can
use the data to enhance value creation, including by possibly retaining third-party
data services”.55 The aim is both to promote transparency and fairness in the use of
data and to enable business users to obtain or bargain about the use of data.
Here again, new rules concerning data and professional users would be outside
the GDPR, but in this case it is more about transparency and access than about data
circulation. It is not yet a data portability right but only a right to know exactly the
type of data and process, with a description in the contractual terms. In the opinion
of the author, it is relevant for data portability because it is a prerequisite for
enabling professional users to negotiate their rights on the data.
Furthermore, also the proposed “Directive on certain aspects concerning con-
tracts for the supply of digital content”,56 even if it does not use the term “data
portability”, establishes that in case of termination of the contract, concluded

54
Recital 2 of the proposed Regulation.
55
Recital 20 of the proposed Regulation.
56
European Commission 2015. The Directive shall apply “to any contract where the supplier
supplies digital content to the consumer or undertakes to do so and, in exchange, a price is to be
paid or the consumer actively provides counter-performance other than money in the form of
personal data or any other data”.
148 S. Martinelli

between supplier and consumer, “the supplier shall take all measures which could
be expected in order to refrain from the use of the counter-performance other than
money which the consumer has provided in exchange for the digital content and
any other data collected by the supplier in relation to the supply of the digital
content including any content provided by the consumer with the exception of the
content which has been generated jointly by the consumer and others who continue
to make use of the content” (Article 13.2, letter b). Article 16.4, letter b. (“Right to
terminate long term contracts”) specifies also that “the consumer shall be entitled to
retrieve the content without significant inconvenience, in a reasonable time and in a
commonly used data format”.57
If this proposal would be approved the right to data portability will know a new
expansion and it would be easier to access, share and re-use data. Nonetheless, the
question remains whether the collocation of the new rules outside the GDPR is the
best solution. It is obvious that there are fundamental differences between personal
and non-personal data and between data referring to a natural person or referring to
a professional user who is not also a natural person. When the data are not
“anonymous data” and when it is possible to link the data to an identified or
identifiable natural person, the level of protection required by the law is higher.
However, as described in the previous section, the privacy problem cannot be
simply related to the rights of the person who uploads the data on the platform. The
dataset charged in the platform by a user, professional or unprofessional, often
contains personal data of a third party, to which the GDPR applies.
It is true that the GDPR seems to impact and include more and more areas of law
and knowledge, but it is also true that it is the place in which the whole process of
data is regulated and subjected to accountability. It would probably be better to
integrate it in the GDPR, in order to better coordinate it with the existing rules and
avoid the risk of a difficult interpretation and application which can lead to legal
uncertainty.

8.5 Provided, Observed and Inferred Data: Regulating


New Technology in Uncertain Times

With regard to the scope of the right to data portability, and in particular to the need
to protect the rights of the controller, the distinction made by the Article 29
Working Party between provided, observed and inferred data becomes relevant.
Provided data are the data “actively and knowingly provided by the data subject”.

57
Article 16.4, letter b, “Right to terminate long term contracts”: “the supplier shall provide the
consumer with technical means to retrieve all any content provided by the consumer and any other
data produced or generated through the consumer’s use of the digital content to the extent this data
has been retained by the supplier. The consumer shall be entitled to retrieve the content without
significant inconvenience, in reasonable time and in a commonly used data format”. See also
European Parliamentary Research Service 2016.
8 Sharing Data and Privacy in the Platform Economy … 149

Observed data are “provided” by the data subject by virtue of the use of the service
or the device. Inferred data and derived data are created by the data controller on the
basis of the data “provided by the data subject”. The first category does not present
peculiar difficulties, which however can be found in the second and third categories.
The Working Group specifies that “the term ‘provided by the data subject’
should be interpreted broadly, and only to exclude ‘inferred data’ and ‘derived
data’, which include personal data that are created by a service provider (for
example, algorithmic results)”. Thus, the term shall include “personal data that
relate to the data subject activity or result from the observation of an individual’s
behaviour but does not include data resulting from subsequent analysis of that
behaviour”. All the data “created by the data controller as part of the data pro-
cessing, e.g. by a personalisation or recommendation process, by user categorisa-
tion or profiling are data which are derived or inferred from the personal data
provided by the data subject, and are not covered by the right to data portability”
shall be excluded. It is the writer’s opinion that this category shall include also
systems of reputation and feedback scores, because “the information referring to a
person’s reputation or feedback score is related to the data subject, even though this
data was not given by the individual, and should therefore fall under the scope of
data portability as personal data”.58
In other words, nearly all data obtained (provided or produced) from data sub-
jects will be “observed” data, while inferred or predicted personal data are “pro-
duced” by companies (e.g., through data mining).59
The issues that arise with regard to the distinction of these data from the “in-
ferred data” and which data must remain in the sole availability of the data con-
troller in order to safeguard his intellectual property, “particularly avoiding that the
intellectual work of a digital service provider (data inferred about consumers, using
complex algorithms) could be lawfully disclosed to competitive businesses for
free”.60
It is a limitation to the right to data portability aimed at protecting intellectual
property rights which seems not only difficult to apply, but also inadequate for the
protection of the interests and needs to which it intends to respond. The “inferred
data” are in fact data generated by the data controller on the basis of data already in
its possession, but these data can also be personal data. Take, for example, the case
of inferred data generated by the algorithmic analysis of data about DNA, which
can describe the probability of incurring a disease. It seems that the data subject has
the right to access concerning these inferred data but not the right to data portability.
Probably this would slow down the migration of data to another data controller, but
it would not solve the problem of the protection of the invention behind this
inferred data.

58
Van der Auwermeulen 2017.
59
De Hert et al. 2018.
60
Ibid.
150 S. Martinelli

The solution, however, can only be found in the new system for the protection of
intellectual property concerning algorithms and Big Data, still in the process of
early theorising, which should allow a wide circulation of data without compro-
mising the investments, the work and the genius of those who worked there.
Finally, in conclusion, in a changing world, where data portability rights will be
essential, it seems necessary to underline three elements which should be always
taken into account when developing new rules in the field of data portability.
First, too much regulation might reinforce and confirm existing dominant
positions. Therefore, it is essential to module the obligations and the diligence
required in relation to the dimensions and concrete technical possibilities of the
platform/data controller. Within the scope of the GDPR this might be possible
through a flexible interpretation of the “appropriate measures in terms of available
technology and costs of implementation”. It will be fundamental to take this
problem into account when devising new rules.
Second, privacy, competition and contract/consumer law are strictly linked and
their analysis, as well as any normative instrument, can only be joint and well
correlated. In particular, the right of data portability can be a milestone for com-
petition in order to avoid the risk of a market of platforms dominated by a few
actors, which can control both the meeting and the relationship between users and
the data and algorithm that govern them. Even if the data portability and a wide
interpretation of this right can lead to a huge sharing of data, the effects of the
absence of such right could be even worse.
However, it is necessary to find new solutions, such as a new
consent-mechanism for an involved third party, in order to apply this new right
without totally compromising privacy. Furthermore, the user’ possibility to take all
the data out from a platform and give it to another one, joined with the right to
erase, could represent a new power of the user. If this mechanism of way-out would
be effective, it would be possible that users and individuals discover the importance
of the data for the platform and maybe they will try to obtain better gains and
performances.
Third, the technology evolves fast. This means inevitably that legislators must
always keep in mind the need for flexibility in order to allow the legislation to be
applied even in a new technological context, but also that the technology itself
could be helpful in order to achieve their goals. The implementation of the right to
data portability in order to empower data subjects, users and consumers will largely
rest in the hands of technicians. Some of these problems require new technical
solutions, as new consent mechanism and PIMS, and all the data process and data
security shall be reinforced by technical controlled measures which allow user’s
trust and controls on the procedures of right of data sharing.
8 Sharing Data and Privacy in the Platform Economy … 151

References

Article 29 Working Party (2017) Guidelines on the right to data portability. Retrieved from http://
www.ec.europa.eu/newsroom/document.cfm?doc_id=44099
Belli L, Zingales N (2017) Platform regulations. How Platforms are Regulated and How They
Regulate Us. Off Outcome UN IGF Dyn Coalit Platf Responsib (United Nations Internet Gov
Forum). Retrieved from http://bibliotecadigital.fgv.br/dspace/handle/10438/19402
Benkler Y (2006) The Wealth of Networks. How Social Production Transforms Markets and
Freedom. Yale University Press, New Haven/London
Boyd DM, Ellison NB (2007) Social network sites: Definition, history, and scholarship. J Comput
Commun 13:210–230
Buttarelli G (2017) Privacy matters: updating human rights for the digital society. Priv Secur Med
Inf
Colangelo G, Maggiolino M (2017) Big Data, Data Protection and Antitrust in the Wake of the
Bunderskartellamt Case Against Facebook. New Front Innov Compet Big Data Case Law
1:104–112
De Hert P, Papakonstantinou V, Malgieri G, et al. (2018) The right to data portability in the
GDPR: Towards user-centric interoperability of digital services. Comput Law Secur Rev
34:193–203
EDPS (2014) Privacy and competitiveness in the age of big data: The interplay between data
protection, competition law and consumer protection in the Digital Economy. Preliminary
Opinion of the European Data Protection Supervisor. Retrieved from https://edps.europa.eu/
data-protection/our-work/publications/opinions/privacy-and-competitiveness-age-big-data_en
EDPS (2016a) The coherent enforcement of fundamental rights in the age of big data. Retrieved
from https://edps.europa.eu/data-protection/our-work/publications/opinions/big-data_en
EDPS (2016b) Opinion on Personal Information Management Systems. Towards more user
empowerment in managing and processing personal data. Opinion 9/2016. Retrieved from
https://edps.europa.eu/data-protection/our-work/publications/opinions/personal-information-
management-systems_en
EDPS (2018) Comments of the EDPS on a Proposal for a Regulation of the European Parliament
and of the Council on a framework for the free-flow of non-personal data in the European
Union. Retrieved from https://edps.europa.eu/data-protection/our-work/publications/
comments/edps-comments-framework-free-flow-non-personal-data_en
Engels B (2016) Data portability among online platforms. Internet Policy Rev J internet Regul
5:1–17
ENISA (2018) How Data is Under Siege like Never Before. Retrieved from https://www.enisa.
europa.eu/publications/info-notes/how-data-is-under-siege-like-never-before
European Commission (2014a) Towards a thriving data-driven economy. COM(422/2014)
European Commission (2014b) Case M.7217 – Facebook/WhatsApp Commission decision
pursuant to Article 6(1)(b) of Council Regulation No 139/2004
European Commission (2015) Proposal for a Directive on certain aspects concerning contracts for
the supply of digital goods
European Commission (2016) Communication on Online Platforms and the Digital Single Market
Opportunities and Challenges for Europe - COM(2016) 288
European Commission (2017a) Enter the Data Economy. EPSC Strateg Notes 1–16
European Commission (2017b) Proposal for a Regulation on a framework for the free flow of data
in the European Union
European Commission (2018a) Proposal for a Regulation of the European Parliament and of the
Council on promoting fairness and transparency for business users of online. COM(2018) 238
European Commission (2018b) Building a European data economy
European Commission (2018c) Study on data sharing between companies in Europe
152 S. Martinelli

European Parliamentary Research Service (2016) Contracts for supply of digital content. A legal
analysis of the Commission’s proposal for a new directive. Retrieved from http://www.
europarl.europa.eu/thinktank/en/document.html?reference=EPRS_IDA(2016)582048
Evans DS, Schmalensee R, Noel MD, et al. (2011) Platform economics: Essays on multi-sided
businesses. Compet Policy Int 459
Floridi L (2016) On Human Dignity as a Foundation for the Right to Privacy. Philos Technol
29:307–312
Frank JS (2014) Competition Concerns in Multi-Sided Markets in Mobilr Communication in
Drexi J et al. Competition on the Internet 2014
Graef I (2015) Mandating portability and interoperability in online social networks: Regulatory
and competition law issues in the European Union. Telecomm Policy 39:502–514
Graef I (2016) Blurring boundaries of consumer welfare How to create synergies between
competition, consumer and data protection law. In: Personal Data in Competition, Consumer
Protection and IP Law: Towards a Holistic Approach? Retrieved from https://ssrn.com/
abstract=2881969
Graef I, Verschakelen J, Valcke P (2014) Putting the right to data portability into a competition
law perspective. Retrieved from https://ssrn.com/abstract=2416537
Lynskey O (2015) The Foundations of EU Data Protection Law, Oxford University Press
Lynskey O (2017) Aligning data protection rights with competition law remedies? The GDPR
right to data portability. Eur Law Rev 42:793–814
Mantelero A (2016) Personal data for decisional purposes in the age of analytics: From an
individual to a collective dimension of data protection. Comput Law Secur Rev 32:238–255
Martinelli S (2017) Diritto all’oblio e motori di ricerca. Memoria e privacy nell’era digitale.
Giuffrè
Mayer-Schönberger V, Cukier K (2013) Big Data: A Revolution That Will Transform How We
Live, Work, and Think. Houghton Mifflin Harcourt
OECD (2013) Exploring the economics of personal data: A survey of methodologies for
measuring monetary value. OECD Digit Econ Pap 40
OECD (2014) Data-driven Innovation for Growth and Well-being: Interim Synthesis Report, 86
Resta G (2018) Digital platforms and the law: contested issues. Medialaws 231–248. Retrieved
from www.medialaws.eu/wp-content/uploads/2018/02/Resta.pdf
Stucker ME, Grunes AP (2016) Big Data and Competition Policy. Oxford University Press
Taylor L, Floridi L, Sloot B van der (2017) Group Privacy. New Challenges of Data Technologies.
Springer
UNESCO (2016) Privacy, free expression and transparency. Redefining their new boundaries in
the digital age. Retrieved from www.unesdoc.unesco.org/images/0024/002466/246610e.pdf
Urquhart L, Sailaja N, McAuley D (2018) Realising the right to data portability for the domestic
Internet of things. Pers Ubiquitous Comput 22:317–332
Van der Auwermeulen B (2017) How to attribute the right to data portability in Europe: A
comparative analysis of legislations. Comput Law Secur Rev 33:57–72
Vanberg AD, Ünver MB (2017) The right to data portability in the GDPR and EU competition
law: odd couple or dynamic duo? Eur J Law Technol 8:1

Silvia Martinelli graduated in Law at the University of Milan, is a lawyer and a member of the
Milan Bar Association and a Ph.D. Candidate from the University of Turin. She is the author of
scientific articles and of a book on the right to be forgotten: “Diritto all’oblio e motori di ricerca.
Memoria e privacy nell’era digitale“, Giuffrè, 2017. She is also Affiliate Scholar at the Information
Society Law Center of the University of Milan, a Teaching Assistant (“Cultore della materia”) in
both Private Law and Legal Informatics, a Member of the Editorial Committee of the Law
Reviews “Ciberspazio e Diritto” and “Diritto, Mercato e Tecnologia“, a Fellow of the European
Law Institute and of the Italian Academy of Internet Code, and a Member of the European Law &
Tech Network.
Chapter 9
Regulating Smart Distributed
Generation Electricity Systems
in the European Union

Theodoros G. Iliopoulos

Contents

9.1 Introduction........................................................................................................................ 154


9.2 The Disruptive Nature of Distributed Generation ............................................................ 155
9.3 Promoting Distributed Generation in the EU: Merits and Risks ..................................... 157
9.4 The EU Legal Order: Law in Force and the Proposal for a Directive on the Internal
Electricity Market .............................................................................................................. 159
9.5 The Special Issue of Net Metering ................................................................................... 164
9.6 Conclusion ......................................................................................................................... 167
References .................................................................................................................................. 168

Abstract Technological advancements facilitate the transition to a decentralised


and smart distributed generation electricity system where active customers will have
a key role. Such a transition can contribute to making the electricity systems
cleaner, more secure, more efficient and less expensive. Nevertheless, the promo-
tion of distributed generation requires reforms to the applicable legislation, so that it
fits the new reality. Accordingly, the Commission has put forward a proposal for a
new Directive on the common rules for the internal electricity market. This proposal
shows the Commission’s support in distributed generation and focuses on the
promotion of self-consumption, instead of net metering, and on the empowerment
of electricity customers through smart meter technologies and secure data man-
agement and data protection regimes. This proposal might originate the develop-
ment of a supranational legislative framework fitting technological innovation in the
field of electricity, but it is rather a basic starting point. It remains to be seen how

T. G. Iliopoulos (&)
Hasselt University, Martelarenlaan 42, Office FR 3.07, 3500 Hasselt, Belgium
e-mail: theodoros.iliopoulos@uhasselt.be

© T.M.C. ASSER PRESS and the authors 2019 153


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_9
154 T. G. Iliopoulos

Member States will respond to these initiatives and how they will implement the
relevant Directive, when—and if—enacted.

Keywords Electricity Law and Regulation  Internal Electricity Market 



Distributed Generation Active Customers  Prosumers  Smart Grid 

Smart Meters Net Metering

9.1 Introduction

Technological progress significantly affects human life. Machines are becoming


‘smarter’ and are having more and upgraded functions. Because of technology and
digitalisation, a number of sectors, like communication, agricultural production,
transportation etc. are experiencing radical changes. New terms, like the sharing
economy,1 are entering every-day life and are dominating the scientific debate. At
the same time, novel concerns are becoming pressing, like the issue of data pro-
tection. But, while technology moves forward in leaps and bounds, legislative
frameworks cannot keep pace. In point of fact, technological progress generally
precedes laws, which often end in non-fitting what they aim to address.
Consequently, policymakers, legislators and regulators often endeavour to catch up
with a newly created reality and to respond to new challenges. This mismatch
between legislation and innovation is described by the term ‘regulatory discon-
nection’2 or ‘regulatory uncertainty’.3
The ‘regulatory disconnection’ debate also applies to the field of electricity;
indeed, technological innovations have set the stage for important changes in the
electricity systems. The shift from centralised electricity systems towards a decen-
tralised distributed generation paradigm has been rendered possible. Distributed
generation can be defined as a ‘technology that allows power to be used and managed
in a decentralised and small-scale manner, thereby siting generation close to load’.4
Furthermore, digital evolution has enhanced the empowerment of electricity con-
sumers who can now avail themselves of a number of sophisticated features in order
to actively interact with the grid operators and to turn into active market actors.
Within this context, this chapter discusses the promotion of smart distributed
generation electricity systems, with the attention focused on the European Union
(hereinafter ‘EU’) and on EU law. It discusses the merits of promoting such a
decentralised and digitalised electricity system and examines the relevant legislative

1
The model of sharing economy is founded on internet markets that allow ‘peer-to-peer’ trans-
actions. Such digital markets facilitate the communication and transaction between owners that
wish to rent out the durable goods they are not using and consumers who wish to make use of these
goods. See Horton and Zeckhauser 2016; Einav et al. 2016.
2
Butenko 2016; Armeni 2015; Brownsword and Somsen 2009.
3
Stein 2014.
4
Cutler and Morris 2006, p. 120.
9 Regulating Smart Distributed Generation Electricity Systems … 155

framework. In terms of the chapter’s structure, Sect. 9.1 contains the Introduction.
Section 9.2 presents the disruptive features of a distributed generation electricity
system. Section 9.3 examines whether such a system should be promoted.
Section 9.4 assesses the relevant EU law and the recently submitted proposal for a
Directive on the internal market in electricity.5 Section 9.5 deals with the special
issue of net metering schemes in order to examine why they do not have but only a
minor role in the promotion of distributed generation in the EU. Lastly, Sect. 9.6
contains the chapter’s conclusions.

9.2 The Disruptive Nature of Distributed Generation

This section presents the main features of a distributed generation electricity system
and investigates whether it can be regarded as a disruptive innovation, to wit
whether it introduces a new business model that affects the dynamics of the market
and makes it expand to new customers.6
But first, and before turning to the heart of the foregoing issues, it is helpful to
briefly examine the features of the conventional electricity system that has been
dominant since the late nineteenth century. Such a conventional system is rather
simple; it is characterised by massive power plants that generate electricity, mostly
from fossil fuels. This electricity is fed into the central grid and, from there, it is
transmitted to large industrial users and it is further distributed to smaller consumers
through a low-voltage distribution network. Traditionally, this scheme was wholly
controlled by a vertically-integrated state-owned company. This is because elec-
tricity was being considered a natural monopoly, meaning that competition between
firms was regarded as by definition detrimental to efficiency.7 Nevertheless, this
traditional framework is not omnipresent anymore. As already mentioned, tech-
nological developments make possible the transition of electricity systems towards
a substantially different paradigm of distributed generation, which is founded on
small-scale generation that comes from different producers. Indeed, small-scale
installations for electricity production are becoming more and more affordable,
which facilitates decentralisation. The micro-production from these small-scale
installations can be consumed by the producers themselves or can be traded, either
to be fed into the central distribution network or to be directly supplied to other
small-scale generators that happens to be in need of more electricity at a given
time.8 Within this context, the term prosumers is gaining ground in the scientific
debate. The term is a combination of the words producer and consumer and

5
European Commission 2016a.
6
Rasool et al. 2018, p. 253; Hopp et al. 2018, p. 446.
7
Lavrijssen 2017, ch. 2; Grossman 2003.
8
Lavrijssen 2017; Olkkonen et al. 2017; Tomain 2012.
156 T. G. Iliopoulos

emphasises the fact that prosumers are active market participants who not only
consume, but also produce and sell their own generated electricity.9
Distributed generation and the position of prosumers are strengthened by the
evolution of the ‘smart grid’. The smart grid is ‘the integration of an electric grid, a
communications network, software, and hardware to monitor, control, and manage
the creation, distribution, storage and consumption of energy’.10 In short, a smart
grid is an upgraded and digitalised network that introduces a two-way electricity
flow.11 Thus, it permits those who have invested in small-scale electricity instal-
lations to feed their surplus into the grid. Furthermore, a smart grid also involves an
exchange of information between the interconnected parties. This communication is
enabled by the smart meters technology. With smart meters, consumers can have
round-the-clock real-time information about the price of electricity and their con-
sumption, which allows them to better control their expenditure and to reduce their
bills.12 The aforementioned developments are to be complemented by storage of
electricity. Nevertheless, storage technologies remain less developed and expensive
enough not to gather pace massively.13
Given the above, distributed generation is a disruptive innovation, in the sense
that it brings fundamental changes in the way electricity systems work; it has a
disruptive impact on the structure of the electricity market and impairs the position
of traditional market players. A decentralised electricity model coexists with and
puts pressure on the conventional centralised model. Further, well informed and
active prosumers are in a position to compete and discipline the incumbent elec-
tricity companies. ‘The energy consumer thus is viewed as a driver of
competition’.14
Before such a situation, it is required that the legislative framework fits a smart
distributed generation electricity system. Of course, the first dilemma that has to be
answered is whether and to what extent such a model should be promoted. In a next
step, policymakers and regulators have to decide on the content of the legislative
reform. These are the issues the next sections scrutinise, with the emphasis being
placed on the EU legal order.

9
The term prosumers was introduced by the futurist Alvin Toffler. See Toffler 1971.
10
Carvallo and Cooper 2006, p. 1.
11
Tomain 2012.
12
v. Schönfeld and Wehkamp 2018; Zame et al. 2018; Wunderlich 2013, ch. 1; Hamilton 2011,
pp. 397–399.
13
Stein 2014.
14
Lavrijssen 2017, p. 174.
9 Regulating Smart Distributed Generation Electricity Systems … 157

9.3 Promoting Distributed Generation in the EU:


Merits and Risks

The EU policymakers have explored the merits of a distributed generation elec-


tricity system and they have found that it is linked with certain positive effects.
More specifically, distributed generation and smart grids can contribute to making
the electricity systems cleaner, more secure, more efficient and less expensive.
To begin with, a distributed generation system is based on renewable energy
sources. Indeed, the small-scale installations involve photovoltaic panels, micro
wind turbines, biomass and geothermal power.15 Therefore, the promotion of dis-
tributed generation can also boost the promotion of renewables and assist in the
attainment of the EU target of 32% share of energy from renewable energy sources
in the gross final consumption of energy in 2030.16 Besides, the fact that the
consumers that are interested in self-production of electricity will turn on their own
initiative to renewable energy sources entails an overall increase in the ‘under-
standing and the acceptance of renewable energy’.17 Put in a common wording in
EU legal and policy documents, a smart distributed generation system enables
citizens to ‘take ownership of the energy transition’.18 The positive impact of
distributed generation on the deployment of renewable energy sources will also
contribute to combatting climate change, which is one of the proclaimed EU policy
priorities.
Furthermore, a distributed generation system is expected to lead to the
empowerment of the electricity consumers. This is where the EU policymakers and
legislators have placed the emphasis on.19 Accordingly, consumers who invest in
self-production of electricity are autonomous and self-sufficient, in the sense that
they can satisfy at least part of their electricity needs on their own, with no need to
resort to the central grid. Thus, they gain a significant degree of independence and a
‘stronger sense of ownership and control over their [electricity] use’.20 And they
can see their electricity bills being reduced: firstly, they can benefit from grid parity,
i.e. the situation where self-generating a certain amount of electricity is
cost-competitive compared to acquiring the same amount of electricity from the
grid.21 Secondly, they can benefit from having an overall better control of their
consumption. Thanks to the sophisticated features of the smart grid and the smart

15
European Commission 2012a, p. 9. Distributed generation can be also based on combined heat
and power systems, which do not belong to the renewable energy sources. Nevertheless, they are
more efficient and, thus, less harmful for the environment than a traditional fossil fuel-based
system.
16
Council of the European Union 2018.
17
European Commission 2012a, p. 10.
18
European Commission, 2015a, p. 2.
19
European Commission 2012a, b.
20
European Commission 2015c, p. 6; European Commission 2012a, p. 9.
21
Choi 2015; European Commission 2015b.
158 T. G. Iliopoulos

meters, electricity consumers are in a position to receive real-time price signals and
to adjust their consumption patterns accordingly. They will reduce their con-
sumption during peak hours, when the price is high, and they will prefer to move
consumption to off-peak hours. Besides, prosumers can have an extra source of
revenue by selling the electricity they generate to the central grid; and this can be a
valuable input for the grid during high-demand hours, when the matching of
demand and supply is becoming a challenging task.
Therefore, self-generation of electricity and the foregoing prosumers’ responses
to price fluctuations also serve efficiency and the network’s good condition. Indeed,
the relationship between demand and supply of electricity is more balanced,
transmission is decreased, the risk of congestion is diminished and utilities are
relieved of the costly burden of upgrading the network capacity.22 Given the above,
distributed generation can be seen as a ‘demand side management’ strategy,23 to wit
as part of a comprehensive set of policies that encourages a change in the patterns of
energy consumption.24 Moreover, the smart grid provides advanced monitoring
possibilities to the grid operators. Thus, they can have a constant and complete
picture of the load and of the electricity demand, which also results in efficiency
gains.25
In short, the facilitation of the transition to a low-carbon economy, the
empowerment of the electricity consumers and the efficiency gains have been
identified as the main positive effects of smart distributed generation systems.
Nevertheless, a shift towards distributed generation is also linked with problematic
situations. For instance, consumers who do not engage in distributed generation and
in smart technologies, because they cannot afford such an investment, they are not
well informed or they are sceptic about it, could see their electricity bills signifi-
cantly rise. In this regard, utilities are expected to increase the price charged to these
customers in order to offset the revenue losses incurred from the fall in electricity
sales. In this sense, traditional electricity customers end up subsidising
self-consumption and prosumerism.26 Similarly, the position of the vulnerable
consumers that are threatened with energy poverty could worsen. This is because in
a liberalised, decentralised electricity system states would have less discretion to
ensure the protection of vulnerable consumers through price regulation or through
prohibiting the disconnection of electricity. Besides, it is simpler for a state to
regulate one entity that provides a certain good, than a number of private actors who
simultaneously are producers and consumers of that good.
In addition, it is anticipated that issues of data protection will be emphatically
raised. As grids are becoming smarter and smart meters are becoming more

22
Umberger 2012.
23
Kakran and Chanana 2018.
24
Cutler and Morris 2006, p. 112.
25
Galera Rodrigo 2016, pp. 67–68.
26
See also Raskin 2014.
9 Regulating Smart Distributed Generation Electricity Systems … 159

widespread, a great amount of data is collected by the electricity suppliers. This data
can reveal very sensitive personal information about customers’ routines and ways
of life, e.g. the time they go to bed or whether they leave for the weekend. Of
course, one could bring forward the counterargument that any access to personal
data requires the consent of the consumer. Still, empirical research shows that
concerns about privacy and about a possible abuse of sensitive information are
among the top reasons that impede the rapid development of smart grids.27
In spite of these disadvantages, EU policymakers seem convinced by the strong
points of a smart distributed generation electricity system.28 And they are willing to
enact legislation that removes barriers to and promotes its development. The next
section delves into the relevant EU legislation that will put this objective into
practice.

9.4 The EU Legal Order: Law in Force and the Proposal


for a Directive on the Internal Electricity Market

The Union legal order has not yet adopted a straightforward legislative framework
regulating distributed generation. Nevertheless, the Directive 2009/72/EC con-
cerning common rules for the internal market in electricity, which is currently in
force, does acknowledge the distributed generation trend and shortly refers to it.
Accordingly, Article 2(31) of the Directive defines distributed generation as
‘generation plants connected to the distribution system’. Next, Article 7 requires
that national legal orders provide for special authorisation procedures for small
distributed generation projects. Last, Article 25(7) states that distributed system
operators shall consider distributed generation issues when planning the develop-
ment of the network and Article 36(d) requires regulators to take measures to
promote distributed generation. This shows the EU had a positive stance on dis-
tributed generation and was willing to facilitate relevant initiatives already from the
late 2000s. Nonetheless, the above provisions do not seem to suffice to create a
clear, stable and supportive legislative framework. Besides, they do not deal with
important issues that lie at the heart of a well-functioning smart distributed gen-
eration electricity system, like the rights and obligations of prosumers, the issue of
data protection etc. Still, in the course of time, the Commission has become
interested in formulating a more precise legal framework fitting the technological
advancements.

27
Hordeski 2011, pp. 260–263.
28
European Commission 2015c.
160 T. G. Iliopoulos

Accordingly, in 2016, and within the context of its Energy Union strategy,29 the
European Commission put forward the Winter Package, which consists of eight
legislative proposals. One of them was the proposal for a new Directive on common
rules for the internal market in electricity.30 The envisaged Directive introduces the
term ‘active customer’. More specifically, Article 2(6), as lastly, until now,
amended by the Committee on Industry, Research and Energy of the European
Parliament,31 defines the foregoing term as meaning
a final customer or a group of jointly acting final customers who consume, store or sell
electricity generated within their premises, including through aggregators or suppliers or
traders, or participate in demand response or energy efficiency schemes provided that these
activities do not constitute their primary commercial or professional activity.

Apparently, this definition reflects the already analysed concept of a prosumer. The
neologism ‘prosumers’ is not used in the proposal for the Directive; yet, it can be found
in EU soft law, as a term comprising consumers, business and households who are
engaged with ‘self-consumption’, thus producing and consuming renewable energy.32
The abovementioned Article 2(6) also refers to the case of collective action and sets
down the actions in which a customer shall be involved in order to be regarded as an
‘active customer’. It also sets down a requirement, which, as rational as it is, could raise
interpretative issues in the future: the activities mentioned in the definition shall not be
the customer’s primary commercial or professional activity. Yet, the Directive does not
mention any criteria or interpretative guidelines for determining whether an activity is
the primary commercial or professional activity of the customer. It can be expected that
in many disputes that will relate to the identification of one or more persons as active
customers, the competent national courts will opt to turn to the Court of Justice of the
EU and to submit a preliminary question so that the concept be clarified.
Interestingly enough, a similar to ‘active customers’ term can be traced in
another proposal for a Directive. More specifically, in the proposal for a new
Renewable Energy Directive,33 the Commission has put forward the concept of the
‘renewable self-consumer’. According to Article 2(aa) of the Proposal, as amended
by the Council,34
‘renewable self-consumer’ means a final customer operating within its premises located
within confined boundaries or where allowed by Member States, on other premises, who
generates renewable electricity for its own consumption, and may store and sell self-generated
renewable electricity, provided that, for non-household renewable self-consumers, those
activities do not constitute their primary commercial or professional activity.

29
European Commission 2015a.
30
European Commission 2016a.
31
European Parliament 2018.
32
European Commission 2015b, p. 2.
33
Commission, ‘Proposal for a Directive of the European Parliament and of the Council on the
Promotion of the Use of Energy from Renewable Sources (recast)’ COM(2016) 767 final.
34
Council of the European Union 2018.
9 Regulating Smart Distributed Generation Electricity Systems … 161

But the terms ‘active customer and ‘renewable self-consumer’ should be treated
as distinct. Indeed, a literal interpretation shows that a renewable self-consumer’s
main feature is the consumption of the self-generated electricity; and they may also
store and sell part of it. On the other hand, active customers are involved in all the
above activities, without distinguishing one of them; they may, for instance, opt to
sell all the electricity generated. The two concepts are not mutually exclusive. It
appears that a renewable self-consumer is by definition an active customer too. On
the contrary, an active customer also belongs to the category of renewable
self-consumers, if he/she generates renewable electricity and consumes part of it.
After having provided the legal definition, the next step is to adopt a new set of
rules regulating the activities of the sui generis kind of active customers. This is a
challenging step because they have a dual nature. As it was argued in the analysis in
relation to prosumers, which also applies to the active customers, they can be
classified into the category of consumers as well as into the category of producers.
The boundary between these two supposedly distinct groups is eroded and the
question how they should be treated by the law springs to mind.35
Since the Union legislator’s intention is to promote prosumerism, it might well
be expected that active customers have the special protection granted to consumers.
Still, the envisaged Directive for the internal electricity market does not clearly
require so. More specifically, according to Article 15(1)(a) of the proposal, as
amended by the European Parliament in 2018, Member States are required to
ensure that final customers are entitled to engage with the activities an active
customer performs, ‘without being subject to discriminatory or disproportionately
burdensome procedures and charges that are not cost reflective’.
As important as this provision might prove for more active customers to rise, it
introduces relatively broad requirements that can be open to different interpretations
and it does not clarify what the legal position of active customers should be.
Accordingly, one could wonder, for instance, whether a national law treating active
customers as producers and not as consumers would be discriminatory and, hence,
not in conformity with the Directive and EU law. Interestingly, in Article 15(2)(c)
of the proposal, it is explicitly required that Member States shall ensure that active
customers owing a storage facility ‘are distinguished from generators and not
subject to related licensing requirements and fees’. Therefore, active customers
owning a storage facility cannot be subject to requirements that are applicable to
regular producers (e.g. permit requirements, monitoring procedures etc.). However,
this special mention seemingly entails that Member States are allowed to treat other
active customers as producers and to impose strict requirements on them. It is not
evident what the differentia justifying a dissimilar treatment between the two groups
of active customers is. Accordingly, it is not evident why the Union legislator has
decided to introduce such a provision as Article 15(2)(c) of the proposal without
expanding it to equally apply to all active customers. Within the legislative

35
Jacobs 2017; Lavrijssen 2017.
162 T. G. Iliopoulos

framework put forward, the legal position of an active customer will possibly be
significantly different from Member State to Member State.
Next, according to Article 15(1)(b), Member States are required to ensure that the
network charges imposed on final customers are cost-reflective, non-discriminatory
and transparent, accounting separately for the electricity fed into the grid and the
electricity consumed. It should be noted that EU soft law had already emphasised the
interrelation between network charges and distributed generation. In this respect, in
the ‘Best practices on Renewable Energy Self-consumption’ Staff Working
Document, the Commission reviewed the effects that network charges might have
over distributed generation. It was stated that a volumetric pricing, which is depen-
dent upon the actual usage of electricity, is expected to encourage distributed gen-
eration, but it can prove disproportionally costly for traditional consumers. On the
other hand, a capacity-based, non-volumetric pricing is also disproportionate because
it equally burdens heavy and light users; and it does not encourage efficiency. The
Commission seems to support ‘hybrid models of combining both capacity and vol-
umetric tariffs’.36 Of course, the decision about the specific design of such models
belongs to Member States. This is because the EU has only provided these specific
guidelines for network pricing through soft law instruments that are not legally
binding for Member States.
Furthermore, Article 15(2) states that ‘the energy installation required for the
activities of the active customer may be managed by a third party for installation,
operation, including metering and maintenance provided that the economic risk
connected to the operation of the installation remains with the active costumer’.
The Proposal under scrutiny also refers to the issue of local energy communities.
This concept is defined in Article 2(7):
‘local energy community’ means an association, a cooperative, a partnership, a non-profit
organisation, SME or other legal entity which is based on voluntary and open participation
and is effectively controlled by local shareholders or members, the predominant aim of
which is to provide local environmental, economic or social community benefits for its
members or the local area or areas where it operates rather than where it generates profits,
and which is involved in activities such as distributed generation, storage, supply, provision
of energy efficiency services, aggregation, electro-mobility and distribution system opera-
tion, including across borders.

Next, Article 16 of the Proposal aims to ensure that local energy communities
will be entitled to be established and to access markets. It also requires that they are
subject to a non-discriminatory treatment with regard to their activities, rights and
obligations and to fair, proportionate and transparent procedures and cost reflective
charges. Still, contrary to Articles 15, 16(1) and (2) contain a relatively detailed list
with the rights and obligations of the local energy communities. Consequently, the
discretion of Member States in this area is comparatively more limited.

36
European Commission 2015b, p. 8.
9 Regulating Smart Distributed Generation Electricity Systems … 163

Apart from the above issues, the new Directive also intends to ensure that Member
States will implement smart metering systems. This is because the further evolution
of a distributed generation electricity system is regarded as closely linked with the
empowerment of consumers. And this empowerment is in its turn linked with their
being well informed.37 Therefore, Article 20 introduces the requirement that Member
States implement smart metering systems ‘where smart metering is positively
assessed as a result of cost-benefit assessment […] or systematically rolled out’. It
also sets down certain principles: consumers shall be provided with accurate infor-
mation about their consumption and with appropriate advice and information when
installing a smart meter; the smart metering systems and the data communication
shall be secure on the basis of the best available techniques; the consumers’ privacy
and data shall be protected. Further, according to Article 21, even if the smart
metering cost-benefit assessment is negative or smart metering is not systematically
rolled out, Member States shall ensure that final customers are still entitled to have
smart meters installed on request, and under fair and reasonable conditions.
The Proposal also refers to the data protection issue that accompanies smart grids
and smart metering. In this respect, Article 23 states that Member States shall
specify the eligible parties which may have access to data of the final customers
with their explicit consent in accordance with the General Data Protection
Regulation 2016/679. Moreover, it is clarified that this group of eligible parties
shall include at least ‘customers, suppliers, transmission and distribution system
operators, aggregators, energy service companies, and other parties which provide
energy or other services to customers’. According to Article 23(4), ‘no additional
costs shall be charged to final customers for access to their data or for a request to
transfer their data’. Member States are also required by Article 23(2) to organise the
secure management of data with the aim to attain efficient data access and
exchange, data protection, data security, transparency, neutrality and data integrity.
Article 23(3) requires that Member States authorise, certify and monitor the parties
involved in data management. Within this legislative framework, it is suggested that
distribution system operators, whose main task is to ensure the safety and reliability
of the electricity system, see their role upgrade and they take up key duties in data
management and data protection.38
In conclusion, the Proposal for a new Directive for the internal electricity market
originates the development of a Union legislative framework fitting technological
innovation in the field of electricity. The Proposal sets down relatively clear
requirements in terms of certain issues, like the implementation of smart meters or
data protection. Nevertheless, it is rather vague regarding the legal position of active
customers, which is a crucial issue. The use of broad legal concepts in Article 15(1)
(a), to wit ‘discriminatory and disproportionately procedures’, leaves a large degree
of discretion to Member States when formulating a national legal framework for
active customers. The enactment of more detailed provisions, similar to those

37
European Parliament 2018, recital 8.
38
Lavrijssen and Carrillo 2017, ch. 4.2.
164 T. G. Iliopoulos

regulating local energy communities or even active customers owning a storage


facility, would reduce the chance to see important discrepancies in the imple-
mentation of the Directive to the national legal orders. Therefore, the new Directive
will provide an important starting point, but it remains to be seen how the Member
States will implement EU law and fit the new reality.

9.5 The Special Issue of Net Metering

Net metering is the most commonly employed model for the promotion of dis-
tributed generation. It involves compensating prosumers for feeding the electricity
they produce into the grid instead of consuming it. The USA are pioneers in net
metering,39 but quite a few EU Member States have established their own national
programs too.40 Nevertheless, while the EU policymakers have recognised the
advantages of distributed generation, the relevant legislation and legislative pro-
posals do not contain any provisions regarding net metering. This section examines
the characteristics of net metering schemes and the reasons lying behind the
decision of the EU policymakers and the EU legislator not to rely on net metering in
order to promote distributed generation.
In net metering schemes, prosumers can feed excess electricity they produce into
the grid and, in exchange, they receive a credit on their bill for the quantity
delivered.41 In other words, prosumers ‘can offset their electricity purchases from
the grid with energy generated behind the retail meter, such as from rooftop solar
panels’.42 In a bright wording, ‘net metering essentially allows the prosumers to use
the electricity network as a type of virtual temporary storage’.43
In net metering the credit offered is equal to the retail rate. Nevertheless, if it was
not for net metering, a utility would purchase electricity at the wholesale price,
which is significantly lower. Besides, distributed generation electricity is not worth
to be purchased at such a high price as the retail rate is, because it is ‘variable and
unavailable for substantial period of times’.44 Consequently, utilities pay a lot for
what they could have purchased with less money, while investors in distributed
generation receive a compensation that is higher than the real value of what they
offer. Obviously, such schemes provide a strong incentive for natural and legal
persons to become involved in distributed generation. On the other hand, the

39
Rossi 2016; Powers 2012.
40
European Commission 2016b, pp. 140–142.
41
Jacobs 2017.
42
Raskin 2013.
43
Butenko 2016, p. 710.
44
Raskin 2014, p. 270.
9 Regulating Smart Distributed Generation Electricity Systems … 165

opportunity for windfall profits can result in overinvestment, which can provoke
negative results. The ‘death spiral’ scenario is very characteristic.45
The ‘death spiral’ theory suggests that the more distributed generation and net
metering expand, the less electricity utilities sell, hence, the less profit they earn.
Moreover, the total amount of credit granted to net metering consumers increases.
Consequently, the total revenue of utilities diminishes, which means that they end
up with less resources to cover fixed costs, like costs for the maintenance of the
infrastructure, for payroll payments etc.46 It is forecasted that in response the
utilities will increase prices, to the detriment of the customers that are not engaged
with distributed generation. This increase in electricity bills, in its turn, will trigger a
new wave of investments in distributed generation. In short, a vicious cycle is
created.47 Utilities will be constantly raising price and, finally, only those who
cannot afford to invest in distributed generation will remain loyal customers. And of
course they will not be in position to afford the utilities’ bills either. As a result,
utilities will not be able to recover their fixed costs and they will founder. Such a
collapse will be calamitous for the electricity system in its whole, because, in spite
of the tremendous distributed generation rise, utilities will still have a role in
ensuring the network’s reliability by supplying temporary backup electricity.48 Of
course, the ‘death spiral’ scenario rather is a dystopian prevision, which is unlikely
to occur soon and unexpectedly. The massive prevalence of an innovation like
distributed generation cannot but take time—if it ever occurs in such an absolute
climax as the ‘death spiral’ theory predicts.49
Nevertheless, ‘death spiral’ is not the only point of criticism that net metering
schemes have experienced. Perhaps more importantly, efficiency issues are raised,
which put the overall net metering schemes’ viability at risk. Offering a bountiful
compensation for the electricity excess that is fed into the grid can motivate scores
of investors to partake in distributed generation. Such an overinvestment means that
utilities are going to need more and more resources in order to be able to keep
offering net metering services. But resources are not abundant; after a certain point,
it is forecasted that they will be depleted and it will be harder and harder to grant net
metering credit.
Such a regulatory failure is analogous to what has already occurred with feed-in
tariffs in a good few Member States, such as Germany, Greece, Portugal and
Spain.50 Feed-in tariffs are long-term contracts that oblige the grid operators to
purchase the renewable electricity generated at a guaranteed fixed price, which is

45
Graffy and Kihm 2014.
46
Jacobs 2017.
47
Rule 2014–15.
48
Rule 2014–15.
49
Raskin 2014.
50
Iliopoulos 2016.
166 T. G. Iliopoulos

higher than the market price.51 Feed-in tariffs have offered a particularly lucrative
business opportunity that attracted many investors and, hence, they have resulted in
a rapid boost of renewables in the EU. Nevertheless, overcompensation has resulted
in overinvestment, because of which regulators finally could not abide by the
contracts; they had to make unilateral adjustments to contractual terms, like
imposing de-escalating tariffs, reducing the contracts’ duration, etc.52 For instance,
Germany, Greece and Spain decided to reduce the fixed prices and imposed an
upper limit for renewable energy projects that would receive support in the form of
feed-in tariffs.53 At the same time, electricity consumers were seeing an increase in
their electricity bills, because grid operators and electricity suppliers were passing
on the extra costs incurred to them. What was ‘politically attractive in its early stage
quickly became a regulatory and political quagmire’.54 In response, since early
2010s many Member States have abandoned the feed-in tariffs schemes and have
turned to more market-based support schemes.55 For instance, Cyprus, Germany,
Greece, Spain etc. have implemented premium tariffs support schemes that adjust to
market price fluctuations and prevent overcompensation.56
The foregoing negative experience with the feed-in tariffs schemes explains to a
large extent the sceptical stance of the EU towards net metering. Feed-in tariffs can
be compared to net metering in the sense that both systems relate to a guarantee that
the electricity generated will be purchased, and at prices exceeding the retail price.
Because of these characteristics, it is argued that the economic effects of the two
schemes ‘are not very different’.57 They both entail overcompensation, which leads
to overinvestment and, in the end, to inflated prices for electricity consumers that
are not engaged in the support scheme.
Accordingly, net metering is regarded as a useful starting point in the course of
distributed generation development, but as a non-viable long-term support policy.
The Commission has acknowledged that net metering is ‘effective to jump-start’
distributed generation, because it is ‘attractive and easy to apply and to understand’.58
But, it has been also highlighted that net metering is not an appropriate instrument
when distributed generation development rises. Given the above, it makes no surprise
that EU soft law advises Member States to ‘prefer self-consumption schemes over net
metering’ and to limit net metering ‘to phase-in periods’.59 And, of course, the

51
Sáenz de Miera and Muñoz Rodríguez 2015; Sioshansi 2016; Atmaca and Lojodice 2014;
Schaffer and Bernauer 2014.
52
Johnston and Block 2012; Prest 2012.
53
Iliopoulos 2016.
54
Raskin 2013, p. 51.
55
European Commission 2013, pp. 12–13.
56
Iliopoulos 2016.
57
Raskin 2014.
58
European Commission 2015b, p. 10.
59
European Commission 2015b, p. 12.
9 Regulating Smart Distributed Generation Electricity Systems … 167

Commission decided not to include any reference to net metering in the proposal for a
new Directive for the internal electricity market.
Nonetheless, the European Parliament in its amendments introduced the concept
of ‘virtual net metering’. Virtual net metering schemes function similarly to regular
net metering schemes, but they relate to energy communities, since they involve the
distribution of the bill credits from the electricity generated by an energy com-
munity among its members.60 In this regard, Article 16a of the Proposal allows
local energy communities to apply ‘virtual net metering schemes’ in order to share
the electricity generated between its members. Article 16a creates an interesting law
and policy situation. While soft law states that Member States should avoid net
metering schemes, the envisaged hard law instrument refers to virtual net metering
as an exemplary means to share electricity within a local energy community. In case
the new Directive is enacted and Article 16a remains as it is, soft law should be
updated and the Commission should either reconsider its negative stance towards
regular net metering or clarify the reasons why only virtual, and not regular net
metering schemes are appropriate for distributed electricity systems.

9.6 Conclusion

‘Life moves quickly, regulation less so’.61 And technological progress has brought
about pivotal developments in the field of electricity. The rise of distributed gen-
eration and decentralisation, the emergence of the special group of prosumers and
the coming of smart grids and smart meters, they all set the scene for a new
electricity system.
Focusing on the EU legal order, it is important that EU law has started taking
notice of the new emerging phenomena. Accordingly, soft law and the Winter
Package set rules specifically applicable to prosumers and to ‘active customers’.
Therefore, the EU legislator put forward a Proposal for a new internal electricity
market Directive and intends to set down certain fundamental rules that will be
guiding Member States in the formulation of national distributed generation-related
legislation. However, it is noted that the Proposal sets down requirements deter-
mining active customers’ legal position that are rather broadly formulated. This
leaves a wide margin of discretion to national legislators when determining the
rights and obligations of active customers. Interestingly, the proposal contains more
detailed requirements when it comes to active customers owning storage facilities
or to local energy communities.
Furthermore, this chapter has also noticed a hesitancy about net metering, which
is regarded as inefficient. Besides, the Commission has explicitly revealed in soft
law its preference for a development of distributed generation being based on

60
Augustine and McGavisk 2016.
61
Jacobs 2017, p. 577.
168 T. G. Iliopoulos

self-consumption. However, since the reservations towards net metering are hardly
traced in hard law or in the proposal for the internal electricity market Directive, it is
unknown to what extent they will exert a persuasive power to Member States.
Besides, the European Parliament’s amendments explicitly state that local energy
communities can apply virtual net metering schemes.
In conclusion, it seems that the merits of promoting distributed generation are
becoming widely accepted. Accordingly, we are in front of a new internal electricity
market Directive that will set down certain rules but will also leave a wide dis-
cretion to the national legal orders. It is therefore hard to speculate how distributed
generation will develop and how the future electricity model will be. It will depend
on the content of the relevant laws, but mostly on the consumers’ reaction towards
the new possibilities offered. If consumers prove reluctant to engage in distributed
generation, it is probable that the future electricity model will take after its con-
ventional ancestor, in the sense that it will simply add several prosumers’ islands to
the central grid. But, if consumers make use of the technological developments and
on the rights that legislation grants them on a big scale, the electricity systems can
radically transform into systems where prosumers prevail, while central grids have
only a marginal role.62 For the moment, it looks a lot like a ‘learning by-doing’
process for the EU, for Member States and for consumers.

References

Armeni C (2015) Global Experimentalist Governance, International Law and Climate Change
Technologies. International & Comparative Law Quarterly 64:875–904
Atmaca N, Lojodice I (2014) The Impact of Support Schemes on RES Installations and Retail
Electricity Prices. Renewable Energy Law and Policy 5:67–78
Augustine P, McGavisk E (2016) The next big thing in renewable energy: Shared solar. The
Electricity Journal 29:36–42.
Brownsword R, Somsen H (2009) Law, Innovation and Technology: Before We Fast Forward - A
Forum for Debate. Law, Innovation and Technology 1:1–73
Butenko A (2016) Sharing Energy: Dealing with Regulatory Disconnection in Dutch Energy Law.
European Journal of Risk Regulation 7:701–716
Carvallo A, Cooper J (2006) The Advanced Smart Grid: Edge Power Driving Sustainability.
Artech House, Boston
Choi DG et al (2015) Is the concept of ‘grid parity’ defined appropriately to evaluate the
cost-competitiveness of renewable energy technologies? Energy Policy 86:718–728
Council of the European Union (2018) Proposal for a Directive of the European Parliament and of
the Council on the promotion of the use of energy from renewable sources – Analysis of the
final compromise text with a view to agreement. 2016/0382 (COD), 21 June 2018. https://eur-
lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_10308_2018_INIT&qid=
1530116593165&from=EN. Last accessed 22 August 2018
Cutler CJ, Morris CG (2006) Dictionary of Energy. Elsevier, London
Einav L et al (2016) Peer-to-Peer Markets. Annual Review of Economics 8:615–635

62
Elliott 2017; Jacobs 2017; Parag and Sovacool 2016; Miller et al. 2013.
9 Regulating Smart Distributed Generation Electricity Systems … 169

Elliot D (2017), Energy futures: New approaches to energy choices. In: Leal-Arcas R, Wouters J
(eds) Research Handbook on EU Energy Law and Policy. Edward Elgar, Cheltenham, pp 486–
500
European Commission (2012a) Communication from the Commission to the European Parliament,
the Council, the European Economic and Social Committee and the Committee of the Regions.
Renewable Energy: a major player in the European energy market. COM(2012) 271 final.
https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52012DC0271&from=EN.
Last accessed 22 August 2018
European Commission (2012b) Communication from the Commission to the European Parliament,
the Council, the European Economic and Social Committee and the Committee of the Regions.
Making the internal energy market work. COM(2012) 663 final. https://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:52012DC0663&from=EN. Last accessed 24 August 2018
European Commission (2013) Commission Staff Working Document. European Commission
guidance for the design of renewables support schemes Accompanying the document
Communication from the Commission Delivering the internal market in electricity and making
the most of public intervention. SWD(2013) 439 final. https://ec.europa.eu/energy/sites/ener/
files/documents/com_2013_public_intervention_swd04_en.pdf. Last accessed 10 September
2018
European Commission (2015a) Communication from the Commission to the European Parliament,
the Council, the European Economic and Social Committee, the Committee of the Regions and
the European Investment Bank. A Framework Strategy for a Resilient Energy Union with a
Forward-Looking Climate. COM(2015) 80 final. https://eur-lex.europa.eu/resource.html?uri=
cellar:1bd46c90-bdd4-11e4-bbe1-01aa75ed71a1.0001.03/DOC_1&format=PDF. Last acces-
sed 22 August 2018
European Commission (2015b) Commission Staff Working Document. Best practices on
Renewable Energy Self-consumption. SWD(2015) 141 final. https://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:52015SC0141&qid=1535102482843&from=EN. Last
accessed August 2018
European Commission (2015c) Communication from the Commission to the European Parliament,
the Council, the European Economic and Social Committee and the Committee of the Regions.
Delivering a New Deal for Energy Consumers. COM(2015) 339 final. https://eur-lex.europa.
eu/legal-content/EN/TXT/PDF/?uri=CELEX:52015DC0339&qid=1535129210449&from=EN
. Last accessed 22 August 2018
European Commission (2016a) Proposal for a Directive of the European Parliament and of the
Council on Common Rules for the Internal Market in Electricity (recast). COM(2016) 864
final. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=COM:2016:0864:FIN. Last acces-
sed 22 August 2018
European Commission (2016b) Impact Assessment accompanying the document Proposal for a
Directive of the European Parliament and of the Council on the promotion of the use of energy
from renewable sources (recast). SWD(2016) 418 final. https://eur-lex.europa.eu/resource.
html?uri=cellar:1bdc63bd-b7e9-11e6-9e3c-01aa75ed71a1.0001.02/DOC_2&format=PDF.
Last accessed 27 August 2018
European Parliament (2018) Report on the proposal for a directive of the European Parliament and
of the Council on common rules for the internal market in electricity (recast). A8-0044/2018.
http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&mode=XML&reference=A8-
2018-0044&language=EN. Last accessed 22 August 2018
Galera Rodrigo S (2016) Changing the Energy Model: Step Back on the Europe 2050 Strategy?
European Energy and Environmental Law Review 2:65–72
Graffy E, Kihm S (2014) Does Distributive Competition Mean a Death Spiral for Electric Utilities?
Energy Law Journal 35:1–44
Grossman PZ (2003) Is Anything Naturally a Monopoly? In: Cole DH, Grossman PZ (eds) The
End of a Natural Monopoly: Deregulation and Competition in the Electric Power Industry.
Taylor & Francis, London, pp 9–38
170 T. G. Iliopoulos

Hamilton B et al (2011) The Customer Side of the Meter. In: Sioshansi FP (ed) Smart Grid:
Integrating Renewable, Distributed and Efficient Energy. Elsevier, Oxford, pp 397–418
Hopp C et al (2018) Disruptive Innovation: Conceptual Foundations, Empirical Evidence, and
Research Opportunities in the Digital Age. Journal of Product Innovation and Management
35:446–457
Hordeski MF (2011) Megatrends for Energy Efficiency and Renewable Energy. The Fairmont
Press, Lilburn
Horton JJ, Zeckhauser RJ (2016) Owning, Using and Renting: Some Simple Economics of the
“Sharing Economy”. The National Bureau of Economic Research Working Paper No. 22029.
http://www.nber.org.ezproxy.ub.unimaas.nl/papers/w22029.pdf. Last accessed 22 August 2018
Iliopoulos T (2016) Renewable Energy Regulation: Feed-in Tariffs Schemes Under Recession
Conditions? European Networks Law and Regulation Quarterly 4:110–117
Jacobs SB (2017) The Energy Prosumer. Ecology Law Quarterly 43:519–579
Johnston A, Block G (2012) EU Energy Law. Oxford University Press, Oxford
Kakran S, Chanana S (2018) Smart operations of smart grids integrated with distributed
generation: A review. Renewable and Sustainable Energy Reviews 81:524–535
Lavrijssen S (2017) Power to the Energy Consumers. European Energy and Environmental Law
Review 6:172–187
Lavrijssen S, Carrillo A (2017) Radical Innovation in the Energy Sector and the Impact on
Regulation. TILEC Discussion Paper No. 2017-017. https://papers.ssrn.com/sol3/papers.cfm?
abstract_id=2979206. Last accessed 28 August 2018
Miller CA et al (2013) The Social Dimensions of Energy Transitions. Science as Culture 2:135–
148
Olkkonen L et al (2017) Redefining a Stakeholder Relation: Finnish Energy “Prosumers” as
Co-producers. Environmental Innovation and Societal Transitions 24:57–66
Parag Y, Sovacool BK (2016) Electricity Market Design for the Prosumer Era. Sussex Research
Online. http://sro.sussex.ac.uk/60088/. Last accessed 28 August 2018
Powers M (2012) Small Is (Still) Beautiful: Designing U.S. Energy Policies to Increase Localized
Renewable Energy Generation. Wisconsin International Law Journal 30:595–667
Prest J (2012) The Future of Feed-In Tariffs: Capacity Caps, Scheme Closures and Looming Grid
Parity. Renewable Energy Law and Policy Review 1:25–41
Raskin D (2014) Getting Distributed Generation Right: A Response to “Does Distributive
Competition Mean a Death Spiral for Electric Utilities?” Energy Law Journal 35:263–282
Raskin DB (2013) The Regulatory Challenge of Distributed Generation. Harvard Business Law
Review Online 4:38–51
Rasool F et al (2018) A framework for disruptive innovation. Foresight 20:252–270
Rossi J (2016) Federalism and the Net Metering Alternative. The Electricity Journal 1:13–18
Rule T A (2014–15) Solar Energy, Utilities, and Fairness. Sand Diego Journal of Climate &
Energy Law 6:115–148
Sáenz de Miera G, Muñoz Rodríguez MA (2015) ‘EU Policies and Regulation on CO2,
Renewables and Energy Efficiency: A Critical Assessment of Recent Experiences. In:
Ansuategi A, Delgado J, Galarraga I (eds) Green Energy and Efficiency. Springer International
Publishing, Switzerland, pp 17–59
Schaffer L, Bernauer T (2014) Explaining Government Choices for Promoting Renewable Energy.
Energy Policy 68:15–27
Sioshansi R (2016) Retail Electricity Tariff and Mechanism Design to Incentivize Distributed
Renewable Generation. Energy Policy 95:498–508
Stein AL (2014) Reconsidering Regulatory Uncertainty: Making a Case for Energy Storage.
Florida State University Law Review 41:697–766
Toffler A (1971) Future Shock. Bantam Books, New York
Tomain JP (2012) Smart Grid, Clean Energy and US Policy. Competition and Regulation in
Network Industries 13:187–212
9 Regulating Smart Distributed Generation Electricity Systems … 171

Umberger A (2012) Distributed Generation: How Localized Energy Production Reduces


Vulnerability to Outages and Environmental Damage in the Wake of Climate Change.
Golden Gate University Environmental Law Journal 6:183–213
v. Schönfeld M, Wehkamp N (2018) Big Data and Smart Grid. In: Hoeren T, Kolany-Raiser B
(eds) Big Data in Context. Springer Open, Cham, pp 93–106
Wunderlich P (2013) Green Information Systems in the Residential Sector. Springer-Verlag
Berlin/Heidelberg
Zame KK et al (2018) Smart Grid and Energy Storage: Policy Recommendations. Renewable and
Sustainable Energy Reviews 82:1646–1654

Theodoros G. Iliopoulos is Doctoral Researcher in Energy and Environmental Law at Hasselt


University.
Part IV
The Data in New Technologies—The
Utilization of Data and the Protection
of Personal Data
Chapter 10
A Public Database as a Way Towards
More Effective Algorithm Regulation
and Transparency?

Florian Wittner

Contents

10.1 Introduction...................................................................................................................... 176


10.2 Transparency in the GDPR ............................................................................................. 177
10.2.1 Fundamental Importance and Elements ............................................................. 177
10.2.2 Algorithmic Transparency .................................................................................. 178
10.3 The Need for Public Transparency ................................................................................. 181
10.3.1 For Data Protection in General .......................................................................... 181
10.3.2 For Algorithms in Particular .............................................................................. 182
10.4 A Public Database of Algorithms ................................................................................... 183
10.4.1 Role Models ....................................................................................................... 184
10.4.2 Details ................................................................................................................. 187
10.5 Conclusion ....................................................................................................................... 190
References .................................................................................................................................. 191

Abstract The increasing usage of algorithmic decision-making (ADM) systems


has led to new and partially urgent challenges for the law, specifically in the field of
data protection. Decisions made by (classic and “intelligent”) algorithms can make
people feel powerless and the underlying opaqueness makes it hard to understand
the reasons for a specific decision. This also increases the danger of discriminating
results, as reproducing if decisions were (indirectly) based on forbidden charac-
teristics becomes increasingly hard. Especially on the private market, consequences
for individuals and society as a whole can be problematic. Much discussion has
revolved around the question of how to achieve more transparency to increase
regulation and allow accountability for those using ADM systems. These discus-
sions mostly focus on transparency-enhancing instruments the General Data

F. Wittner (&)
Department of Law, Hans-Bredow Institute for Media Research at the University of
Hamburg, Rothenbaumchaussee 36, 20148 Hamburg, Germany
e-mail: f.wittner@hans-bredow-institut.de

© T.M.C. ASSER PRESS and the authors 2019 175


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_10
176 F. Wittner

Protection Regulation (GDPR) offers. While the GDPR offers a promising array of
such instruments for data subjects and public authorities, specific instruments for
public transparency are missing. The chapter discusses the notion of a public
database that gives graduated access to information concerning ADM systems used
by companies, allowing analyzing algorithms’ consequences and enabling indi-
viduals to make more informed decisions. Allowing such access would make it
necessary to consider affected companies’ justified interests but could further
overall societal trust and acceptance while increasing control. The contribution tries
to analyze how some of the GDPR’s provisions (such as Articles 20 and 35) can
help with this endeavor, draw comparisons to similar regulatory approaches in other
areas (such as Environmental Law) and make specific recommendations for action.


Keywords Algorithmic decision-making Transparency of Algorithms 

Data Protection Public Scrutiny GDPR 

10.1 Introduction

Transparency has always been an important tool for Data Protection Laws that
traditionally follow and protect the individual’s self-autonomy and awareness when
it comes to his or her personal data.1 A certain minimum of knowledge about
intended or already performed acts of data processing as well as the parties involved
is necessary to give consent or object to a specific act. For Data Protection Agencies
(DPAs), tasked with both enforcing regulations and helping data controllers act in
compliance with them,2 transparency is important as well; because one goal of Data
Protection is to prevent harmful actions and decisions resulting from findings and
insights gained through data processing,3 particularly risky acts of processing need
to be identified and singled out pre-emptively in order to take the necessary mea-
sures and prevent damages for data subjects.4
This general want of transparency is only amplified when it comes to acts of
algorithmic decision-making (ADM). Here, algorithms use personal data to assess
creditworthiness or make predictions about the future behaviour of affected data
subjects and take corresponding decisions. Because the act of processing and the
decision made on its basis fall closer together and because the nature and workings
of these opaque algorithms oftentimes remain unclear for everyone except their

1
Article 29 Data Protection Working Party 2016, pp. 4, 5.
2
Article 57 GDPR.
3
Bull 2015, pp. 24, 25; cf. also Marsch 2018, pp. 203 et seq. for a methodical outline of the
instrumental and accessory nature of the fundamental law of Data Protection in Article 8 EU
Charter of Fundamental Rights (CFR).
4
Natural persons whose data are being processed, Article 4(1) GDPR.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 177

developers and end users (and sometimes even for the latter), regulating the usage
of such algorithms necessarily means putting an increased focus on transparency.
While the GDPR,5 freshly implemented on May 25 2018 after a transitional
period of almost two years, puts its focus on transparency towards data subjects and
DPAs, I want to make the case for more transparency towards the public. In order to
do that, I will summarize the importance of transparency for data protection, ana-
lysing why public transparency is important and to what extent the GDPR already
offers instruments to achieve it. Subsequently I want to discuss the notion of a
public database of commercially used algorithms as a possible way of achieving
this kind of transparency and show how some of the GDPR’s provisions might help
with this endeavour.

10.2 Transparency in the GDPR

As described above, transparency, when it comes to the processing of personal data,


is a fundamental cornerstone for Data Protection laws and within the GDPR. Its
value spans different principles and applies in general to all acts of processing and
in differing manifestations for specific processing types.

10.2.1 Fundamental Importance and Elements

First and foremost, the general principles relating to processing of personal data in
Article 5(1)6 explicitly include the principle of transparency, claiming that personal
data shall be processed “in a transparent manner in relation to the data subject”.
Recital 397 further defines this principle as a guarantee for data subjects to always
know that data concerning them are being collected and used, by whom and for
which purpose. It also states that information and communication given in relation
to the processing of data is to be “easily accessible”, “easy to understand” and
conveyed using “clear and plain language”.8

5
Regulation on the protection of natural persons with regard to the processing of personal data
and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection
Regulation); All subsequent articles cited without further description are the GDPR’s.
6
Basic benchmarks for the lawfulness of processing acts; even though they are being concretized
and operationalized through other, more specific provisions, violating them can already render a
processing unlawful, cf. Paal and Pauly 2018, Article 5 para 11; Rossnagel 2018, 340.
7
Recitals are non-binding provisions that serve the purpose of giving guidance for the inter-
pretation of the GDPR’s provisions.
8
See also Article 12(1).
178 F. Wittner

Apart from this explicit principle, transparency towards data subjects also serves
as one of the underlying factors of other principles such as purpose limitation and
fairness and as a fundamental prerequisite for informed consent, arguably the most
important ground for lawfulness of processing acts according to Article 6(1)(a).
Allowing the data subject to understand the fundamental characteristics of a pro-
cessing act, to foresee the intended purposes and thus to anticipate possible personal
detriments following from the act is crucial. In addition, the subject rights in
Articles 15–21 can only fulfil their purpose if data subjects have sufficient
knowledge about their case. As Recital 39 states: “natural persons should be made
aware of risks, rules, safeguards and rights in relation to the processing of personal
data”.
The GDPR puts further, though not explicit, emphasis on transparency towards
DPAs. These agencies tasked with monitoring and enforcing the provisions of this
regulation9 and therefore with ensuring the lawfulness of processing acts neces-
sarily rely on insights into controllers’ internal procedures and workflows and the
way they handle personal data. This results in investigative powers10 DPAs can
utilize when exploring concrete cases but also in pre-emptive controller obligations
such as the performance of Data Protection Impact Assessments11 or notifications in
cases of data breaches.12 Through the encouragement of regulated self-regulation in
the form of codes of conduct and certification schemes,13 the GDPR also tries to
develop more efficient control mechanisms while simultaneously allowing con-
trollers more freedom and giving them guidance when it comes to possible ways of
becoming and staying compliant even with evolving technologies.

10.2.2 Algorithmic Transparency

While the GDPR does not include any provisions that explicitly address the
transparency of algorithms and their usage, some of them were indeed drafted with
this scenario in mind while others are at least indirectly able to promote algorithmic
transparency.14

9
See Article 57(1)(a).
10
Laid down in Article 58.
11
Laid down in Article 35.
12
Article 33.
13
Laid down in Articles 40 et seq.
14
The risk-based approach of the GDPR, tying the scope of obligations and rights to the risk
connected to a specific act of processing, allows—at least in theory—for handling new problems
and technical scenarios like the usage of ADM systems by acknowledging their respective risks,
therefore avoiding the constant need for adapting consisting or creating new laws.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 179

10.2.2.1 Towards Data Subjects

The provisions primarily concretizing the aforementioned principle of transparency


towards data subjects can be found in Articles 13 et seq. Here, Articles 13 and 14
determine the general controller obligation to inform data subjects about the most
important details of an act of processing.15 The provisions give an extended list of
such details, including the purpose and legal basis for the processing, the identity
and contact details of the controller and possible third party recipients. The norms
also explicitly declare that information must be given out at the time when the data
are obtained or as soon as possible afterwards—in the cases of direct or indirect
collection, respectively. Article 15 offers data subjects a corresponding right of
access, allowing them to ask for the disclosure of their data and all relevant
information pertaining to them at any given time.
The scope of application for these provisions is very wide and includes every
possible act of data processing. On the other hand, they do not on their own offer
detailed information about the usage of algorithms and the way they work, were
developed and trained.16 More specific obligations do arise when the scenario
described in Article 22 is concerned. According to this provision, automated
individual decision-making based on personal data and at the expense of data
subjects are generally prohibited.17 Examples for such decisions are the automatic
refusals of online credit applications or e-recruiting practices.18 While Article 22(2)
offers wide reaching exemptions from this prohibition,19 the existence of such an
automated system still leads to the extra obligations laid down in Article 22(3).
More importantly, it opens up the applicability of Articles 13(2)(f), 14(2)(g), 15(1)
(h), extending the scope of the information obligation to the existence of an
automated decision-making system and to “meaningful information of the logic
involved, as well as the significance and the envisaged consequences” for the data
subject. Opinions on how far this “right to an explanation”20 of algorithmic deci-
sions reaches and if the provisions even grant one at all range from one extreme21 to
the other.22 A sentiment that all parties involved share is the existence of a certain

15
While Article 13 relates to the collection of data directly from the subject, Article 14 covers
instances in which data were collected from third parties.
16
The list of information in Article 13 is exhaustive.
17
Cf. Mendoza and Bygrave 2017, pp. 9 et seq. on interpreting Article 22(1) as a prohibition and
not a subject right to object.
18
Recital 71.
19
Most notably in the case of an explicit consent by the data subject.
20
Wachter et al. 2017, p. 17.
21
Paal and Pauly 2018, Article 35 para 29, arguing for a basic right to an explanation. Cf. also
Malgiere and Comandé 2017, pp. 246 et seq. and Selbst and Powles 2017, emphasising a func-
tional understanding of such a right as giving at least enough information for a data subject to
effectively execute his or her rights.
22
Wachter et al. 2017, p. 17, arguing against such a right.
180 F. Wittner

information paradox: because the workings of algorithms are complex and not
readily understandable for laymen, giving any kind of comprehension-enhancing
information is necessarily difficult, and even more so when Article 12 demands it to
be concise, intelligible and transmitted through clear and plain language.
A dilemma that seemingly cannot be solved without trade-offs: give rather mean-
ingful information that might not be completely understandable to the average data
subject, or give information that the data subject can grasp and comprehend but that
are in turn not that deep and meaningful.

10.2.2.2 Towards DPAs

As has been described above, DPAs have a variety of instruments on their hands to
gain insight into companies’ practices when handling personal data.
According to the general principle of accountability stated in Article 5(2) every
controller is not only obliged to be in compliance with the GDPR’s data protection
provisions but also to always be able to demonstrate said compliance. This obliges
controllers to create effective internal procedures that allow DPAs to check for
compliance in an easy way. This principle is concretized through provisions like
Article 30, obliging controllers to keep records of their processing activities, and
Article 33, obliging them to notify the authorities about every data breach.
Special emphasis should be put onto the Data Protection Impact Assessment laid
down in Article 35. When applicable, it obliges controllers to pre-emptively, before
using it, make an assessment of how high the risks of a specific type of processing
or a set of processing acts is to the rights and freedoms of natural persons. It must
also contain an assessment of the necessity of and concrete ideas for safeguards,
security measures and mechanisms to tackle the detected risks. Finally, Article 36
lays down the obligation to consult the responsible DPA and adhere to its written
advice about possibly insufficient security measures or falsely detected risks. While
the necessity of such an assessment is typically contingent on the assumption of a
high-risk type of processing in every particular case, Article 36(3)(a) makes it
obligatory in all cases that fall under the abovementioned Article 22. This means
that, at least in cases of automated processing that leads to decisions producing legal
effects for or affect a natural person in a similarly significant way,23 DPAs need to
be consulted and their approval of the intended processing acts acquired.

23
The wording of Article 36(3) is not completely identical to the one in Article 22, leaving room
for the interpretation that its scope is wider, making impact assessments obligatory where a
decision is not made exclusively by the automated system, for example.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 181

10.2.2.3 Towards the Public

The GDPR does not know any explicit provisions aimed at offering transparency
towards a general public,24 neither concerning the usage of algorithms nor other-
wise. Some of its provisions do, however, promise to offer some transparency as a
positive side effect.
As described above Article 13 obliges controllers to provide information about
the data and its processing purpose as well as “meaningful information” about the
logic of an ADM system at the time when those data are obtained. This is generally
—and correctly—understood to mean that a future data subject has to be informed
before any transmission and processing of his or her data has occurred.25 This
becomes especially clear and important where said processing is to be based on the
subject’s consent according to Article 6(a): to make an informed decision on giving
consent or not can only be possible when all the important details are disclosed and
the subject knows what exactly he or she is consenting to. Following this inter-
pretation, interested members of the general public could, for example, start a credit
evaluation process in order to gain information on which data the respective bank or
credit company would need and if and (at least to some extent, see above) through
which logic an algorithm would decide on the application, without having to enter
any data of their own. The resulting transparency is, however, very limited.
Especially for NGOs or journalists trying to do research on the usage and possible
effects of specific algorithms used by specific companies, knowing just the input
data being used and some hints about the inner workings of the system does not
really help when, amongst others, corresponding output data and their potential
effect on concerned individuals and society or the training data that were originally
used are still unknown.
A similar judgement can be passed with regard to Articles 40 and 42. If widely
utilized, codes of conducts and certification schemes can help the public get an
overview over what safeguards and security measures companies put into effect
when using ADM systems.

10.3 The Need for Public Transparency

10.3.1 For Data Protection in General

Building on the findings above, public transparency can be of service in helping to


overcome some of the obstacles related to the roles that controllers, data subjects
and DPAs play in the larger goal of data protection. Public scrutiny adds another

24
Here understood as any interested person that is neither a data subject nor part of a DPA,
notably NGOs, journalists or any interested individuals.
25
Cf. Paal and Pauly 2018, Article 13 para 31.
182 F. Wittner

dimension of control that can focus not only on the single affected individual (the
way data protection as self-protection does) or the question of lawfulness of a
processing act or its effects at a single given time (as the typical DPA instruments
do), but also on the long-term effects and the impacts on society as a whole. It can
also be beneficial in promoting societal acceptance for new technologies and pro-
cessing activities by allowing the public to inform themselves about possible
advantages and downsides. Besides laying open possible and so far unknown risks
and uncertainties that can even help the companies using the respective technolo-
gies identify their shortcomings and find solutions, positive feedback from public
actors where no such risks and shortcomings could be found might in contrast
further general trust and increase companies’ sensibilities for safe and responsible
usage of new technologies as a competitive advantage. In addition, public actors
like NGOs or journalists are often in a much better position of representing the
interests of data subjects and negotiating corresponding solutions compared to the
single subject that is unhappy with the treatment of the data pertaining to him or
her.
Last but not least Article 57(1)(b) tasks DPAs with promoting public awareness
and understanding of, inter alia, the risks connected to data processing, making the
realisation of some kind of public transparency an explicit goal of the GDPR.

10.3.2 For Algorithms in Particular

Complementing these general challenges to transparency as an effective prerequisite


for accountability and lawfulness are those specifically connected to the usage of
algorithms and ADM systems. Focusing on allowing transparency for the general
public can be a promising way of overcoming some of these obstacles.
Mentioned above was the paradox of providing meaningful information to a data
subject about the logics of how a specific ADM system decides while keeping said
information easy to understand and in clear and plain language for the recipient—
who is most likely a complete layman when it comes to the workings of algorithms
or even machine learning systems. While some are focusing on developing tech-
niques that achieve both of these goals,26 allowing certain public actors additional
insight into the way that companies use ADM systems could tackle this problem
more efficiently by creating a kind of intermediary or “buffer zone” (inhabited by
NGOs, research institutes or other actors) between the controller and the data
subject. Here, affected data could be thoroughly analysed and interpreted by
interested scholars and other experts before disclosing the results in a readable
manner. This would of course not exempt controllers from fulfilling their

26
See Wachter et al. 2018, p. 6.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 183

obligations to inform affected data subjects and comply with their right of access,
but raise the question if extending the scope of those obligations is really the ideal
way of getting individuals to understand how decisions regarding them were or will
be made.
Closely connected is the notion that such an assessment of ADM systems and
the data they use might help controllers themselves. Because the production and
development of such systems, as well as the decision on their use cases, is often
very explorative and volatile, controllers oftentimes might not quite comprehend
and reproduce why exactly a system decides this way once and differently the next
time.27 This is especially true when it comes to machine learning systems. After all,
controllers do not necessarily need to understand the “why”, as long as the results
are correct for whichever purpose they pursue.28 Forcing them to give explanations,
and thus really understand their own systems, therefore reverses the priorities that
led to their usage in the first place: from efficiency back towards traceability and
reproducibility. While this should again by no means exempt controllers from their
obligations, it does show that expecting too much from them might be overex-
tending and thus counterproductive. By allowing interested scholars and NGOs
access to the relevant data, the results and findings of their research could help
controllers identify dangers they would not necessarily have found on their own,
e.g. when it comes to discriminatory effects or to the cleanliness and comprehen-
siveness of the data they used.29

10.4 A Public Database of Algorithms

The ideal of public transparency as described above can, in my opinion, best be


realized through a public database of commercially used decision-making algo-
rithms. Here, different types of information and data from and about the algorithms
used by companies could be aggregated and shared through a graduated system
with interested and trusted actors. This section will trace some of the role models
for such a database or similar ways of achieving public transparency, analyse the
details of how it could be organized and designed, examine legal limits as well as
provisions in the GDPR that can (or could with some minor modifications) be
utilized in a beneficial way, and review the regulatory impact on the GDPR’s
systematology.

27
Cf. Malgieri and Comandé 2017, p. 246.
28
Burrell 2016, p. 7.
29
See Goodman and Flaxman 2016, p. 3, describing the problem of “uncertainty bias” in con-
nection with incomplete training data.
184 F. Wittner

10.4.1 Role Models

10.4.1.1 Environmental Law

The idea of ensuring the information and participation of the public in the process
of examining potentially high-risk projects is traditionally embedded into the field
of Environmental Law. Since the 1980s, several EU directives30 have made the
performance of an Environmental Impact Assessment an obligation for certain
public and private projects as a part of their planning approval procedure. An
integral part of such assessments is the participation of the public (both affected and
general) at different phases of the process. This enables interested actors to gain
insight into the respective plans, voice their concerns and therefore shape the scope
of topics the responsible authority has to examine in its approval procedure. The
informing of the public is typically done by displaying the relevant documents and
interim conclusions at a specific time and location for interested actors to take
insight.
Different goals are served through this participation during the ongoing approval
procedure: opinions from a variety of actors31 expand the scope of information that
the authority can and should take into account for its decision and might not have
had otherwise—therefore (at least in theory) improving the quality of decisions;
increasing the plurality of participated actors increases the consideration of public
welfare concerns and serves the ideal of greatest possible democratic legitimacy;
and informing the public so early in the process leads to increased future societal
acceptance after a project has been accepted.32
This case and the usage of ADM systems have several similarities: in both cases
private projects bring with them certain individual and societal risks that need to be
identified and addressed through suitable safeguards, then subsequently presented to
and audited by a public authority. In both cases, the subject matter is or can be highly
complicated, making it difficult for both project managers and authorities to fully
grasp the scope of the risks as well as the suitability of the envisaged safeguards and
therefore creating a need for input from experts and immediately affected actors such
as neighbours. The cases do, however, differ in some key aspects.
While environmentally relevant building projects need to be formally permitted
and the associated risks are generally static (meaning they typically won’t change or
evolve over time), the usage of ADM systems—especially of the learning type—
can bring with it risks that might emerge over time as the specific algorithm adapts
and changes. In both cases, projects follow a pre-determined purpose, but only the
algorithm in a (learning) ADM system is free to find new ways to fulfil that
purpose, e.g. change the way it weighs subject traits to calculate the eligibility of a
job applicant, thereby allowing for the possible emergence of discriminatory results.

30
The current one being Directive 2014/52/EU, amending Directive 2011/92/EU.
31
Especially environmental associations and NGOs.
32
See Landmann and Rohmer 2018, § 9.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 185

Letting the public participate once in the process of the execution of an impact
assessment by voicing their opinions and concerns would therefore give the DPA
auditing the suitability of a controller’s risk assessments and safeguards the
advantage of getting more perspectives, but would not help with tackling future
risks.
On the other hand, a complete and permanent disclosure of Data Protection
Impact Assessments concerning a specific controller and its ADM system faces the
problem of conflicting with the controller’s justified interests re trade secrets and
intellectual properties concerning the algorithm.33

10.4.1.2 OpenSchufa

OpenSchufa34 is an initiative started by German NGOs AlgorithmWatch and Open


Knowledge Foundation Deutschland requesting affected individuals to transmit the
data that Schufa, a private company calculating the “credit scores” of consumers35
and transmitting it to companies that subsequently decide whether to offer their
services to the person or not, has about them. Individuals are encouraged to use
their access right in Article 15 to make Schufa transfer their respective credit score
as well as the data they used to calculate it.
By gaining as many data from affected individuals as possible, OpenSchufa
hopes to analyse the way Schufa calculates its scores and to find out if it leads to
any structural problems like the (unintended) discrimination of certain population
groups. Because it is a relatively new initiative, OpenSchufa is still in an early
phase where no results have been published and it remains to be seen if enough
people can be mobilised to transmit their data and if said data can be used for the
aspired goals.
Expanding this approach to all commercially used ADM systems would of
course only increase these concerns: getting enough data for each individual
company using such a system is arguably harder to achieve than for a central
intermediary like Schufa that is affecting every consumer on a regular basis. In
addition, the possibility of learning ADM systems (which Schufa is—at least so far
—not using) would further decrease the potential informative value of past user
data, as even in the case of a successful analysis of a system’s inner workings, the
results would most likely not be topical anymore, the system already evolved
further.

33
This acknowledgment of limitation made in Recital 63 about data subjects’ access rights a
fortiori also applies to disclosures to members of the public that are not themselves affected by the
processing.
34
See https://www.startnext.com/openschufa and https://www.openschufa.de. Last accessed 25
August 2018.
35
Predicting how likely it is that the respective individual settles his or her bills, pays back loans
etc.
186 F. Wittner

Another barrier for this approach stems from the fact that transferring individ-
uals’ data to the new database as well as to potential researchers trying to analyse
them would lead those recipients to become data controllers themselves with regard
to the affected individuals. This would, however, effectively only obligate them to
be sufficiently transparent about their intentions and secure in their handling of the
data, as the voluntary transmission by the individuals and the consent for the usage
of the data would—if done properly—certainly fall under the legal basis of Article
6(1)(a).36

10.4.1.3 Lebensmittelklarheit.de

Lebensmittelklarheit.de37 is an online portal operated by German consumer pro-


tection associations and trying to bring transparency into the way ingredients,
production conditions and origins are displayed and labelled on food products.
Consumers can get informed, voice their concerns, share their experiences with
specific products and make suggestions for improvements. Companies can see
those comments and establish a dialogue, react to suggestions and thereby facilitate
the customers’ acceptance. The portal also reports about recent developments like
successful consumer complaints and changes in product design. In addition, fre-
quent surveys and consumer research allow the consumer protection associations to
identify structural problems regarding e.g. misleading product descriptions and
estimate the success probability for going to court over them.
A similar approach seems beneficial for commercially used ADM systems. Just
as consumers have an interest in understanding the labels of food products and
avoiding confusing information about ingredients and their origins, potential cus-
tomers of a company using an ADM system have an interest in being informed
about the system itself and the way it works. While the information obligation in
Article 13 forces companies to inform about the existence and usage of an ADM
system and its basic logic and consequences38 before any data processing has
occurred and therefore before any contract has been signed, individuals might still
be confused and, even worse, skim over the information without consciously
considering it—just as is common practice with most privacy statements and terms
of service nowadays.39 Having a platform that lists, for example, all the banks using
ADM systems for their credit extensions as well as comments by individuals
reporting about their experiences with the decision and the way it was explained to
them, would be a valuable basis to decide on one bank and against another. Several

36
Just as it is already being done by OpenSchufa.
37
See http://www.lebensmittelklarheit.de/. Last accessed 25 August 2018.
38
See 2.2.1. above.
39
See, for example, Berreby D (2017) Click to agree with what? No one reads terms of service,
studies confirm https://www.theguardian.com/technology/2017/mar/03/terms-of-service-online-
contracts-fine-print. Last accessed 25 August 2018.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 187

similar complaints about a company’s confusing or insufficient information


regarding its system or about being unhappy with their respective decision might
help the company understand where it needs to improve or, if it does not, be an
indication for DPAs to further investigate.

10.4.2 Details

Having covered the role models, how could a public database of ADM systems be
designed and of which data and information could it realistically be composed?
This is heavily contingent on two criteria: the practical benefits of a certain solution
over another and the legal limitations, especially when it comes to the rights of the
affected companies and their algorithms.

10.4.2.1 Responsible Body

First, the question of the responsible body running the database is an important one.
Here two options come into consideration: a database run and maintained by a
known and reliable NGO or association, or one run by a state body.
Looking at the role models described above a database run by an NGO like with
OpenSCHUFA or by a consumer protection association like with Leben-
smittelklarheit.de seems like a reasonable idea. As private entities, they are not
directly bound by other actors’ fundamental rights, do not necessarily need a statutory
mandate and thus have more freedom in their actions. In addition, public trust is
traditionally higher—at least when being done by long-standing and well-known
associations—compared to a state agency that might potentially repurpose or pass on
the relevant data for other objectives.
On the other hand, DPAs—and by extension the new European Data Protection
Board (EDPD)40—are, as Article 52(1) states, explicitly independent and therefore
not in immediate danger of giving away or being pressured into giving away data to
other state agencies for their purposes. They are also, according to Article 57(1),
tasked with promoting public awareness and understanding as well as the awareness
of controllers and processors regarding obligations and risks connected to pro-
cessing. Since the operation of a database would serve both these tasks by
informing and participating the public, promoting research and therefore con-
tributing to an overall knowledge of the concrete risks and dangers connected to a
specific algorithm in a specific usage scenario that also helps companies themselves
to better identify those risks, this makes DPAs the natural responsible bodies.

40
Made up of representatives from the DPAs of each Member State and replacing the Article 29
Data Protection Working Group.
188 F. Wittner

10.4.2.2 Content

Another important aspect is the question of the content of such a database, both
concerning the information themselves and their origin. Here, some of the GDPR’s
provisions could be utilized and inspiration be drawn from some of the role models
described above.

Data Protection Impact Assessment Results

As was shown before, the performance of a Data Protection Impact Assessment will
be the standard for most, if not all usages of ADM systems. Subsequently DPAs
will have pooled information about controllers’ self-assessment regarding risks and
envisaged safeguards as well as the agency’s own judgement about the assessment.
These could be published on the database, just as the decision of Environmental
Impact Assessments is displayed for the interested public, thereby giving
researchers as well as interested individuals thinking about contracting with the
respective company a basic overview of the area of application of the system and
the primary risks connected to it.
Justified concerns re trade secrets and intellectual property could be met by
obligating the responsible body to “redact” the content to an extant where no risk
for the respective company remains. This kind of practice is already common where
journalists or other actors request information under, for example, the German
Freedom of Information Act,41 where § 6 IFG precludes any claims to requested
information insofar as the protection of intellectual property requires it.
Obligating controllers to tolerate the publication of their assessments would still
infringe upon their fundamental rights and therefore require the acting authority to
have an according statutory mandate. While the aforementioned Article 57(1) does
task DPAs with promoting public awareness such an abstractly worded law would
certainly not suffice as the legal basis for such a severe infringement. Publishing the
results of companies’ Data Protection Impact Assessments would therefore not be
possible without the appropriate legislation, e.g. adding a corresponding clause to
Article 36.

Existing User Data

As we saw in the example of OpenSchufa, gaining access to the input data an


existing user provided to a company and its algorithm as well as the output data, or
rather decision data, following the algorithm’s decision can be a helpful basis for
researchers trying to understand the inner workings of a system42 and analyse the

41
Informationsfreiheitsgesetz (IFG).
42
Through reverse engineering or other similar measures.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 189

larger ramifications of the usage when it comes to problems like discriminatory


effects. In addition, potential users could directly benefit from observing how other,
already existing users with certain properties were treated by the specific ADM
system, what kind of decisions and consequences typically relate to which kinds of
properties. By using these data as a guideline, potential users would be able to at
least roughly estimate how they would fare when deciding to, e.g., apply for a loan
at a specific bank using a specific ADM system.
The main problem connected to this idea is that it stands and falls with the
amount of users actively transferring their data to the platform after a decision about
them was made by an ADM system. Here, the newly created right to data porta-
bility laid down in Article 20 GDPR might be helpful. It allows a person that
provided his or her personal data to a controller to demand the transmission of these
data directly to another controller of his or her choosing. While this right was
arguably created as a tool to tackle “lock-in” effects and allow users to take their
data with them when switching service providers (e.g. moving from one social
network to another),43 this designation does not limit data subjects to use it only in
these specific cases.
Instead of transmitting the data themselves, data subject would therefore only
need to send a request for transmission to the respective controller. This would
lower the hurdle significantly, increasing the chance of enough data subjects joining
in. While it is undisputed that Article 20 covers the input data provided by data
subjects, the inclusion of the output data of the ADM system (aka the decision
based on the personal data) is still heavily debated. While the prevailing opinion44
leans towards not including them in the data subject’s right due to role the controller
played in creating them through its algorithm and due to the idea of protecting
business secrets, the better arguments support the opposing opinion: since the
controller informs the data subjects about the decision anyway, nothing would keep
them from passing them on in person. Likewise, in the example of OpenSchufa, it is
undisputed (and common practice by Schufa) that a data subject’s right to access
(Article 15) covers the calculated credit scores. It is thus just consistent with the
telos of Article 20 and its relation to the information rights in Articles 13–15, then,
to ease this endeavour and allow them to have it transmitted directly by the
controller.

Discussion Platform

As was elaborated above, the database could be combined with a platform allowing
for sharing of personal experiences and discussion. Drawing from the example of
Lebensmittelklarheit.de this could include the possibility of posing questions and

43
Cf. Gierschmann et al. 2018, Article 20 para 23.
44
Article 29 Data Protection Working Party 2017, p. 10.
190 F. Wittner

proposals at the respective companies and allowing them to answer and state their
point of view, thereby facilitating public discourse and again enhancing acceptance.
Because the role of the responsible body of the database would be limited to
providing the platform itself without disclosing any trade secrets or other damaging
information, the invasiveness for affected companies would be minimal. It is
therefore not unreasonable to assume that this could be done by the DPAs under
their general task to promote public awareness in Article 57(1) and without the need
for an explicit mandate.

10.4.2.3 Access

In order to make sure that justified interests of the affected companies and their
ADM systems as well as the individuals contributing to the data pool are adequately
protected, certain guidelines concerning graduated access to the database and
anonymization of personal data would need to be established.
Here, the distinction between the general public and the affected public from
environmental law could be picked up. Interested individuals would be able to
access the discussion platform and see a general description of the usage scenario
and the data that the respective company needs for its system to make a decision.
Gaining access to the aggregated input and output data would be reserved for
trusted and accredited NGOs and associations that have proven to use them in a
responsible way. These could be collected in a regularly updated list by the DPAs.
When using aggregated input and output data of real data subjects for further
research, safeguarding the interests of these individuals is paramount. Efforts would
therefore need to be put into making sure that no conclusions can be made
regarding und used against individual data subjects. Allowing only selected actors
access to these data would already limit these risks immensely. In addition, data
would need to be anonymized in the most effective way achievable. The concept of
differential privacy,45 trying to minimize possible conclusions about individual
subjects in a group of data while still allowing for the greatest amount of research
results, would be a suitable starting point.

10.5 Conclusion

In conclusion, public transparency is a valuable and important ideal for data pro-
tection in general and the regulation of the usage of ADM systems in particular.
While the GDPR does not actively pursue it, some of its provisions, when utilized
in the right way, could help with this endeavour. Developing a public database that
lists companies using ADM systems and gives information about the way they use

45
See Dwork 2008, p. 1.
10 A Public Database as a Way Towards More Effective Algorithm Regulation… 191

them as well as the most relevant risks and measures taken against them, allows
potential data subjects to inform themselves and scholars and NGOs to do research
on adverse effects would be the best way to achieve such transparency.
The regulatory impact would be multidimensional: informing the public about
the way certain algorithms work and how companies make sure they work in the
right way directly increases public acceptance. Allowing interested individuals to
inform themselves and giving NGOs and associations access to the relevant data in
order to do research and build better explanation models directly feeds into the
GDPR’s ideal of informed data subjects.46 Moreover, the facilitation of research by
trusted actors adds another layer of quality and error control, supporting both
ADMs in their examinations of companies’ compliance and companies themselves
in disclosing errors and adverse effects they might not have found themselves—
similar to how hackers find security gaps in software in order to nudge developers
towards fixing them. However, at least some of these endeavours would mean
infringing on justified interests and fundamental rights of affected companies. While
Article 57(1) gives DPAs a general task to promote awareness for data subjects,
controllers and the public alike, an explicit statutory mandate within the GDPR
would need to be created. So while the creation of such a database is not possible
with the instruments the GDPR offers at this point, establishing a discussion plat-
form similar to Lebensmittelklarheit.de and others would be both feasible and
beneficial.

References

Article 29 Data Protection Working Party (2016) Guidelines on Transparency under Regulation
2016/679 (wp260rev.01). Available at https://ec.europa.eu/newsroom/article29/item-detail.
cfm?item_id=622227. Last accessed 25 August 2018
Article 29 Data Protection Working Party (2017) Guidelines on the right to data portability (WP
242 rev.01). Available at https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=
611233. Last accessed 25 August 2018
Bull HP (2015) Sinn und Unsinn des Datenschutzes. Mohr Siebeck, Tübingen
Burrell J (2016) How the machine ‘thinks’: Understanding opacity in machine learning algorithms.
Big Data & Society, January–June 2016, 1
Dwork S (2008) Differential Privacy: A Survey of Results. In: International Conference on Theory
and Applications of Models of Computation. Springer, Berlin
Gierschmann S, Schlender K, Stentzel R, Veil W (2018) Kommentar Datenschutz-Grundverordnung.
Bundesanzeiger Verlag, Cologne
Goodman B, Flaxman S (2016) EU Regulations on Algorithmic Decision-Making and a “Right to
Explanation”. In: arXiv:1606.08813v3 [stat.ML], 2016. Available at https://arxiv.org/abs/1606.
08813. Last accessed 25 August 2018
Landmann R v, Rohmer G (2018) Kommentar Umweltrecht. C.H. Beck, Munich

46
An ideal that might not be completely reachable by just forcing controllers to inform them, see
above.
192 F. Wittner

Malgieri G, Comandé G (2017) Why a Right to Legibility of Automated Decision-Making Exists


in the General Data Protection Regulation. International Privacy Law 7:243–265
Marsch N (2018) Das europäische Datenschutzgrundrecht. Mohr Siebeck, Tübingen
Mendoza I, Bygrave LA (2017) The Right not to be Subject to Automated Decisions based on
Profiling. University of Oslo Faculty of Law Legal Studies Research Paper Series, No. 2017–
20. Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2964855. Last accessed
25 August 2018
Paal BP, Pauly DA (2018) Beck’sche Kompakt-KommentareDatenschutz-Grundverordnung. C.H.
Beck, Munich
Rossnagel A (2018) Datenschutzgrundsätze – unverbindliches Programm oder verbindliches
Recht? Bedeutung der Grundsätze für die datenschutzrechtliche Praxis. Zeitschrift für
Datenschutz, 339–344
Selbst AD, Powles J (2017) Meaningful information and the right to explanation. International
Data Privacy Law 7:233–242
Wachter S, Mittelstadt B, Floridi L (2017) Why a right to explanation of automated
decision-making does not exist in the General Data Protection Regulation. International
Data Privacy Law 7:76–99
Wachter S, Mittelstadt B, Russell C (2018) Counterfactual Explanations without opening the
Black Box: Automated Decisions and the GDPR. Harvard Journal of Law & Technology

Florian Wittner studied law with an emphasis on intellectual property at the University of
Freiburg and the National and Kapodistrian University of Athens. As of October 2017, he is a
research assistant at the Hans-Bredow-Institut, working on the project “Information Governance
Technologies”. His interests lie in private and public media law.
Chapter 11
Access to and Re-use of Government
Data and the Use of Big Data
in Healthcare

Miet Caes

Contents

11.1 Setting the Scene ............................................................................................................. 194


11.1.1 Data Availability as a Challenge for Big Data in Healthcare........................... 194
11.1.2 The Use of Government Data by Private Healthcare Actors............................ 195
11.1.3 Government Data Access and Re-use Legislation: Necessary
and Proportionate?.............................................................................................. 196
11.2 Access to Government Data............................................................................................ 198
11.2.1 An International and Constitutional Right......................................................... 198
11.2.2 A “Governmental” Obligation to Allow Access ............................................... 200
11.2.3 Access upon Request versus Proactive Publication........................................... 202
11.2.4 Access Limitations ............................................................................................. 203
11.3 Re-use of Public Sector Information .............................................................................. 213
11.3.1 The PSI Directive ............................................................................................... 213
11.3.2 Link with Access to Government Information? ................................................ 214
11.3.3 A “Governmental” Obligation to Allow Re-use................................................ 215
11.3.4 Re-use Limitations.............................................................................................. 216
11.4 Lessons for the Legislator? ............................................................................................. 223
References .................................................................................................................................. 223

Abstract Data availability is a huge challenge for the use of big data in healthcare.
Although the government is in the possession of valuable data for big data appli-
cations of private healthcare actors, a lot of these data remain out of their reach.
A major reason are the limitations resulting from the legislation regulating access to
and re-use of government information. This legal framework intends to strike a
balance between the interest of the citizen to be informed by being granted access to
data or to be able to re-use data and other interests such as the protection of private
life and personal data, (intellectual) property rights and the confidentiality of

M. Caes (&)
Leuven Institute for Healthcare Policy, Kapucijnenvoer 35, 3000 Leuven, Belgium
e-mail: miet.caes@kuleuven.be

© T.M.C. ASSER PRESS and the authors 2019 193


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_11
194 M. Caes

business information, or the use of data by the government for monitoring and
enforcement tasks. This contribution aims to demonstrate that some of the afore-
mentioned legal limitations unnecessarily or disproportionally hinder the use of big
data by private healthcare actors. To this end, the implications of the relevant
European (e.g. the Council of Europe Convention on Access to Documents and the
PSI Directive), as well as Belgian and Dutch legislation are discussed (e.g. the
Belgian Federal Act of 11 April 1994 relating to access to government information).
In this context, this chapter also analyses the modifications proposed by the
European Commission in its recent revision proposal regarding the PSI Directive.
Furthermore, it considers what measures the legislator could take to improve access
to and re-use of its data by private healthcare actors for big data purposes.

  
Keywords Big data Healthcare Government data Data access Data re-use  
Private healthcare actors

11.1 Setting the Scene

11.1.1 Data Availability as a Challenge for Big Data


in Healthcare

Big data refers to datasets that are high in volume, variety and velocity, whose
processing (i.e. collection, storage and analysis) requires specific technology and
methods.1 In healthcare, such datasets can be a tool for evidence based
decision-making by healthcare policy makers, health insurance institutions and
funds, hospitals and other healthcare facilities, healthcare professionals, and
patients. Hence, the insights offered by big data can contribute to a more efficient
and qualitative healthcare system.
There is a high variety of types of data valuable for healthcare: primary and
secondary care data, health insurance claim data, genetic data, administrative data,
sensor data, social media data, public health data, etc. Moreover, these data are
stored and controlled by diverse actors, for instance healthcare professionals and
facilities, health insurance providers, governmental institutions and agencies, or
private companies. They can be found in variety of distinct databases, which can be
open to the public, but are mostly only accessible under strict conditions.

1
This is an own definition, which is first of all based on the 3 V’s of big data (i.e. Volume, Variety
and Velocity), which are according to the analysts of Gartner its main characteristics (See Laney D
(2001) 3D Data Management: Controlling Data Volume, Velocity, and Variety. https://blogs.
gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-
Velocity-and-Variety.pdf. Last accessed on 7 August 2018). It is also inspired by extensive
research on the topic of De Mauro, Greco and Grimaldi (See De Mauro et al. 2016, p. 122).
11 Access to and Re-use of Government Data … 195

Considering the above, before the healthcare sector can scale up and take the
leap from the use of data to big data, several challenges need to be addressed. One
of the most important is the availability of the data. Valuable data is stored in
different places and controlled by diverse stakeholders with various interests.
Sometimes data controllers are not allowed to share data due to restrictions of
privacy and data protection, intellectual property or administrative law, or con-
tractual obligations. In other cases, data unavailability is not a legal issue.
Stakeholders are simply not willing to share for economic or other strategic reasons,
or are unable to because they lack the necessary technology, experts and/or funding.
This contribution will focus on the legal constraints.

11.1.2 The Use of Government Data by Private Healthcare


Actors

Access to and re-use of government data is recognised as an important driver for big
data analytics.2 In this context, government data could indeed be valuable for private
healthcare actors. It concerns amongst others data collected to have a functioning
public health insurance system, which can be found in the Belgian Crossroads Bank
for Social Security. The government also receives data from hospitals regarding their
functioning and organisation, because of their legal reporting duties or as a result of
spontaneous data sharing initiatives. Another example is information in the posses-
sion of the Belgian Fund for Medical Accidents3 or government healthcare expen-
diture data. Furthermore, governmental datasets can be the result of national health
surveys. Some data are linked with scientific research on healthcare, commissioned
by the government. In principle, it does not concern data which can be found in
electronic health records, since access to these records is reserved to healthcare
professionals with the informed consent of the patient.4
However, huge quantities of valuable data remain for the eyes of the government
only. Just a small amount seems to be accessible upon request and even less data are
actively made public. For instance, on the Belgian governmental initiative data.gov.
be only 140 datasets concerning “health” are publicly available.5 The Flemish open

2
Proposal (European Commission) for a revision of the Directive 2003/98/EC on the re-use of
public sector information, 25 April 2018, COM(2018), p. 2.
3
A public institution with legal personality established by the Belgian Federal Act of 31 March
2010 on damage indemnification resulting from healthcare (BS 2 April 2010).
4
In Belgium, access to electronic health records is complex and regulated by diverse acts and
decrees such as the Royal Decree of 3 May 1999 concerning the general medical record (BS 30
July 1999), the Royal Decree of 3 May 1999 concerning the general minimal requirements of the
medical records to be kept by hospitals (BS 30 July 1999), the Federal Act of 21 August 2008 on
the establishment and organisation of the eHealth platform (BS 13 October 2008) and the Flemish
Act of 25 April 2014 concerning the organisation of a network for datasharing between the actors
involved in care (BS 20 August 2014).
5
See Data.gov.be. https://data.gov.be/en. Last accessed 7 August 2018.
196 M. Caes

data portal opendata.vlaanderen.be offers 125 “health” datasets.6 The Dutch gov-
ernment is doing better, since 618 datasets are linked to the topic “care and health”
on its open data website.7 Moreover, these datasets, from various governmental
sources, are generally modest in size and of limited informative value.8

11.1.3 Government Data Access and Re-use Legislation:


Necessary and Proportionate?

The availability of government data for citizens is regulated by the legislation on access
to government data and the re-use of public sector information. It aims to strike a
balance between the interest of the citizen to be informed by being granted access to data
or to be able to re-use data and other interests such as the protection of private life and
personal data, (intellectual) property rights and the confidentiality of business infor-
mation, or the use of data by the government for monitoring and enforcement tasks.
This contribution aims to study if the legal restrictions unnecessarily or dispro-
portionally hinder the use of big data and big data analytics by private healthcare
actors, at least to the extent that this benefits healthcare. Limitations are qualified as
unnecessary or disproportionate if in this context access to and use of government data
is not effective and/or less efficient, while there are other (less far-reaching) means to
protect the legitimate interest at stake, or if the benefit for healthcare clearly outweighs
the protected interest. There is a lack of effectiveness if limitations simply impede that
the data can be used to improve healthcare, for instance because not enough data is
made available, because the quality of the data is low or because the data is already
outdated before it can be used. Inefficiency means that the benefits of using the data do
not outweigh the costs and/or that it takes a lot of time before the data can be used.
In case of unnecessary or disproportionate limitations, this chapter will shortly
discuss which (legal) measures the government could take to improve the access to
and the use of its data by private healthcare actors, whilst respecting the societal
interests and the rights of individuals which the legal framework on access and
re-use of government data aims to protect.
For the purpose of this research, the concept “government” should be understood
as “the public authorities falling under the scope of the legislation regarding access

6
See Opendata.vlaanderen.be. http://opendata.vlaanderen.be/dataset?q=gezondheid. Last acces-
sed 7 August 2018.
7
See Dataportaal van de Nederlandse overheid. https://data.overheid.nl/. Last accessed 7 August
2018.
8
For instance, the Dutch open data website classifies only one dataset as a “high value dataset”.
11 Access to and Re-use of Government Data … 197

to and re-use of government data”.9 “Government data” is defined as “data which


the government has legitimately collected in view of the fulfilment of its tasks of
public interest”. Throughout this contribution, special attention is paid to the pos-
sibility of access to and use of “dynamic data”,10 “high value data”11 and “research
data”,12 as far as these data can be considered as “government data” and can be
used for the benefit of healthcare.
To answer the aforementioned research questions, this chapter will examine
European, as well as Belgian and Dutch legislation on access to government
information and the re-use of public sector information.13
In the first part, the right to access to government data is analysed. At a European
level, this right is regulated in the Council of Europe Convention on Access to
Documents of 18 June 200914 (hereafter Access Convention), and is also guaran-
teed—at least to a certain extent—by Article 10 ECHR. At a national level, the
aforementioned right has its legal basis in Article 32 of the Belgian and Article 110
of the Dutch Constitution. This right is further elaborated in more detailed legis-
lation such as the Belgian Federal Act of 11 April 1994 relating to access to
government information15 (hereafter Federal Access Act), the Flemish Act of 26
March 2004 relating to access to government information16 (hereafter Flemish
Access Act), and the Dutch Act of 31 October 1991 containing regulations gov-
erning public access to government information17 (hereafter Dutch Access Act).

9
This scope is discussed in Sects. 11.2.2 and 11.3.3.
10
“Dynamic data” are defined as “documents in an electronic form, subject to frequent or realtime
updates” in Article 2.6 of the PSI Directive Revision Proposal.
11
According to Article 2.6 of the PSI Directive Revision Proposal “high value data” are “doc-
uments the re-use of which is associated with important socio-economic benefits, notably because
of their suitability for the creation of value-added services and applications, and the number of
potential beneficiaries of the value-added services and applications based on these datasets”. It is
the European Commission which can adopt by delegated act a list of such datasets. Data on
healthcare could be qualified as such “high value data”.
12
Article 2.7 of the PSI Directive Revision Proposal defines “research data” as “documents in a
digital form, other than scientific publications, which are collected or produced in the course of
scientific research activities and are used as evidence in the research process, or are commonly
accepted in the research community as necessary to validate research findings and results”.
13
This chapter analyses Belgian legislation, because the author is most familiar with it. It also
discusses Dutch legislation, since its administrative law is well thought out and not as fragmented
as its Belgian counterpart.
14
Council of Europe Convention of 18 June 2009 on Access to Official Documents, CETS no.
205.
15
Federal Act of 11 April 1994 relating to access to government information, BS 30 June 1994.
16
Flemish Act of 26 March 2004 relating to access to government information, BS 1 July 2004.
17
Act of 31 October 1991 containing regulations governing public access to government infor-
mation, Stb. 1991, no. 703.
198 M. Caes

The chapter continues with an examination of the Directive 2003/98/EC of 17


November 2003 on the re-use of public sector information18 (hereafter PSI19
Directive). If a non-governmental healthcare actor wants to re-use government data,
it also has to comply with the legal requirements on the re-use of public sector
information.20 Belgium converted the PSI Directive in the Federal Act of 4 May
2016 on the re-use of public sector information21 (hereafter Federal Re-use Act), the
Flemish Act of 27 April 2007 on the re-use of public sector information22 (hereafter
Flemish Re-use Act), and the Flemish Act of 18 April 2008 on the electronic
exchange of administrative information (hereafter the Flemish E-government
Act).23,24 In the Netherlands, there is the Dutch Act of 24 June 2015 laying down
rules on the re-use of public sector information25 (hereafter Dutch Re-use Act).
Special attention is paid to the recent proposal26 of the European Commission of 25
April 2018 to revise the PSI Directive (hereafter PSI Directive Revision Proposal).

11.2 Access to Government Data

11.2.1 An International and Constitutional Right

The right of access to government data guarantees government information is


publicly available, unless other rights or legitimate interests need to be protected. It
constitutes an international right, which is further developed in national legislation.

18
Directive 2003/98/EC of 17 November 2003 on the re-use of public sector information, OJ L
345, 31 December 2003, pp. 90–96, which entered into force on 31 December 2003. It was revised
by Directive 2013/37/EU of 26 June 2013 amending Directive 2003/98/EC on the re-use of public
sector information, OJ L 175, 27 June 2013, pp. 1–8, which entered into force on 17 July 2013.
19
PSI = Public Sector Information.
20
Rijksoverheid (2016) Handleiding Wet hergebruik van overheidsinformatie, p. 9. https://open-
overheid.nl/wp-content/uploads/2016/05/WEB_90943_BZK_Handleiding-Who-versie2.pdf. Last
accessed 7 August 2018.
21
Federal Act of 4 May 2016 on the re-use of public sector information, BS 3 June 2016.
22
Flemish Act of 27 April 2007 on the re-use of public sector information, BS 5 November 2007.
23
Flemish Act of 18 April 2008 on the electronic exchange of administrative information, BS 29
October 2008. This act will not be discussed in this contribution, since it regulates the exchange of
information between governmental bodies and not between the government and private actors.
24
Both acts were amended by the Flemish Act amending the Act of 27 April 2007 on the re-use
of public sector information and the Flemish Act of 18 July 2008 on the electronic exchange of
administrative information of 12 June 2015, BS 30 June 2015.
25
Act of 24 June 2015 laying down rules on the re-use of public sector information, Stb. 2015,
no. 271.
26
Proposal (European Commission) for a revision of the Directive 2003/98/EC on the re-use of
public sector information, 25 April 2018, COM(2018).
11 Access to and Re-use of Government Data … 199

The European Court of Human Rights27 developed case law in which it


recognised that the right to freedom of expression, guaranteed by Article 10
ECHR,28 Implies—at least in certain circumstances—a right of access to State-held
information.29 15 European countries, including Belgium, also signed the Access
Convention. Its Article 2 states: “Each Party shall guarantee the right of everyone,
without discrimination on any ground, to have access, on request, to official doc-
uments held by public authorities.”30 The Convention has not entered into force
though, since a total number of ten ratifications is needed.31 It would have been the
first international legal instrument, binding upon Member States, explicitly recog-
nising a general right of access to official documents held by public authorities.32
In Europe, access to government data is guaranteed by multiple national con-
stitutions. It can be found in Article 32 of the Belgian Constitution33 and Article
110 of the Dutch Constitution.34

27
European Court of Human Rights = ECtHR.
28
Article 10.1 ECHR (European Convention on Human Rights) states: “Everyone has the right to
freedom of expression. This right shall include freedom to hold opinions and to receive and impart
information and ideas without interference by public authority and regardless of frontiers. This
article shall not prevent States from requiring the licensing of broadcasting, television or cinema
enterprises.”
29
ECtHR 8 November 2016, no. 18030/11, Magyar Helsinki Bizottsag v. Hungary; ECtHR 25
June 2013, no. 48315/06, Youth Initiative for Human Rights v. Serbia; ECtHR 29 May 2009, no.
31475/05, Kenedi v. Hungary; ECtHR 14 April 2009, no. 37374/05, Tarsasaga Szabadsagjogokert
v. Hungary; ECtHR 10 July 2006, no. 19101/03, Sdruzeni Jihoceszke Matky v. Czech Republic).
In Magyar Helsinki Bizottsag v. Hungary (para 156), the ECtHR considers “that Article 10 does
not confer on the individual a right of access to information held by a public authority nor oblige
the Government to impart such information to the individual. However, […] such a right or
obligation may arise, firstly, where disclosure of the information has been imposed by a judicial
order which has gained legal force […] and, secondly, in circumstances where access to the
information is instrumental for the individual’s exercise of his or her right to freedom of
expression, in particular “the freedom to receive and impart information” and where its denial
constitutes an interference with that right.”
30
Own translation.
31
Until now, only nine of the 15 States who signed, ratified the Convention. See Council of
Europe (2018) Chart of signatures and ratifications, of Treaty 205. www.coe.int/en/web/
conventions/full-list/-/conventions/treaty/205/signatures?p_auth=ylsG4jgX. Last accessed 7
August 2018.
32
Explanatory Report to the Council of Europe Convention on Access to Official Documents,
CETS no. 205, p. 1.
33
“Everyone has the right to consult any administrative document and to obtain a copy, except in
the cases and conditions stipulated by the laws, federate laws or rules referred to in Article 134.”
(own translation).
34
“In the exercise of their duties government bodies shall observe the right of public access to
information in accordance with rules to be prescribed by Act of Parliament.” (own translation).
200 M. Caes

Due to the Belgian federal system, Belgium has several relevant acts on the
general35 right to access to government data.36 This contribution will only focus on
the Federal Access Act and the Flemish Access Act. In the Netherlands, the Dutch
Access Act specifies how this right can be exercised.

11.2.2 A “Governmental” Obligation to Allow Access

The duty to open up data lies on the government. Article 1.2, a, (i) of the Access
Convention uses the term “public authorities”: “government and administration at
national, regional and local level; legislative bodies and judicial authorities insofar
as they perform administrative functions according to national law”; natural or legal
persons insofar as they exercise administrative authority”. The Parties to the
Convention can also include other instances: “legislative bodies as regards their
other activities; judicial bodies as regards their other activities; natural or legal
persons insofar as they perform public functions or operate with public funds,
according to national law”.37

35
Besides legislation on the general access right to government data, in certain matters, specific
legislation exists. In case of conflict, Article 13 of the Federal Access Acts states the legislation
granting broader access rights prevails. If the specific legislation limits the access, the situation is
more complex and the doctrine does not always agree on which legislation applies (see Schram
2018, pp. 257–261).
36
Federal Act of 12 November 1997 relating to access to government information in the pro-
vinces and municipalities, BS 19 December 1997; Federal Act of 11 April 1994 relating to access
to government information, BS 30 June 1994; Flemish Act of 26 March 2004 relating to access to
government information, BS 1 July 2004; Act of the Walloon Region of 30 March 1995 relating to
access to government information, BS 28 June 1995; Act of the French-speaking Community of
22 December 1994 relating to access to government information, BS 31 December 1994; Act of
the German-speaking Community of 16 October 1995 relating to access to government infor-
mation, BS 29 December 1995; Act of the Brussels-Capital Region of 30 March 1995 relating to
access to government information, BS 23 June 1995.
37
Article 2, b, (ii) Access Convention.
11 Access to and Re-use of Government Data … 201

Although different “government” concepts are used in the Belgian and Dutch
legislation,38 in a healthcare context this implies the Ministries of Health39 and their
advising and consultative bodies, fall under the scope of the legislation on access to
government data. Also, instances such as the Belgian National Institute for Health
and Disability Insurance (RIZIV)40 and the Dutch Healthcare Authority (NZa)41
should comply. The same applies for the Belgian Health Care Knowledge Center
(KCE).42
In the Netherlands, hospitals do not fall under the scope of the legislation, except
for the eight University Medical Centers (UMCs). In 2001, the Dutch administrative
court ruled (the board of) an academic hospital of a public university has a pub-
lication duty as a consequence of the Dutch Access Act.43 However, if citizens wish
to receive information from other Dutch hospitals, they can always consult the
inspection reports concerning the diverse healthcare providers of the Health Care
Inspectorate and Youth Care Inspectorate (IGJ), which are published online or can
be consulted upon request.44 Due to legislation containing reporting obligations for
care providers, the Dutch government receives amongst others information on their

38
The Belgian Federal Access Act uses the concept “administrative authority” (Article 1, limb 2,
1° Federal Access Act), which is hard to delineate since it is not defined in the law, but has been
shaped by case law of the Council of State (Raad van State or abbreviated RvS; the highest
administrative court in Belgium) and the Court of Cassation (Hof van Cassatie or abbreviated
Cass.) (see De Somer 2011–2012, p. 1638; Opdebeek and De Somer 2017, p. 301; Schram 2018,
pp. 92–105). The jurisprudence uses organic, as well as functional criteria. Article 3, 1° of the
Flemish Access Act introduced the broader notion “administrative body” (see Schram 2018,
p. 105). It refers to “a) a legal person which is founded by or by virtue of the Constitution, or of a
federal or regional act; b) a natural person, a group of natural persons, a legal person or a group of
legal persons determined and controlled in their functioning by an instance in the meaning of a); c)
a natural person, a group of natural persons, a legal person or a group of legal persons, when they
are entrusted with the execution of a task of public interest by an instance in the meaning of a) or in
so far they look after a task of public interest and take decisions that are binding on third parties.”
The Dutch Access Act also uses the term “administrative body” (Article 1, a Dutch Access Act).
39
See for example CTB, opinion of 16 April 2012, no. 2012/27 (on an ombuds report of a
hospital in the hands of the Belgian Federal Public Service of Health) and CTB, opinion of 13 July
2009, no. 2009/46 (on the reports concerning hand hygiene in hospitals in the possession of the
Belgian Federal Public Service of Health). The CTB (Commissie voor Toegang tot
Bestuursdocumenten) is an independent advisory federal administrative body in the context of
access to government information legislation. See also the Belgian parliamentary documents
(known as Parl. St.): Parl. St. Kamer, 1992–1993, no. 1112/1, p. 9.
40
See CTB, opinion of 1 February 2016, no. 2016/18 (on the report to the advising doctor of the
health insurance fund). See also Parl. St. Kamer, 1992–1993, no. 1112/1, p. 9.
41
See the collection of decisions as a result of the Dutch Access Act on the website of the NZa
(NZa site. https://puc.overheid.nl/nza/. Accessed 7 August 2018).
42
See CTB, opinion of 29 July 2013, no. 2013/30 (on hospital data concerning cardiological
interventions).
43
ABRvS 26 February 2001, ECLI:NL:RVS:2001:AN6832. The Raad van State, abbreviated
RvS, is the Dutch Council of State.
44
IGJ (2018) Wat maakt IGJ openbaar? www.igj.nl/onderwerpen/openbaarmaking. Accessed 7
August 2018.
202 M. Caes

quality of care.45 The NZa also publishes information on the treatment programs of
all Dutch hospitals and mental health and addiction care providers which they are
legally obliged to share with the Dutch government.46
In Belgium, only the public psychiatric care centers47 are subject to the afore-
mentioned legislation. Information on other hospitals is indirectly available through
governmental reports on their functioning. Belgian hospitals have a legal obligation
to report certain data, such as “the minimal hospital data” (e.g. administrative
patient data, staff data, nursing data, clinical data) to the government.48 However,
because of voluntary initiatives in the healthcare sector, the government can dispose
of other hospital data as well.49 Although hospitals exercise a task of public interest,
i.e. provide healthcare services, they cannot be qualified as “administrative body”
within the meaning of Article 3, 1°, (c) of the Flemish Access Act. It refers to
persons “entrusted” with the execution of a task of public interest, or persons who
look after a task of public interest and who can take decisions that are binding on
third parties. Hospitals voluntarily took up their task of public interest, as a result of
which they got recognised by the government. They were not “entrusted” with it by
means of a governmental decision.50 Neither are hospitals able to take decisions
that are binding on third parties.

11.2.3 Access upon Request versus Proactive Publication

A distinction is made between the duty of the government to provide information


upon request of the citizen and proactive publication of information. The latter
means the government has to inform the citizens at its own initiative. The Belgian
and Dutch constitutions only included the first duty, but the implementing acts also

45
See for example Article 66d, 2 Act of 16 June 2005, Stb. 2005, no. 649 (the Dutch Care
Insurance Act).
46
This information sharing system is known as DIS (“dbc-informatiesysteem”): see NZa (2018)
Over DIS. www.opendisdata.nl/dis/over. Accessed 7 August 2018.
47
See VBOB (2018) Annex to the annual report 2016–2017, p. 3. https://www.vlaanderen.be/nl/
publicaties/detail/jaarverslag-beroepsinstantie-inzake-de-openbaarheid-van-bestuur. Accessed 7
August 2018. The VBOB (Vlaamse Beroepsinstantie Afdeling Openbaarheid van Bestuur) is the
Flemish appeal body in the context of the Flemish access to government information legislation.
48
See Royal Decree of 10 April 2014, BS 28 May 2014; Royal Decree of 28 March 2013, BS 2
April 2013; Royal Decree of 27 April 2007, BS 10 July 2007.
49
In the context of the project known as “VIP” (“Vlaams Indicatorenproject voor Patiënten en
Professionals”), which started in 2010, most of the Flemish hospitals voluntarily measure care
quality indicators.
50
Opdebeek and De Somer 2017, p. 319.
11 Access to and Re-use of Government Data … 203

regulate the more far-reaching proactive information duty.51 The Access


Convention pays attention to this distinction as well.52
In what follows, it will be demonstrated that for big data healthcare applications to
work, the duty of the government to grant access to data upon request is far from ideal.
In this context, it is recommended to focus more on the proactive publication duty.

11.2.4 Access Limitations

11.2.4.1 Which Information Can Be Accessed?

The Access Convention guarantees access to “official information”.53 The Belgian


legislation applies to “administrative documents”,54 whereas the Dutch Access Act
regulates access to “documents” regarding “administrative matters”.55
The Belgian Constitutional Court judged the term “administrative documents”
should be interpreted broadly.56 It refers to all information the government “dis-
poses of”.57 This is information which physically lies with the government,58 but
also information laid down in an archive59 or information the government has the
legal right to claim from a private actor.60 An example of the latter is information
hospitals legally are obliged to provide. However, the government is not obliged to
address itself to a private actor which possesses the concerned information if this
actor has no legal duty to share this.61 In the latter case, due to the private status of
the data controller, access will be subject to different legislation.
The content of the information can vary: it does not only concern “unilateral
administrative legal acts”, but also statistics, contracts, preparatory documents,

51
Articles 4–12 versus 2–3 Federal Access Act, Articles 7–27 versus 28–34 Flemish Access Act,
Articles 3–7 versus 8–9 Dutch Access Act.
52
Articles 2–9 versus 10 Access Convention.
53
Article 1.2, b Access Convention.
54
Article 1, 2° Federal Access Act and Article 3, 4° Flemish Access Act.
55
Article 1, a Dutch Access Act.
56
Arbitragehof 25 March 1997, no. 17/97. The Arbitragehof is the Constitutional Court of
Belgium and is known as Grondwettelijk Hof, abbreviated GwH, since 2007.
57
Article 1, 2° Federal Access Act, Article 3, 4° Flemish Access Act and Article 1, a Dutch
Access Act.
58
Parl. St. Kamer, 1992–1993, no. 1112/1, pp. 9–10; Explanatory report to the Dutch Access Act,
pp. 22–23. However, it does not apply to information which the government only obtained
coincidentally or temporarily (Schram 2008, p. 148).
59
Parl. St. Kamer, 1992–1993, no. 1112/1, p. 22; Schram 2018, p. 307; Schram 2004, pp. 3–34.
In this regard, the explanatory report to the Dutch Access Act specifies, the information should
also be intended for the government (pp. 22–23).
60
Schram 2018, pp. 165 and 307.
61
RvS 16 January 2015, no. 229.828, cvba Gerhanko; Opdebeek and De Somer 2017, p. 458.
204 M. Caes

reports and studies, etc.62 Furthermore, the form of the information does not matter.
It can be written text material, pictures, sound or video files, drawings, data
included in automated data processing, statistics, etc.63 This description corre-
sponds with the “variety” characteristic of big data.
The access duty only applies to existing information.64 The information should
be “materialised” on a data carrier.65 This is for instance not the case if data is not
saved and is only temporarily present on a computer.66 If the data is somehow
materialised though, it is not relevant if it is structured or not. This implies that for
instance the healthcare data of the Belgian Fund for Medical Incidents67 could68 fall
under the scope of the governmental access duty. However, one cannot ask the
government to process or analyse its data.69 Indeed, there is no reason why private
healthcare actors should not do this themselves, especially since they know best
what kind of insights they are looking for.
It does not matter if the government is not the author of the information.70
Administrative documents can be created by a private person, who keeps his/her
copyright.71,72
The proactive governmental publication duty is not as far-reaching as the access
on request duty. Article 10 of the Access Convention states the government shall, at
its own initiative and where appropriate, take the necessary measures to publish
official documents it possesses to promote the transparency and efficiency of public
administration and to encourage informed participation of the public in matters of

62
VBOB 4 July 2005, no. OVB/2005/22 (preparatory acts); CTB, opinion of 5 May 1998, no. 98/
51, 98/63 and 98/65 (internal documents); Opdebeek and De Somer 2017, pp. 457–458.
63
Parl. St. Kamer 1992–1993, no. 1112/1, pp. 11–12; Parl. St. Kamer 1996–1997, no. 871/5, p. 4;
Explanatory report to the Flemish Access Act, Parl. St. Vl. Parl. 2002–2003, no. 1732/1, p. 13;
Explanatory report to the Dutch Access Act, pp. 22–23; ARRvS 5 March 1982, ECLI:NL:
RVS:1982:AM6426 (on audio tapes); RvS 7 April 2011, no. 212.547, cvba Verbruikers Unie Test
Aankoop (on statistical data); VBOB 22 November 2004, no. OVB/2004/42 (on an audio
recording).
64
Parl. St. Kamer 1996–1997, no. 871/5, p. 4; RvS 16 January 2015, no. 229.828; CTB, opinion
of 21 February 2005, no. 2004/103; CTB, opinion of 30 August 2002, no. 2002/79; CTB, opinion
of 22 January 1996, no. 96/2 (no governmental duty to draw up new documents); CTB, opinion of
13 October 1995, no. 95/102 (no duty for the government to transform information into statistics).
65
Delvaux 1999, p. 26; Jongen 1995, p. 781.
66
Data merely present in the random access memory (RAM) of a computer is usually volatile.
67
Fonds voor Medische Ongevallen or FMO.
68
Provided legal exceptions or refusal grounds of the legislative framework on access to gov-
ernment information do not apply.
69
Article 20, § 1, limb 1 Flemish Access Act; ARRvS 31 August 1993, ECLI:NL:RVS:1993:
AN3429 (The Dutch Access Act entails no translation duty for the Dutch government).
70
Explanatory report to the Dutch Access Act, pp. 22–23; RvS 21 October 2013, no. 225.162,
gemeente Schaarbeek; Opdebeek and De Somer 2017, p. 457.
71
Further on in this contribution (Sect. 11.2.4.6), we focus more in detail on the implications of
copyright for access to government data.
72
Opdebeek and De Somer 2017, p. 301; Van Eechoud 2008, p. 96.
11 Access to and Re-use of Government Data … 205

general interest. An example of such a matter of general interest is the organisation


of efficient and qualitative healthcare.
In Belgium, the proactive publication duty concerns general information on the
policy, regulation and services of the government.73 The Dutch Access Act
determines the duty applies to information about the governmental policy, its
preparation and execution, if this information is in the interest of a democratic
governance.74
Clearly, the government has a broad margin of appreciation in determining
which information falls under the scope of the active publication duty. Moreover,
the limited information which does fall under the scope of this duty, is not very
valuable for big data purposes in healthcare. Of course, the government can always
decide to publish more information than legally required. However, the publicity
exceptions that apply to access upon request, which is discussed later in this
contribution, should always be respected.75
In view of the foregoing, to encourage the use of big data applications by
non-governmental healthcare actors, it is recommended to broaden the proactive
publication obligation to the extent that documents falling under the scope of the
access upon request duty should be actively published. Evidently, information, to
which access could be refused based on the refusal grounds should not be included.
It could also be useful to include a governmental duty to provide an overview of
relevant (healthcare) data it has at its disposal, which would allow private
(healthcare) actors to be informed of the kind of available information and its
location.

11.2.4.2 Which Persons Can (Request) Access (to) Government Data?

In case of proactive publication, obviously, anyone can access the published


government data. If a request is needed, every person, physical as well as legal, can
draw up such a request. In principle, no proof of interest is needed.76 For public
legal persons though, a proportionate link with their competencies is required.77
In Belgium, if information is qualified as “personal”, an interest is needed as a
condition to accept the request.78 Personal information is not necessarily the same
as information falling under the scope of private life.79 It is information which

73
Article 2 Federal Access Act and Article 29 Flemish Access Act.
74
Article 8.1 Dutch Access Act.
75
Article 28, § 2 Flemish Access Act and Article 10 Dutch Access Act.
76
Articles 2 and 4.1 Access Convention, Article 4 Federal Access Act, Article 17, § 2 Flemish
Access Act and Article 3 Dutch Access Act.
77
RvS 21 October 2013, no. 225.162, gemeente Schaerbeek.
78
Article 4, limb 2 Federal Access Act and Article 17, § 2 Flemish Access Act.
79
RvS 3 March 2009, no. 191.067, Asselman.
206 M. Caes

concerns a value judgment or an appreciation of a physical person80 whose name is


mentioned, or who is easy to identify, or a description of his/her behavior that
apparently can cause him/her harm.81,82 The Federal Access Act does not specify
what this interest exactly is,83 but according to its preparatory documents it is the
interest required for the admissibility of an annulment request with the Council of
State.84 According to the Flemish Access Act an interest is present if the applicant
can prove he/she is legally directly and personally affected by the information, the
decision the information is about or the decision in preparation of which the doc-
ument was drafted.85 If the information concerns the applicant him-/herself, no such
proof is needed.86 The jurisprudence confirmed the latter also applies at the federal
level.87 Consequently, “personal” information cannot be used for big data appli-
cations of non-governmental healthcare actors. “A value judgment” or an “appre-
ciation” of a physical person do not seem valuable in this context though.

11.2.4.3 Formalities and Processing of the Access Request

The Access Convention determines access request formalities should be proportionate:


they cannot exceed what is essential to process the request.88 Applicants should have the
right to remain anonymous, except if their identity is essential to process the request.89
A Belgian access request should be written, precise (i.e. identifying the subject
or if possible the document(s)) and addressed to the appropriate government
instance.90 The Flemish Access Act also requires an indication of the form91 in
which the applicant prefers to receive the information and his/her name and
address.92 Article 3 of the Dutch Access Act does not require a written request. The
applicant should again specify the administrative subject he/she wishes to receive
information about, or the requested document.93

80
Legal persons are excluded (Boes 1996, p. 19).
81
The “harm” criterion is only mentioned in the Federal Access Act.
82
Article 1, 3° Federal Access Act and Article 3, 6° Flemish Access Act.
83
Article 4, limb 2 Federal Access Act.
84
Parl. St. Kamer 1992–1993, no. 1112/13, p. 12; Parl. St. Kamer 1996–1997, no. 871/5, p. 5.
The interest is interpreted in a broad sense (on the required interest: Kaiser and Gourdin 2015,
pp. 44–58; Opdebeek and De Somer 2017, pp. 564–568).
85
Article 17, § 2 Flemish Access Act.
86
Article 17, § 2 Flemish Access Act.
87
RvS 20 March 2006, no. 156.628; CTB, opinion of 19 February 2001, no. 2001/11.
88
Article 4.3 Access Convention.
89
Article 4.2 Access Convention.
90
Article 5 Federal Access Act and Article 17, § 1 and § 3 Flemish Access Act.
91
I.e. the consultation of the document, information on the document or the provision of a copy.
92
Article 17, § 1 Flemish Access Act.
93
Article 3.2 Dutch Access Act.
11 Access to and Re-use of Government Data … 207

Furthermore, according to the Access Convention, the government shall help


identify the requested document, as far as reasonably possible.94 It shall refer the
applicant to the competent public authority if necessary.95
Indeed, the Belgian and Dutch legislation stipulate, if the applicant did not
address his/her request to the correct governmental instance, it should be referred to
the competent one if reasonably possible.96 Article 7, limb 1 of the Flemish Access
Act adds the government should be helpful.
However, there is no obligation for the government to provide an overview of all
the documents/data in its possession or to make an inventory97 Of course, this
makes it harder for private healthcare actors interested in setting up big data
applications to find out where relevant data can be found. Therefore, it would be
interesting to consider such an overview or inventory obligation in the context of
the abovementioned proactive publication duty of the government.98 The legislation
regarding re-use of public sector information partially makes up for this lack since
inventory provisions are present there.99
A decision on the access request shall be made as soon as possible or within a
reasonable time limit specified beforehand.100 The Belgian federal government
should communicate its decision promptly. In case of refusal or delayed access this
should be communicated within 30 days. An extension of the period with 15 days is
possible in the latter case.101 The Flemish Access Act foresees a time limit of 15
days,102 whereas Article 6 of the Dutch Access Act one of four weeks, with a
renewal option of four weeks.
Finally, the Flemish legislation requires the consent of the person from whom
the information originates, if the request concerns the private life,103 confidential
commercial or industrial information104 or information from third parties which
they were not obliged to give and was explicitly labelled as confidential.105,106 It is
obvious this requirement hampers big data, but it is necessary and proportionate to
the aim of protecting private life and confidential information.

94
Article 5.1 Access Convention.
95
Article 5.2 Access Convention.
96
Article 5 Federal Access Act, Article 17, § 3 Flemish Access Act and Article 4 Dutch Access
Act. See RvS 8 February 2000, no. 85.178.
97
RvS 24 June 2014, no. 227.809, Verrycken.
98
To the extent reasonably possible.
99
Cf. more infra (Sect. 11.3.4.5).
100
Article 5.4 Access Convention.
101
Article 6, § 5 Federal Access Act.
102
Article 20, § 2 Flemish Access Act.
103
Article 17, § 1, limb 1, 1° Flemish Access Act.
104
Article 17, § 1, limb 1, 2° Flemish Access Act.
105
Article 17, § 1, limb 1, 3° Flemish Access Act.
106
The protection of private life and confidential information can be ground for access refusal in
the Federal, Flemish and Dutch Access Act (see further on in this contribution in Sect. 11.2.4.7).
208 M. Caes

Clearly, the request procedure constitutes an obstacle for big data analytics in
healthcare. It can reduce its efficiency since it is time-consuming to draw up a request, to
specify what kind of data is needed and in which form, to identify the correct gov-
ernment instance and finally to wait for the access decision. In some cases, the request
procedure can even render big data applications ineffective. This is especially true if one
needs to access dynamic data, which are subject to frequent or even real-time changes.
This kind of data will often already be outdated when access is granted.
The legislator should consider simplifying and/or shortening the request pro-
cedure. It would be useful to foresee a specific request procedure which allows
real-time access to dynamic data. It is also recommended to broaden the duty of the
government to actively share data, to the extent that it concerns data, which now fall
under the scope of the access upon request duty and to which access cannot be
refused based on the access refusal grounds as foreseen in the law. Specifically, for
the use of big data in healthcare, the request procedure could be abandoned for
well-defined data categories (such as research and high value data) for their access
by a limitative list of private healthcare actors.

11.2.4.4 Access Forms

Article 6.1 of the Access Convention determines that “access” means an applicant
can consult the original or a copy, or request a copy in any available form unless
this appears unreasonable. In this context, Article 6.3 allows Parties to refer to more
easily accessible alternatives.
The Belgian legislation107 specifies access consists of the consultation of the
document, information on the document or the provision of a copy. Information
should be made available in the requested form and if this is not possible, in another
form.108 In the Netherlands access implies the delivery of a copy or of the literal
content in another form, the notification of the content, the disposal of an extract or
a summary, or the provision of information about the document.109 Article 7.2 of
the Dutch Access Act imposes the information should be provided in the requested
form, unless this is not reasonable or if the information is already available in
another form, which is easily accessible to the public.
Obviously, the fact that it will not always be possible to receive information in
the requested form, will reduce the efficiency or even the effectiveness of big data
tools. Such an access limitation is proportionate though, when the advantage of
accessing the information to improve healthcare does not outweigh the burden for
the government. One should keep in mind that the volume and variety of big data
entails complications for the consultation of the data and/or the delivery of a copy.

107
Article 4, limb 1 Federal Access Act and Article 7, limb 2 Flemish Access Act.
108
See Article 20, § 1 of the Flemish Access Act; RvS 23 November 2010, no. 209.086,
Beuls; VBOB 13 January 2010, no. OVB/2009/172.
109
Article 7 Dutch Access Act.
11 Access to and Re-use of Government Data … 209

11.2.4.5 Access Charges

The Access Convention states consultation should be free of charge on the premises
of a public authority, with the exception of archives and museums. A fee for a copy is
allowed, if it is reasonable, does not exceed the costs and the tariffs are published.110
As a matter of fact, the Belgian and Dutch legislation foresee a reasonable fee
can be determined to compensate the costs for a copy.111 However, other charges
due to other legislation are still possible.112 For big data purposes, a digital copy
normally suffices. In principle, such a copy does not entail any costs worth men-
tioning for the government. Therefore, it is recommended for the legislator to
introduce that, by default, digital copies should be provided for free.
The legislator should especially take care to keep charges as low as possible or
even abolish them, if the access concerns (healthcare) research data or data with a
high value for healthcare by private healthcare actors who use big data for the
benefit of healthcare.

11.2.4.6 Copyright Protection

According to the Belgian Federal Access Act,113 the consultation of copyright pro-
tected documents is allowed, but for a copy the consent of the author is required.114
The conflict with the copyright of a third party is not explicitly regulated in the Dutch
Access Act. However, in the Netherlands the same principle as in Belgium
applies.115,116 Evidently, the consent requirement for a copy can be a problem for big
data applications, but it is legitimate to protect the interests of the author.
Documents created by the Belgian or Dutch government fall in the “public
domain”. They are not copyright protected if it concerns “official” documents, i.e.
legislation and its preparatory documents, jurisprudence and decisions of the
administration.117 Other information created by the government does fall under the

110
Article 7 Access Convention.
111
Article 12 Federal Access Act, Article 20, § 3 Flemish Access Act and Article 12 Dutch
Access Act. For instance, the Belgian municipality of Lier asks 0,10 EUR for a black and white
copy and 0,50 EUR for a colour copy. A digital copy is free (www.lier.be/IK_WIL/Ik_wil_
aanvragen/Openbaarheid_van_bestuur_aanvraag. Accessed 26 September 2018).
112
Opdebeek and De Somer 2017, p. 456.
113
Article 9, limb 1 and 2 Federal Access Act.
114
See also Jongen 1994, pp. 303–304.
115
See the Dutch Act of 23 September 1912 with regard to copyright, Stb. 1912, no. 308.
116
Van Eechoud 2008, p. 96.
117
Article XI.172, § 2 Belgian Code of Economic Law (the copyright provisions were inserted in
the Belgian Code of Economic Law by the Federal Act of 19 April 2014, BS 12 July 2014) and
Article 11 Dutch Copyright Act.
210 M. Caes

scope of the copyright legislation, which implies its consent is needed for a copy.118
Article 15b of the Dutch Copyright Act specifies it is possible though, to publish
and copy this information, unless its copyright is specifically reserved.

11.2.4.7 Access Refusal Grounds

In case of conflict with other fundamental interests, information will possibly not
fall under the duty of the government to grant access upon request.119 Article 32 of
the Belgian Constitution and Article 110 of the Dutch Constitution determine
publicity exceptions need to be enacted in an Act of Parliament. The exceptions are
exhaustive and should be interpreted in a restrictive120 way.121
Article 3 of the Access Convention states limitations to the access right need to
be set down in a law, necessary in a democratic society and proportionate to the
interests they aim to protect. Paragraph 1 of the article contains an exhaustive list of
legitimate interests, such as commercial and other economic interests or privacy and
other legitimate private interests. Access to information can be refused if its dis-
closure would or would be likely to harm an interest mentioned in the first para-
graph, unless there is an overriding public interest to disclose.122
The Dutch Access Act contains several imperative content related refusal
grounds.123 In Belgium, there are not only imperative content related refusal
grounds,124 but also facultative formal ones.125,126 In the latter case the government
has a discretionary power, whereas in the first case refusal is obligatory.127 With
regard to the imperative grounds, a distinction is made between relative and
absolute grounds.128 Relative grounds require a balancing exercise between the
interest of public disclosure and a particular interest mentioned in the law.129
Absolute grounds on the other hand, should lead to an automatic access refusal in

118
Van Eechoud 2008, pp. 95–96; Vanhees 1998, p. 22.
119
Opdebeek and De Somer 2017, p. 458.
120
See Article 10 Flemish Access Act; RvS 27 June 2001, no. 97.056, Tassin; RvS 16 January
1998, no. 70.844, Duez; RvS 2 October 1997, no. 68.610, Delwart; RvS 18 June 1997, no. 66.860,
Matagne.
121
RvS 3 October 2011, no. 215.506, Baumwald; RvS 29 March 2010, no. 202.459, Sevenhans;
RvS 22 June 2006, no. 160.433.
122
Article 3, § 3 Access Convention.
123
Article 10 Dutch Access Act.
124
Article 6, § 1–2 Federal Access Act and Articles 13–14 Flemish Access Act.
125
Article 6, § 3 Federal Access Act and Article 11 Flemish Access Act; RvS 16 January 1998,
no. 70.844.
126
Tijs 2012, p. 314.
127
Opdebeek and De Somer 2017, p. 466; Tijs 2012, p. 314.
128
Opdebeek and De Somer 2017, pp. 466–467; Tijs 2012, p. 315.
129
Opdebeek and De Somer 2017, pp. 466–467; Tijs 2012, p. 315.
11 Access to and Re-use of Government Data … 211

case publicity would damage the concerned interest.130 This automatism is criti-
cised though. Some authors emphasise there should always be a balance exercise,
even if it is only a limited one.131 The jurisprudence of the European Court on
Human Rights with regard to Article 10 ECHR and its relationship with Article 8
ECHR,132 which emphasises the right to private life does not necessarily prevail
over the right to publicity, strengthens this view.133
This chapter does not discuss the complete list of refusal grounds. The ones
which seem most relevant in the context of big data in healthcare are singled out:
incompleteness of the data, unreasonable or (almost) impossible requests, vague
and general requests, the protection of private life, the protection of governmental
economic or financial interests, the protection of the confidentiality of commercial
or industrial information, obligations of secrecy and the protection of fundamental
rights and freedoms of others.
Particularly interesting for big data are the Belgian facultative refusal grounds.
First, the government can refuse access because the requested documents are not
finished or incomplete, which can lead to wrong insights and even decisions based on
these insights.134 Since developers of big data applications lately put great emphasis
on the collection of correct and qualitative data, this seems a legitimate refusal ground.
Another reason the government can invoke, is that access is not reasonable or
practically impossible to provide.135 An access request can also be too vague or too
general and consequently refused.136 In short, if the advantage of being able to
access certain information does not outweigh the government’s workload, access
can be refused. In a big data context, the last two refusal grounds could certainly
emerge. First, it does not always seem reasonable or practically possible to demand
access to large quantities of data and/or data in diverse locations.137 Also, a big data
request can easily be too general or vague.138 It is important though, considering the

130
Tijs 2012, p. 315.
131
Schram 2005, pp. 578–586.
132
ECtHR 8 November 2016, no. 18030/11, Magyar Helsinki Bizottsag/Hungary.
133
Voorhoof 2016, p. 5.
134
Article 6, § 1, 1° Federal Access Act and Article 11, 2° Flemish Access Act.
135
Article 6, § 1, 3° Federal Access Act and Article 11, 1° Flemish Access Act.
136
Article 6, § 1, 4° Federal Access Act and Article 11, 1° Flemish Access Act. The Dutch
Access Act states in its Article 3.4 the government can ask more details if a request is too vague or
imprecise.
137
CTB, opinion of 9 June 1997, no. 97/32 (on the request to access a large quantity of docu-
ments: If this is materially almost impossible, the access can be denied or limited to a
consultation).
138
VBOB 7 January 2011, no. OVB/2010/295 (on a request to deliver all reports of
“B-inspections” of a range of bridges between 2008 and 2010); VBOB 24 December 2008, no.
OVB/2008/174 (on the request of large quantities of information from the same city for a column
in a local magazine, which hampers the normal functioning of this city); VBOB, 28 July 2005, no.
OVB/2005/53 (on the request of more than 10.000 pages).
212 M. Caes

rise of big data and advanced data analytics technology, the government has the
adequate infrastructure to allow access by third parties to large volumes of infor-
mation, even if these are stored in different locations. Consequently, the last two
facultative refusal grounds should not be lightly invoked.
A first relevant imperative refusal ground for big data is the protection of private
life.139 Personal data cannot be communicated, unless the concerned person agrees
on this. It is categorised in Belgium, as well in the Netherlands as an absolute
refusal ground. So, if there is no consent, it will be hard for big data initiatives to
gain access to these data. This is of concern for big data in a healthcare context,
given that the vast majority of the relevant data are personal data. However,
exceptionally access can be granted even though there is no consent and private life
is at stake. In this context, the Belgian Commission on Access to Government
Information has allowed consultation and analysis of personal information by
scientific researchers when sufficient guarantees are provided for the protection of
the data and when the processing is limited to the collection of data for the con-
cerned scientific research.140
The economic or financial interests of the government constitute an absolute141
or a relative142 imperative refusal ground. Since healthcare is a considerable budget
item of the government, financial interests will often play an important role. It is
plausible that for this reason access for big data applications will be refused. If the
financing of the healthcare system is endangered, a refusal can indeed be justified.
The confidential character of commercial and industrial information should also
be mentioned.143 This is a relative144 or an absolute145 imperative refusal ground. In
a healthcare context, the government could receive such information from medical
and pharmaceutical companies. Consequently, big data applications possibly will
not be able to access this data due to the refusal ground. If the government carefully
balances the interest of the companies146 with the interest of access to healthcare
data, this limitation is justified.
Healthcare providers have an obligation of secrecy.147 However, sometimes they
are obliged to share data falling under the scope of this obligation with the gov-
ernment.148 The Belgian legislation however, foresees an absolute imperative

139
Article 6, § 2, 1° Federal Access Act, Article 13, 2° Flemish Access Act and Article 10.1, d
Dutch Access Act; RvS 11 December 2000, no. 85.177; RvS 9 July 1999, no. 81.740.
140
CTB, opinion of 8 July 2002, no. 2002/35.
141
Article 6, § 1, 6° Federal Access Act.
142
Article 14, 1° Flemish Access Act and Article 10.2, (b) Dutch Access Act.
143
RvS 18 June 2002, no. 107.951; RvS 2 October 1997, no. 68.809.
144
Article 6, § 1, 7° Federal Access Act and Article 14, 3° Flemish Access Act.
145
Article 10.1, (c) Dutch Access Act.
146
More in particular, their interest of fair competition is at stake.
147
See Article 458 Belgian Criminal Code, BS 9 June 1867.
148
For instance, there is a duty to report certain communicable diseases.
11 Access to and Re-use of Government Data … 213

refusal ground to access these data,149 due to which big data applications rightfully
will not be able to use this information.150 The professional secrecy of healthcare
providers is indeed too important to undermine.
Finally, the protection of the freedoms and rights of others is a relative imper-
ative refusal ground, which can be found in the Article 6, § 1 of the Federal Access
Act. A relevant example of such freedoms and rights for big data in healthcare is the
protection of patient rights and the right to health.151
The discussed refusal grounds always require—to a greater or lesser extent—a
balancing exercise. This is one of the reasons why the investigation of access
requests takes time, which reduces the efficiency or even the effectiveness of
healthcare big data applications. It is a legitimate and necessary exercise though,
since it permits to carefully consider the interests at stake. It is reasonably possible
though to do this balancing when no request has been made (yet), specifically for
certain research and high value data, and to make these data subsequently available
for well-defined private healthcare actors aiming to use these for big data appli-
cations that benefit healthcare. In such cases a request procedure seems an
unnecessary and disproportionate limitation.

11.3 Re-use of Public Sector Information

11.3.1 The PSI Directive

The PSI Directive aims to regulate the re-use of public sector information. The term
“public sector information” can only be found in the title of the Directive, but
should be seen as a synonym of “existing documents held by public sector bodies”,
which is used in the articles and recitals of the Directive, taking into account the
preparatory process of the Directive.152
Article 2.4 of the Directive defines “re-use” as “the use by persons or legal entities
of documents held by public sector bodies, for commercial or non-commercial
purposes other than the initial purpose within the public task for which the documents
were produced. Exchange of documents between public sector bodies purely in
pursuit of their public tasks does not constitute re-use”.153 Since the Directive intends

149
On obligations of secrecy in general: RvS 28 March 2001, no. 94.419; RvS 8 February 2000,
no. 85.177.
150
Article 6, § 2 Federal Access Act and Article 13 Flemish Access Act.
151
RvS 10 January 2015, no. 221.961, cvba Verbruikersunie Test Aankoop (on the publicity of
data from the annual reports of the ombuds in hospitals).
152
Janssen, K (2009) The EC Legal Framework for the Availability of Public Sector Spatial Data,
pp. 108–109. https://lirias2repo.kuleuven.be/bitstream/id/94728/;jsessionid=
E986A59479DAE00DAFF2791534E1C4B9. Last accessed 7 August 2018.
153
Its definition can be found in the Belgian and Dutch legislation in Article 2, 4° Federal Re-use
Act, Article 2, 3° Flemish Re-use Act and Article 1, b Dutch Re-use Act.
214 M. Caes

to stimulate persons to turn available public sector information into socio-economic


value,154 it is of particular interest in a big data context.
The original PSI Directive of 2003 only aimed to facilitate the re-use of PSI in
the EU by harmonising the basic conditions for re-use and removing major barriers
to re-use in the internal market.155 Its provisions were only applicable to the extent
Member States allowed re-use of public sector information. In 2013, the PSI
Directive was revised and an obligation to allow the re-use of generally accessible
public data was introduced.156
Very recently, on 25 April 2018, the European Commission adopted a proposal
to again revise the PSI Directive. The Commission wishes to address some issues,
mentioned in the evaluation report on the Directive.157 With regard to big data, an
interesting observation of this report is that the provision of real-time access to
dynamic data via adequate technical means is still a regulatory challenge.158 Other
observed relevant challenges are the increase of the supply of high value public data
for re-use and the limiting of the use of exceptions to the principle of charging the
marginal cost.159
In Belgium and the Netherlands, the Directive is converted in the Federal Re-use
Act, the Flemish Re-use Act, the Flemish E-government Act and the Dutch Re-use
Act.

11.3.2 Link with Access to Government Information?

Re-use goes further than access to a document in order to learn its content. After
accessing the data, it is used for another purpose. Re-use will only be allowed if
access is. The legislation on the re-use of public sector information is indeed linked
to the one on access to government information. The PSI Directive specifies that
content that cannot be accessed under national laws on access to government
documents is not re-usable.160 It is also clarifies that it builds on and is without

154
Opdebeek and De Somer 2017, p. 480.
155
See recital 9 original PSI Directive.
156
Article 3 PSI Directive.
157
Evaluation report (European Commission) on the Directive 2003/98/EC on the re-use of
public sector information, 25 April 2018, SWD (2018).
158
Evaluation report (European Commission) on the Directive 2003/98/EC on the re-use of
public sector information, 25 April 2018, SWD (2018), 42; see also PSI Directive Revision
Proposal, p. 1.
159
Evaluation report (European Commission) on the Directive 2003/98/EC on the re-use of
public sector information, 25 April 2018, SWD (2018), 42; see also PSI Directive Revision
Proposal, p. 1.
160
Article 1.2, (c), (ca), (cb) and (cc) PSI Directive.
11 Access to and Re-use of Government Data … 215

prejudice to the national access regimes.161 Thus, both legislative frameworks are
relevant for the functioning of big data applications of private healthcare actors.

11.3.3 A “Governmental” Obligation to Allow Re-use

The PSI Directive uses the term “public sector bodies” to refer to the government
bound by its provisions. These are defined as “the State, regional or local author-
ities, bodies governed by public law and associations formed by one or several such
authorities or one or several such bodies governed by public law”.162 “A body
governed by public law” is further defined as “any body: (a) established for the
specific purpose of meeting needs in the general interest, not having an industrial or
commercial character; and (b) having legal personality; and (c) financed, for the
most part by the State, or regional or local authorities, or other bodies governed by
public law; or subject to management supervision by those bodies; or having an
administrative, managerial or supervisory board, more than half of whose members
are appointed by the State, regional or local authorities or by other bodies governed
by public law”.163 The term “public sector bodies” is defined in the same way as the
term “contracting authorities” used in the Public Procurement Directives.164
Therefore, the case law of the European Court of Justice regarding “contracting
authorities” should serve as our reference point for the interpretation of its scope.165
Considering the above, more healthcare entities are to be considered as “gov-
ernment” compared to the legislation on access to government information. For
instance, all Belgian hospitals, whether private or public, will in principle fall under
the scope of the re-use legislation, since all Belgian hospitals are qualified as
“contracting authorities”.166 In the Netherlands, less hospitals are to be included.
Academic hospitals qualify as “contracting authorities”, but not the other hospitals,
except when a contracting authority has control over, or supervision on it.167

161
Article 3 PSI Directive.
162
Article 2.1. PSI Directive.
163
Article 2.2 PSI Directive. The Belgian and Dutch definition of “public sector body” can be
found in Article 2, 1° Federal Re-use Act, Article 2, 1° Flemish Re-use Act and Article 1, c Dutch
Re-use Act.
164
See Directive 2014/23/EU of 26 February 2014 on the award of concession contracts, OJ L
094, 28 March 2014, p. 1; Directive 2014/24/EU of 26 February 2014 on public procurement,
OJ L 094, 28 March 2014, p. 56; Directive 2014/25/EU of 26 February 2014 on procurement by
entities operating in the water, energy, transport and postal services sectors, OJ L 094, 28 March
2014, p. 243.
165
Janssen, 2009, p. 48.
166
Callens et al. 2015, p. 703.
167
See Hoge Raad, 1 June 2007, ECLI:NL:HR:2007:AZ9872 and Court of first instance of
Utrecht, 5 December 2012, ECLI:NL:RBUTR:2012:BY5442. De Hoge Raad is the Dutch
Supreme Court in the fields of civil, criminal and tax law.
216 M. Caes

However, hospitals that can be qualified as “a public sector body” will often not be
obliged to allow re-use of their documents under the PSI Directive, due to its
provision which excludes documents held by educational and research establish-
ments from its scope.168 In the context of the use of big data in healthcare though,
the legislator could consider to oblige the concerned bodies to allow the re-use of
certain research data by well-defined private healthcare actors.

11.3.4 Re-use Limitations

11.3.4.1 What Kind of Information Can Be Re-used?

The PSI Directive applies to the re-use of “existing documents held by public sector
bodies”.169 “Document” has a “generic definition”, “in line with the developments of
the information society”.170 It refers to “any content whatever its medium (written
on paper or stored in electronic form or as a sound, visual or audiovisual recording)”
and “any part of such content”.171 The European Commission clarified “document”
covers “all types of content, varying from audiovisual material to databases, digi-
tised or not”.172 Finally, only “existing” information can be re-used. The government
does not have an obligation to create information. In this regard, the PSI Directive
reminds us of the legislation on access to government information.
An interesting evolution is the introduction by the Revision Proposal of the
European Commission of specific categories of data for which it foresees specific
rules to encourage their re-use: “dynamic data”, “high value datasets” and “research
data”.173 With regard to “research data”, it should be mentioned it only concerns
data which are the result of publicly funded research and whenever access to such
data is provided through an institutional or subject-based repository.174 The pro-
posal does not contain provisions on how to ensure access to and re-use of all

168
Article 1.2, e PSI Directive. See for Belgium and the Netherlands: Article 3, § 2, 7° Federal
Re-use Act, Article 2/1, 5° Flemish Re-use Act and Article 2.1, d Dutch Re-use Act. See more
infra (in Sect. 11.3.4.6).
169
Article 1.1. PSI Directive. See for Belgium and the Netherlands: Article 3, § 1 Federal Re-use
Act, Article 3, limb 1 Flemish Re-use Act and Article 1, d Dutch Re-use Act.
170
Recital 11 of the original PSI Directive.
171
Article 2.3 PSI Directive. See for Belgium and the Netherlands: Article 2, 2° Federal Re-use
Act, Article 2, 2° Flemish Re-use Act and Article 1, d Dutch Re-use Act.
172
Proposal (European Commission) for a directive of the European Parliament and of the
Council on the re-use and commercial exploitation of public sector documents, 5 June 2002, COM
(2002) 207 final, p. 9.
173
We already referred to the definitions of these data categories included in the PSI Directive
Revision Proposal in the beginning of this contribution (see Sect. 11.1.3). The proposed modifi-
cations regarding these data categories are discussed more in detail further in this contribution (see
Sect. 11.3.4.3 et seq.).
174
Article 10 PSI Directive Revision Proposal.
11 Access to and Re-use of Government Data … 217

scientific information. It is up to the Member States to define this.175 In the context


of the use of big data by private healthcare actors, the Member States should
definitely work on this.

11.3.4.2 Who Can Re-use and for What Purpose?

In principle, everyone has the right to re-use public sector information.176 The
framework not only intends to stimulate the use of governmental data for commercial,
but also for non-commercial purposes.177 The organisation of an efficient and qual-
itative healthcare system is an example of such a non-commercial purpose, which
cannot only be pursued by governmental but also by private healthcare actors.

11.3.4.3 Re-use Request Formalities and Processing

A person wishing to re-use public sector information, can make a re-use request.178
Of course, in some cases such a request will not be necessary if the information is
already made publicly available by the government, often on a data portal.179,180
The formalities of a re-use request are similar to those of the legislation on
access to government information. The Federal and the Flemish Re-use Act require
a precise description of the data, the requested form and the aim which the applicant
pursues.181 The Flemish legislation asks for the identification of the applicant as
well.182 Moreover, the request cannot be unreasonable or vague.183 Also the Dutch
legislation requires the request to be specific184 and precise.185 As discussed earlier
in this contribution, such formalities can be an issue for big data applications.
The Directive imposes—if possible and appropriate—the electronic processing of
re-use requests, which encourages big data use. In addition, the processing should
happen within a reasonable time. In this context, the Directive refers to the time frames

175
PSI Directive Revision Proposal, p. 4.
176
Article 3.1 PSI Directive. See for Belgium and the Netherlands: Article 4 Federal Re-use Act,
Article 3 Flemish Re-use Act and Article 3.1 Dutch Re-use Act.
177
Article 3.1 PSI Directive. See for Belgium and the Netherlands: Article 4 Federal Re-use Act,
Article 3 Flemish Re-use Act and Article 3.1 Dutch Re-use Act.
178
Article 4 PSI Directive. See for Belgium and the Netherlands: Article 10 Federal Re-use Act,
Article 10 Flemish Re-use Act and Article 3 Dutch Re-use Act.
179
See infra for more information on data portals (Sect. 11.3.4.5).
180
It is possible though, the government has imposed some conditions for the re-use of this
publicly available information.
181
Article 10, § 1 Federal Re-use Act and Article 10, § 2 Flemish Re-use Act.
182
Article 10, § 2 Flemish Re-use Act.
183
Article 11 Flemish Re-use Act.
184
Article 3.2 Dutch Re-use Act.
185
Article 3.4 Dutch Re-use Act.
218 M. Caes

foreseen in the access to government legislation. If there are no specific time limits or
other rules regulating the timely provision of documents, the limit foreseen in the
Directive is twenty working days after receipt of the request, which may be extended
by another twenty working days for extensive or complex requests. In the latter case
the applicant shall be notified that more time is needed to process the request.186
As mentioned earlier in the section on access to government information, the
time it takes to process requests can be an issue for big data applications in
healthcare. It reduces their efficiency or even effectiveness. Therefore, the recent
European Commission’s PSI Revision Proposal, which intends to shorten the time
frame to process re-use requests for specific data categories (“dynamic data” and
“high value datasets”) is good news.
According to the PSI Directive Revision Proposal, “[p]ublic sector bodies […]
shall make dynamic data available for re-use immediately after collection, via
suitable Application Programming Interfaces (APIs).”187 An “API” “describes the
kind of data can be retrieved, how to do this and the format in which the data will be
received. It has different levels of complexity and can mean a simple link to a
database to retrieve specific datasets, a web interface, or more complex set-ups.”188
It concerns a soft obligation, because if this would exceed the financial and tech-
nical capacities of the public sector body, the government shall make these data
available in a timeframe that does not unduly impair the exploitation of their
economic potential.189 With regard to “high value datasets” though, the government
would have to guarantee in any case the possibility to re-use immediately after
collection via API.190 The introduction of the latter would be a major improvement
for the use of big data in healthcare.

11.3.4.4 Re-use Conditions

Re-use can be unconditionally or subject to conditions, which can be laid down in a


licence.191 Licence conditions shall not unnecessarily restrict possibilities for re-use

186
Articles 4.1 and 4.2 PSI Directive. For Belgium, at the federal level the time limit can be found
in a Royal Decree of 29 October 2007 (BS 6 November 2007). According to its Article 3 a request
should be investigated within ten working days. Within 20 working days the applicant should be
able to re-use the requested information (Article 5). In Flanders, the time limit to process the
request is 15 days and for complex requests 30 days (Article 12, § 2 Flemish Re-use Act). When
re-use is allowed the information is provided within 30 days after the request (Article 12, § 3
Flemish Re-use Act). In the Netherlands, the time frame is four weeks (Article 4.4 Dutch Re-use
Act).
187
Article 5.4 PSI Directive Revision Proposal.
188
Recital 28 and Article 5.4 PSI Directive Revision Proposal.
189
Article 5.5 PSI Directive Revision Proposal.
190
Article 13.2 PSI Directive Revision Proposal.
191
Article 8.1 PSI Directive.
11 Access to and Re-use of Government Data … 219

and shall not be used to restrict competition.192 Furthermore, any other applicable
condition should be non-discriminatory for comparable categories of re-use.193
Member States shall encourage all public sector bodies to use standard licences,
which facilitates the re-use of data.194 The Belgian legislator indeed took the use of
standard licences as a starting point.195 The PSI Directive Revision Proposal takes it
further and specifically states the conditions for re-use of “high value datasets” shall
be compatible with open standard licences.196
The grant of exclusive rights to third parties is in principle forbidden: “[t]he
re-use of documents shall be open to all potential actors in the market, even if one
or more market players already exploit added-value products based on these doc-
uments. Contracts or other arrangements between the public sector bodies holding
the documents and third parties shall not grant exclusive rights”.197 Exclusivity is
exceptionally allowed when this is necessary for the provision of a service in the
public interest. Such exclusive arrangements shall be transparent and made public,
and subject to regular review, at least every three years.198 In the context of big data
use in healthcare, exclusivity is therefore an option.
As a basic principle, public sector bodies shall allow the re-use of information at
charges which only cover the marginal costs for reproducing and disseminating the
documents.199 The PSI Directive requires “any applicable conditions and the actual
amount of those charges, including the calculation basis for such charges, shall be
pre-established and published, through electronic means where possible and
appropriate”.200
In some cases higher charges can be imposed, which can discourage the use of
big data in healthcare since its efficiency is reduced. According to the PSI Directive
these charges cannot exceed the total costs of collecting, producing, reproducing
and disseminating documents, together with a reasonable return on investment.
They shall be laid down according to objective, transparent and verifiable crite-
ria.201 At the outset shall be indicated which factors are taken into account in the

192
Article 8.1 PSI Directive. See for Belgium and the Netherlands: Article 7, § 1–2 Federal
Re-use Act, Article 8 Flemish Re-use Act and Article 6.2 Dutch Re-use Act.
193
Article 10.1 PSI Directive.
194
Article 8.2 PSI Directive.
195
Articles 7.2 and 7.3 Federal Re-use Act and Article 8 Flemish Re-use Act.
196
Article 13 PSI Directive Revision Proposal.
197
Article 11.1 PSI Directive. See for Belgium and the Netherlands: Article 20, § 1 Federal
Re-use Act, Article 14, § 1 Flemish Re-use Act and Article 7.1 Dutch Re-use Act.
198
Article 11.2 PSI Directive. See for Belgium and the Netherlands: Article 20, § 1 Federal
Re-use Act, Article 14, § 2 Flemish Re-use Act and Article 7 Dutch Re-use Act.
199
Article 6.1 PSI Directive. See for Belgium and the Netherlands: Article 8 Federal Re-use Act,
Article 7, limb 1 Flemish Re-use Act and Article 9.1 Dutch Re-use Act.
200
Article 7.1 PSI Directive. See for Belgium and the Netherlands: Article 6, § 2 Federal Re-use
Act, Article 7/1, limb 1 Flemish Re-use Act and Article 9.4 Dutch Re-use Act.
201
Article 6.3 PSI Directive. See for Belgium and the Netherlands: Article 8, limb 2 and 3 Federal
Re-use Act, Article 7, limb 2 and 3 Flemish Re-use Act and Article 9.3 Dutch Re-use Act.
220 M. Caes

calculation, and upon request, the way in which such charges have been calculated
in relation to the specific request.202 This exception applies when public sector
bodies are required to generate revenue to cover a substantial part of their costs
relating to the performance of their public tasks or when the re-use concerns
documents for which the body is required to generate sufficient revenue to cover a
substantial part of the costs relating to their collection, production, reproduction and
dissemination.203
The European Commission intends to lower the charges in order to encourage
re-use. Therefore, its recent Revision Proposal specifies documents should be made
available for re-use for free. Only if necessary charges can be applicable, which
shall in principle be limited to the marginal costs.204 Only in exceptional cases, i.e.
when the public sector bodies need to generate revenue in order to ensure the
performance of their public tasks, a higher charge is possible.205 In the latter case,
the cost of anonymisation of personal data or of commercially sensitive information
should be included in the cost.206 Finally, the re-use of “high value datasets” and
“research data” should be free of charge.207 Clearly, this would be a positive
evolution for big data applications in healthcare, since a reduction of the re-use
charges implies more efficient big data tools.

11.3.4.5 Information Delivery Modalities

Article 5.1 PSI Directive states “public sector bodies shall make their documents
available in any pre-existing format or language, and, where possible and appro-
priate, in open and machine-readable format together with their metadata. Both the
format and the metadata should, in so far as possible, comply with formal open
standards”.208 This provision facilitates the use of government data for big data
applications, since technical data access issues are taken into account. However,
considering the wording of the article,209 it concerns only a ‘soft’ obligation and the
government seems to have a broad margin of appreciation.

202
Article 7.2 PSI Directive. See for Belgium and the Netherlands: Article 6, § 3 Federal Re-use
Act, Article 7/1, limb 2 and 3 Flemish Re-use Act and Article 9.4 Dutch Re-use Act.
203
Article 6.2, a and b PSI Directive. See for Belgium and the Netherlands: Article 8, limb 2 and
3 Federal Re-use Act, Article 7, limb 2 and 3 Flemish Re-use Act and Article 9.3 Dutch Re-use
Act.
204
Article 6.1 PSI Directive Revision Proposal; Recital 33 PSI Directive Revision Proposal.
205
Article 6.2 PSI Directive Revision Proposal; Recital 33 PSI Directive Revision Proposal.
206
Article 6.3 PSI Directive Revision Proposal; Recital 33 PSI Directive Revision Proposal.
207
Article 6.5 PSI Directive Revision Proposal.
208
Article 5.1 PSI Directive. See for Belgium and the Netherlands: Article 9, § 1 and 2 Federal
Re-use Act, Article 3, limb 3 Flemish Re-use Act and Article 5.1 Dutch Re-use Act.
209
“[W]here possible and appropriate” and “in so far as possible”.
11 Access to and Re-use of Government Data … 221

The government has no obligation “to create or adapt documents or provide


extracts in order to comply […] where this would involve disproportionate effort,
going beyond a simple operation”.210 Furthermore, “public sector bodies cannot be
required to continue the production and storage of a certain type of documents with
a view to the re-use of such documents by a private or public sector organisa-
tion”.211 This implies the PSI Directive does not offer private healthcare actors a
legal basis to require the government to keep collecting or storing data they have
been using for a big data application. They can of course always try to work out a
mutual arrangement in this regard.
The PSI Directive finally encourages the Member States to make practical
arrangements to facilitate the search for documents available for re-use and their
re-use conditions.212 In this context, it refers to asset lists of main documents with
relevant metadata, accessible where possible and appropriate online and in
machine-readable format, and portal sites linked to the asset lists. Indeed, the
Belgian and the Dutch government have set up data portals. The Belgian Federal
Re-use Act provides for the creation of a federal data portal, providing access to
public sector information available for re-use, an inventory of the available docu-
ments, the standard licence, an overview of possible specific conditions and the
applicable charges.213 Article 9 of the Flemish Re-use Act also refers to the creation
of data portals with links to lists of the principal documents available for re-use.
Contrary to the legislation on access to government information, the Belgian leg-
islator has laid down an inventory obligation for the government—at least to a
certain extent—regarding its data available for re-use.

11.3.4.6 Re-use Refusal Grounds

Certain information is excluded from the scope of the PSI Directive. Below, the most
relevant exceptions for big data use in healthcare are mentioned. First, documents the
supply of which is an activity falling outside the scope of the public task of the public
sector body.214 Information the government produces and charges for exclusively on
a commercial basis and in competition with others in the market does therefore not
fall under the scope of the discussed re-use legislation.215 Second, documents for

210
Article 5.2 PSI Directive. See for Belgium: Article 9, § 1 Federal Re-use Act and Article 5
Flemish Re-use Act.
211
Article 5.3 PSI Directive. See for Belgium and the Netherlands: Article 9, § 3 Federal Re-use
Act, Article 6 Flemish Re-use Act and Article 5.2 Dutch Re-use Act.
212
Article 9 PSI Directive. See also recital 23 of the original PSI Directive.
213
Article 22, § 1 and § 2 Federal Access Act.
214
Article 1.2, a PSI Directive. See for Belgium: Article 3, § 2, 2° Federal Re-use Act and Article
2/1, 1° Flemish Re-use Act.
215
See recital 9 of the original PSI Directive which refers to this as a typical example of
“activities falling outside the public task”.
222 M. Caes

which third parties hold intellectual property rights:216 consent of the concerned third
party will be required for re-use.217 Third, documents excluded from access by virtue
of the earlier in this contribution discussed legislation on access to government
data.218 Fourth, documents held by educational and research establishments,
including organisations established for the transfer of research results, schools and
universities, with the exception of university libraries—of course only to the extent
that these establishments can be considered “public sector bodies”.219 Finally, doc-
uments containing personal data can only be re-used if this is compatible with data
protection legislation.220
The fourth refusal ground is particularly interesting for big data in healthcare,
since this implies re-use of research data of for instance the Belgian Health Care
Knowledge Center, Belgian hospitals or the eight Dutch University Medical
Centers will not be possible under the discussed legal regime.221
It was observed earlier that the PSI Directive Revision Proposal intends to
extend its scope to certain “research data”, “documents produced as part of sci-
entific research, namely results of the scientific fact-finding process (experiments,
surveys and similar) that are at the basis of the scientific process”.222 This would
imply a limitation of the aforementioned exemption of documents held by educa-
tional and research establishments223 and obviously favours the use of big data in
healthcare. However, it should be repeated it only applies for research data insofar
as the research is publicly funded and whenever access to the data is provided
through an institutional or subject-based repository.224

216
Article 1.2, b PSI Directive. See for Belgium and the Netherlands: Article 3, § 2, 3° Federal
Re-use Act, Article 2/1, 2° Flemish Re-use Act and Article 2.1, b Dutch Re-use Act.
217
Earlier (see Sect. 11.2.4.6) this contribution mentioned the access to government information
legislation also requires consent in case a copy is requested. Mere access however, is possible
without consent of the concerned third party.
218
Article 1.2, c, ca and cc PSI Directive. See for Belgium and the Netherlands: Article 3, § 2, 4°
and 5° Federal Re-use Act, Article 2/1, 3° Flemish Re-use Act and Article 2.1, a Dutch Re-use Act.
Access can be restricted or denied on the grounds of the protection of national or public security,
defence, statistical or commercial confidentiality or the protection of personal data. In some cases a
particular interest will be needed to obtain access.
219
Article 1.2, e PSI Directive. See for Belgium and the Netherlands: Article 3, § 2, 7° Federal
Re-use Act, Article 2/1, 5° Flemish Re-use Act and Article 2.1, d Dutch Re-use Act.
220
Article 1.4 PSI Directive. See for Belgium and the Netherlands: Article 3, § 3 Federal Re-use
Act and Article 1, g Dutch Re-use Act.
221
The PSI Directive concept of “government” in a healthcare context was discussed earlier in
this contribution (see Sect. 11.3.3).
222
PSI Directive Revision Proposal, p. 10.
223
PSI Directive Revision Proposal, p. 10.
224
Article 10 PSI Directive Revision Proposal.
11 Access to and Re-use of Government Data … 223

11.4 Lessons for the Legislator?

Access to and re-use of government data by private healthcare actors in the context
of big data applications that benefit healthcare is not evident. It is strictly regulated
due to the diverse interests which are at stake, such as the protection of private life,
confidentiality of patient information, intellectual property or the financial interests
of the government.
However, the current legislation concerning access to government data and
re-use of public sector information can be an unnecessary and disproportionate
limitation. Especially the access and re-use request procedure can make big data
applications of private healthcare actors inefficient or even ineffective. Indeed,
although the request procedure serves the protection of legitimate interests, it can be
complex, time consuming and costly.
The European Commission recognises the societal and economic value of the
re-use of government data. Therefore, in its recent PSI Directive Revision Proposal,
it introduced amongst others specific re-use regimes for “dynamic data”, “research
data” and “high value data”. For some or all of these data categories, the
Commission proposes to lower the re-use charges, to reduce the request processing
time frame and to use open standard licences. These measures would certainly be a
step forward in the context of the use of big data in healthcare.
However, further action is required. The government should more actively share
its data for access and re-use. The request procedure should be limited to cases
where this is really necessary to protect the different interests at stake. In order not
to disproportionally hinder the use of big data by private healthcare actors—at least
to the extent that this benefits healthcare, the government could be obliged to
actively and for free provide certain data (research and high value data) to a limited
list of private healthcare actors. At least, the government should be encouraged or
even obliged to more actively share what kind of datasets it possesses, so private
healthcare actors at least know which data they are able to access or re-use.

References

Boes M (1996) Openbaarheid van bestuur. Bevoegdheidsverdeling. De federale openbaarheids-


regeling. In: Draye A (ed) Openbaarheid van bestuur in Vlaanderen, België en de Europese
instellingen. Instituut voor Milieurecht KU Leuven, Leuven, pp 11–27
Callens S, Coeffe M, Peers J (2015) Mededinging, overheidsopdrachten en gezondheidszorg. In:
Callens S, Peers J (eds) Organisatie van de gezondheidszorg. Intersentia, Antwerp, pp 691–738
Council of Europe (2018) Chart of signatures and ratifications of Treaty 205. www.coe.int/en/web/
conventions/full-list/-/conventions/treaty/205/signatures?p_auth=ylsG4jgX. Last accessed
7 August 2018
De Mauro A, Greco M, Grimaldi M (2016) A Formal Definition of Big Data Based on Its Essential
Features. Library Review 5:122–135
De Somer S (2012) Het begrip administratieve overheid: stand van zaken van a never ending story.
Rechtskundig weekblad 75:1614–1638
224 M. Caes

Delvaux M (1999) La loi du 2 novembre 1997 et la publicité de l’administration dans les


communes. Rev.dr.commun. 2–62
IGJ (2018) Wat maakt IGJ openbaar? www.igj.nl/onderwerpen/openbaarmaking. Last accessed
7 August 2018
Janssen K (2009) The EC Legal Framework for the Availability of Public Sector Spatial Data. https://
lirias2repo.kuleuven.be/bitstream/id/94728/;jsessionid=E986A59479DAE00DAFF2791534E1C4B9.
Last accessed 7 August 2018
Jongen F (1994) Droit d’auteur et droit d’accès. Amén. 303–304
Jongen F (1995) La publicité de l’administration. JT 777–788
Kaiser M, Gourdin E (2015) La qualité du requérant et son intérêt au recours et au moyen. In:
Viseur F, Philipparts J (eds) La justice administrative. Larcier, Brussels, pp 31–84
Laney D (2001) 3D Data Management: Controlling Data Volume, Velocity, and Variety. https://
blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-
Volume-Velocity-and-Variety.pdf. Last accessed 7 August 2018
NZa (2018) Over DIS. www.opendisdata.nl/dis/over. Last accessed 7 August 2018
Opdebeek I, De Somer S (2017) Algemeen bestuursrecht. Intersentia, Antwerp
Rijksoverheid (2016) Handleiding Wet hergebruik van overheidsinformatie. https://open-overheid.
nl/wp-content/uploads/2016/05/WEB_90943_BZK_Handleiding-Who-versie2.pdf. Last accessed
7 August 2018
Schram F (2004) Archief en openbaarheid van bestuur: een verkenning. In: Opsommer R et al
(eds) De archivaris, de wet en de rechtbank. Verslagboek van de studiedag Archief tussen
openbaarheid van bestuur en bescherming van de privacy. Die Keure, Bruges, pp 3–34
Schram F (2005) Uitzonderingen op openbaarheid van bestuur. NJW 578–586
Schram F (2008) De interpretatie van het decreet van 26 maart 2004 betreffende de openbaarheid
van bestuur door de beroepsinstantie openbaarheid van bestuur: enkele vaststellingen. In:
Schram F (ed) Openbaarheid van bestuur. Stand van zaken 2007. Instituut voor Administratief
Recht KU Leuven, Leuven, pp 123–194
Schram F (2018) Openbaarheid van bestuur. Politeia, Brussels
Tijs R (2012) Algemeen bestuursrecht in hoofdlijnen. Intersentia, Antwerp
Van Eechoud M (2008) Openbaarheid van bestuur en auteursrecht, never the twain shall meet? In:
Van Eijk N, Hugenholtz P (eds) Dommering-bundel: Opstellen over informatierecht
aangeboden aan prof. Mr. E.J. Dommering. Otto Cramswinkel, Amsterdam, pp 89–100
Vanhees H (1998) Auteursrecht in een notendop. Garant, Leuven
VBOB (2018) Annex to the annual report 2016–2017. https://www.vlaanderen.be/nl/publicaties/
detail/jaarverslag-beroepsinstantie-inzake-de-openbaarheid-van-bestuur. Last accessed 7 August
2018
Voorhoof D (2016) Wobben is een EVRM-grondrecht. Juristenkrant 338:5

Miet Caes obtained a Master’s degree in Literature and Linguistics (2009) and in Law (2013) from
the KU Leuven. During her law studies, she was a tutor in the Peer Assisted Learning program of
the Faculty of Law, and an editor of the student law review Jura Falconis. From 2013 until 2016,
she practiced as a lawyer at the law firm KS4V in Brussels, handling cases in the fields of criminal
law, company law, association law, private law, health law, intellectual property law and migration
law. Since November 2016, she is affiliated to the Leuven Institute for Healthcare Policy of the KU
Leuven where she is preparing a doctoral thesis on the governmental use of big data in healthcare.
She is also assisting in diverse courses of medical and health law at the faculties of Law and
Medicine.
Chapter 12
The Challenges of Risk Profiling Used
by Law Enforcement: Examining
the Cases of COMPAS and SyRI

Sascha van Schendel

Contents

12.1 Introduction...................................................................................................................... 226


12.2 Risk Profiling................................................................................................................... 227
12.3 Mapping Practice............................................................................................................. 229
12.3.1 Risk Profiling in General Policing ..................................................................... 229
12.3.2 Risk Profiling in an Individual Case.................................................................. 231
12.4 Challenges of Risk Profiling by Law Enforcement........................................................ 233
12.4.1 Challenges of Transparency ............................................................................... 233
12.4.2 Challenging Procedural Safeguards ................................................................... 234
12.4.3 Challenges Pertaining to Discrimination............................................................ 236
12.5 Conclusion ....................................................................................................................... 237
References .................................................................................................................................. 238

Abstract The use of Big Data in the law enforcement sector turns the traditional
practices of profiling to search for suspects or determining the threat level of a
suspect into a data-driven process. Risk profiling is frequently used in the USA and
is becoming more prominent in national law enforcement practices in Member
States of the European Union. While risk profiling creates challenges that differ per
jurisdiction in which it is used and vary along the purpose for which the profiling is
deployed, this technological development brings fundamental changes that are quite
universal. Risk profiling of suspects, or of large parts of the population to detect
suspects, brings challenges of transparency, discrimination and challenges proce-
dural safeguards. After exploring the concept of risk profiling, this chapter dis-
cusses those fundamental challenges. To illustrate the challenges, the chapter uses
two main examples of risk profiling: COMPAS and SyRI.

S. van Schendel (&)


Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University,
Warandelaan 2, 5037 AB Tilburg, The Netherlands
e-mail: s.vanschendel@tilburguniversity.edu

© T.M.C. ASSER PRESS and the authors 2019 225


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_12
226 S. van Schendel

  
Keywords Big Data Profiling Law Enforcement Predictive Policing 

Discrimination Legal Safeguards COMPAS SyRI  

12.1 Introduction

With developments such as Big Data, national law enforcement agencies have
access to a large array of data including publicly accessible data such as from social
media networks, access to more computer power and analyzation techniques—
whether in-house or in collaboration with technology companies-, and in some
countries increasing competences when it comes to data collection, analysis and
use.1 Big Data provides police with possibilities to find suspects of crimes already
committed but also assists in detecting crimes or to predict crimes. These devel-
opments make for more efficient policing2 but also change the way in which
policing works. Whereas traditionally law enforcement authorities search for evi-
dence about a specific suspect or a specific crime, the power of artificial intelligence
and Big Data allow for an approach in which the system receives a broad query or
task and presents results or patterns which can be investigated further. In that sense
the system already creates a pre-selection of suspects to investigate further and
creates the starting point of the criminal investigation. Police officers have to make
a decision on which individual to focus their attention, which can now be based on
the system’s indication. In addition to being data-driven, the investigation can be
automated, presenting police officers and prosecution authorities with the suspects
and all relevant information. The main element in artificial intelligence and Big
Data,3 and the main enabling factor for data-driven law enforcement practices, is
the use of algorithms. In this context an algorithm can be explained as a sequence of
commands for a computer system to transfer the input into output.4 However, this
sounds simpler than it in reality is. The use of algorithms can make the analysis
more difficult for human actors to understand.5 The way in which input is turned
into output can be extremely opaque.6
Having the tools to analyze huge volumes of data and extract information from
them, possibly completely by automated means, facilitates processes such as the
creation and analysis of risk profiles.7 Risk profiles can make policing, even
prosecution decisions and sentencing, more efficient, by indicating the likelihood
that suspects will display certain criminal behavior. The USA is a country where the

1
Broeders et al. 2017; Joh 2016, p. 16.
2
Zouave and Marquenie 2017.
3
Fundamental Rights Agency 2018.
4
Fundamental Rights Agency 2018.
5
Mittelstadt et al. 2016.
6
Ferguson 2018; Pasquale 2015.
7
Marks et al. 2017.
12 The Challenges of Risk Profiling Used by Law Enforcement … 227

use of technology in law enforcement and courts has already risen early,8 and the
use of algorithms, predictions and profiles is already established practice.9 The USA
also has an interesting judicial system, with well-developed systems for bail and
parole that allow for more instances of algorithmic decision making than in some
European countries. To benefit from these interesting examples present in the USA,
one example of a tool for risk profiling from that jurisdiction, COMPAS, is used
throughout this chapter. Although risk profiling is not yet an established practice in
the law enforcement domain in the Member States of the EU, there are some
examples of risk profiling taking place. In this chapter, a risk profiling system from
the Netherlands is used, showcasing a different level of established practice. The
example from the Netherlands, SyRI, takes a different approach to risk profiling
compared to the USA and offers interesting comparative material. In the
Netherlands in general there is quite some public information concerning law
enforcement practices, albeit most of the sources being in Dutch, allowing for an
analysis.
The use of risk profiles, especially to find suspects, has traditionally been an
important tool to national law enforcement.10 The practice of profiling is not new,
the novelty lies in the increasing pre-emptive or proactive shift. Focusing on risk
allows for an approach in which risk is able to be prevented, the expected crime or
re-offending does not take place, or the expectation of a crime allows for measures
to investigate. The new, data-driven form of risk profiling creates new challenges.
Therefore this chapter aims to explore the challenges of risk profiling in the age of
Big Data. Section 12.2 presents and explains a working definition of risk profiling.
Section 12.3 describes the different types of risk profiling, distinguishing between
risk profiling to find a suspect or in general policing and risk profiling in individual
cases. Section 12.4 explains the general challenges of risk profiling along three
categories and uses the cases of SyRI and COMPAS to illustrate the different
challenges. Section 12.5 briefly summarizes the chapter and presents some con-
cluding thoughts.

12.2 Risk Profiling

Risk profiling is a specific type of policing or tool for decision making. Risk
profiling does not have a set definition. It is not really referenced in literature as
such, but covers instances such as risk assessment, predictive policing, pre-emptive
surveillance, and automated decision making, as seen in the working definition
below. There are definitions of profiling in general such as the one by Hildebrandt:

8
For example predictive policing software was first introduced in the USA before it was used in
European countries.
9
Brayne et al. 2015; Christin et al. 2015; AI Now Institute 2018.
10
Hildebrandt 2008, p. 23.
228 S. van Schendel

“The process of ‘discovering’ correlations between data in databases that can be


used to identify and represent a human or nonhuman subject (individual or group)
and/or the application of profiles (sets of correlated data) to individuate and rep-
resent a subject or to identify a subject as a member of a group or category”.11 This
definition can be made more specific for risk profiling. For the purpose of this
chapter the following working definition of risk profiling is proposed: Risk profiling
is the categorizing or ranking of individuals or groups, sometimes including
automated decision making, using correlations and probabilities drawn from
combined and/or aggregated data, to determine the level of risk that is posed to the
security of others or to national security by those individuals or groups.
To provide some more guidance, the core elements are briefly explained.
Categorizing or ranking of individuals or groups is a phenomenon that is becoming
common practice in all sectors of current society.12 Whether it concerns credit
scores,13 insurance,14 admission rankings for universities,15 insurance policies
categorizing on zip code, or ranking the crime risk of individuals based on social
media.16 The possibility to combine various databases in combination with the use
of algorithms to make patterns visible, facilitates comparing of individuals or
groups.
Risk profiling in itself can simply constitute the compiling of profiles, it can be
followed by human made decisions to employ measures, or it can imply a decision
in itself constituting a form of automated decision making. Therefore the level of
human involvement will differ per system, which can have consequences for the
opacity or understandability of the system and its results.17
Risk profiles depend highly on probabilities. A risk is a chance, likelihood, or
probability that certain behavior will take place. Risk profiles often rely on statistics
drawing inferences, from data of behavior that people with similar characteristics
displayed, to determine the future behavior of a specific individual or group.
In this chapter, risk profiling is discussed as a practice of law enforcement
agencies. Therefore, the type of risk that profiles are created for is the risk that one
(re)commits a crime. The risk can be that generally framed or can be more narrowly
defined, some systems might only profile the risk of committing tax fraud or
committing a crime of violence. Risk will have a different meaning in other sectors
of society, such as the risk in the financial sector that one does not pay back debt.

11
Hildebrandt 2008, p. 19.
12
Zarsky 2014; Keats Citron and Pasquale 2014.
13
Zarsky 2014; Keats Citron and Pasquale 2014.
14
Swedloff 2014.
15
O’Neil 2016.
16
Van Brakel 2016.
17
Mittelstadt et al. 2016.
12 The Challenges of Risk Profiling Used by Law Enforcement … 229

12.3 Mapping Practice

Risk profiling can pertain to different practices of law enforcement, therefore this
section aims to give further insight into different types of risk profiling practices.
A distinction is made between risk profiling in general policing or monitoring and
risk profiling of a specific suspect in a specific case. This distinction allows for a
comparison in the challenges that risk profiling poses along the different stages of
the criminal justice chain that risk profiling is used in: the stage before there is a
suspect, or the stage in which there is already a specific individual targeted. These
two stages are explained below. When referring in this chapter to risk profiling, this
means the practice of risk profiling as such, including collecting the data for the
profile, analyzing the data and creating the profile, and the use or application of the
profile. This approach makes it possible to discuss the challenges stemming from
risk profiling in general, without only exploring the challenges that arise from the
collection of data or merely exploring the challenges arising from the use of
profiles.

12.3.1 Risk Profiling in General Policing

Risk profiling systems where the profiling is the starting point of a criminal
investigation are intended to find individuals or groups that are interesting to
monitor or to find a concrete individual who is a suspect for a case at hand. Rather
than searching for information about a specific person, the system creates a cate-
gorization specifying which individuals are high risk—based on the risk model or
query that the algorithm works with—and who could be (further) looked into. This
use of risk profiling falls under general policing or monitoring. Risk profiling is
used for finding interesting patterns in large amounts of data that cannot be searched
manually. This type of profiling does not take place within the boundaries of a
specific criminal investigation but rather leads to the starting point of an investi-
gation. Detecting individuals based on their risk can be called ‘heatlisting’,18
similar to the heatmapping19 where crime hotspots are shown. Intrado Beware is an
example of a service enabling heatlisting. Intrado Beware is a mobile, cloud-based
application used in the USA, which gathers contextual information from social
media, commercial data and criminal data, creating a risk score—green, yellow, or
red—for individuals.20 Intrado Beware has a very specific purpose. It is targeted
towards providing police officers information when they respond to a 911-call about
the person they are about to encounter and identifying whether this person is high
risk in the sense of posing a risk to the security of the police officer. In this way

18
Clavell 2016.
19
Clavell 2016.
20
Van Brakel 2016.
230 S. van Schendel

police officers will for example arrive with their firearms at the ready in case a
high-risk individual is detected.
There are types of risk profiling where a location is monitored, such as predictive
policing pertaining to an area. In targeting a location various sources of data will be
used, ranging from non-personal data such as the distance to the highway—for
escape routes—to different forms of personal data pertaining to inhabitants of that
area such as the history of criminal records. Algorithms pinpoint the level of risk for
areas, so that police officers can be deployed accordingly. This type of risk profiling
is very well established in the USA,21 but also exists in Europe. In the Netherlands
for instance, the Crime Anticipation System (CAS) is used for the creation of a grid
that is updated every 14 days; this grid shows what crime is likely to take place and
at which time of day in every square of the targeted area. This system was at first
only introduced in the capital, Amsterdam, but is now being tested all across the
country.
Another example of using risk profiling in policing or monitoring is SyRI. The
System Risk Indication (SyRI) is used in the Netherlands to monitor tax fraud,
fraud with social benefits and fraud with labour legislation. SyRI was officially
launched in 2014 and is employed by the Dutch Ministry of Social Welfare &
Employment (hereafter Dutch Ministry). The system can be used by different
parties: several governmental actors together can launch a request with the Dutch
Ministry to make use of SyRI. SyRI contains a large database consisting of sources
such as financial data of citizens, data on social benefits history of citizens, data on
education, or data about integrating in Dutch society.22 SyRI works with a pre-
determined risk model in each instance of collaboration between governmental
actors. The risk model is run through the system and SyRI indicates which indi-
viduals are high risk and which are low risk for one or more of the three types of
fraud. The results for low risk are deleted and the citizens that receive a high-risk
label can be further investigated.23 This investigation can be conducted by the
police, special law enforcement officers, supervisory authorities, municipalities,
immigration authorities, and other relevant authorities. Because SyRI is used in
varying collaborations, SyRI works differently each time, scanning for a different
risk profile each time. Due to the broad scope of the system and the large gov-
ernmental database involved, the majority of individuals in the Netherlands who
have data in one of the categories are present in the database. The risk indication is
stored in a register which relevant public bodies can access.24 Even though SyRI
has been used for years now, its use has not been without resistance. The program
raises issues of transparency, mainly awareness and contestability. Most citizens are

21
With the use of PredPol software.
22
Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.
officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.
23
Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.
officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.
24
Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.
officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.
12 The Challenges of Risk Profiling Used by Law Enforcement … 231

confronted with the existence of the system when they receive an administrative
fine, come into contact with the police, or encounter another consequence. In March
2017, several NGOs and two citizens launched a court case challenging the legality
of the system, which is still ongoing, to test whether SyRI is compliant with EU
data protection legislation, the fundamental right to privacy and the right to fair trial
under Article 6 of the European Convention on Human Rights.25 One of the points
that is debated in the case is the secrecy of the risk models, but also the lawfulness
of the automated decision making and the broadness of the legal mandate to use
SyRI.26

12.3.2 Risk Profiling in an Individual Case

Risk profiling can also take place in a specific case. In this scenario there is already
a suspect or convict at whom the risk analysis is targeted. In such cases a risk profile
is applied to that person to determine their threat level.
Risk profiling that targets a location, such as predictive policing, is a type of risk
profiling that allows for general policing and monitoring. However, while such a
system is targeted at locations, it indirectly profiles the residents of that area. A risk
profiling system that targets areas attaches a risk label to a certain area and police
patrols are sent there accordingly. The deployment of police patrols can impact the
perspective that residents and outsiders have on this area, it can be deemed as an
area with high criminality, or as a ‘bad area’. In addition, sending police patrols to a
specific area will lead to an increase in crime detection in that area: the more police
officers are present there, the higher the chance that they will detect crime even-
tually. Detecting more crime will in turn further increase the number of patrols in
that area and measures taken against residents of this area. In that sense a
self-fulfilling prophecy is created.27 Indirectly the residents are also profiled as high
risk. A type of targeting in which for example data about the income or education
level of an area is analyzed, means that there is an assumption that many suspects or
perpetrators would reside in this area, while this assumption does not have to be
reality. However, as the criminal history of residents is often one of the data points
in risk profiling of areas, the system does assume that the residents influence the
crime level directly. Since the areas that the system targets are traditionally seen by
police as problematic areas, inhabitants can easily already be on the radar. But their
risk level will fluctuate according to the risk score of the area.

25
Information about the pending court case in English is available at: https://pilpnjcm.nl/en/
dossiers/profiling-and-syri/. Last accessed 30 September 2018.
26
Information about the pending court case in English is available at: https://pilpnjcm.nl/en/
dossiers/profiling-and-syri/. Last accessed 30 September 2018.
27
Robinson 2017.
232 S. van Schendel

A next type of risk profiling targeted at specific individuals is risk profiling in


order to assist the police in making decisions about which police powers to employ.
Brkan gives the example of automated decision making to determine whether to
seize a mobile device or not.28 Besides the use of risk profiling to deploy measures,
risk profiling is also used to determine whether someone is allowed bail or pro-
bation, whether that person is at risk of reoffending, or determining the duration of
incarceration. Such advanced risk profiling is not an established practice yet in the
EU Member States. Predictive policing, in the Netherlands, and in most European
countries, is still very much in the beginning stage and is still mainly targeted at
locations. The most prominent example of risk profiling to determine, parole, bail,
or a prison sentence, is the COMPAS tool. Correctional Offender Management
Profiling for Alternative Sanctions (COMPAS) is used in the USA across all States,
but has different usage in various States. COMPAS is part of a broader development
basing bail requests, charging decisions, and sentencing on predictive analytics.29
While there is a range of risk profiling or assessment systems being used in the
USA,30 COMPAS is interesting as it is widely used in the USA and highly
data-driven: many data points are used and the algorithm fully determines the
outcome—high risk or not-. COMPAS was also the subject of the court case
Loomis v. Wisconsin, in which Loomis, who was sentenced to six years of
imprisonment based on the analysis of COMPAS, petitioned for a reconsideration
of his sentence, as COMPAS would violate his right to due process.31 While the
petition was denied in 2016,32 this case highlights one of the challenges of risk
profiling that will be addressed in the next section of this chapter, pertaining to due
process. COMPAS is software that predicts a defendant’s risk of committing a
misdemeanor or felony within 2 years of assessment based on 137 factors per-
taining to the individual in combination with data about the criminal record of the
individual.33 COMPAS takes place in the trial or post-trial stage. The algorithm for
the risk assessment was developed by the company Northpointe and the logic of the
algorithm is kept secret by the company. COMPAS makes use of 137 factors such
as factors relating to criminal history of the individual; non-compliance in court,

28
Brkan 2017.
29
Ferguson 2016.
30
Angwin et al. 2016.
31
The issues raised in the petition are: (1) Whether it is a violation of a defendant’s constitutional
right to due process for a trial court to rely on the risk assessment results provided by a pro-
prietary risk assessment instrument such as the Correctional Offender Management Profiling for
Alternative Sanctions at sentencing because the proprietary nature of COMPAS prevents a
defendant from challenging the accuracy and scientific validity of the risk assessment; and
(2) whether it is a violation of a defendant’s constitutional right to due process for a trial court to
rely on such risk assessment results at sentencing because COMPAS assessments take gender and
race into account in formulating the risk assessment.
32
Loomis v. Wisconsin, docket no. 16-6387, available at: http://www.scotusblog.com/case-files/
cases/loomis-v-wisconsin/. Last accessed 30 September 2018.
33
Dressel and Farid 2018.
12 The Challenges of Risk Profiling Used by Law Enforcement … 233

bail, or probation procedures; criminality among family members or friends; habits


of alcohol and drugs use; residence and living environment; education history; work
situation; (feelings of)social isolation; and feelings of anger.34 Some of these factors
used in the risk assessment are factors that the individual who is being assessed can
control in some measure, such as their criminal record or substance abuse. However
other factors cannot be controlled by the individual, such as criminal records of
friends or family. This makes it difficult for individuals to anticipate whether they
would be considered high risk or not. How the algorithm ranks these different
factors is not publicly known, so even if one knows how they will score on factors,
the ranking of the factors is executed by the algorithm.

12.4 Challenges of Risk Profiling by Law Enforcement

This section discusses the general challenges raised by the use of risk profiling by
law enforcement. The challenges are split into three general categories: challenges
revolving around transparency of risk profiling, challenges in procedural safe-
guards, and challenges centering on discrimination. The challenges should not be
seen as strict separate categories, rather they overlap and influence each other. For
example, issues with making an analysis transparent can be caused by a lack of
procedural safeguards that would create that transparency. The aim in using these
three categories is to present an oversight of the challenges that is as complete as
possible and is relevant for multiple jurisdictions.

12.4.1 Challenges of Transparency

Most profiles are probabilistic, describing the chance that a certain correlation will
occur.35 The core element of profiling is categorizing people in a certain way.
However, in most cases, the individuals included under the profile do not share all
the attributes or characteristics of the group that they are categorized in.36 It is an
approximation, an individual is placed in the category in which they fit best. This
means that there is always an inherent risk of errors in the use of profiles, since it
might include people erroneously within a profile or might miss certain individuals,
leaving them out of scope. The first error would be false positives, the second error
false negatives.37 In case of false positives, people would be incorrectly classified in

34
Angwin et al. 2016. Together with their report, the researchers of ProPublica made several files
publicly available, such as a list with the factors that COMPAS uses in scoring.
35
Hildebrandt 2008, pp. 21–22.
36
Hildebrandt 2008, pp. 21–22.
37
Hildebrandt and Koops 2010.
234 S. van Schendel

profile. For example in the case of COMPAS described above, someone might be
erroneously judged as high risk of re-offending. In the case of a false negative, the
more traditional problem of law enforcement occurs, namely overlooking someone
who should be a suspect or miscalculating the risk of recidivism. In these cases law
enforcement is confronted with the public’s response after for example a crime has
been committed by an individual who was on the police radar, but was not judged
as high risk.38 Especially in the context of terrorism, risk profiles aim at minimizing
false negatives, as the societal consequences are a lot graver when allowing for a
false negative than a false positive.39 Algorithms become increasingly complex and
autonomous, which makes it harder for law enforcement to be transparent about a
process or why they receive a certain outcome and humans have trouble interpreting
which data points lead to the conclusion of the algorithm.40 This added complexity
of the risk profiles in the Big Data era can create a higher chance of errors
somewhere along the process. This opacity and complexity creates transparency
challenges. Transparency—towards the human actors working with the system or
towards supervisory actors—is required to detect false negatives and positives. In
addition, the individuals affected by the risk profiling need to be able to exercise
their rights with regard to the risk profiling. For example, the ability to contest the
profile or decision making that follows from the profile, to be able to exercise the
right to fair trial, to receive due process, or to create equality of arms. As was
demonstrated in the Loomis vs. Wisconsin case about COMPAS and due process,
these transparency challenges are already arising and being presented to courts.
Transparency challenges also rise in risk profiling systems for general monitoring.
In the case of SyRI, data from a lot of individuals is analyzed, potentially every
individual in the Netherlands is involved in the risk model. Ultimately, the risk
profile of the individuals flagged as low risk will be deleted. But there is a large
group of individuals being profiled creating a large chance that there will be a false
positive. Whether there are actually opportunities to correct errors is not clear cut.
Errors in an early stage, such as a wrongful arrest, could be undetected before
further measures are deployed, such as searching the suspect’s house, posing an
infringement on the suspect’s privacy that is irreversible.

12.4.2 Challenging Procedural Safeguards

A second challenge of risk profiling lies in the procedural safeguards (not) ac-
companying these systems. As was described in the introduction, risk profiling is
data-driven and presents a shift from traditional forms of policing. In most juris-
dictions however, the procedural safeguards are attuned to more traditional forms of

38
Rauhofer 2008.
39
Leese 2014.
40
Mittelstadt et al. 2016.
12 The Challenges of Risk Profiling Used by Law Enforcement … 235

reactive policing.41 For instance, if safeguards are linked to the prosecution phase,
they can leave out the pre-investigation and investigation practices where a lot of
data is already analyzed.42 In the Netherlands, for example, the focus of safeguards
on the correction of errors is in the later stages of the criminal procedure.43 This
ultimately means that if the investigation does not reach a later stage, the correction
might not even take place at all. SyRI, for example, profiles individuals who are not
further investigated if not flagged as high risk, raising questions about ex-post
supervision and safeguards. It might be that an investigatory judge only becomes
involved when there is a concrete investigation, meaning that profiling of indi-
viduals who are later marked as low risk could be unsupervised. With regard to
individuals that are flagged as high risk by SyRI questions rise with regard to the
automated system outcomes as the starting point for further investigative measures,
which is a shift from traditional policing as well. Is there supervision on the analysis
that forms the starting point of an investigation or not? It will however differ per
criminal justice system what the police or prosecution is allowed to do with the
outcomes of data mining or Big Data analysis. In the Netherlands it has up until
now been very unclear what is enough in terms of an automated result to commence
a criminal investigation.44 In the USA the criterion of reasonable suspicion plays an
important role as a threshold to undertake measures. With the shift to more pre-
dictive and proactive policing, it is unclear how this shift in undertaking action
increasingly early influences the criterion of reasonable suspicion.45 The doctrine of
reasonable suspicion, functioning as a threshold in for example stopping a person
on the street, or seizing a suspect, requires some data points or objective obser-
vations to give reason for police action against a suspected individual.46 In literature
arguments are made that this threshold is in practice becoming less usable in the
case of Big Data, including risk profiling, as it is much easier for police officers to
fulfill reasonable suspicion when they have access to Big Data systems showing
them a profile with all the relevant factors.47 This would make the threshold of
reasonable suspicion meaningless. On the other hand it can also be argued that
predictive algorithms allow for more accurate and efficient determinations of rea-
sonable suspicion,48 making sure that people are not unjustifiably searched or
stopped on the street. A similar criterion for deploying measures plays a role in
Dutch procedural law. Article 27 of the Dutch Criminal Procedural Code requires a
sufficient suspicion as well in order to commence investigative measures or for
example stop a person on the street. With risk profiling systems that offer grounds

41
Ferguson 2018; Brinkhoff 2017, p. 68.
42
Koops 2009.
43
Koops 2009.
44
Brinkhoff 2017, p. 68.
45
Ferguson 2015; Broeders et al. 2017.
46
Ferguson 2015.
47
Ferguson 2015; Broeders et al. 2017.
48
Simmons 2016.
236 S. van Schendel

for this suspicion, these standard types of procedural safeguards are challenged and
possibly rendered meaningless.
Another challenge pertaining to procedural safeguards is that in the data-driven
society decisions are increasingly made based on group profiles.49 In European
literature on data protection and privacy there are increasingly more debates on the
possibilities for collective procedures to address types of data processing such as
Big Data analytics and group profiling.50 The shift, especially in the context of the
EU and Member States, in focus from the individual to targeting groups—as well as
the shift to ubiquitous analysis and profiling, implicitly targeting almost all citizens
—raises questions as to whom safeguards should be directed. Safeguards and rights
in the EU are often linked to individuals, for example in automated decision
making. Automated decision making in the law enforcement domain, as regulated
by Article 11 of the Law Enforcement Directive,51 on the individual level and
creating an adverse legal or significant effect, is prohibited unless authorized by
national law and provided with appropriate safeguards. Article 11 only prohibits
decision making on the individual level. This poses difficulties in the case of risk
profiling, as profiling is concerned with creating a set of correlations on the
aggregate level and subsequently applying it to individuals or groups. It could be
argued that only the application of a profile to an individual situation is regulated
here. Brkan gives the example of a group being the target of profiling by making an
automated decision to patrol certain areas, affecting the lives of the people who live
in such an area.52

12.4.3 Challenges Pertaining to Discrimination

The third type of challenge pertains to discrimination. Data-driven policy can push
law enforcement to target specific groups, especially with the pressure53 to fully use
technologies such as algorithms and Big Data analysis. The technology detects the
patterns, creates the profiles and finds correlations.54 These technologies can
however make mistakes, just as humans do, posing a threat of discrimination and
stigmatization of certain groups that are appointed by the technology as high risk.

49
Hildebrandt and Koops 2010.
50
Taylor et al. 2017; Mantelero 2016.
51
Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on
the protection of natural persons with regard to the processing of personal data by competent
authorities for the purposes of the prevention, investigation, detection or prosecution of criminal
offences or the execution of criminal penalties, and on the free movement of such data, and
repealing Council Framework Decision 2008/977/JHA, L 119/89.
52
Brkan 2017.
53
Rauhofer 2008, p. 192.
54
Koops 2009.
12 The Challenges of Risk Profiling Used by Law Enforcement … 237

The technology might ‘over target’ specific groups.55 It has been shown already that
risk-based policing targets certain societal groups within EU countries, such as
North African youths, soccer supporters, Roma, and Muslims.56 In the case of
COMPAS, risk profiling over targets black defendants.57 COMPAS was deployed
mainly to counter racial bias in humans such as prosecution officers, parole officers
and judges. However the first analysis by ProPublica of the system in 2016
demonstrated that COMPAS underpredicted recidivism rates in white defendants
and overpredicted the recidivism rates for black defendants.58 A second research
dedicated to this system, in 2018, shows that the same level of accuracy of
COMPAS could also be achieved with only 2 instead of 137 classifiers.59 This
sparked debate in the USA on the question whether data-driven or automated
systems can be fairer than humans or not.60 While there have been debates cen-
tering on racial profiling for years in various countries, there can now be discus-
sions on automated profiling possibly increasing racial profiling.61 It is extremely
hard to tackle unfair discriminatory profiling if the impacted individuals are not
aware that they are placed in a certain profile for example based on race.62 Likewise
the actors using the risk profiling system might be equally unaware of discrimi-
nation happening in their dataset or algorithm, which is more difficult to detect and
address as systems get more complex or make use of deep-learning.

12.5 Conclusion

Risk profiling by law enforcement actors can take place in various ways which all
create challenges. As its prominence in the EU appears to be growing and there
might be more use of Big Data and artificial intelligence embedded in standard
policing practices, prosecution and sentencing, these challenges could require
changes in legal safeguards in most jurisdictions. As was described in this chapter,
challenges of transparency for due process, equality of arms and the right to fair
trial, are important to address. The author is conducting further research on the
transparency challenges in the European and Dutch context and legal frameworks.
The challenges posed by bias, the use of proxies, over-prediction and discrimination
in general, have been researched for COMPAS in the past years and present a
lesson for future systems to be developed in the EU in the light of compliance with

55
Ferguson 2018; Leese 2014.
56
Leese 2014.
57
Angwin et al. 2016.
58
Angwin et al. 2016.
59
Dressel and Farid 2018.
60
Data & Society 2015.
61
Leese 2014.
62
Leese 2014.
238 S. van Schendel

the various instances of the principle of non-discrimination in Union law and


Member States law. The challenges caused by the shift from more traditional
reactive policing and prosecution to pre-emptive, risk focused, data-driven prac-
tices, mainly pertain to procedural safeguards and differ per jurisdiction. Just as
with all technological challenges in the current age of Big Data, algorithms and
artificial intelligence, this poses interesting questions for legal scholars to conduct
further research on. The Loomis vs. Wisconsin case demonstrated questions that can
be raised concerning the due process or fair trial conflicts following from using an
opaque risk profiling system. Similarly, the European Court of Human Rights63 and
national courts64 of EU Member States are increasingly required to deliver judg-
ments on the topic of Big Data and law enforcement. In view of the challenges
described above and judgements from courts on conflicts with human rights, leg-
islators will be required to re-assess their national procedural safeguards in criminal
investigations and procedures.

References

AI Now Institute (2018) Litigating algorithms: Challenging government use of algorithmic


decision systems. https://ainowinstitute.org/litigatingalgorithms.pdf. Last accessed 30
September 2018
Angwin J, Larson J, Mattu S, Kirchner L (2016) Machine Bias: There’s software used across the
country to predict future criminals. And it’s biased against blacks. https://www.propublica.org/
article/machine-bias-risk-assessments-in-criminal-sentencing. Last accessed 30 September
2018
Brayne S, Rosenblat A, Boyd D (2015) Predictive Policing. https://datacivilrights.org/2015/. Last
accessed 30 September 2018
Brinkhoff S (2017) Big Data Data Mining by the Dutch Police: Criteria for a Future Method of
Investigation. European Journal for Security Research 2:57–69
Brkan M (2017) Do algorithms rule the world? Algorithmic decision-making in the framework of
the GDPR and beyond. https://ssrn.com/ab-stract=3124901. Last accessed 30 September 2018
Broeders D, Schrijvers E, Hirsch Ballin E (2017) Big Data and Security Policies: Serving Security,
Protecting Freedom. WRR-Policy Brief. https://english.wrr.nl/publications/policy-briefs/2017/
01/31/big-data-and-security-policies-serving-security-protecting-freedom. Last accessed 30
September 2018
Christin A, Rosenblat A, Boyd D (2015) Courts and Predictive Algorithms. https://datacivilrights.
org/2015/. Last accessed 30 September 2018
Clavell GG (2016) Policing, Big Data and the Commodification of Security. In: Van der Sloot B
et al. (eds) Exploring the Boundaries of Big Data. Amsterdam University Press, Amsterdam,
pp 89–116
Data & Society (2015) Data & Civil Rights: A New Era of Policing and Justice. http://www.
datacivilrights.org/pubs/2015-1027/executive_summary.pdf. Last accessed 30 September 2018
Dressel J, Farid H (2018) The accuracy, fairness, and limits of predicting recidivism. Science
Advances 4; eaao5580

63
Kosta 2017.
64
Such as the court case pertaining to SyRI in the Netherlands.
12 The Challenges of Risk Profiling Used by Law Enforcement … 239

Ferguson A (2015) Big Data and predictive reasonable suspicion. University of Pennsylvania Law
Review 163:327–410
Ferguson A (2016) Predictive Prosecution. Wake Forest Law Review 51:705–744
Ferguson A (2018) Illuminating Black Data Policing. Ohio State Journal of Criminal Law 15:503–
525
Fundamental Rights Agency (2018) Big Data: Discrimination in data-supported decision making.
http://fra.europa.eu/en/publication/2018/big-data-discrimination. Last accessed 30 September
2018
Hildebrandt M (2008) Defining Profiling: A New Type of Knowledge? In: Hildebrandt M,
Gutwirth S (eds) Profiling the European Citizen. Springer, Dordrecht, pp 17–45
Hildebrandt M, Koops EJ (2010) The Challenges of Ambient Law and Legal Protection in the
Profiling Era. Modern Law Review 73:428–460
Joh EE (2016) The New Surveillance Discretion: Automated, Suspicion, Big Data, and Policing.
Harvard Law & Policy Review 10:15–42
Keats Citron D, Pasquale F (2014) The Scored Society: Due Process for Automated Predictions.
Washington Law Review 89:1–33
Koops EJ (2009) Technology and the Crime Society: Rethinking Legal Protection. Law
Innovation and Technology 1:93–124
Kosta E (2017) Surveilling Masses and Unveiling Human Rights - Uneasy Choices for the
Strasbourg Court. Tilburg Law School Research Paper No. 2018-10. https://ssrn.com/abstract=
3167723. Last accessed 30 September 2018
Leese M (2014) The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory
safeguards in the European Union. Security Dialogue 45:494–511
Mantelero A (2016) Personal data for decisional purposes in the age of analytics: From an
individual to a collective dimension of data protection. Computer Law & Security Review
32:238–255
Marks A, Bowling B, Keenan C (2017) Automatic justice? Technology, Crime and Social Control.
In: Brownsword R, Scotford E, Yeung K (eds) The Oxford Handbook of the Law and
Regulation of Technology. Oxford University Press, Oxford, pp 705–730
Mittelstadt BD et al (2016) The ethics of algorithms: Mapping the debate. Big Data & Society 3:1–
21
O’Neil C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens
Democracy. Crown Publishers, New York
Pasquale F (2015) The Black Box Society: The Secret Algorithms That Control Money and
Information. Harvard University Press, Cambridge
Rauhofer J (2008) Privacy is dead, get over it! Information privacy and the dream of a risk-free
society. Information & Communications Technology Law 17:185–197
Robinson D (2017) The Challenges of Prediction: Lessons from Criminal Justice. I/S: A Journal of
Law and Policy for the Information Society. https://ssrn.com/abstract=3054115. Last accessed
30 September 2018
Simmons R (2016) Quantifying Criminal Procedure: How to Unlock the Potential of Big Data in
our Criminal Justice System. Michigan State Law Review 2016:947–1017
Swedloff R (2014) Risk Classification’s Big Data (R)evolution. Connecticut Insurance Law
Journal 21:339–373
Taylor L, Floridi L, Van der Sloot B (eds) (2017) Group Privacy: New Challenges of Data
Technologies. Springer, Dordrecht
Van Brakel R (2016) Pre-Emptive Big Data Surveillance and its (Dis)Empowering Consequences:
The Case of Predictive Policing. In: Van der Sloot B et al. (eds) Exploring the Boundaries of
Big Data. Amsterdam University Press, Amsterdam, pp 117–141
240 S. van Schendel

Zarsky T (2014) Understanding Discrimination in the Scored Society. Washington Law Review
89:1375–1412
Zouave ET, Marquenie T (2017) An Inconvenient Truth: Algorithmic Transparency &
Accountability in Criminal Intelligence Profiling. European Intelligence and Security
Informatics Conference. https://ieeexplore.ieee.org/document/8240764. Last accessed 30
September 2018

Sascha van Schendel is a Ph.D. at TILT.


Chapter 13
Regulating Data Re-use for Research:
The Challenges of Innovation
and Incipient Social Norms

Hannah Smith

Contents

13.1 The Rise of Big Data Analytics in Research Utilising Administrative Data ................ 242
13.2 The Study ........................................................................................................................ 244
13.2.1 Methods .............................................................................................................. 244
13.2.2 Data Analysis...................................................................................................... 245
13.3 The Findings.................................................................................................................... 247
13.3.1 Divergent Approaches to the Construction of the Right to Privacy ................. 247
13.3.2 An Account of the EU Directive’s Notion of ‘Privacy’ ................................... 250
13.3.3 Divergent Approaches to the Value of Research as an Activity
in the Public Interest .......................................................................................... 251
13.4 The Potential Drivers of These Divergences .................................................................. 253
13.4.1 Innovations and Uncertainty .............................................................................. 253
13.4.2 The Role of Data Processing Actors ................................................................. 255
13.5 What Does This Mean for the Law? .............................................................................. 256
References .................................................................................................................................. 258

Abstract Keen to capitalise on advancements in data collection, linkage, and


analysis, governments are increasingly opening the data they collect through their
interactions with citizens to researchers. This re-use of data is justified as in the
‘public interest’ because it can provide unique insights into socio-economic chal-
lenges, giving decision makers a more robust evidence base for policies. Despite
this reasoning, negative societal responses to certain lawful governmental data
sharing initiatives suggest legal compliance is insufficient to achieve societal

H. Smith (&)
Centre for Health, Law, and Emerging Technologies, University of Oxford, Ewert House,
Oxford OX2 7SG, UK
e-mail: hannah.smith@law.ox.ac.uk

© T.M.C. ASSER PRESS and the authors 2019 241


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_13
242 H. Smith

acceptance. Notwithstanding the importance of societal expectations, few empirical


studies have explored societal attitudes towards the re-use of administrative data in
social research. This chapter explores the presence and potential drivers of diver-
gences between the law and individuals’ constructions of acceptable data pro-
cessing. Drawing on the EU Data Protection Directive and data collected from two
focus groups convened for this study, it proposes that whilst the legal approach to
data processing is unaltered by innovations in data processing, this novelty had
implications for participants’ views. The uncertainty resulting from innovative data
processing and disillusionment with its supposed benefits prompted desires for
greater control over personal data and a questioning of the ‘public interest’ in
research. Incipient social norms challenge traditional conceptions of the law’s
legitimacy that are rooted in its ability to reflect such norms. They potentially wield
significant power over the implementation of innovative data processing activities
prompting a need to explore mechanisms that can prevent undue restrictions to
individuals’ privacy interests or the public interest in research.

 
Keywords data protection EU Data Protection Directive societal expectations 
  
administrative data social research privacy public interest

13.1 The Rise of Big Data Analytics in Research Utilising


Administrative Data

Interactions between the citizen and the state during the delivery of public services,
record keeping, and other transactions, such as submission of tax returns or
applications for documents generate increasing amounts of administrative data.
Sharing this data with researchers can enable the advancement of our understanding
of social phenomena and thus better address socio-economic problems faced by
society.1 Traditionally, research utilising these data has been difficult. The datasets
of various government departments are often stored separately and data controllers
have been reluctant to acquiesce to researchers’ requests to access, link, and use
these data.2 However, the perception that data-driven decisions play a fundamental
role in enabling greater efficiency and effectiveness of the delivery of governmental
services means that there has been a drive to open up administrative data to
researchers in the UK.3 The advances in technology that permit the creation and
exploitation of ‘big data,’ which can lead to the discovery of previously unknown
links, patterns, and associations have motivated this development. Big data refers to
both the volume, variety, and velocity of data sets and the sophisticated algorithms

1
Involve 2015, p. 10.
2
The Law Commission 2014, ch. 7.
3
HM Government 2017, p. 10.
13 Regulating Data Re-use for Research: The Challenges … 243

used in the analysis of such data. These developments combined have facilitated
rapid advances in today’s data-driven decision-making.4
The re-use of administrative data for research, particularly where this increases
the robustness and validity of government policy choices, is considered an activity
in the public interest. Nevertheless, the ability to reveal unknown patterns and links
has potential negative ramifications. The use of sophisticated algorithms and ability
to link greater and greater amounts of data can permit new and greater intrusions
into individuals’ privacy. In response, data protection law seeks to protect an
individual’s right to privacy but also promote the free flow of data.5 The decision by
the CJEU in Lindqvist6 confirms that these objectives ‘may of course be incon-
sistent with one another’7 and that is for Member States to use the mechanisms in
the current regulation, which will be set out in more detail below, to achieve a ‘fair
balance’8 between the two interests.
The UK societal reception of certain data sharing initiatives, including care.data,9
which sought to link patients’ medical records and open them up to researchers, have
been negative. This suggests the law’s current approach towards the protection of
privacy and the promotion of the public interest in data sharing has, in the eyes of
some British citizens, failed to secure this ‘fair balance’. This leads such uses of data
to be deemed inappropriate and unacceptable by individuals.10
This chapter suggests that societal norms and individuals’ expectations are
crucial for securing the data protection law’s legitimacy and, without acknowl-
edging the role social norms play, it is impossible to understand how lawful
activities may nevertheless provoke a negative societal response.11 When dis-
cussing social norms, this chapter draws on the works of Nissenbaum12 and
Brownsword13 who define them as phenomena that embody society’s preferences,
desires, and notions of justice which regulate behaviour by being perceived as
actions one ‘ought’ to do. Despite such recognition of the power of social norms,
little empirical work has explored individuals’ attitudes towards the use of
administrative data in social science research.

4
Yeung 2017.
5
See Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on
the protection of individuals with regard to the processing of personal data and on the free
movement of such data 1995 (Official Journal of the European Communities) 31, s. 1.
6
C-101/01 Bodil Lindqvist Judgment of 6 November 2003.
7
Ibid.
8
Ibid., pp. 82–90.
9
Solon O (2014) A Simple Guide to Care.data. https://www.wired.co.uk/article/a-simple-guide-
to-care-data. Last accessed 28 August 2018.
10
Sterckx et al. 2016; Carter et al. 2015.
11
Brownsword 2015, p. 13.
12
Nissenbaum 2009.
13
Brownsword 2015.
244 H. Smith

The findings align with the work of Nissenbaum14 who argues that technological
advances can challenge previous commitments to values and principles present in
existing legislation. A failure to recognise the implications of these changes to
previous commitments risks the creation of even greater divergences between legal
and societal approaches towards data re-use. This potentially has significant neg-
ative consequences for research, such as deterring researchers from undertaking
research that may contribute to the public interest due to fears of a public backlash
and making data controllers reluctant to share data with researchers.15 To explore
these divergences and their implications for regulation, this chapter will present the
findings from an exploratory study that sought to compare individuals’ attitudes to
the approach of the EU Data Protection Directive 1995.16

13.2 The Study

13.2.1 Methods

The existence of legally compliant but societally unacceptable uses of government


data, such as care.data, informed the approach of this project. It sought to examine
individuals’ attitudes and expectations towards the re-use of administrative data in
social research, to compare these to the approach of the Data Protection Directive,
and to consider the implications of any identified divergences between these
approaches. The author of this chapter is based in the UK and the findings feed into
a project that commenced after the UK government’s decision to leave the EU but
prior to it confirming that it would implement the General Data Protection
Regulation (GDPR), whose final text was approved in April 2016. Therefore, the
decision was made to explore the presence of divergences by reference to the EU
Data Protection Directive. The author’s future work will build upon the findings
presented here by examining the changes introduced by the GDPR.
To identify individuals’ expectations, two focus groups comprising of 15 par-
ticipants overall were undertaken. Participants were recruited through emails dis-
seminated through mailing lists owned by various departments and faculties within
the University of Oxford. To achieve a broader demographic of participants,
attempts were made to recruit from the wider Oxford community, but these proved
unsuccessful. The homogeneity of the focus group participants, in terms of their age
and educational background, is a limitation of this study but the findings still

14
Nissenbaum 2009.
15
Latham A (2018) Cambridge Analytica Scandal: Legitimate Researchers Using Facebook Data
Could Be Collateral Damage https://theconversation.com/cambridge-analytica-scandal-legitimate-
researchers-using-facebook-data-could-be-collateral-damage-93600. Last accessed 29 May 2018.
16
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the
Protection of Individuals with Regard to the Processing of Personal Data and on the Free
Movement of Such Data’ [1995] Official Journal of the European Communities 31.
13 Regulating Data Re-use for Research: The Challenges … 245

provide an insight and a foundation for future empirical research. The project’s
aims informed the decision to use focus groups to collect data on participants’
attitudes and opinions. Focus groups consist of small groups of participants who
draw upon their personal experiences to provide researchers with an insight into a
wide range of accounts and responses. The use of open-ended questions allows
participants to guide researchers towards the aspects of the project they perceive as
important, making it a method eminently suited to the small-scale exploratory
nature of this project. The discussion-based nature of focus groups facilitates col-
laborative and confrontational interactions that enable insights into poorly under-
stood phenomena and where questions of acceptability are important.17 These
interactions and the emphasis on participants using their own experiences and
understandings to clarify and justify their views allow researchers an insight into
what participants think and why they think a certain way. This emphasis on the
‘voice’ of the participants counteracts any power imbalances that may exist between
the researcher and the participants, a point especially pertinent to this project.
Studies18 have indicated individuals lack awareness of data protection law and
research so may be hesitant to ‘open up’ to somebody they perceive as an expert in
the field for fears of being wrong and judged.

13.2.2 Data Analysis

With the focus group participants’ consent, the discussions were recorded and then
transcribed verbatim. The inclusion of pauses, hesitations, and emphasis, where
participants forcefully made their contribution, assisted the data analysis by indi-
cating areas of uncertainty, where participants changed their opinions upon
reflection, and where they felt particularly strongly about a topic. It was important
to capture these subtle aspects of the discussions, as what people say and how they
present their views indicate their views and attitudes.
Nissenbaum’s theoretical framework of privacy as contextual integrity19 and
theories of reflexive modernisation,20 which explore individuals’ attitudes towards
changes in society caused by technological advances, aided the process of data
analysis. Nissenbaum emphasises the context-dependent nature of individuals’
expectations towards the appropriateness of a flow of information. Her account of a
‘context’ is understood as a structured social setting comprising of its internal
norms, goals, and purposes, the roles in which people in a context act, the type of
information shared, and the ‘transmission principles’ which constrain flows of
information. These principles may be derived from the law or through social norms

17
Powell and Single 1996.
18
Royal Statistical Society 2014.
19
Nissenbaum 2009.
20
Beck et al. 1994.
246 H. Smith

i.e. the idea of ‘secrecy’.21 This is useful in the analysis of the focus group dis-
cussions as it facilitates a more fine-grained exploration and analysis of the sources
of divergences. It can highlight factors that shape individuals’ sense of appropri-
ateness that may not be present in a legal framework and vice versa. This enables a
comparison between ‘novel’ and ‘entrenched’ flows of information. Novel infor-
mation flows, such as those introduced by technological innovations, may be
perceived as inappropriate where they transgress a context’s norms of appropri-
ateness, governing the type and nature of information shared, or its norms of
distribution, determining the appropriate flow of information between actors
operating in a context. Where this occurs, Nissenbaum posits that the societal
acceptance of a novel information flow depends upon its ability to demonstrate its
superiority over the entrenched information flow. In doing so, her framework is of
value in exploring why some innovation prompts societal discord and, through the
idea of the ‘superiority’ of an information flow, indicates a potentially critical
element of societal acceptance that may not be present or may be constructed
differently in the legal framework. This framework thus supports the aims of this
project to explore factors that may shape individuals’ attitudes towards new forms
of data processing and compare them to the legal framework.
Whilst Nissenbaum’s framework aids the exploration of attitudes towards a
specific example of a technological innovation altering the way information flows
in a given context, theories of reflexive modernisation situate technological inno-
vations in ‘the wider social and political context of which [these] technologies are a
part.’22 This serves to broaden further the range of factors understood to influence
the societal reception of technological advances when analysing the focus group
data. Brownsword’s argument that without acknowledging the ‘full range of norms
in play’23 one cannot meaningfully understand instances where the law fails to
channel societal conduct is useful in analysing the divergences between the
approach of individuals and the law towards certain activities. Reflexive mod-
ernisation posits that technological advances have, as undesired side effects to their
novelty, created new forms of risk to society that have indeterminable origins and
defy accurate measurement. These attributes align with some of the problems
associated with big data analytics and how they can permit greater and unexpected
intrusions into individuals’ privacy interests.24 Beck’s theory of risk society draws
attention to how innovations prompt feelings of uncertainty which can lead indi-
viduals to associate noval advances with a heightened perception of risk and be
disillusioned with its supposed benefits. To this end, Nowotny et al’s25 work on
how these feelings, prompted by the disturbing challenges resulting from techno-
logical advances, serve to undermine the perceived expertise and authority of

21
Selbst 2013.
22
Kerr and Cunningham-Burley 2000, p. 285.
23
Brownsword 2015, p. 13.
24
Mittelstadt and Floridi 2016.
25
Nowotny et al. 2001.
13 Regulating Data Re-use for Research: The Challenges … 247

experts, such as researchers, is insightful in the importance it attributes to societal


constructions of risk. Different constructions of risk and attitudes towards the actors
involved in the re-use of data in social research serve as a starting point in
examining the data for divergences and their potential drivers.
Both theories are of value in drawing attention to how rapid technological
innovations may, due to the changes they are perceived to make to existing rela-
tionships and institutional structures,26 provoke a negative societal response yet still
be legally permissible. They aid in the analysis by suggesting that the building
blocks societal constructions of the risks and benefits of technological advances
may not be the same as those used in the law and may not be responding to the
actual practices regulated by the law. However, these theories, as a lens through
which to analyse the finding, have their limitations that preclude a sole reliance on
them in the process of data analysis. Nissenbaum’s framework is more suited to
contexts where the norms, values, and end goals are more known and shared within
a society hence the examples of education and healthcare contexts within her
book.27 Previous empirical studies,28 supported by the findings of this research,
indicate this to be an unfamiliar context, characterised by incipient social norms and
uncertainty in the face of the rapid changes driven by innovations in big data
analytics. Beck’s theory of risk society has been criticised for a lack of clear
measurable indicators that can be used in empirical studies,29 potentially due to its
level of abstraction.30 Furthermore, the ability for focus group participants to
provide researchers with new ways of approaching and understanding the studied
phenomena meant during the data analysis the possibility of divergences and their
sources not present in these theories was still explored.

13.3 The Findings

13.3.1 Divergent Approaches to the Construction


of the Right to Privacy

The right to privacy is fundamental to the protection offered to individuals by the


Data Protection Directive and is listed as one of the objectives of the legislation yet
a near universal recognition of the importance of privacy has not produced a
consensus as to its definition. This disagreement appears to be reflected in the

26
López and Scott 2000, p. 3 that states that they are ‘cultural or normative patterns that define
the expectations that agents hold about each other’s behaviour and that organize their enduring
relations with each other’.
27
Nissenbaum 2009.
28
Cameron et al. 2014.
29
Laraña 2001.
30
Olofsson and Öhman 2007.
248 H. Smith

differing approaches to privacy, in the context of the re-use of administrative data in


social research, indicated by the analysis of the provisions of the Directive and the
focus group discussions. The relationship between the right to privacy and the
protections offered to individuals by data protection law is a source of contention,
particularly after The Charter of Fundamental Rights of the European Union
introduced a right to the protection of personal data distinct from the right to
privacy.31 Driven by the language used by the focus group participants, their key
concerns, and what appeared to be the drivers of their attitudes, this chapter focuses
on divergences arising from differing approaches to privacy within data protection
law. Some participants’ responses allude to participants drawing upon libertarian
values in their construction of privacy in this context, prompting an individualistic
construction of privacy. Such a construction promotes an individual’s right to
determine what information is disclosed and to whom. The influence of libertarian
ideology on privacy is present in the work of Warren and Brandeis who, in one of
the earliest legal recognitions of the right to privacy, conceived it as the ‘right to be
let alone.’32 Privacy was constructed as a right that could ensure solitude and
emphasised the importance of being able to ‘retreat from the world.’33 Framed in
this way, some participants’ understanding of appropriate uses of data is linked to
their ability to control their personal data as a way of supporting their autonomy and
securing their individual dignity.34 This analysis is supported by the preference
some demonstrated towards the ‘consent or anonymise’ approach towards data
governance. This allows the use of identifiable data only where individual consent
is obtained.35 Some participants were only comfortable with their data being used
where consent had been obtained.
I think that if, for whatever reason, whatever topic or issue needs to be examined
by researchers it should be carried out by directly obtaining consent of the target sample
(FG2, P5)

The collection of consent, enabling individuals to retain a high degree of per-


sonal control over their data and its uses, seemed to fulfil several functions that led
to the re-use of administrative data in social research being perceived as an
acceptable use of data. The findings indicate that the act of consent appears to instil
the activity of research with a sense of legitimacy by ensuring that it aligned with
participants’ values and interests.
You know, I…might consent to my use of data in one thing and not to another and its
respecting the fact, and maybe my reasons for that are irrational or, I don’t know, I have
strong religious views that means I don’t want my research data to be used for the gen-
eration of a particular kind of drug or service or whatever it is (FG2, P3)

31
Tzanou 2013.
32
Warren and Brandeis 1890.
33
Ibid., 196.
34
Westin 1970.
35
Academy of Medical Sciences 2006.
13 Regulating Data Re-use for Research: The Challenges … 249

Others expressed a preference for consent as a means to ensure acceptable data


processing by reference to the type of relationship it creates between participants
and researchers.
[Consent] recognises that [researchers] don’t just have the right to this information but they
must seek this information and that shows that the researcher has given thought and ensures
that the questions will be asked are respectful (FG2, P5)

This suggests consent, as a mechanism for ensuring an individual’s privacy


interests, and facilitates appropriateness and acceptability by demonstrating a level
of respect that participants required to support this use of data. Examined through
the lens of Nissenbaum’s framework of privacy as contextual integrity, within the
specific context of the re-use of administrative data for research purposes, partici-
pants seemed to require the values of autonomy and dignity to be assured, with
consent operating as a transmission principle to uphold these values and ensure
appropriate data flows. One participant argued that the use of anonymous data
would ‘circumvent the issues of personal dignity’ (FG1, P9.) These findings align
with previous studies that highlighted how mechanisms of consent can operate as
an act of courtesy.36 The focus group data suggests that the possibility of using
other mechanisms for determining appropriate uses of data in this context may not
be considered acceptable by individuals and may be at risk of lacking societal
legitimacy. A focus on the values of dignity and respect align with deontological
justifications for attitudes. Such justifications lead to constructions of appropriate
data processing that respects privacy interests through the promotion of ‘right’
outcomes, independently of whether ‘good’ outcomes are promoted.37 Here, indi-
viduals value their privacy based on their understanding of the moral duties they
regarded as relevant in this context, rather than the extent to which the protection of
privacy could promote other desirable results, such as maximising social welfare.38
However, other participants disagreed with a sole reliance on consent for
ensuring appropriate uses of data. Some recognised the difficulties that would arise
and thus supported the use of other mechanisms to determine appropriate uses of
data, even where this would translate into less individual control.
there has to be a balance struck between wanting people to consent to their data and people
feeling like they have consented to the use of their data and actually trying to generate any
kind of meaningful research and I think if you rely entirely on what each individual has
consented to you’re gonna get nowhere (FG2, P3)

Such statements demonstrate the range of views held by participants and how
there is no consensus as to norms, values, and goals of this context. Whilst partic-
ipants were able and willing to participate in the discussions, some were very
hesitant to share their views and many were prone to changing their minds when
discussing points within the group, suggesting the norms in this context are uncertain

36
Trinidad et al. 2012.
37
Lindsay 2005.
38
Lindsay 2005.
250 H. Smith

and incipient. Where traditional conceptions of research are utilised by participants,


including the paramount importance of consent and the creation of a direct rela-
tionship between the researcher and the researched, such features appear critical in
participants’ attitudes towards the appropriateness of the use of data in research. This
is challenging where data are re-used as this often occurs without the knowledge of
individuals, let alone their consent and active participation. Such research runs
counter to the one participant’s perceived appropriate use of data, suggesting a
potential lack of social legitimacy where data are used for these purposes.

13.3.2 An Account of the EU Directive’s Notion of ‘Privacy’

The Data Protection Directive permitted data processing where it fulfilled one of the
grounds listed in Article 7, which concerned the legitimisation of data processing.
Consent39 is listed as one ground but alternatives are given, such as where necessary
for ‘the performance of a task carried out in the public interest’40 and where necessary
for ‘the purposes of the legitimate interests pursued by the data controller’.41 This
suggests that the Directive’s protection of individuals’ privacy interests is not
achieved through individual control alone. The alternative grounds demonstrate that
processing may be lawful irrespective of an individual’s knowledge of that data
processing. Furthermore, many of the regime’s requirements are addressed to the data
controllers and processors, not the data subjects. This supports a less individualistic
construction of appropriate privacy protection in the context of data processing
activities. Framing provisions to emphasise the role of the data controllers and pro-
cessors minimises the extent to which a data subject can determine what happens to
their personal data. Nevertheless, individuals are granted various rights and the ability
to exercise these, even where it could prohibit processing that could create ‘good’
outcomes. These align with individualistically focussed and deontologically sup-
ported constructions of privacy. The right to object42 and the right to information43
support the ability of an individual to control their data. These rights, however, are
subject to limitations that undermine this control, particularly where data are pro-
cessed for research purposes. Where the exercise of the right to information would
involve a ‘disproportionate effort’ on the part of the researcher, it may be limited.44
The right of access could also be restricted where data were processed for research
purposes, so long as adequate legal safeguards were in place.45 This unique approach

39
Article 7(a) Directive 95/46/EC.
40
Article 7(e) Directive 95/46/EC; Wong 2012.
41
Article 7(f) Directive 95/46/EC.
42
See Article 14 Directive 95/46/EC.
43
See Section IV Directive 95/46/EC.
44
Article 11(2) Directive 95/46/EC.
45
Article 13(2) Directive 95/46/EC.
13 Regulating Data Re-use for Research: The Challenges … 251

towards research suggests it is constructed by the Directive as a form of processing


with a unique ability to contribute something beneficial, warranting limitations to
individual rights and individual control. This suggests the Directive’s approach is
justified not by deontological reasoning but one that aligns more with consequentialist
justifications. The approach of the Directive indicates that research is constructed as an
activity with the potential to promote such ‘good’ outcomes that it warrants a different
reconciliation between the right to privacy held by individuals and the benefits arising
from the free flow of data46 than that present in other examples of data processing.
This approach appears justified more by reference to the potential outcome of data
processing for research purposes rather than the ‘rightness’ of such research and its
alignment with an individual’s perceived moral duties.
These findings indicate that the Directive and focus group participants used
different values and justifications to determine what protection should be afforded to
individuals where their data are re-used in social research. Participants, influenced
by libertarian values that emphasise the importance of control for securing their
autonomy, constructed a very individualistic form of privacy. Coupled with evi-
dence of the use of deontological justifications, seeking to promote ‘right’ outcomes
that aligned with their understanding of what it means to be a moral person, led
them to desire a greater level of control than present in the Directive. This suggests
that the societal understanding of the ‘fair balance’ to be struck between the right to
privacy and use of data is currently not incorporated into the law.

13.3.3 Divergent Approaches to the Value of Research


as an Activity in the Public Interest

The Directive regulated the activity of data processing and, as a result, the outcomes
of data processing are not necessarily relevant to the legality of a data processing
practice. Data processors do not have to demonstrate that the processing will be
beneficial for individuals for it to be lawful. In contrast, participants’ understanding
of research’s potential benefits influenced participants’ perceptions of the appro-
priateness of the re-use of administrative data for social research. Many partici-
pants’ support for data re-use in research hinged on its ability to demonstrate a
beneficial contribution.
So long as it’s not just something that doesn’t really have a benefit […] I want it to be
something where they’d use that information to tackle an issue rather than see if there is a
link for some arbitrary reason (FG1, P5)

Some participants did recognise the potential benefits that could arise from this
use of their data with one noting that ‘looking at big data that way has definitely
had a lot of beneficial umm effects’ (FG1, P9.) However, many participants, in

46
Listed as the two objectives in Article 1 Directive 95/46/EC.
252 H. Smith

discussing the potential outcomes from the use of administrative data in social
research, recognised the harms that it could cause to individuals. Some discussed
their fears that the use of data in this way could lead to the ‘profiling’ of individuals
and the ‘pigeonholing’ of certain individuals and communities because of research
findings. Examining these responses through the lens of Nissenbaum’s framework,
these responses suggest the novelty associated with the use of big data analytics on
administrative data does not support the end-goals of the context of research. This is
due to the potential for research to create harmful outcomes for individuals and
communities. Participants offered hypothetical scenarios to support their views as to
the potentially harmful nature of this research, highlighting the uncertainty they felt
as a result.
receipt of benefits…it could be done in a positive way, looking at correlations and causal
factors that might result in somebody….needing to receive state support, welfare support.
Equally, it could fall into the same trap…of building on stereotypes that we already have.
(FG2, P3)

Examined through the lens of theories of reflexive modernisation and risk


society, such attitudes indicate the ambivalence that participants felt towards the
advancements in data processing in research. The novelty of big data analytics, in
its ability to provide novel insights into social phenomena, is simultaneously
recognised as a success and a disturbing development, provoking disillusionment
with the benefits associated with research utilising this innovation.
The Directive adopts a broad definition of research, encompassing both medical
and social science research, but focus group participants made distinctions not
reflected in the law. The participants’ responses suggest that social research is
perceived differently to medical research. Whilst previous studies47 have suggested
that individuals view medical research as a valuable activity in the public interest,
the findings of this project suggest this may not be generalizable to other forms of
research. One participant was immediately supportive of the use of their data for
medical research, but their tone suggested they were more cautious about the use of
their data in other types of research. The participant gave only tentative support for
social science research and restricted this research to using aggregate data ‘yeah, I
mean I guess…as long as it’s like, in aggregate form’ (FG1, P1.) This distinction
between different types of research, based on attitudes towards their appropriateness
and potential for contributing a benefit, is not present in the Directive. The Directive
does include provisions on scientific and historical research, but does not distin-
guish them based on their benefits.
Research, as a data processing activity, also benefits from an assumption that the
re-use of data in research is compatible with the original purpose for which the data
were collected. Whilst other re-uses must demonstrate a separate Article 7 ground
to legitimise the processing of data collected for a different purpose, researchers are
relieved of this burden. This, in conjunction with the outcomes of research not

47
Hill et al. 2013; Aitken et al. 2016.
13 Regulating Data Re-use for Research: The Challenges … 253

being a relevant consideration of the lawfulness of data processing, research as an


activity appears to be constructed by the Directive as one inherently worthy as to
justify a novel approach. This is a potential further example of the Directive con-
structing research as a data processing activity promoting ‘good’ outcomes that
justify an approach that encourages data processing for this purpose.

13.4 The Potential Drivers of These Divergences

13.4.1 Innovations and Uncertainty

The findings in this chapter support previous empirical studies48 that suggested
individuals lack a clear understanding of how collected data are used and that
incomplete information can cause feelings of uncertainty towards activities that
process data. The advancements in technology associated with big data exacerbate
these issues as they often render the collection and usage of personal data ‘invis-
ible.’49 Robinson50 argues that the increasing opacity of data processing and its uses
is one of the key challenges in regulating modern data processing. This opacity
hinders individuals’ ability to exercise meaningful choices, something my findings
indicate as a key concern of the focus group participants. Moreover, the ability to
re-use data without informing individuals leads to an even more complex and
uncertain environment that individuals are required to navigate. The re-use of big
data datasets collected for other purposes in research and novel analytical tech-
niques will serve only to potentially exacerbate these issues by exponentially
increasing the ways that data can be linked, shared, and analysed. Examining this
uncertainty through the lens of Nissenbaum’s framework, this prompts a significant
diversity in the values and interests that shape the context. This diversity, coupled
with the incipient nature of the social norms that determine the appropriateness of
the use of big data analytics where administrative data are re-used in social research,
makes it difficult for this novel flow of information to demonstrate its superiority.
As noted above, Nissembaum posits that novel information flows that permit
greater intrusions into individuals’ privacy interests may be accepted where it can
demonstrate superiority over the entrenched flow of information. The securing of
societal acceptance and ability to demonstrate appropriateness is challenged where
it is unclear exactly what values and norms this is being judged upon. The greater
the diversity and ambiguity as to these values, the more difficult it will be for novel
flows of information to be deemed appropriate by individuals in society, even
where they are lawful uses of data. ‘Superiority’ may be critical for securing
societal acceptance of novel flows of information, but this is not an attribute the law

48
Information Commissioner’s Office 2015; Acquisti et al. 2015.
49
Acquisti et al. 2015, p. 509.
50
Robinson et al. 2009.
254 H. Smith

requires from a data processing activity to comply with its requirements. This
divergence may thus lead to differing societal and legal responses to innovative
technologies as the obligations imposed by the Directive are not altered by the
novelty of technologies, but participants in this study expressed more caution
towards such advancements.
During the focus groups, whilst participants were willing to engage with the
questions and discuss their attitudes, they did raise concerns about their lack of
knowledge and the implications this could have on their privacy expectations.
I don’t think anyone ever thinks, most people don’t have cause to think of all the different
pieces of their data that might be held against them and how they’re being used (FG2, P3)

It is notable that the participant framed their comment in a negative light. They
linked their uncertainty about the potential uses of data to the risk of their data being
held ‘against them’. Such language accords with theories of reflexive modernisation
where innovation is perceived as both a success and a disturbing development with
such changes leading individuals to view innovation as inherently risky. This
societal construction of the risk of innovation may lead to the neglect of the ‘goods’
of modernisation, including the ability to better inform and evaluate policies, and a
focus on the ‘bads’ of this progress, such as the ability to profile individuals.
Participants justified their attitudes by reference to some high-profile negative
media portrayals of governmental uses of data and their regulation. Sunstein51
posits that individuals assess risk by reference to readily available examples of an
outcome and perceive something to be a greater risk where it reminds them of a
similar outcome. The lack of public attention towards the use of administrative data
in social science research and the low understanding of what can be done with such
research52 means it is potentially vulnerable to these types of comparisons. This
creates a further hurdle in that the use of big data analytics on administrative data
for research papers must be overcome for it to demonstrate its superiority over more
traditional, and widely known, forms of research.
Whilst the Directive regulates actual data processing practices, societal con-
structions of the risks associated with novel data processing practices may not be
founded upon actual practices, leading to divergent approaches. The findings
indicate support for Beck’s theory of risk society53 where the dangers arising from
technological advancements dominate and individuals, in response, seek to protect
themselves. As the societal construction of risk may not align with actual practices,
this may lead to individuals requiring greater levels of individual control not present
in the Directive.

51
Sunstein 2002.
52
Information Commissioner’s Office 2015.
53
Beck et al. 1994.
13 Regulating Data Re-use for Research: The Challenges … 255

13.4.2 The Role of Data Processing Actors

The participants’ responses highlighted the importance of the actors and the roles
they played in exploring attitudes towards the appropriateness of re-using data for
research. The responses suggested a mistrust in the Government to use research in
an appropriate way, leading participants to perceive this use of data as
unacceptable.
[I]f the Government suddenly see you’re on that much income they might check on you on
more often or check you for benefit fraud more often if you live in a certain area which
leads to very simple but, you know, quite powerful stigmatisation of people (FG2, P4)

Here, the Government, as a key actor in this context, was perceived as playing a
role that aligns with Orwellian notions of a ‘Big Brother’ society, rather than one
concerned with the welfare of individuals. Whilst participants did acknowledge the
expertise that researchers had in this context;
[M]aybe the original report or the original findings weren’t jumping to those conclusions
but the politicians who’ve read it clearly have (FG1, P3)

This did not alleviate all the concerns participants held. The findings thus sup-
port Nissenbaum’s framework that posits the actors in a context and the role they
play help shape individuals’ attitudes towards the appropriateness of a data pro-
cessing practice. The focus on the authority of researchers and its potential to be
undermined in the context of the re-use of administrative data in social research
accord with theories of reflexive modernisation. This theorises that changes in
society, caused by advancements in technology, affect the authority of science and
researchers. Participants’ acknowledgement of expertise was insufficient to con-
vince them of the superiority of this flow of information. Giddens54 argues that in a
reflexively modern society the attribution of expertise becomes increasingly context
dependent. Nowotny et al.55 contend that societal changes have led categories of
human enterprise, such as science, into more ‘contextualised arenas.’ This con-
textualisation, such as where scientific output is used to justify political choices,
blurs the boundaries between enterprises. This means science is no longer perceived
as a neutral referee and is unable to justify its contributions to society by reference
to facts alone.56 The further its work is utilised in the political sphere to justify
potentially significant changes in society, the more individuals will judge science
and research by reference to the values used by individuals to orient themselves
within society.57

54
Giddens 1990.
55
Nowotny et al. 2001, p. 28.
56
Latour 1998.
57
Nowotny et al. 2001, p. 29.
256 H. Smith

The findings suggest technological advances have caused fundamental changes


to how key structures in society are perceived by citizens. This has made some view
these changes as imposing new and unacceptable burdens upon them. The novelty
associated with the technological advances that make the re-use of administrative
data in research attractive to both researchers and governments have breached the
existing information norms without demonstrating sufficient superiority over
existing information flows to secure societal acceptance. The framework proposed
by Nissenbaum and theories of reflexive modernisation elucidate how technological
advances have prompted a negative societal response to certain data processing
practices, despite their compliance with the law.

13.5 What Does This Mean for the Law?

This leads to the question of how the law could respond to the potential divergences
and their root causes identified in this chapter. This is not a question with an easy
answer or a ‘quick fix.’ Any proposed responses will depend on conception of the
relationship between law and society, an area of significant debate and disagree-
ment. This challenges further the securing of societal acceptance towards the re-use
of administrative data in social research. Traditional conceptions of this relationship
subscribe to what Tamanaha terms the ‘mirror thesis.’58 This premises the societal
acceptance of the law and the legitimacy of legal norms on the extent to which they
reflect the norms of the society of which they purport to govern. Based on this
conception of the law, the legitimacy of activities governed by data processing law
would require the law to enable greater individual control over data processing and
greater granularity in its approach to regulating data processing for research pur-
poses. However, this conception of the relationship between law and society has
been criticised. Edwards59 argues that divergences are not ipso facto problematic,
introducing scope for their existence to be justified. Here, the findings indicate that
the societal norms in this context appear to be transient and incipient, challenging
the appropriateness of their inclusion in a legal framework. There is a danger that,
in attempting to reflect these incipient social norms, the law would potentially be
responding to unwarranted public fears.60 This would introduce regulation with a
‘whiplash effect’61 as, in response to these fears, the legal regime would over-react
by focussing on the perceived harms, rather than the actual practice. Such regula-
tion would thus stifle innovation, potentially at the expense of the public interest.
Allen62 argues that there are times where it is appropriate to constrain individual

58
Tamanaha 2001, ch. 3.
59
Edwards 2015.
60
Sunstein 2002.
61
Mittelstadt and Floridi 2016, p. 305.
62
Allen 2011.
13 Regulating Data Re-use for Research: The Challenges … 257

choice where this would preserve foundational goods. Moreover, the process of
creating and updating legislation is a time-consuming process and attempts to align
with norms that are constantly evolving may not be practically possible.
Such challenges have prompted some to construct a different relationship
between law and society. Cotterrell63 argues that law’s power derives from its
ability to influence those it purports to govern through the fixing of common sense
understandings of society and the relationships within it. Rather than reflecting a set
of unstable norms, the law instead seeks to shape the concepts used by individuals
through defining them in a certain way. This understanding of the relationship
between law and society is present in the law’s construction of privacy and the
ways in which this is reconciled with the public interest in certain forms of data
processing. Cotterrell,64 however, recognises that the law alone cannot fix these
understandings. Instead, they must be derived from the views of society as to the
general nature of law and its appropriate functions. The findings suggest it is here,
where the law attempts to draw upon general societal values, where these diver-
gences occur. The study’s findings indicate the law fails to incorporate fully the
extent to which these values are contextualised by individuals and the range of
factors that influence individuals’ views as to the appropriateness of data
processing.
This has led some to perceive the role of law in this context as necessary but not
sufficient to address all the challenges created by the advances in technology on the
societal acceptance of legally compliant data processing practices. Bennett and
Mulligan65 have instead proposed the use of a policy toolbox that includes both
legislation and self-regulatory instruments, such as sectoral-specific Codes of
Practice, which have been included in the General Data Protection Regulation,66
which came into force after this project had commenced. The use of self-regulatory
instruments would allow for data processing regulation to be more contextualised
and better reflect individuals’ differing expectations in various contexts. This
approach is not without its drawbacks. Self-regulatory instruments are less likely to
consider fully public goals and may instead promote the interests of the actors
operating within a given sector.67 A further issue is that their societal acceptance
would be dependent on the extent to which individuals perceived the creation of
self-regulatory instruments as an appropriate role for actors within a given context.
The findings from this research indicate that the contextualisation of science and the
blurring of boundaries between research and politics decreases the extent to which
researchers are regarded as figures of authority and can claim a sense of expertise.

63
Cotterrell 2004.
64
Cotterrell 2004.
65
Bennett and Mulligan 2012.
66
Regulation (EU) 2016/679, ‘Regulation (EU) 2016/679 of the European Parliament and of the
Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of
Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC’ [2016]
Official Journal of the European Union 1.
67
Bennett and Raab 2006.
258 H. Smith

Others have suggested that this use of data could secure social legitimacy by the
creation of a social licence.68 This term refers to the relationship between a certain
profession and society where compliance with certain requirements, usually beyond
what required by the law, can secure the societal acceptance of behaviours other-
wise perceived to deviate from what is acceptable. The ability to secure a social
licence is challenged in this context as it construction depends on how societal
demands are articulated. This requires individuals to understand the potential harms
and the ability to spell out the implications to the profession seeking the licence.69
The findings of this study suggest individuals may lack the knowledge to under-
stand and articulate the harms they perceive data processing to cause. Furthermore,
there is also a lack of fora for this type of discussion to occur.
The lack of societal acceptance has led to arguments70 in favour of greater
engagement between researchers and individuals in society to include the views of
the latter. Brownsword71 promotes the greater inclusion of individuals on the basis
that the common good can only be advanced where decision makers are properly
informed as to what it constitutes. Due to the increasingly pluralistic nature of
modern societies, this requires decision makers to take account of the different
elements of society and their viewpoints. Whilst engagement is viewed as a way of
helping to deliver legitimacy,72 it is not without its challenges. For public
engagement to have a significant impact, it must take place before decisions are
finalised. However, as noted above, incipient social norms and indeterminable
nature of big data processing makes the relevant social and ethical problems
unclear, making useful engagement challenging.73
This chapter has sought demonstrate that there is no easy answer as to how we
should regulate in uncertain times. It has, however, highlighted the importance of
determining the roots of societal discord with legally compliant practices. This is
what enables a more fine-grained analysis of the issues identified and provides a
more solid foundation for determining the direction of future regulation.

References

Academy of Medical Sciences (2006) Personal Data for Public Good: Using Health Information in
Medical Research
Acquisti A, Brandimarte L, Loewenstein G (2015) Privacy and Human Behavior in the Age of
Information. 347 Science 509

68
Hughes and Everett 1981.
69
Raman and Mohr 2014.
70
Wynne 2006; Harmon et al. 2013.
71
Brownsword 2008.
72
Harmon et al. 2013
73
Haddow et al. 2011
13 Regulating Data Re-use for Research: The Challenges … 259

Aitken M et al (2016) Public Responses to the Sharing and Linkage of Health Data for Research
Purposes: A Systematic Review and Thematic Synthesis of Qualitative Studies. 17 BMC
Medical Ethics 73
Allen AL (2011) Unpopular Privacy: What Must We Hide? Oxford University Press
Beck U, Giddens A, Lash S (1994) Reflexive modernization: Politics, tradition and aesthetics in
the modern social order. Stanford University Press
Bennett C, Mulligan DK (2012) The Governance of Privacy Through Codes of Conduct:
International Lessons for U.S. Privacy Policy. SSRN Electronic Journal
Bennett C, Raab C (2006) The Governance of Privacy: Policy Instruments in Global Perspective.
MIT Press
Brownsword R (2008) Rights, Regulations and the Technological Revolution. Oxford University
Press
Brownsword R (2015) In the Year 2061: From Law to Technological Management. 7 Law,
Innovation and Technology 1
Cameron D et al (2014) Dialogue on data: Exploring the public’s views on using administrative
data for research purposes. https://esrc.ukri.org/files/public-engagement/public-dialogues/
dialogue-on-data-exploring-the-public-s-views-on-using-linked-administrative-data-for-
research-purposes/. Last accessed 27 January 2019
Carter P, Laurie G, Dixon-Woods M (2015) The Social Licence for Research: Why Care.Data Ran
into Trouble. 41 Journal of Medical Ethics 404
Cotterrell R (2004) Law in Social Theory and Social Theory in the Study of Law. The Blackwell
Companion to Law and Society 2
Edwards M (2015) The Alignment of Law and Norms: Of Mirrors, Bulwarks, and Pressure
Valves. 10 FIU Law Review 19
European Parliament and the European Council (1995) Directive 95/46/EC of the European
Parliament and of the Council of 24 October 1995 on the protection of individuals with regard
to the processing of personal data and on the free movement of such data 1995 (Official Journal
of the European Communities)
Giddens A (1990) The Consequences of Modernity. Stanford University Press
Haddow G et al (2011) Nothing is really safe: A focus group study on the processes of
anonymizing and sharing of health data for research purposes. Journal of Evaluation in Clinical
Practice 17:1140–1146
Harmon S, Laurie G, Haddow G (2013) Governing Risk, Engaging Publics and Engendering
Trust: New Horizons for Law and Social Science? 40 Science and Public Policy 25
Hill E et al (2013) “Let’s Get the Best Quality Research We Can”: Public Awareness and
Acceptance of Consent to Use Existing Data in Health Research: A Systematic Review and
Qualitative Study. 13 BMC Medical Research Methodology 72
HM Government (2017) Government Transformation Strategy. https://assets.publishing.service.gov.uk/
government/uploads/system/uploads/attachment_data/file/590199/Government_Transformation_
Strategy.pdf. Accessed 7 January 2019
Hughes EC, Everett C (1981) Men and Their Work. Greenwood Press
Information Commissioner’s Office (2015) Data Protection Rights: What the Public Want and
What the Public Want from Data Protection Authorities. European Conference of Data
Protection Authorities 2015
Involve (2015) Conclusions of Civil Society and Public Sector Policy Discussions on Data Use in
Government
Kerr A, Cunningham-Burley S (2000) On Ambivalence and Risk: Reflexive Modernity and the
New Human Genetics. 34 Sociology 283
Laraña E (2001) Reflexivity, Risk and Collective Action Over Waste Management: A Constructive
Proposal. 49 Current Sociology 23
Latham A (2018) Cambridge Analytica Scandal: Legitimate Researchers Using Facebook Data
Could Be Collateral Damage. The Conversation. https://theconversation.com/cambridge-
analytica-scandal-legitimate-researchers-using-facebook-data-could-be-collateral-damage-93600.
Last accessed 29 May 2018
260 H. Smith

Latour B (1998) From the World of Science to the World of Research. 280 Science 208
Lindsay D (2005) An Exploration of the Conceptual Basis of Privacy and the Implications for the
Future of Australian Privacy Law. 29 Melbourne University Law Review 131
López J, Scott J (2000) Social Structure. Open University Press
Mittelstadt B, Floridi L (2016) The Ethics of Big Data: Current and Foreseeable Issues in
Biomedical Contexts. 22 Science and Engineering Ethics 303
Nissenbaum H (2009) Privacy In Context: Technology Policy And The Integrity Of Social Life.
Stanford Law Books
Nowotny H, Scott P, Gibbons M (2001) Re-Thinking Science: Knowledge and the Public in an
Age of Uncertainty. Polity
Olofsson A, Öhman S (2007) Views of Risk in Sweden: Global Fatalism and Local Control — An
Empirical Investigation of Ulrich Beck’s Theory of New Risks. 10 Journal of Risk Research 177
Powell R, Single H (1996) Focus Groups. 8 International Journal for Quality in Health Care 499
Raman S, Mohr A (2014) A Social Licence for Science: Capturing the Public or Co-Constructing
Research? 28 Social Epistemology 258
Regulation (EU) 2016/679 (2016) Regulation (EU) 2016/679 of the European Parliament and of
the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the
Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive
95/46/EC. Official Journal of the European Union 1
Robinson N et al (2009) Review of the European Data Protection Directive. Rand Europe. https://
ico.org.uk/media/about-the-ico/documents/1042349/review-of-eu-dp-directive.pdf. Last acces-
sed 12 October 2017
Royal Statistical Society (2014) Public Attitudes to the Use and Sharing of Their Data. Research for
the Royal Statistical Society by Ipsos MORI. http://www.statslife.org.uk/news/1672-new-rss-
research-finds-data-trust-deficit-with-lessons-for-policymakers. Last accessed 28 January 2019
Selbst A (2013) Contextual Expectations of Privacy. Cardozo Law Review 35:643–709
Solon O (2014) A Simple Guide to Care.Data. Wired (7 February 2014)
Sterckx S et al (2016) “You Hoped We Would Sleep Walk into Accepting the Collection of Our
Data”: Controversies Surrounding the UK Care.Data Scheme and Their Wider Relevance for
Biomedical Research. 19 Medicine, Health Care and Philosophy 177
Sunstein CR (2002) Probability Neglect: Emotions, Worst Cases, and Law. 112 Yale Law Journal 61
Tamanaha B (2001) A General Jurisprudence of Law and Society. Oxford University Press
The Law Commission (2014) Data Sharing Between Public Bodies: A Scoping Report
Trinidad SB et al (2012) Informed Consent in Genome-Scale Research: What Do Prospective
Participants Think? 3 AJOB Primary Research 3
Tzanou M (2013) Data protection as a fundamental right next to privacy? ‘Reconstructing’ a not so
new right. International Data Privacy Law 3:88–99
Warren S, Brandeis L (1890) The Right to Privacy. 4 Harvard Law Review 193
Westin A (1970) Privacy and Freedom. Bodley Head
Wong R (2012) The Data Protection Directive 95/46/EC: Idealisms and Realisms. 26 International
Review of Law, Computers and Technology 229
Wynne B (2006) Public Engagement as a Means of Restoring Public Trust in Science - Hitting the
Notes, but Missing the Music? 9 Community Genetics 211
Yeung K (2017) Algorithmic Regulation: A Critical Interrogation. 2017–27 C-101/01 Bodil
Lindqvist Judgment of 6 November 2003

Hannah Smith is a D.Phil. candidate associated with the Centre for Health, Law, and Emerging
Technologies, and the Centre for Socio-Legal Studies, both based at the University of Oxford.
Chapter 14
European Cloud Service Data Protection
Certification

Ayşe Necibe Batman

Contents

14.1 Introduction...................................................................................................................... 262


14.2 The Risks and Uncertainties of Cloud Services ............................................................. 263
14.3 Certification Mechanisms ................................................................................................ 264
14.4 Certification Mechanisms According to GDPR.............................................................. 266
14.4.1 The Legislative Framework of Articles 42 and 43 GDPR................................ 266
14.4.2 AUDITOR—Interdisciplinary Research Project from Germany....................... 267
14.4.3 Certification Object: Operating Processes.......................................................... 268
14.4.4 Certification Criteria—The Necessity of Specification...................................... 273
14.4.5 Certification Procedure and Accreditation of Certification Bodies ................... 274
14.4.6 Conclusion .......................................................................................................... 276
References .................................................................................................................................. 277

Abstract Cloud computing is both an economically promising and an inevitable


technology. Nevertheless, some deployment models can be a source of risk in terms
of the protection of personal data. The risks of data loss and data breach hold
private entities back from using cloud services. Articles 42 and 43 of the EU
General Data Protection Regulation (GDPR) provide a new auspicious framework
for certification mechanisms to detect and to be able to minimize these risks.
However, these articles do not specify any criteria for certification mechanisms and
are also technology-neutral. To be implementable, the certification criteria ought to
be defined and a transparent procedure needs to be established. An effective data
protection certification mechanism can serve to build trust and resolve the existing
uncertainties limiting the broader usage of cloud services: certification implies a
presumption of conformity with regulatory standards, and may be seen as an
indicator of quality, which can lead to a distinction on the market. This chapter will

A. N. Batman (&)
Frankfurt am Main, Germany
e-mail: ayse.batman@posteo.de

© T.M.C. ASSER PRESS and the authors 2019 261


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_14
262 A. N. Batman

summarize the author’s research during her collaboration in the research project
AUDITOR for the development of a catalogue of criteria according to the GDPR.


Keywords data protection certification cloud computing GDPR   certification
 
object and criteria legal clarity regulated self-regulation

14.1 Introduction

Cloud computing is both an economically promising1 and an inevitable technology.


Cloud computing, often referred to as “the cloud”, is the delivery of on-demand
computing resources—everything from applications to data centers—over the
internet on a pay-for-use basis. Cloud services consist of elastic resources, so it
becomes possible for the customer to scale up or down quickly and easily to meet
his or her demand. Cloud services are metered services, so the user and customer
only pays for his or her individual use. Another main trait of cloud services is the
self-service characteristic, so the customer gains self-service access to all
IT-services needed for his or her own purposes.2
Despite this practicality of cloud services, some deployment models can be a
source of risk in terms of the protection of personal data. The risks of data loss and
data breach hold private entities back from using cloud services.3 Also, the limi-
tation of the user’s possibilities of control of the processed personal data causes
uncertainties.4
The EU General Data Protection Regulation (GDPR)5 provides a new auspicious
mechanism: Articles 42 and 43 of the GDPR set a framework for certification
mechanisms and a regulated self-regulation instrument for the economic actors in
data-driven industries.

1
EU Commission, Communication, COM 2012, 529, 27 September 2012, Unleashing the
Potential of Cloud Computing in Europe, p. 2; ENISA, Report, 12/2012, Cloud Computing—
Benefits, risks and recommendations for information security, p. 9, available via https://resilience.
enisa.europa.eu/cloud-security-and-resilience/publications/cloud-computing-benefits-risks-and-
recommendations-for-information-security. Last accessed 16 August 2018.
2
See also Sect. 14.4.3.2 for a more detailed technical definition and description of cloud services.
3
Gebauer et al. 2018, p. 59.
4
Article 29 Working Group, Opinion 05/2012 Cloud Computing, WP 126, 1st July 2012, pp. 2, 6
et seq.; Pfarr et al. 2014, p. 5020.
5
EU Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data and on the free
movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)
has entered into force on 25 May 2016 and will take effect in all Member States as of 25 May
2018.
14 European Cloud Service Data Protection Certification 263

However, Articles 42 and 43 GDPR do not specify any criteria for certification
mechanisms and are also technology-neutral. To be implementable, the certification
criteria ought to be defined and a transparent procedure needs to be established.
That is why the interdisciplinary research project AUDITOR6 from Germany aims
to specify the legal provisions concerning the object of certification, criteria and
procedure with a special focus on cloud services.
A data protection certification can serve to build trust and resolve the existing
uncertainties for a broader usage of cloud services: a certification implies a pre-
sumption of conformity,7 and is also a quality statement, which can lead to a
distinction on the market.8
This chapter will summarize the author’s research during her collaboration in the
project AUDITOR for the development of the above-mentioned catalogue of
criteria and the certification scheme.

14.2 The Risks and Uncertainties of Cloud Services

Despite the clear increase of cloud usage9 and its economic and social recognition,
data processing in cloud services presents an undeniable risk in terms of data
protection.10 These risks comprise mainly the lack of control of the processed
personal data and the lack of information about the processing itself.11 Many cloud
services prove to be non-transparent for users, which can result in information
asymmetries.12 Additionally, the volatility of the processed data in cloud services,13
whose technology is continuously evolving, makes the regulation of borderless
cloud services a veritable challenge.14 Finally, the storage and processing of data on

6
European Cloud Service Data Protection Certification, funded by the Federal Ministry of
Economic Affairs and Energy since 1st November 2017; The Ministry’s press release is available
via https://www.digitale-technologien.de/DT/Redaktion/DE/Standardartikel/Einzelprojekte/
einzelprojekte_auditor.html. Last accessed 16 August 2018.
7
Schwartmann and Weiß 2018, Article 42, para 46.
8
Kinast and Schröder 2012, p. 217.
9
KPMG AG and Bitkom Research GmbH (eds.), Cloud-Monitor 2017, available via https://
home.kpmg.com/de/de/home/themen/2017/03/cloud-monitor-2017.html. Last accessed 16 August
2018; Hoffmann, Regulation of Cloud Services under US and EU Antitrust, Competition and
Privacy Laws, 2016, p. 67.
10
Mitchell 2015 pp. 3 et seq.; Determann 2015, p. 120, para 6.13; ENISA, Report, 12/2012,
Cloud Computing—Benefits, risks and recommendations for information security, pp. 14 et seq.;
Article 29 Working Group, Opinion 05/2012 Cloud Computing, WP 126, 1 July 2012, pp. 2, 6 et
seq.
11
Article 29 Working Group, Opinion 05/2012 Cloud Computing, WP 126, 1st July 2012, pp. 2,
6 et seq.
12
Schneider et al. 2016, p. 346.
13
Hofmann and Roßnagel 2018, p. 39.
14
Pfarr et al. 2014, p. 5023.
264 A. N. Batman

external (cross-border) servers—both within and outside of the EU—can lead to a


diminishing authority on behalf of the cloud users.
When accountability can no longer be sustained by solely informal relations of
trust but must be formalised, made visible15 and become subject to independent
validation,16 audits are a tool to make accountability areas transparent and to detect
deviations from the norm.
Data protection certification of cloud services can be an effective solution to this
problem, and a remedy for all actors involved. Given that it is not practicable to
conduct inspections of servers located worldwide, certifications are the only feasible
and effective way to face the challenges and data protection risks of
globally-distributed cloud systems.17 An inspection based on certifications which is
conducted by an independent third party, undertaken once, and results in a con-
formity statement serving several cloud users, can resolve the above-mentioned
uncertainties and eliminate the lack of transparency for cloud users.18 In this sense,
certification provides users with a quick overview about the data protection level of
a cloud service, and gives a presumption of conformity with the GDPR and other
regulations the certification references.19

14.3 Certification Mechanisms

Generally, auditing used to be more process-oriented,20 whereas certification was


more goal-oriented. In a general sense, certification has a static and object-related
character due to its relation to a concrete IT product or service with its specific
properties,21 for example, by stating that the certification object or the “target of
evaluation” complies with the GDPR or some other certification regime. Thus, only
IT products and services with a stable and non-volatile character can be considered
here.22
Auditing, in the sense of a process audit, is not limited to evaluating the process
at a point in time. Auditing aims at a continuous review of the data processing
activities.23 Therefore, auditing is non-static, and shall lead to a continuous

15
Roßnagel 2011, pp. 263 et seq.
16
Power 1996, p. 10.
17
Hennrich 2011, p. 551.
18
Borges and Brennscheidt 2012, pp. 67, 68; Pfarr et al. 2014, p. 5024.
19
Article 83(2j) GDPR; Bergt 2016, p. 496; Lachaud 2018, p. 251.
20
Hornung 2017, Article 42, para 10.
21
Roßnagel 2011, p. 267.
22
Hornung and Hartl 2014, p. 220.
23
Hornung and Hartl 2014, p. 220.
14 European Cloud Service Data Protection Certification 265

improvement of data protection. Auditing does not focus on a singular product, but
takes the data processing of the audited data processor into consideration, hence
contributing to a comprehensive data protection compliance.24
The wording of the GDPR does not differentiate between the auditing process
and the certification. Article 42 of the GDPR refers to data protection-specific
certification procedures aimed at the examination and conformity assessment of
processing operations.
Regarding to ISO-norms on conformity assessment, certification is the overar-
ching term. A certification includes a number of stages including the stage of the
evaluation process. Evaluation can be conducted by the review methods such as
documentation review, sampling, testing and auditing.
Some approaches classify certification according to the GDPR as purely a
conformity assessment.25 This means that the object of the certification is only a
matter of course.26 ENISA (the European Network and Information Security
Agency) holds the view that the to be certified processing operations concern an
activity of data processing, that may be (also an integral) part of a product, system,
or service, but the certification had to be granted in relation to the processing
activities, and not only to the product, system or service as such.27
Furthermore, the data protection mechanisms of Articles 42 and 43 could be
characterised as goal-oriented certifications, as the focus should not be only on
whether measures are in place, but also to what extent such measures are sufficient
to comply with the data protection provisions. Therefore, the scope of these articles
more likely focuses on qualitative than quantitative elements of testing measures of
the entity to be certified.28 In this sense, certification can also be seen as a holistic
approach of an overall system, whereas audits can cover selective inspections of
individual areas.29
Others take the view that the certification pursuant to Article 42 of the GDPR
can involve more than meeting the minimum requirements of a conformity
assessment.30 Also, management systems are seen as core components of

24
Hornung and Hartl 2014, p. 220.
25
This view was also proposed in the GDPR draft of the European Parliament on 12 March 2014;
see also Lachaud 2018, p. 245 with further references.
26
Roßnagel et al. 2015, p. 459.
27
ENISA Recommendations on European Data Protection Certification, Version 1.0, November
2017, p. 15, available via https://www.enisa.europa.eu/publications/recommendations-on-
european-data-protection-certification. Last accessed 16 August 2018.
28
ENISA Recommendations on European Data Protection Certification, Version 1.0, November
2017, pp. 17–18.
29
Schultze-Melling 2013, p. 474.
30
Bergt 2016, p. 496; Schäfer and Fox 2016, p. 744; Hofmann 2016, p. 05324; Hofmann and
Roßnagel 2018, p. 106; Bergt 2017, Article 42, para 15; Roßnagel 2000.
266 A. N. Batman

certifications according to Article 42 of the GDPR, in order to evaluate either the


process or its result.31 This conformity statement “plus” would include the com-
pliance of technical and organisational measures of the certified processing oper-
ations in cloud services according to the requirements of other articles of the GDPR,
which will be presented later in this chapter.
The aim of data protection certification mechanisms according to Article 42
GDPR is to help controllers or processors demonstrate how they comply with their
legal obligations stemming from the GDPR. The audit performed in the framework
of a certification procedure, supposing security requirements are within the scope of
certification, might uncover risks of how data are being processed by the applicant
for the certification and other relevant deviations from the data protection rules.
Therefore, certification as such has not the purpose to mitigate any risks, but by the
means of review during a certification procedure deviation can be detected and the
IT-services can be adjusted to design the processing operations in a GDPR-conform
manner.

14.4 Certification Mechanisms According to GDPR

14.4.1 The Legislative Framework of Articles 42 and 43


GDPR

According to Article 42 of the GDPR, the Member States, supervisory authorities,


the European Data Protection Board and the EU Commission shall encourage, in
particular at the EU level, the establishment of data protection certification mech-
anisms and of data protection seals and marks, for the purpose of demonstrating
compliance of processing operations with the GDPR by controllers and processors.
Therefore, data protection seals are the result of having successfully completed the
certification procedure.32 Seals are a visual way of representing that a third inde-
pendent party is convinced by means of a review that a certified entity has met a
particular set of standards or requirements of a certification scheme.33

31
Weichert 2018, Article 42, para 11; Fladung 2018, Article 42, para 5; Loomans et al. 2014,
pp. 22 et seq.; Lachaud 2018, p. 246.
32
Bergt 2017, Article 42, para 1; Kamara and de Hert 2018, p. 8.
33
Bergt 2017, Article 42, para 1; European Data Protection Board, Guidelines 1/2018, p. 5, no.
1.3.2 (para 11): Rodrigues et al. 2016, p. 1.
14 European Cloud Service Data Protection Certification 267

However, these provisions are of a highly abstract nature. They are seen as a
regulated34 or monitored35 self-regulation mechanism which led to a “frame-
work”—regulation by the legislator, which in turn requires a specification by the
addressees of the certification on a sub-statutory level.36

14.4.2 AUDITOR—Interdisciplinary Research Project


from Germany

The practice-oriented interdisciplinary research project AUDITOR (European


Cloud Service Data Protection Certification)37 aims to specify the legal provisions
regarding the data protection certification in accordance with Articles 42 and 43 of
the GDPR and all other relevant legal provisions.38
The objective of AUDITOR is the conception, exemplary implementation and
testing of an enduring EU-wide data protection certification for cloud services. To
conceptualise an enduring data protection certification, the first step was to develop
a catalogue of criteria for the certification of cloud services in accordance with the
GDPR. An initial prototype of the catalogue of criteria has been completed in May
2018. The next step is to conceptualise a certification scheme and implement both
the catalogue and the scheme in pilot certifications during the project period.
Furthermore, a standardisation in the form of a DIN specification (a German
industry norm) will be pursued in the project, which forms the basis for the intended
European norm and the development of a data protection certification mechanism to
be recognised EU-wide. The goal of the AUDITOR project is to improve the
comparability of cloud services which are offered by companies located in different
EU Member States, and to create transparency.
The GDPR has increased the accountability of cloud service providers.39
Certification under the GDPR is one of the key instruments to meet this account-
ability, as it provides a review by independent third parties, and organises cloud
services in compliance with the new legal requirements.

34
Rodrigues et al. 2016, p. 1; Spindler 2016, p. 409; Martini 2016, p. 11.
35
Lachaud 2018, pp. 245, 251.
36
Spindler and Thorun 2015, p. 61; see also Baldwin et al. 2012, pp. 137 et seq.; for a general
view of the characteristics of self-regulation.
37
AUDITOR is funded by the Federal Ministry of Economic Affairs and Energy and started on 1
November 2017 with a duration of two years until October 2019—funding code: 01MT17003G.
38
Batman et al. 2017; see also www.auditor-cert.de. Last accessed 16 August 2018.
39
Flint 2017a, p. 171; De Hert and Papakonstantinou 2016, p. 184; and as already stated by
Jaatun et al. 2014, p. 1005.
268 A. N. Batman

14.4.3 Certification Object: Operating Processes

14.4.3.1 Derivation from the Legal Provisions of the GDPR

It is of high importance that the object of certification is clearly defined, since the
scope40 is essential for the certificate’s statement.41 This presents a challenge, as the
provisions in Articles 42 and 43 GDPR are technology-neutral. Recital 15 of the
GDPR provides that the protection of natural persons should be technologically
neutral and should not depend on the techniques used, in order to avoid creating a
serious risk of circumvention. Technological neutrality means to prevent legal
provisions from excluding technological innovation or from becoming obsolete
because of their wording and the evolution of technology.42 As a consequence, the
persons or entities applying the GDPR must interpret it with regard to the concrete
affected technology in use. However, technology in se is always neutral and can
only be judged by its use.43 Therefore, the concrete use of cloud computing,
especially the design, its default and deployment model need to be taken into
account in order to define the object of the certification according to Article 42
GDPR in a cloud-specific manner.
For example, the GDPR does not define “processing operations”, but the term of
“processing” is defined as “any operation or set of operations which is performed on
personal data or on sets of personal data, whether or not by automated means, such
as collection, recording, organisation, structuring, storage, adaptation or alteration,
retrieval, consultation, use, disclosure by transmission, dissemination or otherwise
making available, alignment or combination, restriction, erasure or destruction”
(Article 4(2) GDPR). Given that the wording of Articles 42 and 43 and Recital 100
of the GDPR is not precise enough, the term of “processing operations” needs to be
defined in a legally and technologically precise, and implementable, way. Article 42
(1) of the GDPR mentions “processing operations” by controllers and processors as
certification object, whereas Recital 100 focuses on the regulation-compliance of
“products and services”. In order to overcome the technology-neutral characteristic
of the GDPR in the case of cloud services, the author and the project AUDITOR
define the certification object as a socio-technical system, meaning that both, the
technical, human and organisational interactions need to be taken into account:
Therefore, a processing operation consists of non-technical and not-automated, meaning
personnel-based and manually-conducted measures, as well as of technical and automated
procedures.

40
Or the target of evaluation (ToE).
41
Schäfer and Fox 2016, p. 746.
42
Roßnagel et al., Policy Paper, National Implementation of the General Data Protection
Regulation, p. 5, access available via https://www.forum-privatheit.de/forum-privatheit-de/
publikationen-und-downloads/veroeffentlichungen-des-forums/positionspapiere-policy-paper/
Policy-Paper-National-Implementation-of-the-GDPR_EN.pdf. Last accessed 16 August 2018.
43
Hildebrandt and Tielemans 2013, p. 512.
14 European Cloud Service Data Protection Certification 269

This interpretation is also in line with the material scope of the GDPR, which
applies to the processing of personal data wholly or partly by automated means and
to the processing other than by automated means of personal data which form part
of a filing system or are intended to do so.44 The processing is automated if it is
supported by information technology and is not carried out completely manually.45
The scope comprises also the review of the existence and the implementation of data
protection policies and management systems by the processor.46

This view is supported by the material provision in Article 24(2) of the GDPR
which envisages the implementation of data protection policies in order to fulfil the
compliance provisions of the GDPR.47
In this context, it should be noted that according to a commonly shared view,
cloud services are considered as order processing according to Article 28 of the
GDPR.48 The cloud service provider is a processor according to Article 4(8), and
the cloud user is a controller according to Article 4(7).49 This means that according
to Article 28(1) of the GDPR, the cloud service provider, as the one who aims to be
certified, can use the certification to demonstrate to a potential cloud customer that
it offers sufficient and adequate guarantees as set out in Article 28(1) GDPR.
Depending on the business model of the cloud service, it may also be necessary
to refer to the new legal mechanism of joint controllership under to Article 26
GDPR. Given that in most service models the data controller using the cloud
service has no effective control over the activities of the cloud service provider and
the cloud service provider mostly pre-determines the means of the processing of
personal data (s. Article 4(7) GDPR) by configurating his/her service, it needs to be
examined by both cloud service providers and certification bodies during the cer-
tification procedure whether the cloud service provider has to be classified as data
controller.50 If so, the service provider’s certification under Article 42 of the GDPR
would also need to be based on all legal norms concerning controllers.51

44
Article 2 GDPR, para 14.
45
Roßnagel 2018, Article 2 GDPR, para 14.
46
See also Jung 2018, p. 208; Loomans et al. 2014, pp. 22 et seq.; Fladung 2018, Article 42
GDPR, para 5.
47
Martini 2018, Article 24 GDPR, pp. 39 et seq.; Tinnefeld and Hanßen 2018, Article 24 GDPR,
para 24.
48
Article 28 GDPR corresponds essentially to the former provision in § 11
Bundesdatenschutzgesetz (Federal Data Protection Act—2009).
49
Article 29 Working Group, Opinion 05/2012 Cloud Computing, WP 126 (adopted on 1st July
2012), p. 10; Kramer 2018, Article 28, para 16; Niemann and Hennrich 2010, p. 687.
50
Flint 2017a, p. 171; Flint 2017b, p. 125; UK Information Commissioner’s Office (ICO),
Guidance on the use of cloud computing, p. 8: The ICO states that in the provider of a public cloud
needs to be classified as data controller according to data protection law, available via https://ico.
org.uk/media/for-organisations/documents/1540/cloud_computing_guidance_for_organisations.
pdf. Last accessed 16 August 2018.
51
See also Batman 2018, p. 94.
270 A. N. Batman

Nevertheless, based on the view represented here that cloud services are treated
as a process ordering only, the GDPR already imposes strong accountability
measures on cloud service providers as processors.
This results from the fact that under Article 28(1) of the GDPR the cloud user
has the obligation to select an appropriate cloud service provider, which means that
the latter must fulfil the obligations in Articles 24 and 25, even though these
provisions primarily concern the cloud user as the controller. However, since the
cloud user orders the cloud service provider to carry out the processing operation on
his or her behalf of him, the user is obligated to choose a cloud service provider
which has designed and configurated its software or services in line with the
principles under Articles 25(1) and (2) of the GDPR, as well as from Article 5.
These provisions form then the above-mentioned automated procedures. The pro-
visions in Article 24 GDPR, however, would form the non-automated
personnel-based measures of the cloud provider which can then also be certified,
as long as they are connected to processing operations in which personal data are
processed.52 Furthermore, the cloud provider must fulfil its “original” obligations
arising under inter alia Articles 28, 32, and 37–39 of the GDPR, which address the
cloud service provider. All obligations under these provisions are certifiable in
order to demonstrate compliance with the GDPR.
Certification mechanisms can serve to demonstrate compliance with the obli-
gations of the controller as required under Articles 24(3), 25(3), 28(5) and 32(3) of
the GDPR. The objects provided for testing are the technical and organisational
measures (TOM) adopted during the processing operations to ensure and to be able
to demonstrate that processing is performed in a way that sufficient guarantees are
implemented by appropriate TOM and the protection of the rights of the data
subject is ensured.

14.4.3.2 Cloud-Specific Certification Object

Considering this abstract legal framework for the certification object in accordance
with the relevant provisions of the GDPR, a cloud-specific definition of the certi-
fication object will be presented in the following.
Although an old trend, cloud computing has experienced renewed importance
due to its shift to an IT-deployment model.53 Now, the computing is accessible via
networks from distributed and scalable hardware and software resources54 which
resulted from the time-share concepts from the 1960s and 1970s as developed by

52
This view is also shared by the supervisory authority Unabhängiges Landeszentrum für
Datenschutz (ULD) from Schleswig-Holstein, that at its time has also contributed significantly to
the criteria of EuroPriSe and is currently an associated partner in the project AUDITOR.
53
Krcmar et al. 2018, § 1, para 24.
54
Pfarr et al. 2014, p. 5018.
14 European Cloud Service Data Protection Certification 271

John McCarthy.55 Given that the usage of cloud services is still a relatively new
phenomenon and has the potential to change data processing habits of companies
and individuals, regulation of cloud computing is of current relevance given the
factual and legal uncertainties regarding its use.56
According to the broadly recognised definition of NIST57 from 2011, also used
by ENISA as well as by the Federal Office of Information Security (BSI),58 cloud
computing is:
… a model for enabling ubiquitous, convenient, on-demand network access to a shared
pool of configurable computing resources (e.g., networks, servers, storage, applications,
and services) that can be rapidly provisioned and released with minimal management effort
or service provider interaction. This cloud model is composed of five essential character-
istics, three service models, and four deployment models.

The first of the five main characteristics of cloud computing is its on-demand
self-service. A consumer can unilaterally engage computing capabilities, such as
server time and network storage, as needed, on demand, and automatically without
requiring human interaction with each service provider. Secondly, it has a broad
network access, meaning that the services are available with standard mechanisms
over the network and are not bound to a specific client. The third characteristic is
resource pooling, meaning the resources of the provider are available in a pool from
which many users can draw. This is also known as the multi-tenant model. Users do
not know where the resources are specifically located, but can agree by contract on
the storage location, such as a region, country, or computer centre. Fourthly, it is
characteristic for clouds to be rapid elastic, meaning capabilities can be elastically
provisioned and released, in some cases automatically, to scale rapidly outward and
inward commensurate with demand. Finally, cloud systems are a measured service,
meaning resource usage can be monitored, controlled, and reported, providing
transparency for both the provider and consumer.
The three primary service models of cloud computing are Software as a Service
(SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS).59
Furthermore, the NIST differentiates between four primary deployment models:
In a private cloud, the cloud infrastructure is only operated for one organisation.
It can be organised and managed by the organisation or a third party and can be
located in the computer centre of the organisation or a third-party organisation. The

55
Krcmar et al. 2018, § 1, para 24, 25.
56
Pfarr et al. 2014, p. 5023.
57
The National Institute of Standards and Technology (NIST), the standardisation centre of the
USA, is part of the U.S. Department of Commerce. The document is available via https://nvlpubs.
nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-145.pdf. Last accessed 16 August 2018.
58
Federal Office of Information Security https://www.bsi.bund.de/EN/Topics/CloudComputing/
Basics/Basics_node.html;jsessionid=E93272CC537E50C157BBE795D67E6F4A.1_cid351. Last
accessed 16 August 2018.
59
A more comprehensive differentiation between the service models (such as Communication as
a Service, Storage as a Service, Network as a Service or E-Mail as a Service etc.) can also be found
in DIN ISO/IEC 17788:2016-04, Annex A.
272 A. N. Batman

term public cloud is used if cloud services may be used by the general public or a
large group, such as an entire industry, and the services are made available by one
provider. Within a community cloud, the infrastructure is shared by several
organisations with similar interests. Such a cloud may be operated by one of these
organisations or a third party. If several cloud infrastructures, each of which is
independent, are used jointly via standardised interfaces, this is referred to as a
hybrid cloud.
However, these definitions do not cover all possible versions of cloud devel-
opment models. There are many possible secondary definitions, such as “virtual
private cloud”.60
The cloud-specific certification object needs to be defined in a manner that
provides a self-contained and cohesive procedural structure for the processing of
personal data,61 and in which the specific data protection risks of the cloud service
can be captured.62 This includes the capturing of interfaces between the controller
and processor, where a clear distinction of responsibilities, the influence and design
possibilities on data processing becomes possible. Data flow analysis can help to
capture the responsibility areas of both the controller and the processor.63 A data
mapping process can help to recognise the parties already involved at each stage in
data processing activities, as well as parties which should be involved.64 By
examining this data flow mapping and capturing of the interfaces individually, it
needs to be taken into account that not only the non-critical parts of the processing
operations in the cloud service should be selected for the certification,65 namely
those processing operations in which no personal or anonymized data are being
processed.
In concrete terms, this means that all cloud-specific TOM during a processing
operation, from the on-boarding of the cloud customers, transmission of personal
data into the cloud, storage and other processing of personal data, to the design of
the access points, data modification modalities during the storage by the cloud
customer as well as by the cloud provider, access security and limitation with regard
to personnel entrusted with data processing, data deletion policies and encryption
techniques in accordance with the state of the art need to be examined during a
certification based on Articles 42 and 43 of the GDPR, and considering the pro-
visions in the core norms of Articles 5, 24, 25, 28 and 32 of the GDPR. Note that

60
Federal Office of Information Security https://www.bsi.bund.de/EN/Topics/CloudComputing/
Basics/Basics_node.html;jsessionid=E93272CC537E50C157BBE795D67E6F4A.1_cid351. Last
accessed 16 August 2018
61
Weichert 2018, Article 42 GDPR, para 47.
62
Roßnagel 2000.
63
IT Governance Privacy Team (ed.), EU General Data Protection Regulation (GDPR). An
implementation and Compliance guide, 2nd edition (2017), pp. 159–162; see also Roßnagel 2000,
pp. 76 et seq.—A data flow analysis for the purpose of the definition of a certifiable element is
demonstrated here.
64
IT Governance Privacy Team 2017, p. 161.
65
Roßnagel 2000.
14 European Cloud Service Data Protection Certification 273

the definition of the cloud-specific scope of the certification made here represents
only a general legal guidance. A further and more practical concretisation of the
certification object is undertaken between a cloud provider and an accredited cer-
tification body66 during the individual certification procedure.

14.4.4 Certification Criteria—The Necessity of Specification

The legislator has not defined any concrete certification criteria in the GDPR.
Articles 42(5) and 43(2b) of the GDPR presuppose that the competent authorities in
each member-state can provide and impose these criteria. Therefore, Article 42 and
43 of the GDPR set only a framework for the criteria to be defined by the private
sector in the Common Market and by the Member States.67
The research project AUDITOR has developed a catalogue of such criteria on
the basis of the former audit standard “Trusted Cloud Data Protection Profile
(TCDP)” from 2016,68 which was written in accordance with the former legislation
of the Federal Data Protection Act of Germany (Bundesdatenschutzgesetz).
AUDITOR now aims to turn these abstract legal provisions into concrete legal
requirements under new data protection legislation in order to implement the cer-
tification mechanisms pursuant to Articles 42 and 43 of the GDPR. The
above-mentioned provisions in Articles 24, 25, 28, and 32 form the core norms of
the catalogue of criteria and focus in particular on the TOMs of cloud service
providers. But also other provisions e.g. the support of the cloud user as controller
in safeguarding the considerably strengthened rights of the affected persons in
chapter III of the GDPR and the compliance with data protection principles in
Article 5 GDPR have been incorporated into the normative criteria of the
AUDITOR catalogue.
Furthermore, in order to specify cloud-specific criteria, the principles of data
protection by design and by default have been relevant, due to the fact that the
cloud service provider needs to design the data processing operations in a manner,
that the cloud user choosing him can fulfil his accountability arising from Article 25
in conjunction with Article 5 GDPR. The catalogue also comprises criteria with
regard to the requirements for sub-contractors according to Article 28(2 and 4).
Finally, the catalogue provides measures to implement a data protection

66
Article 43(1) sentence 2 GDPR.
67
Spindler and Thorun 2015, p. 61.
68
Trusted Cloud Data Protection Profile (TCDP), TCDP v1.0, available via https://www.tcdp.de/
index.php/dokumente. Last accessed 16 August 2018. The Trusted Cloud audit standard TCDP
has emerged from the Technology Programme of the Federal Ministry for Economic Affairs and
Energy in 2016.
274 A. N. Batman

management system, as the catalogue includes criteria for the designation of a data
protection officer, his or her organisational functions and competences,69 and
procedures to manage data security and data breach incidents.70
By defining these normative criteria derived from the legal provisions of the
GDPR in a concrete and auditable way, the inclusion of and the reference to
technical and organisational standards such as the DIN EN ISO/IEC 27001, 27002,
27018, 17789, ISO/IEC 29100-2011 and DIN 66398 (deletion concept) has been
necessary. Furthermore, the Compliance Controls Catalogue (C5) of the Federal
Office for Information Security (BSI)71 has been considered in the definition the of
technical, organisational and information-security measures. This catalogue of
criteria evolved in the project AUDITOR, is publicly available as a draft version
since June 2018 and will be finalized within the project duration in October 2019.72

14.4.5 Certification Procedure and Accreditation


of Certification Bodies

Certification bodies control the certification procedure and accreditation standards


under Articles 42 and 43 of the GDPR. This includes control over the procedures
for the issuing, periodic review and withdrawal of data protection certification, as
well as seals and marks which indicate certification.
Under Article 42(3) of the GDPR, certification shall be voluntary and available
via a transparent process. Certification may be issued for a maximum period of
three years and may be renewed if the relevant requirements continue to be met,73
or withdrawn if the requirements are not or are no longer met.74
According to these provisions, the controller (or in the case of cloud providers,
the processor) shall provide the certification body with all information and access to
its processing activities as necessary to conduct the certification procedure. The
certification body must be accredited75 by a competent supervisory authority or a
national accreditation body in accordance with the EU Regulation on
accreditation.76

69
Article 37–39 GDPR.
70
Article 33, 34 in conjunction with Article 28(3), sentence 2, lit. f GDPR.
71
BSI, C5-Compliance Controls Catalogue (C5), available via https://www.bsi.bund.de/EN/
Topics/CloudComputing/Compliance_Controls_Catalogue/Compliance_Controls_Catalogue_
node.html. Last accessed 18 September 2018.
72
Roßnagel et al. 2018.
73
Article 42(7), sentence 1 GDPR.
74
Article 42(7), sentence 2 GDPR.
75
Article 43(1), sentence 2 GDPR.
76
Regulation (EC) No 765/2008 of the European Parliament and of the Council (1) in accordance
with EN-ISO/IEC 17065/2012.
14 European Cloud Service Data Protection Certification 275

In order to be accredited, the certification body must demonstrate independence


and expertise in relation to the subject matter of the certification to the satisfaction
of the competent supervisory authority (SPA).77 Furthermore, the certification body
must respect existing certification criteria referred to in Article 42(5) and that are
approved by the SPA.78 The certification body must also establish procedures for
the issuing, periodic review, and withdrawal of data protection certification, seals
and marks,79 as well as to handle complaints about infringements of the certification
or the manner in which the certification has been or is being implemented by the
controller or processor.80 Finally, the certification body must demonstrate, to the
satisfaction of the competent SPA, that its tasks and duties do not result in a conflict
of interest.81
Under Article 43(4), the accreditation shall be issued for a maximum period of
five years and may be renewed under the same conditions, provided that the cer-
tification body still meets the above-mentioned requirements. The procedure also
provides the publishing of the certification criteria by the supervisory authority and
making the criteria available to the public.82 The supervisory authorities would then
transmit those criteria to the European Data Protection Board, which would collate
all certification mechanisms and data protection seals into a register and make them
publicly available.83 The competent supervisory authority or the national accredi-
tation body can revoke an accreditation of a certification body where the conditions
for the accreditation are not, or are no longer, met, or where actions taken by a
certification body infringe the GDPR.
The purpose of accrediting private certification bodies is to create a network of
certifications that contributes to the improvement of the implementation of data
protection, to create commonly recognised standards, and to relieve the workload of
supervisory authorities.84 The fact that also private entities can be accredited and
conduct certification is a result of the co-regulative approach under the provisions
of Articles 40–43 of the GDPR.85

77
Article 43(a) GDPR.
78
Article 43(b) GDPR.
79
Article 43(c) GDPR.
80
Article 43(d) GDPR.
81
Article 43(e) GDPR.
82
Article 43(6) sentence 1 GDPR.
83
Article 43(6) sentence 2 GDPR.
84
Weichert 2018, Article 43 GDPR, para 2.
85
Spindler and Thorun 2015, p. 61.
276 A. N. Batman

14.4.6 Conclusion

This chapter discussed the defining terms for and the effects of data protection
certification mechanisms and how they can contribute to the enabling of legal
clarity and to the enforcement of data protection law.86 However, it should be
considered that according to Article 42(4) of the GDPR a certification does not
reduce the responsibility of the controller or the processor for compliance with the
GDPR. Furthermore, this chapter tried to demonstrate that certifications lead to a
facilitation and standardisation of examination procedures for GDPR compliance
and can replace complex and costly individual examinations.87 Therefore a stan-
dardised certification mechanism is an appropriate means to review a likewise
standardised IT-model such as the cloud.
As a conclusion, the social and economic impacts of data protection certifica-
tions are raising awareness about data protection issues in companies using cloud
services, by enabling the selection of appropriate and compliant cloud services, not
in the least to prevent severe sanctions under Articles 82 and 83 of the GDPR.
Certifications can help also to raise awareness on the cloud market, so that a
certified order processing will be accepted as a quality statement88 and a compet-
itive advantage.89 Furthermore, certifications can result in improving trust in and
enabling the broader usage of cloud services by private entities. “Users must trust
the device, company or cloud-based server to use it. (…) Reputation and trust are
intertwined. (…) [Reputation] is the result of an evaluation process.”90 Building
trust is one of the most important factors to sustainably position cloud services in
the market.91 The implicit goal of certifications to raise trust in the use of tech-
nologies corresponds with the overall objective of the GDPR. “Its cornerstone is the
notion of trust: trust in data controllers [and processors] to treat personal infor-
mation responsibly, and trust that the rules will be effectively enforced.”92
Economic relief, especially for SMEs using the most economically promising
deployment model of public cloud services, which, however, currently hold the
main data protection and data security risks, can be a further economic impact of
certifications. Large companies can mostly afford e.g. private cloud services, which
are costlier. In order to enhance the competitive position of SMEs on the market, it
is in their interest to enable the secure usage concerning data security as well as the
security for the fundamental rights of potentially affected individuals.

86
Bock 2016, Chapter 15, p. 335.
87
Schäfer and Fox 2016, p. 744.
88
Jentzsch 2012, p. 416; Schäfer and Fox 2016, p. 746.
89
Kinast and Schröder 2012, p. 217; Weichert 2010, pp. 274–279.
90
Hoffman 2016, p. 190.
91
Buch et al. 2014, p. 68.
92
Buttarelli 2016, p. 77.
14 European Cloud Service Data Protection Certification 277

References

Baldwin B, Cave M, Lodge M (eds) (2012) Understanding Regulation. Theory, Strategy and
Practice, 2nd edn. Oxford University Press, USA
Batman AN (2018) Die Datenschutzzertifizierung von Cloud-Diensten nach der EU-DSGVO. In:
Taeger J (ed) Rechtsfragen digitaler Transformationen – Gestaltung digitaler Veränderungsprozesse
durch Recht. Tagungsband Herbstakademie 2018, OlWIR Oldenburger Verlag für Wirtschaft,
Informatik und Recht, Edewecht, pp 87–101
Bergt M (2016) Die Bedeutung von Verhaltensregeln und Zertifizierungen nach der DSGVO. In:
Taeger J (ed) Smart World - Smart Law? Weltweite Netze mit regionaler Regulierung.
Tagungsband Herbstakademie 2016, OlWIR Oldenburger Verlag für Wirtschaft, Informatik
und Recht, Edewecht, pp 483–489
Bergt M (2017) Art. 42. In: Kühling J, Buchner B (eds) DSGVO-Kommentar. C.H. Beck, Munich
Bock K (2016) Data Protection Certification: Decorative or Effective Instrument? Audit and Seals
as a Way to Enforce Privacy. In: Wright D, De Hert P (eds) Enforcing Privacy, Chapter 15,
Springer International Publishing, pp 335–356
Borges G, Brennscheidt K (2012) Rechtsfragen des Cloud Computing – ein Zwischenbericht. In:
Borges G, Schwenk J (eds) Daten- und Identitätsschutz in Cloud Computing, E-Government
und E-Commerce. Springer, Berlin/Heidelberg, pp 43–77
Borges G, Meents J (eds) (2016) Cloud Computing. Rechtshandbuch. C.H. Beck, Munich
Buch MS, Gebauer L, Hoffmann H (2014) Vertrauen in Cloud Computing schaffen - Aber wie?
Wirtschaftsinformatik & Management, 03/2014, pp 67–77
Bundesamt für Sicherheit in der Informationstechnik (BSI) (2018) C5-Compliance Controls
Catalogue (C5). Available via https://www.bsi.bund.de/EN/Topics/CloudComputing/
Compliance_Controls_Catalogue/Compliance_Controls_Catalogue_node.html. Last accessed
16 August 2018
Burri M, Schär R (2016) The Reform of the EU Data Protection Framework: Outlining Key
Changes and Assessing Their Fitness for a Data-Driven Economy. Journal of Information
Policy, Vol. 6 (2016), pp 479–511, available via http://www.jstor.org/stable/10.5325/jinfopoli.
6.2016.0479. Last accessed 16 August 2018
Buttarelli G (2016) The EU GDPR as a clarion call for a new global digital gold standard.
International Data Privacy Law, 2016, Vol. 6, No. 2, pp 77–78
De Hert P, Papakonstantinou V (2016) The new General Data Protection Regulation: Still a sound
system for the protection of individuals? Computer Law & Security Review, 2016, Vol. 32,
pp 179–194
Determann L (2015) Determann’s Field Guide to Data Privacy Law. International Corporate
Compliance, 2nd edn. Edward Elgar Publishing Ltd
ENISA (2012) Cloud Computing - Benefits, risks and recommendations for information security.
Report
ENISA (2017) ENISA Recommendations on European Data Protection Certification Version 1.0,
November 2017, p 15, available via https://www.enisa.europa.eu/publications/recommendations-
on-european-data-protection-certification. Last accessed 16 August 2018
European Commission (2012) Communication from the Commission to the European Parliament,
the Council, the ECOSOC and the Committee of the Regions, COM 2012, 529, 27 September
2012, Unleashing the Potential of Cloud Computing in Europe, available via http://ec.europa.
eu/transparency/regdoc/rep/1/2012/EN/1-2012-529-EN-F1-1.Pdf. Last accessed 16 August
2018
Fladung A (2018) In: Wybitul T (ed) (2018) Handbuch EU-Datenschutz-Grundverordnung.
Fachmedien Recht und Wirtschaft, dfv Mediengruppe Frankfurt am Main, Commentary of Art.
42 GDPR
Flint D (2017a) Sharing the Risk: Processors and the GDPR. Business Law Review, 2017, Vol. 38,
pp 171–172
278 A. N. Batman

Flint D (2017b) Storms ahead for Cloud Service Providers. Business Law Review, 2017, Vol. 38,
pp 125–126
Forum Privacy and Self-Determined Life in the Digital World (ed) (2018) Policy Paper, National
Implementation of the General Data Protection Regulation, Challenges - Approaches -
Strategies, available via https://www.forum-privatheit.de/forum-privatheit-de/publikationen-
und-downloads/veroeffentlichungen-des-forums/positionspapiere-policy-paper/Policy-Paper-
National-Implementation-of-the-GDPR_EN.pdf. Last accessed 16 August 2018
Gebauer L, Söllner M, Leimeister JM (2018) Vertrauensproblematiken im
Cloud-Computing-Umfeld. In: Krcmar H, Leimeister JM et al. (eds) Cloud-Services aus der
Geschäftsperspektive. Springer Fachmedien Wiesbaden, pp 59–69
Hennrich T (2011) Compliance in Clouds. Datenschutz und Datensicherheit in Datenwolken.
Computer & Recht (CR), Vol. 8/2011, pp 546–552
Hildebrandt M, Tielemans L (2013) Data protection by design and technology neutral law.
Computer Law & Security Review, Vol. 29 (2013), pp 509–521
Hoffmann SG (2016) Regulation of Cloud Services under US and EU Antitrust, Competition and
Privacy Laws. Peter Lang GmbH, Internationaler Verlag der Wissenschaften, Frankfurt am
Main
Hofmann JM (2016) Zertifizierungen nach der DS-GVO. In: ZD-Aktuell 2016, 05324
(online-resource)
Hofmann JM, Roßnagel A (2018) Rechtliche Anforderungen an Zertifizierungen nach der
DSGVO. In: Krcmar H, Eckert C et al. (eds) Management sicherer Cloud-Services. Springer
Fachmedien Wiesbaden, pp 101–112
Hornung G (2017) Art. 42. In: Auernhammer H (ed) DSGVO BDSG Kommentar, 5th edn. Carl
Heymanns Verlag, Cologne
Hornung G, Hartl K (2014) Datenschutz durch Marktanreize – auch in Europa? - Stand der
Diskussion zu Datenschutzzertifizierung und Datenschutzaudit. Zeitschrift für Datenschutz
(ZD) 2014, pp 219–225
Horwitz J (1982) The History of the Public/Private Distinction. University of Pennsylvania Law
Review, Vol. 130, No. 6 (1982), pp 1423–1428, available via http://www.jstor.org/stable/
3311976. Last accessed 16 August 2018
IT Governance Privacy Team (eds) (2017) EU General Data Protection Regulation (GDPR). An
Implementation and Compliance Guide, 2nd edn.
Jaatun MG, Pearson S, Gittler F, Leenes R (2014) Towards Strong Accountability for Cloud
Service Providers. IEEE 6th International Conference on Cloud Computing Technology and
Science, pp 1001–1006
Jarass H (2018) Art. 8. In: Jarass H, Pieroth B (eds) Grundgesetz für die Bundesrepublik
Deutschland: GG, Kommentar. C.H. Beck, Munich
Jentzsch N (2012) Was können Datenschutz-Gütesiegel leisten? Wirtschaftsdienst 2012, pp 413–
419
Jung A (2018) Datenschutz-(Compliance-)Management-Systeme – Nachweis- und
Rechenschaftspflichten nach der DSGVO. ZD (Zeitschrift für Datenschutz), 2018, pp 208–213
Kamara I, de Hert P (2018) Data Protection Certification in the EU: Possibilities, Actors and
Building Blocks in a Reformed Landscape. In: Rodrigues R, Papakonstantinou V (eds) Privacy
and Data Protection Seals. T.M.C. Asser Press, The Hague
Kinast K, Schröder M (2012) Audit & Rating: Vorsprung durch Selbstregulierung. Datenschutz als
Chance für den Wettbewerb. ZD (Zeitschrift für Datenschutz), 2012, pp 207–210
Koops BJ (2008) Should ICT Regulation be Technology-Neutral? In: Koops BJ, Lips M et al.
(eds) Starting Points for ICT Regulation. Deconstructing Prevalent Policy One-Liners. IT &
Law Series Vol. 9, T.M.C. Asser Press, The Hague, pp 77–108
Krcmar H, Eckart C et al. (eds) (2018) Management sicherer Cloud-Services. Entwicklung und
Evaluation dynamischer Zertifikate. Springer Fachmedien, Wiesbaden
14 European Cloud Service Data Protection Certification 279

Lachaud E (2018) The General Data Protection Regulation and the rise of certification as a
regulatory instrument. Computer Law & Security Review 34, 2018, pp 244–256
Loomans D, Matz M, Wiedemann M (eds) (2014) Praxisleitfaden zur Implementierung eines
Datenschutzmanagementsystems. Ein risikobasierter Ansatz für alle Unternehmensgrößen.
Springer Fachmedien, Wiesbaden
Martini M (2018) In: Paal BP, Pauly AD (eds) Beck’sche Kompakt-Kommentare.
Datenschutz-Grundverordnung, Article 24 GDPR. C.H. Beck Verlag, Munich
Martini M (2016) Do it yourself im Datenschutzrecht. NVwZ – Extra 6/2016, pp 1–10
Mitchell C (2015) Privacy, Compliance and the Cloud. In: Zhu/Hill/Trovati (eds) Guide to
Security Assurance for Cloud Computing. Springer International Publishing, pp 3–14
Niemann F, Hennrich T (2010) Kontrollen in den Wolken? Auftragsdatenverarbeitung in Zeiten
des Cloud Computing. Computer & Recht (CR) 10/2010, pp 686–692
Pfarr F, Buckel T, Winkelmann A (2014) Cloud Computing Data Protection – A Literature Review
and Analysis. 2014 47th Hawaii International Conference on System Science, pp 5018–5027
Power M (1996) The audit explosion. Demos. White Dove Press, London, available via https://
www.demos.co.uk/files/theauditexplosion.pdf. Last accessed 16 August 2018
Rodrigues R, Barnard-Wills D, De Hert P, Papakonstantinou V (2016) The future of privacy
certification in Europe: An exploration of options under article 42 of the GDPR. International
Review of Law 2016, Computers & Technology, available via http://dx.doi.org/10.1080/
13600869.2016.1189737. Last accessed 16 August 2018
Rodrigues R, Wright D, Wadhwa K (2013) Developing a privacy seal scheme (that works).
International Data Privacy Law, 2013, Vol. 3, No. 2, pp 100–116
Roßnagel A (2000) Datenschutzaudit. Konzeption. Durchführung. Gesetzliche Regelung. Vieweg,
Braunschweig
Roßnagel A (2011) Datenschutzaudit - ein modernes Steuerungsinstrument. In: Hampel L,
Krasmann S, Bröcking U (eds) Sichtbarkeitsregime. Überwachung. Sicherheit und Privatheit
im 21. Jahrhundert, pp 263–280
Roßnagel A (2018) Art. 2. In: Simitis S, Hornung G, Spiecker gen., Döhmann I
(eds) Datenschutzrecht – DSGVO mit BDSG, Großkommentar. Nomos, Baden-Baden
Roßnagel A (ed) (2018) Das neue Datenschutzrecht. Europäische Datenschutz-Grundverordnung
und deutsche Datenschutzgesetze. Nomos, Baden-Baden
Roßnagel A, Nebel M. Richter P (2015) Was bleibt vom Europäischen Datenschutzrecht? -
Überlegungen zum Ratsentwurf der DS-GVO. ZD 2015, pp 455–460
Roßnagel A, Sunyaev A, Batman A et al. (2017) AUDITOR: Neues Forschungsprojekt zur
Datenschutz-Zertifizierung von Cloud-Diensten nach der DS-GVO. ZD-Aktuell 2017, 05900
(online-resource)
Roßnagel A, Sunyaev A, Batman A, Lins et al. (2018) AUDITOR-Kriterienkatalog, draft version
v.07, research contribution 4 June 2018, available as a technical report via https://
publikationen.bibliothek.kit.edu/1000083222. Last accessed 16 August 2018
Schäfer C, Fox D (2016) Zertifizierte Auftragsdatenverarbeitung. Das Standard-ADV-Model.
Datenschutz und Datensicherheit (DuD), 2016, Vol. 11, pp 744–748
Schneider S, Sunyaev A et al. (2016) Entwicklung eines Kriterienkatalogs zur Zertifizierung von
Cloud Services. In: Krcmar H, Leimeister JM et al (eds) Cloud-Services aus der
Geschäftsperspektive. Springer Fachmedien, Wiesbaden, pp 337–349
Schultze-Melling J (2013) Datenschutz. In: Bräutigam P (ed) IT-Outsourcing und
Cloud-Computing. Eine Darstellung aus rechtlicher, technischer, wirtschaftlicher und
vertraglicher Sicht, 3rd edn. Erich Schmidt Verlag GmbH & Co., Berlin
Schwartmann R, Weiß S (2018) In: Schwartmann R, Jaspers A, Thüsing G, Kugelmann D
(eds) Heidelberger Kommentar. DS-GVO/BDSG. C.F. Müller, Heidelberg
Semmelmann C (2012) Theoretical Reflections on the Public-Private Distinction and their Traces
in European Union Law. Oñati Socio-legal Series (online), 2012, Vol. 2 (4), pp 25–59,
available via http://ssrn.com/abstract=2016077. Last accessed 16 August 2018
Siemen B (2006) Datenschutz als europäisches Grundrecht. Duncker und Humblot, Berlin
280 A. N. Batman

Spindler G (2016) Selbstregulierung und Zertifizierungsverfahren nach der DS-GVO. Reichweite


und Rechtsfolgen der genehmigten Verhaltensregeln. ZD 2016, p 407
Spindler G, Thorun C (2015) Eckpunkte einer digitalen Ordnungspolitik. Politikempfehlungen zur
Verbesserung der Rahmenbedingungen für eine effektive Ko-Regulierung in der
Informationsgesellschaft, available via https://sriw.de/images/pdf/Spindler_Thorun-Eckpunkte_
digitale_Ordnungspolitik_final.pdf. Accessed 15 August 2018
Stone CD (1982) Corporate Vices and Corporate Virtues: Do Public/Private Distinctions Matter?
University of Pennsylvania Law Review, 1982 Vol. 130, No. 6, pp 1441–1509, available via
http://www.jstor.org/stable/3311978. Last accessed 16 August 2018
Tinnefeld C, Hanßen C (2018) In: Wybitul T (ed) (2018) Handbuch EU-Datenschutz-
Grundverordnung. Fachmedien Recht und Wirtschaft, dfv Mediengruppe Frankfurt am Main,
Commentary of Article 24 GDPR
van der Sloot B, Broeders D, Schrijvers E (2016) Exploring the boundaries of Big Data.
Amsterdam University Press
Weichert T (2010) Datenschutzzertifizierung – Vorteile für Unternehmen. ITK-Kompendium
2010, pp 274–279
Weichert T (2018) Art. 4. In: Däubler et al. (eds) EU-Datenschutz-Grundverordnung und
BDSG-neu. Kompaktkommentar. Bund-Verlag, Frankfurt am Main
Werner F (1959) Verwaltungsrecht als konkretisiertes Verfassungsrecht. DVBl, 1959, pp 527–533
Wybitul T (ed) (2018) Handbuch EU-Datenschutz-Grundverordnung. Fachmedien Recht und
Wirtschaft, dfv Mediengruppe, Frankfurt am Main

Ayşe Necibe Batman is Attorney in Frankfurt am Main, Germany and advises and represents
national and international clients in the areas of IT and data protection and privacy law,
telecommunication law and public law. Previously, she worked as research associate at the
University of Kassel, Department of Public Law with a focus on law of technology within the
interdisciplinary research project AUDITOR (European Cloud Service Data Protection
Certification) funded by the Federal Ministry for Economic Affairs and Energy and is the
co-author of the AUDITOR catalogue of criteria for the data protection certification of cloud
services under the EU GDPR. Batman studied Law at the Goethe-University in Frankfurt,
Germany and at the Université Lumière Lyon 2 in France with a focus on International and
European Public Law and finished her studies with the First State Examination. She did her legal
internship at the Supreme Judicial District (Kammergericht) in Berlin with a traineeship inter alia
at the Federal Constitutional Court (Bundesverfassungsgericht) in Karlsruhe, Germany and
completed her legal internship with the Second State Examination. Besides her profession as
Attorney she is writing a doctoral thesis on a data protection related topic.
Chapter 15
Data Privacy Laws Response
to Ransomware Attacks:
A Multi-Jurisdictional Analysis

Magda Brewczyńska, Suzanne Dunn and Avihai Elijahu

Contents

15.1 Introduction...................................................................................................................... 282


15.2 What is Ransomware?..................................................................................................... 283
15.3 Ransomware Targeting Personal Data ............................................................................ 285
15.4 Ransomware as Information Security Failure................................................................. 286
15.5 Ransomware in the Light of Legal Obligations to Secure Personal Data ..................... 288
15.5.1 InfoSec and Security of Personal Data .............................................................. 288
15.5.2 “Security Safeguards Principle” in the OECD Guidelines................................ 288
15.5.3 European Union.................................................................................................. 289
15.5.4 Canada ................................................................................................................ 291
15.5.5 Israel.................................................................................................................... 293
15.5.6 Analysis .............................................................................................................. 295
15.6 Data Breach Notification Obligations ............................................................................. 298
15.6.1 Rationale ............................................................................................................. 298
15.6.2 European Union.................................................................................................. 299
15.6.3 Canada ................................................................................................................ 299
15.6.4 Israel.................................................................................................................... 300
15.7 Conclusion ....................................................................................................................... 301
References .................................................................................................................................. 302

M. Brewczyńska (&)
Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University,
Warandelaan 2, 5037AB Tilburg, The Netherlands
e-mail: m.m.brewczynska@uvt.nl
S. Dunn
Faculty of Law, University of Ottawa, 57 Louis-Pasteur Private, Ottawa, Canada
e-mail: suzie.dunn@uottawa.ca
A. Elijahu
Faculty of Law, University of Haifa, Sheshet Ha’yamim st. 282/6, Kiryat shemona, Israel
e-mail: avijai323@gmail.com

© T.M.C. ASSER PRESS and the authors 2019 281


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_15
282 M. Brewczyńska et al.

Abstract In recent years thousands of organisations have fallen victim to ran-


somware attacks. This malicious software disables access to users’ data and
demands payment of a ransom for its restoration. Cyberattacks like these are usually
thought of in the context of cybercrime, but because the data affected by ran-
somware is often personal data, such attacks also raise pertinent questions that need
to be examined under the light of data privacy laws. Considering that security has
always been central to the protection of personal data, this chapter proposes an
analysis of ransomware attacks through the lens of the well-established information
security model, i.e. the CIA (confidentiality, integrity, and availability) triad. Using
these three basic security principles, we examine whether ransomware will be
considered a data breach under data privacy laws and what the legal implications of
such breaches are. In order to illustrate these points, we will focus on ransomware
attacks that target organisations that process personal data and highlight three
examples of jurisdictions, namely the European Union (EU), Canada and Israel.

 
Keywords Ransomware Malware Information Security (InfoSec) CIA triad  

Data privacy Data breach notification

15.1 Introduction

What Happened to My Computer?


Your important files are encrypted.
Many of your documents (…) are no longer accessible because they have been
encrypted. Maybe you are busy looking for a way to recover your files, but do not
waste your time. Nobody can recover your files without our decryption services.
Can I Recover My Files?
Sure. We guarantee that you can recover all of your files safely and easily.
(…) But if you want to decrypt all your files, you need to pay.
You only have 3 days to submit the payment. After that the price will be doubled.
Also, if you don’t pay in 7 days, you won’t be able to recover your files forever.1
In May 2017, this notification appeared on hundreds of thousands of screens
across the world, when users found their devices held hostage by the WannaCry
ransomware attack.2 Without access to their data, many organizations were unable
to function and the data stored on their devices were at risk of being lost perma-
nently. WannaCry is only one example of the increasing number of ransomware
attacks that are impacting individuals, industry, governments, law firms, hospitals,

1
This is part of the message sent with the 2017 WannaCry ransomware attack. See e.g. Petit
2017.
2
Furnell and Emm 2017, p. 6.
15 Data Privacy Laws Response to Ransomware Attacks … 283

and other organizations. Ransomware is malicious software that disables access to


either specific sets of data or the device’s entire system, until a ransom is paid by
the victim. In other words, it is a “malware which demands a payment in exchange
for a stolen functionality”.3 Oftentimes, such disruption affects personal data, i.e.
“any information relating to an identified or identifiable individual (data subject)”.4
Ransomware attacks are usually thought of in the context of cybercrime, but they
also raise pertinent questions that need to be examined under the light of the data
privacy laws.5 This chapter aims at contributing to the legal literature on security in
the context of the data privacy protection and providing one of the first legal
analysis with the specific focus on ransomware.
In an effort to better understand when ransomware will be considered a data
privacy breach (data breach), and what the implications of such a legal classification
are, this chapter will use concepts found in the information security (InfoSec)
literature to explore how ransomware attacks interfere with the principle of data
security as defined by data privacy laws. It will further examine potential data
breaches caused by ransomware and scrutinise respective data breach notification
obligations. To that end, first, we will explain what ransomware is and who is
targeted by it. Secondly, we will discuss when ransomware attacks could cause an
InfoSec failure in the light of the “CIA” (confidentiality, integrity and availability)
triad. Thirdly, we will look at the InfoSec and data security interplay and then argue
that ransomware attacks qualify as data breaches and consequently, may trigger
notification obligations under data privacy laws. In order to illustrate these points,
we will focus on attacks that target organisations processing personal data and
highlight three examples of jurisdictions with well-developed data privacy laws,
namely the European Union (EU), Canada and Israel.

15.2 What is Ransomware?

Ransomware is malicious software used to block access to an electronic device or the


data stored within it, in order to extract a ransom from the device’s user.6 There are
two primary forms of ransomware: locker-ransomware and crypto-ransomware.7
When a device is infected with either of them, a warning announcement appears on

3
Gazet 2010, p. 77.
4
OECD 1980, Part One General Definitions, 1(b). We adopt this definition in an attempt to
capture a universal term across all of the jurisdictions we refer to in this chapter. Due to space
constraints, a discussion on the definition of and distinction between the terms: personal data,
personally identifiable information (PII), personal information, etc. must be omitted.
5
For consistency purposes, we use the term “data privacy” following Bygrave’s suggestion that it
“provides a bridge for synthesizing European and non-European legal discourses” (Bygrave 2014,
p. 29).
6
Cabaj et al. 2018, p. 353.
7
Cabaj et al. 2018, p. 353; Gómez-Hernández et al. 2018, p. 389.
284 M. Brewczyńska et al.

the screen, demanding a payment for the user to regain access to their device or data.
Attackers typically request payments in the form of a cryptocurrency, such as
Bitcoin, as it makes it extremely difficult to trace the transfer of cryptocurrency back
to the attackers to identify them.8
Locker-ransomware denies access to an infected device by disabling the display
or keyboard of the device, but it typically leaves the underlying system and files
intact.9 This type of ransomware can easily be dismantled using techniques and
tools that restore the computer system.10 Whereas crypto-ransomware, which we
focus on in this chapter, is considered to be much more destructive11 because it
prevents users from accessing their data by encrypting it.12 Encryption is a process
of disguising plaintext (e.g. a document which contains personal data), by turning it
into ciphertext, so that the original substance is hidden.13 Over time, the sophisti-
cation of ransomware attacks has intensified, including the present shift from using
only symmetric key cryptography, where the encryption and decryption keys are
the same.14 Newer types of ransomware apply asymmetric cryptography,15 which
uses a public key for encryption and a private key for decryption,16 making it even
harder to recover the data through other means than paying the ransom.17
Ransomware uses several tactics to infect a computer, but the more common
ones are: targeting unpatched software vulnerabilities, drive-by-download (infect-
ing a device by automatically downloading the malware, e.g. when a user visits a
particular website), malvertisement, and social engineering (e.g. sending the mal-
ware in an attachment in a legitimate looking email).18 Attackers also employ
various strategies to bypass anti-virus detection and evade analysis tools in order to
successfully deliver the malware.19
Each “family” of ransomware has its own strategy for infecting a device and
demanding a ransom.20 Nevertheless, many ransomware attacks share a similar
pattern of attack. Generally, once a device is infected, the malicious code contacts
the attacker through the Command and Control server and after establishing a
secure communication channel, it starts looking for further instructions. Instructions
may include identifying specific files that should be targeted for encryption,

8
Kshetri and Voas 2017, p. 2.
9
Cabaj et al. 2018, p. 353.
10
Gómez-Hernández et al. 2018, p. 391.
11
Gómez-Hernández et al. 2018, p. 391.
12
Gómez-Hernández et al. 2018, p. 389.
13
Schneier 2015, p. 1.
14
Schneier 2015, p. 4.
15
Cabaj et al. 2018, p. 354.
16
Schneier 2015, p. 4.
17
Cabaj et al. 2018, p. 354.
18
Palisse 2017, p. 13.
19
Palisse 2017, p. 13; Kharraz and Kirda 2017, p. 101.
20
Gómez-Hernández et al. 2018, p. 389.
15 Data Privacy Laws Response to Ransomware Attacks … 285

reporting certain information back to the attacker, or removing the backup files to
prevent data restoration.21 Lastly, in most cases, ransomware modifies the booting
up process of the operating system so that only the ransom demand is displayed on
the screen.22
Both individual and networked devices are targeted by ransomware attacks, but
organizations that rely on connected networks have reported wider impacts due to
the swift spread of ransomware that is designed to automatically infect all of the
devices connected to the network.23 If the malware is not detected by the organi-
zation’s security system, it can spread to other devices in the network, disabling all
the devices and servers it successfully reaches.24

15.3 Ransomware Targeting Personal Data

Ransomware is evolving not only technically, but also in terms of its targets.25
Large-scale attacks on thousands of individual devices demanding smaller ransom
amounts, like the WannaCry attack, which was extorting between 300 and 600 US
dollars per device,26 is one tactic used by ransomware attackers.27 However, more
recently many attackers have realized the vulnerability of organizations that manage
particularly valuable data, like e.g. health data.28 According to a report by Osterman
Research, attackers have begun to target specific organisations with personal data,
such as hospitals due to the value resulting from the sensitivity of the data they
manage.29 Certain organizations have been recognised as potentially willing to pay
a much higher ransom to recover their data in a single attack.30
In addition to targeting specific organizations, some strains of ransomware target
certain types of files, which have been predetermined to be the most valuable for the
system owner, to increase the efficiency of the attack. As we mentioned before, in
some cases ransomware receives the instruction from the attacker to first scan the
contaminated device’s file system and, for instance, search for files with specific
extensions (e.g. pdf, doc, jpg) known to store important information, including
personal data, and prioritize encrypting those files.31

21
Liska and Gallo 2017, p. 10; Brewer 2016, p. 7.
22
See Gómez-Hernández et al. 2018, p. 391.
23
See e.g. Mansfield-Devine 2017a, p. 17.
24
O’Brien 2017, p. 59.
25
See e.g. Mansfield-Devine 2016, p. 12.
26
Kshetri and Voas 2017, p. 2.
27
O’Brien 2017, pp. 14, 17; McAfee 2018, p. 11.
28
Mansfield-Devine 2017b, p. 15.
29
Osterman 2016, pp. 8, 55.
30
Mansfield-Devine 2016, p. 9.
31
Gómez-Hernández et al. 2018, p. 393.
286 M. Brewczyńska et al.

15.4 Ransomware as Information Security Failure

This section will shed some light on ransomware attacks from the perspective of
InfoSec. InfoSec is commonly defined by three primary notions: confidentiality,
integrity and availability, jointly known as the “CIA triad”. These concepts are
used, among others, by the International Organization for Standardization (ISO) and
can be found e.g. in the ISO/IEC 27000 standard, which defines InfoSec as the
“preservation of confidentiality, integrity and availability of information”.32 The
classic InfoSec canons serve as the assessment criteria to determine the level of
security that is in place in network and information systems,33 and are to some
extent reflected in data privacy laws, which we will examine later.
The first element, confidentiality, requires information to be kept secret and
protected from exposure to any unauthorized parties,34 which includes not only
protection from unauthorized individuals or entities, but also unauthorized pro-
cesses.35 Thus, to maintain confidentiality organisations need to govern and control
access to their systems and data they process to prevent any unauthorized exposure.
The second element, integrity, requires the preservation of the data’s accuracy
and completeness.36 According to Andress, integrity means that data must not be
altered or deleted in an unauthorized or undesirable way.37 To maintain integrity,
organisations need to implement the means to prevent such changes or deletions
and have the ability to detect and reverse them if any occur.38
The last concept, availability, is understood as a “property of being accessible
and usable on demand by an authorized entity”.39 It can also be described as the
state of “systems, services, and networks being up and running”,40 which implies
the need for ensuring reliable and uninterrupted access to the information.
Whenever a cyberattack occurs, the assurance of the various InfoSec properties
is verified and any shortcomings in the preservation of the CIA triad components
are revealed. The attacks may result in an interruption, modification and intercep-
tion of the data, or a combination of those.41 Each type may interfere with one or

32
ISO/IEC 27000:2018 Information technology—Security techniques—Information security
management systems—Overview and vocabulary, clause 3.28.
33
Porcedda 2018, p. 5.
34
Andress 2014, p. 6.
35
ISO/IEC 27000:2018, clause 3.10 defines confidentiality as “property that information is not
made available or disclosed to unauthorized individuals, entities, or processes”.
36
ISO/IEC 27000:2018, clause 3.36.
37
Andress 2014, p. 6.
38
Andress 2014, p. 6.
39
ISO/IEC 27000:2009, clause 3.7.
40
Sherman et al. 2017, p. 371.
41
Andress 2014, p. 9.
15 Data Privacy Laws Response to Ransomware Attacks … 287

more of the CIA principles and the strict lines between them cannot always be
drawn.42 For instance, it can be conceptually difficult to distinguish an availability
failure from an integrity failure when an attack results in the loss of information.
Therefore, for the purpose of this chapter, we propose to take account of the key
characteristics of functionalities of cyberattacks and pair them as follows: inter-
ruption—availability; modification—integrity; interception—confidentiality.
Accordingly, we argue that ransomware seems to fit best under the category of
interruption, since the attack locks up the data, impacting its availability until the
ransom is paid and access is restored, or the data is recovered by other technical
means. An interruption results in rendering data temporarily or permanently
unusable or unavailable43 and, as such, affects the InfoSec, primarily the principle
of availability. The severity of interruption can escalate if, in addition to encrypting
data, the ransomware also deletes volume shadow copies of the attacked files. Such
copies usually enable the restoration of the attacked files (see Sect. 15.2). There is
also the risk of losing the data entirely if the attacker does not release the decryption
key.
Ransomware also has some characteristics of an interception attack, which
typically allows an unauthorized user to access data or applications, but also
environments.44 Even though ransomware is not typically oriented towards
breaching information confidentiality (in most cases the data are locked and the
content is neither looked at nor revealed by the attacker) unauthorized access to the
user’s environment is gained, which implies a confidentiality breach. In addition,
ransomware may also scan the device’s filesystem to determine which files to
encrypt or it may report the information found on the infected device back to the
attacker (see Sect. 15.3). A full understanding of the attack, including whether any
data was actually viewed, or what the attacker did with that data, besides encrypting
it, can be difficult to detect even with a forensic analysis.45
Furthermore, some classify ransomware as a variant of a data integrity attack.46
Indeed, the malware manipulates the processes that run on the device, and hence it
should certainly be considered an issue of a system integrity. However, the data per
se is “modified” only in a sense that the attacked files are converted into an
encrypted form. Therefore, should there be no flaws in the encryption process that
result in actual content modification, ransomware does not jeopardise the data as
such.

42
Andress 2014, p. 9.
43
Andress 2014, p. 9.
44
Andress 2014, p. 9.
45
Gordon et al. 2016.
46
E.g. Ekstrom et al. 2017.
288 M. Brewczyńska et al.

15.5 Ransomware in the Light of Legal Obligations


to Secure Personal Data

15.5.1 InfoSec and Security of Personal Data

InfoSec and its principles are not only desired security characteristics of computer
systems and networks but have been given particular prominence in the data pri-
vacy laws.47
Although it appears that there is not an exact mirroring of the concepts essential
to InfoSec and understanding of the security of personal data, there is a high degree
of convergence between those two. For instance, the European Union Agency for
Network and Information Security (ENISA) considers security “central for the
protection of confidentiality, integrity and availability of personal data”.48 In a
similar vein, ISO/IEC 29100, which provides a privacy framework, specifies that in
the context of the protection of personal data, adhering to InfoSec principle means,
among others, “to ensure the integrity, confidentiality and availability of [personal
data]”.49

15.5.2 “Security Safeguards Principle” in the OECD


Guidelines

Before we illuminate where data privacy laws in the EU, Canada and Israel include
InfoSec concepts, the OECD Guidelines Governing the Protection of Privacy and
Transborder Flows of Personal Data are worth recalling.50 The “Security Safeguards
Principle” contained therein stipulates: “[p]ersonal data should be protected by
reasonable security safeguards against such risks as loss or unauthorised access,
destruction, use, modification or disclosure of data”.51 The risks enlisted in the
Guidelines seem to match three above discussed types of security failures, and thus
might be paired with the CIA goals (see Table 15.1). Accordingly, the unauthorised

47
E.g. the OECD Guidelines (OECD 2013) and APEC Privacy Framework (APEC 2005), which
use the term “Security Safeguards”; Convention for the Protection of Individuals with regard to
Automatic Processing of Personal Data (no. 108) of the Council of Europe (COE 1981), which
recognises the principle of “Data Security”.
48
ENISA, Security of personal data https://www.enisa.europa.eu/topics/data-protection/security-
of-personal-data. Accessed 27 August 2018.
49
ISO/IEC 29100 Information technology—Security techniques—Privacy framework,
Section 5.11.
50
The reason for this is that both Canada and Israel, as well as almost all of the Member States of
the EU (with the exception of Bulgaria, Croatia, Cyprus, Malta, Romania) are members of the
OECD.
51
OECD 2013, principle 11.
15 Data Privacy Laws Response to Ransomware Attacks … 289

Table 15.1 Security goal - Security failure - Risks according to OECD Guidelines
Security goal Security failure Risks according to OECD guidelines
Confidentiality Interception Unauthorised access, disclosure or use
Integrity Modification Unauthorised modification
Availability Interruption Unauthorised loss or destruction
[Source The authors]

access, disclosure, or use of data may lead to a breach of information confidentiality;


unauthorised data modification is an example of infringing information integrity; and
unauthorised loss or destruction of data affect information availability.

15.5.3 European Union

In the European Union, most of the personal data processing operations fall under
the new General Data Protection Regulation (GDPR)52 regime, which became
fully applicable in May 2018. Under the GDPR, the need for safeguarding security
plays an important role. Some say that compared to the previous Data Protection
Directive,53 the GDPR consists of a more comprehensive approach concerning data
security.54 Similarly to the Security Safeguards Principle in the OECD Guidelines,
Article 5(1)(f) of the GDPR provides that personal data should be processed “in a
manner that ensures appropriate security of the personal data, including protection
against unauthorised or unlawful processing and against accidental loss, destruction
or damage, using appropriate technical or organisational measures”. Despite
touching on characteristics of all three CIA concepts in the wording of the cited
provision, the GDPR spells out only two of them directly, namely: integrity and
confidentiality and refrains from mentioning the third element of the CIA triad—
availability. Nevertheless, the recognition of the importance of the availability of
data by EU legislator seems indisputable. In addition to Article 5(1)(f) of the
GDPR, which obliges organizations to prevent “accidental loss, destruction or
damage” of data, the obligation to maintain the availability of data can be inferred
from the principle of transparency55 and provisions which establish the right to
access personal data by a data subject. In principle, the data controller is obliged to

52
European Parliament and Council Regulation 2016/679 of 27 April 2016 on the protection of
natural persons with regard to the processing of personal data and on the free movement of such
data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/59.
53
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the
protection of individuals with regard to the processing of personal data and on the free movement
of such data [1995], OJ L 281.
54
Jay 2017, p. 131.
55
GDPR, Article 5(1)(a).
290 M. Brewczyńska et al.

provide information to the data subject, including a copy of the personal data
undergoing processing, without undue delay and in any event within one month of
receipt of the request.56
Furthermore, the GDPR lays down an obligation to implement technical and
organizational measures to ensure a level of security appropriate to potential risks
and lists several examples of such measures.57 Interestingly, it is the only time when
the GDPR explicitly refers to the CIA triad and provides that organisations should
have “the ability to ensure the ongoing confidentiality, integrity, availability and
resilience of processing systems and services”.58 In this respect, ENISA has con-
tended that, in the GDPR, security “equally covers” all three InfoSec attributes,59
however it should be noted that the GDPR speaks of the CIA goals with respect to
the “systems and services”, but not specifically with respect to personal data.
This brings us to the question of the consequences of noncompliance with the
security principles, and whether such situations will always amount to a data breach
under the GDPR. A data breach is “a breach of security leading to the accidental or
unlawful destruction, loss, alteration, unauthorised disclosure of, or access to,
personal data transmitted, stored or otherwise processed”.60 The definition of a data
breach and the principles established in Article 5(1)(f) of the GDPR correspond,
albeit not in their entirety. Article 5(1)(f) seems to provide a non-exhaustive list of
risks against which personal data should be protected. Whilst the definition of a data
breach seems to call for a narrow interpretation. In other words, not all, but only
those InfoSec incidents which “lead to” the specifically enumerated consequences
to personal data, can be considered data breaches.
Having argued earlier that the key characteristics of ransomware make this form
of attack be best categorized as an interruption attack, and ransomware is thereby
best viewed as an availability issue, it appears that on the basis of the GDPR
definition of a data breach, the most relevant is the element of “the loss of data”.
Since the GDPR does not elaborate on whether “the loss” needs to be permanent
and irreversible in order to be considered a data breach one may argue that because
the files are only encrypted, and the possibility to restore the data through paying
the ransom or with other measures still exists, there is no actual “loss of data”, and
thus no breach. On the other hand, until the access is restored, the data remains
unusable, which in practice does not differ much from its destruction in that
moment. The Article 29 Working Party (WP29),61 endorsed the second interpre-
tation and stated that “a security incident resulting in personal data being made

56
GDPR, Articles 15 and 13(3).
57
GDPR, Article 32.
58
GDPR, Article 32(1)(b) (emphasis added).
59
ENISA 2016, p. 8.
60
GDPR, Article 4 indent 12.
61
Prior to the establishment of the European Data Protection Board (EDPB), WP29 was an
advisory body consisting in representatives from the data protection authorities of each EU
Member State, the European Data Protection Supervisor and the European Commission.
15 Data Privacy Laws Response to Ransomware Attacks … 291

unavailable for a period of time is also a type of breach, as the lack of access to the
data can have a significant impact on the rights and freedoms of natural persons”.62
Furthermore, WP29 took the position that regardless of the loss of data, ran-
somware would still qualify as a data breach, due to the network intrusion that must
have occurred.63 Regrettably, when speaking of ransomware as a problem of
confidentiality, WP29 did not take account of the phrasing used in the discussed
definition and the issue of differentiating between an unauthorised access to data
and to a system.

15.5.4 Canada

Like the GDPR, Canada’s federal privacy legislation,64 the Personal Information
Protection and Electronic Documents Act (PIPEDA),65 also reflects the InfoSec
principles of confidentiality, integrity, and availability. Two of PIPEDA’s ten key
principles66 most closely encapsulate the CIA concepts: safeguards and individual
access. The principle of safeguards is outlined in Section 4.7 of Schedule 1 of
PIPEDA. Clause 4.7.1 requires that organizations establish safeguards to “protect

62
Article 29 Working Party 2018, p. 8.
63
Article 29 Working Party 2018, p. 9.
64
It should be noted that Canada has various levels of privacy legislation, but for the purpose of
this chapter we will only focus on the federal privacy legislation that protects personal information
collected in the course of commercial activities, PIPEDA. For other Canadian privacy legislation
see the federal Privacy Act, RSC, 1985, c P-21; provincially based privacy legislation, including,
Alberta’s Personal Information Protection Act, SA 2003, C P-6.5; British Columbia’s Personal
Information Protection Act, SBC 2003, c 63; Quebec’s An Act Respecting the Protection of
Personal Information in the Private Sector, SQ C P-39.1; provincial privacy legislation that
addresses health privacy, including, Alberta’s Health Information Act, RSA 2000, c H-5; British
Columbia’s E-Health (Personal Health Information Access and Protection of Privacy) Act SBC
2008, C-38; Manitoba Personal Health Information Act, CCSM, c P33.5; New Brunswick’s
Personal Health Information Privacy and Access Act, SNB 2009, c. P-7.05; Newfoundland and
Labrador’s Personal Health Act, SNL 2008 C P-7.01; Northwest Territories, Health Information
Act, SNWT 2014, c-2; Nova Scotia’s Personal Health Information Act, SNS 2010, c 41; Ontario’s
Personal Health Information Protection Act, SO 2004, C 3, Sched A; Quebec’s An Act to amend
the Act respecting health services and social services, the Health Insurance Act and the Act
respecting the Régie de l’assurance maladie du Québec, SQ 2008, C-8; Saskatchewan’s Health
Information Protection Act, SS, C H-0.021; Yukon’s Health Information Privacy and
Management Act, SY 2013, c 16; and provincial, territorial, and municipal governments may have
applicable public sector privacy legislation that covers institutions such as universities and gov-
ernments, such as Ontario’s Freedom of Information and Protection of Privacy Act, RSO 1990, c
F31.
65
Personal Information Protection and Electronic Documents Acts (PIPEDA), SC 2000, c 5,
Section 4.7.
66
PIPEDA is based on 10 key principles: Accountability, Identifying Purposes, Consent,
Limiting Collection; Limiting Use, Disclosure, and Retention; Accuracy, Safeguards, Openness,
Individual Access, and Challenging Compliance.
292 M. Brewczyńska et al.

personal information against loss or theft, as well as unauthorized access, disclo-


sure, copying, use, or modification”.67 Failure to do so will be considered a breach
under PIPEDA. As we noted earlier, ransomware poses a serious threat to the loss
of personal data, and arguably leads to the unauthorized access of data and the
potential modification thereof (see Sect. 15.4). As such, the failure of an organi-
zation to safeguard data from a ransomware attack could be considered a breach
under Canada’s privacy legislation.68
A breach of security safeguards is defined under Section 2(1) of PIPEDA as the
“loss of, unauthorized access to or unauthorized disclosure of personal information
resulting from a breach of an organization’s security safeguards that are referred to in
Clause 4.7 of Schedule 1 or from a failure to establish those safeguards”.
Organizations in Canada are thus required to implement adequate safeguards to
protect electronically stored private data69 from ransomware attacks. Clause 4.7.3 of
PIPEDA offers examples of some safeguards that could protect private data from
privacy data breaches, including technological solutions such as encryption.70
PIPEDA Case Summary #2009-017 held that it can be beneficial to safeguard
personal data by encrypting data that are accessible via the internet.71 Understanding
that it can be difficult to determine the level of access a ransomware attacker had to
the data (see Sect. 15.4),72 encryption could be a beneficial safeguard from ran-
somware attacks as it would make the data undecipherable to the attacker, should
they get access to the system. It must be noted however, that encrypting data can
only protect against a confidentiality breach by making the data unreadable by the
attacker, but it does not secure the data from being rendered unavailable as a result of
a ransomware attack, which may require additional safeguards to meet the obliga-
tions under PIPEDA, such as having secure back-ups of the data in place or
implementing protective software and employee practices to prevent attacks.73
For example, while the WannaCry attack relied on a security vulnerability in a
software program that could have been prevented by a technological upgrade, many
ransomware attacks are successful because an employee is unaware of ransomware
tactics and opens up an infected file in an email, which is a non-technical security
vulnerability.74 To safeguard against non-technical breaches, employee training is

67
PIPEDA, c 5, Section 4.7.
68
See PIPEDA Report of Findings #2014-015, “After a significant Adobe data breach, customer
questions company’s security safeguards and the response it provided about impacts on his per-
sonal information” (3 September 2014) which involved safeguarding from cyberattacks.
69
PIPEDA Report of Findings #2014-015, “After a significant Adobe data breach, customer
questions company’s security safeguards and the response it provided about impacts on his per-
sonal information” (3 September 2014).
70
PIPEDA, Schedule 1, s 4.7.3.
71
PIPEDA Case Summary #2009-017, Third-Party landlord organization collected, used and
disclosed tenants’ personal information without their consent.
72
Gordon et al. 2016.
73
Al-rimy et al. 2018; Office of the Privacy Commissioner (2018).
74
O’Brien 2017, p. 20.
15 Data Privacy Laws Response to Ransomware Attacks … 293

needed as well as technological safeguards. To meet its obligations under PIPEDA,


organizations will need to consider the layers of security safeguards that need to be
in place to prevent breaches related to ransomware attacks.
Interestingly, in its report #2014-004, the Canadian Office of the Privacy
Commissioner noted that “the fact that a breach has occurred is not necessarily
indicative of a contravention of the Act. For example, an organization may have
appropriate safeguards in place and still fall victim to a determined, clever and/or
innovative attacker”.75 This was demonstrated when a company faced a “zero day”
cyberattack, meaning that the organization could not have known about the vul-
nerability at the time of the attack, but had appropriate safeguards in place including
the use of encryption and intrusion detection systems.76 Even though personal
information was accessed during the attack, the Commissioner determined that the
organization had met its obligations under PIPEDA. This illustrates that organi-
zations are not obliged to protect against unimaginable attacks but must have
adequate safeguards in place that are relevant to the sensitivity of the personal
information under their control and the knowledge of cyber threats available at that
time. Considering the significant number of organizations that have been impacted
by ransomware and the increasing awareness of these attacks, organizations will
need to implement safeguards that are appropriate for the data they manage to
protect against these attacks.
The second relevant key principle from PIPEDA, individual access, is found in
Clause 4.9. It establishes the individual’s right to access their personal data. This
principle ties to the concept of availability. Under this clause an organization must
give an individual access to their personal data when requested, within a reasonable
time.77 In order to comply with this obligation, it is necessary for an organization to
maintain reliable access to the personal information they manage, something that is
prevented when an organization is temporarily locked out of their data, or when the
data is lost altogether, due to a ransomware attack. A successful ransomware attack
could be a breach under PIPEDA by making the data inaccessible.

15.5.5 Israel

The Israeli privacy legislation, in a manner that sets it apart from the EU or
Canadian legislation, is built like a funnel. The Protection of Privacy Act (PPA)78
sits at the top of the funnel and prescribes the general principles and obligations of

75
PIPEDA report findings #2014-004, Online service provider that suffered a breach had
appropriate safeguards in place (23 April 2014), p. 2.
76
PIPEDA report findings #2014-004, Online service provider that suffered a breach had
appropriate safeguards in place (23 April 2014), p. 2.
77
PIPEDA, Section 4.9 & 4.9.4.
78
Protection of Privacy Act (PPA), 5741–1981 of 11 March 1981.
294 M. Brewczyńska et al.

the owner of the database,79 and the Privacy Protection (Data Security) Regulations
(PPDS),80 (which are lower in the hierarchy of norms) specifically guide how to
carry out the duties prescribed in the PPA. The PPA and the PPDS jointly establish
an obligation to ensure that security safeguards are in place to protect personal data.
Chapter 2 of the PPA (sections 7-17I) deals with Protection of Privacy in
Databases. Section 17 of the PPA states that “[a] database owner, possessor or
manager are each liable for the information security in the database”. Information
security “means protection of the integrity of the information, or protection of the
information from being exposed, used or copied, without legal permission”.81 On
the face of it, it seems that Israeli data privacy law covers only two notions of the
CIA triad, namely integrity and confidentiality. However, this formulation must be
read in conjunction with the further explanation of the term “information integrity”,
which requires that “the data in the database is identical to the source from which it
was drawn, not having been changed, delivered or destroyed without lawful per-
mission”.82 Interestingly, unlike the GDPR and PIPEDA, PPA does not mention
data loss. Consequently, it seems that in this provision only the reference to the
destruction may imply that the Israeli legislation recognises the need to preserve the
availability of data.
Furthermore, the availability principle can be inferred from Article 13(a) of the
PPA anchoring “the Right to inspect information”, which stipulates that “every
person is entitled to inspect (…) any information relating to such person kept in a
database”. It seems that, although it is not explicitly stated in PPA, the notion of
integrity has been given a broad meaning, which in certain circumstances also
covers data availability.
As previously discussed, although ransomware attacks do not typically expose,
use (including disclosure, transfer and delivery)83 or copy the data, they may
potentially lead to a violation of the right of the individual enshrined in Article 13 of
the PPA. Given that, in Israel, protecting information from destruction is part of the
idea of information integrity, the destruction of the data by a ransomware attack
should be considered a violation of the database owner’s obligation to secure the
database, which is considered a civil wrong under the PPA.
Finally, one may claim that ransomware encryption is a form of data change as
the modified data is no longer “identical” to the original. However, as we argued
before, it is open to interpretation whether the integrity of data should be under-
stood as maintaining the continuity of its content or of a file format.
What is unique about Israeli legislation is that the PPDS differentiates between
three levels of databases, which are categorized as basic/medium/high security.

79
“Owner of a database” is the Israeli equivalent of data controller and “possessors” is the
equivalent of data processor (see e.g. Tene 2017).
80
Privacy Protection (Data Security) Regulations (PPSD), 5777–2017 of 8 May 2017.
81
PPA, Section 7.
82
PPA, §7.
83
PPA, §3.
15 Data Privacy Laws Response to Ransomware Attacks … 295

A database is subject to medium security if it contains information about a person’s


intimate life, medical information or a person’s mental condition; genetic infor-
mation, political opinions, religious beliefs, criminal records, telecommunication,
biometric information, financial situation and consumption habits, or consists of a
database whose main purpose is collecting data in order to transfer it to a third
party. A database is subject to high security level if it is a database containing the
data mentioned above (in the medium level) but also contains information about
100,000 people or more, or is one that has more than 100 authorized users.84 A
database is subject to a basic security level if it contains personal data other than
that listed above.
The obligations set out in the PPDS can be divided into two main types of duties
for the owners of the databases: legal obligations, including the duty to document
and report security incidents to the “Register of Databases” (the Register), and
technological obligations, including the obligation to ensure secure and updated
management of the database systems. Interestingly in the context of ransomware,
PPDS expressis verbis provide that when a database is subject to a high security
level, the controller is required to retain the backup copy of the data to ensure its
integrity and the ability to restore the data in case of loss or destruction.85
The PPDS define two kinds of security incidents. The first category is a “non-severe
security incident”, which means every incident “raising concern regarding a breach of
the data integrity, unauthorized use thereof or deviation from authorization”.86 The
second type is a “severe security incident”, which is an incident “involving the use of
data from the database without authorization or in excess of authorization, or damage to
the data integrity in databases subject to a high security level or to a substantial part of the
database in databases subject to medium security level”.87 In a regular security event
only documentation is needed, however a severe security incident also requires
immediate reporting to the Register (see Sect. 15.6.4).

15.5.6 Analysis

The OECD Guidelines and laws in the EU, Canada and Israel acknowledge the
importance of ensuring security to protect data privacy. The preceding sections used

84
“Authorized user”—a person who has access to one of the following with the permission of the
database controller or processor: (1) Data from the database; (2) Database systems; (3) Information
or component which is required for operating or accessing the database; Notwithstanding the
above, a processor who is not an individual, or an individual who obtained access on the basis of
the processor’s permission, will not be considered an authorized user of the database controller.
85
PPDS, §17B: “In a database subject to medium or high security level, the database controller
will back up the data retained as per Sub-Regulation (a) in a manner ensuring that the data can be
restored to its original form at all times”.
86
PPDS, §1.
87
PPDS, §1.
296 M. Brewczyńska et al.

the CIA triad as a common framework to examine how the InfoSec principles have
been reflected in data privacy laws and how these laws respond to the threats posed
by the ransomware attacks. Considering that, from the InfoSec point of view,
ransomware mainly impacts availability we examined whether the analysed laws
provide for an obligation to maintain the data available (see Table 15.2).
The result of our analysis is that in the three presented legal frameworks, the
principle of data availability is implicitly recognised, since it can be interpreted
from the obligations imposed upon data controllers to enable individuals to access,
request or inspect the data. Furthermore, the laws require the organisations to
protect personal data from unauthorised “loss” (GDPR, PIPEDA) or “destruction”
(GDPR, PPA). The event, in which the principle of availability of data is com-
promised due to the loss or destruction of data, seems to fall under the definition of
a data breach.
Considering the other two CIA principles, we note that in all three jurisdictions
the obligation to prevent interception attacks (i.e. to protect data from unauthorised
access or disclosure) and thereby safeguard its confidentiality has been given a
prominent place. The understanding of the notion of data integrity across given
jurisdictions has proved more challenging. As we discussed earlier, in the field of
InfoSec, the concept of integrity is primarily associated with maintaining accuracy
and completeness of information. In the data privacy context, and particularly in the
Israeli law, the term integrity seems to have a broader meaning or can even be an
overarching concept, which covers various forms of unauthorised operations on
data. The difficulty with considering ransomware as a data breach on the ground of
compromising data integrity lies in the fact that the data protection laws seem to
adopt a narrow understanding of integrity, which refers to the integrity of the data
and not to the systems integrity. Furthermore, one may argue that when the laws
speak of “unauthorised alteration” (GDPR), “unauthorised modification” (PIPEDA)
or “change without lawful permission” (PPA), it does not extend to the process of
encryption by ransomware, which in principle, does not modify the content of
information, but affects only structure of the file. Nevertheless, as we indicated
before, one may claim that understanding the ransomware encryption as an alter-
ation, modification or change is also possible.88
In regard to safeguards, none of the laws analysed provide specific guidelines on
how to determine which implemented safeguards would be sufficient to protect
against a ransomware attack under the GDPR, PIPEDA or PPA. All these laws
adopt the risk-based approach with regard to the technical and organisational
measures that may deem “adequate” or “appropriate”. The risk assessment may be
dependent on factors such as the relative sensitivity and quantity of personal data
collected. For instance, organizations that store highly sensitive personal data may
be required to implement more stringent safeguards. Implementing backup

88
E.g. Ekstrom et al. 2017.
15

Table 15.2 Security goals in the GDPR, PIPEDA and PPA


Security goal Data privacy laws
GDPR PIPEDA PPA
Article 5(1)(f) Article 4 ind. 12 Clause 4.7 & 4.9 Section 2(1) Section 7 Section 16
(Schedule 1)
Confidentiality Unauthorised or Unauthorised Unauthorized access, Unauthorized Exposure, use, copying or Disclose without
unlawful disclosure or access disclosure, copying, access or delivery without lawful lawful
processing use disclosure permission permission
Integrity Accidental or Unauthorized − Change without lawful –
unlawful alteration modification permission
Availability Accidental loss, Accidental or Unauthorized loss or Unauthorized Destruction without lawful
Data Privacy Laws Response to Ransomware Attacks …

destruction or unlawful theft loss permission


damage destruction or loss
[Source The authors]
297
298 M. Brewczyńska et al.

procedures that ensure ongoing access to the data, updating software to patch
vulnerabilities to avoid unauthorized access by malware attacks, training employees
on how to identify and avoid ransomware attacks, creating air-gaps such as network
segregation, and having filtration systems that prevent malware-ridden emails from
reaching their targets are all useful safeguards for the organizations to have in place.
An organization which does not have, at the very least, an adequate backup pro-
cedure that allows it to restore its data following a ransomware encryption, is at risk
of losing all of the data captured in the attack.

15.6 Data Breach Notification Obligations

15.6.1 Rationale

An organization which has experienced a data breach may have an obligation to


notify the competent privacy body about the breach and sometimes also the persons
who the data concerned. The low level of reporting becomes apparent when the
number of ransomware attacks that were reported to data privacy bodies in recent
years are compared to the number of known ransomware attacks, the latter far
exceeding the former.89 It seems that organizations are reluctant to admit to being
attacked by ransomware, in part, because a ransomware infection reflects poorly on
an organization’s reputation and can impact their client’s trust in them.90
Due to the introduction of mandatory reporting, the level of data breach
reporting is expected to change in all three jurisdictions analysed. At the time of the
legislative works on the GDPR, the European Commission explained that one of the
main reasons for adopting such a solution was that “breach notifications provide a
systematic feedback about the actual risk and the actual weaknesses of existing
security measures; they enable authorities and consumers to assess the relative
capabilities of data controllers with respect to data security; they force data con-
trollers to assess and understand their own situation regarding security measures”.91
These reasons seem universal also for Canada and Israel.

89
The Office of the Privacy Commissioner recognized the large number of ransomware attacks in
Canada (Parsons 2016), but has not published a single case or report on a reported ransomware
attack.
90
Arnold and Oates 2016; OECD 2011.
91
European Commission 2012, p. 99.
15 Data Privacy Laws Response to Ransomware Attacks … 299

15.6.2 European Union

Despite the initial proposal of the European Commission to introduce a notification


obligation to practically all data breaches, in the final text of the GDPR this obli-
gation was significantly reduced, and a so-called layered approach was adopted.92
Accordingly, when a data breach occurs, the data controllers must assess whether it
may affect the risk to the rights and freedoms of individuals whose data is con-
cerned. In the case of a ransomware attack, if the organisation determines that there
is no risk, because, for example, the attacked files were encrypted by the data
controller (i.e. there is no confidentiality breach) and the data can be restored from
the backup copies, it is unlikely that a notification obligation would apply. The
organisation must still document the incident for the potential future controls.93
Whenever the data controller concludes that there was a data breach, and that it is
likely to result in the risk to natural persons, a notification to the competent
supervisory authority would be required. In addition, the controller must assess the
level of risk. Should it be estimated high, the organization is required to commu-
nicate the underlying data breach also to the data subjects.94
The GDPR provides for some exceptions to the data breach notification rules.
For instance, communication to the individuals is not required if the data controller
“has implemented appropriate technical and organisational protection measures,
and those measures were applied to the personal data affected by the personal data
breach, in particular those that render the personal data unintelligible to any person
who is not authorised to access it, such as encryption”.95

15.6.3 Canada

In Canada, mandatory breach notifications for certain breaches came into force in
November 2018. Notification is now required under PIPEDA where the organi-
zation has failed to adequately safeguard the personal information under its control
and there is a risk of significant harm to the person whose information was
impacted.96 Significant harms include “bodily harm, humiliation, damage to rep-
utation or relationships, loss of employment, business or professional opportunities,

92
de Hert and Papakonstantinou 2016, p. 191.
93
GDPR 2016, Articles 31, 33, 34.
94
GDPR 2016, Article 34.
95
GDPR 2016, Article 34(3)(a).
96
PIPEDA 2000, Section 10.1(7) of the new provisions defines significant harm as: bodily harm,
humiliation, damage to reputation or relationships, loss of employment, business or professional
opportunities, financial loss, identity theft, negative effects on the credit record and damage to or
loss of property.
300 M. Brewczyńska et al.

financial loss, identity theft, negative effects on the credit record and damage to or
loss of property.”97
In determining whether a breach requires notification, each incident will be
assessed independently, examining the context of the breach, factoring in the
sensitivity of the personal information involved and the probability the information
could be misused, along with other prescribed factors.98 Therefore, if a ransomware
breach has occurred where there is a risk of significant harm, the organization will
be obliged to notify the individuals affected as well as the Privacy Commissioner.
Regardless of whether the breach is serious enough to require notification, every
time there has been a breach the organization must make a record of that breach
within their organization, and this record of breaches must be shown to the Privacy
Commissioner upon request.99
Under PIPEDA, not all attacks will be sufficiently harmful to require the orga-
nization to report to the Privacy Commissioner or the individual whose data were
affected. For example, in cases where an organization has backups in place to
restore access to the data and the data that fell victim to a ransomware attack was
encrypted so the attackers could not read its content, there may be no harm to the
individual that the data concerns. However, if the data has been breached and it is
unclear whether the attacker has read or copied the data or there is evidence that the
data was accessed or copied, there could be a significant risk of harm and the
organization would be obliged to report. Additionally, if the sole copy of the data is
lost completely, the risk of harm would increase and more likely require reporting.

15.6.4 Israel

In Israel, under PPDS, the duty to report to the Register applies only in the event of
a severe security incident. Such incident may occur either in a database subject to
high security level—when an incident involves the use of data from the database
without authorization or in excess of authorization, or damage to the data integrity;
or in a database subject to medium security level—when an incident involves the
use of substantial part of the database without authorization or in excess of
authorization, or damage to the data integrity with respect to a substantial part of the
database.
Therefore, in the case where the data is backed up, and there is no fear of harm to
the integrity of the data, there is also no obligation to report. Moreover, a ran-
somware attack on a database subject to medium security level, where the data is
not backed up, but the attack did not cause any damage to a substantial part of the
database, would also not be obliged to report. Furthermore, there is no duty to

97
PIPEDA 2000, Section 10.1(7).
98
PIPEDA 2000, Section 10.1(8).
99
PIPEDA 2000, Sections 10.3(1)–(2).
15 Data Privacy Laws Response to Ransomware Attacks … 301

report incidents within databases subject to basic security level. In July 2018, the
Israeli Privacy Protection Authority (RAMOT) published guidelines explaining
what events are considered to be severe security incidents.100 Interestingly,
according to these instructions, only an incident in which a ransomware attack has
disrupted or encrypted data from database subject to high security level, without the
ability to restore the information will be required to report to the Register. In
addition, the guidelines explicitly state that in an event of a ransomware attack
disrupting or encrypting data from the database, there is no obligation to report the
attack if the data was successfully restored and there was no indication of data
leakage.

15.7 Conclusion

When considering ransomware attacks through the lens of the InfoSec model
represented by the CIA triad: confidentiality, integrity and availability of infor-
mation, it appears that ransomware constitutes primarily a problem of data avail-
ability. The data privacy laws of the EU, Canada and Israel implicitly recognise the
principle of data availability, as they impose obligations on the data controllers to
enable individuals to access, request or inspect their data. In order to fulfil these
obligations, the organisations need to implement necessary safeguards to mitigate
potential risks of losing access to the data. The data privacy laws adopt the
risk-based approach with regard to the technical and organisational measures,
which means that organisations need to implement security measures, which they
consider “adequate” or “appropriate” in specific circumstances. Such measures
should certainly include regular software updating to patch vulnerabilities and
raising cyberthreat awareness of employees. However, in the context of ran-
somware, we believe that it should be considered a minimum-security standard to
have the backup procedures in place that allow for the restoration of data encrypted
in the attack. Provided that, despite the implementation of the security measures, a
data breach still occurs, it may trigger a notification obligation. The GDPR,
PIPEDA and PPDS have established mandatory reporting of certain data breaches
to competent privacy bodies and sometimes to the persons concerned. This change
may play an important role in enhancing data security, and ultimately better pro-
tecting the rights of individuals. The GDPR and PIPEDA leave quite a lot of room
for the organisations to assess the level of risk posed by the data breach and decide
whether it is necessary to report. In this respect, the Israeli law lays down much
more precise indications when the reporting is obligatory. It remains to be seen
which of the approaches will better respond to the threat of the ransomware attacks.
In conclusion, ransomware seems to present some unique data privacy concerns,
which call for an in-depth multidisciplinary analysis. Apart from focusing on

100
RAMOT 2018.
302 M. Brewczyńska et al.

ransomware specificities, this chapter illustrated a more general issue, namely that
there is an urgent need for further research and substantial discussion on security in
the field of the data privacy protection. While looking at ransomware through the
lens of basic InfoSec principles—confidentiality, integrity, and availability and
examining ransomware in the light of the legal definitions of a data breach, we
identified some discrepancies and uncertainties. Many of them may potentially be
mitigated through the use of a language common for lawyers and legislators, but
also experts in other fields, such as InfoSec and computer forensics.

Acknowledgements The first version of this chapter was drafted as a student assignment in the
context of a legal clinic organised in cooperation with three universities: the University of Haifa
(Israel), the University of Ottawa (Canada) and Tilburg University (the Netherlands). The purpose
of the project was to scrutinise topical cybersecurity issues from the perspective of three different
jurisdictions. The authors would like to express their gratitude to the organisers of that program
and particularly to Prof Tal Zarsky, Prof Michael Geist and Dr Bart van der Sloot. They would also
like to thank Dr Aaron Martin, Dr Jaap-Henk Hoepman and Drs Paulus Meessen for the inspiring
discussions and all anonymous reviewers for their suggestions on how to improve the original
draft.

References

Al-rimy B, Maarof M, Mohd Shaid S (2018) Ransomware threat success factors, taxonomy, and
countermeasures: A survey and research directions. Computer & Security 74:144–166
Andress J (2014) The Basics of Information Security. Understanding the Fundamentals of InfoSec
in Theory and Practice. Elsevier, Amsterdam
APEC (2005) APEC Privacy Framework, Asia-Pacific Economic Cooperation. http://publications.
apec.org/-/media/APEC/Publications/2005/12/APEC-Privacy-Framework/05_ecsg_
privacyframewk.pdf. Accessed 26 August 2018
Arnold J, Oates C (2016) Ransomware Threat to Canadian Business Broadens. Lexology. https://
www.lexology.com/library/detail.aspx?g=954259cf-4c09-4532-a831-76f580783e9f. Accessed
26 August 2018
Article 29 Working Party (2018) Guidelines on Personal data breach notification under Regulation
2016/679. WP WP250rev.01
Brewer R (2016) Ransomware attacks: Detection, prevention and cure. Network Security 9:5–9
Bygrave LA (2014) Data Privacy Law: An International Perspective. Oxford University Press,
Oxford
Cabaj K, Gregorczyk M, Mazurczyk W (2018) Software-defined networking-based crypto
ransomware detection using HTTP traffic characteristics. Computers and Electrical Engineering
66:353–368.
COE (1981) Convention for the Protection of Individuals with regard to Automatic Processing of
Personal Data (no. 108) of the Council of Europe, Council of Europe
de Hert P, Papakonstantinou V (2016) The new general data protection regulation: Still a sound
system for the protection of individuals? Computer Law & Security Review 32:179–194
Ekstrom M, Lusty L, McBride T, Sexton J, Townsend A (2017) NIST Special Publication 1800-11
Data Integrity Recovering from Ransomware and Other Destructive Events. https://www.
nccoe.nist.gov/sites/default/files/library/sp1800/di-nist-sp1800-11-draft.pdf. Accessed 20
September 2018
ENISA (2016) Guidelines for SMEs on the security of personal data processing. https://doi.org/10.
2824/867415
15 Data Privacy Laws Response to Ransomware Attacks … 303

ENISA (ND) Security of Personal Data, European Union Agency for Network and Information
Security. https://www.enisa.europa.eu/topics/data-protection/security-of-personal-data. Accessed
26 August 2018
European Commission (2012) Commission Staff Working Paper: Impact Assessment.
Accompanying the document Regulation of the European Parliament and of the Council on
the protection of individuals with regard to the processing of personal data and on the free
movement of such data (General Data Protection Regulation) and Directive of the European
Parliament and of the Council on the protection of individuals with regard to the processing of
personal data by competent authorities for the purposes of prevention, investigation, detection
or prosecution of criminal offences or the execution of criminal penalties, and the free
movement of such data. SEC(2012) 72 final
European Parliament and European Council (1995) Directive 95/46/EC of the European Parliament
and of the Council of 24 October 1995 on the protection of individuals with regard to the
processing of personal data and on the free movement of such data [1995], OJ L 281
European Parliament and European Council (2016) European Parliament and Council Regulation
2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of
personal data and on the free movement of such data, and repealing Directive 95/46/EC
(General Data Protection Regulation) [2016] OJ L119/59.
Furnell S, Emm D (2017) The ABC of ransomware protection. Computer Fraud & Security 10:5–11
Gazet A (2010) Comparative analysis of various ransomware virii. Journal of Computer Virology
and Hacking Techniques 6:77–90
Gómez- Hernández JA, Alvarez-Gonzalez L, Garcia-Teodoro P (2018) R-Locker: Thwarting
ransomware action through a honeyfile-based approach. Computers & Security 73:389–398
Gordon AM, Killilea A (2016) Guidance on ransomware attacks under HIPAA and state data
breach notification laws. Lexis Practice Advisor Journal. https://www.lexisnexis.com/lexis-
practice-advisor/the-journal/b/lpa/archive/2017/02/09/guidance-on-ransomware-attacks-under-
hipaa-and-state-data-breach-notification-laws.aspx. Accessed 26 August 2018
ISO/IEC 27000:2018 (2018) Information technology - Security techniques - Information security
management systems - Overview and vocabulary
ISO/IEC 29100:2011 (2011) Information technology - Security techniques - Privacy Framework
Jay R (2017) Guide to the General Data Protection Regulation. Sweet & Maxwell, London
Kharraz A, Kirda E (2017) Redemption: Real-Time Protection Against Ransomware at End-Hosts.
In: Dacier M et al (eds) Research in Attacks, Intrusions, and Defenses 20th International
Symposium, RAID 2017. Springer International Publishing AG, pp 98-119
Kshetri N, Voas J (2017) Do crypto-currencies fuel ransomware? IEEE Computer Society. https://
ieeexplore.ieee.org/document/8057721/. Accessed 26 August 2018
Liska A, Gallo T (2017) Ransomware Defending Against Digital Extortion. O’Reilly
Mansfield-Devine S (2016) Ransomware: Taking business hostage. Network Security 10:8–17
Mansfield-Devine S (2017a) Ransomware: The most popular form of attack. Computer Fraud &
Security 10:15-20
Mansfield-Devine S (2017b) Leaks and ransoms – the key threats to healthcare organisations.
Network Security 6:14–19
McAfee (2018) Economic Impact of Cybercrime - No Slowing Down. McAfee. https://www.
mcafee.com/enterprise/en-us/assets/executive-summaries/es-economic-impact-cybercrime.pdf.
Accessed 26 August 2018
O’Brien D (2017) Internet security threat report: Ransomware 2017, Symantec https://www.
symantec.com/content/dam/symantec/docs/security-center/white-papers/istr-ransomware-2017-en.
pdf. Accessed 20 September 2018
OECD (1980) Guidelines Governing the Protection of Privacy and Transborder Flows of Personal
Data C(80)58/FINAL
OECD (2011) Reducing Systemic Cybersecurity Risk, Organization for Economic Co-operation
and Development. https://www.oecd.org/internet/46894657.pdf. Accessed 26 August 2018
OECD (2013) OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal
Data. Organization for Economic Co-operation and Development http://www.oecd.org/sti/
304 M. Brewczyńska et al.

ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.html .
Accessed 26 August 2018
Office of the Privacy Commissioner of Canada (2015) The Digital Privacy Act and PIPEDA. Office
of the Privacy Commissioner of Canada. https://www.priv.gc.ca/en/privacy-topics/privacy-laws-
in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/legislation-
related-to-pipeda/02_05_d_63_s4/?wbdisable=true. Accessed 26 August 2016
Osterman (2016) Understanding the depth of the global ransomware problem. Osterman Research
Survey Report. https://www.malwarebytes.com/pdf/white-papers/UnderstandingTheDepthOf
RansomwareIntheUS.pdf. 26 August 2018
Palisse A, Le Bouder H, Lanet JL, Le Guernic C, Legay A (2017) Ransomware and the Legacy
Crypto API. Springer International Publishing AG. https://doi.org/10.1007/978-3-319-54876-02
Parsons C (2016) Privacy tech-know blog- Pay me to regain access to your personal information!
Ransomware on the rise. Office of the Privacy Commissioner. https://www.priv.gc.ca/en/blog/
20161109/. Accessed 26 August 2018
Personal Information Protection and Electronic Documents Acts (PIPEDA), SC 2000, c 5
Petit H (2017) More chaos on the way? Wanna Cry? Cyber hackers send their victims an ominous
new message http://www.dailymail.co.uk/sciencetech/article-4518538/Cyber-attackers-WannaCry-
send-eerie-new-message.html. Accessed 26 August 2018
PIPEDA Case Summary #2009-017 (2009) Third-Party landlord organization collected, used and
disclosed tenants’ personal information without their consent. Office of the Privacy
Commissioner
PIPEDA Report Findings #2014-004 (2014) Online service provider that suffered a breach had
appropriate safeguards in place. Office of the Privacy Commissioner
PIPEDA Report of Findings #2014-015 (2014) After a significant Adobe data breach, customer
questions company’s security safeguards and the response it provided about impacts on his
personal information. Office of the Privacy Commissioner
Porcedda MG (2018) Patching the patchwork: appraising the EU regulatory framework on cyber
security breaches. Computer Law and Security Review
Privacy Protection Data Security Regulations (PPSD), 5777-2017 of 8 May 2017
Protection of Privacy Act (PPA), 5741-1981 of 11 March 1981
RAMOT - Israeli Privacy Protection Authority (2018) Examples of severe security incidents
https://www.gov.il/he/Departments/General/data_security_report_examples. Accessed 26
August 2018 (available only in Hebrew)
Schneier B (2015) Applied Cryptography: Protocols, Algorithms and Source Code in C. Wiley
Sherman A, DeLatte D, Neary M, Oliva L, Phatak D, Scheponik T, Herman J, Thompson J (2017)
Cybersecurity: Exploring core concepts through six scenarios. Cryptologia, 42:4,337–377
Tene O (2017) The new Israeli data security regulations: A tutorial. IAPP. https://iapp.org/news/a/
the-new-israeli-data-security-regulations-a-tutorial/. Accessed 26 August 2018
US Department of Health and Human Services (2016) Fact Sheet: Ransomware and HIPPA, US
Department of Health and Human Services. https://www.hhs.gov/sites/default/files/
RansomwareFactSheet.pdf. Accessed 26 August 2018

Magda Brewczyńska is a Ph.D. researcher at Tilburg Institute for Law, Technology, and Society
(TILT). She holds an LL.M. (cum laude) in Law & Technology from Tilburg University in the
Netherlands (2018) and a Master’s degree in Law obtained at Jagiellonian University in Poland
(2015). Prior to joining TILT, she was gaining professional experience in the law firms and at the
European Data Protection Supervisor (EDPS). Her research interests lie primarily in the area of
privacy and data protection. Currently, she focuses mainly on the protection of personal data in the
law enforcement context.
15 Data Privacy Laws Response to Ransomware Attacks … 305

Suzanne Dunn is a Ph.D. student at the University of Ottawa Faculty of Law. Her research
focuses on technology-facilitated violence. Ms. Dunn was the recipient of the Shirley Greenberg
Scholarship for outstanding feminist research and is a research fellow with The eQuality Project, a
multi-year SSHRC funded project that examines the ways in which corporate data collection can
impact the privacy, identity and safety of young people.

Avihai Elijahu holds an LL.B degree and an LL.M degree with a specialization in law and
technology from the University of Haifa, Israel. He has completed his legal internship at Israeli law
firm “E.S. Shimron, I Molho, Persky & Co”. He was a member of the editorial board of “He’arat
Din” law journal, the University of Haifa Law Faculty’s online journal. In addition, he served as a
Training Manager at “The New Path Association”, a legal, academic and multi-disciplinary
organization for youth in multi-cultural communities.
Part V
Conclusion
Chapter 16
Concluding Observations: The
Regulation of Technology—What Lies
Ahead—And Where Do We Want
to End Up?

Leonie Reins

Contents

16.1 Introduction...................................................................................................................... 310


16.2 The Human Aspect and Technology .............................................................................. 310
16.3 Competitive Technology and Technology in Competition ............................................ 311
16.4 Technology, Data and the Role of the Individual .......................................................... 311
16.5 Conclusion ....................................................................................................................... 313

Abstract The volume concludes by stating that as humanity evolves and continues
the search for technological improvements, the question on how to effectively
regulate these improvements will continue to exist. Ultimately, regulation of
technologies is—like all other forms of regulation—the result of an exercise of
weighing and balancing conflicting societal objectives and interests in the frame-
work of differing regulatory preferences. Certain societies will be—on balance—
more risk averse than others. Others will place greater emphasis on the importance
of open and competitive markets. However, all democratic societies have in com-
mon that they—when regulating new technologies—have to find a balance between
the different societal objectives that will exist in any given society. Ensuring sup-
port for regulation of technologies by involving citizens and stakeholders will
therefore remain of crucial importance, regardless of the sometimes high levels of
complexity that may be involved.

Keywords Technology  Regulation  future challenges


L. Reins (&)
Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University, Tiburg, The
Netherlands
e-mail: l.s.reins@uvt.nl

© T.M.C. ASSER PRESS and the authors 2019 309


L. Reins (ed.), Regulating New Technologies in Uncertain Times,
Information Technology and Law Series 32,
https://doi.org/10.1007/978-94-6265-279-8_16
310 L. Reins

16.1 Introduction

Innovation is part of human nature. The desire to make human existence on planet
earth more bearable, easier and more comfortable will continue to drive the quest
for technological innovation. Where problems are identified, humanity relies on
technological ingenuity to provide appropriate solutions. Yet, technological solu-
tions to problems are almost never entirely value-free. Wherever a technological
solution to an identified problem is identified, the application of that technology
will inevitably bring about (un)expected and/or (un)desired consequences. In this
regard, the regulation of technology takes place within a societal debate in which
different values are in natural competition with one another.
The contributions of the next generation of scholars featured in this volume
show us that, regardless of any specific type of technology, the underlying concerns
relating to the weighing and balancing of regulatory objectives and regulatory
preferences will continue to exist. It is in that regard that scholars that study the
regulation of new technologies should hopefully be able to contribute to a debate on
how to design regulatory processes that are capable of channeling the discussion on
these conflicting objectives and to ensure that a regulatory framework is created that
does not favor one particular objective over another.

16.2 The Human Aspect and Technology

As the contributions by Brown and Martinelli have shown, sometimes a


rights-based approach may be the best way to ensure that the rights of consumers
and citizens are safeguarded in times of smart cities and data portability. Indeed, as
Brown observes “[t]aking a human centred approach ensures that there are the best
possible guards against undesirable and unintended consequences of technology”.
Martinelli finds that regulators should take into account three considerations when
adopting data portability rights. First, that strict data portability regulation may
reinforce existing dominant positions. Second, that data portability regulation
requires a joint and coordinated approach based in privacy, competition and con-
tract law in order to ensure an effective outcome. Third, that novel approaches, such
as a consent mechanism, may be required in order to ensure the effective deploy-
ment of technology, whilst safeguarding privacy.
This approach towards regulation that departs from individual citizens or con-
sumers can also be seen in the contribution by Berti Suman. She aptly describes the
need for Citizen Science, as rooted in the right to live in a healthy environment and
the right to (environmental) information, in order both to produce knowledge and to
enable citizens to engage in scientific debate forming the basis for political
decision-making and regulatory practice. In this regard, Berti Suman focuses on the
potential of Citizen Science to act as a means to restore public trust in science,
which in turn is essential for the creation of science-based regulation. Vellinga
sheds light on the ethical considerations that need to be reflected in the regulation
adopted for automated driving. Again, the impact of technology on the lives of
16 Concluding Observations: The Regulation of Technology … 311

humans plays a critical role, and will require the reconsideration of traffic rules as
focusing on the vehicle, rather than on the driver. Moreover, there will be a need to
define a regulatory framework aimed at the person responsible for the operation of
the vehicle, rather than the person responsible for the vehicle automatically being
the operator of the vehicle. Vellinga’s approach offers a novel solution for the
absence of a ‘driver’ in an automated vehicle whilst still accommodating conven-
tional driving in conventional vehicles.
Finally, Ligthart examines coercive neuro-technologies and forensic evaluations
and assesses these technologies inter alia in light of Article 3 of the ECHR. Having
established a research method that should enable further research, the findings of
which could contribute to the further debate about the legal regulation of neu-
roimaging technologies in a criminal law context.

16.3 Competitive Technology and Technology


in Competition

Technology produces major changes in human behavior and thereby can have a
significant impact on markets. The widespread deployment of shale gas technology
disrupted the global market for natural gas, with major geopolitical effects. Similarly,
the internet has enabled streaming services such as Netflix, Spotify and Amazon
Prime to almost entirely replace “analogue” services and goods such as the VHS
Rental Shop and Compact Discs. Therefore, whenever a new technology enters the
market, a disruptive effect—or at least a competitive effect—will be visible.
Martinelli’s contribution observed that data portability rights have a significant
impact on competition. Verdonk’s contribution focuses on the phenomenon of “digital
agriculture” and notes that, in the absence of a well-functioning regulatory framework,
it could “exacerbate the existing imbalances of economic power in the upper segment
of the food supply chain and facilitate abusive or unfair trading practices vis-à-vis
customers and competitors”. In order to avoid this from happening, Verdonk recom-
mends the deployment of market studies and sector inquiries, as well as cross-border
cooperation between national and EU authorities in order to carefully monitor devel-
opments in this regard and to take appropriate action where necessary. Iliopoulos
focuses on the role of prosumers in the context of distributed generation in electricity
systems and finds that policymakers and legislators are faced with the challenge of
determining a regulatory framework that maximizes social welfare.

16.4 Technology, Data and the Role of the Individual

In terms of the use of data in new technologies, as well as the protection of personal
data in the application of these technologies, several contributions have highlighted
the importance of an approach that reflects the role of individuals, whether as
citizens or as consumers, in regulation.
312 L. Reins

Smith’s contribution departs from the idea that technological advances can
challenge previous commitments to values and principles present in existing leg-
islation. In order to avoid the creation of even greater divergences between legal
and societal approaches towards data re-use, a recognition of the implications of
these technological changes is required. Therefore, Smith recommends that the
roots of societal discord with legally compliant practices are determined, as this will
enable a more fine-grained analysis of the issues identified and would provide a
more solid foundation for determining the direction of future regulation. Similarly,
Wittner, focusing on the importance of transparency, sheds light on the phe-
nomenon of algorithmic decision-making (ADM). He considers that public trans-
parency is a valuable and important ideal for data protection in general and the
regulation of the usage of ADM systems in particular. Wittner recommends
developing a public database that lists companies using ADM systems and gives
information about the way they use them. This would enable potential data subjects
to inform themselves, and scholars and NGOs to do research on adverse effects,
which would be the best way to achieve such transparency. Caes focuses on the
effective use of big data in the healthcare sector and observes that existing legis-
lation concerning access to government data and the re-use of public sector
information can be an unnecessary and disproportionate limitation. Therefore,
according to Caes, the government could be obliged to actively provide certain data
(research and high value data) to a limited list of private healthcare actors, in order
not to disproportionally hinder the use of big data by these entities.
Whereas the usage of data is crucially important in a number of technologies, so
is the protection of the individuals that are the source of such data.
In this regard, Van Schendel focuses on risk profiling by law enforcement. She
observes that the challenges caused by the shift from more traditional reactive
policing and prosecution to pre-emptive, risk-focused, data-driven practices, mainly
pertain to procedural safeguards and differ per jurisdiction. In light of the fact that
national and European courts are already and increasingly requested to adjudicate in
disputes concerning big data, algorithms, and artificial intelligence, there is an
urgent need for legislators to become more active to ensure that national procedural
safeguards are in compliance with the rights of individuals. Batman focuses on the
effects of trust in new technologies, the new ways of technology usage and the
regulation of cloud services by launching data protection certification mechanisms
in the market of cloud computing. She proposes the thesis that data protection
certifications are raising awareness about the importance and impacts of data pro-
tection in companies using cloud services, by enabling the selection of appropriate
and compliant cloud service providers, not in the least to prevent severe sanctions
under the EU GDPR. Brewczyńska, Dunn, and Elijahu focus on the problem of
ransomware attacks and the right to access personal data. They find that the existing
data privacy laws that were examined require the implementing of necessary
safeguards to mitigate potential risks to data security during a ransomware attack.
16 Concluding Observations: The Regulation of Technology … 313

16.5 Conclusion

At the end of this edited volume, the only fitting conclusion is that as humanity
evolves and continues the search for technological improvements, the question on
how to effectively regulate these improvements will continue to exist. Ultimately,
regulation of technologies is—like all other forms of regulation—the result of an
exercise of weighing and balancing conflicting societal objectives and interests in
the framework of differing regulatory preferences. Certain societies will be—on
balance—more risk-averse than others. Others will place greater emphasis on the
importance of open and competitive markets. However, all democratic societies
have in common that they—when regulating new technologies—have to find a
balance between the different societal objectives that will exist in any given society.
Ensuring support for the regulation of technologies by involving citizens and
stakeholders will therefore remain of crucial importance, regardless of the some-
times high levels of complexity that may be involved.
It is hoped that this edited volume has shed light on the manner in which several
new technologies are currently being regulated in societies that are all confronted
with developments such as automation and digitalization. There is no optimum
form of regulation for any type of technology, but it is worth striving for the
optimisation of the regulatory processes that ultimately produce that regulation.
Regardless of whether one’s regulatory preferences lie predominantly within the
realm of privacy, competitive markets or product safety, it is in the interest of
humanity as a whole that the regulatory processes for new technologies are capable
of capturing these—often competing—preferences, without—at the same time—
stifling innovation in the process.
This is all the more crucial in the uncertain times that we are currently in. Rapid
technological developments coincide with geopolitical challenges, challenges to
institutions of multilateral governance, climate change and migration flows. Many,
though not all, of these developments are driven by data. A well-functioning reg-
ulatory framework that captures competing societal objectives is therefore critical in
order to safeguard humanity from technology itself. Ultimately, technology should
always be a servant to humanity, not the other way around. Law and regulation play
a pivotal role in this regard. The scholars featured in this volume have all shown
their commitment to ensuring that this objective is achieved.

Leonie Reins is an Assistant Professor at the Tilburg Institute for Law, Technology, and Society
(“TILT”) at Tilburg University in the Netherlands. Previously she was a Post Doctoral Researcher
at KU Leuven, Belgium where she also wrote her Ph.D. thesis on the coherent regulation of energy
and the environment in the EU. Leonie completed an LL.M. in Energy and Environmental Law at
KU Leuven, and subsequently worked for a Brussels-based environmental law consultancy,
providing legal and policy services for primarily public sector clients. Leonie’s research focuses on
the intersections of international and European energy, climate and environmental law.

You might also like