Download as pdf or txt
Download as pdf or txt
You are on page 1of 493

Ius Gentium: Comparative Perspectives on Law and Justice 96

Marion Albers
Ingo Wolfgang Sarlet   Editors

Personality and
Data Protection
Rights
on the Internet
Brazilian and German Approaches
Ius Gentium: Comparative Perspectives on Law
and Justice

Volume 96

Series Editors
Mortimer Sellers, University of Baltimore, Baltimore, MD, USA
James Maxeiner, University of Baltimore, Baltimore, MD, USA

Editorial Board
Myroslava Antonovych, Kyiv-Mohyla Academy, Kyiv, Ukraine
Nadia de Araújo, Pontifical Catholic University of Rio de Janeiro, Rio de Janeiro,
Brazil
Jasna Bakšic-Muftic, University of Sarajevo, Sarajevo, Bosnia and Herzegovina
David L. Carey Miller, University of Aberdeen, Aberdeen, UK
Loussia P. Musse Félix, University of Brasilia, Federal District, Brazil
Emanuel Gross, University of Haifa, Haifa, Israel
James E. Hickey Jr., Hofstra University, South Hempstead, NY, USA
Jan Klabbers, University of Helsinki, Helsinki, Finland
Cláudia Lima Marques, Federal University of Rio Grande do Sul, Porto Alegre,
Brazil
Aniceto Masferrer, University of Valencia, Valencia, Spain
Eric Millard, West Paris University, Nanterre Cedex, France
Gabriël A. Moens, Curtin University, Perth, Australia
Raul C. Pangalangan, University of the Philippines, Quezon City, Philippines
Ricardo Leite Pinto, Lusíada University of Lisbon, Lisboa, Portugal
Mizanur Rahman, University of Dhaka, Dhaka, Bangladesh
Keita Sato, Chuo University, Tokyo, Japan
Poonam Saxena, University of Delhi, New Delhi, India
Gerry Simpson, London School of Economics, London, UK
Eduard Somers, University of Ghent, Gent, Belgium
Xinqiang Sun, Shandong University, Shandong, China
Tadeusz Tomaszewski, Warsaw University, Warsaw, Poland
Jaap de Zwaan, Erasmus University Rotterdam, Rotterdam, The Netherlands
Ius Gentium is a book series which discusses the central questions of law and
justice from a comparative perspective. The books in this series collect the
contrasting and overlapping perspectives of lawyers, judges, philosophers and
scholars of law from the world’s many different jurisdictions for the purposes of
comparison, harmonisation, and the progressive development of law and legal
institutions. Each volume makes a new comparative study of an important area of
law. This book series continues the work of the well-known journal of the same
name and provides the basis for a better understanding of all areas of legal science.
The Ius Gentium series provides a valuable resource for lawyers, judges,
legislators, scholars, and both graduate students and researchers in globalisation,
comparative law, legal theory and legal practice. The series has a special focus on
the development of international legal standards and transnational legal
cooperation.

More information about this series at https://link.springer.com/bookseries/7888


Marion Albers · Ingo Wolfgang Sarlet
Editors

Personality and Data


Protection Rights
on the Internet
Brazilian and German Approaches
Editors
Marion Albers Ingo Wolfgang Sarlet
Faculty of Law Law School
University of Hamburg Pontifical Catholic University from Rio
Hamburg, Germany Grande do Sul
Porto Alegre, RS, Brazil

ISSN 1534-6781 ISSN 2214-9902 (electronic)


Ius Gentium: Comparative Perspectives on Law and Justice
ISBN 978-3-030-90330-5 ISBN 978-3-030-90331-2 (eBook)
https://doi.org/10.1007/978-3-030-90331-2

© Springer Nature Switzerland AG 2022


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of
the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology
now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Contents

Personality and Data Protection Rights on the Internet:


Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Marion Albers and Ingo Wolfgang Sarlet
Privacy Protection in the World Wide Web—Legal Perspectives
on Accomplishing a Mission Impossible . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Markus Kotzur
Personality Rights in Brazilian Data Protection Law: A Historical
Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Danilo Doneda and Rafael A. F. Zanatta
Realizing the Fundamental Right to Data Protection in a Digitized
Society . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Jörn Reinhardt
Surveillance and Data Protection Rights: Data Retention
and Access to Telecommunications Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Marion Albers
Privacy Protection with Regard to (Tele-)Communications
Surveillance and Data Retention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Carlos Alberto Molinaro and Regina Linden Ruaro
The Protection of Personality in the Digital Environment . . . . . . . . . . . . . . 133
Ingo Wolfgang Sarlet
Forgetting as a Social Concept. Contextualizing the Right to Be
Forgotten . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
Anna Schimke
Brazilian Internet Bill of Rights: The Five Roles of Freedom
of Expression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
Carlos Affonso Pereira de Souza and Beatriz Laus Marinho Nunes

v
vi Contents

Civil Rights Framework of the Internet (BCRFI; Marco Civil da


Internet): Advance or Setback? Civil Liability for Damage Derived
from Content Generated by Third Party . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Anderson Schreiber
Self-regulation in Online Content Platforms and the Protection
of Personality Rights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Ivar A. Hartmann
Regulating Intermediaries to Protect Personality
Rights Online—The Case of the German NetzDG . . . . . . . . . . . . . . . . . . . . 289
Wolfgang Schulz
Online Anonymity—The Achilles’-Heel of the Brazilian Marco
Civil da Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309
Sohail Aftab
The Impact of Jurisdiction and Legislation on Standards
of Anonymity on the Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
Lothar Michael
Digital Identity and the Problem of Digital Inheritance . . . . . . . . . . . . . . . . 355
Gabrielle Bezerra Sales Sarlet
The Digital Estate in the Conflict Between the Right of Inheritance
and the Protection of Personality Rights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
Felix Heidrich
Algorithms and Discrimination: The Case of Credit Scoring
in Brazil . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
Laura Schertel Mendes and Marcela Mattiuzzo
Safeguarding Regional Data Protection Rights on the Global
Internet—The European Approach Under the GDPR . . . . . . . . . . . . . . . . . 445
Raoul-Darius Veit

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485
Editors and Contributors

About the Editors

Marion Albers is Full Professor of Public Law, Information and Communication


Law, Health Law and Legal Theory at Hamburg University and Principal Investigator
in the Brazilian/German CAPES/DAAD PROBRAL-Research Project “Internet
Regulation and Internet Rights.” Main areas of research include fundamental rights,
information and Internet law, data protection, health law and biolaw, police law
and law of intelligence services, legal theory and sociology of law. Selected publi-
cations are Recht & Netz: Entwicklungslinien und Problemkomplexe, in: Marion
Albers/Ioannis Katsivelas (eds.), Recht & Netz, Baden-Baden: Nomos, 2018, pp. 9–
35; L’effet horizontal des droits fondamentaux dans le cadre d’une conception
à multi-niveaux, in: Thomas Hochmann/Jörn Reinhardt (dir.), L’effet horizontal
des droits fondamentaux, Editions Pedone, Paris, 2018, pp. 177–216; Biotech-
nologies and Human Dignity, in: Dieter Grimm/Alexandra Kemmerer/Christoph
Möllers (eds.), Human Dignity in Context, München/Oxford/Baden-Baden: C. H.
Beck/Hart/Nomos, 2018, pp. 509–559, also published in Revista Direito Público,
Vol. 15 (2018), pp. 9–49; A Complexidade da Proteção de Dados, Revista Brasiliera
de Direitos Fundamentais e Justiça, Belo Horizonte, ano 10, n. 35, 2016, pp. 19–45.

Ingo Wolfgang Sarlet Dr. Iur. Ludwig-Maximilians-Universität München is Chair


Professor for Constitutional Law and Current Head of the Graduation Program
in Law (LL.M.-Ph.D.) at the Pontifical Catholic University Rio Grande do Sul,
Brazil—PUCRS and Principal Investigator in the Brazilian/German CAPES/DAAD
PROBRAL-Research Project “Internet Regulation and Internet Rights.” Current
research projects are protection of human dignity and fundamental rights in the
digital domain and social rights, innovation and technology. Selected Publications
are A Eficácia dos Direitos Fundamentais, 13th Ed., Porto Alegre, Livraria do Advo-
gado, 2018; Dignidade da Pessoa Humana na Constituição Federal de 1988, 10th Ed.,
Livraria do Advogado, Porto Alegre, 2015; Direito Constitucional Ecológicol, 6th
Ed., Revista dos Tribunais, São Paulo, 2019; Grundrechte und Privatrecht—Einige

vii
viii Editors and Contributors

Bemerkungen zum Einfluss der deutschen Grundrechtsdogmatik und insbeson-


dere der Lehre Canaris’ in Brasilien, in: Festschrift für Claus-Wilhelm Canaris
zum 80. Geburtstag, Berlin, De Gruyter, 2017, pp. 1257–80; Menschenwürde und
soziale Grundrechte in der brasilianischen Verfassung, in: Stephan Kirste, Draiton
Gonzaga De Souza and Ingo Wolfgang Sarlet (eds), Menschenwürde im 21. Jahrhun-
dert. Untersuchungen zu den philosophischen, völker- und verfassungsrechtlichen
Grundlagen in Brasilien, Deutschland und Österreich, Nomos, Baden-Baden, 2018.

Contributors

Sohail Aftab Government of Pakistan, Islamabad, Pakistan;


University of Hamburg, Hamburg, Germany
Marion Albers University of Hamburg, Hamburg, Germany
Carlos Affonso Pereira de Souza Rio de Janeiro State University (UERJ), Rio de
Janeiro, Brazil
Danilo Doneda Public Law Institute of Brasília (IDP), Brasília, Brazil
Ivar A. Hartmann Insper Learning Institution, São Paulo, Brazil
Felix Heidrich Leipzig University and Federal Administrative Court, Leipzig,
Germany
Markus Kotzur University of Hamburg, Hamburg, Germany
Marcela Mattiuzzo University of São Paulo, São Paulo, Brazil
Laura Schertel Mendes University of Brasília (UnB), Brasília, Brazil
Lothar Michael Universität Düsseldorf, Düsseldorf, Germany
Carlos Alberto Molinaro formerly: Pontifical Catholic University of Rio Grande
do Sul—Law School (PUCRS), Porto Alegre, Brazil
Beatriz Laus Marinho Nunes Intellectual Property Specialist, Pontifical Catholic
University of Rio de Janeiro (PUC), Rio de Janeiro, Brazil
Jörn Reinhardt University of Bremen, Bremen, Germany
Regina Linden Ruaro Pontifical Catholic University of Rio Grande do Sul—Law
School (PUCRS), Porto Alegre, Brazil
Gabrielle Bezerra Sales Sarlet Pontifical Catholic University of Rio Grande do
Sul—Law School (PUCRS), Porto Alegre, Rio Grande do Sul, Brasil
Ingo Wolfgang Sarlet Pontifical Catholic University of Rio Grande do Sul–Law
School (PUCRS), Porto Alegre, Brazil
Anna Schimke University of Hamburg, Hamburg, Germany
Editors and Contributors ix

Anderson Schreiber Faculty of Law, Rio de Janeiro State University (UERJ), Rio
de Janeiro, Brazil
Wolfgang Schulz Leibniz-Institute for Media Research | Hans Bredow-Institut,
Hamburg, Germany
Raoul-Darius Veit University of Hamburg, Hamburg, Germany
Rafael A. F. Zanatta University of São Paulo, São Paulo, Brazil
Personality and Data Protection Rights
on the Internet: Introduction

Marion Albers and Ingo Wolfgang Sarlet

Abstract This article gives a brief overview of the development of the Internet and
the rise of the onlife world. As a socio-technical arrangement, the Internet is both a
factor in and a product of modern society and comes together with fundamental social
change. The legal challenges emerging with the Internet involve fundamental issues
up to and including the question of what exactly the law is. They comprise cross-
sectional issues such as the declining relevance of the territorial borders of nation
states for applying and enforcing law. Last but not least, manifold legal questions in
particular areas arise. Legal answers must be Internet-specific to a certain extent, but
must also build on to or at least be coordinated with established legal solutions. As
communication on the Internet always includes datafication, among the pressing legal
problems is the handling of personal data and how to advance personality and data
protection rights. The contributions to this volume are dedicated to key questions,
ranging from the urgent need for transnational standards or convincing enforce-
ment mechanisms for regionally established data protection rights up to problems of
surveillance, forgetting on the Internet, regulation of intermediaries, anonymity, the
digital estate or algorithmic discrimination.

We are grateful for the support of the DAAD (Deutscher Akademischer Austauschdienst) and
CAPES (Comissão de Aperfeiçoamento de Pessoal de Ensino Superior) for the Academic Exchange
and Research PROBRAL Project ’Internet Regulation and Internet Rights’ conducted by the Pontif-
ical Catholic University Porto Alegre, Brazil (PUCRS) and the University of Hamburg (Germany)
and coordinated by us both. This book is one of the outcomes of this project. We would also like
to thank Matthew Harris and Sandra Lustig for their help and constructive comments during the
process of editing this article and several other contributions in this volume.

M. Albers
University of Hamburg, Hamburg, Germany
e-mail: marion.albers@uni-hamburg.de
I. W. Sarlet (B)
Pontifical Catholic University of Rio Grande do Sul–Law School (PUCRS), Porto Alegre, Brazil

© Springer Nature Switzerland AG 2022 1


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_1
2 M. Albers and I. W. Sarlet

1 The Emergence of the Onlife World

Whereas in its early years it was described as a separate sphere, “cyberspace”,


or as “virtual” (in contrast to “real”) reality, the Internet and the techniques and
arrangements associated with it are now widely and firmly embedded in society. The
“onlif e world” is one of the new catchwords referring to a blurred online/offline
“hyperconnected reality.”1 These developments entail numerous challenges for the
law.
From a technical point of view, “the” Internet can be described as a set of inter-
connected networks and computers that are in principle organized in a decentralized
way, but operate according to certain standards and (meta-)protocol families. It is
not a medium, but rather a multi-layered infrastructure that provides the basis for
various communication and media formats. It is based on technical developments, for
example on digitization, on decentralized and increasingly miniaturized computers,
on data transmission technologies, and on software architectures. To a certain extent,
it is shaped by their construction and their advancements.
The technical foundations, however, are not the only factor that matters. Devel-
opments and applications of technology are always embedded in social contexts
and in social practices in dealing with technology. For this reason, the milestones
of the Internet—from its early stages through the Web 2.02 and the “Internet of
Things”3 to the emerging “Internet of Bodies”4 —cannot be understood in terms of a
linear and seamless evolution. There are overlaps, a juxtaposition of technologies or
implementation of technical possibilities in social practices some of which unfolds
rapidly, some slowly and, in part, is not done at all. From a social science point of
view, the Internet presents itself as a socio-technical arrangement. It is a factor in and
a product of modern society and its characteristics: the global society, the functionally
differentiated society, the knowledge society or the individualized society.
How broadly and how deeply the Internet is anchored in society can be proven
by quantitatively measurable indicators of usage. More crucial, however, are the
radical transformations the ways and means of communication are undergoing and the
numerous qualitative societal changes. For example, communication on the Internet
comes with “datafication.”5 As a result, the possibilities for surveillance are unprece-
dented.6 And even if and precisely because the Internet as a socio-technical arrange-
ment is far from not forgetting, social knowledge, memory and forgetting are being

1 Hildebrandt (2016), p. 1 ff.; see also Floridi (2015).


2 For this term see O’Reilly (2005).
3 The term “Internet of Things” (IoT) is attributed to Kevin Ashton (1999), who used it to describe

the vision of a system of networked computers and things that operate relatively autonomously,
especially when it comes to data processing.
4 Cf., for example, the contributions in Leenes, van Brakel, Gutwirth and de Hert (2018) and

Matwyshyn (2019).
5 For this term see Mayer-Schönberger and Cukier (2013), pp. 23 f., 101.
6 Edward Snowden’s revelations about intelligence surveillance are a vivid example.
Personality and Data Protection Rights on the Internet … 3

subjected to fundamental structural change.7 In the economic system, the Internet is


a product of and a factor in globalization and the international division of labor.8 It
has become the basis for a multitude of new business models, including the “share-
conomy” in which data are conceived as valuable economic goods. Providers, search
engine companies, platforms and social network operators are becoming powerful
new players, boosted by mechanisms such as lock-in effects. Their selection and
mediation services have quickly become so indispensable that they are assuming the
role of gatekeepers.9 More importantly, they are making a considerable contribution
to the construction of social and individual knowledge.10 This applies regardless of
whether the Internet is associated with “filter bubble” 11 mechanisms or the view is
taken that there has never before been such far-reaching access to other opinions.
The challenges the Internet poses are also characterized by the speed and dynamics
of its evolutionary stages. From a technical point of view, the emerging Internet of
Things describes the networking of a wide variety of physical objects, within the
Internet as well as among the objects, with the help of a range of complementary
technologies, such as RFID technology, special sensor technologies, chip technolo-
gies or energy supply technologies. “Ubiquitous computing” aims at technologies
and Internet processes working in the background being omnipresent and at the
same time invisible, and being integrated as discreetly as possible into everyday
life. Wearable computing, smart houses or autonomous cars are no longer fiction.
If we consider the increasing storage, processing and analysis functionalities in the
context of Big Data analytics or artificial intelligence, the Internet of Things once
again has the potential to bring about fundamental societal changes that even extend
into ontology.
This is all the more true with regard to future scenarios of the Internet of Bodies
(IoB). These scenarios suggest that implants ranging from smart lenses to memory
chips and brain-to-brain interfaces will mechanize the body and connect people
“directly” to the Internet. In view of the convergence of biotechnologies and infor-
mation technologies that can already be observed today, even far-reaching futuristic
visions are by no means completely unrealistic.12 The catchword “onlif e,” which
already captures the essence of the current situation, would take on completely new
dimensions.

7 See, e.g., Schimke (2016) with further references.


8 Fuchs (2008), pp. 154 ff.
9 See, e.g., Introna and Nissenbaum (2000), pp. 169 ff.; Tavani (2020), Sect. 3.1.
10 Hinman (2008), pp. 69 ff., 75.
11 Popular: Pariser (2011).
12 Cf. for scenarios and challenges Koops (2013b); Andresen (2018), p. 491 ff.; Albers (2016a),

p. 63, esp. 72 ff.


4 M. Albers and I. W. Sarlet

2 Internet Regulation

The early days of the Internet gave rise to the idea that it is a law-free space.13 But
legal regulation has always existed in certain respects, for example with regard to the
establisment and operation of telecommunications infrastructures and networks. And
since the development of Web 2.0 and the increasing embeddedness of the Internet
in society, the extent of the need for regulation has become crystal clear. Just as
in “offline” cases, the law must guarantee legal frameworks and conflict resolution
in cases involving the Internet. This is why the Brazilian Marco Civil da Internet14
has attracted international attention.15 Looking at the features of the Internet and
the social arrangements made possible by it, legal regulation is facing numerous
unprecedented questions. On the one hand, novel approaches must be elaborated. On
the other hand, Internet cases do not require completely novel regulations in every
respect. Although the emerging legal issues may be Internet-specific to a certain
extent, they can at the same time build upon established legal solutions; they must
at least be coordinated with them. Consequently, the core of legal considerations
involves the questions whether, where, and to what extent particular cases are shaped
by features of the Internet in a legally relevant way, to what extent novel solutions
tailored to these characteristics are necessary, and what the relevant concepts could
be.
If we examine the challenges emerging with the Internet, we can, firstly, identify
fundamental and cross-sectional issues. From various points of view, the Internet
leads straight to the necessity to revisit the notion of “law.” Detailed fundamental
questions regarding how the law is to be understood present themselves when
decision-making processes are being driven by sophisticated software programs. In
addition, the problem of how to ensure normative requirements are observed in these
programs and decision-making processes must be solved as well. The ethical and
legal discussions on autonomously driven cars are an illustrative example. Turning
to the cross-sectional issues, it is particularly relevant due to the rise of the Internet
society that physical national borders are losing importance. Activities on the Internet
cross borders, whether in terms of the routes taken by the data packets transmitted
or in tems of server locations on the one hand and retrieval locations on the other.
Novel answers have to be developed concerning what criteria the applicability of
national or supranational law is based on and how to ensure the enforceability of the
applicable law. It is, among other things, in this context that the European Court of

13 Cf. Barlow (1996): “Governments of the Industrial World, you weary giants of flesh and steel,
I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to
leave us alone. You are not welcome among us. […] We are forming our own Social Contract”.
14 Lei no 12.965/2014.
15 On the Brazilian Marco Civil da Internet see, among others, Leite and Lemos (2014); Souza,

Lemos and Bottino (2018); and Del Masso, Abrusio and Florêncio Filho (2019).
Personality and Data Protection Rights on the Internet … 5

Justice’s key decision Google Spain and Google of 2014 has been discussed interna-
tionally.16 Other high-profile decisions address the question of the prerequisites that
must be met for the transfer of personal data from the European Union to the US; in
the opinion of the European Court of Justice, EU law establishes requirements that
neither the Safe Harbor Agreement nor its successor, the Privacy Shield Agreement,
have fulfilled.17
In addition to fundamental and cross-sectional issues, there are, secondly,
numerous novel legal questions in fields that are closely related to the Internet or
are particularly influenced by its features. At the infrastructure level, questions of
access of competing providers to networks or of net neutrality are discussed as well
as problems of IT security. At the level of Internet services, attention was initially
focused especially on the legal responsibility of providers for their own or user-
generated content. With the further development of the Internet, the list of questions
is becoming endless. They range from the the civil law and consumer protection
issues raised by e-commerce or a shareconomy to the regulation of public commu-
nication with regard to hate speech or the use of social bots for the purpose of
manipulating public opinion. How search engine, platform and social network oper-
ators should be regulated appropriately, among other things in the area of conflict
between freedom of expression and protection of personality rights, is the subject of
heated legal controversies.18 Models such as the German Network Enforcement Act
are attracting attention across the globe.19
Across all areas, new legal problems center on the handling of data. This is obvious,
because communication on the Internet always includes “datafication.” Here too, the
list of legal issues is exploding and broadly diversified. Important aspects concern
society’s dependence on the Internet infrastructure and the resulting vulnerability.
Hazards are to be countered by multi-layered protection measures against failures
or attacks. Processing and using data as a basis for information and knowledge must
also be addressed from a variety of perspectives. The constructions of social and indi-
vidual knowledge created by search engines may be influenced by the control of the
general terms and conditions search engine providers are using as well as by antitrust
or competition law requirements aiming for plurality. The catchword “governing

16 The ECJ has ruled that the search engine result lists produced by Google are within the scope of
application of the European Data Protection Directive because the search engine business model is
inseparably linked to the placement of advertising, so that the processing of personal data resulting
from a search is carried out within the framework of the marketing activities of the Spanish branch
of Google Inc. based in the USA, Judgment of the ECJ (Grand Chamber) of 13 May 2014, C-131/12,
available under curia.europa.eu, para 55 ff.
17 As for the Safe Harbor Agreement: Judgment of the ECJ (Grand Chamber) of 6 October 2015, C-

362/14; for the Privacy Shield Agreement: Judgment of the ECJ (Grand Chamber) of 16 July 2020,
C-311/18, both available under curia.europa.eu. See also more closely on enforcement mechanisms
with regard to the GDPR Veit (2022), in this volume.
18 Cf. Schreiber (2022), in this volume; Hartmann (2022), in this volume.
19 See more closely Schulz (2022), in this volume.
6 M. Albers and I. W. Sarlet

algorithms,”20 as another example, focuses on problems of how the transparency


of algorithms can be ensured and which standards algorithms should meet so that
algorithm-driven data processing produces reasonable and proper knowledge. In the
context of the Internet of Things, questions about the ownership of data21 that are
generated or liability for faulty data processing can be added. More fundamentally,
we need to consider how to create a legal framework that ensures the functioning of
data architectures that may be extremely complex.22
In view of the increasing importance of data, information and knowledge, a closer
analysis reveals a wide range of requirements for protecting individuals adequately.
This applies, inter alia, to surveillance or tracking of identifiable persons. Activities
on the Internet leave data traces. Personal or relatively anonymous data are collected,
evaluated and passed on by a wide variety of state authorities or private companies.
These range from e-commerce companies that collect and evaluate their customers’
data in the course of other business transactions to the numerous companies whose
business model is “paying with data,” all the way to data brokers specialized in
tracking and data sales. Comprehensive profiling23 is a reality and calls for protec-
tion.24 As for disadvantages, usually only personalized advertising is mentioned—
effects of profiling that are detrimental enough, if the now very subtle forms of
influence are kept in mind.25 However, personalized advertising is by no means the
only problem that arises. This is shown by the data scandal surrounding the use of
Facebook data by Cambridge Analytica. Also worthy of attention is the far-reaching
potential for discrimination26 or consequences such as “dynamic pricing,” where
people shopping online are shown different prices depending on profiling results.27
But not only the commercialization-driven activities are a problem. One of the new
challenges in the field of surveillance is that state institutions, especially police agen-
cies or intelligence services, may be authorized to get access to the “honeypot” of data
gathered by private companies. Legally, the problems resulting from such “public–
private assemblages” of surveillance are reflected in the controversies and in the
series of court decisions on the lawfulness of telecommunications surveillance and
data retention.28

20 From the extensive discussions see, for example, Gillespie (2014), pp. 167 ff.; Ziewitz (2016),
pp. 5 ff. See also Mendes and Mattiuzzo (2022), in this volume.
21 On this issue differentiated and with critical remarks Determann (2018).
22 Sicari et al. (2018), pp. 60 ff.
23 See also the definition of Ferraris et al. (2013), p. 32.
24 More closely Elmer (2004); the contributions in: Hildebrandt and Gutwirth (2008); Schermer

(2013), pp. 137 ff.; Bosco et al. (2015), pp. 3 ff.


25 For an analysis of the mechanisms of personalized advertisement see Katsivelas (2018), pp. 221

ff.
26 Christl (2017).
27 Zuiderveen Borgesius and Poort (2017), pp. 347 ff., 521.
28 See Albers (2022), in this volume; Molinaro and Ruaro (2022), in this volume.
Personality and Data Protection Rights on the Internet … 7

There are numerous other issues to be addressed. The Internet and the social
arrangements it makes possible not only raise the problem of a “right to be forgot-
ten”29 but also of how to manage the problem of “digital assets” that persists in social
media accounts. Do we need new legal approaches to protect the rights of deceased
persons, or can we work with existing regulatory patterns?30 Among the difficult and
controversially discussed questions is also whether there must be a highly protected
right to anonymity on the Internet, or whether and under what conditions legislation
could introduce a duty for people to use their own birthname.31 And do data protec-
tion principles offer anticipatory or procedural standards for the design of relevant
algorithms that go beyond a control of results?32

3 Advancing Personality and Data Protection Rights

Against this background, the contributions in this volume deal with the key ques-
tion of how personality and data protection rights on the Internet must be further
developed.33 New fundamental problems are addressed as well as cross-sectional
and specific issues.
A first fundamental problem is the urgent need for transnational standards for
personality and data protection rights on the one hand, while on the other hand it
seems to be difficult, if not impossible, to reach a consensus on such standards, given
the diverse legal cultures in the regions of the world. In the second contribution of
this volume, Markus Kotzur deals with the key research question of whether or not
certain personality or privacy rights and doctrinal figures such as the “duty to protect”
or the “horizontal dimension of fundamental rights,” which have been developed
at the level of international or European Union law—in particular by court deci-
sions—can be used to develop sufficiently universal standards for effective Internet
governance.34 He emphasizes that the problems of agreeing on overarching standards
reflect the deeper problem that there is no incontested conceptualization of “privacy”
and, in addition, Internet and social media have made it even more difficult to deter-
mine what “privacy” means. The traditional dichotomy between the private and the
public sphere, which has characterized the concept to a certain extent, however, has
always been only an ideal type. It is necessary to assume the existence of a spectrum
“composed of the in-betweens,” which diversifies into different constellations. At
the same time, we must acknowledge several dimensions of protection, above all,
besides rights of defense that are traditionally recognized also third-party effects and

29 On this topic Sarlet (2022), in this volume; Schimke (2022), in this volume.
30 See Bezerra Sales Sarlet (2022), in this volume; Heidrich (2022), in this volume.
31 As for the problems of anonymity on the Internet see Aftab (2022), in this volume; Michael

(2022), in this volume.


32 Cf. Koops (2013a) pp. 196 ff. Cf. also Schertel Mendes and Mattiuzzo (2022), in this volume.
33 For principle considerations on the complexity of data protection, see Albers (2016b).
34 Kotzur (2022), in this volume.
8 M. Albers and I. W. Sarlet

duties to protect. Markus Kotzur analyzes specific features of privacy protection both
within the state, here with regard to the German Basic Law [Constitution], as well as
beyond the state, looking at the European Convention of Human Rights (ECHR), the
Charter of Fundamental Rights of the European Union and Universal Public Law and
Human Rights Law. He highlights that, even though universal minimum standards
of protection can and have to be created, we should not disregard the importance of
the underlying legal fabric that concretizes the individual’s rights and the necessity
of “law-in-context studies” while carrying out comparative work.
Danilo Doneda and Rafael Zanatta approach Brazilian protection of personality
and data protection rights with a view to their historical roots.35 Nowadays, the Marco
Civil da Internet and the recent General Data Protection Law (LGPD) form the back-
bone of the Brazilian legal framework for the society in the digital era.36 However,
they do not cover only new topics such as net neutrality or intermediary liability
and are not only influenced by European models of data protection. In particular, the
provisions of the LGPD also refer to a previously existing fragmented set of Brazilian
legislation which was created over many years. This body of legislation has itself
often been inspired by European legal traditions, among others the Portuguese and
German legal traditions. The establishment of personality and data protection rights
was advanced by, for example, the Brazilian Civil Code, the Brazilian Constitution
of 1988, and consumer protection rules. Danilo Doneda and Rafael Zanatta high-
light that, on the one hand, it is always important to understand the links between
the doctrine and tradition of personality rights as well as of other already established
protective rules and new data protection legislation. On the other hand, it is necessary
to respond to new challenges raised by the Internet and to strive for harmonization
with international and transnational standards on data protection.
Just like Brazil, the Europan Union is constantly renewing its legal framework
for data protection in the light of an increasingly digitized society. Since the Charter
of Fundamental Rights of the European Union (CFR) came into effect in 2009, the
catalogue of fundamental rights offers, in addition to the right to respect for private
life in Article 7 CFR, a right to the protection of personal data in Article 8 CFR. Jörn
Reinhardt takes a closer look at this right,37 especially because challenges to the
protection of personal data arise not only from state action, but also from the inherent
logic of the data economy. He stresses that Article 8 CFR includes certain conditions
the processing of personal data must fulfill in order to be lawful, for example the
principle of purpose limitation. Nevertheless, the protected interests behind the “right
to the protection of personal data” remain insufficiently defined and must be further
specified, not least with a view to other freedoms. Since data protection cannot be
limited to one specific goal, the protection requirements vary. After having clarified
positive obligations and horizontal effects in the context of Article 8 CFR, Jörn
Reinhardt works out central standards for data processing in the digital economy.
These standards include consent and control, protective legislation built upon an

35 Doneda and Zanatta (2022), in this volume.


36 For an overview of the LGPD Schreiber (2020), pp. 45 ff.
37 Reinhardt (2022), in this volume.
Personality and Data Protection Rights on the Internet … 9

objectification of the interests involved, transparency as well as information rights


of data subjects, and data security.
After these general and overarching contributions, the following articles of this
volume address major challenges related to the life in the digital era which are being
discussed worldwide. How do personality and data protection rights address the
problems of unprecedented surveillance, of the assumption that “the Internet does
not forget,” of the power of intermediaries, of “digital heritage” or of algorithmic
decision making?
The fifth and sixth contributions deal with the topic of personality and data
protection rights confronted with the surveillance possibilities made available by
the Internet. Marion Albers first focuses on understanding surveillance and on the
changes it is undergoing under the conditions of the Internet. Protection needs are
wide-ranging and addressed not only by the right to respect for privacy but by a
broad spectrum of protected interests enshrined in the constitution, in human or
fundamental rights catalogues, or in statutes. Particularly in Europe, “data reten-
tion,” with its characteristics such as being a public–private assemblage as well as
blanket surveillance without cause, has become a major issue in heated debates
about surveillance on the Internet. Marion Albers explains the grounds for the series
of decisions of the German Federal Constitutional Court and of the European Court
of Justice. She stresses where their approaches and criteria resemble each other and
where they differ, what key points the courts have worked out for the legitimacy of
data retention, and how much work still has to be carried out to appropriately assess
and regulate surveillance under Internet conditions.38 Carlos Alberto Molinaro and
Regina Linden Ruaro also highlight the changes that surveillance is undergoing in
this day and age.The Snowden revelations demonstrate how far surveillance by intel-
ligence services extends, even in the case of highly developed democracies such as
the United States of America and Great Britain. The threats it poses to democracy and
fundamental rights require that intelligence activity is fully subordinated to consti-
tutional and appropriate legal provisions and subject to various forms of control.
Carlos Alberto Molinaro and Regina Linden Ruaro then introduce the provisions of
the Marco Civil da Internet and other legal rules insofar as these regulations estab-
lish a legal framework for telecommunications surveillance, including data retention
rules as well as limits and safeguards, for example a judge’s proviso. Finally, they
discuss technical tools of protection against surveillance on the Internet.39
Another topic that is being discussed worldwide is the “right to be forgotten.”
There have been few decisions that have been discussed so intensively and controver-
sially in different countries, especially in Brazil and Germany, as the ECJ’s Google
Spain and Google judgment of 2014.40 This is due to the fact that in the Internet
society, remembering and forgetting penetrate societal processes and mechanisms to

38 See Albers (2022), in this volume.


39 Molinaro and Ruaro (2022), in this volume.
40 Judgment of the ECJ (Grand Chamber) of 13 May 2014, C-131/12, available under

curia.europa.eu. For critical remarks see, for example, Masing (2017), esp. pp. 442 ff. For the
Brazilian discussion see Branco (2017), Sarlet and Ferreira Neto (2018) and Frajhof (2019).
10 M. Albers and I. W. Sarlet

an exceptional degree. And for this very reason, the problem is not new—it is a fine
example of the finding that legal responses to novel challenges must partly build upon
previous approaches and partly find new solutions. This is also confirmed by recent
judgments issued by the German FCC on the “right to be forgotten.”41 Regarding
Brazilian law, Ingo Sarlet addresses the potential foundations of a “right to be forgot-
ten” in the Brazilian constitutional system and partial legislative expressions as well
as its acknowledgment and protection by the Brazilian superior courts. He explains
that there are “offline” as well as “online”-cases. The right to be forgotten already
has a rich history, but the Internet raises questions of its own, not least because inter-
mediaries and algorithms enter the picture. Ingo Sarlet emphasizes that the “right to
be forgotten” covers a range of subjective positions, e.g., a right to data erasure, to
de-referencing, to a digital response, or to the suppression of identity. Differentiation
and balancing conflicting rights is as necessary as careful regulation by the legis-
lator.42 In her contribution focusing on EU and German law, Anna Schimke aims at
contextualizing the right to be forgotten in terms of the insights of social and cultural
sciences, in order to improve legal approaches. Forgetting on the Internet refers to a
collective dimension or to communication and to information and knowledge rather
than to data: particular knowledge about a person which could previously have been
lawfully acquired should no longer exist in a certain social context. A closer analysis
shows that cases involving such problems had already come up in the European and
German jurisdictions, as well as in Brazil, before the Internet entered the picture.
Anna Schimke emphasizes that they are dealt with in various fields of law, not only in
data protection law but also in press law. She proposes that the areas of application
of these fields of law be determined with a view to their characteristics and strengths
and with recourse to the so-called media privilege, which can be found in European
and German law and also in Brazilian law. This helps to reasonably elaborate the
right to be forgotten and the range of subjective positions and claims that can be
developed as differentiated in European and German law as Ingo Sarlet has worked
out for Brazilian law. Such an approach also helps to develop a convincingly fair
balance with counter-interests.43
The following articles turn more closely to questions concerning conflicts between
personality and data protection rights on the Internet and communication freedoms,
especially the freedom of expression. Carlos Affonso Pereira de Souza and Beatriz
Laus Marinho Nunes explain the significant and multifarious role freedom of expres-
sion was given in the Brazilian Marco Civil da Internet. The constitutional founda-
tions are already multifaceted and rich in content. Depending on the context in which
reference is made to freedom of expression in the Marco Civil da Internet, particular
facets and impacts come into play. This is substantiated with regard to the founda-
tions of Internet governance in Brazil, to the principles for regulating Internet use, in
particular as regards anonymous discourse, to the conditions for the full exercise of

41 Judgments of the FCC of 6 November 2019, 1 BvR 16/13 and 1 BvR 276/17, both also available
in English under www.bundesverfassungsgericht.de.
42 Sarlet (2022), in this volume.
43 Schimke (2022), in this volume.
Personality and Data Protection Rights on the Internet … 11

the right to Internet access, to Internet intermediaries’ liability, and to challenges to


copyright.44 Anderson Schreiber explains that the Internet is not per se a place of free
discourse, but can negatively affect the very essence of freedom of expression. The
“dark side” of social networks comprises new forms of oppression, such as virtual
bullying, verbal aggression, stigmatizing labeling, and online hate speech.
As a result, rules must ensure that freedom of expression is not exercised against
itself. Anderson Schreiber explains case law prior to the Brazilian Marco Civil da
Internet as well as their regulations with a view to the civil-law liability of Internet
providers for damage derived from content generated by third parties. He arrives
at the result, that the chosen regulation is extremely restrictive and lacks sufficient
protection of the rights of Internet users such as honor, image and privacy.45 Ivar
Hartmann advances the argument that content moderation—separating acceptable
from unacceptable content—should be achieved by new models of self-regulation
and by code rather than by the judiciary. Platform operators should implement decen-
tralized moderation systems where users are engaged as reviewers as much as content
producers and consumers. The rules set in the architecture and community guidelines
must comply with procedural boundaries established by the efficacy of fundamental
rights being asserted between private parties. Judicial review should play a role not in
evaluating the merits of content posted or shared, but rather in correcting the course
of the procedural rules that ensure self-regulation does not disproportionally restrict
personality rights.46 Wolfgang Schulz introduces the German Network Enforcement
Act, which has attracted attention across the globe. He analyzes its regulatory concept
and the obligations imposed on online platform operators as well as its compatibility
with higher-ranking law and fundamental rights. Based on the findings, he then
discusses human-rights-friendly alternatives to the platform regulation chosen in the
Network Enforcement Act.47
Facing the conflicts between personality and data protection rights and freedom
of expression, the possibilities for users to remain anonymous on the Internet deserve
special attention. Sohail Aftab analyzes protection needs and cases against the back-
ground of the Brazilian constitutional provision that prohibits online anonymity.
Taking a closer look at several court decisions, he shows that the Marco Civil da
Internet has no clear solutions for the balancing the apparently conflicting positions
of various constitutional provisions that, on the one hand, provide for the protec-
tion of private life, but on the other, prohibit anonymity. A comparative examination
illustrates that the jurisprudence of the European Court of Human Rights encour-
ages anonymity and protects dignitarian interests diligently by means of a complex
balancing review. The value and the concept of anonymity are fleshed out, in partic-
ular with recourse to empirical evidence from Pakistan. The modern challenges to
anonymity in the online sphere call for broader elaborations. Sohail Aftab concludes

44 Affonso Pereira de Souza and Laus Marinho Nunes (2022), in this volume.
45 Schreiber (2022), in this volume.
46 Hartmann (2022), in this volume.
47 Schulz (2022), in this volume.
12 M. Albers and I. W. Sarlet

that anonymity necessitates complex conceptualizations and sympathetic regula-


tory approaches. Lothar Michael explores challenges, protection needs and limits
of anonymity with a view to online evaluation portals.48 While explaining conflicts
in multidimensional cases with diverging interests of the participants, he also looks
at the fundamental rights of the German Constitution and argues that they protect
anonymity (only) to a certain extent. He presents the grounds for prominent decisions
of the German Federal Court of Justice and concludes with remarks on the role of
the judiciary and of the legislator.
As activities on the Internet result in the accumulation of huge amounts of data,
one of the pressing issues in connection with personality and data protection rights is
how to handle the problem of “digital assets.” Just like the right to be forgotten, this
problem is a telling example of discussions about the question whether previous legal
approaches can be transferred to Internet cases or whether novel solutions are neces-
sary. From a Brazilian point of view, Gabrielle Bezerra Sales Sarlet explores more
closely the concept of digital identity and approaches to digital heritage. She explains
cases and conflicts between the guarantee of the right to intimacy and privacy and the
right to inheritance as well as Brazilian regulations. She arrives at the findings that
the simple application of inheritance rights to the digital universe is not convincing,
legislative gaps can be identified and, as a result, the need for further elaboration
becomes apparent.49 After differentiating among groups of data and cases of the
“digital estate,” Felix Heidrich focuses on German legal approaches. The important
leading decisions of German Civil Courts are introduced. In a detailed analysis, Felix
Heidrich then points out that digital estates enjoy protection under the fundamental
right of inheritance and neither the secrecy of telecommunications nor data protec-
tion in favor of the testator’s communication partners create obstacles to the right
of heirs to access a testator’s digital account data. For the German legal situation,
he comes to a different result than Gabrielle Bezerra Sales Sarlet for Brazil. He
concludes that, based on the cases discussed, there is no need for statutory regulation
beyond existing rules.50
Among the problems that, in a cross-sectional manner, extend across many fields
is the use of algorithms and the necessity to protect the individual with regard to
personal and data protection rights as well as against algorithmic discrimination.
Laura Schertel Mendes and Marcela Mattiuzzo explain the concepts of algorithm
and algorithmic discrimination and show how Big Data and algorithms have funda-
mentally altered decision-making processes in everyday life. They then focus on the
case of credit scoring. After exploring pertinent Brazilian regulations and decisions
of Brazilian courts, they turn to an overarching analysis of algorithmic discrimina-
tion and governance. Against this background, they point out which deficiencies in
current Brazilian law still need to be resolved when applied to credit scoring.51

48 Michael (2022), in this volume.


49 See Bezerra Sales Sarlet (2022), in this volume.
50 Heidrich (2022), in this volume.
51 Mendes and Mattiuzzo (2022), in this volume.
Personality and Data Protection Rights on the Internet … 13

All these contributions addressing concrete challenges for personality and data
protection rights on the Internet show that legal solutions are and must be, at least
to a certain extent, embedded in the legal culture and the specific legal system of
a particular country. But what about the cross-border flow of data as an essential
characteristic of the Internet society?
In the last article of this volume, Raoul-Darius Veit analyzes the mechanisms
of the European General Data Protection Regulation which aim at ensuring that
data protection rights guaranteed in the European Union are also safeguarded on the
global Internet. The challenges being faced by data protection law in particular are
obvious. After having explained the jurisdiction of the ECJ which culminated in the
recent decision on the Privacy Shield Agreement,52 Raoul-Darius Veit explains the
external dimension of the EU data protection regime in detail. The effects doctrine
(Marktortprinzip) has direct extraterritorial effects, while the principles regulating
data transfer to third countries result in indirect extraterritorial effects. Their core is
the adequacy regime whose guidelines and flexibilities are worked out. Raoul-Darius
Veit highlights the finding that modern data protection laws need to be designed in
such a way that they are open to conflicting notions of privacy and political and
economic interests, while at the same time maintaining their normative claim.

References

Affonso Pereira de Souza C, Laus Marinho Nunes B (2022) Brazilian Internet Bill of Rights: the
five roles of freedom of expression. In: Albers M, Sarlet IW (eds) Personality and data protection
rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Aftab S (2022) Online anonymity—the Achilles’-heel of the Brazilian Marco Civil da Internet.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Albers M (2016a) Grundrechtsschutz und Innovationserfordernisse angesichts neuartiger Einblicke
und Eingriffe in das Gehirn. In: Lindner J (ed) Die neuronale Selbstbestimmung des Menschen:
Grundlagen und Gefährdungen. Nomos, Baden-Baden, pp 63–97
Albers M (2016b) A Complexidade da Proteção de Dados. Rev Brasiliera Direitos Fundamentais
Justiça 10(35):19–45
Albers M (2022) Surveillance and data protection rights: data retention and access to telecommuni-
cations data. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Andresen M (2018) Von Cyborgs und Brainhacks: Der Schutz des technisierten Geistes. In: Albers
M, Katsivelas I (eds) Recht & Netz, Nomos, Baden-Baden, pp 491–518
Ashton K (1999) That “Internet of Things” Thing. RFID J. http://www.itrco.jp/libraries/RFIDjo
urnal-That%20Internet%20of%20Things%20Thing.pdf. Accessed 11 Jan. 2022
Barlow JP (1996) A declaration of the independence of cyberspace. https://www.eff.org/cybers
pace-independence. Accessed 11 Jan. 2022
Bosco F, Creemers N, Ferraris V, Guagnin D, Koops B-J (2015) Profiling technologies and funda-
mental rights and values: regulatory challenges and perspectives from European data protection

52Judgment of the ECJ (Grand Chamber) of 16 July 2020, C-311/18, available under
curia.europa.eu.
14 M. Albers and I. W. Sarlet

authorities. In: Gutwirth S, Leenes R, de Hert P (eds) Reforming European data protection law.
Springer, Dordrecht, Heidelberg, New York, London, pp 3–33
Branco S (2017) Memória e Esquecimento na Internet. Arquipélago Editoral Ltda, Porto Alegre
Christl W (2017) How companies use personal data against people. Automated disadvantage,
personalized persuasion, and the societal ramifications of the commercial use of personal infor-
mation. Working paper by cracked labs, https://crackedlabs.org/dl/CrackedLabs_Christl_Data
AgainstPeople.pdf. Accessed 11 Jan. 2022
Del Masso FD, Abrusio J, Florêncio Filho MA (2019) Marco Civil da Internet: Lei 12;965/2014.
Thomson Reuters do Brasil, Sâo Paulo
Determann L (2018) No one owns data. Hastings Law J 70(1):1–44
Doneda D, Zanatta RAF (2022) Personality rights in Brazilian data protection law: a historical
perspective. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Elmer G (2004) Profiling machines: mapping the personal information economy. MIT Press,
Cambridge, MA
Ferraris V, Bosco F, Cafiero G, D’Angelo E, Suloyeva Y (2013) Defining profiling. Working paper
in the project protecting citizens’ rights fighting illicit profiling. http://www.unicri.it/special_t
opics/citizen_profiling/PROFILINGproject_WS1_definition_0208.pdf. Accessed 3 June 2020
Floridi L (2015) The onlife manifesto: being human in a hyperconnected era. Springer, Cham,
Heidelberg, New York, Dordrecht, London
Frajhof IZ (2019) O Direito ao Esquecimento na Internet: Conceito, Aplicação e Controvérsias.
Almedina, Coimbra
Fuchs C (2008) Internet and society: social theory in the information age. Routledge, New York
Gillespie T (2014) The relevance of algorithms. In: Gillespie T, Boczkowski P, Foot K (eds) Media
technologies. MIT Press, Cambridge MA, pp 167–193
Hartmann IA (2022) Self-regulation in online content platforms and the protection of personality
rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Heidrich F (2022) The digital estate in the conflict between the right of inheritance and the protection
of personality rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the
internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Hildebrandt M, Gutwirth S (eds) (2008) Profiling the European citizen: cross-disciplinary
perspectives. Springer, Dordrecht
Hildebrandt M (2016) Smart technologies and the end(s) of law: novel entanglements of law and
technology. Edward Elgar, Cheltenham and Northampton
Hinman LM (2008) Searching ethics: the role of search engines in the construction and distribution
of knowledge. In: Spink A, Zimmer M (eds) Web search: multidisciplinary perspectives. Springer,
Berlin, Heidelberg, pp 67–76
Introna LD, Nissenbaum H (2006) Shaping the web: why the politics of search engines matters. Inf
Soc 16(3):169–185
Koops B-J (2013a) On decision transparency, or how to enhance data protection after the computa-
tional turn. In: Hildebrandt M, de Vries K (eds) Privacy, due process and the computational turn:
the philosophy of law meets the philosophy of technology. Routledge, New York, pp 196–220
Koops B-J (2013b) Concerning “humans” and “human” rights. Human enhancement from the
perspective of fundamental rights. In: Koops B-J et al (eds) Engineering the human. Human
enhancement between fiction and fascination. Springer, Berlin, Heidelberg, pp 165–182
Katsivelas I (2018) Das Geschäft mit der Werbung: Finanzierungsmechanismen, personalisierte
Werbung und Adblocker. In: Albers M, Katsivelas I (eds) Recht & Netz, Nomos, Baden-Baden,
pp 207–248
Kotzur M (2022) Privacy protection in the world wide web—legal perspectives on accomplishing
a mission impossible. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the
internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Personality and Data Protection Rights on the Internet … 15

Leenes R, van Brakel R, Gutwirth S, de Hert P (eds) (2018) Data protection and privacy: the internet
of bodies. Hart Publishing, Oxford
Leite GS, Lemos R (2014) Marco Civil da Internet. Atlas, São Paulo
Masing J (2017) Assessing the CJEU’s “Google decision”: a tentative first approach. In: Miller
RA (ed) Privacy and power: a translantic dialogue in the shadow of the NSA-affair. Cambridge
University Press, Cambridge, New York, pp 435–456
Mayer-Schönberger V, Cukier K (2013) Big Data: Die Revolution, die unser Leben verändern wird,
2nd edn. Redline, München
Matwyshyn AM (2019) The internet of bodies, 61 Wm. & Mary L. Rev. 77. https://scholarship.law.
wm.edu/wmlr/vol61/iss1/3. Accessed 11 Jan. 2022
Mendes LS, Mattiuzzo M (2022) Algorithms and discrimination: the case of credit scoring in Brazil.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Michael L (2022) The impact of jurisdiction and legislation on standards of anonymity on the
internet. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Molinaro CA, Ruaro RL (2022) Privacy protection with regard to (tele-)communications surveil-
lance and data retention. In: Albers M, Sarlet IW (eds) Personality and data protection rights on
the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
O’Reilly T (2005) What is Web 2.0: design patterns and business models for the next generation of
software. https://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html. Accessed 11 Jan.
2022
Pariser E (2011) The filter bubble: what the internet is hiding from you. Penguin, London, New
York
Reinhardt J (2022) Realizing the fundamental right to data protection in a digitized society. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Sarlet GBS (2022) Digital identity and the problem of digital inheritance: limits of the posthumous
protection of personality on the internet in Brazil. In: Albers M, Sarlet IW (eds) Personality and
data protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in
this volume)
Sarlet IW, Ferreira Neto A (2018) O Direito ao Esquecimento na Sociedade da Informação. Livraria
do Advogado Editora, Porto Alegre
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schermer B (2013) Risks of profiling and the limits of data protection law. In: Custers B, Calders T,
Schermer B, Zarsky T (eds) Discrimination and privacy in the information society: data mining
and profiling in large databases. Springer, Berlin, Heidelberg, pp 137–152
Schimke A (2016) Vergessen als neue Kategorie im Recht. In: Autengruber A et al (eds) Zeit im
Recht—Recht in der Zeit. Jan Sramek, Wien, pp 87–104
Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Schreiber A (2020) Right to privacy and personal data protection in Brazilian law. In: Moura Vicente
D, de Vasconcelos Casimiro S (eds) Data protection in the internet. Springer Nature, Cham, pp
45–54
Schreiber A (2022) Civil rights framework of the internet (BCRFI; Marco Civil da Internet):
advance or setback? Civil liability for damage derived from content generated by third party.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
16 M. Albers and I. W. Sarlet

Schulz W (2022) Regulating intermediaries to protect privacy online—the case of the German
NetzDG. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Sicari S, Rizzardi A, Cappiello C, Miorandi D, Coen-Porisini A (2018) Toward data governance
in the Internet of Things. In: Yager RR, Pascual Espada J (eds) New advances in the Internet of
Things. Springer International Publishing, Cham, pp 59–74
Souza CA, Lemos R, Bottino C (2018) Marco Civil da Internet. Jurisprudência Comentada, Revista
dos Tribunais, São Paulo
Tavani H (2020) Search engines and ethics. Stanford Encycl Philos. https://plato.stanford.edu/ent
ries/ethics-search/. Accessed 11 Jan. 2022
Veit, RD (2022) Safeguarding regional data protection rights on the global internet—The European
approach under the GDPR. In: Albers M, Sarlet IW (eds) Personality and data protection rights
on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Ziewitz M (2016) Governing algorithms: myth, mess, and methods. Sci Technol Human Values
41:3–16
Zuiderveen Borgesius F, Poort J (2017) Online price discrimination and EU data privacy law. J
Consum Policy 40:347–366

Marion Albers Dr. iur, Full Professor of Public Law, Information and Communication
Law, Health Law and Legal Theory at Hamburg University. Principal Investigator in the
Brazilian/German CAPES/DAAD PROBRAL-Research Project “Internet Regulation and Internet
Rights”. Main areas of research: Fundamental Rights, Information and Internet Law, Data
Protection, Health Law and Biolaw, Police Law and Law of Intelligence Services, Legal Theory
and Sociology of Law. Selected Publications: Recht & Netz: Enwicklungslinien und Prob-
lemkomplexe, in: Marion Albers/Ioannis Katsivelas (eds.), Recht & Netz, Baden-Baden: Nomos,
2018, pp. 9–35; L’effet horizontal des droits fondamentaux dans le cadre d’une conception à
multi-niveaux, in: Thomas Hochmann/Jörn Reinhardt (dir.), L’effet horizontal des droits fonda-
mentaux, Editions Pedone, Paris, 2018, pp. 177–216; Biotechnologies and Human Dignity,
in: Dieter Grimm/Alexandra Kemmerer/Christoph Möllers (eds.), Human Dignity in Context,
München/Oxford/Baden-Baden: C. H. Beck/Hart/Nomos, 2018, pp. 509–559, also published
in Revista Direito Público, Vol. 15 (2018), pp. 9–49; A Complexidade da Proteção de Dados,
Revista Brasiliera de Direitos Fundamentais e Justiça, Belo Horizonte, ano 10, n. 35, 2016,
pp. 19–45.

Ingo Wolfgang Sarlet Dr. Iur. Ludwig-Maximilians-Universität-München, Chair Professor for


Constitutional Law and current Head of the Graduation Program in Law (LLM-PHD) at the
Pontifical Catholic University Rio Grande do Sul, Brazil - PUCRS). Principal Investigator in the
Brazilian/German CAPES/DAAD PROBRAL-Research Project “Internet Regulation and Internet
Rights”. Current research projects: protection of human dignity and fundamental rights in the
digital domain and social rights, innovation and technology. Selected Publications: A Eficácia
dos Direitos Fundamentais, 13th Ed., Porto Alegre, Livraria do Advogado, 2018; Dignidade
da Pessoa Humana na Constituição Federal de 1988, 10th Ed., Livraria do Advogado, Porto
Alegre, 2015; Direito Constitucional Ecológicol, 6th Ed., Revista dos Tribunais, São Paulo, 2019;
Grundrechte und Privatrecht–Einige Bemerkungen zum Einfluss der deutschen Grundrechtsdog-
matik und insbesondere der Lehre Canaris’ in Brasilien, in: Festschrift für Claus-Wilhelm Canaris
zum 80. Geburtstag, Berlin, De Gruyter, 2017, pp. 1257–80; Menschenwürde und soziale Grun-
drechte in der brasilianischen Verfassung, in: Stephan Kirste, Draiton Gonzaga De Souza and Ingo
Wolfgang Sarlet (eds), Menschenwürde im 21. Jahrhundert. Untersuchungen zu den philosophis-
chen, völker- und verfassungsrechtlichen Grundlagen in Brasilien, Deutschland und Österreich,
Nomos, Baden-Baden, 2018.
Privacy Protection in the World Wide
Web—Legal Perspectives
on Accomplishing a Mission Impossible

Markus Kotzur

Abstract The paper aims at evaluating the power of law, in particular human rights
law, amongst the various instruments being recently discussed as means for effective
internet governance. Focusing on selected case law of the European Court of Justice
(ECJ) and the European Court of Human Rights (ECR), it discusses privacy rights, a
“right to be forgotten”, and a horizontal dimension of these rights as well as a duty to
protect these rights owed by the States and/or the International Community. The key
research question will be whether or not guarantees like these can—bottom-up—be
used to develop sufficiently universal standards for effective internet governance.
It is argued that regardless of the different levels of protection which the right to
privacy enjoys in different States and on the European/international plane, common
needs and dangers can be identified. Based upon them, universal minimum stan-
dards of protection can and have to be created. For a theoretical framing, the paper
finally addresses the so-called “public–private”-dichotomy and the rapid dynamics
it is facing in the age of the Internet.

1 Introduction: Little v. Big or How to Accomplish


a Mission Impossible

He did it again. Austrian national Maximilian Schrems had already succeeded in


2015 in a preliminary ruling proceeding related
to the interpretation, in the light of Articles 7, 8 and 47 of the Charter of Fundamental
Rights of the European Union (...), of Articles 25(6) and 28 of Directive 95/46/EC of the
European Parliament and of the Council of 24 October 1995 on the protection of individuals
with regard to the processing of personal data and on the free movement of such data
(...), as amended by Regulation (EC) No 1882/2003 of the European Parliament and of the
Council of 29 September 2003 (...), and, in essence, to the validity of Commission Decision
2000/520/EC of 26 July 2000 pursuant to Directive 95/46 on the adequacy of the protection

M. Kotzur (B)
University of Hamburg, Hamburg, Germany
e-mail: markus.kotzur@uni-hamburg.de

© Springer Nature Switzerland AG 2022 17


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_2
18 M. Kotzur

provided by the safe harbour privacy principles and related frequently asked questions issued
by the US Department of Commerce.1

The 2015 request had been made in the context of proceedings between him and the
Irish Data Protection Commissioner. It concerned the latter’s refusal to investigate
a complaint made by the applicant regarding the fact that Facebook Ireland Ltd
transferred personal data of its users to the United States of America and kept it on
servers located in that country.2 From August 2011 on, Schrems had lodged before
the Irish Data Protection Commissioner 23 complaints against Facebook Ireland, one
of which finally gave rise to a reference for the aforementioned preliminary ruling.
What seemed to be a “mission impossible” turned out to become one of the most
far-reaching recent landmark decisions of the European Court of Justice. Schrems
won his case and the Court held that
Article 25(6) of Directive 95/46/EC of the European Parliament and of the Council of
24 October 1995 on the protection of individuals with regard to the processing of personal
data and on the free movement of such data as amended by Regulation (EC) No 1882/2003
of the European Parliament and of the Council of 29 September 2003, read in the light of
Articles 7, 8 and 47 of the Charter of Fundamental Rights of the European Union, must be
interpreted as meaning that a decision adopted pursuant to that provision, such as Commis-
sion Decision 2000/520/EC of 26 July 2000 pursuant to Directive 95/46 on the adequacy of
the protection provided by the safe harbour privacy principles and related frequently asked
questions issued by the US Department of Commerce, by which the European Commis-
sion finds that a third country ensures an adequate level of protection, does not prevent a
supervisory authority of a Member State, within the meaning of Article 28 of that directive
as amended, from examining the claim of a person concerning the protection of his rights
and freedoms in regard to the processing of personal data relating to him which has been
transferred from a Member State to that third country when that person contends that the law
and practices in force in the third country do not ensure an adequate level of protection.3

An unlimited safe harbor-doctrine was history and data protection activists around
the globe had a new hero. His success made Maximilian Schrems even more active.
He has published two books on his legal proceedings against alleged infringements
of data protection. He has offered manifold lectures and has registered a number of
Internet websites such as blogs, online petitions as well as crowd-funding sites to
finance more upcoming legal proceedings against Facebook.4 He has furthermore
founded an association which seeks to uphold the fundamental right to data protection
and he has received various prizes. The little vs. big-story continued when Schrems,
by now a Robin Hood-like public figure defending private individuals against the
privacy-intrusive internet giants, has had assigned to him, by more than 25 000
people worldwide, claims to be brought in a class action against undertakings which
potentially endanger the fundamental right to privacy by their (online) activities.5

1 ECLI:EU:C:2015:650, para 1.
2 Ibidem, para 2.
3 Ibidem.
4 ECLI:EU:C:2018:37, para 12. Recently Schrems succeeded in another landmark decision:

Data Protection Commissioner versus Facebook Ireland Limited and Maximillian Schrems,
ECLI:EU:C:2020:559.
5 ECLI:EU:C:2018:37, para 14.
Privacy Protection in the World Wide Web—Legal Perspectives … 19

Regardless of his numerous—also commercial—activities, the European Court of


Justice held Maximilian Schrems still to be a consumer:
Article 15 of Regulation No 44/2001 must be interpreted as meaning that the activities of
publishing books, lecturing, operating websites, fundraising and being assigned the claims
of numerous consumers for the purpose of their enforcement do not entail the loss of a private
Facebook account user’s status as a ‘consumer’ within the meaning of that article.6

The Court, however, denied the option of a class action brought by a consumer in the
courts of the place where she or he is domiciled. Whilst, as usually in its jurisprudence,
taking individual (consumer) rights seriously,7 the Court did not want to go that far
and invent a judge-made class action-procedure that neither in the primary nor the
secondary law would find a sufficient legal basis. Even though such a cross-border
class-action, unifying thousands of European users against Facebook, will not take
place—at least not without the European law maker taking action,8 it would be quite
a misunderstanding to believe that Schrems completely lost his case. On the contrary,
the Court granted Schrems standing for a single action against Facebook in his home
country Austria even though Facebook is not an Austrian undertaking but has its
registered office in Ireland.9
What also is, apart from the specific legal questions, the underlying narrative of
the just described little vs. big-scenario? Even though the relevant debates date back
farther than 1890, when US-Supreme Court Justice Louis Brandeis famously penned
the since then so-called “right to privacy”,10 “in today’s society, privacy has become
more complex than simply physical interference. The birth of the World Wide Web
has created a new landscape for which current legal standards are inadequate.”11
The underlying narrative thus is a story of inadequacy and Schrems sees himself
as an advocate for all those being negatively affected by this inadequacy. It is an
inadequacy not limited to reactions new technologies provoke but it starts with the
inadequate, for sure ambiguous notion of privacy itself. As stated by A. M. Brumis,
“thanks to the Internet and social media, personal privacy has been revolutionized,
public figures and private figures are becoming increasingly difficult to discern, and
until changes in the law occur, privacy violations in an Internet environment are hard
to determine”. Brumis, however, additionally refers to a quote by W. Hartzog to make
her point: “The proper legal response to the issue of social media and privacy has
proven elusive because there is no fixed conceptualization of privacy”.12 Here, at
the very core of “privacy”—be it a real world phenomenon, a social attribution, a

6 Ibidem, para 41.


7 Coppell and O´Neill (1992), p. 227 et seqq.; Weiler and Lockhart (1995), p. 579 et seqq.
8 One might not be misled believing that one strategic aim of Schrems’ action was to place an

incentive to do so.
9 ECLI:EU:C:2018:37, para 12; see also Die Zeit, online version, 25. 01. 2018, http://www.zeit.

de/digital/datenschutz/2018-01/maximilian-schrems-facebook-eugh-urteil-keine-sammelklage.
Accessed 11 Jan. 2022.
10 Brandeis (1890).
11 Brumis (2016), p. 1.
12 Hartzog (2013), pp. 50 at 51.
20 M. Kotzur

legal concept or an individual right—any further investigation on privacy protection


in the world wide web has to take its starting point before trying to accomplish an
(otherwise) mission impossible.

2 Spheres of the Public and the Private—Shades of Grey

Our investigative journey on privacy has to also take into account the very opposite
thereof—the public space, the public sphere and, in a broader sense, “publicity”.
The differentiation between the spheres of the public and the private is among
the well-established fundamentals of Western constitutionalist respectively legal
thought.13 Anglo-American literature pointedly uses the concept of a “public–private
dichotomy”.14 Along with this strict separation come—oft-criticized, for instance
by “legal feminism”15 —typifying status attributions, in other words: clichés—and
the placing of behavioral expectations on the individual. Even if black-and-white
stereotypes consistently fall short, the fact alone that the constitutionalist school of
thought continues to permeate and, to a degree, dominate current discourse, warrants
its consideration. What does the separation of the private and public signify? The
public space is where the active citizen, the political citoyen, operates.16 This is
where s/he takes part in the shaping of the polity, does her/his part in the concretion
of the common good and does not avoid the public eye, on the contrary: it is within
the sphere of the democratic public that individuals’ political actions reverberate,
thereby preserving civil liberties. The private domain, in contrast, is the refuge of the
bourgeois. Protected by fundamental rights, it is here that s/he can be her/himself,
unimpeded by the state and unobserved by the public. Here s/he can be sure of her/his
freedom, the level of said freedom rising with the degree of intimacy. Readers prone
to a comparative legal approach find the phrase of a “right to privacy”,17 or—perhaps
even more illustrative—a “right to be let alone” in US Supreme Court rulings. In
his famous dissenting opinion in Eisenstadt, 405 U.S. p. 442, the aforementioned
Justice Brandeis cites the ruling Olmstead v. United States, 277 U.S. p. 438, 478
(1928): “The right to be let alone (is) the most comprehensive of rights and the right
most valued by civilized men.“18 The judicature of the German Bundesverfassungs-
gericht, differing from its US counterpart in many nuanced ways, but alike in its

13 See: Häberle (2006) for a fundamental discussion.


14 Weintraub and Kumar (eds.) (1997).
15 Pateman (1990), p. 118 et seqq.
16 Smend (1968), Dahrendorf (1974), in: Turner and Hamiliton (eds.) (1994), p. 292 et seqq.; Preuß,

in: Winter (ed.) (2002), p. 179 et seqq.


17 Outlined for instance in: Pfisterer (2014).
18 Ibidem, p. 273.
Privacy Protection in the World Wide Web—Legal Perspectives … 21

general direction, has adopted an equally illustrative concept by coining the term
“Allgemeines Persönlichkeitsrecht”, or “general right to privacy”.19
The rigorous separation of the private and the public, of citoyen and bourgeois,
has in its black-and-white dichotomy always been an ideal type all times being over-
shadowed by many grey areas. The Internet age with its social networks has only
added new shades of grey. On the one hand, the individual ever more so fears total
surveillance by a technologically, if not quite all-powerful, certainly over-powering
state. The NSA scandal20 serves here as only one jarring example of an Orwellian
“big brother is watching you” reinvented.21 On the other hand does the same indi-
vidual knowingly publicize information of private, even intimate character via social
networks like Facebook and thereby relinquishes the constitutionally guaranteed
protection of privacy, often without properly considering the long-term effects of such
self-exposure.22 The dreaded “Big brother” State is joined by many not less dreadful
“little brothers and sisters” of the online-community (or rather communities). To
again quote A. M. Brumis:
First, the distinction between a public figure and private figure is becoming increasingly
difficult to decipher, due to the Internet and social media platforms: The Rosenbloom plurality
opinion, by Justice Brennan, expressed: Voluntarily or not, we are all ‘public’ men to some
degree. Justice Brennan’s words ring even more true in the digital age. (...) In the age of
microcelebrity-fame – along with its associated benefits and burdens – is distributed along a
spectrum, not according to a dichotomy. The Internet has turned what many would previous
deem “private figures” into what could now be argued as public figures.23

A dichotomy that never really was one turned into a spectrum that always will be
composed of the “in-betweens”, too. Such a development, both technology- and
behavior-driven, does not only lead to sustained effects on the changing percep-
tion of private and public space, it also causes a number of issues with regard to
the protection of fundamental rights within and beyond the state.24 The suprana-
tional dimension is of course evident, as the world-wide-web, due to its very struc-
ture, escapes the regulatory power of the nation state and calls for global internet-
governance.25 The protection of privacy and private sphere play a significant role
within such a governance scheme. Whether fundamental rights guarantees in polit-
ical multilevel systems26 spanning the national-constitutional, the regional-public-
law and universal-public-law level, serve as sufficiently effective protection mech-
anisms, becomes the pivotal question. The following tour d´horizon is neither able,

19 Comprehensively—and with particular relevance to this context—discussed by: Taraz (2016);


furthermore Albers, in: Halft /Krah (eds.) (2013), p. 15; id. (2010), p. 1061.
20 For instance: Heidebach (2015), p. 593 et seqq.
21 Orwell (1949). For surveillance under Internet conditions see also Albers (2022), in this volume;

Molinaro and Ruaro (2022), in this volume.


22 Solove (2007); see also: Mayer-Schönberger (2015), p. 14 et seqq.
23 Brumis (2016), p. 1; Lat/Shemtob, Zach (2011), p. 403.
24 Fischer-Lescano (2014), p. 965 et seqq.; Taraz (2016).
25 Voegli-Wenzel (2007), p. 807 et seqq.; Uerpmann-Wittzack (2009), p. 261 et seqq.
26 Pernice (1999), p. 703 et seqq.; ibid, in: Bauer et al. (eds.) (2000), p. 25 et seqq.; ibid (2006),

pp. 973, 993; see also: Bilancia and Pizzetti (2004).


22 M. Kotzur

nor aiming to provide detailed answers hereto—no “mission accomplished” can be


expected. The much more modest goal is, rather, an initial mapping of the subject
area.

3 Basic Topology: Multipolar Fundamental Rights


Relations and the Protection of the Individual
from Her/Himself and Other Private Actors

The basic topology here is conceivably complex. The protection of the individual
from the state or sovereign power—exercised both within and beyond the state—is
no longer the primary concern, but rather it is about the protection of the indi-
vidual from other private actors—discussed as the “third-party-effect” or “horizontal
dimension/horizontal effect” in German fundamental rights doctrine27 —and from
her/himself, the dogmatic keyword here being the “obligation or duty to protect”.28
This duty is also resembled in international human rights law.29 As outlined by
the Office of the High Commissioner in his 2005 “Principles and Guidelines for a
Human Rights Approach to Poverty Reduction”30 : “All human rights—economic,
civil, social, political and cultural—impose negative as well as positive obligations
on States, as is captured in the distinction between duties to respect, protect and
fulfill. The duty to respect requires the duty-bearer to refrain from interfering with
the enjoyment of any human right. The duty to protect requires the duty-bearer to
take measures to prevent violations of any human right by third parties. The duty
to fulfill requires the duty-bearer to adopt appropriate legislative, administrative and
other measures towards the full realization of human rights. Resource implications
of the obligations to respect and to protect are generally less significant than those
of implementing the obligation to fulfill, for which more pro-active and resource-
intensive measures may be required. Consequently, resource constraints may not
affect a State’s ability to respect and to protect human rights in the same extent as
its ability to fulfill human rights.” This holistic human rights concept only allows
resource-based excuses when it comes to the fulfillment aspect. Both, the respect for
and the active protection of human rights, constitute resource-neutral and far-reaching
obligations.
In the Internet context, the latter comes into play in a very specific manner:
as—see above—a duty to protect the individual—notwithstanding her/his autonomy
and self-responsibility—from herself/himself. The duty to protect gains all the more
importance since the vulnerable individual publicizes private information voluntarily

27 De Wall and Wagner (2011), p. 743 et seqq.; Märten (2015).


28 On this topic: Rottmann (2014), p. 966 et seqq.
29 Cf. also for the European Union level Reinhardt (2022), in this volume.
30 Office of the United Nations High Commissioner for Human Rights, Principles and Guidelines

for a Human Rights Approach to Poverty Reduction, 2006, HR/PUB/06/12, pp. 47/48. See also De
Schutter (2014), p. 290.
Privacy Protection in the World Wide Web—Legal Perspectives … 23

and thereby not only potentially causes wholly unintended and unexpected conse-
quences, but gives away today—perhaps out of the spur of the moment, the inexpe-
rience of youth, or an adolescent urge for self-promotion—what shall be forgotten
tomorrow. Self-responsibility is limited when the future consequences of the actions
are hardly foreseeable in their full dimensions. Today’s innocent joke—the funny
semi-naked party picture of a drunken high school kid—can become the end of
tomorrow’s career—the former party kid now a public figure running for office and
being confronted with the “sins” of her/his past.31 Plenty are the examples of such
distressed public figures becoming an easy prey for the press. In the words of J.
Rosen: “Around the world citizens are experiencing the difficulty of living in a world
where the Web never forgets, where every blog and tweet and Facebook update
and MySpace picture about us is recorded forever in the digital cloud. This expe-
rience is leading to tangible harms, dignitary harms, as people are losing jobs and
promotions.”32
Whereas Rosen doubted that law was a good remedy for these harms,33 the ECJ
pursued a different path. In their Google ruling, the Luxemburg Judges established a
groundbreaking “right to be forgotten”34 or “right to oblivion” giving the concerned
individual’s privacy rights a stronger weight than the operator’s economic and the
general public’s information interests. The Court inter alia held that
in the light of the potential seriousness of that interference, it is clear that it cannot be justified
by merely the economic interest which the operator of such an engine has in that processing.
However, inasmuch as the removal of links from the list of results could, depending on the
information at issue, have effects upon the legitimate interest of internet users potentially
interested in having access to that information, in situations such as that at issue in the main
proceedings a fair balance should be sought in particular between that interest and the data
subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the
data subject’s rights protected by those articles also override, as a general rule, that interest
of internet users, that balance may however depend, in specific cases, on the nature of the
information in question and its sensitivity for the data subject’s private life and on the interest
of the public in having that information, an interest which may vary, in particular, according
to the role played by the data subject in public life.35
More specifically turning to the “be forgotten” interest the Court continued:

31 One example is given by the case of Stacy Synder (Snyder v. Millersville Univ.), 2008 U.S.
Dist. LEXIS 97943 (E.D. Pa., Dec. 3, 2008), outlined by Rosen (2011), p. 345 at 346 as follows:
“She is the young woman who was about to graduate from teachers college, and days before her
graduation her employer, a public high school, discovered that she had posted on MySpace a posting
criticizing her supervising teacher and a picture of herself with a pirate’s hat and a plastic cup and
she had put the caption “drunken pirate” under it. The school concluded that she was behaving
in an unprofessional way and promoting underage drinking. Therefore, they did not allow her to
complete her student teaching practicum. As a result, her teachers college denied her a teaching
certificate”.
32 Rosen (2011), p. 345 at 345.
33 Ibidem.
34 ECJ, ECLI:EU:C:2014:317–Google/Spain; on this topic Schiedermair, in:
Lind/Reichel/Österdahl (eds.) (2015), p. 284 et seqq.; of further relevance in this context
again Rosen (2011), p. 345.
35 ECJ, ECLI:EU:C:2014:317, para 81.
24 M. Kotzur

As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the
Charter, request that the information in question no longer be made available to the general
public by its inclusion in such a list of results, it should be held, as follows in particular
from paragraph 81 of the present judgment, that those rights override, as a rule, not only the
economic interest of the operator of the search engine but also the interest of the general
public in finding that information upon a search relating to the data subject’s name. However,
that would not be the case if it appeared, for particular reasons, such as the role played by
the data subject in public life, that the interference with his fundamental rights is justified
by the preponderant interest of the general public in having, on account of inclusion in the
list of results, access to the information in question.36

The protection from private actors becomes particularly relevant, as it is the providers
behind social networks who, primarily driven by economic interests and protected
by economic rights, entice the individual to publicize private data. But obviously
these private parties, as just stated, act under fundamental rights protections of their
own, which is why we speak of multipolar fundamental rights relations,37 that in turn
necessitate complex, and at times over-complex weighting processes. Public infor-
mation interests and the protection thereof make the necessary balancing approach
even more intricate.38 Finally, another factor comes into play when dealing with
globally operating providers and networks which transcend national borders—and
that is the question of the ex-territorial application of basic and/or human rights guar-
antees.39 The question of effective judicial enforcement of these rights is yet another
problem, entirely.40 Facing all these challenges, a cross-border Internet governance
being defined as “the evolving policies and mechanisms under which the Internet
community’s many stakeholders make decisions about the development and use
of the Internet”41 becomes an urgent desideratum in global politics and in schol-
arly research as well. A multi-stakeholder approach seems to be without reasonable
alternative.

4 Privacy Protection Within the State—Constitutional Law


Foundations of the General Right to Privacy

The general right to privacy after all provides a solid legal basis at the constitutional
law level. Looking for an explicit protection clause in the German Grundgesetz (GG),

36 Ibidem, para 97. And it is not only the ECJ dealing with a right to be forgotten. From a comparative
perspective see, e.g., Argentina. Here, the leading case about a “right to be forgotten” involves a pop
star called Virginia da Cunha: V. Sreeharsha, Google and Yahoo Win Appeal in Argentine Case,
N.Y.TIMES, Aug. 20, 2010, at B4. For further reference again Rosen (2011), p. 345 at 351. Cf. also
Sarlet (2022), in this volume; Schimke (2022), in this volume.
37 Karavas (2007), p. 81 et seqq.
38 See also Albers (2016), p. 19; id., in: Gutwirth/De Hert/Leenes (eds.) (2014), p. 213.
39 On the extra-territorial application of the right to privacy: Töpfer (2014), p. 31 et seqq.
40 Cf. with regard to this problem Veit (2022), in this volume.
41 https://www.nro.net/internet-governance/. Accessed 11 Jan. 2022.
Privacy Protection in the World Wide Web—Legal Perspectives … 25

however, would prove futile. Similar to the US constitution in its amendments, the
Grundgesetz does not go beyond traditional basic rights guarantees. From a joint
assessment of Article 1(1), the human dignity clause, and Article 2(1), the right to free
development of one’s personality, the Bundesverfassungsgericht (German Federal
Constitutional Court) has conceptualized such a right in the 1980s and spelled it out
later on in a number of cases42 : the right to your own likeness, the right to your own
name, the right to mandate the use of one’s personal data, which was coined as the
somewhat awkward-sounding “right to informational self-determination”,43 the right
of protection of reputation, among others.44 What is important here is the reference to
human dignity.45 It does not only highlight the close connection between the general
right to privacy and the right to free development of one’s personality, but also
renders any justification of infringement subject to a strict proportionality assess-
ment—which, admittedly, can be seen as a typical German approach. Comparing
the American notion of freedom and the German concept of dignity, E. J. Eberle
highlights the quintessential feature: “First, and most fundamentally, the German
constitution is anchored in the architectonic value of human dignity, meaning, at
least, that each person is valuable per se as an end in himself, which government and
fellow citizens must give due respect “.46 This “due respect” by the fellow citizens
in particular refers to the horizontal dimension of human dignity as a human right.
Eberle continues explaining:
The influence of the Kantian maxim, [a]ct so that you treat humanity, whether in your own
person or in that of another, always as an end and never as a means only is clear (although it
would be an overstatement to say the GG is simply Kantian), and this gives rise to a German
“constitution of dignity,” as compared to the American constitution of liberty. One obvious
difference between the two is that the German constitution is value-ordered around the norm
of dignity, whereas the American charter is value-neutral based on an idea of liberty rooted
in personal choice.

Given this background of a value-based “constitution of dignity “ which goes beyond


a notion liberty “rooted in personal choice”, the indirect third-party-effect/horizontal
effect of the general right to privacy has been directly acknowledged by the
Bundesverfassungsgericht inter alia in its case rulings in the field of media law. So far
so good! Adding an extraterritorial dimension, however, quickly reveals the fragility
of this dogmatic construct—let alone the different cultural traditions around the globe
as just exemplarily touched by the comparison between quite different constitutional
concepts of freedom as enshrined in the US and the German constitution. A judi-
cially defined that is to say judge-made right, which the Grundgesetz expressly does
not envisage, now has to be stretched in order to apply to third-party private actors,
which is never explicitly stated in the constitution, although it is implied in Article
1(1) and (2). In addition, this right would now have to be transposed, referencing

42 On this topic and with a particular focus on the internet: Leible/Kutschke (eds.) (2012).
43 Albers (2005a); id. (2005b), p. 537; id, in: Friedewald/Lamla/ Roßnagel (eds.) (2017), p. 11.
44 Firgt (2015).
45 Sarlet (2015); Häberle, in: Isenesee/Kirchhof (2004), § 2; Enders (1997).
46 Eberle (2008), p. 1 at p. 3.
26 M. Kotzur

Article 1(2) and (3), to a sphere beyond the state where the responsibility of German
sovereign power becomes ever-more remote. An extraterritorial horizontal effect of
Article 1(1) Basic Law might not be the most solid dogmatic ground to base effec-
tive Internet governance upon. Just a brief procedural side remark: Violations of the
general right to privacy, particularly in civil proceedings, are first to be raised before
the specialized courts. It follows that “regulation 1215/2012 on jurisdiction and the
recognition and enforcement of judgments in civil and commercial matters” plays a
decisive role in determining the place of jurisdiction on the EU level. This role shall
not be further discussed here, but it deserves mention, nevertheless. We, instead, shall
proceed to other options of privacy protection beyond the State.

5 Privacy Protection Beyond the State—International


Human Rights Guarantees Protecting the General Right
to Privacy

5.1 Regional Human Rights Protection Systems—The ECHR

As for the protection mechanisms at the European level, the ECHR shall be consid-
ered first. Article 8 ECHR takes center stage, here—it reads: “Everyone has the right
to respect for his private and family life, his home and his correspondence.” A right
to privacy becomes tangible here, based on the wording alone. The ECtHR views the
protection of personal data as a direct consequence of this right. This provision does
not, however, unfurl a direct third-party effect/horizontal effect on private actors, nor
is it explicitly grounded anywhere in the text of the ECHR. Therefore, as is the case in
national constitutional law, only an indirect third-party effect can be considered—not
a foreign concept to ECtHR jurisprudence, either. What is particularly interesting
with regards to the legal questions surrounding social networks, is the dogmatic
proximity of the third-party/horizontal effect and the above outlined duty/obligation
to protect. The Austrian scholar and judge at the Austrian Constitutional Court
Christoph Grabenwarter has carved this out, precisely:
The approach to a duty to protect advocated here – characterized by the victim-offender
relationship – also determines the relationship between a duty to protect and the horizontal
effect. Where the duty to protect applies to the legal relationship between private actors, the
indirect horizontal effect is satisfied through the fulfilment of the duties to protect. The state,
which, through civil laws and their enforcement ensures a balancing of interests as intended
by the ECHR, protects bearers of basic rights from infringements on behalf of private actors,
through means of criminal law (…), in so doing does not only fulfill its duty to protect. In
the process it also assumes all duties related to legal questions commonly discussed within
the dogmatic scope of the so-called effect. Problems of the third party effect are therefore
incorporated in the dogma of a duty to protect.47

47 Grabenwarter/Pabel (2016) (translation provided by the author), § 19 para 9, p. 160.


Privacy Protection in the World Wide Web—Legal Perspectives … 27

This approach centered on the duty to protect as core principle for further dogmatic
development shall also be used with regard to the EU Charter of Fundamental Rights,
and within its scope of application. As the decidedly more modern and therefore more
advanced convention as compared to the ECHR, the Charter strikes a more discerned
balance with respect to its rights guarantees. It differentiates between “respect for
private and family life” in Article 7 and the explicit protection of personal data
in Article 8 of the Charter, the latter’s protective quality enhanced by Article 16
TFEU.48 In its already above discussed Facebook-ruling (Schrems v. Data Protec-
tion Commissioner),49 the ECJ holds that secondary legislation, “in so far as they
govern the processing of personal data liable to infringe fundamental freedoms and,
in particular, the right to respect for private life, must necessarily be interpreted in the
light of the fundamental rights guaranteed by the Charter of Fundamental Rights of
the European Union.”50 This decision, unlike the “right to be forgotten” in the also
above discussed Google-Case, did not pertain to the protection of an individual using
social networks from her/himself, but rather to the passing-on of sensible personal
data by the EU, here the Commission, to a third country.
Nevertheless, this ruling is not of lesser significance regarding the contouring of
a duty to protect. In this case the Commission had assumed that the data would be
transferred to a “safe harbour”, that would provide an adequate level of protection
for personal data. The ECJ does not question the concept of a “safe harbour” in
principle. It demands, however, that the level of safety of that harbour is verifiable
and is actually being inspected. Again a quote from the decision:
Whilst recourse by a third country to a system of self-certification is not in itself contrary to
the requirement laid down in Article 25(6) of Directive 95/46 that the third country concerned
must ensure an adequate level of protection ‘by reason of its domestic law or … international
commitments’, the reliability of such a system, in the light of that requirement, is founded
essentially on the establishment of effective detection and supervision mechanisms enabling
any infringements of the rules ensuring the protection of fundamental rights, in particular
the right to respect for private life and the right to protection of personal data, to be identified
and punished in practice.51

The transferring state is not allowed to pass along its responsibility to control data
safety, herein lies its very duty to protect. What follows from this for the protection
of personal data in social networks: here, too, must there be an ultimate account-
ability on behalf of the responsible sovereign for the protection of personal data.
This accountability-based duty, if necessary, goes as far as to protect the individual
from her/himself and also has an indirect horizontal effect on private actors with
respect to their relevant fundamental rights. In every conceivable constellation must
“the persons whose personal data is concerned have sufficient guarantees enabling
their data to be effectively protected against the risk of abuse and against any unlawful
access and use of that data. The need for such safeguards is all the greater where

48 See more closely Reinhardt (2022), in this volume.


49 For further reading Buffa (2016).
50 ECLI:EU:C:2015:650, para 68.
51 Ibidem, para 78.
28 M. Kotzur

personal data is subjected to automatic processing and where there is a significant


risk of unlawful access to that data”.52
These groundbreaking holdings of the ECJ establish at the very least a guide-
line which provides substantive legal standards for “internet governance” to be
used in complex balancing processes in individual cases. The duty demands for
judicial enforceability that is to say effective legal remedies. Mere voluntary self-
commitments or best practice standards by providers would—even though a potential
first step53 —not be sufficient at the end of the day. The question as to the degree of
protection from her/himself that the individual requires remains unresolved, however,
and it will likely pose very difficult balancing questions for courts and decision-
makers to answer on a case-by-case basis. This might serve as a general measuring
stick: the greater the power-imbalance between provider and user, the less predictable
and further-reaching future consequences are, the more inscrutable technological
processing contexts are (for instance regarding the deletability of data), the more
vulnerable and in need of protection remains the individual, even in such cases
where s/he offers up personal information voluntarily and thereby turns her/himself
into a transparent user.

5.2 Universal Public Law and Human Rights Law

Privacy Protection is not a foreign concept to universal public law either. Back in
1948, the UDHR as a resolution of the General Assembly, formulated only soft law
in Article 12: “No one shall be subjected to arbitrary interference with his privacy,
family, home or correspondence, nor to attacks upon his honour and reputation.
Everyone has the right to the protection of the law against such interference or
attacks.” Article 17 of the ICCPR transposes this protection into binding public inter-
national law, when reproducing Article 12 UDHR almost verbatim. Para 1 reads: “No
one shall be subjected to arbitrary or unlawful interference with his privacy, family,
home or correspondence, nor to unlawful attacks on his honour and reputation.” And
para 2 continues: “Everyone has the right to the protection of the law against such
interference or attacks.”

52 Ibidem, para 91.


53 See: The Internet Governance Forum. Best Practices Built By You: “The Internet’s influence
has touched people all over the globe. In 2003 and 2005, the United Nations organized the World
Summit on the Information Society (WSIS). One of the most critical outcomes from this landmark
summit was the creation of the Internet Governance Forum, or the “IGF”. Each year there are global,
national, and regional IGFs events happening around the world. Every IGF offers a unique space for
a an amazing range of people to share information and develop solutions on key Internet issues. It
was purposefully designed not to be a decision-making body, which allows people to speak freely,
on an equal footing, without limitations linked to the negotiations of formal outcomes. What comes
out of the IGF, however, plays an essential role in shaping decisions taken by other groups that helps
the internet run.” (https://www.internetsociety.org/events/igf. Accessed 11 Jan. 2022.).
Privacy Protection in the World Wide Web—Legal Perspectives … 29

As clearly as these provisions bear out a consensus with respect to the protection
worthiness of the individual, at least in principle, it is equally clear that the manner
and scope of protection remains to a large degree subject to the cultural context. A
dogmatic fine-chiseling into the questions of universal protection worthiness and a
horizontal effect issue therefore proves futile (and perhaps is not even desirable). In
addition, the concept of what is private itself—this was the opening consideration—
is subject to dynamic cultural change. Recommendations for future research, as
outlined by A. M. Brumis, thus include:
exploring the overall implications of advancing technology on our personal privacy and right
to privacy, exploring the differences in terms and use across social media platforms to see
what impact these have on privacy expectations, exploring how self-disclosure throughout an
Internet landscape impacts an individual’s right to privacy, and exploring how government
plays a role in digital privacy laws. (…). These would be areas worth exploration, as each
relates to the future of digital privacy.54

Rapidly changing online-cultures may well lead us to a place where information


we would today assign to the sphere of protection-worthy privacy, could soon be
viewed as naturally being part of the public sphere. It is, however, this very dynamic
that affirms the protection-worthiness of the protective good of privacy. There is
no lack of fundamental or human rights guarantees and dogmatic constructs that
facilitate such protections, neither on the national-constitutionalist level, nor on the
universal public law level. On the contrary, the web of protection rules is rather close-
knit, if one also takes into account the fragmented protection regimes of the OECD,
which include specific provisions to protect privacy. The issue lies rather with the
effectuation/implementation and another deficit might require further consideration.
What is lacking, both at national and international level, are sub-constitutional that
is to say ordinary legal provisions based on international treaties, that concretize the
individual’s fundamental or human rights based claim to protection. Human rights for
sure provide a meaningful tool for internet governance and privacy protection. They,
however, are neither the only or per se most effective tool. The sole avenue of recourse
being available at the highly abstract level of fundamental and human rights, with
their respective uncertainties in regards to justification and (rights-)balancing, cannot
(fully) eliminate protection deficits if the underlying legal fabric is insufficient.

6 Conclusions—Including a Comparative Dimension

This paper has emphasized and even appraised the virtue of forgetting.55 However,
what’s got law to do with it? Very little, might have been the answer of J. Rosen,
who rather and very pragmatically endorsed some kind of expiration date for online

54 Brumis (2016), p. 1.
55 Mayer-Schönberger (2009).
30 M. Kotzur

published private data.56 It’s all about politics, stupid, might have been A. Brumis
reply:
As the field of public relations becomes increasing technology-driven, the right to privacy
across digital platforms will continue to be an important concern for both organizations and
individual clients. Public relations practitioners must steer clear of crises, and privacy viola-
tions can certainly turn into a crisis situation if an organization allows consumer information
to get into the wrong hands, or fails to protect employees from digital data breaches. As
more organizations are interacting with stakeholders through the Internet and social media
platforms, public relations practitioners must take into account the legal aspects of privacy,
and how to interpret it in a digital environment. The inadequate legal standards that currently
exist for digital privacy has led to privacy policies that protect both consumers and organi-
zations as user information is collected. However, these policies are not consistent across
all social media platforms, nor consistent among all organizations. Therefore, public rela-
tions practitioners must be well informed in order to avoid privacy violations and in order to
protect clients and organizations from digital privacy breaches or government intrusion.57

Human rights instruments might provide a suitable solution. Finally, it is the ECJ’s
more “law-optimistic” attempt which offers a mean of legal protection by a newly
shaped “right to be forgotten”. What all these approaches share is the very notion
that we are “more public and more interconnected than ever”.58 As D. Lat and Z.
Shemtob remind us:
“In this day and age – of blogs, where our private misadventures can be written about at
length; of streaming video and YouTube, where said misadventures can be seen and heard
by total strangers; of Facebook, where “friends” can post pictures of us, against our will
(maybe we can “de-tag,” but we can’t remove); of full-body scanners at the airport Justice
Brennan’s59 words ring more true than ever, for better or worse”.

If we want to evaluate the power of law within possible internet governance schemes,
in particular the steering powers human rights could unfold, we first have to take
a comparative look at the concept of privacy as such. Only if we find a “pri-
vacy language” spoken and understood in different legal systems respectively legal
cultures we can go one decisive step further towards effective legal instruments.
Law comparison, however, is a difficult job. R. Hirschl’s recent study on compara-
tive constitutional law can be invoked as convincing witness.60 Rather than simply
being looking for a blueprint of fixed solutions—let’s do it as the others do!, the
comparative lawyer is in permanent search for a matrix that allows him to weigh,
to probe, and to critically reconsider her or his own arguments against the back-
ground of experiences that others have made or solutions that others have found.61

56 Rosen (2011), p. 345.


57 Brumis (2016), p. 1.
58 Lat and Shemtob (2011), p. 403 at 416.
59 Rosenbloom v. Metromedia Inc., 403 U.S. 29 (1971): “[v]oluntarily or not, we are all ‘public’

men to some degree”.


60 Hirschl (2014).
61 Constitutionalism in Europa, the Americas or in Asia should thus be engaged in a permanent

dialogue on constitutionalism; for die Asian example Chen (ed.) (2014).


Privacy Protection in the World Wide Web—Legal Perspectives … 31

Just to copy-paste a rule stemming from another legal system or to restate a judg-
ment of a foreign Court has nothing to do with meaningful comparative work and
for sure will fall short of getting the whole “privacy picture” as displayed by so
many different legal cultures and privacy traditions around the globe. A simplistic
copy-paste would both misconceive the cultural heterogeneity of the legal world and
ignore a political community’s own legal identity as cultural identity.62 Without any
doubt, the public–private-distinction is at the very heart of this “identity”. Both appro-
priate and advisable comparative work may not limit itself to the idea of comparing
“the laws” (that is to say written norms, legal texts or, in particular given common
law systems,63 judgements) but it has, in a broader sense, to encompass a sensi-
tive comparison of cultures64 —in our case of “privacy cultures”. Whoever wants
to undertake the endeavor of a so-characterized holistic comparison65 must neces-
sarily leave the ivory tower of pure legal thought as well as the narrow world of
law-school-comparison often constraining itself to some rather fruitless semantic
exercises. A shift from comparative law stricto sensu to broadly shaped comparative
“law in context”-studies is the obvious consequence.66
An effective “right to be forgotten” concept has to be aware of these comparative
necessities. All the more so should global players take into account the perspective
of the “relevant others”. The United Nations are well aware of the privacy issue
and thus wrangling over new forms of internet governance. The Internet Governance
Forum founded in 2006 should be mentioned, in particular. It is improbable, however,
that global (legal) discourse would easily bring about globally uniform, binding and
comprehensive treaty regimes in the near future. All the more important is it that those
already powerful actors/stakeholders, within their capabilities, accept their share of
responsibility for the protection of privacy and enter into willing-to-learn-and-listen
worldwide discussions. The ECJ, with its two (quickly deemed historic) decisions
on Google and Facebook, has emphatically fulfilled its responsibility. The Court has
assumed the active position of a committed advocate dealing with the question of
what Internet governance should accomplish. This is certainly encouraging—also
for our Brazilian-German-Forum and for what could be described as “Civility in the
Digital Age”.67

62 For further relevant discussions: Cownie (2004).


63 See in this context also Singh (1985).
64 A classic of such an approach is Häberle (1982), p. 33; id. (1998), p. 463 et seq.; later Wahl

(2000), pp. 163, 173 et seq.; furthermore Varga (ed.) (1992); Ehrmann (1976).
65 Hirschl (2014), at p. 13 suggests “that for historical, analytical, and methodological reasons,

maintaining the disciplinary divide between comparative constitutional law and other closely
relates disciplines that study various aspects of the same constitutional phenomena artificially and
unnecessarily limits our horizons”.
66 Ibidem at p. 151.
67 Weckerle (2013).
32 M. Kotzur

References

Albers M (2005a) Informationelle Selbstbestimmung, Nomos, Baden-Baden


Albers M (2005b) Basis of fundamental rights in personal data protection: right to informational
self-determination and/or respect for private life? Juridica VIII:537–543
Albers M (2010) Grundrechtsschutz der Privatheit, DVBl:1061–1069
Albers M (2013) Privatheitsschutz als Grundrechtsproblem. In: Halft S, Krah H (eds) Privatheit.
Strategien und Transformationen, pp 15–44
Albers M (2014) Realizing the complexity of data protection. In: Gutwirth S, De Hert P, Leenes R
(eds) Reloading data protection, pp 213–235
Albers M (2016) A Complexidade da Proteção de Dados, Revista Brasiliera de Direitos Fundamen-
tais e Justiça, Belo Horizonte, ano 10, n. 35, pp 19–45
Albers M (2017) Informationelle Selbstbestimmung als vielschichtiges Bündel von Rechts-
bindungen und Rechtspositionen. In: Friedewald M, Lamla J, Roßnagel A (eds) Informationelle
Selbstbestimmung im digitalen Wandel, pp 11–35
Albers M (2022) Surveillance and data protection rights: data retention and access to telecommuni-
cations data. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Bilancia P, Pizetti F G (2004) Aspetti e problemi del constitutionalizmo multilevello. Giuffrè Editore,
Milano
Brandeis LD (1890) The right to privacy. IV Harvard Law Rev 5:193–220
Brumis A M (2016) The right to privacy in a digital age: reinterpreting the concept of personal
privacy. Inquiries 8(9). http://www.inquiriesjournal.com/articles/1450/the-right-to-privacy-in-a-
digital-age-reinterpreting-the-concept-of-personal-privacy. Accessed 11 Jan. 2022
Buffa F (2016) Freedom of expression in the internet society, Key editore
Chen AHY (ed) (2014) Constitutionalism in Asia in the early twenty-first century. Cambridge
University Press, Cambridge
Coppell J, O´Neill A (1992) The European court of justice: taking rights seriously? Leg Stud
12(2):227–239
Cownie F (2004) Legal academics. Cultures and identities, Oxford and Portland. Hart Publishing,
Oregon
Dahrendorf R (1974) Citizenship and beyond: the social dynamics of an idea. Soc Res 41:673–701
De Schutter O (2014) International human rights law: cases, materials, commentary, 10th edn.
Cambridge University Press, Cambridge
De Wall H, Wagner R (2011) Die sogenannte Drittwirkung der Grundrechte. Juristische Arbeits-
blätter 734–740
Eberle EJ (2008) The German idea of freedom. Oregon Re Int Law 10:1–76
Ehrmann HW (1976) Comparative legal cultures. Prentice Hall, New Jersey
Enders (1997) Die Menschenwürde in der Verfassungsordnung. Mohr Siebeck, Tübingen
Firgt R (2015) Strukturelle Analyse des Allgemeinen Persönlichkeitsrechts anhand des Rechts auf
informationelle Selbstbestimmung. Verlag Dr. Kovač, Hamburg
Fischer-Lescano A (2014) Der Kampf um die Internetverfassung. Rechtsfragen des Schutzes
globaler Kommunikationsstrukturen vor Überwachungsmaßnahmen. Juristenzeitung 965–974
Grabenwarter C, Pabel K (2016) Europäische Menschenrechtskonvention. C. H. Beck, Helbing
Lichtenhahn Verlag, Manz, München, Basel, Wien
Häberle P (1982) Verfassungslehre als Kulturwissenschaft, 1st edn. Duncker & Humblot, Berlin
Häberle P (1998) Verfassungslehre als Kulturwissenschaft, 2nd edn. Duncker & Humblot, Berlin
Häberle P (2004) Die Menschenwürde als Grundlage der staatlichen Gemeinschaft. In: Isensee J,
Kirchhof P (eds) Handbuch des Staatsrechts, vol 2, § 2
Häberle P (2006) Öffentliches Interesse als juristisches Problem, 2nd edn. Berliner Wissenschafts-
Verlag, Berlin
Hartzog W (2013) Privacy and terms of use. In: Stewart DR (ed) Social media and the law, pp 50–74
Privacy Protection in the World Wide Web—Legal Perspectives … 33

Heidebach M (2015) Die NSA-Affäre in Deutschland - Stößt der Grundrechtsschutz an seine


Grenzen? DÖV: 593–599
Hirschl R (2014) Comparative matters. The renaissance of comparative constitutional law, Oxford
University Press, Oxford
Karavas V (2007) Digitale Grundrechte. Elemente einer Verfassung des Informationsflusses im
Internet. Nomos, Baden-Baden
Lat D, Shemtob Z (2011) Public figurehood in the digital age. J Telecommun High Technol Law
9:403–419
Leible S, Kutschke T (eds) (2012) Der Schutz der Persönlichkeit im Internet. Boorberg, Stuttgart
Märten J (2015) Die Vielfalt des Persönlichkeitsschutzes. Pressefreiheit und Privatsphärenschutz
in der Rechtsprechung des Europäischen Gerichtshofs für Menschenrechte, in Deutschland und
im Vereinigten Königreich. Nomos, Baden-Baden
Mayer-Schönberger V (2009) Delete: the virtue of forgetting in the digital age. Princeton University
Press, New Jersey
Mayer-Schönberger V (2015) Was ist Big Data? Zur Beschleunigung des menschlichen Erkennt-
nisprozesses, Aus Politik und Zeitgeschichte, No. 11–12, pp 14–19
Molinaro CA, Ruaro RL (2022) Privacy protection with regard to (tele-)communications surveil-
lance and data retention. In: Albers M, Sarlet IW (eds) Personality and data protection rights on
the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Office of the United Nations High Commissioner for Human Rights, Principles and Guidelines for
a Human Rights Approach to Poverty Reduction, 2006, HR/PUB/06/12
Orwell G (1949) Nineteen eighty-four. Secker & Warburg, London
Pateman C (1990) The disorder of women: democracy, feminism and political theory. Stanford
University Press, Stanford
Pernice I (1999) Multilevel constitutionalism and the treaty of Amsterdam: European constitution-
making revisited. CMLREV 1999:703–750
Pernice I (2000) Europäisches Verfassungsrecht im Werden. In: Bauer H et al (eds) Ius Publicum
im Umbruch, pp. 25–46
Pernice I (2006) The global dimension of multilevel constitutionalism: a legal response to the
challenges of globalisation. In: Tomuschat FS pp 973–1006 (ch)
Pfisterer V (2014) Unternehmensprivatsphäre. Mohr Siebeck, Tübingen
Preuß UK (2002) Der EU-Staatsbürger–Bourgeois oder Citoyen. In: Winter G (ed) Das Öffentliche
Heute, pp 179–195
Reinhardt J (2022) Realizing the fundamental right to data protection in a digitized society. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Rosen J (2011) Free speech, privacy, and the net that never forgets. J Telecommun High Technol
Law 9:345–356
Rottmann F (2014) Totalüberwachung des Internets: Kapitulation vor der “Macht der Fakten”?
Deutscher und europäischer Datenschutz und die ausländischen Geheimdienste. AnwBl.
12/2014:966-978
Sarlet IW (2015) Dignidade (da Pessoa) Humana e Direitos Fundamentais na Constituição Federal
de 1988, 10th edn. Livraria do Advogado
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schiedermair S (2015) The right to be forgotten in the light of the Google Spain Judgment of
the European court of justice. In: Lind AS, Reichel J, Österdahl I (eds) Information and law
in transition. Freedom of speech, the internet, privacy and democracy in the 21st century, pp
284–299
34 M. Kotzur

Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Singh MP (1985) German administrative law in a common law perspective. Springer, Berlin,
Heidelberg
Smend R (1933) Bürger und Bourgeois im deutschen Staatsrecht. In: ibid. (1968, 2nd ed)
Staatsrechtliche Abhandlungen, pp 309–325
Solove DJ (2007) The future of reputation: gossip, rumor, and privacy on the Internet. Yale University
Press, New Haven
Taraz D (2016) Das Grundrecht auf Gewährleistung der Vertraulichkeit und Integrität informa-
tionstechnischer Systeme und Gewährleistung digitaler Privatheit im grundrechtlichen Kontext:
Wegbereitung für eine digitale Privatsphäre? Verlag Dr. Kovač, Hamburg
Töpfer E (2014) Wie das Menschenrecht auf Privatheit in seiner Krise an Profil gewinnt, vorgänge.
Zeitschrift für Bürgerrechte und Gesellschaftspolitik, 53. Jahrgang, Nummer 206/207:31–41
Turner BS, Hamiliton P (eds) (1994) Citizenship: critical concepts. Routledge, London
Uerpmann-Wittzack R (2009) Internetvölkerrecht. AVR 47:261–283
Varga C (ed) (1992) Comparative legal culture. New York University Press, New York
Veit RD (2022) Safeguarding regional data protection rights on the global internet—The European
approach under the GDPR. In: Albers M, Sarlet IW (eds) Personality and data protection rights
on the Internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Voegeli-Wenzel J (2007) Internet Governance am Beispiel der Internet Corporation of Assigned
Names and Numbers (ICANN). GRUR Int.:807–816
Wahl R (2000) Staat-Souveränität-Verfassung. Festschrift für Helmut Quaritsch, edited by Dietrich
Murswiek/Ulrich Storost/Heinrich A. Wolff, Duncker & Humblot, Berlin, 2000. ISBN 3-428-
09623-1
Weckerle A (2013) Civility in the digital age: how companies and people can triumph over haters,
trolls, bullies and other jerks. Que Publishing, Indianapolis
Weiler JHH, Lockhart NJS (1995) ‘Taking rights seriously’ seriously: The European court and its
fundamental rights jurisprudence. CMLREV 32(2):579–627
Weintraub J, Kumar K (eds) (1997) Public and private in thought and practice: perspectives on a
grand dichotomy. University of Chicago Press, Chicago

Markus Kotzur Prof. Dr. Iur. (Universität Bayreuth), LL.M. (Duke University, NC, USA),
Professor for Public International and European Law, President of Europa-Kolleg Hamburg.
Main research areas: Global Constitutionalism, Global Governance, Human Rights Law,
EU-Institutions, European and National Constitutional Law. Selected Publications: European
Union Treaties. A Commentary (together with Rudolf Geiger and Daniel-Erasmus Khan),
C.H.Beck/Hart, Munich/Oxford 2015; Grenznachbarschaftliche Zusammenarbeit in Europa.
Der Beitrag von Article 24 Abs. 1a GG zu einer Lehre vom kooperativen Verfassungs- und
Verwaltungsstaat. Habilitationsschrift, Duncker & Humblot, Berlin 2004; Legal Cultures in
Comparative Perspective, in: M. P. Singh (Hrsg.), The Indian Year-Book of Comparative Law
2016, Oxford University Press, Oxford 2017, pp. 21–50; Solidarity as a Legal Concept, in: A.
Grimmel und S. M. Giang (Hrsg.), Solidarity in the European Union, Springer, Cham 2017,
pp. 37–45; Theorieelemente des internationalen Menschenrechtsschutzes. Das Beispiel der
Präambel des Internationalen Paktes über bürgerliche und politische Rechte, Berlin 2001.
Personality Rights in Brazilian Data
Protection Law: A Historical Perspective

Danilo Doneda and Rafael A. F. Zanatta

Abstract This chapter traces the influence of personality rights and European legal
thought in the development of data protection law in Brazil. We argue that the
Brazilian Data Protection Law enacted in 2018 is grounded on a solid legal tradition
of civil law. In order to demonstrate this argument we trace how Brazilian lawyers
took advantage of European legal thought in the 20st century and how the concept
of personality rights was intellectually constructed. We also argue that personality
rights had an important role in legal struggles during the Brazilian civil-military
dictatorship (1964–1985). The chapter also presents new data about the history of
data protection in Brazil.

1 Introduction

The enactment of the first Brazilian data protection legislation in August 2018 was
the culmination of a process dating back to 2010, when earlier versions of its text were
submitted to public comments on the Internet. First conducted by the federal govern-
ment and later by the Brazilian parliament, the development of the text received
reasonable feedback from society following a path already taken by another piece
of legislation, the Internet Civil Rights Framework (known as the Marco Civil da
Internet 1 ). Both these statutes form the backbone of the Brazilian legal framework
for the information society, together with other legislation regarding issues ranging
from access to information to intellectual property.

D. Doneda
Public Law Institute of Brasília (IDP), Brasília, Brazil
R. A. F. Zanatta (B)
University of São Paulo, São Paulo, Brazil
1 Law 12.965 of 2014.

© Springer Nature Switzerland AG 2022 35


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_3
36 D. Doneda and R. A. F. Zanatta

The particular kind of collaborative process which resulted in the General Data
Protection Law (known as LGPD2 ) and the Marco Civil made both of them partic-
ularly sensitive to some demands from society, which contributed to their final text.
However, while the Marco Civil was a brand-new framework built almost from
scratch and from references to specific topics—there was no other law of this kind,
combining privacy, net neutrality, and intermediary liability—, LGPD had a previ-
ously existing set of legal frameworks to relate to. Data protection is currently a
well-established legal subject and field in several other legal systems—particularly
the continental European legal system, from which the Brazilian one derives—with
its own set of concepts and standards which are much easier and more rational to
adapt to than to contrast.
In this sense, the process that resulted in LGPD took into consideration not merely
the need to include data protection concepts and standards in Brazilian law but also
the dialogue with current Brazilian legislation and legal tradition. We argue that the
rationale of data protection, which is the protection of citizens regarding the possible
effects of the processing of their data, is coherent with the tradition of personality
rights, already well-established in Brazilian law.3 We therefore aim to identify the
significance of personality rights in the shaping of the new legislation, in histor-
ical perspective, and to stress the intersection points between LGPD provisions and
personality rights, recognizing that the strength of this relationship is very important
for a complete and necessary translation of the dogmatics of data protection (fairly
new to the country’s legal tradition) into the Brazilian legal framework.

2 Personality Rights and the Role of European Legal


Thinking in Brazilian Law

Personality rights play a key role in the structuring and shaping of the Brazilian data
protection framework. This section explores aspects of the history of Brazilian private
law, the influence of European legal thinking on Brazilian legal scholarship during
the nineteenth and twentieth centuries, and the challenges of protecting individuals
during the civil-military government (1964–1985). We argue that the constitution of
1988 was influenced by previous struggles for the right to privacy and the doctrine
of personality rights.

2 LGPD stands for Lei Geral de Proteção de Dados Pessoais in Portuguese.


3 Cf. also Sarlet (2022), in this volume; Molinaro and Ruaro (2022), in this volume.
Personality Rights in Brazilian Data Protection Law … 37

2.1 Legal Formants of Brazilian Private Law

It is difficult and tricky to compare legal systems.4 Despite their similarities, the task
of comparing the Brazilian and the European systems must also take into account that
there is no complete singularity and uniformity in European Union law. However, it
might be useful to bring back Rodolfo Sacco’s methodological approach of “legal
formants”5 to discuss “hidden similarities” and “metalegal formants” present in the
Brazilian legal system. As advocated by Sacco, comparative approaches to legal
studies should focus not only on written law and explicit legal rules, but should aim
at “highlighting structural regularities that would otherwise pass unobserved.”6
Instead of speaking of legal rules, according to Sacco, “we must talk instead
of the rules of constitutions, legislatures, courts, and, indeed, of the scholars who
formulate legal doctrine.”7 By applying this methodology to personality rights and
personal data protection, this means that we should move beyond a mere description
of rules and the topography of law. We must evaluate the consistency of what is in
the code compared to what the courts apply (law in books vs. law in action) and the
legal formants, that is the living law, which is, in a sense, underground and barely
seen, and consists of statutory rules, the formulations of scholars (and their main
intellectual inspirations), and the decisions of judges. As argued by Sacco, a merely
topographic analysis of law (e.g. the differences between Codes or positive law) does
not take into account the deep structures of legal thinking, like the role of scholarship
in decision-making and the interplay between the knowledge produced by Court in
the process of ruling a hard case and the positive law.
One important feature of the Brazilian legal system is the adoption of the civil
law tradition and the codification of private law that culminated in the first Brazilian
Civil Code of 1916, which was highly influenced by the European legal tradition and
the work conducted by French, German, and Italian legal elites. Nevertheless, the
role and meaning of civil codes changed substantially during the twentieth century.
Judith Martins-Costa, writing in 1998, claimed that the Civil Code does not have
as a paradigm the structure of a “perfect model designed by the wise” capable of
predicting all kinds of fattispecie (an ensemble of factual elements that are governed
by a certain legal norm) and following sets of beliefs of nineteenth-century jurists.
The inspiration for contemporary civil codes, according to Martins-Costa, is clearly
the “open models” of constitutions, with the adoption of open clauses (clausulas
gerais). In any case, if we consider the historical development of Brazilian private
law, the influence of the “codification movement” proves to be strong. Caio Mario da
Silva Pereira, one of the leading private-law jurists in the twentieth century, stresses
the importance of the open model and, after recognizing the importance of the codi-
fications in Prussia, France, and Austria in the late eighteenth and early nineteenth
centuries in his work “Instituições de Direito Civil,” observed that “the establishment

4 Cf. also Kotzur (2022), in this volume.


5 Sacco (1991), pp. 1–34.
6 Sacco (1991), p. 5.
7 Sacco (1991), p. 21.
38 D. Doneda and R. A. F. Zanatta

of codified principles”8 allows law to evolve through systematic reasoning and legal
interpretation.
The Brazilian legal system was strongly influenced by the colonial Portuguese
legal model and the regulation of public and private life through the Ordenações. Even
after the Declaration of Independence in 1822, legal and political elites sustained the
enforcement of the Ordenações Filipinas and began considering the enactment of a
Civil Code.9 However, according to Caio Mario da Silva Pereira, the turning point
was the decision of the imperial government in 1855 to consolidate Brazilian civil
law. Augusto Teixeira de Freitas worked on this project from 1859 to 1867 and
delivered his draft of the Civil Code, strongly influenced by concepts developed in
German legal theory.10 This Draft was divided into a general part and a special one.
Dogmatics began to play a new role with the theoretical work Consolidação das
Leis Civis written by Teixeira de Freitas. In this influential work of dogmatics and
private law, Teixeira de Freitas made a clear distinction between direitos reais (legal
relationships among a person and a thing) and direitos pessoais (legal relationships
among people). Even if his framework did not reach what we would later identify as
personality rights, whose conceptualization was still at a very early stage at that time,
the notion of an obligation of satisfaction or compensation if personal rights were
violated or offended was to become the general concept that would be associated
with several legal instruments aimed to protect the individual.
It is not our goal to provide a history of personality rights in Brazil. We would
like to stress, instead, that personality rights were influenced by this dual movement:
the codification that started with Teixeira de Freitas and culminated with the Civil
Code of 1916 and the role of “legal dogmatics”11 in civil law, highly influenced by
German legal theory in the beginning, and later by Italian and French civil law.12
These elements are part of the Brazilian “legal formants” and must be considered in
any comparative analysis.
According to Orlando Gomes, Brazilian jurists such as Rubens Limongi França,
author of Direitos da Personalidade, took up the concept of “personality rights” in
the form proposed by legal theorist Otto von Gierke, who firmly believed that the
Roman distinction between private and public law could not be sustained, considering
that “private law, though it cares foremost for individual interests, must serve the

8 Pereira (2011), p. 66.


9 Pereira (2011), pp. 66–69.
10 Reis (2015), pp. 181–222.
11 Legal dogmatics (Rechtsdogmatik or dogmatica giuridica) is used here as a synonym of legal

scholarship, a type of knowledge about concepts and structures of law aimed at decision-making
actors. “Legal dogmatics” is also explained by MacCormick: “What may be called normativist
approaches to jurisprudence, most notable among them the Pure Theory of Law of Hans Kelsen,
provide a theory of knowledge for legal dogmatics in this traditional style. Such theories seek
to expound the presuppositions behind the kind of descriptive statements of law, or descriptively
normative statements, which black letter lawyers expound. They also try to work out and account
for the internal logical and conceptual structures which give legal dogmatics a rational form.”
MacCormick and Weinberger (2013), p. 2.
12 Gomes (1966), pp. 39–48.
Personality Rights in Brazilian Data Protection Law … 39

common goal,” and that “public law, though it looks first to the whole, must be just
toward individuals.”13 Gierke defended the notion that private law should preserve
the “inviolable sphere of the individual” but also promote community. The idea about
“the social role that private law should play” strongly influenced Orlando Gomes. In
1889, Gierke defended the idea that “private law must use the means at its disposal
first and foremost to guarantee and protect the personalities of individual people,
to both acknowledge and limit the rights of personality (rights of the individual)
in their general and equal manifestation, in their condition as being members of a
community, with their unique eccentricities, and in their individually acquired or
otherwise attained development, to use civil obligations next to criminal sanctions to
vindicate rights where they are harmed, and to secure compensation and redress.”14
In his classic essay Direitos da Personalidade, Gomes agrees with Gierke and
claims that “personality rights are absolute, extrapatrimonial, non-transferable, non-
prescriptable, vital and necessary.”15 However, to Gomes, they are more than the
rights to life and freedom. They are “subjective private rights” aimed at the “develop-
ment and expansion of the physical and spiritual individuality of the human person.”16
In his essay, Gomes calls for an agenda of dogmatic studies on personality rights,
claiming that his peers “continue to live in the past century” and that it is the task of
the young generation to “demonstrate sensitivity for these issues.”
Personality rights were first introduced in Brazilian law as a concept associated
with “an ensemble of attributes particular to the human condition,” as professor San
Tiago Dantas said in his lectures in the 1940s,17 or special human manifestations
whose protection is necessary for the normal development of persons. Brazilian legal
tradition on personality rights turned out to reflect a strong influence from Italian
legal doctrine, mainly from Adriano De Cupis, who in 1950 formulated a concept
of personality rights as a set of rights whose absence would make legal personality
void of any concrete value; this concept is still widely mentioned and used. In other
words, its absence would make all other subjective rights practically useless for the
individual; in broad terms, for De Cupis, personality rights could mean a set of
essential rights.18 While these rights began to be observed in case law, basically due
to the development of cases regarding damages to image, honor, or other personality
rights,19 the debate continued in Brazilian private law.
In 1979, Fabio Maria de Mattia returned to the issue in his article Direitos
da personalidade and claimed that the Brazilian debate was highly influenced
by late nineteenth-century German legal theory—which created concepts such as
Personalitätsrechte and Persönlichkeitsrechte—and the experience of the Italian and
Portuguese civil codes. Mattia, clearly influenced by Orlando Gomes and Rubens

13 Gierke (2016), p. 6.
14 Gierke (2016), p. 7.
15 Gomes (1966), p. 42.
16 Gomes (1966), p. 43.
17 Santiago Dantas (2001).
18 De Cupis (1950).
19 Santos (1997).
40 D. Doneda and R. A. F. Zanatta

Limongi França, claimed that “personality rights are an autonomous category within
the subjective rights, considering that this autonomy comes from the essential char-
acter that they present because of the special character of their object and the
singularity of its content.”20 For Mattia, postwar constitutions such as the German
Grundgesetz, approved in May 1949, conceptualized dignity, stating that human
dignity must not be violated and that is the role of the state to protect it. Matta
mentions Article 2 of the German Constitution: “every person shall have the right to
free development of his personality insofar as he does not violate the rights of others
or offend against the constitutional order or the moral law.”
European law played a substantial role in the theoretical development of person-
ality rights in Brazil. It becomes easier to understand the influence of European legal
theory if one accepts the role of dogmatics and civil codes as part of the Brazilian
“legal formant.”

2.2 The “Social Turn” in Brazilian Law: Instrumentalism


and Authoritarianism

In order to fully understand the development of data protection law in Brazil, it is


important to observe two transformations in twentieth-century legal theory. The first
one was briefly described above, namely the emergence of “dignity” as a key concept
in post-World War II constitutions. The second one is the so-called “social turn” of
civil law and the relationship between constitutions and the generation of principles
that have profound effects on private law.
The “social turn” was proclaimed during the 1960s in Brazil by civil lawyers
such as Orlando Gomes, based on the contributions of European legal theory and
socialist approaches to the welfare state. Gomes was also influenced by Ripert (1880–
1958), author of Le régime démocratique et le droit civil moderne, who noticed
that democracy and private law emerged in a dialectic process in which civil law
merged with the “democratic spirit” based on solidarity, social progress, and the
socialization of private law. As noted by José Reinaldo de Lima Lopes and Paulo
Garcia Neto, the Brazilian critique of the liberal state and classic forms of legal
reasoning involved many pressing issues (e.g., the problem of poverty and labor
relations, the problems of the relation between government and business corporations,
the problem of development and using law as an instrument to foster economic
growth, and the problem of traditional property doctrine as an obstacle to agricultural
reform), in which the “social turn”21 of civil law was just one component.
During the 1960s and 1970s, many Brazilian jurists wrote about the “decay of legal
individualism” and the “role of the socialization of law”22 together with state inter-
ventionism. However, as noted by many legal scholars who studied the effects of legal

20 Matta (1979), p. 252.


21 Lopes et al. (2009).
22 Tácito (1959), pp. 1–28.
Personality Rights in Brazilian Data Protection Law … 41

instrumentalism in Brazil, such as John Henry Merryman,23 David Trubek,24 and


José Eduardo Faria,25 the civil-military coup of 1964 deeply transformed Brazilian
society and legal institutions. The “social turn” of law was also subverted. Instead
of advancing the idea of democracy and private law, the Brazilian “social turn” was
captured by an instrumental approach to law. State interventionism and develop-
mental industrial policies combined with authoritarianism, wiping out fundamental
rights. Indeed, the military used the constitution and the narratives of “protection
of democracy” to eliminate left-wing political groups and opponents of the military
regime created in 1964.
After the infamous Ato Institucional n. 5 of 1969, which eliminated fundamental
rights such as due process in criminal law and habeas corpus, activists and lawyers
began to organize the defense of civil rights especially through the Brazilian Bar
Association (Ordem dos Advogados do Brasil) and major political parties such as
the Movimento Democrático Brasileiro (MDB). Raymundo Faoro, the leader of the
Ordem dos Advogados do Brasil during the 1970s, began complex negotiations with
the military for the return of these civil and political rights. There was also a growing
movement of human rights activists working on a global scale to protect left-wing
citizens who were tortured and imprisoned.
The literature about this period of the Brazilian history is very rich, and it shows
that activists,26 lawyers,27 diplomats, and politicians were able to shape a new agenda
for the return of democracy and the creation of a new constitution.28 After the oil
crisis and the global economic instabilities in the 1970s, the Brazilian government
lost economic power. The “Brazilian economic miracle” was over, and this opened
up space for a process of “redemocratization” of the country. In this process, legal
elites were able to import European theories of “fundamental rights” and apply them
in Brazil, influencing the work of the Constituinte, the political group in charge of
drafting a new federal constitution for the Brazilian Republic. The resulting Brazilian
Constitution of 1988, among other characteristics, presented several provisions which
favoured the incorporation and enforcement of personality rights as well as advanced
towards their enforcement. The framework of a workable theory of personality rights
in Brazilian law have reached its maturity, as a milestone article by Gustavo Tepedino
demonstrates (Tepedino 1999).

23 Merryman (1977), pp. 457–491.


24 Trubek (1972), pp. 1–50.
25 Faria (1981).
26 Pilatti (2008).
27 Barroso (1998), pp. 1–25.
28 Faoro (1981).
42 D. Doneda and R. A. F. Zanatta

2.3 The First Attempts Toward a Brazilian Data Protection


Law and the Fight Against a National Identification
System

In 1975, the military government presented the project of a Unified Biometric Iden-
tification System (Registro Nacional de Pessoas Naturais—RENAPE) for the whole
country. This happened during the peak of Brazilian state interventionism when
industrial policy focused on technology and the rising “information economy.” The
military had created public firms such as Serpro (Serviço Federal de Processa-
mento de Dados) and wanted to invest in public policy projects that would generate
demand for these firms. The idea was to unify many different databases—Registro
Geral, Cadastro de Pessoa Física, and Inscrição no Instituto Nacional de Seguridade
Social29 —into one database with biometric information about all citizens. Providing
the data would be mandatory for all citizens, feeding the database of the public firm.
A group of sociologists, engineers, and computer scientists resisted and published
texts and manifestos against the biometric system in specialized magazines and jour-
nals. Professionals of other technology firms such as IBM also protested and claimed
that a Unified Biometric System was unsafe and potentially harmful. In 1977, these
activists turned to the French Data Protection Law as a model for Brazil. A few months
after the proposal of the French legislation, Brazilian Congressman José Faria Lima
(a liberal politician from a wealthy family based in São Paulo and from the govern-
ment’s political party, Arena), proposed a draft bill for a General Data Protection
Law.30 The draft bill stated that data collection processes required consent from citi-
zens. The draft bill also adopted a set of principles for the processing of personal
data both by the private and the public sector. It also proposed the creation of a Data
Protection Authority, following the model of the French Commission Nationale de
l’Informatique et des Libertés (CNIL). At almost the same time, Senator Nelson
Carneiro, from MPB, proposed a draft bill to the Senate on the “protection of
computer information.”31 Both draft bills were rejected by the National Congress.
In 1977, the liberal newspaper Estado de São Paulo published a series of articles
against the Registro Nacional de Pessoas Naturais (RENAPE). President Geisel and
the military defended the project and had the support of the Ministry of Justice.
Intellectuals and professionals from the tech sector criticized the project and used
specialized journals such as Revista Dados to mobilize against it. As argued by Estado
de São Paulo, the project “represented not only a threat to the citizen, in his privacy
and liberty, but also a threat to capitalism itself.”32 In the article “Threat to Privacy, the
Greatest Criticism of the Project,” sociologist Maria Tereza de Oliveira argued that

29 Registro Geral could be translated as General Identifier. It is a document provided by the Secre-
tariat of Security. Cadastro de Pessoa Física could be translated as Individual Registration and it
is used for fiscal purposes. INSS could be translated as Social Security Number and it is used for
labor law benefits and pension system.
30 Draft Bill PL 4.365, proposed in 1977.
31 Senate Draft Bill 96 of 1977.
32 ‘Em estudos, identidade única’, Estado de São Paulo, 09 September 1977, p. 18.
Personality Rights in Brazilian Data Protection Law … 43

the right to privacy was protected by Article 12 of the Universal Declaration of Human
Rights33 and that this international standard should be followed in Brazil.34 Oliveira
also argued that many European countries such as Belgium, Italy, and Germany
had opposed national registration similar to that proposed in Brazil. She argued
that huge databases about citizens conflicted with democratic values and generated
incentives for a less active society because of the chilling effects of control and
constant monitoring.
Writing in 1979 for Estado de São Paulo, Ethevaldo Siqueira also criticized the
military project and argued that “privacy involves all the forms of protection of
the individual, the person, his home, his family, his ideas, his lifestyle, his right to
isolate himself and escape from external interference.” For Siqueira, “there is no
public interest that can justify the systematic violation of human privacy.”35
The same year, René Ariel Dotti gave a speech about the right to privacy in Brazil at
the annual meeting of the Brazilian Bar Association. Based on the French legislation
and the work of Alan Westin in the United States, Dotti argued that “the atmosphere
of oppression was taking over the country, making use of advanced technology”.
For Dotti, RENAPE was to be prohibited just like a similar database was in Portugal
because it was, just like in Portugal, a national database that violated the fundamental
right of privacy of citizens. In his speech about the “normative treatment of private
life,” Dotti argued that Brazil was to use informatics to benefit citizens and that the
“right to privacy should be guaranteed by the constitution.”36 Also, Portugal and
Spain were to be inspirations for Brazil.
Despite the failed attempt of a General Data Protection Law in 1977, the discussion
about the right to privacy was strengthened and RENAPE was defeated. Nonetheless,
initiatives aiming to draw up a data protection framework in Brazil continued: Repre-
sentative Cristina Tavares (PMDB) presented Draft Bill 4646 in 1984 on the Right
to Privacy and Personal Databases, and representative José Jorge (PDS) presented
Draft Bill 4766, also in 1984, on the Right to Privacy and Access to Databases.
In the following years, after the military government had abandoned the RENAPE
project, the Constituinte began working on a new set of rights such as habeas data
(the right to know information about yourself held by a public official or department).
From 1986 to 1988, members of the Constituinte discussed the fundamental rights
of Brazilians and approved a new constitution, which stated that the “right to inti-
macy and private life” was a fundamental right. It also included an article about the
fundamental right of access to information, guaranteed by the writ of habeas data.

33 Article 12. No one shall be subjected to arbitrary interference with his privacy, family, home
or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the
protection of the law against such interference or attacks.
34 We thank Jacqueline Abreu (University of São Paulo Faculty of Law) for gathering data and

kindly sharing her research on the right to privacy in Brazil. She searched the archives of Estado
de São Paulo and found many articles and news items critical of the Registro Nacional de Pessoas
Naturais.
35 Siqueira (1979), p. 24.
36 Dotti (1980), pp. 145–155.
44 D. Doneda and R. A. F. Zanatta

These legal innovations formed the bedrock of the Brazilian system of data protec-
tion. During the 1990s and 2000s, this system evolved into a “patchwork model” of
different rules on data protection.37 In the 2010s, the Brazilian Congress finally
harmonized the normative system into a coherent General Data Protection Law. The
next section explores this transformation.

3 Personality and Data Protection Rights in a Fragmented


Set of Legislation

In the previous section, we explored the evolution of Brazilian private law and the
struggles to establish constitutional protections for data privacy. In this section, we
show that the normative system of personal data protection benefited from a complex
process of unifying a fragmented set of legislation that was created over 20 years.
We also argue that personality rights are clearly present in the Brazilian General
Protection Law.

3.1 The “Principle of Consent” in Fragmented Legislation

After the constitutional reform of 1988, Brazil initiated its economic opening to
international markets and the construction of a “regulatory state”38 based on the
experience of the United States of America and many European and Latin Amer-
ican countries.39 The governments of Fernando Collor (1990–1992), Itamar Franco
(1993), and Fernando Henrique Cardoso (1994–2002) were dedicated to the recon-
struction of the Brazilian state, the creation of a strong internal market with interna-
tional standards of consumer rights (guaranteed by the Code of Consumer Defense
of 1990), and the creation of regulatory agencies with normative power. These agen-
cies were designed to regulate specific markets, promote competition, correct market
failures, and protect consumers.
The first law that dealt with issues of personal data protection after the constitu-
tion of 1988 was the Code of Consumer Defense (Federal Law 8.078/1990), which
includes a specific chapter on databases and profiles of consumers. The law states
that private firms are free to create consumer reports, profiles, and methodologies to
assess risks. However, consumers have the basic right to access information about
themselves. There is a basic right to retrieve information about what kinds of data are

37 Brazilian legal scholarship since the mid-2000s has pointed to the limits of this patchwork,
defending a comprehensive approach to personal data protection. See Doneda (2006). See also
Limberger (2007).
38 Prado (2012), pp. 300–326.
39 Dubash and Morgan (2012), pp. 261–281.
Personality Rights in Brazilian Data Protection Law … 45

used in these profiles and databases. There is also a right to have incorrect information
corrected and to delete data about payments of bills after five years.
The Law of Bank Secrecy (Complementary Law 105/2001) also imposed a series
of obligations on banks and financial institutions that work with their clients’ personal
data.40 The law says that consumers must consent to data collection processes and
that private firms have the obligation not to reveal (or not to share with third parties)
the personal data they possess.
Sharing of financial data is also covered by the Code of Consumer Protection. An
important case involving HSBC bank and decided by the Superior Court of Justice
in 2017 addressed the relationship between consent and bank secrecy.41 According
to the court, when people sign a credit card services agreement, they have the right
to choose whether or not to authorize their personal data and financial information
being provided to other companies, even if they are partners of the company admin-
istrating the credit card services agreement. For this reason, the imposition of the
authorization in an adhesion contract is considered abusive and violates the principles
of transparency and trust in consumer relations. In deciding the case, Justice Luis
Felipe Salomão pointed out that, among basic consumer rights, protection against
unfair terms in the provision of products and services is one of the most important
provisions of the Consumer Protection Code (CDC).42 By breaching the principles
of transparency and trust in consumer relations, Luis Felipe Salomão considered
it abusive if the credit card service does not offer the customer the possibility of
rejecting the sharing of data. For the Justice, the transfer of information, in addition
to making the client vulnerable, is not essential for the execution of the contracted
service.
Sector-based regulatory norms also guaranteed the right of consent for personal
data collection and a series of obligations for those who process these data.43 The
Telecommunications Act (Law 9.476/1997), for example, stated that consumers of
telecommunication services must consent to data collection processes. Telecom-
munication operators must also handle their customers’ personal data with care.
There is a breach of law, enforced by the Agência Nacional de Telecomunicações
(Federal Agency of Telecommunications), if this information is shared with third
parties without the customers’ consent. The same rule applies to the sectors of
energy (Agência Nacional de Energia Elétrica—Federal Agency of Electricity) and
supplementary health (Agência Nacional de Saúde—Federal Agency of Health), for
example.
In 2010, the company Oi S.A. started to map Internet access habits of users of the
Velox service without their consent. Called a Navegador, the program used for data
collection traced consumer profiles, which were then marketed to conduct targeted

40 Roque (2001).
41 Superior Court of Justice, Recurso Especial 1,348,523, ruled in September 2017, Justice Luis
Felipe Salomão, accessible under http://www.stj.jus.br. Accessed 11 Jan. 2022.
42 Scocuglia (2017), accessible under https://www.jota.info/justica/bancos-nao-podem-compartil

har-dados-de-clientes-diz-stj-13102017. Accessed 11 Jan. 2022.


43 Alves (2001), pp. 219–230.
46 D. Doneda and R. A. F. Zanatta

advertising. In 2014, the Department of Consumer Protection and Defense (DPDC)


filed an administrative proceeding to determine the irregularities committed. After
its conclusion, the National Consumer Secretariat (Senacon/MJ) fined the company
Oi S.A. R $3.5 million (more than US $1 million), based on the Code of Consumer
Protection and the violation of the Telecommunications Act.44
In 2001, the Brazilian Congress finally enacted the new Civil Code (Código Civil)
after decades of discussions, based on the work of Miguel Reale and other private-law
jurists. This Code was inspired by late twentieth-century European legal thinking. It
developed an entire chapter on personality rights and protected the right of individuals
to fully develop their personhood (the idea that the State must guarantee the possibility
of developing capabilities and the individual’s personality). It also protected the right
to private life and provided legal remedies and means of redress in case of violation
of these rights (Articles 20 and 21). The right to one’s image, for instance, stated
that no one could profit from a person’s image without that person’s consent. If this
occurs, there is civil liability for a violation of the law.

3.2 The Freedom of Information Act and the Marco Civil


Da Internet

During Lula’s government (2003–2010), there was a turn in the national agenda.
Instead of focusing on market incentives, competition, and consumer welfare—tradi-
tional models of economic regulation—, the Brazilian government decided to focus
on the transformation of society caused by the Internet and a renewed agenda for
citizenship online.45
Backed by a partnership with the Open Government Partnership, coordinated by
the United States of America, Brazil discussed and enacted a Freedom of Information
Act (Law 12.527/2011). This law defines basic concepts of “personal data” and a set
of rights for citizens wanting to know about the performance of public officials or
to access information about the government. The Brazilian Freedom of Information
Act also defined a principle of consent for data sharing by public officials.
Finally, the “principle of consent” for the use of personal data on the Internet was
clearly established by the Internet Civil Rights Framework.46 This law, which was
legally constructed by a bottom-up procedure and considerable online participation,
defined basic rights for the use of the Internet in the country and has an entire
chapter on “basic rights of Internet users.” It says, in Article 7, that Internet users
have the right to clear information about how personal data are collected by Internet
Service Providers (ISPs) and Internet Application Providers (IAPs). It also says

44 Casemiro (2014), accessible under https://oglobo.globo.com/economia/defesa-do-consumidor/


oi-multada-em-35-milhoes-por-invasao-de-privacidade-feita-por-velox-13348505. Accessed 11
Jan. 2022.
45 Matheus et al. (2012), pp. 22–29.
46 Medeiros and Bygrave (2015), pp. 120–130.
Personality Rights in Brazilian Data Protection Law … 47

that data collection and data processing must be grounded in “free, expressed, and
informed consent.” In other words, Internet users must be autonomous and must
agree with data processing before it happens. As argued by Laura Schertel Mendes
and Danilo Doneda, Marco Civil da Internet clearly affirms a “fundamental right of
data protection”47 and establishes the principle of consent as a foundational principle
for the application of the legislation.
Up until now, together with the Code of Consumer Defense, Marco Civil da
Internet has provided the legal basis for class actions and administrative procedures
focused on the violation of personal data protection on the Internet. In 2016, for
instance, the Brazilian Institute of Consumer Defense published a report on the
violations of Marco Civil in the “WhatsApp case” (the new terms of use imposed
by Facebook Inc. and the collection of metadata without clear user consent). The
institute argued that there was a violation of Article 7, VII, of Law 12.965/2014
because personal data was shared with “third parties” without clear, informed, and
expressed consent.48
Another important case was the class action of the Ministério Público Federal
(Brazilian Federal Public Prosecutor) against Microsoft in 2018 because of opt-out
mechanisms for data collection set by default in the Windows 10 operating system.
According to the federal prosecutors, the product violated Brazilian law because “it
collected personal data without clear and expressed consent from users, sentencing
to death the constitutional principles of dignity of the human person, the inviolability
of intimacy and private life, honor and image.”49
Until the enactment of the General Data Protection Law (Law 13.709/2018), both
Marco Civil da Internet and the Code of Consumer Defense provided a strong legal
basis for injunctions and collective redress.

3.3 The Credit Reporting Law and “Rights of Control”

During the early period of Dilma Rousseff’s government (2011–2016), Brazil enacted
an important piece of legislation on credit reporting which was inspired by the Credit
Reporting Act of 1976 in the United States. Brazilian Law 12.414/2011 established
a legal framework for “positive credit reporting,” that is, a reporting system that
monitors the payment behavior of consumers, the use of financial products, and the
probability that a particular person will fulfill his or her financial obligations.50
The structure of this system, the Cadastro Positivo, rests on three pillars: (i)
the basic right of consent (credit bureaus can only open a positive credit report if
the consumer understands what is going on and agrees), (ii) the right of control

47 Doneda and Mendes (2014), pp. 3–20.


48 Zanatta (2016), accessible under https://www.idec.org.br/pdf/relatorio-whatsapp-termos-de-uso.
pdf. Accessed 11 Jan. 2022.
49 See http://www.mpf.mp.br/sp/sala-de-imprensa/docs/acp-microsoft. Accessed 11 Jan. 2022.
50 Bessa (2011).
48 D. Doneda and R. A. F. Zanatta

(the consumer must have control over his or her data, with a set of basic rights
of access, modification, deletion, etc.), and (iii) the right of transparency and non-
harmful discrimination (consumers who suffer the consequences of a purely auto-
mated decision must have the right to a human analysis in order to diminish harmful
discrimination).51
As argued by Danilo Doneda and Laura Schertel Mendes in 2014, the Brazilian
Credit Reporting Law advanced personal data protection, making it an “autonomous
field”52 and detaching it from consumer protection and constitutional law. With
this new set of rights—mostly designed to ensure control, transparency, and non-
harmful discrimination—the Brazilian legal system approximated the European one.
Indeed, European Directive 46/95 and the draft version of the General Data Protection
Regulation (GDPR) inspired the rights of access, control, and transparency in the
Brazilian Credit Reporting Law.
This set of rights created by Law 12.414/2011 inspired the Ministry of Justice
to work on a draft version of a General Data Protection Law during Dilma
Rousseff’s government. There were two public consultations in a period of five
years and hundreds of contributions from experts, industry representatives, and
non-governmental organizations (Boff and Fortes 2014).53
In the next section, we will explore how the General Data Protection Law used
the legal background of these diverse laws and built something new in normative
terms, taking advantage of an established tradition of personality rights within the
Brazilian legal community.

4 The General Data Protection Law of 2018

Brazil’s first General Data Protection Law (LGPD) was enacted on August 14, 2018.54
The approved text has its origins in a proposal prepared by the Ministry of Justice in
2010 that went through years of official Internet debates (promoted by the ministry
in 2010 and 2015) as well as discussions with the government and stakeholders.55
The dynamics and conditions of the process which gave birth to the LGPD differ
in some relevant ways from what happened in other Latin American countries when
approving their own data protection legislation. Indeed, none of the data protection
laws previously enacted in Latin America had a concrete impact on the timing and
mood of the debate on the matter in Brazil, which was basically internal. This was the
case even though countries as close to Brazil as Argentina or Uruguay, all of which

51 Bessa (2011), pp. 45–65.


52 Doneda and Mendes (2014).
53 Boff and Fortes (2014), pp. 109–127.
54 Monteiro (2018), accessible under https://iapp.org/news/a/the-new-brazilian-general-data-protec

tion-law-a-detailed-analysis/. Accessed 11 Jan. 2022.


55 Zanatta et al. (2018), accessible under https://www.accessnow.org/cms/assets/uploads/2018/06/

Idec_TechnicalNote_DataProtection_20181.pdf. Accessed 11 Jan. 2022.


Personality Rights in Brazilian Data Protection Law … 49

are members of the commercial block Mercosul, enacted their legislation and went
further in order to obtain adequacy status from the European Commission or, in the
case of Uruguay, become a signatory of Convention 108 of the Council of Europe.
One exception to this almost isolationist approach, however, was the participation
of Brazil in a working group in Mercosul (SGT13, the working sub-group on elec-
tronic commerce and digital certification). The group, after being encouraged by the
Republic of Argentina to elaborate a legal data protection framework for the coun-
tries of Mercosul (at that time Argentina, Brazil, Paraguay, and Uruguay), discussed
the proposal for five years until the four countries signed a data protection regula-
tion in 2010 which was meant to be evaluated by the executive branch of Mercosul
(Grupo Mercado Comum) at a later date. Even though the document ended up being
neither analyzed nor enacted as Mercosul legislation, this was the first (preparatory)
official document endorsed by Brazilian representatives pointing to the need to estab-
lish an internal data protection framework for Brazil (the Mercosul document would
ideally, if enacted, drive the member countries to approve their own data protection
regulations in compliance with the Mercosul standard).
In 2016, the Data Protection Bill prepared by the Ministry of Justice was sent by
the government to the National Congress, and in 2018, the final text of Law 13.709
was unanimously approved both by the Chamber of Deputies and the Federal Senate.
The President of Republic, however, vetoed some provisions, most importantly the
creation of an autonomous Data Protection Authority (DPA), on the basis of formal
objections to the way the DPA was created.
The veto of the DPA was followed by an intense debate about its desirable nature
and shape. In the last days of his term of office, and following his own statement that
the DPA would be created before he left office, President Michel Temer issued the
executive law Medida Provisoria MP 869 of 2019 creating the Autoridade Nacional
de Proteção de Dados, a DPA with no formal independence from the government
and linked to the Presidency of the Republic. However, Congress had to deliberate
its creation and several changes this executive law made to the LGPD.
The Brazilian data protection framework usually does not refer to a fundamental
right to data protection present in the Constitution, as the current jurisprudence of STF
(Brazilian highest Constitutional Court) does not yet recognize a constitutional right
referring to personal data. That differs from the GDPR, which is strongly grounded in
the conceptualization of data protection as a fundamental right as enshrined in Article
8 of the Charter of Fundamental Rights of the European Union56 and recognized by
Recital 1 of the GDPR.
The lack of a literal provision in the Brazilian Constitution, however, was followed
by several expressions of references to fundamental rights and personality rights in
the LGPD, both in the formal sense (as fundamental references) and in the structural
sense (the use of personality right techniques).
In this sense, the presence of personality rights as the driving force for the protec-
tion of citizens by means of a series of provisions on the use of their personal data is
made exceptionally clear in the first article of LGPD, which not only mentions the

56 Rodotà (2009), pp. 77–82.


50 D. Doneda and R. A. F. Zanatta

law’s aim to protect the fundamental rights of freedom and privacy, but also the free
development of the personality of natural persons. The notion of the free develop-
ment of personality is strictly linked to the theory of rights of personality in Brazilian
private law,57 and its mention stresses the fact that this model was the main LGPD
reference for translating the rationale of the tradition of data protection to Brazilian
law. Indeed, the “free development of personality”58 is mentioned twice in LGPD.
First, in Article 1,59 which defines the scope of the law. Second, in Article 2,60 when
mentioning the basis of data protection, together with human rights, dignity and the
exercise of citizenship.61
Article 2 is also a central article regarding the issue of personality rights in the
LGPD for another reason: it clearly mentions that most of grounds upon which the
disciplina of data protection is based are related to personality rights: the right to
privacy (Article 2, I), the inviolate intimacy, honor, and image (Article 2, IV), the
aforementioned Article 2, VII, but also expressions of fundamental rights strongly
related to personality rights such as freedom of expression, communication, and
opinion (Article 2, III), as well as the presence of the concept of informational self-
determination (Article 2, II), directly quoting the Recht auf informationelle Selbstbe-
stimmung from the 1983 ruling by the German Constitutional Court.62
In several places, the body of the LGPD demonstrates that some of its instruments
were shaped so that in situations where personality rights demand a major level of
protection, these rights should be particularly and inevitably considered. That is
the case, for example, for the general rule of verifying the feasibility of legitimate
interest as the basis for personal data processing, which may occur only if the rights
and freedoms of the data owner prevail—thus creating a concrete limit on the use
of legitimate interest grounded in the protection of the data owner (Article 7, IX).
Another clear example of protection of personality is the concept of sensitive data,
which demands a higher level of protection because there is a higher potential for
damage to the data owner. The LGPD went even further when it strictly forbade all
communications of health data for economic purposes (Article 11, § 4).
Some processual provisions in LGPD also strongly resonate with tools which are
commonly used to assure personality rights, and perhaps the most important of them
is the possibility given by Article 22 to use the available processual instruments,
whether individual or collective, broadly.

57 See part 2 of this contribution.


58 Martins-Costa (2011), pp. 813–840.
59 Article 1 reads: “This Law provides for the processing of personal data, including by digital

means, by a natural person or a legal entity of public or private law, with the purpose of protecting
the fundamental rights of freedom and privacy and the free development of the personality of the
natural person”.
60 Article 2 reads: “The discipline of personal data protection is grounded on the following: (…)

VII—human rights, free development of personality, dignity and exercise of citizenship by natural
persons”.
61 Rouvroy and Poullet (2009), pp. 45–76.
62 As to this right, cf. Albers (2005).
Personality Rights in Brazilian Data Protection Law … 51

5 Conclusion

Brazil lagged behind even multiple Latin American countries in introducing a General
Data Protection Law, doing so only in 2018. One of the reasons for the long duration
of this process was the novelty and even the complexity of some of the key tools and
concepts in the new legislation, many of them not yet present in Brazilian law, but
necessary to achieve harmonization with international and transnational standards
on data protection. One of the clear signs of it was, indeed, the vacatio legis period
established until the LGPD entered into force, which was originally 18 months, but
later expanded to 24 months, by Medida Provisória 869 of 2018—an uncommonly
long period by Brazilian legal standards.63
For the LGPD to fulfill its mission to provide Brazil with modern data protection
legislation in order to protect the individual, it will be important to recognize the
strong link between the new legislation and the doctrine and tradition of the rights
of personality, for at least three reasons.
First, as mentioned, the fact that the recognition of data protection as a fundamental
right is still a work in progress in Brazilian jurisprudence makes it essential to ensure
that data protection is strongly linked with the tradition of personality rights, assuring
that the protection of individuals in the face of the information society can use the
set of tools provided by private law to assure the data owners’ position regarding the
processing of their data.
Second, the bridging between personal data protection and personality rights
can avoid excessive “technicization” of the field of personal data protection within
the legal community. This process of “technicization” could transform personal
data protection into something similar to, for instance, telecommunication regu-
lation, making it a “small field” for a highly specialized group of lawyers basi-
cally concerned with compliance and economic regulation. That is not the case with
personal data protection, which is not only about economic regulation but essentially
about fundamental rights and personality rights.
Third, the Brazilian legal community was able to take advantage of its long tradi-
tion of personality rights, as explained in Part 2 of this chapter. By reshaping person-
ality rights into personal data protection—making it clear that personal data protec-
tion is about the free development of personality, informational self-determination,
and risk regulation—, Brazilian lawyers and scholars can think more clearly about this
new field of rights and obligations without abandoning a well-established tradition
and its humanistic character.
We do not need to abandon legal thinking from the past. We can use the best
that this legal thinking produced in private law to think about contemporary prob-
lems caused by information technology and the digitalization of society. This is the
challenge for the application of the General Data Protection Law.

63The LGPD, except articles 52–54, finally entered into force on September 18, 2020; the rules on
sanctions in articles 52–54 came into effect on August 1, 2021.
52 D. Doneda and R. A. F. Zanatta

References

Albers M (2005) Informationelle Selbstbestimmung. Nomos, Baden-Baden


Barroso LR (1998) Dez anos da Constituição de 1988. Rev de Direito Administrativo 214:1–25
Bessa LR (2011) Cadastro positivo: comentários à Lei 12,414, de 09 de junho de 2011. Editora
Revista dos Tribunais, São Paulo
Boff SO, Fortes VB (2014) A Privacidade e a Proteção dos Dados Pessoais no Ciberespaço como
um Direito Fundamental: perspectivas de construção de um marco regulatório para o Brasil.
Seqüência: estudos jurídicos e políticos 35(68):109–127
da Pereira CMS (2011) Instituições de Direito Civil, 24th edn. Editora Forense, Rio de Janeiro
De Cupis A (1950) I Diritti Della Personalità. Giuffrè, Milano
Doneda D (2006) Da privacidade à proteção de dados pessoais. Renovar, Rio de Janeiro
Doneda D, Mendes LS (2014) Data protection in Brazil: new developments and current challenges.
In: Gutwirth S, Leenes R, De Hert P (eds) Reloading data protection. Springer, Dordrecht, pp
3–20
Dotti RA (1980) Proteção da vida privada e liberdade de informação: possibilidades e limites.
Editora Revista dos Tribunais, São Paulo
Dubash NK, Morgan B (2012) Understanding the rise of the regulatory state of the south. Regul
Gov 6(3):261–281
Faria JE (1981) Direito, modernização e autoritarismo: mudança socioeconômica × liberalismo
jurídico. Edusp, São Paulo
Faoro R (1981) Assembléia Constituinte: a legitimidade recuperada. Brasiliense, Brasília
Gierke O von (2016) The social role of private law (trans: McGaughey E). German Law J 19(4)
Gomes O (1966) Direitos da personalidade. Rev de Informação Legislativa 3(11):39–48
Jeová Santos A (1997) Dano Moral Indenizável. Lejus, São Paulo
Kotzur M (2022) Privacy protection in the world wide web—legal perspectives on accomplishing
a mission impossible. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the
internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Limberger T (2007) O direito à intimidade na era da informática: a necessidade de proteção de
dados pessoais. Livraria do Advogado, Porto Alegre
Lopes JR, Neto G, Macedo P (2009) Critical legal thought: (1920–1940): the case of Brazil.
Fundação Getulio Vargas working papers
Martins-Costa J (2011) O princípio do livre desenvolvimento da personalidade. In: Lobo Torres R
(ed) Dicionário de Princípios Jurídicos. Elsevier, Rio de Janeiro, pp 813–840
Mattia FM (1979) Direitos da personalidade: aspectos gerais. Rev de Informação Legislativa
14(56):245–262
Medeiros Alves OM (2001) Agências reguladoras e proteção do consumidor de serviços de
telecomunicações. Rev de Direito Administrativo 226:219–230
Merryman JH (1977) Comparative law and social change: on the origins, style, decline & revival
of the law and development movement. Am J Comp Law, 457–491
Molinaro CA, Ruaro RL (2022) Privacy protection with regard to (tele-)communications surveil-
lance and data retention. In: Albers M, Sarlet IW (eds) Personality and data protection rights on
the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Pilatti A (2008) A Constituinte de 1987–1988: progressistas, conservadores, ordem econômica e
regras do jogo. Editora PUC-Rio, Rio de Janeiro
Prado MM (2012) Implementing independent regulatory agencies in Brazil: the contrasting
experiences in the electricity and telecommunications sectors. Regul Gov 6(3):300–326
Reis T (2015) Teixeira de Freitas, lector de Savigny. Rev de historia del derecho 49:181–222
Rodotà S (2009) Data protection as a fundamental right. In: Gutwirth S, Poullet Y, De Hert P (eds)
Reinventing data protection? Springer, Dordrecht, pp 77–82
Roque MJ (2001) Sigilo bancário e direito à intimidade. Juruá, Curitiba
Personality Rights in Brazilian Data Protection Law … 53

Rouvroy A, Poullet Y (2009) The right to informational self-determination and the value of self-
development: reassessing the importance of privacy for democracy. In: Gutwirth S, Poullet Y, De
Hert P (eds) Reinventing data protection? Springer, Dordrecht, pp 45–76
Sacco R (1991) Legal formants: a dynamic approach to comparative law (Installment I of II). Am
J Comp Law 39(1):1–34
Santiago Dantas F (2001) Programa de Direito Civil: teoria geral. Forense, Rio de Janeiro
Sarlet IW (2022) The protection of personality in the digital environment. An analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schertel Mendes L (2014) Privacidade, proteção de dados e defesa do consumidor - Linhas gerais
de um novo direito fundamental. Saraiva, São Paulo
Tácito C (1959) O Abuso do poder administrativo no Brasil: conceito e remédios. Rev De Direito
Administrativo 56:1–28
Tepedino G (1999) A tutela da personalidade no ordenamento civil-constitucional brasileiro. In:
Temas de direito civil. Forense, Rio de Janeiro, pp 23–54
Trubek DM (1972) Toward a social theory of law: an essay on the study of law and development.
Yale Law J 82(1):1–50
Zanatta R (2016) Consentimento Forçado: uma avaliação sobre os novos termos de uso do WhatsApp
e as colisões com o Marco Civil da Internet. Instituto Brasileiro de Defesa do Consumidor, São
Paulo

Danilo Doneda Ph.D. in Civil Law (UERJ). Professor at IDP. Director at CEDIS/IDP (Center
of Studies in Internet and Society). Member of the advisory board of the United Nations Global
Pulse Privacy Group, the Project Children and Consumption (Instituto Alana) and Open Knowl-
edge Brasil. Previously served as General Coordinator at the Department of Consumer Protection
and Defence in the Ministry of Justice (Brazil). Visiting researcher at the Italian Data Protection
Authority (Rome, Italy), University of Camerino (Camerino, Italy) and at the Max Planck Institute
for Comparative and International Private Law (Hamburg, Germany). Part of his work is available
at www.doneda.net/.

Rafael A. F. Zanatta Director of the Research Association Data Privacy Brasil. PhD Candi-
date at the University of São Paulo. Main areas of research: Data Protection Rights, Collective
Rights, Data Commons, Legal Theory and Sociology of Law. Selected Publications: A Proteção
de Dados entre Leis, Códigos e Programação: os limites do Marco Civil da Internet, in: Newton
de Lucca/Adalberto Simão Filho/Cíntia Rosa Pereira (eds.) Direito e Internet III: Marco Civil
da Internet. São Paulo: Quartier Latin, 2015, pp. 447–470; Economias do Compartilhamento e o
Direito, Curitiba: Juruá, 2017; Dados, Vícios e Concorrência: repensando o jogo das economias
digitais, Revista Estudos Avançados, v. 33, n. 96, 2019, pp. 421–446 (with Ricardo Abramovay);
Proteção de dados pessoais e direito concorrencial: razões de aproximação e potencialidades de
pesquisa, Revista Fórum de Direito da Economia Digital, Belo Horizonte, v. 3, 2019, pp. 141–
170 (with Bruno Renzetti); Dados pessoais abertos: pilares dos novos mercados digitais, Direito
Público, v. 16, n. 90, 2019, pp. 155–178 (with Ricardo Abramovay).
Realizing the Fundamental Right to Data
Protection in a Digitized Society

Jörn Reinhardt

Abstract Article 8 of the EU Charter of Fundamental Rights (CFR) guarantees the


protection of personal data and sets forth the main principles of data protection. This
chapter explores the implications of this fundamental right for a digitized society.
First, it addresses specific problems of the understanding of Article 8 CFR. This
includes the concept of data protection as a fundamental right itself: What are the
(tangible) harms of data processing? What is the object of protection? Further, it
elaborates on its meaning and scope with respect to private parties. According to
Article 51 CFR, the provisions of the Charter are addressed to the institutions and
bodies of the Union and the Member States when they are implementing Union law.
Private parties are not explicitely mentioned. However, fundamental rights standards
of data protection apply also to private companies. As a fundamental right, Article
8 CFR produces a “horizontal effect”. Despite several decisions of the Court of
Justice of the European Union (CJEU), in particular the Google Spain judgement,
the conceptual frame and the scope of the horizontal effect remain unclear. In order
to clarify the possible effects of Article 8 CFR between private parties, it is necessary
to explicate the doctrinal grounds on which these effects can be construed (direct
and indirect horizontal effects, negative and positive obligations). On this basis, the
chapter highlights regulatory implications for the data economy.

1 The General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament

and the Council of 27 April 2016 on the protection of natural persons with regard to the processing of
personal data and on the free movement of such data—GDPR), which came into force in May 2018,
was certainly a major landmark. However, the process of “modernizing” data protection legislation
is ongoing and concerns a variety of projects on the level of secondary EU law. In particular, data
protection and e-privacy aspects still need to be coordinated. The reform of the e-privacy directive
is not yet completed.

J. Reinhardt (B)
University of Bremen, Bremen, Germany
e-mail: joern.reinhardt@uni-bremen.de

© Springer Nature Switzerland AG 2022 55


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_4
56 J. Reinhardt

1 Introduction

The European Union has been renewing its legal framework for data protection in
the light of an increasingly digitized and “datafied” society.1 As threats to privacy
evolved, the European Charter of Fundamental Rights (CFR) acknowledged data
protection as a fundamental right. The “right to the protection of personal data”
differs in its degree of abstraction from the other “freedoms” guaranteed under Title
II of the Charter. Although Article 8 CFR has been in force for almost twenty years
now, it still provokes a number of elementary questions concerning its interpretation
and understanding. What does the “right to the protection of personal data” actually
protect? Obviously, it is not the data itself, but rather individuals and their interests
and, on a more abstract level, the integrity of social practices. If this is the case, then
what is the relationship between the right to data protection and the other fundamental
rights provisions of the Charter, especially the right to privacy (Article 7 CFR)?
Apart from the questions concerning the understanding of the right as such, the
effects of the right to data protection vis-à-vis private actors often remain unclear.
According to Article 51 CFR, the provisions of the Charter address the institutions
and bodies of the European Union and the member states when implementing EU law.
However, challenges for the protection of personal information not only result from
state action, but also from the inherent logic of the data economy.2 The dominant
advertising business model that is shaping large parts of internet communication,
so-called “surveillance capitalism”, implies that personal data is being “harvested”
systematically and comprehensively.3 Big data technologies enable innovation and
technological progress in many areas, but also bring with them various dangers for
personal rights and can even threaten our understanding of equality and democratic
processes.4
This contribution explains the objectives and central components of the funda-
mental right to data protection and then turns to the regulatory implications for the
data economy. Obviously, not every fundamental rights issue of digitization can be
solved by means of data protection. Systemic undesirable developments such as
the accumulation of information power require approaches that extend way beyond
Article 8 CFR. Nevertheless, the protection of personal data remains a crucial aspect
for the digitized public sphere.

2 A digitized society not only produces data to an exponentially increasing extent, but reproduces
itself by means of this data and is particularly dependent on it (cf. Mayer-Schönberger and Cukier
2013). Therefore, digitization does not only refer to the technical process of “datafication”, i.e. of
converting analog aspects of the life-world into data and information. The concept also refers to the
social effects (Baecker 2018), p. 59. The process of digitization changes the texture of social life
and social practises. The transformative circumstances do not result from one technology alone, but
the combination of “radical” technologies and technological developments such as the internet of
things, augmented reality, machine learning and artificial intelligence (Greenfield 2017), p. 253.
3 Zuboff (2019), p. 128.
4 O’Neil (2016), p. 199.
Realizing the Fundamental Right to Data Protection … 57

2 Mechanisms and Objectives of Data Protection

Article 8(2) CFR authorizes the processing of personal data only if certain conditions
are met. It provides that personal data “must be processed fairly for specified purposes
and on the basis of the consent of the person concerned or some other legitimate basis
laid down by law”. While the open formulation in Article 8(1) CFR (“right of every
person to the protection of personal data concerning him or her”) does not yet suggest
a particular data protection concept, Sect. 2 introduces further requirements that any
processing activity has to fulfill in order to be lawful. Processing of personal data is
only possible on a legal basis if it is sufficiently specific as to the purposes for which
the data is collected, used, processed and shared. The need to specify the purpose of
data processing is a mechanism that is supposed to safeguard the rights of the data
subject. It has a restrictive aspect insofar as it requires limiting data processing to a
specified purpose (“principle of purpose limitation”).
Nevertheless, the goal and the purpose of data protection remain largely under-
determined. At the outset, the CJEU adopts a rather technical approach. According to
the Court’s jurisprudence, any operation performed upon data or information relating
to an identified or identifiable natural person is relevant in terms of Article 8 CFR.
While this is consistent with secondary law definitions, it has its difficulties. For
instance, there are data analysis technologies that make it possible to trace back
even supposedly anonymous data to a certain individual. Whether or not data or
information can be assigned to a natural person depends very much on the required
effort, the context and the respective information technology. Therefore, further
criteria are needed in order to assess whether data processing actually interferes
with fundamental rights positions or not. At least, when it comes to balancing the
conflicting fundamental rights positions and legal interests, one needs a substantive
account for why data protection is important. The debate about the goal and purpose of
data protection is as old as the concept itself and cannot be avoided.5 Unsurprisingly,
there are plenty of suggestions for interpretation. It goes without saying that data
is not protected for its own sake. Prominent reasons for data protection include
privacy and informational self-determination.6 In addition, there are systemic and
functional aspects such as trust.7 Here, the legal discourse enters seamlessly into
ethical, political, sociological, but also information technology arguments. This only
increases complexity.
The aim of the debates around the goal or purpose of data protection is to establish
a normative foundation for the fundamental right, conceptually and dogmatically.
Any explanation of the purpose of data protection, however, is reductionist if data
protection is limited to a single value or principle.8 This first became clear in view of

5 Cf. Albers (2005), p. 361 on the coherence and plausibility of specific paradigms.
6 Cf. Simitis (1987) detailed analysis of the “quest for a concept” of privacy and the difficulties that
arise for the concept of privacy in an information society.
7 Waldmann (2018), p. 77.
8 Albers (2017), p. 12. It is therefore not surprising that data protection does not serve one specific

purpose. Critical of a plurality of purposes remains Veil (2018).


58 J. Reinhardt

the relationship between data protection and privacy, whereby the fundamental right
of data protection needed to be detached from privacy protection and the protection
of particular private information.9 In an increasingly digitized society, the handling
of personal data can jeopardize freedom in a broader sense. This is true, among other
things, for the integrity of media, communications and democratic processes as well
as legitimate expectations of privacy. Since data protection cannot be limited to one
specific goal, the protection requirements vary.
This is apparent in the Court’s jurisprudence: The CJEU emphasizes the close
connection between data protection and privacy (Articles 8 and 7 CFR), but under-
stands Article 8 CFR as an independent guarantee10 that is of equal importance for
the exercise of other freedoms, in particular freedom of communication.11

3 Fundamental Rights as Subjective Rights and Objective


Principles

As already noted, the provisions of the Charter are addressed explicitly to the insti-
tutions and bodies of the European Union and to the member states when they are
implementing European Union law. Article 51(1) CFR does not mention private
parties. That does not mean, of course, that fundamental rights standards have no
implications for private individuals or companies. Fundamental rights not only serve
as guarantees against disproportionate state interventions, limiting state interferences
and guaranteeing negative freedom, but also entail positive obligations of aid and
protection. The Charter contains sufficient elements to substantiate such obligations.
In order to understand the implications of Article 8 CFR on the data economy, it is
important to recall these elements.

3.1 Positive Obligations and Horizontal Effects

Fundamental rights in general and the right to data protection in particular not only
have a negative, but also a positive dimension. The concept of a fundamental right as
merely a negative freedom goes back to the distinction between “state” and “society”
developed in eighteenth century political thought.12 In the liberal tradition, funda-
mental rights were guarantees against government power. Consequently, freedom
was understood as “negative” freedom from state intervention. This understanding of
fundamental rights and the related concept of negative freedom encountered various
objections from the beginning. Above all, the understanding (and the scope and

9 Cf. only Gratton (2013), p. 222.


10 Reinhardt (2017), p. 540.
11 CJEU, 21.12.2016, C-203/15 and C-698/15 – Tele Sverige et. al., para 92.
12 Grimm (1987), p. 11.
Realizing the Fundamental Right to Data Protection … 59

content) of fundamental rights positions was changed by the realization that societal
self-regulation does not produce a fair balance of interests.
The same debate has taken place in the Information Age. Again, the understanding
of the function and the role of fundamental rights corresponds to certain models of
order in the digital sphere.13 At the beginning of the internet era, the idea resonated
that the internet was a functioning (public) sphere as long as it was shielded from
external interventions. The “Declaration of Independence of Cyberspace” declared
by activist J. P. Barlow at the World Economic Forum in Davos in 1996 made use
of a rather naïve notion of freedom: “Governments of the Industrial World, […] I
come from Cyberspace, the new home of Mind. On behalf of the future, I ask you
of the past to leave us alone.” It not only quickly became apparent that government
influence would persist, but that self-regulation can also have dysfunctional effects.
For instance, the exercise of fundamental freedoms by some can easily undermine the
freedom of others.14 This reality has become increasingly familiar in the wake of the
development of big data technologies and the “datafication” of everyday life. Because
many do not perceive the inherent restrictions on their freedom, their submission
is voluntary. The surveillance possibilities of control will become more and more
numerous. It is also not just about monitoring online behavior—with the Internet of
Things (which has become a veritable Internet of Everything) any expression of life
can potentially be tracked and followed.
In view of the accumulation of data and information by large tech companies and
the global character of the Internet, the private/public distinction and state-centred
concepts of fundamental rights have obvious limits.15 However, it would not be plau-
sible to transfer the traditional obligations of the state to private actors without adap-
tations.16 Beyond the few tech giants, the data economy is extremely heterogeneous.
Establishing direct obligations on the part of private companies does not necessarily
mean that the affected users and data subjects will enjoy stronger fundamental rights
protections. An extension of the effect of fundamental rights does not change the
fact that private parties themselves can invoke fundamental rights on behalf of the
latter. This does not mean, of course, that the level of protection against the big tech
companies must fall short of the obligations of public authority. With the respective
responsibilities of the private parties in question, the scope of fundamental rights obli-
gations can in fact be equivalent to the direct obligations of the state in comparable
constellations (or even more extensive).17 Special responsibility, however, requires a
corresponding justification. There are a number of criteria for assessing the “degree”
of fundamental rights obligations of private parties. Mere economic or social power
as such does not entail a special responsibility for fundamental rights. Fundamental

13 Cf. already Karavas (2007), p. 36. On the different types of “horizontality” see Frantziou (2019),
p. 37.
14 Goldsmiths and Wu (2006), p. 129.
15 Cf. also Kotzur (2022), in this volume.
16 Fischer-Lescano (2016), p. 163.
17 At least, this would be in line with the jurisprudence of the German Federal Constitutional Court

for private operators of public spaces (see, for further references, Hochmann and Reinhardt 2018).
60 J. Reinhardt

rights obligations extend to private actors inasmuch as these actors set the conditions
for the exercise of the fundamental freedoms of others.18
The straightforward legal obligations aside, human and fundamental rights are an
important element in the “self-constitutionalisation” of certain societal spheres. The
big tech companies are increasingly using fundamental rights standards as guidelines
for self-regulation. One prominent example is the evolution of community standards
for managing user-generated content on the big social networks and communication
platforms. Tech companies realize that they have obligations towards their users that
go beyond a mere contractual relationship. Due process guarantees, fundamental
rights or international human rights are touchstones in the formulation of community
standards.19
Whatever the specific function and character of the rights in question, the negative
dimension of fundamental rights and the protection against state intervention are
clearly not sufficient. Systemic dangers demand systemic responses. Article 8 CFR
was explicitly drafted as a “right to protection of personal data” and demands that the
level of protection as well as the respective instruments are adapted to the individual
risks.

3.2 The Primacy of Legislation

The legal responsibility for realizing Article 8 CFR therefore falls to European legis-
lators as well as the data protection authorities and the courts, in particular the
European Court of Justice. Specifically in the area of data protection law, the legis-
lators regularly emphasize the need to effectively protect fundamental rights.20 The
declared aim of the GDPR is the protection of fundamental rights (Article 1 para
2 GDPR). In realizing fundamental rights, questions relating to the separation of
powers have to be taken into.21 One institutional consequence of the distinction
between negative and positive obligations is that the political legislature has a rather
broad scope for action. Positive obligations are not as determined as negative ones.
The legislator usually has a broad range of options for fulfilling fundamental rights
obligations. The rights of the Charter limit the scope of legislative decisions in two
ways. As a positive obligation, Article 8 CFR calls for a data protection concept that
is sufficiently effective. As negative obligations, the rights of the Charter ensure that
interventions in favor of privacy and data protection satisfy the justification require-
ments of the fundamental rights positions in question. Between these two poles,
there is considerable legislative leeway, which permits the legislator to react to the

18 Cf. Reinhardt (2019) with further references.


19 Cf. also for a proposal to establish self-regulation mechanisms through code Hartmann (2022),
in this volume.
20 Cf. Lynskey (2015), p. 35.
21 Möllers (2013), pp. 105 and 174.
Realizing the Fundamental Right to Data Protection … 61

various complex challenges of a digitized society.22 Ultimately, a coherent regula-


tory concept will have to make the various fundamental rights positions compatible,
while also taking into account the inherent logic and influence of technologies.23

3.3 The Multipolarity of Fundamental Rights Protection

The primacy of legislative decisions comes to an end, of course, when the legislature
operates with open and general norms. In such cases, responsibility almost inevitably
shifts to the courts.24 The CJEU is poised to take an active role. As the Google Spain
ruling shows, the court derives far-reaching requirements for the design of the legal
system from Articles 7 and 8 CFR. By creating “a right to be forgotten”,25 the court
actively interfered in the ongoing legislative process. Even though it is generally
compatible with the role of a high or supreme court to identify and counter systemic
problems,26 basic requirements have to be met. This includes taking into account the
multipolarity of fundamental rights relations. The protection of personal data vis-à-
vis private actors does not adhere to the same model concerning the justification of
state interferences with its strict proportionality test. It requires balancing the various
interests and rights positions. The balancing does not only involve the freedom of
information of search engine users on the one side and the privacy rights of those
affected on the other. It also involves the service provider and its overall role and
function in the communicative sphere. Concerns of fundamental data protection must
be seen, as the CJEU puts it, “with regard to its social function” and in relation to other
legal positions. They range from the information interests of users of search engines
and social media platforms and the interests of service providers in improving or
further developing their applications to the interest of rights holders in the use of
software to protect their works against copyright infringements and of homeowners
in the use of surveillance cameras against vandalism.
If the “multipolarity” of fundamental rights relations is not taken into account, the
results will be one-sided and possibly dysfunctional. This is not always reflected in

22 Another possibility for a court to respect its functional boundaries is to formulate general norma-
tive requirements and leave it up to legislators to implement them in secondary data protection law.
Similar strategies apply with respect to the member states. In the Ryneš case, the Court of Justice
did not itself carry out the balancing test, but left it to the referring Czech court to do so. CJEU,
11.12.2014, C-212/13, para 34. – Ryneš. Cf. also CJEU, 29.1.2008, C-275/06 – Promusicae as well
as CJEU, 9.3.2017, C-398/15 – Manni. This is one possible approach for preserving a certain diver-
sity in the European system of fundamental rights protection. Cf. for this debate Franzius (2015),
passim.
23 Howard (2015), p. 255.
24 Despite the comparatively tight data protection regulations in the non-public realm, the courts still

have considerable discretion in weighing the conflicting fundamental rights positions. Cf. CJEU,
13.5.2014, C-131/12 – Google Spain and CJEU, 11.12.2014, C-212/13 – Ryneš.
25 On this subject in detail Sarlet (2022), in this volume, and Schimke (2022), in this volume.
26 Cf. Albers (2012), p. 266.
62 J. Reinhardt

the Court’s methodological approach. In Google Spain, the CJEU proceeds in such a
way that data processing is initially seen as an interference in Articles 7 and 8 CFR.
This interference is then tested for proportionality.27 From this point of departure,
the Court concludes that the rights protected by Articles 7 and 8 CFR outweigh
conflicting interests.28 However, there is a difference between the level of secondary
data protection law and the level of fundamental rights which must be respected.
A change in methodology does not imply that intermediaries cannot be obliged to
protect privacy and personal data, but would shift the justification requirements.
In a recent decision, the CJEU refers to the case law of the European Court of
Human Rights (ECtHR) on press publications in order to balance the right to privacy
and the right to freedom of expression.29 The criteria which must be taken into
account are therefore, inter alia, contribution to a debate of public interest, the degree
of notoriety of the person affected, the subject of the news report, the prior conduct of
the person concerned, the content, form and consequences of the publication, and the
manner and circumstances in which the information was obtained and its veracity.30

4 Standards for Data Processing in the Digital Economy

The requirements of Article 8 CFR that apply to state interventions do not auto-
matically apply to the data processing by private parties. Data processing by public
authorities is subject to the precautionary principle, the need for a legal basis when
processing personal data and the requirements of purpose definition and purpose
limitation. At the same time, data processing by private companies clearly can have
severe implications not only for private individuals, but also for society as a whole.
“Processing” of personal data, however, is a fundamental operation in a digitized
society that ranges from individual street photographers and the surveillance cameras
of private homeowners to the big social media companies that not only design indi-
vidual services, but entire ecosystems in which they permanently observe users and
make them the subject of ongoing behavioral experiments.31 Given the heterogeneity
of the digital economy, the scope of the fundamental right to data protection cannot
be conceived in a uniform way. This is all the more the case as it remains unclear
whether the concepts of data protection against state interventions are useful to regu-
late the information economy as a whole. A regulatory concept that is sensible with
regard to state intervention is not necessarily sensible for the data economy in which

27 CJEU, 13.5.2014, C-131/12 – Google Spain, para 80.


28 CJEU, 13.5.2014, C-131/12 – Google Spain, para 81.
29 CJEU, 14.2.2019, C-345/17 – Buivids, para 65. According to Article 52(3) CFR, the Charter

rights are to be given the same meaning and the same scope as the corresponding rights of the
Convention for the Protection of Human Rights and Fundamental Freedoms, signed at Rome on 4
November 1950 (ECHR). Article 7 CFR contains rights which correspond to those guaranteed by
Article 8(1) ECHR. The same applies for Article 11 CFR and Article 10 ECHR.
30 Ibid. para 66.
31 Zuboff (2019), p. 199.
Realizing the Fundamental Right to Data Protection … 63

the disclosure of personal data is no longer exceptional. The mere need for high
standards of protection does not mean that the concept of protection must be the
same.32
As we have seen, the fundamental rights obligations of private actors depend
very much on the context. Article 8 CFR contains broad principles that can limit
private autonomy and economic freedoms in various ways. The abstract principles
of the fundamental right to data protection can result in very different regulatory
approaches. While it is the task of the legislator to enact general data protection rules,
the Court of Justice must take account of the principles set out in Article 8 CFR when
applying secondary law in general and GDPR rules in particular.33 The fundamental
rights principles have a normative force and shape the understanding of various
data protection instruments as set out in the regulation. This not only concerns the
mechanism of consent (Sect. 4.1). The principles are also important for the question
to what extent the interests of the data subject can be objectified (Sect. 4.2). Important
decisions on processing personal data cannot be imposed entirely on the contracting
parties. Article 8 CFR demands a legal framework that distributes the opportunities
and risks of data use. Finally, they have consequences for the understanding of the
rights to information (Sect. 4.3) and the guarantee of data security (Sect. 4.4).

4.1 Consent and Control

One important dimension of data protection as a fundamental right concerns the indi-
vidual’s control over personal data and information. However, the conceptualization
of data protection in terms of the aspect of control, especially individual control, has
notable limitations. One issue involves the role of consent as a legitimate ground
for the processing of personal data. Even though there is a long-standing critique,
especially from behavioral economics scholars, of the idea of data protection as
control, it still predominates.34 As Woodrow Hartzog recently put it with a view
to European secondary law: “an empire of data protection has been built around
the crumbling edifice of control”.35 Consent requirements are supposed to ensure a
self-determined handling of personal data. As an instrument of (informational) self-
determination, however, consent is as elementary as it is demanding.36 A conception
of informational self-determination that overestimates individual abilities can easily
have negative effects: “The idealisation of control in modern data protection regimes

32 Cf. Buchner (2006). European legislators have opted for a high degree of convergence of stan-
dards for data processing in the private and public sector. This does not mean, however, that a
fundamentally different concretization would not have been possible and that the legislator does
not have a far-reaching scope for shaping data protection in the private sector in a different manner.
Cf. Reinhardt (2017), p. 556. See also Marsch (2018), p. 269.
33 Unseld (2018), p. 181.
34 Cf. for further references Reinhardt (2017), p. 558.
35 Hartzog (2018), p. 425.
36 Cf. also the considerations of Schertel Mendes (2015), esp. p. 129 ff.
64 J. Reinhardt

like the GDPR and the ePrivacy Directive creates a pursuit that is actively harmful
and adversarial to safe and sustainable data practices. It deludes us about the efficacy
of rules and dooms future regulatory proposals to walk down the same, misguided
path”.37
In the digital economy, data processing by private companies often depends
on the consent of users. This is in line with Article 6 GDPR. Article 8 CFR,
however, demands a free consensus and protects as far as possible the conditions for
autonomous decision-making. In cases where there is an extreme imbalance of power,
for example, consent of the data subject is not an expression of self-determination.
If the negotiating positions of the contracting parties are so uneven that one side is in
fact able to dictate the terms of the contract, the conditions for contractual autonomy
are effectively undermined. This appears to be the case with the large social networks
because of their ‘hegemonic’ position. One dimension of Article 8 CFR is to secure
elementary conditions of autonomous decision-making with regard to personal data.
The GDPR takes this into account by laying down criteria for a freely given consent,
which, in turn, have to be interpreted in light of Article 8 CFR.38

4.2 The Objectification of Interests

In view of the limits of the consent mechanism, it may be in the interest of effective
data protection that the processing of certain categories of personal data is prohibited
from the outset, irrespective of possible consent. Article 8 CFR is a justification for
strong regulatory measures vis-à-vis exceptionally powerful companies and within
the context of structural power relations. Conversely, it is also within the purview of
the legislator to allow the processing of personal data without the consent of the data
subjects if the potential effects of the data processing are negligible. The positive
obligations of Article 8 CFR allow for the “objectification” of the interests of the
data subjects. At the same time, consent remains a component of self-determination.
Inherent to fundamental rights guarantees is the decision to what extent fundamental
freedoms are exercised and which restrictions are accepted for the sake of other
advantages. For example, the installation of the Facebook Research App,39 which

37 Ibid.
38 The GDPR defines consent in Article 4(11): “any freely given, specific, informed and unam-
biguous indication of the data subject’s wishes by which he or she […] signifies agreement to
the processing of personal data relating to him or her”. The further relevant provisions are Article
6(1) GDPR (consent as a lawful basis for processing personal data) and the prohibition of consent
bundling in Article 7(4) GDPR: “When assessing whether consent is freely given, utmost account
shall be taken of whether, inter alia, the performance of a contract, including the provision of a
service, is conditional on consent to the processing of personal data that is not necessary for the
performance of that contract”.
39 On Facebooks Research VPN see the TechCrunch report “Apple bans Facebook’s Research app

that paid users for data” (February 2019) available at: https://techcrunch.com/2019/01/30/apple-
bans-facebook-vpn/. Accessed 11 Jan. 2022.
Realizing the Fundamental Right to Data Protection … 65

paid users for allowing the company to track all of their online and phone activity, is
not per se in violation of Article 8 CFR. If there are no minors involved (which was
actually the case), an explicit consent is not in principle invalid.
Finally, Article 8 CFR requires the legislator to ensure the possibility of self-
determined decisions in the long term. This would be undermined, for example,
if service providers only allowed registration with the profile of the largest social
network. The obligation to allow a variety of verification options (including anony-
mous or pseudonymous profiles) does interfere with the economic freedoms of
service providers, but it is justified by the legislative responsibility to guarantee
the protection of personal data. Such provisions not only serve the purpose of data
protection, but they also take into account aspects of consumer protection and market
regulation. However, such legislation—like a right to data transferability - can indi-
rectly promote a data protection-friendly attitude. Article 8 CFR does not require
such provisions, but the positive obligation towards Article 8 CFR may justify the
restrictions on the economic freedom (Article 15 CFR) of the companies concerned.

4.3 Transparency and Information Rights

The system of checks and balances that is required under Article 8(2) CFR includes
transparency in the handling of personal data. Article 8(2) CFR grants the data
subjects a right to information and rectification. The obligation of the data processing
bodies to provide information is intended to ensure that personal data are handled
in accordance with data protection law. The fundamental right to information and
rectification has no immediate effect on private companies. The extent to which the
right to information pursuant to Article 8(2) CFR applies to private companies must
be determined by the legislator. The GDPR defines information rights broadly and
only allows limited possibilities for refusing information.40 The right to information
is also secured by the guarantee of effective legal protection, Article 47 CFR. A
regulation that does not provide for the possibility of obtaining access to data relating
to a person by means of a legal remedy or of obtaining their correction or deletion
violates the right to effective judicial protection laid down in Article 47 CFR.41 An
effective protection of fundamental rights requires transparency in the data collection
activities, which often remain hidden from the user.

40 Cf. Articles 13–15 GDPR.


41 CJEU, 6.10.2015, C-362/14 – Schrems, para 95.
66 J. Reinhardt

4.4 Data Security

Another important element of Article 8 CFR are standards of data security. Although
Article 8 CFR is not explicit about data security requirements, the right to data protec-
tion is also understood by the Court as essentially a guarantee of data security. In
the Digital Rights case, the CJEU justifies the obligation to store telecommunication
metadata within the European Union, as otherwise an independent control of the
standards of data protection and data security could not be guaranteed.42 Article 8
CFR demands standards of data security and technical precautions to ensure effec-
tive protection against the unlawful handling, unauthorised access and misuse of
personal data. Compliance with technical standards to ensure data security is not
only a prerequisite for the proportionality of government intervention, but equally
important in the data economy. The necessary level of data security depends on the
type of data and information. Sensitive bank or credit card data must be secured by
other means than those used for consumer data that is less susceptible to misuse. The
importance which Article 8 CFR gives to aspects of data security is now safeguarded
by the GDPR through liability and sanction mechanisms.

5 Conclusion

Article 8 CFR not only establishes a negative right against state action, but has
regulatory implications for the digitized society as a whole. Article 8 CFR guarantees
the protection of personal data, regardless of whether they are processed by the
state or by non-state actors. Nevertheless, realizing the fundamental right to data
protection requires a differentiated approach. It has to consider the scope and function
of the fundamental right as well as the separation of powers. Article 8 CFR entails
positive obligations, on the part of legislators and the courts. While it is the task of
the legislator to create sufficiently effective and therefore sufficiently complex data
protection rules, it is for the courts to apply these rules in light of the data protection
principles. They both have to take into account that fundamental rights relations
between private parties are “multipolar” from the outset. The manifold interests in
“processing” data counterbalance the right to the protection of personal data. Article
8 CFR is not merely a subset of the right to privacy, but contains basic principles of
data protection that serve a variety of purposes. In addition to the protection of privacy
and personality rights, the protection of personal data is particularly important for
the exercise of fundamental freedoms and the integrity of communications.

42 CJEU, 8.4.2014, C-293/12 and C-594/12 – Digital Rights Ireland, para 68. This corresponds to a
right of the persons concerned to turn to the national supervisory authorities to protect these funda-
mental rights. CJEU, 6.10.2015,C-362/14 – Schrems, para 72. The national supervisory authorities
are competent to check whether a third country guarantees an adequate level of data protection,
cf. CJEU, 6.10.2015, C-362/14 – Schrems, para 58. For a detailed analysis of the Digital Rights
Ireland-Decision see Albers (2022), in this volume.
Realizing the Fundamental Right to Data Protection … 67

References

Albers M (2005) Informationelle Selbstbestimmung. Nomos, Baden-Baden


Albers M (2012) Höchstrichterliche Rechtsfindung und Auslegung gerichtlicher Entscheidungen.
In: Grundsatzfragen der Rechtsetzung und Rechtsfindung, VVDStRL, pp 257–295
Albers M (2017) Informationelle Selbstbestimmung als vielschichtiges Bündel von Rechts-
bindungen und Rechtspositionen. In: Friedewald M, Lamla J, Roßnagel A (eds) Informationelle
Selbstbestimmung im digitalen Wandel. Springer, Wiesbaden, pp 11–35
Albers M (2022) Surveillance and data protection rights: data retention and access to telecommuni-
cations data. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Baecker D (2018) 4.0 oder Die Lücke, die der Rechner lässt. Merve Verlag, Berlin
Buchner B (2006) Informationelle Selbstbestimmung im Privatrecht. Mohr Siebeck, Tübingen.
Fischer-Lescano A (2016) Struggles for a global internet constitution: protecting global communi-
cation structures against surveillance measures. Glob Const 5(2):145–172
Frantziou E (2019) The horizontal effect of fundamental rights in the European Union. A
constitutional analysis. Oxford University Press, Oxford
Franzius C (2015) Strategien der Grundrechtsoptimierung in Europa, EuGRZ, pp 139–153
Goldsmiths J, Wu T (2006) Who controls the internet? Illusions of a borderless world. Oxford
University Press, Oxford
Gonzalez Fuster G (2014) The emergence of personal data protection as a fundamental right of the
EU. Springer, Cham et al
Gratton E (2013) Understanding personal information: managing privacy risks. LexisNexis,
Markham, Ont
Greenfield A (2017) Radical technologies. The design of everyday life. Verso, London
Grimm D (1987) Bürgerlichkeit im Recht. In: Grimm, Recht und Staat der bürgerlichen Gesellschaft.
Suhrkamp, Frankfurt a. M.
Hartmann IA (2022) Self-regulation in online content platforms and the protection of personality
rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Hartzog W (2018) The case against idealising control. Eur Data Protect Law Rev 4:423–432
Hochmann T, Reinhardt J (2018) L’effet horizontal, la théorie de l’État et la dogmatique des droits
fondamentaux. In: Hochmann R (ed) L’effet horizontal des droits fondamentaux. Pedone, Paris,
pp 7–22
Howard P (2015) Pax Technica. How the Internet of Things may set us free or lock us up. Yale
University Press, New Haven, CA
Karavas V (2007) Digitale Grundrechte. Elemente einer Verfassung des Informationsflusses im
Internet. Nomos, Baden-Baden
Kotzur M (2022) Privacy protection in the world wide web—Legal perspectives on accomplishing
a mission impossible. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the
internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Lynskey O (2015) The foundations of EU data protection law. Oxford University Press, Oxford
Mayer-Schönberger V, Cukier K (2013) Big data: a revolution that will transform how we live, work
and think. John Murray, London
Marsch N (2018) Das europäische Datenschutzgrundrecht: Grundlagen – Dimensionen, Verflech-
tungen. Mohr Siebeck, Tübingen
Möllers C (2013) The three branches: a comparative model of separation of powers. Oxford
University Press, Oxford
O’Neil C (2016) Weapons of math destruction: how big data increases inequality and threatens
democracy. Random House, New York
Reinhardt J (2017) Konturen des europäischen Datenschutzgrundrechts. Zu Gehalt und horizontaler
Wirkung von Article 8 GRCh. Archiv des öffentlichen Rechts 142:528–565
68 J. Reinhardt

Reinhardt J (2019) Conflitos de direitos fundamentais entre atores privados. “Efeitos horizontais
indiretos” e pressupostos de proteção de direitos fundamentais. Direitos Fundamentais & Justiça
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schertel Mendes L (2015) Schutz vor Informationsrisiken und Gewährleistung einer gehaltvollen
Zustimmung. Eine Analyse der Rechtmäßigkeit der Datenverarbeitung im Privatrecht. De
Gruyter, Berlin, Boston
Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Simitis S (1987) Reviewing privacy in an information society. Univ Pennsylvania Law Rev 135:707–
746
Unseld C (2018) Zur Bedeutung der Horizontalwirkung von EU-Grundrechten. Mohr Siebeck,
Tübingen
Veil W (2018) The GDPR: the emperor’s new clothes—On the structural shortcomings of both
the old and the new data protection law. Neue Zeitschrift für Verwaltungsrecht, pp 686–696.
Available at SSRN https://ssrn.com/abstract=3305056. Accessed 11 Jan. 2022
Waldmann AE (2018) Privacy as trust. Cambridge University Press, Cambridge
Zuboff S (2019) The age of surveillance capitalism: the fight for a human future at the new frontier
of power. Public Affairs, New York

Jörn Reinhardt Professor of Public Law at the University of Bremen (Germany). Main areas
of research: Constitutional Law, Fundamental Rights, Law of the Information Society, Legal
and Political Theory and Philosophy of Law. Selected Publications: „Recht auf Vergessen“ auf
Französisch, in: Datenschutz und Datensicherheit (44) 2020, pp. 361–364, https://doi.org/10.
1007/s11623-020-1284-2; Thomas Hochmann/Jörn Reinhardt (eds.), L’effet horizontal des droits
fondamentaux, Editions Pedone, Paris 2018; Konturen des europäischen Datenschutzgrundrechts.
Zu Gehalt und horizontaler Wirkung von Art. 8 GRCh, Archiv des öffentlichen Rechts (AöR)
142 (2017), pp. 528–565; Human Rights, Human Nature, and the Feasibility Issue, in: Marion
Albers/Thomas Hoffmann/Jörn Reinhardt (eds.), Human Rights and Human Nature, Springer,
Dordrecht/Heidelberg/London/New York 2014, pp. 137–158.
Surveillance and Data Protection Rights:
Data Retention and Access
to Telecommunications Data

Marion Albers

Abstract Communication on the Internet and in the onlife world are accompa-
nied by digitization and datafication which, in turn, open up increasing potential for
surveillance. This transformation of communication into data means that the Internet
facilitates unprecedented forms of mass surveillance. One illustrative example is that
states aim at imposing an obligation on Internet service providers to retain particular
data and make them, under certain circumstances, accessible to security agencies. In
Europe, data retention has been the subject of a strong protest movement and of an
ongoing public discourse on how to strike a balance between liberties and security
needs. This contribution discusses the notion and novel features of surveillance under
Internet conditions. Attention is also paid to protection needs and to interests and
rights safeguarding people against surveillance. The article then analyzes the series
of manifold and partly divergent decisions which the German Federal Constitutional
Court and the European Court of Justice have handed down on the regulations on data
retention. Although the courts have developed a spectrum of legal requirements and
defined a bundle of individual rights as well as demands for control and evaluation,
we must conclude that they are still struggling to deal with the challenges, protection
needs and specification of convincing legal frameworks. The surveillance potential
inherent to the Internet and datafication calls for many new approaches.

I am indebted to the DAAD (Deutscher Akademischer Austauschdienst) and CAPES (Comissão de


Aperfeiçoamento de Pessoal de Ensino Superior) for generously supporting the Academic Exchange
and Research Project “Internet Regulation and Internet Rights” between the Pontifical Catholic
University Porto Alegre, Brazil (PUCRS) and the University of Hamburg (Germany), coordinated
by Professor Ingo Sarlet (PUCRS) and me (Hamburg).

M. Albers (B)
University of Hamburg, Hamburg, Germany
e-mail: marion.albers@uni-hamburg.de

© Springer Nature Switzerland AG 2022 69


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_5
70 M. Albers

1 Introduction

Since the advent of the Internet era, an increasing part of everyday life has shifted
onto the digital level. The Internet is providing a global infrastructure which gives
rise to a wide range of electronic communications services and novel communication
structures. Not only have new ways of communication been established, such as e-
mail, messenger services or social media platforms, but by now everyday activities
such as information retrieval, banking or shopping can quickly be managed via the
Internet. The onlif e world1 comes hand in hand with digitization and “datafication”.2
These developments, in turn, are accompanied by increasing potential for and novel
dimensions of surveillance.
In the case of state and intelligence services, the revelations published in June
2013 by Edward Snowden in particular have demonstrated the extent of governmental
surveillance as well as the new methods of surveillance. He introduced the world
to the PRISM program which is used by the NSA to obtain access to the servers
of major technology companies as well as to several other programs operated by
the intelligence services of the US or other states, for example the British GCHQ.3
Knowledge of the extent and intensity of such surveillance practices has prompted
protests and human rights complaints by citizens.4 Beyond that, it has led to conflicts
between different states, especially when state representatives themselves were the
object of surveillance, initiated, among others, by Brazilian/German collaborations in
political arenas, and promoted or accelerated by various regulatory efforts.5 Covering
a spectrum of objectives and regulations, the Brazilian Law 12.965/14, the “Marco
Civil da Internet”, for example, was an entirely new approach to regulating the crucial
challenges posed by the Internet.6
The conditions prevalent on the Internet and the Snowden revelations also make it
clear that surveillance can no longer be understood through the metaphors that shaped
the post-twentieth century dictatorship period: a single all-knowing oppressive force
at the top of a pyramid as George Orwell depicted in the novel Nineteen Eighty-
Four.7 Instead, it extends to diverse fields and is ramified and recombined time and
again. Surveillance is literally ingrained into the fabric of the Internet. This is due

1 For this term see Floridi (2015); Hildebrandt (2016), pp. 1ff.
2 This term was introduced by Cukier and Mayer-Schönberger (2013), pp. 73ff.
3 Greenwald (2014); Harding (2014).
4 See, e. g., ECtHR (Grand Chamber), Judgment of May 25, 2021—Application no. 58170/13, Big

Brother Watch and Others v. the UK; no. 62322/14, Bureau of Investigative Journalism and Alice
Ross v. the UK; and no. 24960/15, 10 Human Rights Organisations and Others v. the UK, and
Judgment of May 25, 2021—Application no. 35252/08 Rättvisa v. Sweden; both available under
http://hudoc.echr.coe.int. See also the decision of the German FCC, Judgment of May 19, 2020, 1
BvR 2835/17, available at: http://www.bundesverfassungsgericht.de.
5 Cf. Bauman, Bigo, Esteves, Guild, Jabri, Lyon and Walker (2014), pp. 127ff.
6 Doneda and Zanatta (2022), Sect. 1 and 3.2; Molinaro and Ruaro (2022), Sect. 4, both in this

volume. See also Lemos (2014).


7 Orwell (2008). See also Timan et al. (2017), pp. 733ff. with regard to the more differentiated

Bentham Panopticons, especially the prison-Panopticon.


Surveillance and Data Protection Rights: Data Retention … 71

to socio-technical factors, not least to the fact that data is a source of commercial-
izable knowledge. Advances are continuously being made in technologies capable
of gathering, storing or analyzing huge amounts of data. Given that major telecom-
munications providers or intermediaries such as Google or Facebook hold dominant
market positions, they are the ones in control of vast data collections. App providers,
tracking companies or data brokers enter the picture as well.8 All these enterprises
rush into processing, aggregating, searching, cross-referencing and selling more or
less personalized data as long as no legal regulation prevents them from doing so.
More than ever before, private players are observing people, attitudes, behaviors or
communications and collecting, analyzing and using the data. Hence, the surveil-
lance scenarios relevant nowadays often encompass the interplay between the data
collections of private companies and the access requests of government agencies.9
The sector of telecommunications and telemedia is particularly illustrative here. But
there are numerous other areas: financial institutions and account data, airlines or
transport companies and mobility data, health services or biobanks and health data.
Advances achieved with the Internet of Things and the Internet of Bodies will open
up further fields as well.
This background explains why, at least in Europe, “data retention” has become a
major issue in ongoing debates about surveillance and the focus of a very powerful
protest movement. In the context of the former EU Data Retention Directive (EU-
DRD)10 and the relevant laws of the member states, the term involves the obligation
of particular providers to retain particular personal (meta)data generated in telecom-
munications for the purpose of enabling authorities at a later point in time to gain
access to and make use of these data.11 These measures are highly contested. Some
criticized them as mass surveillance without any particular cause or as the most
invasive instrument ever adopted by the EU in terms of how it invades privacy, its

8 Cf. Katsivelas (2018), pp. 221ff.


9 Cf. the contributions in Cate and Dempsey (2017).
10 Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on

the retention of data generated or processed in connection with the provision of publicly available
electronic communications services or of public communications networks and amending Directive
2002/58/EC, Official Journal L 105/54. In 2014, the European Court of Justice (ECJ) declared this
directive invalid, see Sect. 4.3.1 of this article.
11 There are fields other than telecommunications, e. g., the retention of passenger name records,

cf. ECJ (Grand Chamber), Opinion of July 26, 2017 – 1/15, curia.europa.eu, Vedaschi and Marino
Noberasco (2017), or the retention of banking data in order to keep them available for the purposes
of combating money laundering or financing of terrorism; see inter alia Milaj and Kaiser (2017).
Legally, each field has distinct characteristics of its own.
72 M. Albers

scale and the number of people it affects.12 Others argued that the temporary reten-
tion of metadata results in only minor intrusions or risks.13 The divergent points of
view indicate that there is a need to clarify the reasons for the normative protection
against surveillance in order to determine why, under what conditions, to what extent
and how people should be protected against what form of surveillance. Given how
the Internet has spread and the surveillance measures it has made possible, this is
urgently necessary across multiple sectors.
This article addresses questions of data retention, subsequent governmental access
to telecommunications data and extracting information from these data as one of the
best examples of the new possibilities of surveillance. Section 2 analyses the concept
of surveillance and protection needs as well as protected goods and individual rights.
Because surveillance should not be understood as a strictly homogeneous exercise,
more issues are affected than only those related to “privacy”. Section 3 deals with
breaking down data retention and subsequent data access into different elements
and identifying threats to freedom and regulatory options. Section 4 explains the
European and German legislation as well as the series of decisions of the European
Court of Justice and the Federal Constitutional Court. The last section centers on the
legal requirements developed through the network of courts and on remaining open
questions.

2 Normative Protection Against Surveillance

An increasing potential for surveillance is the downside of communication on the


Internet, and in a similar way, the Internet impacts the forms of surveillance:
Surveillance was once literally “watching”; now, it is also “seeing with data”.14

However, whether, how and to what extent surveillance is actually expanding depends
just as much or even more on other technological and societal developments. Surveil-
lance is framed and takes place within social contexts. And even if “key trends of
surveillance”15 are indicative of expansion, surveillance is not a homogeneous exer-
cise but appears in various manifestations and is uneven both in its penetration of
society and its consequences.16 Concepts and contexts have to be clarified, fields,

12 As an example of objections see European Data Protection Supervisor, “Opinion of the Euro-
pean Data Protection Supervisor on the Evaluation report from the Commission to the Council
and the European Parliament on the Data Retention Directive (Directive 2006/24/EC)“, 31
May 2011, https://edps.europa.eu/sites/default/files/publication/11-05-30_evaluation_report_drd_
en.pdf (accessed 11 Jan. 2022).
13 Cf., e. g., Schluckebier, Dissenting Opinion, Judgment of the FCC of March 2, 2010, 1 BvR

256/08, 263/08, and 586/08 (= BVerfGE 125, 260; available also in English at: www.bundesverfas
sungsgericht.de), paras 312ff. In connection with this judgment, see Sect. 4.2 of this article.
14 Bennett, Haggerty, Lyon and Steeves (2014), p. 6.
15 Bennett, Haggerty, Lyon and Steeves (2014), pp. 13ff.
16 Coleman and McCahill (2010), pp. 4ff., 8.
Surveillance and Data Protection Rights: Data Retention … 73

forms and methods must be differentiated and systematized, and the protection needs
have to be worked out more closely. Only then will it become clear what new chal-
lenges are being posed by the Internet, against what background the debates on data
retention are to be understood, and what problems are associated with this form of
surveillance.

2.1 Contextualizing Surveillance

“Surveillance Society” has always been one of the catchwords used to grasp an exten-
sion and intensification of surveillance against which people have to be protected.
Generally speaking, surveillance is a very broad term. An object is observed in
terms of a specified benchmark using certain methods, modes and instruments and
a comparison is made of the extent to which the behaviors, processes or outcomes
observed correspond to this benchmark as a reference or target value. The core
element of surveillance is the production of knowledge through observation, storage
and processing, analysis or evaluation and comparison. Surveillance never traces the
entire real picture, but selectively focuses on certain aspects instead. However, it
is often not possible in a “target-specific” manner, but is usually associated with a
range of events, i.e., more is observed than is strictly relevant to the actual surveil-
lance target. It is also important to take into account that surveillance is not limited
to watching or data collection but includes further data and information processing
extending from data storage and modification to analysis and use in decisions. Within
an abstract framework,17 its essential function lies in aligning the actual activities
and outcomes with the benchmark and exerting related influence on the object under
surveillance, on general conditions or on future activities or people’s behavior.18
Accordingly, surveillance results are the basis for decisions on the extent to which
corrective measures or sanctions are required, whether on an individual or an over-
arching level. Such decisions can be made later, in other contexts or by specialized
actors. Knowledge acquisition and decisions based on surveillance results may be
more or less closely linked to the preceding data processing and may differ depending
on actors and perspectives. Thus, surveillance is a process with several elements, but
it does not necessarily follow a linear course. Surveillance gains in effectiveness as
a result of the fact that persons under surveillance often adapt their behavior them-
selves. Regardless of later adjustments or sanctions, the production of knowledge
already changes the relationship between the persons involved in a positive or nega-
tive way. Beyond the processes of gathering and handling data, information and
knowledge, the surveillance methods or techniques employed and the modes chosen

17 In concrete contexts, the functions can be more specific and diverse. In asymmetrical relationships,
for example, surveillance is a means of constantly reproducing and stabilizing power relationships.
18 Cf. also the definition of Lyon (2007), p. 14: Surveillance is the “focused, systematic and routine

attention to personal details for purposes of influence, management, protection or direction”.


74 M. Albers

(overt, covert, clandestine, or implied surveillance19 ) are relevant because different


methods, techniques and modes each have their own consequences. Already at this
point, a multi-dimensional and differentiated picture of what “surveillance” means
emerges.
In very general terms, surveillance is an indispensable element of complex soci-
eties and is present in many different ways, without it being possible to say that
in modern society it has continued to expand in every respect.20 As far as norma-
tive assessment is concerned, surveillance is by no means necessarily negative. On
the contrary, it always has advantageous functions, depending on the perspective and
assessment benchmark. For example, those being monitored may feel a sense of reas-
surance that erroneous actions can be discovered and corrected or that responsibility
can be shared through surveillance. Hence, the normative assessment of surveillance
is not possible without a more detailed description of the relevant context and further
breakdowns. It is necessary to specify at what level of abstraction the starting point
is set, what the subject matter of surveillance in terms of target persons (e.g., an iden-
tified person, specified groups or organizations, the general population) and in terms
of the data collected is, how the social framework is structured, what participants
play a role given the particular background of social relationships, what surveillance
devices are used and what the explicit or implicit goals of surveillance and their actual
positive or negative consequences are. For normative assessment, for example, it is
important whether asymmetrical power relations are present in the background, such
as in the surveillance of citizens by the police or of consumers or network users by
intermediaries.
What is novel about surveillance in modern society and under the conditions of
the Internet? What is essentially new is the far-reaching “datafication” that goes hand
in hand with communication on the Internet and the digitization of society.21 This
development will be boosted by the Internet of Things. Observation and storage facil-
ities are, in principle, relatively simple, inexpensive, applicable everywhere and often
already woven into the functionality of certain technical processes. De facto, data
can simply be stored or retained, just in case it is needed at any time for any purpose.
Subject to compatibility requirements, the various phases of data processing can be
decoupled from each other and linked in ever new forms. This means that data can
be decoupled from its original context and flow into other contexts where, regularly
combined with other data, it can be used as a source of information and knowledge
for other purposes. And “there is no longer close proximity between watcher and
watched […] It now becomes possible to monitor behavior at long distances and
to record it for future reference”.22 The relative separation of collection, storage

19 Cf. Petersen (2012), pp. 12ff.


20 A closer analysis would emphasize, on the one hand, the intense forms of social control practiced
in communities in earlier times and, on the other hand, the implications of, for example, individual-
ization, separation of roles or mobility. Nevertheless, there is universal agreement that, along with
the increasing differentiation of global society, patterns of surveillance have multiplied.
21 See also Albers (2018), pp. 29ff.
22 Timan et al. (2017), p. 743.
Surveillance and Data Protection Rights: Data Retention … 75

and use of data increases the need for further data in order to classify and under-
stand the original data. An essential characteristic of modern surveillance is also the
fact that techniques, codes and programs shape the data as well as data processing
and change the forms of knowledge acquisition. Surveillance is “dataveillance”23
and reconstructs parts of individuals from data trails as “data doubles”.24 In the end,
although to a certain extent limited by, e. g., the possibilities of anonymous communi-
cation and technically sophisticated encryption,25 the Internet and associated socio-
technological developments make surveillance of identified persons or of groups
and organizations or bulk surveillance possible to a degree unprecedented. A lot of
specific data is accumulated in numerous contexts—location data, travel data, finan-
cial data, electronic communications metadata or content data,26 and these data are
often informative in itself, but can also be compiled to create an in-depth profile. The
problem is on the one hand the range of the collection, linking and analysis options
and on the other hand the technically influenced selectivity of the results. Regarding
the modalities for carrying it out, surveillance is often invisible and intransparent,
not only in the data collection phase but more importantly also with regard to subse-
quent data processing. The use of artificial intelligence will significantly reinforce
this. We can also foresee that new surveillance technologies will “not only afford
a mediated gaze from a distance, but also acting, intervening, and pre-empting at a
distance”.27 Increasingly, surveillance is being and will continue to be carried out in
an interplay of different participants, in an “overlapping and entangled assemblage
of government and corporate watchers”.28 All these peculiarities can be found in
more or less pronounced form in data retention—and explain why it has become a
central paradigm of surveillance.

2.2 Protection Needs

If surveillance at a general level is a fundamental concept and an indispensable


element of complex societies, where exactly are the protection needs? In the abstract
sense of observation and comparison using a benchmark, surveillance can be consid-
ered normatively undesirable neither with regard to the people under surveillance
nor with regard to society. If, however, surveillance processes are approached more

23 For the term see Clarke (1988).


24 Timan et al. (2017), p. 736 with reference to Poster (1990).
25 Concerning anonymity on the Internet, cf. Aftab (2022) in this volume.
26 Cf. Tzanou (2017), pp. 67ff.
27 Timan et al. (2017), p 744. See also Cohen (2017), p. 457: “Networked information and communi-

cation technologies enable the intensification of surveillant attention, providing the material condi-
tions for it to become continuous, pervasively distributed, and persistent. Those capabilities in turn
enable modulation: a set of processes in which the quality and content of surveillant attention can
be modified continually according to each subject’s behavior, sometimes in response to inputs from
the subject but according to logics that ultimately are outside of the subject’s control”.
28 Richards (2013), p. 1936.
76 M. Albers

precisely in specific contexts and if the social relationships that exist in the back-
ground as well as the various perspectives of the people involved are included, addi-
tional aspects enrich the picture. We can work out who should be protected under
what circumstances against what measures for what reason, and how surveillance
must therefore be regulated. Subjects of protection can be persons, groups or orga-
nizations, or the society (and, indirectly, its citizens). Protection can be directed
against the surveillance processes themselves and the associated knowledge acqui-
sition, against the surveillance methods and technologies, or against the mode of
surveillance, for example, whether it takes place overtly or covertly. The reasons for
protection are a variety of protection needs. These include, among other things, the
possibility for individuals to behave freely, feel confident, enjoy social recognition
and participate in shaping their social role.
When we look at the individual person under surveillance, the surveillance can
cause a wide range of adverse effects resulting from the information created, the
surveillance methods and technologies or the surveillance modes. As a consequence
of actual or assumed surveillance, social relations between observed and observing
persons inevitably change. An effect that is often emphasized is the adaptation of
observed persons to surveillance in their behavior or in their communications. This
already happens regularly through “internalized discipline”.29 Behavior or communi-
cations under surveillance conditions are different from what they would be without
surveillance. Free spheres may no longer be used “freely” and thus their protec-
tive function for selfhood as well as their social potential is restricted.30 The self-
perception of individuals as an autonomous individual may be questioned, in partic-
ular in cases of surveillance without cause, where individuals can no longer influence
whether they are subjected to surveillance or not through their own behavior.31 In
connection with feelings or emotions, surveillance can lead to social pressure, social
awkwardness and insecurity, shame or feelings of inferiority. Surveillance methods
such as the use of covert investigators may result in reductions in the level of trust the
person under surveillance builds in social relationships. In a similar way, surveillance
processes that are intransparent either due to their secrecy or to technical features
affect the reliability of expectations and the building of trust in the social environ-
ment, which to a certain extent are essential conditions of freedom.32 The social

29 Cf. with regard to Bentham’s prison-Panopticon Timan et al. (2017), p. 733: “the goal of surveil-
lance is to reform the individual (all aspects of the person), in order to create perfect and internalized
discipline and make punishment unnecessary”.
30 Cf. Cohen (2017), p. 459: “Subjectivity – and hence selfhood—exists in the gap between the

reality of social shaping and the experience of autonomous selfhood”; Richards (2013), pp. 1945ff.
with regard to “intellectual privacy”. See also Mitrou (2010), p. 138: “under pervasive surveillance,
individuals are inclined to make choices that conform to mainstream expectations”. See also the joint
partly concurring opinion of Judges Lemmens, Vehabović and Bošnjak, ECtHR (Grand Chamber),
Judgment of May 25, 2021—Application no. 58170/13 et al., Big Brother Watch and Others v. the
UK et al., http://hudoc.echr.coe.int, para 4-8.
31 Cf. Grimm (1991), p. 198.
32 Cf. Albers (2005), pp. 469ff.; see also Bennett, Haggerty, Lyon and Steeves (2014), p. 4: “Having

a sense of control over our public persona is vitally important […]”.


Surveillance and Data Protection Rights: Data Retention … 77

role of the person under surveillance is always altered by surveillance. This is also
true even when, as under the conditions brought about by the Internet and modern
surveillance techniques, the person under surveillance and the person carrying out
the surveillance or the different elements of surveillance processes are independent
of each other: People know that personal data and more and more such data are
being collected and combined, but the people carrying out the surveillance, their
knowledge and the resulting effects remain more or less abstract. Dealing with this
uncertainty even poses demanding challenges. Consequently, the assertion that the
mere collection or recording of data is not yet a problem and does no harm33 is not
true—it is precisely the unforeseeable possibilities of use and misuse and the unpre-
dictable effects that make it a problem. Surveillance always changes the social role
of the people carrying out the surveillance as well. Under certain circumstances, they
can use the knowledge acquired to influence social contexts in such a way that the
spectrum of behavioral possibilities of the person under surveillance is restricted or
altered.34 The knowledge acquired or simply the operating procedures of the surveil-
lance technologies installed may also entail discrimination and “social sorting” or
“sorting out” mechanisms. Such mechanisms, which limit the possibilities of affected
persons, place “the matter firmly in the social and not just the individual realm”.35
Surveillance, certain methods or modes, such as comprehensive secret surveil-
lance, can have negative effects not only on individuals or groups, but also on commu-
nication relations, social subsystems or society as a whole. The social conditions for
freedom may be affected.36 In numerous countries, Germany and Brazil included,
there are obvious examples of the social consequences of secret mass surveillance.
If we leave the traditional model of the national state conducting surveillance and
look at the global world of different states, detrimental effects range up to endanger-
ment of political action and national sovereignty, as has become clear in the wake of
Edward Snowden’s revelations.
Which factual effects a particular instance of surveillance may have and how
these effects are to be judged normatively are two different questions. The extent to
which certain adverse effects are undesirable in normative terms must be answered
in overarching contexts. If a person adapts his or her criminal behavior because of
the possibility that he or she could be observed and punished, this is desirable in
normative terms; in fact, such adaptations are among the functions of criminal law.
A need for protection can, however, be substantiated if, as a result of surveillance,
behavior the person being observed is at liberty to perform is adapted. One of the
well-known examples of undesirable chilling effects is that the surveillance by police
or intelligence authorities of political activity leads to individuals no longer engaging
in such activities in the future. Sorting decisions as a result of surveillance are also

33 Cf. Richards (2013), p. 1934, drawing attention to common assumptions in the USA.
34 Cf. also Richards (2013), pp. 1952ff.
35 Lyon (2003), p. 13.
36 In the context of surveillance and collection of data about individuals or groups, it must always

be kept in mind that the political regime can change, for instance from a democratic to a dictatorial
regime.
78 M. Albers

not undesirable under all circumstances, but preventing unjustified discrimination as


a result of surveillance and subsequent decision processes is a normative goal. And
if surveillance methods include a large number of innocent people because more is
observed than is strictly relevant to the actual surveillance target, this is normatively
just as undesirable as “black box”-techniques whose processes and selections are
intransparent.
As a result, surveillance is a multi-layered social process, which always involves a
series of both desirable and undesirable effects. A need for regulatory requirements
does not arise only when normatively undesirable impacts already exist. On the
contrary, legal regulation of surveillance aims at controlling effects in advance of their
potential occurrence and at avoiding undesirable effects. Consequently, the question
of whether certain effects are typically expected or are likely as a risk is not a purely
empirical problem.37 The estimation and assessment of adverse effects is part of a
normative appraisal that may not be completely dissociated from empirical evidence,
but is relatively independent. This is also true because the law must take into account
the limits of empirical studies in terms of both subject matter and methodology. In
addition, legal regulation may have to be introduced well in advance before effects
can be precisely predicted. These protection requirements and the need for legal
regulation are reflected in legally protected interests and corresponding rights.

2.3 Protected Interests and Corresponding Rights

As we have seen, surveillance is a process with different elements requiring contextu-


alization, and protection needs are wide-ranging. That is why highlighting a particular
legal guarantee would fall short, for example regarding “privacy”. Rather, protec-
tion against surveillance is addressed by a broad spectrum of protected interests
enshrined in the constitution, in human or fundamental rights catalogs or in statutes
that protect particular interests of individuals or organizations as well as the precon-
ditions for freedom, the rule of law, democracy, or sovereignty. Depending on the
complex of norms and the level of regulation, the provisions each have their own legal
nature, meaning and binding effect; enforcement mechanisms also vary substan-
tially.38 In the European Union and its member states, a plurality of human and
fundamental rights exists. In their respective scope of application, the human rights
of the European Convention on Human Rights (ECHR), the fundamental rights of
the EU Charter of Fundamental Rights (CFR) and the fundamental rights provided
by national constitutions are applicable and enforceable.39

37 Regarding the assumption of “chilling effects”, there is an extensive discussion concerning


whether this assumption can be based on empirically supported judgments.
38 Cf. Lachmeyer and Witzleb (2014), pp. 749ff.
39 See on enforcement mechanisms beyond the territory of the EU, especially with regard to the

GDPR, Veit (2022), in this volume.


Surveillance and Data Protection Rights: Data Retention … 79

Human and fundamental rights may be applicable because the object of surveil-
lance is covered by their scope of protection: observed behavior such as expressions of
opinion or political demonstrations, private homes being searched, surveilled corre-
spondence or (tele)communication. For example, the guarantee of the inviolability
of telecommunications protects individuals against the surveillance of these. The
protection does not cease to apply because the fact that communication takes place
could be interpreted as a lack of expectation of privacy. Such an interpretation not
only falls short because the relevant communication is restricted to certain addressees
and is regularly not open to others. Above all, such an interpretation overlooks the
fact that the data and information in question are only created through communica-
tion.40 Human and fundamental rights can also take effect because of the effects of
surveillance: freedom of expression protects against surveillance which presumably
results in the suppression of expressions of opinion.
In addition, there is a need for protection not only from concrete surveillance,
but also at a more abstract level and in a preliminary stage where concrete rights
in regard to a particular surveillance regimen cannot yet be identified and acted
upon. Provided that we interpret the right to privacy, the right to data protection or
the guarantee of the inviolability of telecommunications in a complex way,41 they
are sufficiently abstract to be able to cover this preliminary stage. In addition to
these already established and advanced protected interests, completely new scopes
of protection are currently being developed against the background of the challenges
of the Internet: For example, the FCC has derived a “right to the guarantee of the
confidentiality and integrity of information technology systems” from Article 2 in
connection with Article 1 Basic Law42 and scholars emphasize that “a metaphorical
home 2.0 that protects an abstract space around persons”43 is necessary.
Human and fundamental rights are primarily directed against the state. On the one
hand, persons who are under surveillance are protected by particular guarantees. On
the other hand, companies such as access providers or intermediaries enjoy protection
through freedom of profession or freedom of enterprise and can assert rights of
defense if they are obliged to contribute to the surveillance or to cooperate with the
authorities.44 Beyond that, it is now recognized, albeit to varying extents in different

40 The guarantee of the inviolability of telecommunications does not protect privacy in the sense of
being left alone or in seclusion, but aims at the protection of communication and of participants as
communication partners.
41 For complex approaches to the right to privacy see, for example, Cohen (2017), p. 458: “The

condition of privacy is dynamic and best described as breathing room for socially situated subjects to
engage in the processes of boundary management through which they define and redefine themselves
as subjects. Rights to privacy are rights to the sociotechnical conditions that make the condition of
privacy possible.” For a complex concept of the right to data protection see Albers (2005), pp. 454ff.;
cf. also Marsch (2018), pp. 127ff. See as well Reinhardt (2022), in this volume.
42 Judgment of the FCC of February 27, 2008, 1 BvR 370/07 and 595/07, BVerfGE (Volume of the

decisions of the FCC) 120, 274, also available in English at: www.bundesverfassungsgericht.de.
43 Timan et al. (2017), p. 745; see also Koops (2014).
44 For an overview of the divergent roles and perspectives of companies see Determann (2017),

pp. 372ff.
80 M. Albers

legal cultures, that human and fundamental rights are also applicable in relations
between private individuals through doctrinal figures such as third-party effects or
the duty to protect, and thus protect people against surveillance by other private
individuals or companies. More complex models of the construction of protection
may have to be developed in cases where surveillance has to be understood in the
form of “private–public assemblages”, as is increasingly the case.
Furthermore, human and fundamental rights and their scope of protection prove to
be multi-layered because they include different dimensions of protection. Tradition-
ally, they are primarily oriented towards the defense against impairments. Increas-
ingly, however, there are also protective dimensions being worked out that oblige the
state to take action. This includes guarantees that the persons affected have a right
to know and therefore to be informed of the surveillance and its results, are able to
participate at appropriate points in the surveillance and decision processes or have
access to judicial review procedures. Regulatory requirements also emerge beyond
individual-oriented guarantees and rights.
A distinction must be made between the scopes of protection with their various
dimensions of protection and the claims which may be invoked. Under certain circum-
stances, protected persons can raise claims for defense against surveillance and for its
cessation. Since surveillance is a process with various elements using certain methods
and modes, such claims can already be differentiated in nature. For example, a claim
to refrain from data collection as such may be unfounded, but claims to refrain from
storage for the planned duration or to refrain from using collected personal data for
one of the intended purposes may be justified. Besides, there may also be acces-
sory claims. For example, surveillance and the respective collection and storage of
personal data may be permitted, but for this very reason accessory claims to data
security and precautionary safeguards may exist. Human and fundamental rights
provide rights to know, however, what exactly claims should include, at what point
in time they may be raised, and how detailed requested information can be needs to
be further elaborated. Looking at the influence of codes and software programs in
modern surveillance, for example, the extent to which claims to knowledge of the
functioning and operating mechanisms of algorithms can be asserted requires clari-
fication.45 Irrespective of questions of detail and the need for continuing advances,
the findings show that, in view of the different scopes of protection as well as their
different directions and dimensions, we have to work out a broadly diversified bundle
of rights.

3 Data Retention

Like “surveillance”, data retention is not a clear-cut concept. It is understood differ-


ently, especially with regard to different cultural and juridical contexts or to different

45 Cf. for algorithms Schertel Mendes and Mattiuzzo (2022), in this volume.
Surveillance and Data Protection Rights: Data Retention … 81

fields. Generally defined, the term “data retention” describes the collection or preser-
vation of (personal) data, either for a purpose which is as yet undetermined or is more
extensive in content or time than is necessary for a particular primary purpose, with
the result that the data can later be accessed as a source of information in the event that
the need arises. In this general form, this does not yet say much. When we consider
the in principle unlimited data collection possibilities available under the conditions
of the Internet and modern storage technologies, however, it becomes obvious, that
“data retention” is anything but unproblematic, depending on the context.
In the Europe-wide debates, “data retention” is mainly linked to the preservation of
personal data generated in telecommunications and to the former EU DRD46 and the
relevant laws of the member states. Due to provisions resulting from the protection of
the secrecy of telecommunications, providers of telecommunication services may, in
principle, only process and store their clients’ personal and communication data for
the purposes of transmitting communications, handling interconnection payments
and invoices, marketing, and restricted other value-added services. As soon as the
data are no longer needed for these purposes they have to be deleted. If a criminal
investigation detects communication details only at a later date, they can no longer be
traced back. Against this background, accessory data retention rules oblige providers
to retain certain categories of data, so-called metadata, for a specific period of time
in order to enable security agencies to gain access to and make use of these data for
security purposes at a later point in time. This is the subject of the provisions of the
former EU DRD, and similar rules have been established in other countries.47
Several elements are characteristic here: First of all, there is the plurality of partici-
pants: private parties have data at their disposal to which state authorities seek access.
These data are not collected separately, but are generated in specific contexts in accor-
dance with the conditions and rules applicable in that context. The data is legitimately
stored for a certain period of time for particular purposes, in the case of telecommu-
nications companies primarily for billing purposes. Companies are then obliged to
store this data for longer than is necessary for the primary purposes, so that it can
be made available to the state security authorities for security purposes (secondary
purposes) under certain conditions.
A closer look at “data retention” reveals, firstly, that such a structural pattern can
be found in a number of fields other than telecommunications, in some cases with
certain modifications. In the banking sector, data relating to customers and money
transactions must be retained in order to keep them available for the purposes of
combating money laundering and financing of terrorism. Data that is generated in
the context of the Internet of Things or data that is collected in biobanks also arouse
the interest of security authorities. Secondly, there are a number of key points which
require closer attention: for example, whether all or only certain data that private
companies collect are involved and under what conditions and for what purposes

46 See n. 10.
47 Cf. for an overview on the broader “bulk “ collection of data Cate and Dempsey (2017). For the
rules in the Marco Civil da Internet and other Brazilian legislation see Molinaro and Ruaro (2022),
in this volume.
82 M. Albers

which security authorities have access to which data. Data retention proves to be a
fundamental issue of great importance in connection with surveillance. This is why
the former EU DRD and the national laws of the member states as well as the related
court decisions are the subject of extensive, often heated debate.

4 Data Retention in the European Union and Its Member


States—An Endless Story

4.1 EU Data Retention Directive and Subsequent Laws


of Member States

The EU DRD was prompted not least by terrorist attacks in Madrid and London and
entered into force from May 2006.48 It provided for retention obligations in the field
of telecommunications which derogated the system of protection established by the
Data Protection Directive49 and the ePrivacy Directive.50 As a general principle of
protection, personal data should be retained only for the period of time that is neces-
sary for specified, explicit and legitimate purposes of data processing which, for its
part, must be based on particular grounds of lawfulness.51 The ePrivacy-Directive,
with its focus on protecting the privacy and confidentiality of communications by
means of a publicly available electronic communications service and the related
traffic data, confirmed this principle with the result that, as a general rule, providers
had to erase or anonymize personal data where it was no longer needed for transmis-
sion, billing or other service purposes.52 Referring to the competency to harmonize
member states’ provisions with regard to the establishment and functioning of the
internal market,53 the EU DRD amended this protection system by introducing partic-
ular obligations of the providers of publicly available electronic communications
services—e.g., telephone services, e-mail services or Internet access services—or of

48 Concerning the context and the legislative process see Granger and Irion (2014), pp. 837 f.;
Zedner (2017), pp. 566ff.
49 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the

protection of individuals with regard to the processing of personal data and on the free movement of
such data, Official Journal L 281/31, meanwhile replaced by the General Data Protection Regulation
(GDPR, Regulation 679/2016/EU), Official Journal L 119/1.
50 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning

the processing of personal data and the protection of privacy in the electronic communications
sector, Official Journal L 201/37.
51 Articles 6(1b, e) and 7 Data Protection Directive, cf. also Articles 5(1b, e) and 6 GDPR.
52 More closely Articles 6 and 9 ePrivacy Directive.
53 The competences of the EU were contested, because the primary objective of the EU-DRD

appears to be the fight against crime but an overarching EU competence in this area does not exist.
This problem leads to some inconsistencies in the Directive. However, the ECJ did not object to
the use of the chosen competency, see judgment of the ECJ (Grand Chamber) of 10 February 2009,
C-301/06, available at curia.europa.eu.
Surveillance and Data Protection Rights: Data Retention … 83

public communications networks. Member states had to ensure that these providers
were obliged to retain certain data generated or processed by them for longer than
necessary for their business purposes, in order to ensure that these data were available
for the purpose of the investigation, detection and prosecution of serious criminal
offenses, as defined by each member state in its national law.54 In principle, a six-
month minimum storage period from the time of communication was set as well as
a maximum retention period of two years which, in special circumstances, could be
extended.55
Article 5(1) EU DRD specified a comprehensive catalog of categories of data
on communications of natural and legal persons as well as data on subscribers or
registered users stored beyond individual communications. However, Article 5(2)
EU DRD expressly prohibited the retention of data revealing the content of the
communication. The directive referred to so-called metadata. Depending on the
communication medium,56 the catalogue included
• data necessary to trace and identify the source and the destination of a commu-
nication, e.g., the calling and dialed telephone numbers as well as the name and
address of the subscriber or registered user, user ID(s) of communication partici-
pants as well as the name and address of the subscriber or registered user to whom
an Internet Protocol (IP) address, user ID or telephone number was allocated at
the time of the communication,
• data necessary for identifying the date, time and duration of a communication,
e.g., of the log-in and log-off of an Internet access service, together with the IP
address, whether dynamic or static,
• data necessary to identify the type of communication, i.e., the telephone or Internet
service used,
• data necessary to identify users’ communication equipment or what purports to
be their equipment, e.g., the calling and called telephone numbers, the IMSI or the
IMEI of the calling and of the called party and, in the case of pre-paid anonymous
services, the date and time of the initial activation of the service and the Cell ID
from which the service was activated,
• data necessary to identify the location of mobile communication equipment, e. g.,
the Cell ID at the start of the communication and data identifying the geographic
location of cells by reference to their Cell IDs during the period for which
communications data are retained.
Furthermore, the EU DRD required member states to make sure that the providers
respect, at minimum, certain data security principles related to retained data. Member
states also had to ensure that, in specific cases, this data and any other necessary
information relating to such data could be transmitted without undue delay to the

54 Cf. Article 1(1), 3ff. EU DRD.


55 Article 6 EU-DRD.
56 Fixed network telephony, mobile telephony, Internet access, Internet e-mail and Internet

telephony.
84 M. Albers

competent authorities upon their request.57 The procedures to be followed and the
conditions to be fulfilled in order to gain access to retained data in accordance with,
for example the requirements of the principle of proportionality had to be defined by
each member state in its national law.
To implement the EU DRD into national law, member states had to adopt measures
that can be systematized into three complexes. Firstly, regulations had to ensure that
providers or network operators in the telecommunications sector retain the data
specified in Article 5(1) EU DRD and in doing so also observe data security require-
ments. Secondly, rules had to be added obliging the companies to provide these
data to competent security authorities under certain conditions. The third complex
consisted of regulations which allowed the authorities to request and receive the data
to be transferred and to use the data transferred for their own purposes.
In Germany, the DRD was implemented in 2008 by partly directive-implementing
and partly directive-exceeding provisions amending existing laws.58 In its former
version, § 113a of the Telecommunications Act (TKG) obliged the providers of
publicly accessible telecommunications services to store certain telecommunications
service data for a period of six months. The catalog of data corresponded to Article 5
(1) EU-DRD. The duty of storage essentially extended to all information that is neces-
sary in order to reconstruct who communicated or attempted to communicate with
whom, when, from where and for how long. Telecommunications service providers
that—like anonymization services—altered the data to be stored were required to
store the original and new data as well as the time at which these data were tran-
scribed.59 § 113a para 8 TKG emphasized that the contents of the communication
may not be stored. The data stored were to be deleted within one month of the end
of the storage period.60 As regards data security, the care generally necessary in the
area of telecommunications had to be observed.61
§ 113b TKG generally defined the purposes for which the telecommunications
companies were allowed to transmit stored data to public authorities. Three exclu-
sive purposes were enumerated: the prosecution of criminal offenses, the warding
off of substantial dangers to public security and the performance of intelligence
service tasks. Furthermore, and under less strict conditions, the use of these data was
permitted to identify Internet users to whom specific IP addresses had been assigned
in order to provide necessary customer and traffic data to authorities requesting these
data, e.g., for purposes of copyright protection in accordance with the relevant laws.62

57 Cf. Article 4 EU DRD. Details were left to the member states due to the limited competencies
of the European Union.
58 In particular: Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer

verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG of 21


December 2007, BGBl I 3198.
59 § 113a para 6 TKG (former version).
60 § 113a para 11 TKG (former version).
61 § 113a para 10 TKG (former version).
62 This means that if authorities already knew an IP address through their own investigations, they

should be able to request information as to which subscriber the address belonged to.
Surveillance and Data Protection Rights: Data Retention … 85

Authorizations to access and use the data retained were to be put in concrete
terms by provisions of specific branches of law passed by, due to the German feder-
alist system, both the Federal Government and the Länder (states).63 The Federal
Government is competent for criminal procedural provisions and federal authori-
ties, e.g., the Federal Intelligence Service; the Länder are responsible for regulating
their police forces and intelligence services. Provisions for prosecution had been laid
down in § 100g of the Code of Criminal Procedure (StPO). This norm allowed the
criminal prosecution authorities to collect data stored by way of precaution pursuant
to § 113a TKG without the knowledge of the person concerned.64 The authorization
was limited insofar as data access and use were necessary for investigating the facts
of the case or determining the whereabouts of an accused person and only applied
to “offenses of considerable importance” and to offenses committed via telecom-
munications. Data collection could only be ordered by a judge, except in cases of
imminent danger.65 The court order was only to be directed against accused persons
or against persons of whom it must be assumed due to specific facts that they receive
or transmit specific messages directed to the accused or originating from this person
or that the accused uses their line. The order did not authorize the authorities to
directly access the data, but obliged the service providers to filter out and transmit
the data in a separate intermediate step in accordance with the requirements of the
order.

4.2 Judgment of the FCC on German Data Retention Laws

The German laws implementing the DRD triggered the largest mass constitutional
complaint in the history of the Federal Republic of Germany. Following several
preliminary injunctions, the Federal Constitutional Court (FCC) reached its principal
decision on March 2, 2010.66 The FCC had to identify the relevant fundamental rights
of the German constitution and to flesh out their scope of protection. Then it had
to deal with the constitutionality of the provisions of the TKG and the StPO. Here,
we can highlight three central questions: Is precautionary storage without cause

63 According to the requirements recognized under German law, in addition to the regulations that
allow a data controller, in this case a private company, to transmit data to the authorities, further
regulations are necessary that allow the authorities to query or receive the data (“double-door
model”). Cf. more closely Albers (2001), p. 334 f.; Gazeas (2014), p. 228f., 501ff.
64 The Code of Criminal Procedure provides for duties of ex post-notification of the persons affected

by measures and subsequent judicial relief.


65 §§100g para 2 in conjunction with 100a para 3 StPO (former version).
66 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, BVerfGE 125,

260 (with dissenting opinions of two judges), available also in English under www.bundesverfas
sungsgericht.de. This judgment developed several new considerations but also built on the Court’s
jurisdiction which had been developed since the census decision of 1983. See, e.g., FCC, Judgment
of July 14, 1999 – 1 BvR 2226/94 et al. = BVerfGE 100, 313; Judgment of March 3, 2004 – 1 BvR
2378/98 et al. = BVerfGE 109, 279; Decision of March 3, 2004 – 1 BvF 3/92 = BVerfGE 110, 33;
Judgment of February 27, 2008 – 1 BvR 370/07 et al. = BVerfGE 120, 274.
86 M. Albers

constitutional? Which requirements must the provisions meet? What is the situation
of the providers of telecommunications and anonymization services?

4.2.1 Refining the Protection of the Guarantee of the Inviolability


of Telecommunications Secrecy

Art. 10 (1) Basic Law responds to the threats to freedom arising from the use of
telecommunications or telecommunications technology and the facilities of third
parties.
The [...] telecommunications secrecy shall be inviolable.

The guarantee refers to the use of telecommunication techniques and services as a


formal reference point. It covers the communication process from the sending of
the message to its receipt within the recipient’s sphere of control.67 In the case of
Internet-mediated communication, its scope of application must be delimited vis-à-
vis Article 5(1)2 Basic Law which protects the freedom of broadcasting services,
since Article 10(1) Basic Law only protects individual communication, not public
mass communication. However, a distinction between individual and mass commu-
nication is not possible without a link to the content of the information transmitted
which is contrary to the protective function of the fundamental right, and therefore,
the FCC explained, Article 10 Basic Law applies as long as the character of the
communication in the network is not discernible.68
Its protection is not confined to the contents of the communication.69 The confi-
dentiality of the closer circumstances of the communication process is also protected.
This includes whether, when and how often telecommunications traffic took place
or was attempted between which persons or telecommunications devices.70 Article
10 Basic Law thus also protects the corresponding information and the pertinent
data recorded in connection with this information (“metadata”). The FCC explained
this in the context of the principle of proportionality: The informative value of these
data can be extremely broad. Depending on the use of the telecommunications by
the persons affected, and in future with increasing frequency, it is possible to create
meaningful personality and mobility profiles and to draw conclusions (with a more or

67 As to problems in determining the appropriate fundamental right in the field of telecommunica-


tions and the Internet cf. FCC, Judgment of March 2, 2006 – 1 BvR 2099/04 = BVerfGE 115, 166
(185ff.). See also Albers (2010b), p. 1064.
68 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 192.
69 It does not matter whether the communication contents are of a private nature or serve business

purposes see, e.g., FCC, Judgment of July 14, 1999 – 1 BvR 2226/94 et al. = BVerfGE 100, 313
(358).
70 This is settled case law, see, e.g., FCC, Decision of June 20, 1984 – 1 BvR 1494/78 = BVerfGE

67, 157 (172); Decision of March 25, 1992 – 1 BvR 1430/88 = BVerfGE 85, 386 (396).
Surveillance and Data Protection Rights: Data Retention … 87

less reliable degree of probability) about the individual activities and the social envi-
ronment of the persons, about their social or political affiliations and preferences, or
about internal influence structures and decision-making processes within groups.71
Moreover, the scope of protection of Article 10 Basic Law covers not only the
cognizance of telecommunications data but extends to the subsequent information
and data processing processes.72 This means that protection extends from the gath-
ering and storage of data to it being requested by authorities and any transmission of
data to its further storage for security purposes and its subsequent use. The “secrecy”
protection does not end as soon as data becomes known to providers or authori-
ties but continues to set constitutional criteria. Article 10 Basic Law thus offers the
guiding norm. Other fundamental rights, such as the freedom of the press, might
be referred to in order to strengthen its protection.73 However, the data retention
decision mentioned them only marginally.74
After having clarified the scope of protection of Article 10 Basic Law, the FCC
explained that the legal obligations of the telecommunications companies to retain
telecommunications data and transfer it to state authorities have to be regarded not
as an indirect, but as a direct infringement of the rights of those communicating. It
argued that service providers, without any margin of discretion being left to them in
this respect, would be used solely as auxiliaries for the performance of tasks by state
authorities, so that the storage of the data is legally accountable to the legislature as
a direct infringement.75 It corresponds to the refining of the scope of protection that
each legally regulated step of data processing—storage (§ 113a (1) TKG), request and
transmission (§ 113b TKG), further storage and subsequent use (§ 100g StPO)—is
an independent encroachment on Article 10 Basic Law.76 The constitutional review
dealt with these steps in a differentiated manner.

4.2.2 Constitutionality of Precautionary Data Retention Without Cause

One of the core problems of the case was whether precautionary data retention
without cause as such could be constitutional at all. The complainants invoked
the famous census judgment of the FCC.77 In connection with the requirements
of purpose determination and purpose limitation established in this judgment, the
Court had laid down a strict prohibition, in the given case addressed to the state,

71 More closely: Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para
211. See also Tzanou (2017), p. 79, who highlights these considerations.
72 Leading decision in this respect: FCC, Judgment of July 14, 1999 – 1 BvR 2226/94 et al. =

BVerfGE 100, 313 (359).


73 See FCC, Judgment of July 14, 1999 – 1 BvR 2226/94 et al. = BVerfGE 100, 313 (365).
74 Cf. judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 191, 305.
75 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 193.
76 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, paras 192ff.
77 Judgment of the FCC of December 15, 1983, 1 BvR 209/83 et al. = BVerfGE 65, 1.
88 M. Albers

of collecting and storing non-anonymous data “for indefinite or not yet definable
purposes”78 in the event that it should ever be needed for not yet specified future use.
However, the FCC delimited the data retention constellation to be assessed against
this strict prohibition. For the purpose of delimitation, it described this constellation
as “a precautionary storage of telecommunications traffic data without cause for
later transmission with cause to the authorities responsible for criminal prosecution
or warding off danger or to the intelligence services”.79 It did not consider such a
form of precautionary data retention to be absolutely incompatible with Article 10
Basic Law, but as only permissible in exceptional cases and subject to particularly
strict requirements.
The problem that the telecommunications traffic data had to be stored by the
service providers without cause for six months was discussed not as an independent
requirement for the determination of purpose80 but as part of the proportionality test.
The FCC affirmed the suitability even if it admitted that “such a storage of data
cannot ensure that all telecommunications connections can reliably be assigned to
specific users, and it may be possible for criminals to circumvent storage by using
Wi-Fi hotspots, Internet cafés, foreign Internet telephone services or prepaid mobile
telephones registered under a false name”.81 However, suitability “merely requires
that the attainment of the goal is facilitated”.82 The storage was also regarded as
necessary in the sense that there are no less drastic but similarly effective means.83
The main considerations dealt with the question of whether the storage is dispro-
portionate from the outset. The FCC classified the storage as a particularly serious
encroachment. Among the criteria relevant for this classification are the extent of
storage and the absence of cause as well as the far-reaching informative value of the
stored data. Even without recording the contents of the communication, conclusions
could be drawn on, for example, social or political affiliations and personal prefer-
ences, inclinations and weaknesses. This included connections which are engaged in
with an expectation of confidentiality. The persons concerned faced increased risk
of being exposed to further investigations without themselves having given occasion
for this and of being affected by an abuse of the data collected. The infringement
was given particular weight by the fact that the storage of telecommunications traffic
data without cause “is capable of creating a diffusely threatening feeling of being

78 See FCC, Judgment of December 15, 1983 – 1 BvR 209/83 et al., BVerfGE 65, 1, 46 and 47.
79 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 207.
80 Concerning the significance of purpose determination see Albers (2005), pp. 498ff.; von

Grafenstein (2018), pp. 231ff.


81 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 207.
82 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 207. This

consideration is settled case law and is rooted in the notion that legislation should have a margin
of assessment in this respect. This notion and the margin of assessment are the reference point for
the now widely acknowledged duty of legislation to establish evaluations by means of which the
accuracy of legislative assumptions at the time of the enactment of a law is later checked (duties of
legislation to observe developments and to improve the law if necessary).
83 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 208.
Surveillance and Data Protection Rights: Data Retention … 89

watched which can impair a free exercise of fundamental rights in many areas”.84
The aspect that “trust” is a central condition for the unbiased exercise of fundamental
rights had already been emphasized by the Court in its decision on the fundamental
right to the guarantee of the integrity and confidentiality of information technology
systems.85
Despite the severity of the encroachment, the Court did not consider precautionary
storage without cause of telecommunications traffic data to be absolutely prohibited
under constitutional law. Decisive for the justifiability was how the statutory frame-
work was shaped. Firstly, the storage without cause was not carried out directly by
the state, but through an obligation imposed on the private service providers. The
data is thus, the Court argued without mentioning the rather oligopolistic situation on
the market, not yet compiled during storage itself, but remain distributed over many
individual companies. The retrieval of the data by state authorities only took place
in a second step and was then carried out on a case-by-case basis in accordance with
legally defined criteria. “The formulation of the provisions giving permission for
retrieval and further use of the stored data may ensure”, the Court argues, “that the
storage is not made for purposes that are indefinite or cannot yet be determined”.86
The separation of storage and retrieval and their respective legal frameworks are
therefore at the heart of the Court’s argument. The Court pointed out that the storage
of telecommunications traffic data does not include the contents, is limited to six
months and is not directed towards total recording of all citizen’s activities but takes
up, in a manner that is still limited, the special significance of telecommunications
and responds to the specific potential danger associated with them, for instance,
that criminal offenses operating essentially on the Internet would escape observation
if telecommunications data were deleted. The Court added that the precautionary
storage of telecommunications traffic data without cause must remain an exception.
In particular, it must not “in interaction with other existing files, lead to virtually all
activities of the citizens being reconstructible”.87 Legislation aimed at precautionary
storing as comprehensively as possible all data useful for criminal prosecution or the
prevention of danger would from the outset be incompatible with the constitution.
The Court introduced a kind of “surveillance overall accounting” as a duty of the
legislator.88 It even associated the normative statement that the exercise of freedom
may not be recorded and registered in its entirety with the constitutional proviso with
regard to identity asserted in the Lisbon decision.89
Despite these latter restraints, the grounds of the data retention decision consider-
ably weaken the strict prohibition of precautionary collecting and storing of personal
data for indefinite or not yet definable purposes, which has been emphasized since

84 Judgment of the FCC of March 2. 2010, 1 BvR 256/08, 263/08, and 586/08, para 212.
85 FCC, Judgment of February 27, 2008 – 1 BvR 370/07 et al., para 181 et sqq. = BVerfGE 120,
274 (306ff.).
86 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 214.
87 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 218.
88 More closely Roßnagel (2010), pp. 1240f.
89 Cf. FCC, Judgment of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 218.
90 M. Albers

the census judgment. This prohibition follows from the principle of purpose limi-
tation. This principle requires not only that the purposes be determinable, but also
refers to the requirement that the data be necessary for the respective purposes. A
determination of the purpose would be useless without the correlate of necessity. It
is true that the purposes, if we follow the Court’s construction of “a precautionary
storage of telecommunications traffic data without cause for later transmission with
cause to the authorities responsible for criminal prosecution or warding off danger
or to the intelligence services”, might be described more closely with a view to the
permitted uses by the entitled authorities. This description, however, anticipates a
certain course of events from the first element of storage to the further elements of
access and use which are actually relatively separate from storage (the Court itself
has highlighted this as a condition influencing the assessment of constitutionality).
The problem arises that the necessity of the data for preventive, criminal procedural
or intelligence purposes is not certain at the time of storage. The service providers
only retain the data in the event that it proves to be necessary in the future. And
what is new about the storage duties at issue here is that the precautionary storage is
carried out without further criteria which support the prognosis of a later necessity
and which otherwise include, for example, indications or assumptions of danger or
suspicion because of assessments of situations, specific findings or previous conduct
of persons. It is precisely for this reason that the vast majority of data will prove
to be unnecessary. The lack of more detailed criteria for precautionary storage and
necessity prognosis gets, it might be thought, at the core issue of the prohibition of
precautionary storage of personal data for indefinite or not yet definable purposes. But
the FCC came to the conclusion that the chosen construction does not give rise to a
fundamental verdict of unconstitutionality.90 It then tried to set restrictions by means
of the above-mentioned “surveillance overall accounting”—which, however, faces
difficulties of operationalizing cumulative effects of surveillance measures—and by
means of a bundle of requirements which must be complied with for constitutionality
to be granted.

4.2.3 A Bundle of Constitutional Requirements

Based on the principles of legal certainty and definiteness regarding norms91 and of
proportionality, the FCC derived concrete and in part detailed constitutional require-
ments resulting in a network of dovetailed substantive and procedural precautions.
Innovative remarks concern aspects of data security, the scope of data use, trans-
parency, enforcement as well as control mechanisms and provisions ensuring eval-
uations. Also deserving special mention is the approach conceiving the individual

90 See also the comparison with Decision no. 1258 of the Romanian Constitutional Court of October
8, 2009 in: de Vries et al. (2013), pp. 13ff.
91 On the increasing importance of the principles of legal certainty and definiteness in the jurisdiction

of the FCC cf., for example, BVerfGE 110, 33 (53ff.); 113, 348 (375ff.).
Surveillance and Data Protection Rights: Data Retention … 91

elements as part of a coherent concept with the consequence that the constitution-
ality of certain elements depends on whether other elements have been formulated
in accordance with the constitution.
The precautionary storage of telecommunications traffic data without cause must
meet the requirements of the principle of proportionality. Considerations of suit-
ability and necessity as well as some of the aspects relevant to the balancing of
interests have already been advanced by the Court in its examination of whether the
contested data retention provisions could be constitutional at all: The data remain
distributed over several private providers and the storage is limited to six months and
to traffic or metadata which arise regularly in connection with the service and are
already temporarily stored for particular purposes. Beyond that, the Court empha-
sized that the severe encroachment constituted by such storage is only proportionate
in the narrow sense if a particularly high standard of data security by means of clear
and binding obligations on private service providers is guaranteed.92 This responsi-
bility on the part of the legislator to guarantee data security responds to the fact that
the risk of illegal access to data being attractive in view of its multifaceted informa-
tive value is high93 and that private service providers have only limited incentives
to guarantee data security on their own due to cost pressure. Data security require-
ments apply both to data retention and transmission, and effective safeguards are
also needed to ensure data deletion. The data security standard must be designed
dynamically: It must be based on the present state of development of the discussion
between technical experts—for example, by resorting to legal figures under ordi-
nary law such as the state of the art—and continuously incorporate new findings and
insights. Referring to the expert statements in the written submissions and in the oral
hearing of the proceedings before the Court, as examples the FCC named separate
storage of the data, fastidious coding, a secure access regime through more refined
use of for instance the “four-eye principle” and audit-proof recording of the access
to data and their deletion. In this respect, the requirements to be laid down are to be
specified either by sophisticated technical regulations, possibly graduated at various
levels of legislation, or in a general manner and then specified transparently by
binding individual decisions of the regulatory authorities addressed to the individual
companies. Constitutional law also requires monitoring that is comprehensible to the
public and involves the independent data protection officer and a balanced system of
sanctions which also attaches appropriate weight to violations of data security. The
FCC endeavored with these data security requirements to ensure that the data retained
is not used in any other way than for the intended security purposes and not beyond
the time stipulated. Finally, the constitutionality of the storage of the data depends on
the regulations on subsequent access and further use being designed in such a way
that they themselves meet the requirements of the principle of proportionality.94

92 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 221 et sqq.
93 Cf. Clarke (2015), p. 128: Data retention measures and huge volumes of highly sensitive data
result in a concentrated “honeypot” which attracts attacks.
94 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 226: “The

drafting of these provisions on use, in a manner that is not disproportionate, thus not only decides
92 M. Albers

The retrieval by security authorities, the transmission to them and their use of
telecommunications traffic data are also subject to a number of requirements. To begin
with, the legislator must provide for a fundamental prohibition of transmission for a
confined range of telecommunications connections relying on special confidentiality,
such as counseling over the phone in situations of emotional crisis.95 Apart from that,
because the analysis of retained data allows conclusions to be drawn that might pry
deeply into private lives and enable the creation of personality or movement profiles,
the use of retained data must be for the purpose of fulfilling tasks of paramount
importance related to the protection of legal interests. This must be reflected in the
statutory description of both the protected legal interests and the thresholds that
must be reached to enable the authorities to request and then to use the data. For
criminal prosecution, this means that the legislature must provide an exhaustive list
of the criminal offenses that are to apply here, either by having recourse to existing
lists or by creating a new one; a blanket clause or a mere reference to criminal
offenses of considerable significance is not sufficient. Furthermore, a retrieval of
the data requires at least the suspicion of a listed offense based on specific facts. In
the context of preventing danger, the use of the stored data may only be permitted
to ward off dangers to the life, limb or freedom of a person, to the existence or
security of the Federal Republic or of a Land or to ward off a danger to public
safety. The statutory norm must provide for the threshold of “actual evidence of a
concrete danger to the legal interests to be protected”.96 This means that specific
given facts support the prognosis that, without intervention, damage will occur with
reasonable probability in the foreseeable future. Considering the impairments for
citizens associated with the use of intelligence data, these requirements also expressly
apply to intelligence services, even though their tasks of informing the government
and therefore elucidating particular areas of activity are not solely linked to concrete
danger situations.97 Regarding procedural safeguards, the FCC required, except for
the intelligence services,98 the retrieval and transmission of the data to be subject to a
well-defined judge’s pre-emptive review,99 although Article 10 GG does not provide
for such a proviso.
The limitation to certain purposes, which reflect the outstandingly important tasks
of legal protection, must also be ensured after the data have been retrieved and
transferred to the authorities requesting it, and must be accompanied by procedural
safeguards. In particular, the data must be analyzed immediately after transmission
and deleted if it is irrelevant. The retrieving authorities may, however, forward the

on the constitutionality of these provisions, which in themselves constitute an encroachment, but


has also an effect on the constitutionality of the storage as such”.
95 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 238.
96 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 231. Cf. also

already FCC, Judgment of February 27, 2008 – 1 BvR 370/07 et al., para 247 et sqq. = BVerfGE
120, 274 (328f.).
97 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 233.
98 In this respect, Article 10 (2) Basic Law permits the replacement of preemptive judicial

supervision by supervision carried out by an agency or auxiliary agency appointed by parliament.


99 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 247 et sqq.
Surveillance and Data Protection Rights: Data Retention … 93

data to other bodies insofar as this is done to perform tasks for which access to
this data would also be directly permissible (the so-called “hypothetical substitute
intervention”).
The FCC also required the legislature to take effective precautions to ensure
transparency in the use of data. Secret use of data could only be considered if the
purpose of the investigation for which the data retrieval serves would otherwise be
thwarted. The legislature could assume this, in principle, in cases of warding off
danger and performing intelligence service tasks, but not in cases of criminal prose-
cution, where investigative measures are regularly carried out with the knowledge of
and in the presence of the suspect. If the data subject does not have to be informed
prior to querying or transmission of his or her data, the legislature must provide for
a duty of subsequently notifying him or her.100
Furthermore, it is constitutionally required that a legal protection procedure be
available to subsequently review the use of the data. Where persons affected had no
opportunity before the measure was carried out to defend themselves against the use
of their telecommunications traffic data, they must be given the possibility of subse-
quent judicial review. Finally, a legislative formulation that is not disproportionate
also requires effective sanctions for violations of rights, whether these are incor-
porated into the general structure of the law of criminal procedure or into current
liability law, or established by more extensive provisions.
For the indirect use of data for information pursuant to § 113 (1) TKG in the form of
a claim for information from service providers for the identification of IP addresses,
the FCC stipulated less strict requirements.101 In this case, the authorities are merely
given personal information on what owner was registered on the Internet with regard
to an IP address, and this owner is determined by the service providers by recourse
to data retained. The informative value of these data is limited. Nevertheless, the
respective rules influence the conditions of communication on the Internet and limit
its anonymity. The FCC explained that the legislature must ensure that information
may only be obtained on the basis of a sufficient initial suspicion or of a concrete
danger on the basis of facts and with a view to sufficiently weighty adverse effects
on a legal interest. There is no duty to provide for a judge’s pre-emptive review.
The person affected has, in principle, the right to learn that this anonymity has been
removed, and why.
Once it had worked out the requirements, the FCC reviewed the respective legal
regulations. Closer examination showed that the contested provisions did not satisfy
the developed constitutional requirements in key respects. They were incompatible
with Article 10(1) Basic Law as a whole.

100 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, paras 243 et sqq.
101 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, paras 254 et sqq.
Cf. also on easier access to traffic data in Brazilian regulations Molinaro and Ruaro (2022), in this
volume.
94 M. Albers

4.2.4 Regulation of Anonymization Services

§ 113a TKG required telecommunications service companies, including anonymiza-


tion services, to store and, where applicable, transmit the data listed in the law, with
the costs for the necessary infrastructure to be paid by the companies. These restric-
tions were to be measured against the standard set out in Article 12(1) Basic Law.
The FCC classified neither the technical effort nor the financial burdens as a dispro-
portionate impairment of the freedom of telecommunications service providers to
exercise their profession.102 It even came to the conclusion that higher security stan-
dards are necessary, which can be imposed on the providers without any convincing
constitutional objections. Moreover, it argued that § 113a (6) TKG did not make
the offer of an anonymization service de facto impossible. Rather, anonymization
services could continue to offer their users the possibility of surfing the Internet
without private individuals being able to identify their IP address; anonymity would
only be lifted vis-à-vis the state authorities under the narrow conditions of a permitted
data retrieval. The Court did not deal separately with those anonymization service
providers who themselves could not identify their users and are now obliged to ensure
identifiability. However, the impossibility of such an offer also constitutes (only) a
regulation of the exercise of a profession which can be regarded as proportionate.
As a result, the FCC did not see any constitutional concerns with regard to Article
12 Basic Law.

4.2.5 Results

The FCC did not consider the contested form of data retention to be unconstitutional
under all circumstances. The Court took the position that the constitutionality of this
“precautionary storage of telecommunications traffic data without cause for later
transmission with cause to the authorities responsible for criminal prosecution or
warding off danger or to the intelligence services” depends on how it is shaped by
legal regulations and whether sufficient limits are set at the numerous points that
can be identified. The challenged provisions did not meet the derived constitutional
requirements and therefore violated the guarantee of the inviolability of the secrecy of
telecommunications. As far as providers of telecommunications and anonymization
services are concerned, however, a violation of Article 12 Basic Law was rejected.
That means that the State is allowed to regulate the activity of anonymization services
in such a way that under certain legal conditions an identification of their users is
possible.

102 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 300 et sqq.
As long as the effects are not highly restrictive, the legislature has been granted wide discretion
concerning what burdens and measures to safeguard public interests, which are in need of regulation
as a result of commercial activities, it imposes on market players in order to integrate the associated
costs into market and market prices.
Surveillance and Data Protection Rights: Data Retention … 95

4.3 Rulings of the ECJ

Not only in the member states, but also at the European level, the data retention
established by the former DRD has been the subject of controversial debates and
decisions by the highest courts.103 Acting on the request from the High Court of
Ireland for a preliminary ruling, in 2009 the ECJ reviewed whether the DRD was
covered by the powers of the European Union. The Court’s judgment argued that the
EU could draw on its competence to adopt measures with regard to the functioning
of the internal market.104 Questions concerning the compatibility of the DRD with
the fundamental rights of the EU Charter were decided in 2014 in response to a
request for a preliminary ruling from both the High Court of Ireland and the Austrian
Constitutional Court (“Digital Rights Ireland” judgment).105 In this judgment, the
ECJ declared the DRD invalid. In the light of the protection of fundamental rights as
developed in this judgment, two years later the focus was on what requirements the
ePrivacy Directive sets for existing or newly introduced data retention regulations in
the member states (“Tele2 Sverige” judgment).106 In its recent “Quadrature du Net
and Others” and “H. K.” judgments,107 the ECJ has developed a differentiated and
detailed approach establishing a bundle of interdependent requirements.

4.3.1 The “Digital Rights Ireland”-Judgment

In its “Digital Rights Ireland” judgment, the ECJ initially examined which funda-
mental rights were applicable. The main focus was on Article 7 and Article 8
CFR.
Article 7 CFR Respect for private and family life
Everyone has the right to respect for his or her private and family life, home and
communications.
Art. 8 CFR Protection of personal data
(1) Everyone has the right to the protection of personal data concerning him or her.

103 For an overview of the discussions and decisions in member states see the contributions in Zubik
et al. (2021).
104 Judgment of the ECJ (Grand Chamber) of February 10, 2009, C-301/06, available at curia.eur

opa.eu.
105 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, available at

curia.europa.eu.
106 Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15 – Tele2

Sverige and Home Secretary v. Watson, available at curia.europa.eu. The joint requests for a prelimi-
nary ruling under Article 267 AEUV were made by the Administrative Court of Appeal in Stockholm
and the Court of Appeal (England & Wales) concerning the interpretation of Article 15 I of Directive
2002/58/EC read in the light of Articles 7 and 8 and Article 52 para 1 of the Charter of Fundamental
Rights of the European Union.
107 Judgments of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

and of March 2, 2021, C-746/18, – H. K., both available at curia.europa.eu.


96 M. Albers

(2) Such data must be processed fairly for specified purposes and on the basis of the
consent of the person concerned or some other legitimate basis laid down by law. Everyone
has the right of access to data which has been collected concerning him or her, and the right
to have it rectified.
(3) Compliance with these rules shall be subject to control by an independent authority.
Art. 52 CFR Scope and interpretation
(1) Any limitation on the exercise of the rights and freedoms laid down by the Charter must
be provided for by law, respect their essence and, subject to the principle of proportionality,
limitations may be made to those rights and freedoms only if they are necessary and genuinely
meet objectives of general interest recognized by the Union or the need to protect the rights
and freedoms of others.
[...]

The areas of application of Article 7 CFR, which includes the right to respect for
private life and communications, and Article 8 CFR, which provides a separate
fundamental right to the protection of personal data, are not easily distinguishable
from each other.108 In its early decisions, the ECJ hardly distinguished these two
fundamental rights. The “Digital Rights Ireland” judgment is characterized by a
stronger distinction: From the point of view of the ECJ, Article 8 CFR provides
protection with regard to the processing of personal data in a way that is not restricted
to private life and establishes certain distinctive data protection requirements. Article
7 CFR, by contrast, protects private life substantively. This term is not defined, but
is concretized with a view to the data and their information content.
The Court observed that, first of all, the data to be retained would make it possible,
in particular, to know the identity of the person with whom a subscriber or registered
user had communicated and by what means, to identify the time of the communi-
cation as well as the place from which that communication took place and to know
the frequency of the communications between participants during a given period.
Those data, taken as a whole, would allow very precise conclusions to be drawn
about the private lives of persons whose data has been retained, such as the habits of
everyday life, permanent or temporary places of residence, daily or other movements,
the activities carried out, the social relationships of those persons and the social envi-
ronments frequented by them.109 The ECJ added that the retention of the data might
affect free and uninhibited communications behavior as it is safeguarded as freedom
of expression in Article 11 CFR.110 Furthermore, the fact “that data are retained and
subsequently used without the subscriber or registered user being informed is likely

108 Among other things, the background to the delimitation difficulties is that Article 52 para 3 CFR
assigns the same significance and coverage to the rights of the Charter as to the corresponding rights
of the ECHR. Data protection in the ECHR, however, is essentially guaranteed by Article 8 ECHR,
the right to respect for a person’s private life, because the ECHR does not include an explicit human
right to data protection.
109 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, paras 27 and

56.
110 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 28.
Surveillance and Data Protection Rights: Data Retention … 97

to generate in the minds of the persons concerned the feeling that their private lives
are the subject of constant surveillance.”111
The infringements of these rights were divided into relatively independent steps.
The obligation imposed on providers to retain the specified data for a certain period
of time112 constituted in itself an interference with Article 7 CFR. The access of
the competent national authorities to the data113 was regarded as a further intrusion.
Because the directive allowed the processing of personal data, Article 8 CFR was
also impaired. Similar to the FCC, the ECJ classified the intrusions as particularly
serious.
The essence of these fundamental rights which must be respected according to
Article 52(1) CFR was not adversely affected from the point of view of the Court.114
As far as Article 7 CFR was concerned, it argued that the DRD did not permit the
acquisition of knowledge of the content of the communications as such. As for Article
8 CFR, it pointed out that the DRD provided that certain principles of data protection
and data security must be observed.
The crucial point then was whether the principle of proportionality was complied
with. This principle requires that acts of the EU institutions be suitable for attaining
legitimate objectives and not exceed the limits of what is necessary and appropriate
in order to achieve those objectives.115 In some contradiction to the 2009 ruling,116
the ECJ acknowledged, when it examined compliance with the principle of propor-
tionality in the “Digital Rights Ireland”-judgment, that the material objective of the
DRD was to ensure that the data retained would be available for the purpose of the
investigation, detection and prosecution of serious crime and thus to contribute to
public security.117 This constituted an objective of general interest and because the
data at issue was deemed to be a valuable tool in the prevention of offenses and
the fight against crime, its retention was considered to be suitable for attaining the
objective.
But the ECJ raised the objection that the DRD did not lay down clear and precise
rules guaranteeing that the wide-ranging and particularly serious interference with
Articles 7 and 8 CFR is actually limited to what is strictly necessary. The Court
emphasized that the directive, firstly, covered all persons, all means of electronic
communication and traffic data without any differentiation, limitation or exception

111 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 37.
112 Articles 3, 5 and 6 EU DRD.
113 Article 4 EU-DRD.
114 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, paras 39 et

sqq.
115 The EU legislature’s has a margin of appreciation here, but the ECJ highlights that, where

interference with fundamental rights is at issue, the extent of the EU legislature’s discretion regarding
judicial review of compliance with those conditions may prove to be limited, depending on, in
particular, the area concerned, the nature of the right at issue, the nature and seriousness of the
interference and the object pursued by the interference.
116 See n. 104.
117 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, paras 49 et

sqq.
98 M. Albers

being made in the light of the objective of fighting against serious crime. It explained
in more detail that the DRD did not require any relationship between the data whose
retention is provided for and a threat to public security. On the contrary, it applied to all
persons using electronic communications services and therefore even to persons for
whom there is no evidence suggesting that their conduct might have any connection
with serious crime. It did not provide for any precautions with a view to obligations
of professional secrecy.118 In particular, there were no restrictions of retention in
relation to data pertaining to a particular time period and/or a particular geographical
zone and/or to a circle of particular persons likely to be involved in a serious crime, or
to persons who could contribute, through the retention of their data, to the prevention,
detection or prosecution of serious offenses.119 Secondly, the Court also held that
the directive failed to provide for sufficient safeguards to ensure effective protection
of the data considering its vast quantity as well as its sensitive nature and the risk of
abuse. Instead, a particularly high level of protection and security should be applied
by the providers by means of technical and organizational measures.120 Thirdly, the
directive failed to confine the authorities’ access to the data and its subsequent use to
sufficiently serious criminal offenses121 and to lay down substantive and procedural
thresholds, for instance the requirement of a prior review of authorities’ requests
for access and use by a court or an independent administrative body.122 Fourthly, the
period of time for which data should be withheld was not sufficiently differentiated in
terms of categories of data and their possible usefulness for the purposes or according
to the persons concerned.123 That period was set at between a minimum of six months
and a maximum of 24 months without any objective criteria being discernible that
would have provided information on how to determine the period of retention. Finally,
the Court criticized that the DRD did not call for data to be retained within the EU
to ensure control by an independent authority.124
As a result of these considerations, the DRD was declared invalid. However, the
extent to which this judgment required what adjustments of member state laws was
(and is) not fully clear. Only the directive which obliged member states to introduce
data retention rules or to adapt their existing rules no longer exists. This does not
say anything about the extent to which such member state rules are allowed under
Union law. Against this background, there are a number of member states that have
maintained or, like Germany, reintroduced in an amended form their rules on data

118 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 58.
119 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 56 et
sqq.
120 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 66 et

sqq.
121 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 60 et sqq.

On the difficulties caused by the limited areas of competence of the EU, which also explain why the
DRD rules on access by security authorities do not go into detail, see Bignami (2007), pp. 238ff.
122 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 62.
123 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 63.
124 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 68.
Surveillance and Data Protection Rights: Data Retention … 99

retention. With the nullification of the DRD, the compatibility of these rules with
Union law depends on the scope, the precautions and the limitations of the ePrivacy
Directive.125

4.3.2 The “Tele2 Sverige” Judgment

Questions arising in the context of the ePrivacy Directive were the subject of the
“Tele2 Sverige” ruling handed down by the ECJ in December 2016.126 As a general
rule established in this directive, providers of publicly available electronic commu-
nications services are required to erase or anonymize personal data where it is no
longer needed for transmission, billing or other service purposes. However, Article
15 ePrivacy Directive states that member states may adopt legislative measures to
restrict the scope of the rights and obligations specified in this directive when such
restriction constitutes a necessary, appropriate and proportionate measure within a
democratic society to safeguard, inter alia, public security and the prevention, inves-
tigation, detection and prosecution of criminal offenses. Under these conditions and
in accordance with the general principles of Community law, legislative measures
providing for the retention of data for a limited period justified on particular grounds
are explicitly allowed.
The ECJ explained that the interpretation of Article 15 ePrivacy Directive leads
to the result that the contested national legislation on data retention falls within
the scope of EU law. From the point of view of the Court, this is true despite the
limited competence of the EU in the area of crime prevention and prosecution,
because Article 15 ePrivacy Directive expressly authorizes the member states to
adopt legislation permitting data retention only if certain restrictions are adhered
to.127 In this respect, the ECJ reaffirmed and specified the requirements it had already
set out in its “Digital Rights Ireland” decision.
Article 15 ePrivacy Directive must be interpreted in the light of the fundamental
rights guaranteed by the CFR, particularly in the light of Articles 7 and 8 CFR,
but also in the light of Article 11 CFR. As a result and in line with the “Digital
Rights Ireland” decision, it precludes national legislation which, for the purpose of
fighting crime, provides for the general and indiscriminate retention of all traffic
and location data of all subscribers and registered users related to all means of
electronic communication.128 However, the ECJ made further observations on the

125 Directive 2002/58/EC (n. 10).


126 Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15. The joint
requests for a preliminary ruling under Article 267 AEUV were made by the Administrative Court
of Appeal in Stockholm and the Court of Appeal (England & Wales) concerning the interpretation
of Article 15 I of Directive 2002/58/EC read in the light of Articles 7 and 8 and Article 52 I of the
Charter of Fundamental Rights of the European Union.
127 Cf. Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15,

paras 73ff.
128 Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15, para

112.
100 M. Albers

conditions under which data retention rules are permitted. Beyond precise rules
governing the scope and application of such data retention measures and imposing
minimum safeguards, the retention of data must meet objective criteria that establish
a connection between the data to be retained and the objective pursued in the context
of prevention, investigation, detection and prosecution of serious crime. The national
legislation must be based on objective evidence which makes it possible to identify
a public whose data is likely to reveal a link, at least an indirect one, with serious
criminal offenses, and to contribute in one way or another to fighting serious crime or
to preventing a serious risk to public security.129 The Court explained more closely
that such limits might be set by using a geographical criterion where the competent
national authorities consider, on the basis of objective evidence, that in one or more
geographical areas, a high risk of preparation for or commission of such offenses
exists.130

4.3.3 The “Quadrature Du Net and Others” Judgment

The grounds for the “Tele2 Sverige” judgment, in conjunction with the previous
declaration of the invalidity of the DRD in the “Digital Rights” decision, required
adjustments to national legislation.131 However, the member states have not abol-
ished their data retention regulations, but have modified them, if at all. They have
argued that there is room for interpretation of the ECJ rulings and that they have
their own competences in the field of national security and public safety. As a result,
further cases have been and are pending before the ECJ and over time, its rulings are
turning toward more differentiated approaches. In a decision that centered solely on
the accessibility of data for security agencies, the ECJ has set lower requirements
for access to mere identification data132 —entirely in line with the FCC.133 The next
detailed leading decision on data retention, the “Quadrature du Net and Others”
judgment,134 dealt with the question of whether French and Belgian regulations
were compatible with European Union law. These regulations obliged electronic

129 Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15, para
111.
130 Judgment of the ECJ (Grand Chamber) of December 21, 2016, C-203/15 and C-698/15, para

111.
131 For an overview, see the Eurojust report “Data retention regimes in Europe in light of the CJEU

ruling of December 21, 2016 in Joined Cases C-203/15 and C-698/15,” Council EU, Doc 10098/17;
and European Commission, Study on the retention of electronic communications non-content data
for law enforcement purposes: Final report, 2020.
132 Judgment of the ECJ (Grand Chamber) of October 2, 2018, C-207/16, paras 57 et sqq., available at

curia.europa.eu. The ECJ has explicitly excluded the question of whether the Spanish Ley 25/2007
de conservación de datos relativos a las comunicaciones electrónicas y a la redes públicas de
comunicaciones that provides for obligations to retain data in accordance with the invalid DRD is
consistent with the requirements laid down in Article 15(1) of Directive 2002/58.
133 See Sect. 4.2.3 of this article and n. 101.
134 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

available at curia.europa.eu.
Surveillance and Data Protection Rights: Data Retention … 101

communications operators, access providers and hosting service providers to retain


closer defined data for one year in order to make them available to specific security
authorities under specified conditions. In addition, the French regulations include
obligations covering the automated analysis and real-time collection of traffic, loca-
tion or particular technical data—beyond data retention, the “Quadrature du Net and
Others” judgment provides insights into surveillance under the conditions of the
Internet.135
The ECJ initially explained that the processing of data by providers is regulated
comprehensively by EU law and that access to these data by security authorities is
therefore also covered, even if the purposes of national security in particular are a
matter of member state competence.136 Then it developed—in comparison to the
“Tele2 Sverige” judgment—differentiated guidelines for data retention. Although
it affirmed that Article 15 ePrivacy Directive, read in the light of Articles 7, 8 and
11 CFR, must be interpreted as precluding legislative measures that provide, as a
preventive measure for broadly defined purposes, for the general and indiscriminate
retention of traffic and location data, it recognized exceptions.
For the purposes of safeguarding national security in situations where the member
state is confronted with a serious threat to national security that is shown to be genuine
and present or foreseeable, measures of the member states obliging providers of
electronic communications services to retain, generally and indiscriminately, traffic
and location data are not prohibited. The ECJ argued that even if such a measure is
applied indiscriminately to all users of electronic communications systems, without
there being at first sight any connection with a threat to the national security, it must
nevertheless be considered that the existence of that threat is, in itself, capable of
establishing that connection.137 However, the Court has set further requirements:
Clear and precise rules must ensure that the retention of data at issue is subject
to compliance with substantive and procedural conditions. The persons concerned
must have effective safeguards against the risks of abuse. The decision imposing
such an obligation has to be subject to effective review, either by a court or by an
independent administrative body. The data retention period must be limited in time
to what is strictly necessary.138
For the purposes of safeguarding national security, combating serious crime and
preventing serious threats to public security, the targeted retention of traffic and
location data on the basis of objective and non-discriminatory factors is possible for
a certain period of time, provided that such retention is limited, with respect to the
categories of data to be retained, the means of communication affected, the persons

135 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,
paras 169 et sqq.
136 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

paras 91 et sqq. See also judgment of the ECJ (Grand Chamber) of October 6, 2020, C-623/17,—
Privacy International, available at curia.europa.eu, paras 34 et sqq.
137 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

para 137.
138 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

paras 134 et sqq.


102 M. Albers

concerned and the retention period adopted, to what is strictly necessary.139 Once
again, the ECJ highlighted that limits might be set using a geographical criterion or
criteria that are oriented to categories of persons whose data are likely to be relevant
for the specified purposes.140 It is essential that sufficiently clear substantive and
procedural conditions and safeguards exist. This applies to both storage by private
companies and possible access and use by state authorities. In comparison to the
preceding judgments, the ECJ differentiated more clearly between these steps of
data processing. Regulations governing the access of the competent authorities to
retained data as well as their use must not only ensure compliance with the specified
purposes but must also lay down further substantive and procedural conditions.141
For example, the authorities’ access to retained data have to be subject to a prior
review carried out by a court or by an independent administrative body.142
Under similar but adjusted conditions and safeguards, legislative measures that
provide for the general and indiscriminate retention of IP addresses assigned to the
source of an Internet connection are not precluded.143 In reviewing the measures
concerning the general and indiscriminate retention of data relating to the civil
identity of users of electronic communications systems, the Court arrived at lower
requirements.144
In a nutshell, data retention rules are not excluded, but are subject to a differentiated
classification and multilayered restrictive conditions. How these conditions are to be
understood in detail is controversial in the current general discussion.

4.4 Current Situation of Data Retention

In Germany, the legislator did not immediately adapt the provisions declared uncon-
stitutional by the FCC, but waited for the “Digital Rights” decision of the ECJ that

139 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,
paras 140 et sqq.
140 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

paras 148 et sqq.


141 Cf. judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C

520/18, para 176, and the subsequent decision of the ECJ (Grand Chamber) of March 2, 2021,
C-746/18,—H. K., para 49. See also judgment of the ECJ (Grand Chamber) of October 6, 2020,
C-623/17,—Privacy International: Article 15 ePrivacy Directive, read in the light of Articles 7, 8
and 11 CFR, must be interpreted as precluding national legislation enabling a State authority to
require providers of electronic communications services to carry out the general and indiscriminate
transmission of traffic and location data to the security and intelligence agencies for the purpose of
safeguarding national security.
142 More closely judgment of the ECJ (Grand Chamber) of March 2, 2021, C-746/18, para 51 et

sqq.
143 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

paras 152 et sqq.


144 Judgment of the ECJ (Grand Chamber) of October 6, 2020, C-115/18, C-512/18 and C 520/18,

paras 157 et sqq.


Surveillance and Data Protection Rights: Data Retention … 103

expressly concerned the compatibility of the DRD with European fundamental rights.
After this decision had been handed down, the German parliament approved laws
reintroducing data retention.145 The legislator claimed to comply with the require-
ments of both Germany’s FCC and the ECJ. The provisions were supposed to be
implemented by providers by July 1, 2017. Among other things, the range of data
permitted to be stored was reduced, e.g., e-mail services’ data was excluded, the
retention periods were differentiated into between four and ten weeks according
to data categories, protective measures for professional secrets were provided, the
provisions for data security were significantly tightened, the transmission thresholds
were raised, access and use for prosecution purposes were linked to a listed catalog
of criminal offenses, and various procedural safeguards were introduced. Regarding
the “Tele2 Sverige” decision of the ECJ, the German legislator saw no reason to
change the law already enacted.
However, constitutional complaints against this new law have once again been
submitted to the FCC.146 The administrative courts concerned with the data retention
obligations have delivered several preliminary and main rulings with the result that the
obligations are unlawful.147 They highlight, in particular, the fact that the requirement
to restrict in any suitable manner the circle of persons affected by data retention has
not been met.148 The Federal Administrative Court has referred this matter to the ECJ,
questioning whether national legislators are denied any possibility of establishing
data retention without cause in order to combat serious crime, even if they limit the
risks of personality profiling and provide strict data security and access rules. In the
meantime, an oral hearing before the ECJ has taken place, the Advocate General has
delivered his opinion in November 2021 and a new leading decision of the ECJ is
expected in 2022.149 In view of the existing legal uncertainties, the Federal Network
Agency has provisionally suspended the providers’ obligation to retain data.
In the pending main proceedings, the FCC is facing the task to assess the German
regulation against the background of EU law and the jurisdiction of the ECJ. This is
all the more challenging in view of the fact that EU legislation and, as we have seen,
the case law of the ECJ are developing dynamically. For several years, an ePrivacy
Regulation to replace the ePrivacy Directive has been the subject of fierce political

145 Gesetz zur Einführung einer Speicherpflicht und einer Höchstspeicherfrist für Verkehrsdaten, v.
10.12.2015, BGBl. I 2218.
146 1 BvR 141/16, 229/16, 2023/16 and 2683/16. Applications for interim measures have been

rejected twice by the FCC because of the complexity of the legal issues and as a result of a balancing
of consequences, as is usual in proceedings for interim relief, see FCC, Ruling of June 8, 2016, 1
BvQ 42/15, and ruling of March 26, 2017, 1 BvR 3156/15.
147 Adminstrative Court of Cologne, Ruling of January 25, 2017 - 9 L 1009/16; Higher Adminis-

trative Court of North Rhine-Westphalia, Ruling of June 22, 2017 – 13 B 238/17; Administrative
Court of Cologne, Judgment of April 20, 2018 – 9 K 3859/16.
148 Cf. Higher Administrative Court of North Rhine-Westphalia, Ruling of June 22, 2017 – 13 B

238/17, paras 80ff..; Administrative Court of Cologne, Judgment of April 20, 2018 – 9 K 3859/16,
paras 88ff.
149 Cf. the decision for referral to the ECJ: Federal Administrative Court, Ruling of September 25,

2019, 6 C 12/18. The opinion of Advocate General Campos Sánchez-Bordona has been delivered on
November 18, 2021; the cases are pending before the ECJ as joined Cases C-793/19 and C-794/19.
104 M. Albers

debate. The German legislative bodies are also considering a revision of the German
data retention regulations.

5 Developing Requirements for Surveillance in the Internet


Age

Data retention proves to be a good example of modern surveillance practices under the
conditions of the Internet and of needs for protection that arise as a result of this. It has
also become clear that such surveillance offers many elements which can be made use
of for regulation. It is necessary to break down the problem in more detail with respect
to a series of questions. Which data and which persons, enterprises or authorities are
involved? To what extent and for what primary purposes does data already accrue for
what period of time? How is the data collection and storage activity of enterprises
regulated in this respect? What about data security measures and protection against
misuse? What is the relationship between primary, secondary and possibly further
purposes? For what secondary purposes do which security authorities shall have
access to which data under what conditions and intervention thresholds and in what
form? Which control mechanisms already exist or should be implemented?
Data retention is also a good example of the interactions between courts, in
particular between national constitutional courts, ECJ and member states’ appellate
courts.150 Although the extent to which the FCC was to have jurisdiction to examine
the legal complaints was questionable, this Court fully reviewed the contested provi-
sions151 and opened up the avenue of a strategically important review of whether data
retention provisions conformed to fundamental rights. Its judgment, issued in 2010,
has developed the orientation effects in the proceedings before the ECJ which the
FCC may have expected in the case of such proceedings. This successful outcome
may have encouraged the FCC to assign to itself far-reaching judicial review powers
with regard to both national fundamental rights and the fundamental rights of the
Charter of the European Union in cases in which European Union law is being imple-
mented, as it did in its two decisions on the right to be forgotten.152 By comparison
with the data retention decisions, we can clearly see that the ECJ adopted some of
the considerations and requirements that the FCC had elaborated. In turn, by being in
some respects more critical than the FCC, the ECJ took advantage of the opportunity

150 See more closely for these interactions Slaughter (2004); Maduro (2009); Voßkuhle (2010);
Albers (2012). With regard to data retention: Zubik et al. (2021), pp. 229 ff.
151 This was initially justified succinctly by the fact that there is nothing to preclude an examination

on the basis of fundamental rights if the ECJ were to declare the directive null and void by way of a
preliminary ruling. Cf. Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08,
para 182. This reasoning can be challenged because its latter part is based solely on an assumption
and remains hypothetical, cf. Albers and Reinhardt (2010), p. 769.
152 Judgments of the FCC of November 6, 2019, 1 BvR 16/13 and 1 BvR 276/17. As to the right to

be forgotten, see Sarlet (2022), in this volume; and Schimke (2022), in this volume.
Surveillance and Data Protection Rights: Data Retention … 105

to distinguish itself as an even better fundamental rights Court.153 The role of the
non-constitutional national courts, which are no longer oriented solely to the highest
courts of their country, but—by requesting a preliminary ruling—also to the ECJ and
thus gain in their relative independence, should also be emphasized.
Now what are the key points that the Courts have worked out for data retention
as a paradigm for modern surveillance measures? Where do their approaches and
criteria resemble each other and where do they differ?
Both the FCC and the ECJ rightly point out that surveillance is a process with
several elements: It is not limited to data collection but includes further steps such
as storage over a period of time, access to data, its analysis and the translation
of acquired knowledge into decisions.154 The Courts break down this process to a
certain extent, assess relevant steps as an autonomous impairment of the relevant
fundamental rights and develop relatively independent legal requirements for each.
Thus, attention should be paid to the question of how the Courts deal with the problem
that frequently arises under the conditions of the Internet: The different steps of the
surveillance processes are more or less uncoupled from each other.155 In the case
of data retention discussed here, private companies collect and store the data, while
the security authorities access and use specific data at a later date under certain
conditions. Is the data retention in any way already surveillance in the sense of the
concept that we have fleshed out, even though it is possible that there is no access at
all by security authorities, but the data are, due to lack of relevance, deleted after the
storage period in an automated form? Is this really a matter of “mass surveillance”
by the state?
The legal assessment is facilitated by the fact that telecommunications are
protected by quite abstract guarantees that cover the entire process of communication
and handling of telecommunications data (Article 10 German Basic Law, Articles
7 and 8 EU-CFR). The FCC argues that the complainants contest an extension of
storage with regard to which service providers were obliged, solely as auxiliaries, to
carry out state tasks. Hence, the retention of the data is legally accountable to the state
and to be classified as an element of state surveillance with regard to all data held
available. However, the existing “private–public assemblage” and the uncoupling
of the individual steps of the surveillance still play a vital decisive role: Beside the
Court’s assessment of proportionality, they influence its view that the data retention
case is not a matter of storing non-anonymous data for indefinite or not yet defin-
able purposes or of bulk surveillance without cause, but “a precautionary storage of
telecommunications traffic data without cause for later transmission with cause”.156
The persuasiveness of the Court’s quite sophisticated considerations suffers from
the fact that such an approach does not properly address the new problem of datafi-
cation of all life circumstances and the broad mass surveillance potential inherent

153 Cf. also Kühling (2014); Granger and Irion (2014), pp. 844ff.
154 Cf. Sect. 2.1 of this article.
155 See Sect. 2.1 of this article.
156 Cf. Sect. 4.2.2 of this article.
106 M. Albers

therein.157 The FCC itself recognizes this and tries to counter it by requiring that
the precautionary storage of telecommunications traffic data without cause must
remain an exception and that the exercise of freedom may not be recorded and regis-
tered in its entirety. However, it does not become clear what the implementation and
operationalization of these vague remarks might look like.
The ECJ also emphasizes that there is a legal obligation imposed on providers to
retain the data. In the initial judgments, it does not differentiate as strongly as the
FCC between storage by private companies and possible use by state authorities.
This is due to the fact that the directive has set few standards for avoiding the risk of
abuse of the data during storage and confining the authorities’ access to the data.158
Against such a background, the bulk surveillance character of the established data
retention suggests itself. This explains the limitations called for by the ECJ: a link,
at least an indirect one, between the data to be retained and the objective pursued
in the context of fighting serious crimes.159 It upholds this in principle in the recent
judgments, although introducing some differentiations as well. The persuasiveness
of these considerations suffers from the fact that the proposed limitations, e. g.,
using geographical criteria, do not seem to be appropriate, but rather eclectic. Data
retention in the telecommunications sector aims at retaining data while its potential
link to criminal offenses can only be worked out at a later time, because only then
will the evidence for criminal offenses have been consolidated and the relevance
of certain data becomes evident. It is not as if the legislator were to refrain from
setting limits, although they could easily be provided for. The fact that the ECJ hints
at a compatibility of data retention with fundamental rights provided that particular
limits are set, which, however, do not capture the problem to be solved, is the reason
for the ongoing debates.
The next point that deserves attention concerns the description of the interests
protected. At the outset, both Courts remain relatively abstract, because they base
their decisions on broadly defined formal rights, namely the guarantee of the invi-
olability of telecommunications (Article 10 German Basic Law) and the protec-
tion of private life and data protection (Articles 7 and 8 CFR). In assessing the
intensity of the infringement and in the context of proportionality, however, they
specify the protected interests. Both Courts highlight the informative value of the
data retained.160 The influence that technologies can have on the format or selec-
tivity of data is not the main focus here; in other cases this will require appropriate
elaboration. The data retained makes it possible to create meaningful personality
and mobility profiles and draw conclusions with a more or less reliable degree of
probability about individual activities, social relationships, political affiliations and
preferences or decision-making processes within groups. The Courts add further

157 See Sect. 2.1 of this article.


158 See Sect. 4.1 of this article with the explanation that this is due to the limited competencies of
the EU.
159 Cf. Sects. 4.3.1 and 4.3.2 of this article.
160 Sections 4.2.1, 4.2.2, 4.3.1 and 4.3.2 of this article.
Surveillance and Data Protection Rights: Data Retention … 107

aspects that have been mentioned in the general discussions on the effects of surveil-
lance: The persons affected have not given any cause for being subject to surveillance
by security authorities. Such surveillance without cause and its quantitative range
as well as the details and quality of the data may result in the “diffusely threatening
feeling of being watched”161 or generate “in the minds of the persons concerned the
feeling that their private lives are the subject of constant surveillance”.162 The holders
of the pertinent fundamental rights are protected against others gaining knowledge
about them through the telecommunications surveillance in question and against
being subjected to surveillance. There may be technical tools for self-protection;
however, part of freedom is that individuals cannot be subjected to pressure to use
them and apart from this, at least the German legislator has imposed obligations on
access providers that make such self-protection more difficult.163 The Courts also
address the increased risk of being exposed to further investigations which subse-
quently turn out to be unfounded, as well as the increased risk of being affected by
abuse of the data collected. They emphasize that the effects they have mentioned can
have further detrimental consequences for the free exercise of fundamental rights
in many fields. The interests protected are described at a rather abstract preliminary
level covering a situation before specific infringements are definable and therefore
given particular weight in the balancing process. What matters in the first instance
is the possibilities offered by the data as well as the perspective of and impact on the
persons affected. The question is now whether these impairments can be regulated
and possibly excluded by means of an appropriate legal framework, as the FCC has
assumed in its decision.
The requirements for legal frameworks must once again respond to the fact that
surveillance is a multi-layered process with different steps in specific social contexts
and that the persons concerned are affected differently by the various forms, methods
and modes of surveillance. Both Courts derive a bundle of highly differentiated inter-
dependent requirements from fundamental rights with regard to the measures data
retention calls for, the individual rights to be acknowledged, control, and evaluation.
Considering the stages of surveillance via data retention, there are manifold refer-
ence points for regulation. Data retention is based on the fact that certain data already
exist and are stored. European telecommunications law is characterized by the fact
that particular regulatory requirements for telecommunications or telemedia services
determine which data is stored for how long, by which responsible data controller
and for what primary purpose. The data retention obligations then specify which data
is to be retained by which controller for how long, for what purposes, in what form
and under what conditions. The question of when access for secondary purposes is
legal also offers a variety of reference points for regulation: Under what conditions
do which security authorities have access to which data? The legislator must describe
in more detail the investigative purposes and the dangers or criminal offenses that
are supposed to legitimize access. It must also regulate which basis of facts and

161 Judgment of the FCC of March 2, 2010, 1 BvR 256/08, 263/08, and 586/08, para 212.
162 Judgment of the ECJ (Grand Chamber) of April 8, 2014, C-293/12 and C-594/12, para 37.
163 Section 4.2.4 of this article.
108 M. Albers

which degree of probability is required for the security authorities to predict that
such dangers are imminent or to assume that crimes have already been committed.
The retained data to which access is requested must be suitable and necessary for
further investigation. Each of these elements as well as their interplay can be assessed
against the background of fundamental rights standards and may or may not satisfy
the resulting requirements. For example, the FCC has set specific requirements for
the legislator to ensure data security on the part of service providers, which are
intended to ensure that the risks of abuse are reduced as far as possible. And in its
decision, the manifold restrictions to access by the security authorities were crucial.
In its recent judgments, the ECJ has also adopted a precise and detailed approach.
In addition, both Courts concluded that a list of different individual rights must
be established in the case of surveillance. This starts with claims that the contested
surveillance must be omitted. In case a general omission cannot be successfully
claimed, further rights arise. These include, for example, accessory claims to protec-
tive and data security measures, rights to information about surveillance and subse-
quent measures, rights to the establishment of appropriate controls, and rights to
judicial review.
A particularly important role has been given to an independent control of compli-
ance with the regulations. In the data retention procedures, both the FCC and the ECJ
highlight the need for a judge’s proviso or at least control by an independent authority
at central points in data processing, especially regarding the transfer of personal data
from the providers to the security authorities.164 They also emphasize that there must
be sufficient remedies for the individual to have a possible violation of his or her
rights examined through judicial review. As a result of analyses of surveillance on
the Internet, the need to establish independent oversight or controls beyond possible
violations of individual rights is also becoming increasingly clear.165
This overlaps to a certain extent with the requirement to establish evaluations.
Their aim is not to check the legality of behavior or to examine effectiveness. Against
the background of guaranteeing fundamental rights, evaluations rather have the func-
tion of checking whether the actual circumstances and developments correspond to
the forecasts and assumptions on which the legislature based the establishment of a
particular surveillance measure, for example data retention, and the shaping of its
regulation. In addition, an evaluation has the function of assessing what effects the
chosen legal regulations have, especially with regard to violations of constitutional
and fundamental rights. The purpose of the evaluation is to ensure that the experi-
ence gained from implementation is passed on to the legislature and provides the
basis for improvements. This is particularly important in the case of data retention or
other complex surveillance measures. Here, many questions of suitability, necessity,

164 Cf. Molinaro and Ruaro (2022), Sect. 4 in this volume (regarding the importance of control by
a judicial or independent administrative body in Brazilian law).
165 In its decision on the competencies of the Federal Intelligence Service for surveillance of

telecommunications abroad, the German FCC places particular emphasis on the establishment
of independent, comprehensive and multifarious controls beyond individual rights, see FCC, Judg-
ment of May 19, 2020, 1 BvR 2835/17, paras 272ff., available at http://www.bundesverfassungsg
ericht.de. See also Bos-Ollermann (2017), pp. 147ff.
Surveillance and Data Protection Rights: Data Retention … 109

appropriateness, effectiveness of technical security measures or abuse opportunities


arise. Without evaluations, these questions cannot be answered with a reasonable
degree of certainty.166

6 Conclusion and Outlook

As a result of the rise of the Internet and the onlife world, data collections of private
companies, data retention and subsequent governmental data access and use will
take on ever increasing importance as surveillance strategies. The forms surveillance
takes are impacted by digitization and by the Internet. In addition, technologies are
continuously being developed, particularly in the area of artificial intelligence, that
are capable of analyzing huge amounts of data.
Retention of data generated in telecommunications in order to enable security
agencies at a later point in time to gain access to and make use of this data for security
purposes is an illustrative example of how the Internet impacts the forms surveillance
takes and creates new challenges for fundamental rights. Prompted by mass protests
against bulk surveillance, the regulations on data retention have been brought before
the national courts and the ECJ. The courts are still struggling to describe protection
needs and to develop appropriate legal requirements. Even though a spectrum of legal
requirements and a bundle of individual rights as well as demands for oversight and
evaluation have been worked out in the leading decisions, we should not expect too
much from courts. Court proceedings and court decisions have institution-specific
limitations which mean that the questions that arise are always answered only on a
case-by-case basis, selectively and from the perspective of specified judicial reviews.
In court proceedings, the overarching societal situation with the constantly growing
number of surveillance measures and their combination is just as difficult to grasp
as the dynamic development of technical possibilities.167
The substantive discussion in this field is only in its early stages. We have to
deepen thinking about appropriate solutions to the surveillance potential inherent on
the Internet and in datafication.

References

Aftab S (2022), Online anonymity—the Achilles’-heel of the Brazilian Marco Civil da Internet.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)

166 Cf. Albers (2010a). See also more broadly Clarke (2015), pp. 129ff. Empirical studies are
complicated and rare; see, as an example, Max Planck Institut für ausländisches und internationales
Strafrecht (2011).
167 See also Koops et al. (2019), pp. 685ff (discusses whether the protection of privacy has to be

reframed).
110 M. Albers

Albers M (2001) Die Determination polizeilicher Tätigkeit in den Bereichen der Straftatenverhütung
und der Verfolgungsvorsorge. Duncker & Humblot, Berlin
Albers, M (2005) Informationelle Selbstbestimmung. Nomos, Baden-Baden.
Albers M (2010a) Funktionen, Entwicklungsstand und Probleme von Evaluationen im
Sicherheitsrecht. In: Albers M, Weinzierl R (eds) Menschenrechtliche Standards in der Sicher-
heitspolitik. Beiträge zur rechtsstaatsorientierten Evaluierung von Sicherheitsgesetzen. Nomos,
Baden-Baden, pp 25–54
Albers M (2010b) Grundrechtsschutz der Privatheit. Deutsches Verwaltungsblatt (DVBl):1061–
1069
Albers M (2012) Höchstrichterliche Rechtsfindung und Auslegung gerichtlicher Entscheidungen.
In: Grundsatzfragen der Rechtsetzung und Rechtsfindung, VVDStRL vol 71, pp 257–295
Albers M (2018) Recht & Netz: Entwicklungslinien und Problemkomplexe. In Albers M, Katsivelas
I (eds) Recht & Netz, Nomos, Baden-Baden, pp 9–35
Albers M, Reinhardt J (2010) Vorratsdatenspeicherung im Mehrebenensystem: Die Entscheidung
des BVerfG vom 2. 3. 2010. Zeitschrift für das juristische Studium (ZJS):767–774
Bauman Z, Bigo D, Esteves P, Guild E, Jabri V, Lyon D, Walker RBJ (2014) After Snowden:
rethinking the impact of surveillance. Int Polit Soc 8(2):121–144
Bennett CJ, Haggerty KD, Lyon D, Steeves V (2014) Transparent lives: surveillance in Canada. AU
Press, Edmonton
Bos-Ollermann H (2017) Mass surveillance and oversight. In: Cole DD, Fabbrini F, Schulhofer S
(eds) Surveillance, privacy and transatlantic relations. Hart Publishing, Oxford, Portland, Oregon,
pp 139–154
Cate FH, Dempsey JX (2017) Bulk collection: systematic government access to private-sector data.
Oxford University Press, New York
Clarke R (1988) Information technology and dataveillance. Commun ACM 31(5):498–512
Clarke R (2015) Data retention as mass surveillance: the need for an evaluative framework. Int Data
Priv Law 5(2):121–132
Cohen JE (2017) Surveillance versus privacy: effects and implications. In: Gray D, Henderson
SE (eds) the Cambridge handbook of surveillance law. Cambridge University Press, New
York/Cambridge, pp 455–469
Coleman R, McCahill M (2010) Surveillance and crime. Sage, London, Thousand Oaks, New Delhi,
Singapore
Cukier K, Mayer-Schönberger V (2013) Big data: a revolution that will transform how we live,
work, and think. Eamon Dolan, Houghton/Mifflin/Harcourt
Determann L (2017) Business responses to surveillance. In: Gray D, Henderson SE (eds) The
Cambridge handbook of surveillance law. Cambridge University Press, New York/Cambridge,
pp 372–391
de Vries K, Bellanova R, De Hert P, Gutwirth S (2013) The German constitutional court judgment
on data retention: proportionality overrides unlimited surveillance (doesn’t it?). In: Gutwirth S,
Poullet Y, De Hert P, Leenes R (eds) Computers, privacy and data protection: an element of
choice. Springer, Dordrecht, pp 3–23
Doneda D, Zanatta RAF (2022) Personality rights in Brazilian data protection law: a historical
perspective. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Floridi L (ed) (2015) The Onlife Manifesto: being human in a hyperconnected era. Springer, Cham,
Heidelberg, New York, Dordrecht, London
Gazeas N (2014) Übermittlung nachrichtendienstlicher Erkenntnisse an Strafverfolgungsbehörden.
Duncker & Humblot, Berlin
Granger MP, Irion K (2014) The court of justice and the data retention directive in Digital Rights
Ireland. Telling off the EU legislator and teaching a lesson in privacy and data protection. EL
Rev 39:835–850
Greenwald G (2014) No place to hide: Edward Snowden, the NSA, and the U.S. Surveillance State.
Metropolitan Books, Henry Holt, New York.
Surveillance and Data Protection Rights: Data Retention … 111

Grimm D (1991) Verfassungsrechtliche Anmerkungen zum Thema Prävention. In: Grimm D, Die
Zukunft der Verfassung, Suhrkamp, Frankfurt a. M., pp 197–220
Harding L (2014) The Snowden files: the inside story of the world’s most wanted man. Vintage,
New York
Hildebrandt M (2016) Smart technologies and the end(s) of law. Edward Elgar, Chel-
tenham/Northampton
Katsivelas I (2018) Das Geschäft mit der Werbung: Finanzierungsmechanismen, personalisierte
Werbung und Adblocker. In: Albers M, Katsivelas I (eds) Recht & Netz, Nomos, Baden-Baden,
pp 207–248
Koops, B-J (2014) On legal boundaries, technologies, and collapsing dimensions of privacy (April
10, 2014). Politica e Società 3(2):247–264. Available at SSRN https://ssrn.com/abstract=258
1726. Accessed 11 Jan. 2022
Koops B-J, Newell BC, Škorvánek I (2019) Location tracking by police: the regulation of “tireless
and absolute surveillance”. 9 U.C. Irvine Law Rev 635–698
Kühling J (2014) Der Fall der Vorratsdatenspeicherungsrichtlinie und der Aufstieg des EuGH zum
Grundrechtsgericht. Neue Zeitschrift für Verwaltungsrecht (NVwZ) 681–685
Lachmayer K, Witzleb N (2014) The challenge to privacy from ever increasing state surveillance:
a comparative perspective. UNSW Law J 37(2):748–783
Lemos, R (2014) A bill of internet rights for Brazil. Available at https://www.accessnow.org/a-bill-
of-internet-rights-for-brazil/. Accessed 11 Jan. 2022
Lyon D (2003) Surveillance as social sorting. In Lyon D (ed) Surveillance as social sorting. Privacy,
risk, and digital discrimination. Routledge, London, New York, pp 13–30
Lyon D (2007) Surveillance studies: an overview. Polity Press, Cambridge/Malden
Maduro MP (2009) Courts and pluralism: essay on a theory of judicial adjudication in the context of
legal and constitutional pluralism. In: Dunoff JL, Trachtmann JP (eds) Ruling the world? Consti-
tutionalism, international law, and global governance. Cambridge University Press, Cambridge,
pp 356–380
Marsch N (2018) Das europäische Datenschutzgrundrecht: Grundlagen–Dimensionen–Verflech-
tungen. Mohr Siebeck, Tübingen
Max-Planck-Institut für ausländisches und internationales Strafrecht (2011) Schutzlücken durch
Wegfall der Vorratsdatenspeicherung? Eine Untersuchung zu Problemen der Gefahrenabwehr
und Strafverfolgung bei Fehlen gespeicherter Telekommunikationsverkehrsdaten. Gutachten im
Auftrag des Bundesamtes für Justiz, 2nd edn. Freiburg i.Br.
Milaj J, Kaiser C (2017) Retention of data in the new anti-money laundering directive—“need to
know” versus “nice to know.” Int Data Priv Law 7(2):115–125
Mitrou L (2010) The impact of communications data retention on fundamental rights and democ-
racy—the case of the EU data retention directive. In: Haggerty KD, Samatas M (eds) Surveillance
and democracy, pp 127–147
Molinaro CA, Ruaro RL (2022) Privacy protection with regard to (tele-)communications surveil-
lance and data retention. In: Albers M, Sarlet IW (eds) Personality and data protection rights on
the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Orwell G (2008) Nineteen eighty-four. Penguin, London
Petersen JK (2012) Introduction to surveillance studies. CRC Press, Boca Raton
Poster M (1990) The mode of information: poststructuralism and social context, 2nd edn. University
of Chicago Press, Chicago
Reinhardt J (2022) Realizing the fundamental right to data protection in a digitized society. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Richards NM (2013) The dangers of surveillance. Harv Law Rev 126:1934–1965
Roßnagel A (2010) Die “Überwachungs-Gesamtrechnung”–Das BVerfG und die
Vorratsdatenspeicherung. Neue Juristische Wochenschrift (NJW) 1238–1242
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
112 M. Albers

protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schertel Mendes L, Mattiuzzo M (2022) Algorithms and discrimination: the case of credit scoring
in Brazil. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Slaughter A-M (2004) A new world order. Princeton University Press, Princeton
Timan T, Galič M, Koops BJ (2017) Surveillance theory and its implications for law. In: Brownsword
R, Scotford E, Yeung K (eds) The Oxford handbook of law, regulation, and technology. Oxford
University Press, Oxford, pp 731–753
Tzanou M (2017) The fundamental right to data protection: normative value in the context of
counter-terrorism surveillance. Hart Publishing, Oxford, Portland
Vedaschi A, Marino Noberasco G (2017) From DRD to PNR: looking for a new balance between
privacy and security. In: Cole DD, Fabbrini F, Schulhofer S (eds) Surveillance, privacy and
transatlantic relations. Hart Publishing, Oxford, Portland, Oregon, pp 67–87
Veit RD (2022) Safeguarding regional data protection rights on the global internet—The European
approach under the GDPR. In: Albers M, Sarlet IW (eds) Personality and data protection rights
on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Voßkuhle A (2010) Der europäische Verfassungsgerichtsverbund. Neue Zeitschrift für Verwal-
tungsrecht (NVwZ) 1–8
Von Grafenstein M (2018) The principle of purpose limitation in data protection laws. Nomos,
Baden-Baden
Zedner L (2017) Why blanket surveillance is no security blanket: data retention in the United
Kingdom after the European data retention directive. In: Miller RA (ed) Privacy and power: a
translantic dialogue in the shadow of the NSA-affair. Cambridge University Press, Cambridge,
New York, pp 564–585
Zubik M, Podkowik J, Rybski R (2021) Judicial dialogue on data retention laws in Europe in the
digital age: concluding remarks. In: Zubik M, Podkowik J, Rybski R (eds) European constitutional
courts towards data retention laws. Springer, Cham, pp 229–249

Marion Albers Professor of Public Law, Information and Communication Law, Health Law
and Legal Theory at Hamburg University. Principal Investigator in the Brazilian/German
CAPES/DAAD PROBRAL-Research Project “Internet Regulation and Internet Rights”. Main
areas of research: Fundamental Rights, Information and Internet Law, Data Protection, Health
Law and Biolaw, Police Law and Law of Intelligence Services, Legal Theory and Sociology of
Law. Selected Publications: Recht & Netz: Enwicklungslinien und Problemkomplexe, in: Marion
Albers/Ioannis Katsivelas (eds.), Recht & Netz, Baden-Baden: Nomos, 2018, pp. 9–35; L’effet
horizontal des droits fondamentaux dans le cadre d’une conception à multi-niveaux, in: Thomas
Hochmann/Jörn Reinhardt (dir.), L’effet horizontal des droits fondamentaux, Editions Pedone,
Paris, 2018, pp. 177–216; Biotechnologies and Human Dignity, in: Dieter Grimm/Alexandra
Kemmerer/Christoph Möllers (eds.), Human Dignity in Context, München/Oxford/Baden-Baden:
C. H. Beck/Hart/Nomos, 2018, pp. 509–559, also published in Revista Direito Público, Vol. 15
(2018), pp. 9–49; A Complexidade da Proteção de Dados, Revista Brasiliera de Direitos Funda-
mentais e Justiça, Belo Horizonte, ano 10, n. 35, 2016, pp. 19–45.
Privacy Protection with Regard to (Tele-)
Communications Surveillance and Data
Retention

Carlos Alberto Molinaro and Regina Linden Ruaro

Abstract Law 12.965/2014 established the so-called Civil Framework for the
Internet. The principles of the law, especially the guarantee of net neutrality, freedom
of expression and privacy of users, have been established to maintain the openness
of the Internet. However, the Civil Framework for the Internet does not close the
debate on the Brazilian Internet regulation. Privacy and the protection of personal
data, for example, are protected by the General Law for the Protection of Personal
Data (LGPDP), Law 13.709 of August 14, 2018. In its Article 60, this Law amends
Law 12.965, of April 23, 2014, regarding the right of definitive deletion of personal
data, as well as the prohibition of storage of personal data which are excessive in rela-
tion to the purpose for which the data subject has given consent. The Brazilian Civil
Framework for the Internet establishes general principles, rights, and obligations for
the use of the Internet, as well as some relevant provisions on storage, use, treatment
and dissemination of data collected online. In addition, its Regulatory Act (Decree
8.771/2016) brought the first legal definition of personal data in its Article 14, letters
A and B. Other aspects of data privacy are still governed by general principles and
provisions on data protection and confidentiality in the Federal Constitution, in the
Brazilian Civil Code and in laws and regulations for other fields and types of relation-
ships (for example, financial institutions, health, consumers, telecommunications or
medical sector). This text studies the issues of privacy, data protection, and their
retention, given the rules of the Brazilian legal system, as established in the Civil
Framework of the Internet, following the general normative order. The methodology
uses bibliographic research in national and foreign doctrine, as well as legislation

Our thanks to Prof. Ph.D. Joshua Harrys (joshuaharrys@protonmail.com) of American Academy


of Arts and Sciences, for his diligent proofreading of this paper.

C. A. Molinaro
formerly: Pontifical Catholic University of Rio Grande do Sul—Law School (PUCRS), Porto
Alegre, Brazil
R. L. Ruaro (B)
Pontifical Catholic University of Rio Grande do Sul—Law School (PUCRS), Porto Alegre, Brazil
e-mail: regina.ruaro@pucrs.br

© Springer Nature Switzerland AG 2022 113


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_6
114 C. A. Molinaro and R. L. Ruaro

and legal issues on what is pertinent to the subject and, to a lesser extent, for the
intended objectives.

1 Introduction

In recent years, with rapidly evolving technology and a more globalized culture,
the importance of privacy has become paramount. The legal protection of privacy
against these new technological threats is an essential point of discussion on all
planetary latitudes. Privacy and protection of personal data in an era of surveil-
lance, as well as the dissemination of technologies and its practices, covers several
interdependent and emerging issues surrounding the legal protection of privacy and
scientific-technological knowledge.1 However, absolute privacy is impossible in a
society where more and more people share personal information.
Modern information and communication technologies (ICT) have facilitated new
ways of communicating, distributing and collecting information. Moreover, they
induce socialization with previously unknown individuals, now new friends, and
also contribute to the formulation of advice, warnings and much more. Now we
can communicate directly in a way that was unthinkable a few decades ago. In
addition to the many advantages, new modes of communication have also created
risks, especially those associated with damages for violation of privacy rights and
misuse of personal data. It is worth remembering that the right to protection of
personal data results from the right to privacy, intimacy, and confidentiality.
The essay on this topic should expose how public or private information about
individuals are collected and processed, stored and maintained electronically or simi-
larly by public or private organs. The processing of personal data must be guided by
specific purposes and based on the consent of the data subject or on some legitimate
and legal basis that transcends this necessity. Moreover, everyone must have the right
to access personal data in the hands of a third party or oppose their processing, as
well as the right that the data are corrected or deleted. Legally, data protection is part
of the right to privacy, which is protected as a human (and fundamental) right, both in
the contemporary national constitutions and in international treaties and conventions.
This means that data protection principles already existed before the data protection
debate.2 The connection remains essential to this day. Even countries that do not
have separate legislative acts on data protection honor the right to data protection to
a certain extent, since almost all states have adhered to at least some conventions on
human rights.3

1 On the subject, notably for the US privacy protection system, refer to Donohue’s study (2016).
2 Cf. also with regard to personality rights in Brazilian law Doneda and Zanatta (2022), esp.
Section 3.2 and 3.3, in this volume.
3 A quick search on the website of “Constitute” (where the currently-in-force constitutions for

nearly every independent state in the world as well as some drafts and historical texts are covered)
developed by the authors of the Comparative Constitutions Project at the University of Texas at
Privacy Protection with Regard to (Tele-)Communications … 115

In the following, we will examine the threats that cyberspace represents for the
privacy of individuals, as well as the risks that surveillance technologies generate in
public spaces and digital communication. We will study media practices and privacy
frameworks in the fields of rights and digital management in the Brazilian legal
system, especially in the context of Internet regulation, where we will pay attention
to the study of chapters II and III of the Civil Framework for the Internet. We will
consider the problem generated by the retention of user data in the Brazilian system,
notably in Law 8,218, of August 29, 1991 (as amended by Provisional Measure 2158-
35 of August 24, 2001), in Law no. 12,850 of 2013 (Law on Criminal Organizations),
in Law no. 12,965 of 2014 (Civil Internet Framework), and in the Law 13,709 of
August 14, 2018. At the sub-statutory level, Resolutions 426 of 2005, 477 of 2007
and 614 of 2013 of the National Telecommunications Agency (ANATEL), applicable
to telephony and Internet providers, are relevant.

2 Privacy is Under Permanent Surveillance, Arbitrary


Surveillance that Threatens Human and Fundamental
Rights

State surveillance is spreading to all levels: government authorities, officials, police,


and more general social control agencies. For this, it relies on private initiative,
both in the form of technological innovation and in the resources for tracking indi-
vidual actions.4 Surveillance technologies and their application by nation states are
permanent threats to democracy, freedom of expression and privacy.
Surveillance by the nation states has grown impressively, notably through the
collection of traffic data. Examples of traffic data may include:
• Information that tracks the source or destination of communication being
transmitted;
• Information that identifies the location of the equipment when a communication
has been made or received (such as the location of a cell-phone);
• Information that identifies the sender and recipient (including the recipients of
the copy) of a data communication included or attached to the transmission;
• Information that identifies the equipment through which any communication is
transmitted (for example, dynamic allocation of IP addresses, file transfer records,
and so on);

Austin, shows that 171 countries in the world include and guarantee by their constitutions the right
to privacy. See https://www.constituteproject.org/, search by subject. Accessed 11 Jan. 2022.
4 In a recent book, the investigative journalist Yasha Levine, examines the “private surveillance

business” that drives technology giants such as Google, Facebook and Amazon. He reveals how
these companies spy on their users and take advantage of that to provide (with obvious profits for
themselves) information to the military and intelligence services. Levine shows that the military
and Silicon Valley are effectively inseparable: Levine (2018).
116 C. A. Molinaro and R. L. Ruaro

• Anything, such as addresses or marks, written on the outside of a postal item (such
as a letter or package) that is in transmission;
• Tracking of online communication (including postal items and parcels);
• Telecommunications data generated whenever someone makes a phone call,
accesses the Internet or sends an e-mail.
In any way, it is indispensable—in the rule of law—to maintain democratic princi-
ples of access to the network, as well as security in the transmission and communica-
tion of data. Nonetheless, the war against terror may be the most common justification
for the emergence of the surveillance state, but it is not the only or the most important
reason.
The question is not whether we find ourselves in a surveillance state, but rather
what kind of surveillance state we are living in (?). So, do we have a government
without sufficient controls over public and private surveillance, or do we have a
government that protects individual dignity, freedom of expression and privacy rights,
reinvigorating the rule of law?
In the surveillance state, the government uses intelligence, spying, perspicacity
and data mining, collecting, altering, cleaning, modeling and analysis in order to
uncover the information needed to prevent potential threats, however, also to manage,
administer and provide social services. States of this type are a particular case of a
political unit founded under the so-called information society, states that attempt to
identify and resolve governance issues through the collection, compilation, analysis
and production of information.
In the ambiance of this state model, the breaking point between freedom and secu-
rity is privacy. This is particularly true because of the perplexity and the increasing
uncertainty of the threats of terrorism and the transgressions that occur on the Internet
(the ample space of cybercrime). These are facts that stimulated the restrictions (and
violations) of individual rights, affecting private life.5
Privacy is under permanent surveillance, arbitrary surveillance that threatens
human and fundamental rights (!). Surveillance is understood here in a broad sense,
as activity and mode of investigation, systematic and methodical, comprising the
surveillance of actions or communications of one or more institutions of any type or
people. A revival of the Benthamian Panopticon, more sophisticated and intrusive
(!). Alternatively, we can find a new vision of the same in the works of Foucault,6
where at least one or more particular activities, combined or not, are present, among
others, (a) behavioral surveillance, (b) communications surveillance, (c) data surveil-
lance (interception), (d) location surveillance and tracking, (e) body surveillance
(biometrics).

5 For these issues, we have well-founded and accurate literature, for example in English: Keller
(2017); in German: Reischl (2013); in Spanish: Rodríguez (2014); in Portuguese: Otero (2015).
6 For the subject, although in another critical approach, but by metaphor consult the doctoral thesis

held at the University of Twente, Netherlands, from 2007 to 2012 by Dorrestijn (2012), especially
Chap. 3 Technical mediation and subjectivation: Philosophy of technology after Foucault. Available
at: https://bit.ly/2FESCGM (permanent link). Accessed 11 Jan. 2022.
Privacy Protection with Regard to (Tele-)Communications … 117

Against this background and in the light of systematic intrusions into privacy
caused by invasive ICT innovations, do the democratic models of contemporary
western states provide a comprehensive system of fundamental and human rights?
The problem is not to affirm the existence of a system of rights, but rather to
uphold and enforce those rights regarding surveillance or espionage practiced by
such invasive technological innovations.7

3 Democracy, Surveillance and Privacy

Since September 11, 2001, when the United States declared war on terrorism, the
global democratic system has undergone substantial changes, leading to losses in
the enforcement of individual and collective rights, thereby undermining democratic
progress. This new reality has led to a negligence of the realization of rights and,
therefore, to a setback in the debate on these issues, both within the USA and in
the international order. Thus, we have lost the democratic gains of recent decades
because of the naive belief that intelligence agencies can somehow be more efficient
and effective when stepping away from democratic procedures. For the future, the
objective is to avoid a gap between two poles: efficiency and legality. The aim of
democratic states must be to guarantee intelligence services that are effective and
capable of operating within the limits of law, politics, and ethics.
The National Security Agency’s (NSA) spy reports revealed by former intelli-
gence agent Edward Snowden shocked not only Internet users but the entire interna-
tional community.8 The procedures of the intelligence agency contradicted political
agreements, violated international law, and threatened the privacy of governments,
corporations, and individuals. By mid-2013, when Snowden’s earliest information
came to light, WikiLeaks had already released significant leaks of sensitive data about
the actions of the U.S. and its allies in the Middle East and a monumental amount
of communications from U.S. diplomatic officials. In mid-2010, WikiLeaks had

7 From a Latin American perspective see the important contribution of LAVITS—Red Latinoamer-
icana de Vigilancia, Tecnología y Sociedad y de la Fundación Vía Libre (Latin American Surveil-
lance, Technology and Society Network and the Vía Libre Foundation) that resulted in the book
edited by Camilo Rios, ¿Nuevos paradigmas de vigilancia? Miradas desde América Latina. Memo-
rias del IV Simposio Internacional Lavits, Buenos Aires (2016). (New Paradigms of the Surveil-
lance? Views from Latin America. Memories of the IV International Lavits Symposium. Córdoba:
Fundación Vía Libre 2017), articulated in nine main areas: (1) Security doctrine and intelligence
services. (2) Surveillance, space and territory. (3) Identification and biometrics. (4) The seduction
of the number: decisions based on algorithms and Big Data. (5) New challenges in the protection of
personal data. (6) Corporate surveillance. (7) Cultural dimensions of surveillance. (8) Resistance
and counter-surveillance. (9) Art, surveillance and technology. Available online in: https://bit.ly/
2TZ91dc. Accessed 11 Jan. 2022.
8 As the scandal widened, BBC News looked at the leaks that brought US spying activities to light:

Edward Snowden: Leaks that exposed US spy programme, BBC News (US & Canada), 17 Jan
2014. Available online: http://www.bbc.com/news/world-us-canada-23123964. Accessed 11 Jan.
2022.
118 C. A. Molinaro and R. L. Ruaro

published information about offensives in Afghanistan and Iraq, presenting a video


in which the crew of an Apache helicopter shooted and killed civilians, including
Reuters employees.9
The public embarrassment caused by the significant leak in November 2010
did not prevent the occurrence of other shameful episodes and exposed the most
significant financial and military power on the planet. The Snowden case introduced
further issues into the global debate on politics and technology, society and informa-
tion, transparency and privacy. Snowden reported to journalist Glen Greenwald of
The Guardian that the NSA ignored its constitutional limits and surveilled citizens
outside the United States’ borders as well as spied on heads of state. He revealed that
German Chancellor Angela Merkel and Presidents Antonio Pieña Nieto (Mexico)
and Dilma Rousseff (Brazil) underwent surveillance of their mobile phones.10 The
intelligence agency collected metadata from the communications of these leaders
and close advisors, thereby achieving possible political and commercial advantages.
According to Greenwald, with the collaboration of technology companies such as
Apple, Microsoft, Google, Verizon, and Facebook, the NSA could access, collect
and analyze massive amounts of personal data from tweets, Facebook posts, Skype
conversations, and chats, history of navigation, SMS, files in SkyDrive, metadata
of mobile telephony and even private banking transactions. The agency would be
able to identify IPs, quickly locate users and track navigation real-time.11 All these
intrusive practices—either by the national states or by private companies—against
the democratic principles affect human and fundamental rights. It should be noted
that nowadays this rights violation takes place in states that have always had models
of highly developed democracies, such as the United States of America and Great
Britain.
The traditional set of rules (national and international), which aims to protect
freedom of expression and privacy, is not achieving its objective, either due to inef-
ficiency, political will, economic and financial constraints or safety reasons. Conse-
quently, contemporary states find themselves at a crossroads: whether maintaining
a highly developed democracy and neglecting security or reinforcing models for
preserving state security through intrusive measures in private life.
The dilemma has led states to solutions founded on intelligence, regardless of
how far these services contravene democratic principles.
It is worth mentioning that intelligence involves the use of the secrecy of sources
and methods, the uncovering of private facts, and even the use of funds that, though
not exempt from control, are subject to a particular regime that limits the evidence
of the way they work. Therefore, intelligence activity is not a regular activity of the
democratic state. It is an exceptional activity of the state, confined either to activities
regarding foreign countries and in this respect to the most critical issues of foreign,
economic and defense policies or to activities within the state on matters strictly

9 Collateral Murder, Website WikiLeaks, https://collateralmurder.wikileaks.org/. Accessed 11 Jan.


2022.
10 Leleux (2014), p. 172.
11 Greenwald (2014), pp. 90 ff.
Privacy Protection with Regard to (Tele-)Communications … 119

focused on identifying the threats that would probably destroy the state and the
democratic system.
Since intelligence is considered an activity that is often part of the administrative
and political structure of the state, we would like to address the question of why it is
necessary to control the intelligence activity. The answer to this question lies in the
fact that no state activity can be exempted from public control in order to ensure that
it is provided by legitimacy on the one hand and by cost-effectiveness, efficiency, and
effectiveness on the other. We emphasize: the legality of the intelligence activity must
be linked to compliance with the provisions of the constitutional, legal and regulatory
norms that are in force in the respective country, i.e., intelligence activity must be fully
subordinated to the laws and furthermore respect the individual rights of citizens.
The benefit lies in the appropriate relationship between the means available to the
agencies that administer and use public resources and the final product obtained:
intelligence.
We can find various types or forms of control so that we can integrate intelligence
activity into a real democracy. For example, we can adopt a non-partisan political
control held initially by the central political power itself to verify that the activities of
the intelligence activity respond appropriately to the needs of society. In addition to
political control, there can and must be a fundamentally professional control, carried
out by the head of the Intelligence Agency over the behavior of its subordinates as
well as the legitimacy and suitability of activities to the society’s interests.12
We can also envisage political control calling for zeal, objectivity, depth, prudence
and reserve in its realization and seeking to verify both the legitimacy and effec-
tiveness of intelligence activity, in the latter aspect an activity merely reactive,
episodic and responsive to contingencies; trying to continually influence the neces-
sary changes, making recommendations and stimulating appropriate behaviors and
attitudes within its sphere of competence. It also requires that political parties tran-
scend their interests. Moreover, we can also establish control over those activities of
the intelligence agencies that affect the privacy of citizens. Therefore, we must verify
that these activities have a sole purpose, invoked and authorized by the Authority,
for example, national security, and ensure that interference in the private sphere is
reduced to the minimum possible. This control also includes receiving complaints
from individuals for alleged damage caused by intelligence activity. This control is
exercized by different means, according to the laws of the countries, assuming the
necessity of authorization for intelligence agencies to carry out privacy invasions.13

12Important for this issue is the work organized by Born and Caparini (2007), pp. 217/236.
13Ugarte (2002a, b) A Procura de Legitimidade e Eficácia, pp. 40–62; Ugarte (2002a, b) Europa y
América Latina, una visión comparativa; Cepik, Ambros (2014), pp. 523–551. https://doi.org/10.
1080/02684527.2014.915176.
120 C. A. Molinaro and R. L. Ruaro

4 Privacy, Data Privacy and the Law 12,965/2014

The Brazilian Federal Constitution of 1988 shapes the country’s legal-political profile
and is characterized by its rigid form, organizing a democratic state under the rule
of law in a federative republic. The Constitution of the Republic promotes and guar-
antees freedom of expression, freedom of the press, the right to information (Article
5th, IV, IX, and XIV; Article 220 of FC/1988), and the right to privacy (Article 5th,
X, FC/1988). Thus, the constitutional guarantee of freedom of expression, in all its
modalities, and privacy in all its dimensions are legal requirements and an impera-
tive associated with the protection of human and fundamental rights enshrined in the
Constitution. Similarly, most of the contemporary ‘Constitutional States’ embrace
these rights, the paramount feature of democratic regimes. When it comes to regu-
lating the Internet, Law 12.965/2014 was undoubtedly the turning point; the law has
been introduced by means of a collaborative method involving Brazilian civil society
and through a productive process of public consultation.14
Law 12.965, of April 23, 2014, known as the Civil Framework or “Brazilian
Civil Framework for the Internet” and known as the “Charter of Internet Rights”,
establishes principles, guarantees, rights and obligations for the use of the Internet
in Brazil. In its Chapter I—Preliminary Provision, Article 2, the law regulates the
use of the Internet in Brazil based on respect for freedom of expression, as well as
recognition of the scale and global nature of the network. It prioritizes the protection
of human and fundamental rights and the consistent development of personality and
the exercise of citizenship through digital media. It is also based on the protection
of plurality and diversity as well as the character of “openness” and “collaboration”.
It defends free enterprise, free competition, consumer protection and, not least, the
social purpose of the Global Net.
Among other legal principles applicable to the advancement and protection of the
use of information and communication technologies regulated in the Brazilian legal
system, the Civil Framework of the Internet establishes the following principles: the
guarantee of freedom of expression, communication and manifestation of thought
(as well as freedom of conscience or ideas), protection of privacy and protection of
personal data as well as the preservation and the guarantee of network neutrality.15
In Chapter II, entitled: “On the rights and guarantees of users”, among other priv-
ileges and civil liberties, Article 7 establishes that the full exercise of citizenship
finds substantial support in free access to the Internet. Therefore, it guarantees the
inviolability of intimacy and privacy, as well as their protection and compensation
for material or moral damages resulting from their violation and the confidentiality

14 The full text of the law can be consulted online on the website of the Presidency of the Republic of
Brazil, Civil House, Sub-head for Legal Affairs (http://bit.ly/1kxaoKm). An English version can be
obtained online on the Comitê Gestor da Internet no Brasil—CGI.br (Committee Website Internet
Manager in Brazil—CGI.br) (http://bit.ly/2FyKmZr). Accessed 11 Jan. 2022.
15 Law 12,965/2014, Article 3rd numbers I to IV. Regarding the Neutrality of the Network, see,

also, Decree 8,771, dated May 11, 2016, which regulates Law 12,965/2014, Chapter II, where are
the admitted hypotheses of data packet discrimination and internet traffic degradation. Available
online in: http://bit.ly/1TRNpKo. Accessed 11 Jan. 2022.
Privacy Protection with Regard to (Tele-)Communications … 121

of private communications stored on (own) servers or clouds (except by court order).


This normative provision also added, through Law 13,709/18, the right of definitive
deletion of personal data at the request of the user, except for those of mandatory
custody provided by law. The same rule (Article 7) expresses the guarantee of not
suspending the Internet connection, except for debts arising from its use. Besides,
the rule is imperative to ensure the maintenance of the Internet connection quality
agreement (in this point also the importance of “neutrality”). Here, the rule should
be read in parallel with the rules of Chapter I, Article 3, Section IV, in conjunc-
tion with Chapter III, Section I. The guarantee of “neutrality” reveals a democratic
principle of the Internet that guarantees the freedom and openness of technology
without discrimination. The Internet service provider must ensure (also concerning
net neutrality) the quality and speed of the net, which must be the same throughout
the contract period, in the same sense the connection class must be the same, even
at peak times.16
Users are guaranteed rights to clear and complete information entailed in the
service contracts, specifying the protection regime for records of connection and
access to Internet applications, and on traffic management practices that may affect
the quality of the service provided. They have the right that their personal data is not
disclosed to third parties, including logs of connection and data of access to Internet
applications, unless with free, express and informed consent or in cases provided by
law. The law also guarantees users the right to the definitive deletion of their data at
the end of the relationship between the parties, except in cases of mandatory retention
of records provided for by law. To conclude, the publicity and clarity of any terms of
use of Internet connection providers or Internet application providers is guaranteed.
Furthermore, Article 7 states that users are entitled to accessibility in terms of their
own perceptual, sensory, intellectual and mental characteristics, as prescribed by law.
In addition, the application of consumer protection rules in consumer relations on
the Internet is guaranteed.17

16 In the United States of America, on the issue of network neutrality the Federal Communica-
tions Commission (FCC) has decided that fixed broadband will be reclassified as “information
service” and mobile Internet as “interconnection service”. Framed in this way, the two modalities
of connection leave the scope of the FCC and can be marketed in accordance with the market
interest. According to the Commission, infringements of network neutrality will be based on the
existing rules in antitrust and consumer law protection laws. What the FCC will do is increase the
demand for transparency. Operators will be required to inform: how they manage the network, the
performance of the network, and commercial terms of the service. According to the FCC, this helps
consumers to choose what works best for them and allows the persons, entrepreneurs and other small
businesses to get the information they need to innovate because they are the individual consumers,
not the government or its agencies that will elect the type and the nature of Internet access that is
best suited to their individual needs (for more information see the Website of FCC: https://www.fcc.
gov/, permanent link). The repeal of the guarantee to network neutrality is the highest point in the
series of regulatory flexibilities that the FCC has been doing during Donald Trump’s management
in the White House. The Commission has already loosened rules that prevented the concentration
of radio stations and TVs.
17 Law 12,965/2014, Article 7th numbers I to XIII. In the Brazilian literature see: Damásio and

Milagre (2014); Souza and Lemos (2016). In English see: Souza et al. (2017). Available online in:
http://bit.ly/2G87mMv. Accessed 11 Jan. 2022.
122 C. A. Molinaro and R. L. Ruaro

It is important to note that the “hard-core” of the Brazilian Civil Internet Frame-
work remains in Chapter II, Article 8, where the rights to privacy and freedom of
expression in communications are guaranteed as a prerequisite for the full exercise
of the right of access to the Internet. The legal provision implies that the respon-
sible providers must protect the records, personal data and private communications
of users, all with the aim of preserving the intimacy, privacy, honor, and image of
users. It should be remembered that, for the protection of intimacy and privacy, the
unauthorized disclosure of such information will only be released by judicial order,
except for the possibility of the administrative authorities to obtain the registration
data in accordance with the law. Non-compliance with these obligations entails the
application of the penalties provided for in Article 12 of Law 12,965/2014, in addi-
tion to the other penalties provided for by different legal rules that are applicable
regarding the nature and gravity of the violation and the damages resulting from it.
Sanctions also apply to any clause that prevents the user from obtaining the guarantee
of the inviolability of their data, privacy, and confidentiality, or that does not adopt the
Brazilian jurisdiction to propose possible legal actions aimed at the accountability
of providers.18
The Civil Framework of the Internet also extends to the user the rights of consumer
protection legislation. The affirmation of rights and the precise definition of what the
Internet represents for current social life seem to be very particular aspects for users,
which allows us to see that the Civil Framework has followed the option desired
by much of users and not that intended by the market. In this sense, the Frame-
work incorporates a demand of the citizen spirit and not the business style of the
markets. Chapter III is essential because it deals with Internet connection and appli-
cation options, being the most sensitive to the analysis considering that its wording
deals with the protection of records, personal data, and private communications, as
well as the liability for damages caused by content generated by third parties. The
chapter, as discussed above, also addresses another issue of concern on the subject:
net neutrality.19
Article 10 of Brazilian Internet-law stipulates that the retention and accessibility
of connection logs and access to Internet applications logs, as well as of personal
data and of the content of private communications, must be determined by the safe-
guarding of intimacy, private life, honor, and image of the persons directly or indi-
rectly involved. The provider responsible for the safekeeping of the data shall only be
obliged to make the records available, whether separately or associated with personal
data or other information that may contribute to the identification of the user or the
terminal, upon court order and in compliance with the provisions of the law. Further-
more, the content of private communications may only be made available by court
order, in the cases and in the manner established by law; however, the provision does

18 For a closer look see the well-articulated propositions in Comentários ao Marco Civil da Internet
(Comments on the Civil Framework of the Internet), from ABDET—Academia Brasileira de Direito
do Estado (BASEL—Brazilian Academy of State’s Law). Available online at: https://abdet.com.br/
site/wp-content/uploads/2015/02/MCI-ABDET..pdf. Accessed 11 Jan. 2022.
19 Footnote n. 15 and 16, retro.
Privacy Protection with Regard to (Tele-)Communications … 123

not prevent access to registration data that disclose personal qualification, affiliation
and address, in accordance with the law, by the administrative authorities that have
legal competence for their request. In addition, security and confidentiality measures
and procedures must be notified by the person responsible for the provision of services
and must meet the standards defined in the regulations, in compliance with rights to
confidentiality regarding business secrets which are also respected.
The Internet legal framework establishes the requirement to obligatorily respect
Brazilian legislation regarding the rights to privacy, the protection of personal data
and the confidentiality of private communications and records when collecting,
storing, retaining, and handling personal data or files. Article 11 states that in any
operation of collection, storage, retention, and treatment of personal data or commu-
nications data by connection providers and Internet applications providers where at
least one of these acts takes place in national territory, the rights to privacy, protection
of personal data and confidentiality of communications and private records must be
respected. Safeguarding is necessary for an environment dominated by transnational
corporations based in countries with very flexible rules if any such laws exist. In
this sense, the Brazilian Civil Framework guarantees a national forum for the solu-
tion of litigation, plus a guarantee to users in an eventual asymmetric dispute. The
law mentions civil, criminal and administrative sanctions, and defines four types of
penalties, which may apply individually or cumulatively. These are: (a) a warning,
indicating the deadline for the adoption of corrective measures; (b) a fine of up to
10% of the annual turnover of the economic group in the country; (c) the temporary
suspension of activities; and (d) the prohibition of operations in the country. The
law further states that it is the responsibility of the system administrator to maintain
records of connection, under a confidential, controlled environment, for one year, and
that this responsibility cannot be transferred to third parties.20 The measure inhibits
trade or data leakage, but legislation is flawed when it does not define sanctions in
case of noncompliance.
A question regarding the Civil Framework of the Internet is whether the law
guarantees the privacy of users. Can any law guarantee privacy? The answer is yes,
because it is prescribed by law, but can we effectively count on full application of
the law, or do we have (as in all peripheral countries) a legal enforcement deficit? It
would be naïve to think that the law would solve a set of complexities like this. Norms
arise when the consensus is distant, and ethical reflections have not been sufficient
to create an environment of equilibrium. The law prescribes conduct, but that does
not preclude its contempt or manipulation. In other words, the law is an important
starting point, but the establishment of a legal order on the Internet in Brazil (and in
the world) will depend on other efforts, also in the fields of education and culture.
However, Brazilian law responds positively to the question of whether instruments
are offered to citizens to safeguard their browsing and application data. Despite the
affirmative answer, some of these instruments are still weak.

20 See Article 13 of Brazilian Civil Framework, in particular, paragraph 6, which states that in
the application of the penalties for non-compliance “the nature and gravity of the infraction, the
damages resulting from it, any advantage obtained by the offender, the aggravating circumstances,
the offender’s history, and recidivism shall be considered”, (free translation of the law).
124 C. A. Molinaro and R. L. Ruaro

5 Privacy, Data Retention, and Brazilian Legislation Issues

On March 24, 2015, the United Nations Human Rights Council supported the
establishment of a special rapporteur on privacy. The resolution,21 led by Brazil
and Germany and cosponsored by 46 states, including 10 other Latin American
countries, gives the right to privacy the recognition and international protection it
deserves. Why has data retention provoked such a strong reaction from digital secu-
rity experts?22 Does it matter whether citizens’ application records or IP addresses—
which can change with each new session—are catalogued? Such data, after all, is
often dismissed by government officials as meaningless. However, this informa-
tion may work more like pieces of a complicated jigsaw puzzle. Taken separately,
it may seem irrelevant; but when carefully collected and combined, it reveals our
online identities with surprising accuracy. Regarding telephone communications,
for example, information about who people call can be used to derive extraordinary
sensitive insights about them, such as the fact that someone has sought and received
treatment for specific medical conditions. The IP addresses collected by a web service
can even reveal whether two people spent the night in one place. The information
released by your phone can tell whether two people were close to each other and
that proximity could reveal whether a person attended a protest or betrayed his wife.
The consequences of data retention mandates are far-reaching, but one particularly
worrying result is the undermining of the right of journalists to refuse to provide
evidence to the authorities to protect the confidentiality of their sources. One of the
most persuasive arguments against data retention is the risk of accidental exposure
through leaks and hackers. Unauthorized access to logs can lead to sales of confi-
dential information, blackmail, or threats to public figures. They are a continuing
source of risk to all users because the user can never be sure that the records will not
be disclosed and used for criminal purposes.23
The origin of the data retention obligation in Brazil is in Article 11 of Law 8,218,
of August 29, 1991 (as amended by Provisional Measure 2,158-35, of August 24,
2001).24
In the infra-legal environment, the possibility of data retention is provided in the
Annex of the Resolution of the National Telecommunications Agency—ANATEL
426, of December 9, 2005 (Regulation of the Switched Fixed Telephone Service).

21 United Nations. General Assembly A/HRC/28/L.27 Distr.: Limited. 24 March 2015 (Original:
English). Available online at: https://bit.ly/2PjO0IZ (reduced URL, permanent link). Accessed 11
Jan. 2022.
22 Cf. also for the heated discussions in Germany and Europe Albers (2022), in this volume.
23 Rodriguez (2015). Available online at: https://bit.ly/2RomTAH, (reduced URL, permanent link).

Accessed 11 Jan. 2022.


24 Article 11, Law 8.218 /91: “Legal entities that use electronic data processing systems to register

the business and economic or financial activities, books or prepare documents of an accounting or
tax nature, are required to keep at the disposal of the Federal Revenue Service, the individual digital
files and systems, for the decadence period provided for in the tax legislation”, (free translation of
the law).
Privacy Protection with Regard to (Tele-)Communications … 125

Two years later, the Annex to ANATEL Resolution 477/07, which edits the Regula-
tion on Personal Mobile Service, determines in item XXII of Article 10: “to keep,
at the disposal of ANATEL and other interested parties, documents of fiscal nature,
which include information on the connections made and received, date, time and
duration of the call, as well as subscriber registration data, a minimum term of five
(5) years” […].25 On August 6, 2012, by the Annex of the Resolution ANATEL 614,
the providers of telephone services are obliged to provide data, allow access and
provide online access to applications, systems, resources and technological facilities
used by them for the collection, treatment and presentation of data, information and
other aspects (Article 38 of the Resolution).
On August 2, 2013, Law 12,850 is published which deals with the definition of the
criminal organization, providing for the form of the criminal investigation, and the
means of obtaining evidence, related criminal offenses and criminal proceedings. By
this law, access to registries, cadastral data, documents, and information is guaranteed
to the authorities, regardless of judicial authorization, only to the cadastral data of
the investigated that exclusively inform the personal qualification, the affiliation
and the address maintained by the Electoral Justice, telephone companies, financial
institutions, internet providers and credit card administrators. Likewise, the transport
companies will allow, for five years, direct and permanent access by the judge,
the Public Prosecutor’s Office or the police delegate to the reservation and travel
registration databases. In addition, fixed and mobile telephone concessionaires will,
for five years, keep records of the identification of the telephone numbers of origin
and destination of international, long-distance and local telephone calls.26
In the infra-legal plan, Annex I of ANATEL Resolution 614/13, which regulates
the Multimedia Communication Service, in its Article 53 requires Internet connec-
tivity providers to maintain connection records and subscriber registration data for at
least one year. By this Resolution, the definition of connection records is established
in Article 53. 4, XVII (set of information regarding the date and time of beginning and
end of an Internet connection, its duration and the IP address used by the terminal
for sending and receiving data packages, among others that allow identifying the
access terminal used). It should be noted that the shorter term may already indicate
the context of the discussion that existed in the discussion environment before the
enactment of Law 12,965/14 (Civil Framework of the Internet).
Article 10, paragraph 3 of the Civil Framework of the Internet, at least explicitly
states that the disclosure of Internet connection and access to applications records
may only be made by court order; this protection is repeated in Articles 13, paragraphs
5, and 15, paragraph 3. Article 22, in turn, outlines the purposes for which this may
occur, namely the creation of an “evidence basis in civil or criminal proceedings”,
and establishes the requirements to be met by the request of the “interested party”
for the granting of the judicial order: well-founded evidence of the occurrence of the

25 Copy of normative act at: https://bit.ly/29N0qX6, (reduced URL, permanent link). Accessed 11
Jan. 2022.
26 Law 12,850, of August 2, 2013, Article 17. Copy of normative act at: https://bit.ly/1MTFk42,

(reduced URL, permanent link). Accessed 11 Jan. 2022.


126 C. A. Molinaro and R. L. Ruaro

offense, reasoned justification of the usefulness of the records requested for purposes
of investigation or evidentiary instruction, and the period to which the records refer.27
Regarding the terms of access to registration data, in paragraph 3 of Article 10 the
Marco Civil provides, that the respect for the protection of personal data and private
communications guaranteed in the caput of the article “does not prevent access to
registration data informing about personal qualification, affiliation, and address, in
accordance with the law, by administrative authorities that have legal competence for
their request”. Regarding this provision, there is no clarity who these “administrative
authorities” are with the power to directly request registration data, thus enabling
several governmental authorities to claim this prerogative for themselves. Finally, the
breach of confidentiality of the content of electronic communications in possession
of Internet application providers (such as Google and Facebook) is also provided for
in the Civil Framework of the Internet, in Articles 7, III and 10, paragraph 2, which
clarify the need for a court order to do so. Contrary to what occurs for the provision
of records (Article 22), however, the law does not explicitly address the formal and
substantive requirements that must be met for a court order to be granted. This gives
rise to abuses and case-by-case applications.28

6 Conclusion

The involvement of information technologies in social and individual contexts has


become more and more general and invasive.
Surveillance and control devices are not exclusive to the virtual space. Whether
through surveillance cameras or GPS monitoring, these mechanisms were used to
monitor streets, schools, airports, buses, churches, shops, bank branches, and even
cell phones. When we register on social networking sites, free e-mail, or even retail
e-commerce, we disclose data about our privacy, such as personal tastes and prefer-
ences. With this, we become potential customers, since, through dedicated software,
in interaction with the Big Data, consumption patterns (profiling) will be configured,
sorted by people, cities, religions, sexual orientation and so on.29
It is observed, then, that the sites visited, the e-mails sent with a specific question
or specific theme become objects of the creation of a pattern of consumption, so that
the companies that sell products or services can, through intrusive procedures, spread
their propaganda. It is a strategy that exposes our expectations to the “watchful eye”
of companies (the new “Big Brother”) since marketing becomes the instrument of
social control.

27 Antonialli et al. (2017), pp. 345–367. Available online at: https://bit.ly/2IIGar7 (reduced URL,
permanent link). Accessed 11 Jan. 2022.
28 Antonialli et al. (2017), pp. 345–367. Available online at: https://bit.ly/2IIGar7 (reduced URL,

permanent link). Accessed 11 Jan. 2022.


29 Cf. also Schertel Mendes and Mattiuzzo (2022), in this volume, with a view to algorithms and

discrimination in the field of credit scoring.


Privacy Protection with Regard to (Tele-)Communications … 127

With the insertion of these new technological mechanisms, the increasingly


sophisticated dissemination of information contributes to the growing narrowing
of the private circle. These systems of surveillance, monitoring and control can be
well illustrated by the narrative contained in the film “The Matrix,” where society
ends up being transported to a controlled space.
Today, ICT technologies present numerous advances and problems. Mass media,
strengthened by new technologies, have abolished religious, political and cultural
boundaries. In this sense, they are directly related to the issue of privacy because, in
the name of security or social issues, we allow ourselves to be filmed by cameras in
banks, public and commercial buildings, supermarkets, shopping centers, residential
buildings, elevators, among others.
The violation of privacy is not only given in physical space, but also—and espe-
cially—in virtual.30 It is a fact that the aspects of private life vary according to the
social categories of each individual person because there are those who preserve and
broaden the factors related to privacy, but there are also others who expose them-
selves with the ostentation of their privacy. However, it is still the right of everyone
to choose what shall be shown and what should be deprived of public knowledge.
The arbitrary invasion of privacy characterizes contemporary society through
the creation of technological instruments, or by the deliberate overexposure of
individuals in search of notoriety and social identity.
It cannot be repeated often enough that nowadays we live in an information society
immersed in a web of surveillance, either by the private sector or by those who
should protect us from the interference of our privacy, the public sector. We note
that information reveals itself as an extraordinary legal good of high value since the
opportunities arising from the use of the Internet generate more economic activity
based on the information economy.
The novelty focuses not on the advertising of the data available on the Internet, but
rather on the ease of structured search because information has always been available
in a sparse way, but its ease of organized access by anyone, whether public or private,
is something unprecedented.31
In the economic system in which we live, it also deserves emphasis that there is a
bias of privacy violation by the private sector regarding the processing of personal data
of individuals for the benefit of transnational companies that work with information
and communication technology. Personal data of individuals have begun to have
value in the economic world, in the information and communication markets, and
they start to be the subject of specialized marketing.
In the virtual environment, there is a network that appears to be transparent,
visible, perceptible and open in every Internet user’s eyes. However, another system
exists, not so apparent, somewhat veiled, which consists of the agreements made

30 Digital cameras, camera phones, webcams—connected to the Internet—cause effects directly


linked to the loss of privacy, because their users can use these technological advances to compromise
the image of an individual, since the content of such models can be displayed on websites, blogs,
photo blogs, YouTube, without consent from these people.
31 Cf. for the unprecedented characteristics of surveillance under the conditions of the Internet also

Albers (2022), Sect. 2.1, this volume.


128 C. A. Molinaro and R. L. Ruaro

between providers invading the lives of people and companies that have an interest in
obtaining personal data about individuals. Intrusion into the personal data of Internet
users through cookies or spreadsheets is a common practice. Tools for collecting data
are available in the IT-market.
Another issue is the relation between public and private. The espionage cases show
that the division between the public and the private sector is becoming increasingly
precarious. This is because the public sector uses the services of private companies to
succeed in global surveillance (for example, the international intelligence agencies
and the network giants like Google, Microsoft, Apple, and so many others have a
partnership for the invasion and intersection of people’s data, based on the argument
of national defense reasons).
Moreover, private companies provide this type of service to the public sector just
because it is more convenient to participate in the culture of state control than to
resist it. Thus, they become complicit in government control.
Are there sufficient means of protection in this intrusive environment? There is no
guarantee of privacy protection and “hiding” from surveillance networks (public and
private). However, some tools are available to improve possible protection levels,
such as the use of a VPN.
A VPN, virtual private network, is one of the main ways to ensure the privacy of
browsing.32 The VPN creates an encrypted tunnel between the user’s computer and
the VPN server. With this, the ISP does not have access to the data that is transmitted
in this tunnel—it only knows that the user has connected to a VPN server, at what
times and for how long, but does not have access to the contents of the VPN server. In
other words, a VPN helps to hide the IP of the user’s computer (the identity number
of each machine). When a user is downloading a file that is in Central Europe,
for example, the VPN shows that this file is being downloaded by an American-IP
machine—or from some other country of the user’s choice—thus hiding the exact
IP of the device, for example a computer located in Brazil.
However, the VPN may be of no use if the company that manages the network
maintains the information of its users and cooperates with government agencies. If
this company is required to disclose data from a specific user,33 it can quickly search
for the files of that user in the system and deliver them to the authorities. It is therefore
crucial to look for a VPN that does not include archive logs (records of activities
performed by a computer).34
Other tools can be more or less useful, such as the “Tor” browser.35 Developed as
a research project of the U.S. Navy, Tor has become a free project that helps protect
the location of Internet users. After installing the application on the computer, users

32 The choice of a secure VPN depends on different variables: whether or not the company maintains
user logs, how it behaves when requested to disclose data to governments and how the payment
method for the product is linked to each user’s account.
33 See for the regulation of anonymization services in the first version of German data retention

rules Albers (2020), Sect. 4.2.4, in this volume.


34 Perlmutter (2000); Held (2004); In Portuguese see: Guimaraes et al. (2006).
35 Tor Project: https://www.torproject.org/. Accessed 11 Jan. 2022.
Privacy Protection with Regard to (Tele-)Communications … 129

can access the Internet through “virtual tunnels” that do not allow anyone to intercept
their navigation or determine where they are located or what specific content they
have posted. Tor itself does not hide the user’s identity on the web, but provides a
tool/user interface that allows appropriate security. Tor is an Internet network protocol
designed to anonymize the data transmitted through it. Using Tor will make it difficult,
if not impossible, for any eavesdropper to see your webmail, search history, social
media posts, or other online activities. They will also not be able to tell in which
country you are using your IP address, which can be very valuable for journalists,
activists, business owners and others. When you run Tor, online data collectors such
as Google Ads and the little known but powerful Acxiom aggregator will not be able
to perform traffic analysis and collect data about your Internet habits. Theoretically,
surveillance organizations, such as the NSA, will also not be able to observe it.
CYPHR must also be mentioned. CYPHR is a simplified system of encrypted
messages that guarantees the security of the user’s conversation and has the policy
of not storing user records.36
As for browser extensions, HTTPS Everywhere37 is a free extension available
for the Mozilla Foundation Firefox browser. The extension allows the user to access
the secure version of all sites on the Internet, i.e., those that have the URL started
by HTTPS, and prevents information sent between the Internet user’s computer and
third parties from intercepting the server where the website is located.
The TOSBack38 alerts users about changes regarding the use of sites like Face-
book, Twitter, and others, which can change without notice. It is fundamental, for
example, to notice necessary changes, such as sharing user information from two
services held by the same company.
Finally, two tools deserve prominence, the SpiderOak, and ProtonMail. SpiderOak
is a kind of fortified cloud. The online storage service is similar in everything to
Dropbox or Google Drive, with the only difference that it includes enhanced security
measures to ensure the privacy of users. They take privacy seriously in such a way
that the company does not keep records of passwords. If somebody loses his or her
password, there is no way to log in without it. The same characteristics are present in
ProtonMail, an encrypted e-mail system with end-to-end technology. Messages are
stored on ProtonMail servers in an encrypted format and transmitted in an encoded
form between servers and user devices.39
Regardless of software and protection devices, measures must be taken to protect
privacy and its connectedness. There is no doubt, at present, that all over the planet,
at least some rights have been granted to users of the Internet, among them:
• Right to information: Organizations must be fully transparent about the use of
personal data;

36 CYPHR by Goldenfrog: https://www.goldenfrog.com/cyphr. Accessed 11 Jan. 2022.


37 HTTPS Everywhere: https://www.eff.org/https-everywhere. Accessed 11 Jan. 2022.
38 TOSBack: https://tosback.org/. Accessed 11 Jan. 2022.
39 SpiderOak: https://spideroak.com/; ProtonMail: https://protonmail.com/. Accessed 11 Jan. 2022.
130 C. A. Molinaro and R. L. Ruaro

• Right of access: Individuals should have the right to know exactly what
information is being processed, how and for what purpose;
• Right of rectification: Individuals should have the right to request a correction of
their data if it is inaccurate or incomplete;
• Right of elimination also known as the right to be forgotten: this refers to a person’s
right to request the deletion or removal of their data without the need for a specific
reason;
• Right to limit processing: It refers to a person’s right to exclude or suppress the
processing of their data;
• Right to transfer data: This allows individuals to retain and reuse their data for
their purposes;
• Right of defense: In some cases, individuals should have the right to oppose the
use of their data, as in the case of a company that uses personal data for direct
marketing, scientific and historical research, or for the fulfillment of a public
interest mission;
• Rights regarding automated decision-making: Profiling and automated decision-
making can be useful for data holders and organizations, providing benefits such
as greater efficiency and economy. However, such procedures require adequate
safeguards as they may pose significant risks to the rights of individuals and their
freedoms. Individuals may choose not to be subject to an automated decision if it
would result in undesired consequences.

References

ABDET—Academia Brasileira de Direito do Estado, Comentários ao Marco Civil da Internet.


Available online at: https://abdet.com.br/site/wp-content/uploads/2015/02/MCI-ABDET..pdf.
Accessed 11 Jan. 2022
Albers M (2022) Surveillance and data protection rights: data retention and access to telecommu-
nications. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Antonialli D, Souza Abreu J (2017) The treasure trove’s tale: the expansion of surveillance by
the evolution and popularization of cell phones in Brazil. 5th Simposio Internacional LAVITS.
Vigilancia, Democracia y Privacidad en América Latina: Vulnerabilidades y resistencias - 29 y
30 de noviembre, 01 de diciembre de 2017. Santiago, Chile, pp 345–367. Available online at:
https://bit.ly/2IIGar7 (reduced URL, permanent link). Accessed 11 Jan. 2022
Born H, Caparini M (2007) Democratic control of intelligence services—containing rogue
elephants. Ashgate Publishing, Farnham
Cepik C (2014) Intelligence, crisis, and democracy: institutional punctuations in Brazil, Colombia,
South Africa, and India. Intell Natl Secur J 29(4):523–551. https://doi.org/10.1080/02684527.
2014.915176
Damásio JA (2014) Marco Civil da Internet - Comentários À Lei n. 12965/14. Saraiva, São Paulo
Doneda D, Zanatta R (2022) Personality rights in Brazilian data protection law: a historical perspec-
tive. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Donohue L (2016) The future of foreign intelligence: privacy and surveillance in a digital age.
Oxford University Press, New York
Privacy Protection with Regard to (Tele-)Communications … 131

Dorrestijn S (2012) The design of our own lives. Technical mediation and subjectivation after
Foucault, 3rd edn. Wöhrmann, Zutphen/Netherlands
Greenwald G (2014) No place to hide: Edward Snowden, the NSA, and the U.S. Surveillance State.
Metropolitan Books, New York
Guimaraes AG, Lins RD, Oliveira RC (2006) Segurança em Redes Privadas Virtuais – VPNs.
Brasport, Rio de Janeiro
Held G (2004) Virtual private networking: a construction, operation and utilization guide. Wiley,
New Jersey
Keller WW (2017) Democracy betrayed: the rise of the surveillance security state. Counterpoint,
Berkeley
Leleux C (2014) IRISS—increasing resilience in surveillance societies, FP7 European research
project, deliverable D6.1: civil protection in a European context, in a report on resilience in
“democratic” surveillance societies. In: Wright D, Rodrigues R (eds) European Commission,
FP7, IRISS: increasing resilience in surveillance societies deliverable, 6.1, part 2.1.7. IRISS,
Glasgow
Levine Y (2018) Surveillance valley: the secret military history of the internet. Public Affairs, New
York
Otero P (2015) A Democracia Totalitária do Estado totalitário à Sociedade totalitária. A influência
do totalitarismo na democracia do Século XXI, 2ª Reimpressão. Principia, Lisboa
Perlmutter B (2000) Virtual private networking: a view from the trenches. Prentice Hall PTR, New
Jersey
Reischl G (2013) Unter Kontrolle: Die fatalen Folgen der staatlichen Überwachung für Wirtschaft
und Gesellschaft. Redline Wirtschaft, München
Rios C (2016) ¿Nuevos paradigmas de vigilancia? Miradas desde América Latina. Memorias del
IV Simposio Internacional Lavits, Buenos Aires. https://descargas.vialibre.org.ar/libros/lavits/
Lavits2016_BsAs_Libro.pdf
Rodriguez K (2015) Privacy is a human right: data retention violates that right. Americas Quarterly
2015. Available online at: https://bit.ly/2RomTAH, (reduced URL, permanent link). Accessed 11
Jan. 2022
Rodríguez OT (2014) Seguridad del Estado y privacidad. Editorial Reus, Madrid
Schertel Mendes L, Mattiuzzo M (2022) Algorithms and discrimination: the case of credit scoring
in Brazil. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London, in this volume
Souza CA, Lemos R (2016) Marco civil da internet: construção e aplicação. Juiz de Fora: Editar
Editora Associada Ltda.
Souza CA, Viola M, Lemos R (2017) Brazil’s internet bill of rights: a closer look. Juiz de Fora:
Editar Editora Associada Ltda. Licensed under creative commons 4.0, by Institute for Technology
and Society of Rio de Janeiro (ITS Rio). Available online in: http://bit.ly/2G87mMv, (reduced
URL, permanent link). Accessed 11 Jan. 2022
United Nations. General Assembly A/HRC/28/L.27 Distr.: Limited. 24 Mar 2015 (Original:
English). Available online at: https://bit.ly/2PjO0IZ, (reduced URL, permanent link). Accessed
11 Jan. 2022
Ugarte JM (2002a) Controle Público da Atividade de Inteligência: a Procura de Legitimidade e
Eficácia. In: Anais do Seminário Atividades de Inteligência no Brasil: Contribuições para a
Soberania e a Democracia. Congresso Nacional, Brasília
Ugarte JM (2002b) Control público de la actividad de inteligencia: Europa y América Latina,
una visión comparativa. In: Congresso Internacional Post-Globalización: Redefinición de la
Seguridad y la Defensa Regional en el Cono Sur, 2002, Anais. Buenos Aires, Centro de Estudios
Internacionales para el Desarrollo
132 C. A. Molinaro and R. L. Ruaro

Carlos Alberto Molinaro Doctor of Law, with mention of “European Doctor” by Pablo de
Olavide University, Spain. Former free researcher at the Centre of Social Studies of the Univer-
sity of Coimbra, Carlos III University, Université Catholique de Louvain and the Pablo de Olavide
University and former Professor of the Master’s and Doctorate Program at the Pontifical Catholic
University of Rio Grande do Sul—Law School. Main areas of research: Philosophy and Crit-
ical Theory of Law, Environmental Law, Constitutional Law. Selected Publications: Privacidade e
proteção de dados pessoais na sociedade digital. 1. ed. Porto Alegre: Editora FI, 2017 (in collab-
oration with Regina Linden Ruaro and J. F. Pinar Manas); Big data, machine learning e a preser-
vação ambiental: instrumentos tecnológicos em defesa do meio ambiente. Revista Veredas do
Direito, v. 15, p. 201, 2018 (in collaboration with Augusto Fontanive Leal); Fim da privacidade:
divulgação e negociação de dados pessoais (End of privacy: a disclosure and trading of personal
data), Economic Analysis of Law Review v. 10 n. 1, 2019 (in collaboration with Regina Linden
Ruaro).

Regina Línden Ruaro Professor and Associate Dean of the Law School of the Pontifical Catholic
University of Rio Grande do Sul. Retired Federal Attorney/AGU. Doctor in Law by the Univer-
sidad Complutense de Madrid. Post-doctoral internship at the Universidad San Pablo—Ceu de
Madrid. Composes the International Research Group “Data Protection and Access to Informa-
tion”. Invited Professor of the Master in Data Protection, Transparency and Access to Infor-
mation at Universidad San Pablo de Madrid-CEU of Spain. Honorary Member of the Interna-
tional Institute of State Law Studies—IEDE. He leads the Research Group registered with the
CNPq: Personal Data Protection and Fundamental Right of Access to Information in the Demo-
cratic State of Law. Selected Publications: Privacidade e proteção de dados pessoais na sociedade
digital. 1. ed. Porto Alegre: Editora FI, 2017 (in collaboration with Carlos Alberto Molinaro e J.
F. Pinar Manas); Fim da privacidade: divulgação e negociação de dados pessoais (End of privacy:
a disclosure and trading of personal data), Economic Analysis of Law Review. v. 10 n. 1, 2019
(in collaboration with Carlos Alberto Molinaro).
The Protection of Personality
in the Digital Environment
An Analysis in the Light of the So-Called Right to be
Forgotten in Brazil

Ingo Wolfgang Sarlet

Abstract Human dignity and personality rights are particularly vulnerable in the
so-called digital age due to the use of new information technologies. Thus, law needs
to respond effectively to deal with the new challenges represented by more impactful
and even newer ways of placing at risk and violating personality rights. This has led
to recognizing new expressions of these rights, as in the case, among others, of the
right to informational self-determination and, more recently, the so-called right to be
forgotten. In Brazil, the situation is not different, because the problems and challenges
generated by the use of the internet have taken on a global scale, so that the discussion
is not only about its recognition itself, but especially regarding the content and the
scope of the right to be forgotten. Hence, this debate is placed in an outstanding posi-
tion on the political and legal agenda. The present text, after describing how and for
what reason the right to be forgotten has been recognized as an implicit fundamental
right in the Brazilian constitutional system, tries to identify, present and evaluate
critically the way this right has been applied by the Superior Courts and afterwards
being to certain extend denied by the Federal Supreme Court (Supremo Tribunal
Federal) in a recent decision. Although in the cases that are not related to the internet
there are already decisions sheltering the right to be forgotten, some resistance is
still found with regard to the digital environment, especially in which concerns the
responsibility of search engine providers. In this context, the article seeks to show
the need to ensure, besides the right to deleting data—partly recognized by domestic
legislation—a right to deindexation vis-à-vis the search engine providers, so that

I thank the DAAD (Deutscher Akademischer Austauschdienst) and CAPES (Comissão de Aper-
feiçoamento de Pessoal de Ensino Superior) for the support granted to the Academic Exchange
and Research Project ‘Internet Regulation and Internet Rights’ (PROBRAL program) between
the Pontifical Catholic University Porto Alegre, Brazil (PUCRS) and the University of Hamburg
(Germany), coordinated by Professor Marion Albers (Hamburg) and me (PUCRS). I also thank the
Max-Planck Institute for Comparative and International Private Law, Hamburg, for the opportu-
nity to research the subject of this paper and the protection of personality rights on the Internet in
general during the period of January-February 2017 (as a Fellow with a Grant from the Institute)
and January-February 2018. This paper is one of the results linked to these projects.

I. W. Sarlet (B)
Pontifical Catholic University of Rio Grande do Sul—Law School (PUCRS), Porto Alegre, Brazil

© Springer Nature Switzerland AG 2022 133


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_7
134 I. W. Sarlet

the Brazilian legal system—except for its peculiarities—becomes aligned with the
decision of the Court of Justice of the European Union (CJEU) in the case of Google
versus the Spanish Data Protection Agency and Mario Costeja and the new General
Regulation of Data Protection for the European Union. However, a critical analysis
of the decisions of the Brazilian Superior Courts reveals that, as a rule, the criteria
applied in case-law, even being in part aligned with those managed by the CJEU,
have been used inconsistently and even contradictorily, lacking further improve-
ment and complementation, besides having to be adapted to the requirements of the
Constitution. In conclusion, it is advocated that because of the preferred position
of freedom of speech and information, recognizing a right to be forgotten must be
exceptional, besides complying with strict criteria established by the legislator and
being submitted to strict control as regards their constitutional legitimacy.

1 Introduction

Considering that the protection of the dignity of the human person and personality
rights represents—in the face of technological advances, especially in the field of
information technologies and those involving the internet environment—a central
problem for the political, economic and legal system, the discussion of the so-
called right ‘to be forgotten’—as happened in Europe and the USA—soon called
the attention also of Brazilian legal scholarship, case law and politics.
Although the problems related to the protection of human and fundamental rights
in this context have worldwide and transnational repercussions, going beyond the
territorial boundaries of the States and requiring compatible answers and solutions,
there are peculiarities that must be taken into account. For this reason, priority will be
given to a constitutionally adequate understanding, i. e. adapted to the peculiarities
of the Brazilian legal system, without neglecting some references to international
and foreign law when relevant to the comprehension, development and critique of
the evolution of the topic and its treatment in Brazil.
Before going any further, however, it is essential to make a brief note, even in
this introductory phase, regarding the origin, terminology and conceptualization of
the so-called right to be forgotten, aspects which in themselves are not the object of
consensus. Therefore, it is imperative to establish here a few semantic agreements
and premises that guide the subsequent considerations.
Both from the terminological and conceptual standpoint, the notion of a right to be
forgotten, known in German as Recht auf Vergessenwerden, despite its more recent
dissemination, is not in itself new. Some relatively old direct references are found,
be it in court decisions or in the literature.
Thus, for instance, in the sphere of case law, the notion of a right to be forgotten
is said to have been used in France in the well-known Landru Case (TGI—Tribunal
de Grand Instance Seine), judged on October 14th , 1965, and confirmed on appeal,
The Protection of Personality in the Digital Environment 135

but also in a decision dated April 20th , 1983, by the TGI Paris.1 In the literature, it is
useful to refer, likewise as an illustration, to the contributions of Agathe Lepage2 and
Théo Hassler,3 and mainly to the monographic work by Viktor Mayer-Schönberger,4
that greatly contributed to the dissemination of the term.
Even in Brazil, there are some earlier references to a right to be forgotten, in the
legal literature since the beginning of the Decade of 1990, but also in some judicial
decisions from 2012 and 2013, that will be presented and analyzed later.5
But it was the famous case Google versus the Spanish Data Protection Agency
and Mario Costeja judged by the (CJEU) on May 13th , 2014,6 that gave worldwide
notoriety to the so-called right to be forgotten although this right as such was not
mentioned in the decision.
On the other hand, independently of the direct mention of a right to be forgotten
in the aforementioned court cases, this notion has been traced back to other relevant
judicial precedents involving the conflict between freedom of speech and person-
ality rights as well as other constitutional goals and principles (e. g. democracy,
transparency). These precedents—as, for instance, in the Lebach case judged by the
Federal Constitutional Court of Germany in 1973—were utilized to provide the justi-
fication for recent cases acknowledging a right to be forgotten, also by the Brazilian
Judicial Power.7
As regards the legislative sphere of the normative regulation of a right to be
forgotten, it is impossible to ignore the mention of the document of the European
Union from November 2010, A Comprehensive Approach to Data Protection in the
European Union, which refers to the need to ensure a right to be forgotten and
even suggests a definition,8 as well as mention of the right to be forgotten in the
drafting of a new general regulation about data protection in the European Union of
March 2012. This provision was ultimately included in the European General Data
Protection Regulation approved in March 2016 (679/2016) that has come into force

1 See Heylliard (2012), p. 10.


2 Lepage (2001), p. 2079.
3 Hassler (2007), p. 2829.
4 Mayer-Schönberger (2009).
5 Rodrigues Junior (2013), pp. 1–5.
6 These and other examples can also be found in Carello (2017), pp. 45ff.
7 See the case of the Candelária Slaughter judged by the Superior Court of Justice in 2013 which

will still receive special attention. See Brasil, Superior Tribunal de Justiça. Recurso Especial:
REsp 1334097/RJ, Justice-Rapporteur Luis Felipe Salomão, Fourth Panel, judged on May 28,
2013. https://scon.stj.jus.br/SCON/GetInteiroTeorDoAcordao?num_registro=201201449107&dt_
publicacao=10/09/2013. Accessed 11 Jan. 2022.
8 ‘… clarifying the so-called ‘right to be forgotten’, i.e. the right of individuals to have their data no

longer processed and deleted when they are no longer needed for legitimate purposes.’ It should,
however, be noted that this document—on the contrary of what is mentioned in some publications—
does not consist of a Directive or Regulation, but rather a kind of Letter of Intent in the form of a
communication of the European Commission to the European Parliament about the need for a new
and more effective regulation of the matter.
136 I. W. Sarlet

on May 25th , 2018.9 In Article 17, the new regulation contains a right to erasure of
data (expressly associated with a right to be forgotten) and establishes parameters
for its application.10
For these reasons it did not take long for the term right to be forgotten (although
technically not the most appropriate11 ) to be disseminated and widely used, becoming
incorporated into the current language, in the media, in the literature and in case law
in several countries and languages. Indeed, the figure of the right to be forgotten—
and the Brazilian case reveals this—has even been associated with situations that are
not always directly related to this object and where before and during a long time
there was even no mention of such a right. Despite this, it is necessary to have a
semantic agreement and, to avoid doubts about the object of our text, here too we
will use this terminology.
As to the content and justification of a right to be forgotten, although these are
arguments that are mostly shared by the different legal systems where this right has
gained visibility, here we will include, when timely, the arguments as to the founding
of the recognition and definition of the right to be forgotten and its object in Brazil.
Furthermore, the discussion on the recognition, content, limits and possibilities
of implementing the right to be forgotten, including a possible controversy about its
condition as a human and fundamental right, has become relevant in several contexts
and environments, partly requiring distinct approaches and solutions.
Thus if, regarding the recognition of the right to be forgotten in the conventional
cases that involve conflicts of the personality rights with the freedom of speech and
information, and even other rights, principles and relevant legal interests, generally
care is taken to consider and apply habitual solutions and possibly develop them, the
same does not occur in the internet environment.
Indeed, on the internet, as already mentioned partly, although the same values and
fundamental rights are involved, the context and its peculiarities have caused very
complex technical, legal (as well as economic, political, ethical, sociological and
even cultural) problems and challenges that are particularly difficult to solve. As an
illustration, it is sufficient to mention here the dissemination of access to the internet,
the almost instantaneousness of the information flow and its large-scale repercussion,
including at a global level, the difficulty of establishing an adequate regulation or of
effectively imposing the discourse of fundamental rights in this context.

9 See European Union, Regulation (EU) 2016/679 of the European Parliament and of the Council
of 27 April 2016 on the protection of natural persons with regard to the processing of personal data
and on the free movement of such data and repealing Directive 95/46/EC (General Data Protection
Regulation) http://eur-lex.europa.eu/eli/reg/2016/679/oj. Accessed 11 Jan. 2022.
10 ‘Article 17 Right to erasure (‘right to be forgotten’) 1. The data subject shall have the right to

obtain from the controller the erasure of personal data concerning him or her without undue delay
and the controller shall have the obligation to erase personal data without undue delay where one
of the following grounds applies: ….’
11 It should be noted that, under the label of a right to be forgotten, what is at stake is—in tech-

nical terms—basically to assure individuals of a right to obtain the annulment, but also the non-
dissemination, and/or making it difficult to access particular pieces of information (as seen in the
requests for deindexation to search engine providers of the internet).
The Protection of Personality in the Digital Environment 137

These circumstances also provide the opportunity to create new mechanisms to


protect human dignity and personality rights, on the one hand (which are the subject
of the right to be forgotten), and, on the other, the principles of the rule of law, of
democracy and of correlated communicational freedoms, to mention only the most
relevant to our study.
Therefore, and taking into account the subject of the book of which the present text
is part, it should be mentioned that here we will focus on the right to be forgotten in
the digital sphere and in Brazil, which all the same requires—when there are common
elements that shall be considered—a look at other experiences that, likewise, have
been the object of heated controversy.
For this purpose, the path to be taken here is the following: We shall begin by
examining the possible constitutional foundations of a right to be forgotten in Brazil
(2.1), justifying its condition as a fundamental right in the material and formal sense.
Next, but still, in this context, we shall try to summarily identify the specific manifes-
tations that already exist at the level of ordinary legislation (2.2), and then present the
main decisions of the Superior Courts that involve recognition (or not) of a right to be
forgotten, focusing here on the digital environment, more specifically on the internet
(3). In sequence, we will perform a global and critical analysis of how the right to
be forgotten has been understood and applied in Brazil, highlighting the problems
connected to their content and limits, including the criteria used for its determi-
nation, especially as regards the tensions between personality protection and other
fundamental rights (4), ending the text with a few conclusions and interrogations (5).

2 The Legal-Constitutional Framework and the Inference


of a Fundamental Right to ‘be Forgotten’

2.1 Constitutional Foundations

Similarly to the development of the right to be forgotten in the European Union12


and several European countries, as in the case, among others, in Germany,13 Spain,14
France15 and Italy,16 and even in other parts of the world, such as the USA (although
in this case facing strong resistence17 ), in Brazil, this legal concept is not expressly

12 Here we point to the references already made to the new Data Protection Regulation of the
European Union and to the case of Google vs. the Spanish Data Protection Agency and Mario
Costeja, judged by the CJEU.
13 See recently Weismantel (2017).
14 See pars pro toto Caro (2015).
15 See pars pro toto Dechenaud (2015).
16 See recently Martinelli (2017).
17 See especially Mayer-Schönberger (2009), as well as the recent article by Kelly and Satolam

(2017).
138 I. W. Sarlet

established in the sense of being textually and specifically expressed in the Federal
Constitution of 1988 (hereinafter FC).
Despite this, Brazilian legal scholarship doctrine and case law have begun to
recognize, especially (not only) from 2013 onwards, a right to be forgotten, even
with the status of a fundamental right, although some contrary positions.18 On the
other hand, there is no consensus about several points related to the right to be
forgotten, such as its content, limits and criteria for application.19
In this context, it is essential to take into account that the right to be forgotten
holds—at least according to the majority of the Brazilian literature—the double
condition of a fundamental right in the substantial and formal sense. This means
that (here in the substantial sense) the right to be forgotten must be founded in
certain superior values and principles and its relevance must be justified from a
philosophical and constitutional perspective, in order to be recognized and protected
as a fundamental right.20
On the other hand, its condition as a fundamental right in the formal sense corre-
sponds to the qualified legal regime of fundamental rights (precisely due to their
substantial fundamentality) in the FC, such as the prerogative of immediate appli-
cability and direct binding of state actors and even of private entities, but also its
condition as a substantial limit to the constitutional amendment power and its rein-
forced protection from restrictions imposed by the legislator, administrator, judicial
power and even by acts performed by private actors.21
From this perspective, as shown by the German experience, the acknowledgment
of a right to be forgotten is founded—from the constitutional perspective—in the
dignity of the human person, in the general clause of personality protection (or
rather, in the right to a free development of personality), besides being related to
and inferable from other special personality rights, such as the right to informational
self-determination, and the rights to a private life, honor and image.22 Furthermore,
it is widely recognized that the acknowledgment of a so-called right to be forgotten
is justified by the protection of personality in the face of potential abuses resulting
from the exercise of freedom of speech and information, not only, but especially in
the internet environment.
In a first approach—turning here to the material dimension of its fundamentality—
one must begin with the general and structuring fundamental principle of the dignity
of the human person, contained in Article 1, item III of FC, in the article that lists
the foundations of the Brazilian democracy and rule of law. This principle has also
assumed—in a practically settled manner in legal scholarship and case law—the

18 Mention should be made here of the view of Sarmento (2016), pp. 190–232.
19 Limiting ourselves here to the sphere of the Brazilian books dedicated specifically to the topic,
see Martinez (2014); Maldonado (2017); Carello (2017); Consalter (2017); Branco (2017), Sarlet
and Ferreira Neto (2018) and Frajhof (2019).
20 See, pars pro toto, Ferreira Neto (2016), pp. 278–323.
21 For the development of this point see, pars pro toto, Sarlet (2015).
22 See, pars por toto, Buchholtz (2015), pp. 127ff.
The Protection of Personality in the Digital Environment 139

condition of a general clause of protection of personality and the justification for


the—also implicit—fundamental right to a free development of the personality.23
The direct connection between the right to be forgotten, the dignity of the human
person and the general right of personality, in the sense of a right to free development
of the personality, may be justified, in a first approach, by the fact that the capacity
and possibility of being forgotten, and the need to recognize and protect it in the
legal sphere, represent a necessary condition to also exercise what was designated as
a right to reinvent oneself24 or to a new beginning, i.e. the possibility of reformatting
(reconstructing) the personal (individual) existential and social trajectory, free of
particular ties provoked by the direct and permanent confrontation in time with
aspects involving memory (past).25
In other words, the possibility of forgetting, but also—and here there is a need
for recognition and protection vis-à-vis the state and third parties on the ampli-
fied social level—of being ‘forgotten’ and not suffering permanently and indeter-
minately the negative repercussions associated with facts (here in the broad sense)
of the past is essential not only for a healthy personal life—from the physical and
psychological point of view—but also for a social integration of the individual. As
remembered by Catarina Botelho, neurological sciences teach that one of the major
functions of the human brain is to forget what is negligible and filter contents that
harm us emotionally.26
From this perspective, this has also to do, in a certain sense, with the need to ensure
a given possibility of self-governing one’s own memory and being able to react in
some way to the ‘implacable collective memory of the internet,’ besides preventing
people from becoming prisoners of a ‘past that is destined not to go away’.27
This statement, in turn, indicates that the right to be forgotten takes on a dimension
that goes beyond the individual one, since it has already been demonstrated that being
forgotten—here specifically highlighting the internet—consists in a social process
that needs a description and understanding, regardless of being forgotten by people
considered individually, and is related to what is usually called ‘collective memory’.28
This means, in a brief summary, that the thesis that the internet does not forget should
not be seen as closed and absolute, since the choice of a given item of information
to be disseminated implies that an entire range of other pieces of information that

23 In Brazilian law, see among others Tepedino (1999), pp. 44 ff.; de Andrade (2006), p. 101; de
Moraes (2003), pp. 117ff; Schreiber (2014), especially pp. 7ff.
24 See in this sense Solove (2011), pp. 15–30.
25 In this sense, precisely as regards the right to be forgotten, see Diesterhöft (2014), p. 150. It

should, however, be noted that the author refers to a more restricted dimension of what is called a
right to a new beginning, since he talks specifically of a “‘medial’” new beginning, emphasizing
the digital environment, which, from the perspective adopted here, sounds excessively limited.
26 Botelho (2017), p. 52.
27 See Rodotà (2014), pp. 41–42.
28 On this, see Albers and Schimke (2019). It should be pointed out that the authors share the

understanding that one must distinguish between different types of memory, viz. individual, social,
cultural and political, which will not be further discussed in the present text.
140 I. W. Sarlet

were not chosen will be forgotten for the current communication processes, so that,
basically and from an abstract perspective, one forgets as much as one remembers.29
Given the relevance of this aspect for the right to be forgotten, it should also be
recalled that on the internet environment information (and thus also ‘memory’)—can
be lost in several ways, namely, due to the fact that it was not stored, due to technical
damages or because it is not accessible by the available means.30 This, obviously,
should be considered when the content, limits and efficacy of the right to be forgotten
are defined.
On the other hand, one can see that the recognition of a right to be forgotten does
not entail attributing a subjective right in the sense of being able to oblige someone,
from the individual viewpoint, to forget something, which would be technically
impossible except by the coercive imposition of certain medical procedures, which,
besides being clearly illegitimate from the legal (and ethical) standpoint, would not
have the capacity to influence the sphere of collective memory. That is why—as
emphasized above—one should consider a process of social forgetting, which is
reflected on the individual level, but occurs through the possible suppression of
particular pieces of information and by making it difficult to access them. These
aspects, in turn, have a direct relationship with the problem of content, limits and
ways to implement the right to be forgotten.
Having said that and notwithstanding the fact that the right to be forgotten can
already be well anchored from the point of view of its justification and acknowledg-
ment, as a fundamental right implicit in human dignity and in the right to free devel-
opment of personality, this justification is reinforced by the possible and necessary
relationship between the right to be forgotten and some special personality rights,
be they expressly established, particularly the right to privacy (and intimacy), the
rights to honor and to one’s own image, or deduced as implicit and non-enumerated
rights by the Brazilian Federal Supreme Court (hereinafter FSC), like the rights to
informational self-determination or the right to a name and personal identity.
These special personality rights—except the right to informational self-
determination—were objects of express acknowledgment by the framers of the
FC and allow, especially in case of tension and even conflict with the freedom of
speech and information, to reinforce the justification for the inference of a right to
be forgotten as an implicit and necessary requirement for protection, especially in
relation to the personal and public exposure/confrontation with facts from the past—
whether one’s own or especially of third parties—of an offensive, embarrassing or
even inconvenient and untrue nature.
On the other hand, the right to informational self-determination, which, like in
other countries, is not directly enshrined in the FC, has been inferred from the prin-
ciple of the dignity of the human person as a general clause of personality protection,
from the right to inviolability of data communication enshrined in Article 5, item XII
of the FC (together with the confidentiality of correspondence and communications
by telephone).

29 See again Albers and Schimke (2019), with reference to Esposito.


30 In this sense, see Esposito (2017), pp. 1–11.
The Protection of Personality in the Digital Environment 141

It was only in May 2020, that the FSC, in a historical decision (ADI 6387, Jus-
tice-Rapporteur Rosa Weber), recognized an implicit autonomous fundamental right
to personal data protection, deduced from the principle of human dignity, the right to
a free development of personality, the right to informational self-determination and
the right to privacy. In October 2021, a Constitutional Amendment proposal (PEC
17/2019) was finally approved by the Brazilian National Congress, including a right
to personal data protection in the constitutional text (art. 5, LXXIX).
Furthermore, the right to informational self-determination is related to the right
to information and also to the instrument of habeas data, a constitutional action
instituted by Article 5, item LXXII of the FC, according to which ‘habeas data shall
be granted: (a) to ensure the knowledge of information related to the person of the
petitioner, contained in records or databases of government agencies or of agencies
of a public character; (b) for the correction of data, when the petitioner does not
prefer to do so through a confidential process, either judicial or administrative.’
In brief, the right to informational self-determination (originally recognized
through case law developed by the Federal Constitutional Court of Germany) estab-
lishes a link between the protection of personality in a broad sense and the protec-
tion of personal information, ensuring the individuals to decide themselves about
supplying, using and disseminating data (information) that concern them, although
not being an absolute right,31 which, however, will not be further developed here.
Moreover, although here we are dealing with a more specific aspect that refers
more directly to criminal law, it is cogent to invoke convict’s right to resocialization,
recognized by the FSC also as an implicit fundamental right, even if this acknowl-
edgment is previous and originally not associated with a right to be forgotten. In
its dimension connected to the right to be forgotten, the right to resocialization
implies ensuring that the convict has a real possibility of becoming reintegrated
into family and social life and not be constantly confronted with this fact. Here, in
turn, legal scholarship and case law have repeatedly invoked a precedent from the
Federal Constitutional Court of Germany, specifically the famous Lebach I case,
which will be the object of some attention in the critical evaluation of the Brazilian
court decisions.
Also in the criminal sphere, the FSC (and the SCJ and many decisions made in
the lower courts as well) has associated the right to be forgotten to the constitutional
prohibition of life sentence verdicts, in order to prevent, in some cases and given
the circumstances, the consideration as a bad record (to dose the sentence) of prior
convictions that were matters adjudged over five years ago.32
In brief, before continuing to present possible specific expressions of a right to
be forgotten in the sphere of ordinary legislation, it can be claimed that, taking
into account the elements presented, the reasons of those who refute the possibility

31 About this topic, in general terms, see especially Albers (2005). For a look at the most recent
evolution, see pars pro toto, Bull et al. (2016), and, focusing on the internet, Maisch (2015).
32 In this sense, for instance, Brasil, Supremo Tribunal Federal, Habeas Corpus: HC nº 126,315,

Justice-Rapporteur Gilmar Mendes, judged on Sept 15, 2015. https://redir.stf.jus.br/paginadorpub/


paginador.jsp?docTP=TP&docID=9947298. Accessed 11 Jan. 2022.
142 I. W. Sarlet

of recognizing a fundamental right to be forgotten in the Brazilian constitutional


system are weak. They advance especially the argument that this right could not
be extracted even by interpretation, since personality rights (in this case, privacy,
intimacy, honor and image) would not cover the forgetting of facts that concern the
public interest, even more so considering the priority position of the freedom of
speech and information.33
This line of argument, however, is not convincing, since the objection concerns
the content and scope of a right to be forgotten—more precisely, the criteria for its
recognition and application—and not the fact that personality should be protected,
at least to certain extend—against the harm caused by abusive speech. Indeed, it
is noteworthy that even those who have formulated the abovementioned objection
ultimately accept a field of application—although exceptional—for a right to be
forgotten, when it refers to information without any public interest.34 In other words,
if there is a scope of protection for the right to be forgotten, even if exceptional and
limited (the same would then have to be claimed in relation to the personality rights
in general while conflicting with freedom of speech), one also has to accept, in the
same measure, the existence of this right.
Obviously, acknowledging a right to be forgotten as inferred (as an implicit funda-
mental right), from the dignity of the human person and from the personality rights,
or even, particularly, as a legal position associated with specific constitutional rights
and guarantees, as in the case of the criminal sphere, does not subtract the right to
be forgotten from the legal regime of fundamental rights.
This, of course, includes the fact that fundamental rights are not absolute rights and
must have certain limits and support the imposition of restrictions. Such limits and
the correlated imposition of restrictions (restrictive interventions) have precisely the
purpose of ensuring other fundamental rights and goods with constitutional stature,
which in turn are also limited by the right to be forgotten, as in the case of the already
mentioned freedoms of speech and information, particularly the right of access to
information.
On the other hand, eventual restrictions imposed to fundamental rights, must be
submitted to severe scrutiny since they must comply with the requirements of the
constitution, regarding, for instance, the necessity of being directly and expressively
prescribed or the proportionality test of legislation and restrictive interventions in
general. But we will come back to this point later.

2.2 Partial Legislative Expressions

Beyond its (constitutional) qualification as a fundamental right, one can also find
(even long before the quarrel regarding the recognition of the right to be forgotten
as such) several rules at the underconstitutional level that correspond to some of

33 See Sarmento (2016), p. 204ff.


34 See in this sense also the reference of Branco (2017), p. 143.
The Protection of Personality in the Digital Environment 143

its specific expressions or contribute to justifying the existence of the right to be


forgotten and to establishing its limits. Besides, there are already several legislative
proposals going through the National House of Representatives providing for an
express acknowledgment of a right to be forgotten and even establishing rules for its
application.
An initial and traditional dimension of a right to be forgotten—although not
recognized with this terminology—until recently this term was not even invoked—
can be found in the domain of criminal law and criminal procedure law, especially as
regards Article 135 of the Criminal Code, Article 748 of the Criminal Procedure Code
and Article 202 of the Criminal Conviction Execution Act, which is in clear harmony
with the constitution (the convict’s right to resocialization and the prohibition of life
sentences), not only supporting and giving additional content to legal protection, but
also making it concretely possible in some aspects. Regardless of some other details,
the mentioned provisions state that any previous conviction will not be mentioned
in the criminal record of the rehabilitated person, nor in a certificate extracted from
the court books, except when ordered by a criminal court judge and in case of a new
criminal procedure and investigation.
In the same sense, the so-called Statute of Children and Adolescents (Estatuto
da Criança e do Adolescente, Statute no. 8,069, of July 13, 1990) contains rules
that can be used to acknowledge a right to be forgotten, protecting the dignity and
personality rights of children (up to just before 12 years of age) and adolescents (12–
18 years). Thus, besides the provision in Article 18, according to which children and
adolescents cannot be submitted to any type of inhuman, violent, terrifying, vexatious
or embarrassing treatment, article 143 forbids ‘the dissemination of judicial, police
and administrative acts that concern children and adolescents to whom the act of
committing an infraction is attributed.’ In addition, the paragraph of the same article
provides that any news about the event must not include identifying the child or the
adolescent, and prohibits the inclusion of photographs, references to name, nickname,
parentage, relatives, residence and even the initials of the name and surname.
Another specific statement that has been associated with a right to be forgotten
is supported by the Consumer Protection Code (Statute no. 8.078, of September
11, 1990), more precisely in its Article 43, implementing, in this domain, aspects of
the right to informational self-determination. According to Article 43 ‘the consumer,
without prejudice to the provisions of the Article 86,35 shall have free access to any of
their own data stored in reference files, index cards, records, personal and consumer
data, as well as their respective sources’. Besides this, paragraph 2 of the same article
assures that in case of ‘any inaccuracy in their data and records, the consumer shall
be entitled to require the prompt correction, and the person in charge of such records
shall communicate the alteration, within five weekdays, to any possible addressee of
the incorrect information.’
Likewise, the negative information on the consumers (late paying debtors) can
only be stored and used for five years (Article 43, paragraph 1 of the Consumer
Protection Code) and the affected person has the right to demand the exclusion of

35 This provision was vetoed by the President.


144 I. W. Sarlet

these pieces of information, besides being entitled to hold accountable the entities
responsible for maintaining and using data in an illegal way.
The Brazilian Civil Code (Law no. 10,406 of January 10th , 2002), particularly
in Articles 11, 12 and Articles 16 to 21, all contained in the chapter concerning the
personality rights, also offers support for the protection of certain aspects connected
to the right to be forgotten. Whereas Articles 11 and 12 establish general rules for the
protection of personality rights in general, in Articles 16 to 21 one can find specific
rules related to important aspects that also concern a right to be forgotten.
In Article 16, the Civil Code acknowledges that ‘every person has the right to
a name, which includes forename and surname.’ This right has been, despite the
absence of a specific provision in the FC, considered as a fundamental right, deduced
from the human dignity principle and the general clause of protection of personality.36
In the following, Articles 17 to 19 of the Civil Code define that (a) ‘a person’s name
cannot be used by another in publications or representations which expose them to
public contempt, even when there is no defamatory intention’ (Article 17); (b) ‘the
name of another cannot be used in commercial advertising without authorization’
(Article 18); (c) ‘the pseudonym adopted for lawful activities enjoys the protection
that is given to the name’ (Article 19).
In more general terms, Article 21 of the Civil Code provides that ‘the private life
of a natural person is inviolable, and the judge, at the request of the party concerned,
will adopt the necessary measures to prevent or make cease acts contrary to this rule.’
This rule, handled in harmony both with the constitutional framework and with other
legal provisions, can be a powerful instrument for the effectiveness of some aspects
related to the right to be forgotten. While the first part of Article 21 only reproduces,
at the ordinary legislative level, the right to privacy already enshrined in the FC, the
second part, however, assures the judicial protection of private life, being—at least
to a certain extend—applicable to the right to be forgotten.
Particularly controversial is the statement of Article 20 of the Civil Code: ‘Unless
authorized, or if necessary to the administration of justice or maintenance of public
order, the dissemination of written material, the transmission of words, or the publi-
cation, exhibition or use of a person’s image may be forbidden, at their behest and
without prejudice to the compensation, if appropriate, if their honor, good name or
respectability are affected, or if they are used for commercial purposes.’ In fact,
considering that the FC assures the freedom of speech without the need for any kind
of previous authorization, the constitutionality at least of part of Article 20’s content
has been strongly contested.

36 See as an illustration representing the orientation in constitutional case law the decision of the
SC in RE 454903/SP: ‘The right to a name is part of the concept of dignity of the human person
and expresses their identity, the origin of their ancestry, the acknowledgment of the family, which
is the reason why the state of affiliation is an inalienable right, because of the greater common
good to be protected, derived from the very binding force of the public order precepts that regu-
late the matter (Child and Adolescent Statute, Article 27)’. Brasil, Supremo Tribunal Federal,
Recurso Extraordinário: RE 454903/SP, Justice-Rapporteur Joaquim Barbosa, judged on Dec 7,
2009. https://jurisprudencia.stf.jus.br/pages/search/despacho125237/false. Accessed 12 Jan. 2022.
The Protection of Personality in the Digital Environment 145

Indeed, due to the existence of dissenting in precedents from lower courts, as well
because of major criticism by legal scholars, the FSC—in ADI 481537 (judgment
in 2015)—though it did not declare the unconstitutionality in itself of Article 20 of
the Civil Code, gave it an interpretation in accordance with the FC, in the sense of
interdicting the requirement of prior authorization by the person whose biography is
involved (or their representatives), for third parties to write biographies.
Furthermore, the Federal Statute no. 12,965 of April 23rd , 2014, the so-called
Marco Civil da Internet (the Civil Framework of the Internet [hereinafter CFI])
established a set of principles and provided for guarantees, rights and duties for the
use of the internet in Brazil. The CFI, even if it does not expressly mention a right
to be forgotten, contains important general guidelines and concrete rules that can be
applied directly or be reconstructed for the purpose of recognizing the need to accept
this individual legal claim in certain cases. The following provisions from the CFI,
if systematically interpreted, along with other legal provisions already listed, lead to
the conclusion that points to the existence of the right to be forgotten in national law,
besides regulating specific aspects of this fundamental right and thus being useful at
least partially for its implementation.
From this perspective, Article 2 of the CFI establishes, on the one hand, that
the ‘Internet use in Brazil is founded on the respect for freedom of speech,’ which
is the reason why its use should always guarantee and fulfill ‘the human rights,
the development of personality and the exercise of citizenship in digital media.’
On the other hand, Article 7 contains a catalog of rights and guarantees of internet
users that also involve the protection of personal data, privacy and informational
self-determination, and even reproduces constitutional provisions, besides rendering
them concrete by regulating various aspects that are correlated to them, in the internet
environment:
Internet access is essential for the exercise of citizenship, and the users have the following
rights:
I – inviolability of intimacy and privacy, protection and compensation for property or
moral damages resulting from its breach;
VII – no supply of personal data to third parties, including connection logs, and data of
access to internet applications, except by free, express and informed consent or in the cases
provided by law;
IX – express consent to the collection, use, storage and processing of personal data, which
should be highlighted in the contract terms;
X – definitive exclusion of personal data that have been provided to the particular internet
application, on request of the user, at the end of the relationship between the parties, except
in the cases of mandatory storage imposed in this law;

37 Brasil, Supremo Tribunal Federal, Ação Direta de Inconstitucionalidade: ADI 4815, -Rapporteur
Carmen Lúcia. En banc court, judged on June 10, 2015. https://redir.stf.jus.br/paginadorpub/pag
inador.jsp?docTP=TP&docID=10162709. Accessed 11 Jan. 2022. For non-Brazilian readers, it
should be explained that the procedure in which this matter was decided is called Direct Action
for the Declaration of Unconstitutionality (‘Ação Direta de Inconstitucionalidade’), which, within
the sphere of an abstract and concentrated control of rules, aims to declare the unconstitutionality
of a federal or state normative act or then, among other modalities of decision, to promote an
interpretation according to the Constitution.
146 I. W. Sarlet

XIII – application of protective standards and consumer protection in consumer relations


carried out on the internet.

In addition to this, there is a provision (Article 19, paragraph 4) of the possibility


that the judge will grant interim relief when there is unequivocal proof of the fact,
considering the collective interest in the availability of the content on the internet.
However, the aforementioned Article 19 does not ensure a right to disconnect contents
from the internet, but only the liability of the providers of the application for damage
generated by the content of third parties.38
Besides this (according to Article 21), the providers may be subsidiarily held liable
if they provide information maintained by third parties—in this case, videos or other
materials containing scenes of nudity or private sexual acts—when those affected
by the contents that have been made public have not given their express consent, or
otherwise when the providers do not obey the terms of the letter of demand sent by
the victim.
Important to mention is that the Brazilian General Data Protection Act (GDPA,
Federal Act-Lei 13.709), enacted in August 2018 and fully in force since August
2021,39 does not mention a right to be forgotten and also does not assure a right
to deindexation, recognizing only a right to anonymization, blocking, elimination
(erasure) and correction of data (Article 8º). Besides this, the new legislation changed
partly the text from Article 7º, X, of the CFI. According to its new version, the
internet user has the right to require the definitive erasure of personal data offered
to certain internet application providers, after the relation between the parties is
finished, excluding the cases in which the storage is mandatory according to the CFI
and the GDPA.
However, although the CFI provides for a right to exclude the contents considered
prejudicial and/or illegal, it should be emphasized that the right to be forgotten, at
least in the way we see it and as it was acknowledged in the Google-Case by the
CJEU (although not being expressly mentioned in the decision), is not limited to such
aspects, since it covers (and even possibly is centered on) a right to deindexation from
the search mechanisms, which must also meet certain criteria which will be discussed
further on.

38 As well remarked by Gonçalves (2016), p. 88.


39 The GDPA, enacted by the National Congress on August 14, 2018, was destinated to be in force
in February 2020, but due to a Provisory Measure (Medida Provisória Nr. 869/2018) edited by the
President of the Republic, afterwards converted by the Congress in the Federal Act Nr. 13.853/2019,
the date was postponed to August 2020. Considering the resistance against the sanctions established
in the new legislation, the Congress enacted the Federal Act Nr. 14.010/2020, which changed the
vacatio legis from articles 52 to 54 of the GDPA to August 1, 2021. During the deliberation of the
Act, the Federal Government enacted the Provisory Measure Nr. 959/2020, trying to alter again the
entering in force of the whole GDPA, but the Federal Senate didn’t accept this new attempt. The
result was, that the GDPA, except articles 52–54, entered effectively into force on September 18,
2020 (when the Federal Act Nr. 14. 058/2020, which converted Provisory Measure Nr. 959/2020,
was published, having been promulgated by Congress on September 17), and the part with the
sanctions in articles 52–54 came into effect on August 1, 2021.
The Protection of Personality in the Digital Environment 147

On the other hand, though the current legislation does not expressly cover all
aspects of the right to be forgotten, there are now several law bills that provide for
and regulate a right to be forgotten but are still being discussed in the Brazilian
National Congress. It should be underlined that all the bills are directly related to the
internet environment, although not exclusively in all cases.
The Law Bill presented by Representative Jefferson Campos (Projeto de Lei—PL
2,712/2015) alters and complements the CFI by adding to Article 7 ‘the removal,
at the request of the party involved, of references to records about them in search
sites, social media or other sources of information on the internet, as long as there
is no current public interest in the dissemination of the information and that the
information does not refer to facts that are genuinely historical.’
In turn, Law Bill 1,676/2015, presented by Representative Veneziano Vital do
Rêgo, defines as a crime and punishes the act of filming, taking photographs or
recording people’s voices without permission or for non-licit purposes (with a prison
term of one to two years, besides a fine) and establishes a harsher punishment if
later this information is disseminated (two to four years). If the dissemination takes
place on the internet in any way (computer, internet or social media), the penalty will
be even higher, between four and six years in prison and a fine. The Law Bill even
aims at developing a definition for ‘right to be forgotten’ as the ‘expression of the
human dignity, representing the guarantee of dissociation of the name, image and
other aspects of the personality in relation to facts that, even if veridical, do not have,
or no longer have, public interest.’
Thus, relevant for the internet environment is the circumstance that the Law Bill
establishes that even true facts, if they are no longer in the public interest, may be
subject to protection and removal. Besides this, it provides for holders of the right to
be forgotten the right to demand, independently of judicial authorization, from the
media in general and also from the providers of content and search on the internet,
that they stop transmitting or exclude material or references that connect them to
unlawful facts or those that will besmirch their honor. It also obliges the social media
and search, as well as content providers, to create specific departments to deal with
the right to be forgotten and to give a well-founded answer in the case they do not
acknowledge the application of the right to be forgotten.
Moreover, the Law Bill 8,443/2017 provides that any citizen has the right to
demand from any mass media the removal of any information that is inappropriate
or detrimental to their image, honor and name. The right should be exercised through a
petition addressed to the specific media, mentioning the rights affected and describing
the harm caused by the dissemination of the information. In case of refusal, which
must be proved, the affected person is entitled to access the Judiciary. An especially
relevant aspect is that the bill does not ensure this right to holders of elective office and
public actors (v.g. judges, public attorneys, diplomats) who are being investigated
or sued or who have already been sentenced. Besides, if it is a public person, the
request must mandatorily be formulated through judicial proceedings.
More recently, in 2020, the Law Bill 4,418/20 has been submitted to the Na-
tional Congress, establishing and regulating the ‘right to be forgotten in criminal
law’. In general terms, the proposal assures the right to any convicted person to
148 I. W. Sarlet

not be cited by name or any other manner of identification, both in the criminal or
administrative justice system, after six years. Any person found not guilty will acquire
immediately the mentioned right, while those convicted by corruption or especially
heinous crimes will be able to exercise the right to be forgotten only twelve years
later. In the same year, another Law Bill – Projeto de Lei 4,306 – was submitted
to debate in the Brazilian National Congress in order to establish the ‘right to be
forgotten for children and adolescents’, punishing with imprisonment (from two up
to four years) anyone who shares data of children or adolescents who have been
victims or have witnessed crimes.
If these bills (or one of them) are approved, it is evident that they may have a great
impact and practical consequences, besides giving rise to several legal problems,
even from a constitutional perspective, which will be the subject of our attention also
in the context of the general analysis and criticism to be made further on (Sect. 4).

3 The Right to ‘be Forgotten’ and Its Acknowledgment


and Protection by the Brazilian Superior
Courts—Superior Court of Justice (SCJ)
and the Controversial Decision of the Federal Supreme
Court (FSC)40

Based on the constitutional and legal framework, as well as looking at contributions


from national and even foreign legal scholarship, the Brazilian superior courts, such
as—for now—the Superior Court of Justice (SCJ) and the Federal Supreme Court
(FSC), have already had the opportunity of invoking and enforcing a right to be
forgotten in a number of cases submitted to their examination. However, considering
a casuistic practice, which is not always consistent and predictable—as shall be
seen further on—it is not possible to argue yet for a consolidation of the right to be
forgotten as far as case law is concerned. This is even more relevant if we consider
the increasing number of decisions from lower courts that, given their broad range,
will not be considered here.41

40 Just to inform the non-Portuguese readers, the Superior Court of Justice (Superior Tribunal de
Justiça) is a high Federal Court and constitutes, along with the Superior Labor Court (Tribunal
Superior do Trabalho), the Superior Electoral Court (Tribunal Superior Eleitoral) and the Superior
Military Court (Tribunal Superior Militar) a kind of third instance of the judicial system. Specially
the SCJ gains importance for the right to be forgotten due to the fact that its competence embraces
the revision of lower courts decisions (federal and state courts) in order to assure the unity
and authority of national Law. In the case of the FSC, its importance derives from the fact that it acts
as a constitutional court, binding all other courts, including the SCJ and any Federal High Court. The
access to the FSC can be—based on the US-System—through judicial review (since every single
judge and any Court can declare the unconstitutionality of any peace of legislation), or through
mechanisms related to an objective, abstract and concentrated control of constitutionality.
41 For a presentation and analysis of lower Courts decisions (appeal level) see Carello (2017),

pp. 81–142.
The Protection of Personality in the Digital Environment 149

Furthermore, for the sake of an easier understanding and mainly to allow a broad
and comparative critical analysis, we shall not only follow the chronological order,
but also present separately the cases that concern the right to be forgotten in general
terms (offline) and the cases that refer to this right in the domain of the internet
(online). On the other hand, considering the limits of the present paper, we will
restrain ourselves from presenting those cases that could be considered the leading
ones in this matter.

3.1 Decisions Involving a Right to Be Forgotten Outside


the Internet Domains

The first two cases, both judged by the SCJ, were appreciated in the Special Appeals
(Recurso Especial) no. 1335153/RJ and no. 1334097/RJ (2013). In these two cases
judged on the same day, a few parameters were established for the acknowledgment
and respective legal consequences of a right to be forgotten in Brazil. However, oppo-
site conclusions were reached in each case, since one of them assured the protection
of this right, whereas in the other one prevalence was given to the freedom of infor-
mation and communication. This discrepancy does not necessarily prove to be a
contradiction, but already points to the fact that, like in other cases in which there
is a collision of rights, it is necessary to analyze the peculiarities of each case, the
weight of the rights involved, as well as the impact resulting from their greater or
lesser protection. This must be done through a balancing procedure aiming at estab-
lishing an adequate solution from the legal point of view. Whether the result of the
balancing undertaken by the court in each case was adequate will be discussed later
(Sect. 4).
In the first case, known as the ‘Aida Curi’ case (Resp. 1.335.153/RJ), family
members of Aida Curi, who was murdered in 1958 (this case became notorious at
the time when the murder was committed) sued Globo Comunicações e Participações
for broadcasting—without former permission—the TV-Program called Linha Direta,
reproducing and reconstituting, even as a documentary and decades later, the same
traumatic episode. They claimed that ‘old already healed wounds’ might be reopened
in public. The plaintiffs requested the acceptance of their claim that in this case ‘their
right to be forgotten’ should be recognized in order ‘to not have revived, against their
will, the pain that they had suffered at the time of Aida Curi’s death, and because of
the publicity given to the case decades earlier.’ They further requested compensation
for immaterial (moral) damages.
Analyzing the particularities of the case, the Court stated that
1. the victims of crimes and their families, theoretically, can also hold the right to
be forgotten insofar as they cannot be obliged to be subjected unnecessarily to
‘memories of past facts that caused them unforgettable wounds;’
150 I. W. Sarlet

2. the appropriate solution of the case requires balancing the potential historicity
of the narrated facts with the protection of the personality rights of the victim
and her relatives;
3. the crime entered the public domain and thus became a historical fact, which
must be accessible to the press and society. Besides, due to the broad dissem-
ination given to the fact at the time it happened, including the investigation
and trials, as well as the direct connection with the victim’s name, it would be
unfeasible to ‘portray the Aida Curi case without Aida Curi;’
4. considering the specific situation, a restriction of the freedom of the press
would be disproportionate compared to the discomfort generated by the remem-
brance of the facts by the victim’s family, particularly considering the long time
elapsed since the date of the events, which has the power of reducing, even
if not completely removing, the pain and shock caused by the facts and their
dissemination.
In the second decision, known as the case of the Candelária Slaughter (Chacina
da Candelária, Resp. no. 1.334.097/RJ), the scope of the lawsuit was to obtain a
civil compensation from Globo due to the broadcasting (in the same Linha Direta
program) of a documentary about the facts, the investigation and the trial before
courts. The plaintiff of the original claim fought for the acknowledgment of his right
to be forgotten, i. e. not be remembered against his will, because of criminal facts for
which he had been prosecuted and judged but acquitted. He also alleged the absence
of contemporaneity of the facts and that reopening ‘old wounds’ which had already
been overcome by him would have reawakened the suspicion of society regarding
his character. Considering the particularities of the case, the Court, recognizing the
plaintiff’s claim, argued that
1. even if the crimes reported were famous and historical and although the jour-
nalistic story was faithful to reality, the protection of the intimacy and privacy
of the convicted and the acquitted should prevail, since the ‘useful life of the
criminal information’ had already reached its end;
2. the acknowledgment of a right to be forgotten expresses ‘a cultural evolution of
society, conferring concreteness to a legal system that, between memory—which
is the connection with the past—and hope—which is the tie with the present—
made a clear option for the latter.’ Besides this, the right to be forgotten is a
‘right to hope, completely attuned with the legal and constitutional presumption
of the possibility of rehabilitation of the human person;’
3. the uncontested historicity of the facts must be concretely examined, affirming
the public and social interest, as long as the personal identification of those
involved is essential. Although it is a historical event and a symbol of the precar-
iousness of the State’s protection of children and adolescents, the documentary
could have portrayed the facts correctly without identifying, by name or by
image, those involved, particularly the appellee;
4. furthermore, allowing the dissemination of the name and image of the appellee,
even if acquitted (who even so would have reinforced his image as accused and
involved in the crime), would be the same as allowing a second violation of his
The Protection of Personality in the Digital Environment 151

dignity, since the very fact and its broad dissemination, including the name of
the appellee as a suspect, and also the police inquiry, at the time it took place
was already a national shaming.
Considering its specificities, there is a third decision from the SCJ worth
mentioning, although not being related to the internet environment. In this case,
decided in March 28th , 2020 (Recurso Especial—Special Appeal—Nr. 1.736.803/RJ,
Justice-Rapporteur Villas Bôas Cueva), a right to be forgotten was also recognized
by the Court, upholding the decision from the State Appeal Court Rio de Janeiro,
convicting—as it was the case from the first instance Judge—the defendant (an impor-
tant Magazine—Isto É) to pay for immaterial damages for exposing the private life
from one of the plaintiffs and her family, including children, in a report published in
October 2012.
According to the SCJ, the public interest should prevail when information
disclosed with respect to a notorious criminal fact are marked by historicity,
remaining current and relevant to collective memory, a situation not configured in
the case. The publication of a report with content exclusively limited to describing
routine habits and private facts from the life of a contemporary person convicted
of a crime and her family members constitutes an abuse of the right to inform and
infringes the right to privacy and the right to be forgotten.
Besides this, the SCJ argued that the media exploitation of personal data of an
egress of the criminal system is a violation of the constitutional principle of the
prohibition of life sentences, the right to rehabilitation and the right to return to
social life, guaranteed by the infra-constitutional legislation (criminal code, criminal
procedure code and the criminal penalty execution Act). The extension of the effects
of the conviction to third parties not related to the crime constitutes a transgression
of the principle of non-transferability of the penalty, enshrined in Article 5, XLV,
of the FC, being especially serious when affecting children or adolescents, who are
protected by Law, which ensures the right to integral protection and full development
in a healthy way.
On the other hand, it is of major importance that the SCJ, as well as the State
Courts, acknowledged that in face of evident social interest in preserving the historical
and collective memory of notorious crime, the plaintiff’s thesis that the right to be
forgotten implies a prohibition of any future broadcasting of journalistic articles
related to her criminal act, cannot be accepted for it clearly constitutes pre-censorship,
prohibited by the FC.
Particularly relevant for the discussion around the content and scope of a right to
be forgotten in Brazil is the fact that the FSC (after the Aida Curi case) acknowledged
the general repercussion of the discussion,42 since it would be possible to examine

42 Brasil, Supremo Tribunal Federal, Recurso Extraordinário com Agravo: ARE 833248 RG/RJ,
Justice-Rapporteur Dias Toffoli, judged on Feb 19, 2015. https://redir.stf.jus.br/paginadorpub/pag
inador.jsp?docTP=TP&docID=7810658. Accessed 12 Jan. 2022. It should be mentioned for the
better understanding of non-Brazilians that the FSC, by a qualified majority decision of its judges,
can accept the allegation of general repercussion (relevance and national character of the case and
a significant controversy and divergence in case law), which implies the suspension of all actions
152 I. W. Sarlet

in an extraordinary appeal (Recurso Extraordinário) the allegation that the right to


be forgotten is an attribute that cannot be dissociated from the guarantee of human
dignity and that freedom of speech is not an absolute right and cannot have a higher
position than the individual guarantees of the inviolability of personality, honor,
dignity, private life, and intimacy of the human person. During the proceedings, the
Justice-Rapporteur, Justice Dias Toffoli, called for a public hearing, held on June
12th , 2017, besides having accepted the inclusion of various agents as amici curiae.
In the public hearing, in which members of academia, nongovernmental orga-
nizations, representatives of the media, people with technical expertise and others
participated. Three different groups were represented. The first, formed especially
by the representatives of the media, sustained the prevalence of freedom of infor-
mation and speech, and the impossibility of recognizing a right to be forgotten. The
second advocated the prevalence of personality rights and, therefore, of the right to
be forgotten, when there is an abuse in freedom of speech and, in general, in case of
information and opinions that are offensive to personality rights. On the other hand,
the third current, with an intermediary approach, proposed that the prevalence or not
of personality rights or of freedom of speech and information should be analyzed in
a case-by-case basis, with careful balancing.43
Finally, almost four years after the public hearing, on February 11th , 2021, the
FSC decided the case and launched a highly controversial decision. As regards the
existence of the right to be forgotten, the FSC’s Full Court (except one of the Justices,
who preferred to not participate in the decision due to conflict of interest) decided by
the majority (not unanimous) that the right to be forgotten is incompatible with the
Brazilian constitutional system, at least in general and with the extension pretended
in the Aida Curi case.
Five Justices expressly stated that the Brazilian legal framework does not
support the mentioned right (Justices Dias Toffoli, Alexandre de Moraes, Rosa
Weber, Cármen Lúcia and Marco Aurélio), while two Justices (Nunes Marques and
Lewandowski), although they also do not recognize the existence of the right as such,
pointed out the possibility of case-by-case analysis. Both Justice Gilmar Mendes and
Justice Luiz Fux sustained intermediary approaches. The first recognized that it is
possible to recognize the mentioned right in terms of a ‘practical concordance’ anal-
ysis, through balancing with other rights and interests, while the latter stated that the
right cannot be recognized if the public interest is at stake. Justice Fachin was the
only one to recognize the right to be forgotten without reservations.
Although being a judgment with general (erga omnes) and binding effect, a consid-
erable number of questions and possibilities, even the recognition of a right to be
forgotten, including a right to deindexation in regard to the search engine providers,
remain open. For this reason, the commentaries on the decision will be made after

that are going through the ordinary instances up to the definitive judgment of the case by the FSC,
when the decision begins to have general effects (erga omnes) and directly binds all of the bodies
and agents of the Judiciary.
43 For a summary presentation of the three currents, see Schreiber (2017a, b).
The Protection of Personality in the Digital Environment 153

the presentation of the jurisprudence of the SCJ related to the right to be forgotten
in the digital environment.

3.2 The Right to Be Forgotten on the Internet


from the Perspective of the SCJ

Even if the general repercussion of the topic was acknowledged based a case (Aida
Curi) that do not involve the internet, the discussion at the public hearing (in which
representatives of Google and Yahoo had the floor) and the FSC’s decision on the
merits also had a direct repercussion on the right to be forgotten online. Thus, taking
into account that as regards acknowledging a right to be forgotten on the Internet,
no final decision of the FSC has been recorded, and that a deeper evaluation of that
judgment will be undertaken considering both, the offline and online jurisprudence
and developments in general, the goal here is to analyze the main decisions taken by
the SCJ until the day the present text was concluded.
In this context it should be emphasized that the right to be forgotten on the internet
is not limited to the liability of the search engine providers and to a right to deindex-
ation. Indeed, taking into account the scope of the right to data erasure established
in the CFI, the right to be forgotten, at least in the Brazilian case where there is no
specific rule regarding the deindexation by search engine providers, is also relevant
as regards the providers of content, especially in the so-called social media, sharers
of videos and photographs, etc.
However, despite the recurrent judicial cases involving the liability of content
providers due to the non-erasure of all kinds of expressions considered untrue, incor-
rect, offensive to personality rights and even those defined as crimes, it is striking
that, at least in the cases of the SCJ about this subject, a right to be forgotten has not
been directly invoked. For this reason, our attention will not be given to these deci-
sions, but rather regarding those that referred to a right to be forgotten and dedicated
themselves to understanding and applying it.
Be as it may, simply not to completely ignore the point, it should be emphasized
that the dominant and consolidated case law of the SCJ44 can be summarized in the
following terms: (a) the providers of application and content, including the social
networks, are not objectively liable for the insertion of illegal information in the site
by third parties; (b) they cannot be obliged to perform prior control of the content of
information posted by the users; (c) however, when they are informed unequivocally
of its existence, they must remove it immediately, for not being considered respon-
sible for the damage caused; (d) they must maintain a minimally effective system
that allows the identification of users, whose efficacy will be checked case-by-case,

44In this sense, see Brasil, Superior Tribunal de Justiça, Recurso Especial: Resp. 1642560/SP,
Justice-Rapporteur Marco Aurélio Belizze, judged on Sept 12, 2017 and Resp.1629.255-MG,
Judge-Rapporteur Nancy Andrighi, judged on Aug 22, 2017. http://www.stj.jus.br/portal/site/STJ.
Accessed 11 Jan. 2022.
154 I. W. Sarlet

in order to comply with the prohibition of anonymity established explicitly by the


FC when ensuring the freedom of speech; (e) their liability is subjective, and it is
solidary with the author of the posting, when, upon having learned about the injurious
character of a given content, they do not take the appropriate measures to remove it.
Besides this, the SCJ requires the precise indication by the claimant of the URL of
the content considered illicit as a condition for obtaining the court order to remove
it, which, besides being a criterion to establish liability, avoiding vague, impre-
cise orders, helps ensure effective control of the implementation of the decision.
This requirement corresponds to what is required by the CFI, in the sense that the
disputed content should be clearly and specifically identified.45
But particularly important for the present work are the decisions of the SCJ
concerning the liability of the research providers and the so-called right to dein-
dexation, as occurred in the paradigmatic (but all the same no less polemical) case
of Google versus Agencia Española de Protección de Datos and Mario Costeja.
The first case that arrived at the SCJ became known as the Xuxa case, and it was
judged in the Special Appeal 1316921, on June 26, 2012.46 It was a case brought
by Maria da Graça Xuxa Meneguel, then still a presenter of television programs for
children and adolescents, against Google Brasil Internet Ltda, aiming to suppress
from search mechanisms all and any results of searches based on the term ‘xuxa
pedófila’ or any other expression that would associate the name of the plaintiff with
pedophilia and any other kind of criminal practice.
The SCJ, however, appreciating Google Search’s refusal to accept the plain-
tiff’s reasoning, ended up looking into the specific problem involving a right to
be forgotten. As to what matters to our study, here are, in brief, the reasons on which
the decision was based and that will be evaluated critically further on:
1. The provision in Article 14 of the Consumer Protection Code does not apply to
Google Search, whose activity is limited to operating as a search mechanism
and a search engine provider. On the contrary of what happens with the content
providers, Google Search limits itself to indexing terms and ‘indicating links
where the terms or expressions for search supplied by the user themselves can
be found.’ That is why in this type of activity it is not possible to speak of
defective service;
2. Since the activity of the search engine provider is carried out in a virtual environ-
ment, which allows public and unrestricted access, even if there were no search
mechanisms such as those offered by Google Search the contents considered
unlawful would remain circulating and being made available on the internet;

45 See, pars pro toto, Brasil, Superior Tribunal de Justiça. Recurso Especial: Resp 1642560-SP,
Justice-Rapporteur Marco Aurélio Belizze, judged on Sept 12, 2017. https://scon.stj.jus.br/SCON/
GetInteiroTeorDoAcordao?num_registro=201602427774&dt_publicacao=29/11/2017. Accessed
11 Jan. 2022.
46 See Brasil, Superior Tribunal de Justiça. Recurso Especial: Resp 1316921/RJ. Justice-

Rapporteur: Nancy Andrighi. Third Panel, judged on June 26, 2012. https://scon.stj.jus.br/SCON/
GetInteiroTeorDoAcordao?num_registro=201103079096&dt_publicacao=29/06/2012. Accessed
11 Jan. 2022.
The Protection of Personality in the Digital Environment 155

3. Given the subjective and arbitrary character involved in the decision to remove—
or not—links, results and pages vehiculating offensive (illicit) contents on the
internet, this margin of discretion cannot be delegated to the research provider;
4. The conflict between the protection of personality rights and the freedom of
speech and communication, especially in its collective dimension, should prevail
over the individual interests and greater weight should be attributed to the right
to information.
A second relevant decision was taken by the SCJ in the Special Appeal 407271/SP,
judged on November 21st, 2013.47 This was a case brought by a woman who was
dismissed by the company where she worked after a video was found in her corporate
e-mail with intimate scenes recorded on the premises of the company. This video was
posted on the internet, made available on Orkut and could be accessed through Google
search engines. The claimant sought to obtain the disconnection of all prejudicial
URLs from Google, the removal of any mention to her name from the Orkut site,
and to be given the information on all those responsible for publishing the message
that was offensive to her.
Already in the decision by the lower court judge, it was considered that there was
no way of removing all the pages that had posted the video, and the obligation was
converted into financial compensation for the damages caused. The SCJ, however,
refused the claimant’s request. Besides reiterating arguments used in the previous
case, the balancing undertaken by the Court was in favor of the freedom of informa-
tion, for the claimant’s request was legally impossible because it was unreasonable
in the specific case. Furthermore, mention was made of the claimant’s behavior as
being naive and offhand because she maintained videos with intimate images on her
electronic mail.
A third case was the Complaint (Reclamação) no. 5,072/Acre, judged on
December 11th , 2013,48 regarding the request of Google Brasil Internet to over-
rule the decision of the lower courts which condemned the appellant to pay moral
damages in favor of the defendant (a Judge), because his name was connected to
news about judges involved in practicing pedophilia.
During the judgment, there was an attempt to remove the fine imposed on the
claimant because it did not comply with the preliminary injunction that had ordered
it to remove from the internet records of the original page, as well as to suppress the
name of the original plaintiff and his association with the aforementioned matter, its
reproductions or all and any topic related to pedophilia from the search engines.
The outcoming of this judgment was again a victory of Google and the SCJ
overruled the decisions taken at the lower court level. The reasons invoked by the
SCJ in general replicated the arguments of previous decisions, so that here we limit

47 Brasil, Superior Tribunal de Justiça, Recurso Especial: Resp. 1407271/SP. Justice-Rapporteur


Nancy Andrighi. Third Panel, judged on Nov 29, 2013. https://scon.stj.jus.br/SCON/GetInteiroTe
orDoAcordao?num_registro=201302398841&dt_publicacao=29/11/2013. Accessed 11 Jan. 2022.
48 Brasil, Superior Tribunal de Justiça. Reclamação: Recl 5072/AC. Justice-Rapporteur Nancy

Andrighi. Third Panel, judged on December 11, 2013. https://scon.stj.jus.br/SCON/GetInteiroTe


orDoAcordao?num_registro=201002183066&dt_publicacao=04/06/2014. Accessed 11 Jan. 2022.
156 I. W. Sarlet

ourselves to mentioning the additional arguments, partly adjusted to the judgment of


the case now presented. In this sense, we highlight the consideration that in case the
copy of the illicit text or image was recorded in the cache memory of the search engine
provider, the latter is obliged to exclude it preventively from the search engines, once
it has been informed of the fact and as long as the URL of the original page is provided
and it is proven that the original page was removed from the internet. Furthermore,
since it is a specific measure to be carried out by a person different from the one who
posts the offensive content, and also it is related to a file that is not identical with the
original text or image, the existence of an individual request is imperative, as well
as a court order determining the removal of the cache copy. Finally, it is noteworthy
that—according to the courts reasoning—besides the fact that certain words and
expressions can be used with and understood in distinct meanings and/or in different
contexts, the suppression—deindexation—of the information (even if offensive) from
the search engines would also interdict the broad access to the right of reply given to
the victim, or even the broad circulation of possible material explaining the mistake.
Another case judged by the SCJ on May 10th , 2016 (Special Appeal no.
1.582.981/RJ, Justice-Rapporteur Marco Aurélio Bellizze)49 presents a few pecu-
liarities in relation to the Xuxa case. In this appeal, filed both by Google and by the
plaintiff, the object of the lawsuit against Google concerned the liability of the Google
Brasil company for the fact that, despite the exclusion of the name of the plaintiff,
who had been affected by third party comment inappropriately connecting him to his
name and profession (lawyer), Google continued showing the aforementioned story
in its search mechanisms.
It should be noted that in the lower courts Google had been condemned and ordered
to review its search mechanisms, excluding the association of the name of the plaintiff
to the link www.tudosuper.com.br and its derivatives, with a penalty of a daily fine
in case of noncompliance. The SCJ, in turn, although having repeated arguments
from previous trials, decided that exceptionally the search engine providers may be
compelled to exclude from their database results that are incorrect or damaging,
‘especially when there is no relationship of pertinence between the content of the
result and the criterion researched.’ Furthermore, besides maintaining the imposition
of the compensatory fine, the SCJ underlined that the value of the fine must be
dynamic in its character and adequated so as to have ‘real coercive force’ to prevent
noncompliance with the judicial decisions, like what happened in that specific case.
Another important case was judged by the SCJ in November 2016 (Resp. no.
1.593.873-SP).50 It was a case brought against Google Brasil Internet (defendant) in
order to block the possibility of researches based on the name of the plaintiff, because
they might show nude images of her. In brief, the arguments used in the previous
decisions were reiterated, especially in the sense that search engine providers are not

49 Brasil, Superior Tribunal de Justiça, Recurso Especial: Resp. 1582981/RJ. Justice-Rapporteur


Marco Aurélio Belizze, judged on May 10, 2016. https://scon.stj.jus.br/SCON/GetInteiroTeorDoAc
ordao?num_registro=201502238660&dt_publicacao=19/05/2016. Accessed 11 Jan. 2022.
50 Brasil, Superior Tribunal de Justiça. Recurso Especial: Resp. 1593873/SP. Judge-Rapporteur

Nancy Andrighi, judged on November 10, 2016. https://scon.stj.jus.br/SCON/GetInteiroTeorDoAc


ordao?num_registro=201600796181&dt_publicacao=17/11/2016. Accessed 11 Jan. 2022.
The Protection of Personality in the Digital Environment 157

responsible for the content of the results of the searches carried out by the respective
users and cannot be obliged to eliminate results derived from searches based on a
given term or name. Besides, the SCJ considered (by a majority) that in Brazil there
is no specific legal provision imposing this responsibility on search engine providers,
which would also mean to entrust them with the function of a kind of digital censor.
If we consider all the judgments mentioned and related to the Internet, we can
see that the SCJ has been refusing the direct liability of search engine providers for
third party contents on the Internet and even not recognizing a right to be forgotten
as such in those cases.
This restrictive tendency seems to be experiencing a radical change, since the
SCJ recently decided (Resp. 1.660168/RJ, May 8th, 2018), by a tight majority (3
vs. 2), to affirm the responsibility of research providers for contents than can be
accessed through their search engines. In order to better understand the case and to
check all the arguments managed in the former decisions, identifying the criteria that
supported the judgment, we present the aspects of the case and the court’s reasoning.
The case is about judgment of the special appeal filed by the companies Google
Brasil Internet Ltda, Yahoo do Brasil Internet Ltda and Microsoft Informática Ltda
regarding a judgment at the level of appeal decided by the State Appeal Court Rio
de Janeiro, which, in turn, had overturned the low court Judge’s decision that did not
acknowledge a right to be forgotten of the plaintiff, who was a member of the State
Prosecution Office. The plaintiff wanted an injunction imposing the defendants to
install a keyword-based filter in order to avoid the association of the plaintiff’s name
to an alleged fraud occurred in 2007, on the occasion of a public contest (state exam)
to become a judge.
It should be noted that at that time the National Council of Justice carried out an
investigation, which, however, did not find enough evidence of the occurrence of a
wrongdoing or violation of the law. Nonetheless, even after that decision, the name
of the plaintiff continued to be indexed and associated with data related to the topic
‘fraud in competitive public examination for the bench.’ This led to the filing of the
suit based on the allegation that this affected the plaitiff’s privacy and image, due to
the fact that she already held another public office in the legal area.
As far as the reasons alleged are consistent, it should be mentioned that the low
court judge did not acknowledge the liability of Google as a search engine provider
by basically putting forth the arguments underscoring the guidance provided by the
SCJ until then. That decision was overruled by the State Appeal Court, arguing that
given the case’s circumstances, the personality rights of the then appellant should
prevail in order to avoid the circulation, for an unreasonable length of time, of news
that might have a negative repercussion on the present lives of individuals. Thus, the
State Appeal Court ordered the abovementioned research providers to install content
filters that would disassociate the name of the plaintiff from information and opinions
related to the alleged fraud, under penalty of a daily fine of R$3000.00 (about USD
800,00).
In the special appeal filed, the defendants (Google and others) alleged, basically,
(a) that the court’s decision involved a violation of provisions of the Civil Procedural
Code and of the Civil Code because it imposed a technically and legally unfeasible
158 I. W. Sarlet

obligation; (b) that the obligation imposed had no usefulness whatsoever, since the
disassociation of the appellee’s name of the search engines does not prevent the main-
tenance of stories in which her name is mentioned on the internet; (c) the application
of the SCJ’s reiterated view on the impossibility of holding the providers liable under
such conditions; (d) that the order determining the filtering of search results implies
censorship and violates the rights of consumers who use their search services.
Finally, already at the level of the special appeal judgment, the SCJ decided, by
a majority, to partially grant the appeal by reducing the amount of the fine. As far as
the reasons put forth by the Justices are concerned, it should be highlighted that the
Justice-Rapporteur, Nancy Andrigui, maintained in general terms the position upheld
at several previous decisions in which requests for the deindexation of contents posted
by third parties and accessed through search mechanisms provided by search engine
providers were denied. Agreeing with the Rapporteur’s opinion, Justice Ricardo
Villas Boas Cuêva claimed that the decision by the State Appeal Court Rio de Janeiro
had denied the effectiveness of Article 19 of the CFI, particularly because it had
issued a generic order without identifying in a clear and specific manner the content
considered harmful and enabling its location, which is the reason why the author of
the request for the exclusion of data should indicate the URL. Furthermore, according
to the legal provision cited, providers can only be held liable for contents posted by
third parties when, once notified, they do not take measures to make the material
considered harmful become unavailable.
Regarding the dissenting opinions, which finally prevailed, Justice Marco Aurélio
Belizze granted the appeal only partially by reducing the amount of the stipulated
fine. Regarding the acknowledgment of the right to be forgotten in the specific case
at hand, he upheld the decision by the State Court of Rio de Janeiro basically on
the following grounds: (a) there is no difference between the norms applicable in
Europe and in Brazil because in both cases what is at stake is the liability of search
engine providers (query mechanisms), which select and establish a hierarchy of
information based on algorithms independent of the content of the data to which
they provide access; (b) the CFI supports the measure taken by the State Court
Rio de Janeiro, although it is not explicitly provided for, which may be the case in
exceptional situations when the access to information produces a disproportionate
impact, particularly as regards interests of a private nature and even in the presence
of a collective interest, when a long period of time has elapsed since the occurrence
of the facts whose dissemination on the internet is seen as harmful; (c) in the specific
case, even after a period of two years had elapsed, the provider still indicated as the
most relevant piece of news associated with the plaintiff’s name the one related to
the public examination, and even after a whole decade that data is made available
as if there were no other later information on the issue available; (d) the plaintiff’s
request is specific in the sense that the indication of her name should no longer be
used as the exclusive criterion, unconnected from any other term, by relating it to
the mentioned fact that was detrimental to her personality rights; (e) insofar as that
result appears and is maintained by the site, this involves a feedback effect since the
user, when performing the research based on the plaintiff’s name and getting the link
to that particular piece of news, will access the content, which, in turn, will reinforce
The Protection of Personality in the Digital Environment 159

the automated system’s indication that that page is relevant; (f) the access to that
data is not prevented because the sources that disseminate it and even mention the
plaintiff’s name are still available on the internet; (g) the order to install filters aims
at avoiding that, by using the plaintiff’s name as an exclusive search criterion, the
information about an alleged case of fraud that took place more than ten years ago
be accessed first.
After the position taken by Justice Moura Ribeiro, who adhered to the dissenting
opinion, Justice Paulo Sanseverino had to deliver the tie-breaking opinion. He favored
the dissenting opinion by not granting the appeal based on the following reasons: (a)
the specific case, differently from those in which the content providers’ liability is
at stake, has to do with the right to avoid that, when the search is made by using the
query mechanisms of the search engine providers that refer only to the person’s name,
without any other binding criterion, the prioritized piece of information continues to
be, after so many years, the facts that impact the plaintiff’s rights; (b) for this reason,
as in the example of the case Google versus Mario Costeja González judged by the
European Union’s Court of Justice and considering the peculiarities of the specific
case, the right to information should not prevail due to the disproportionate impact
on the plaintiff’s personality rights.
In the light of the decisions mentioned here, it is necessary to make a critical
evaluation of them, which will be done in the next section.

4 General and Critical Analysis of the Current State


of the Recognition and Implementation of a Fundamental
Right to Be Forgotten on the Internet in Brazil

4.1 Did the FSC Close the Door to the Right to be Forgotten?

The FSC’s decision from February 2021 was immediately followed by a huge contro-
versy and criticism.51 At one point, however, an almost absolute consensus can be
described, namely that the Court correctly rejected the application of the right to be
forgotten in the so-called Aida Curi case, due to its circumstances, already presented
and discussed to some extent in the decision from the SCJ (see 3.1).
The most important question that remains open and must be answered is
whether the judgment from the FSC, denying the right to be forgotten in the Aida
Curi case through a direct binding decision (a mandatory precedent for all other
Brazilian Judges and Courts) and with erga omnes efficacy, closed the door to future
developments, even those admitting some dimensions of that right.

51Brasil, Supremo Tribunal Federal, Recurso Extraordinário 1.010.606, Justice-Rapporteur Dias


Toffoli, judged on Feb 11, 2021. https://redir.stf.jus.br/paginadorpub/paginador.jsp?docTP=TP&
docID=755910773. Accessed 11 Jan. 2022.
160 I. W. Sarlet

The answer that has been gaining more and more supporters is positive, in the
sense that the door to the right to be forgotten remains open. In fact, closely examining
the Justices’ votes, a first remark that shall be made is that the Court underlined that
the right to be forgotten could not be accepted with the broadness and scope sustained
by the family members of Aida Curi. Besides this, the Justices argued that in case of
abuses of the right of freedom of speech and information and depending on which
other fundamental rights and constitutional principles and goods are affected and
how they are affected, in each specific case and considering the circumstances, it is
necessary to conduct a balancing approach in order to evaluate what kind of measures
are needed to solve the problem. Another aspect worth mentioning is that the Court
did not decide about the right to be forgotten on the internet domain, leaving the way
open in this regard, mainly concerning the so-called right to deindexation.
Despite the existence of other arguments sustained by the Justices and advocated
by the Court’s critics that could be presented and explored, the fact is that the FSC
denied the ‘label’ right to be forgotten, but not the possibility to recognize and apply,
under certain circumstances, some dimensions that have been associated with this
right. The first and probably most important dimension (or instrument) of a so-called
right to be forgotten online is the right to deindexation, which will be commented on
in the next segment.

4.2 The Controversy Around the Liability of Search Engine


Providers and a Right to Deindexation

A first aspect to be discussed here, although it is specifically applicable to the internet


domain, refers to the liability of search engine providers due to the fact that they
ensure access to contents considered harmful to the interests and rights of particular
persons taken individually or collectively and even to the interests of public and
private institutions.
This liability, which was not being acknowledged in most cases, was, as we saw
in the previous section, recently acknowledged in a decision taken by a tight majority
(one vote difference). This means that the SCJ adhered—assuming that this decision
will be confirmed in future trials—to the position that is apparently predominant in
Europe, particularly due to the CJEU’s decision in the case of Google vs. the Spanish
Data Protection Agency and Mario Costeja and to the fact that it is (at least partially)
supported by the new General Regulation of Data Protection of the EU.
Besides this, despite the lack of a specific provision of a right to deindexation
vis-à-vis the research providers in statutory law, the Brazilian normative framework,
if considered as a whole and from a teleological and systematic perspective, assures
such a right.52

52See Sarlet (2018). For further development of the topic (liability of research providers and
deindexation), see, among the literature available in Brazil, Gonçalves (2016).
The Protection of Personality in the Digital Environment 161

An additional argument is that once a right to erasure is recognized in some


instances, a right to deindexation can be deduced, since it is a way to implement
the informational self-determination and control of the use and dissemination of
information on the internet.53
In this context, it cannot be ignored that, from the strictly technical point of view,
there is a difference between the data processing using algorithms and by human
action, for several reasons: (a) algorithms do not refer to meanings, but rather to data;
(b) the algorithms as currently employed—here focusing on the internet (and search
engines)—do not seek to reproduce the human forms of processing information,
which (without detriment to other aspects of a technical nature relevant to the right
to be forgotten) imply difficulties in establishing the liability for the way and the
results of data processing,54 difficulties that were was precisely used as an argument
by Google to claim non-liability for the use and results obtained with its search
engines, be it in the case judged by the CJEU or in the other cases in which it was
summoned to answer in Court, including the trials of the SCJ.
However, the fact is that—in the wake of the Google decision by the CJEU
and from the technical point of view (on which this decision was based)—search
engine providers can no longer be considered purely and simply mere intermediaries
between the users and e. g. the content providers, since the algorithms used for their
operations imply a form of data collection and processing. Indeed, the search engines
search automatically, continuously and systematically for information published on
the internet and then proceed to select, store and organize it, for instance, as regards
the hierarchization of the information sought in terms of order of appearance on their
pages.55
Here one might add that, independently of the technical aspects mentioned,
including the argument that the companies and programmers that produce and utilize
algorithms themselves do not properly know the data and do not control the results of
their processing, this could not lead to legal immunity of the search engine providers,
already due to the simple fact that this would imply the impossibility of protecting
the users and third parties, precisely in the face of the great losses caused by the
massive accessibility to contents that could cause real and unjustified harm to their
personality rights.
It should be noted—even while not entering the terrain of liability, at least not
when the contents are removed due to a court order, as acknowledged in the SCJ
case law, a situation in which there is a subjective type of responsibility—that the
possibility of, under certain circumstances, determining the deindexation of search
engines is still the most appropriate instrument for an effective right to be forgotten
on the internet and, thus, for the protection of personality rights.
It should be underlined that the Law Bills going through the National Congress-
such as the Law Bill 2,712/2015 and the Law Bill 1,676/2015,56 that provide for

53 In this sense, see Nolte (2014), p. 2240.


54 See Esposito (2017), p. 2.
55 For the European case see, among so many, Stehmeier and Schimke (2014), pp. 661–682.
56 See point 2.2 of this article.
162 I. W. Sarlet

and regulate not only a right to be forgotten in general terms, but also a right to
deindexation vis-à-vis the search engine providers, create at least a solid prospect
that, if this is carried out, should lead to an effective change in the case-law of the
SCJ, reinforcing the for the moment isolated decision above mentioned, unless a
declaration of unconstitutionality by the FSC may take place.
Although solved the issue regarding the liability of search engine providers, other
questions still remain to be considered, as they have been only partly—and not always
adequately—dealt with in the above-mentioned decisions of the SCJ. These aspects
refer both to the internet domain and other cases involving a right to be forgotten.
However, it should be noted that in the case of the former there are peculiarities that
must be highlighted and analyzed in a particularly careful manner.

4.3 Aspects Related to the Criteria for the Application


and Delimitation of the Content and Scope of the Right
to Be Forgotten

Considering that the right to be forgotten, as a specific and partly autonomous expres-
sion of the protection of personality in general terms and of various specific person-
ality rights, and that, as a rule, its recognition implies affecting other rights, principles
and legal interests of a constitutional stature (outstandingly freedom of speech and
information), the central theoretical and practical problems concern the determination
of its limits and of the criteria for applying it to specific cases.57
In this context, we can find that the right to be forgotten, as a fundamental right,
may cover a diversified range of subjective positions, each of them with its own object,
but that may possibly become cumulative.58 Indeed, the right to be forgotten may
primarily consist of a right to require the cancellation (erasure) of certain information
(data) vehiculated through the media, in general, and on the internet in particular.
This possibility, as already mentioned, is already provided for in Brazilian law, even
if to a limited extent, in the Consumer Protection Code and in the CFI.
Another alternative that was found particularly controversial in Brazil is to ensure
that the holder of the right has the prerogative of demanding from the search engine
providers that certain links of the respective search engines are deindexed. Like-
wise, the right to be forgotten may include the correction of information or even
the suppression of the identity of those who feel that they have been harmed. In
addition to such possibilities, there are the instruments of the right to reply and
compensation for damages caused by the dissemination of information considered
prejudicial, besides criminal liability (e. g. in the cases of verbal abuse, defamation
and slander), but they are mechanisms that do not imply directly the suppression or
difficulty to access information, although able to have a preventive and dissuading

57 Cf. also the considerations of Schimke (2022), in this volume.


58 Cf. also with regard to the fundamental right to data protection in general Albers (2016), pp. 32
ff.
The Protection of Personality in the Digital Environment 163

effect in relation to new abuses. The right to reply (through an electronic form),
used in a proportional manner, has the advantage of providing a kind of communi-
cational adversarial proceeding, ensuring the possibility of supplying an alternative
view, whereas the resort to financial compensation and even punitive damages may
be useful in the sense of dissuading new violations of personality rights.
Likewise, taking seriously the duties of state protection of fundamental rights,
the range of alternatives is even larger, including—beyond the criminal and civil
liability—the regulation and guarantee by the State, e.g. of free competition among
the media and internet actors or the creation of judicial and extrajudicial procedural
mechanisms and guarantees. Thus, both from the point of view of the subjective and
the objective dimension (that interact and complement each other, but may also enter
a collision course), one can see that there may even be a cumulation of measures,
depending on the case (erasure, deindexation and civil and criminal liability, for
instance.)
This aspect is even more important when one notes that also the levels (the inten-
sity) of affectation of fundamental rights and other legal interests with a constitu-
tional stature are differentiated, which must be duly taken into account by the judicial
control of restrictions, especially as regards the proportionality criterion. Measuring
the legitimacy of the restriction must consider the nature and impacts of the restrictive
measure. From this perspective, besides the test of suitability—i. e. of the possibility
of achieving the intended result through this measure—it is necessary to assess the
alternatives available for this purpose and verify what is the respective impact, i. e.
the intensity of the restriction imposed according to what is indicated above. At least
in principle, the determination of data erasure and deindexation of the search engines
on the internet are weightier (from the point of view of the scope of restricting the
freedom of speech and information) than the suppression of the identity of those who
feel harmed or even rectification (depending on the case and scope) of information.
With a view to the previous considerations and the range of possibilities theoret-
ically available to implement the right to be forgotten, affirming the protection of
the personality vis-à-vis the freedom of speech and information, it is found that, as
a rule, the decisions of the SCJ do not give due attention to the problem of a careful
differentiation and assessment of these alternatives and their impact on the rights and
interests at stake in the context of a balancing criterion.
An exception occurs in the case of the Candelária Slaughter, in which it was
pointed out that in the disputed television program the identity of the plaintiff could
have been suppressed, rendering her protection compatible with the freedom of infor-
mation as regards an embarrassing public exposure. Moreover, the argument (used in
one of the cases involving Google) that with the deindexation of the search engines
one would be at the same time hindering access to the right to reply exercised by the
plaintiff may be a guideline that provide grounds to solve certain cases. The same
applies to the behavior of the one who felt harmed by the allegedly resulting losses,
although one may question whether this indeed should be considered in a judgment
issued by the SCJ.
164 I. W. Sarlet

However, although there are a few possible criteria to apply the right to be
forgotten, these are isolated arguments that, in our opinion, do not remove the need
for the substantial correctness of the criticism presented here.
Another point that should be emphasized is that in the decisions (especially in
those involving the internet), the nature of the act considered prejudicial—an exami-
nation that should be conjugated with that of the impact of the measures which imple-
ment the right to be forgotten—was not sufficiently taken into account, such as the
distinction between true and false information, and even those of an unlawful nature,
their more or less harmful character (i. e. damage intensity).
But there are yet other aspects to be emphasized and that concern the criteria for
a recognition of the right to be forgotten and its possibilities of being implemented.
As in the cases involving the internet this was not examined, it is necessary to refer
to the trials of the cases of Aida Curi and the Candelária Slaughter.
In both cases, it is noteworthy that the passage of time was invoked as a criterion,
but in a different sense. While in the Aida Curi case it was found that the long
time elapsed since the facts mitigated the pain and embarrassment caused by its
reconstitution, contributing to a verdict unfavorable to the plaintiffs, in the Candelária
Slaughter case the opposite occurred, as it was argued that the useful life of the
information about the investigation and criminal case had been exhausted.
Although the time lapse may be useful and relevant (but not exclusive) to the
criterion for the balancing approach, in both cases it was not necessarily employed
in the best possible manner. In the Aida Curi case, it can be objected that, given the
traumatic nature of the facts (murder and rape), no matter how much time has gone by,
the effects on the close family continue to be serious and their public remembrance
may potentiate them. In the Candelária Slaughter case, in turn, as the claimant had
been acquitted in the criminal case, the situation should not be confused with cases of
a criminal conviction verdict, because in this situation there are legal rules (as it is the
case in Brazil) interdicting the publication of criminal records, except for the use in
new criminal investigation and procedures. Besides this, the Candelaria Slaughter
case cannot be directly compared with the famous German Lebach case, where the
plaintiff had been convicted and was about to be released, and the reconstitution of
the crime by a TV program would compromise severely his resocialization.
Another argument employed by the SCJ concerns the historically relevant content
of the information and the public interest in its dissemination. In both cases, this
condition was affirmed, but, even so, in the Candelária Slaughter case, it did not
result in the denial of the claim, since here the alleged exhaustion of the informational
value and the embarrassment caused by the new dissemination of the facts ultimately
worked in favor of recognizing the request for compensation. It should be noted that
in both cases true facts were at stake, whose historical and informational value was
recognized, but the criterion of the time elapsed since the crimes occurred was valued
distinctly for reasons that do not seem sufficiently convincing.
In the cases concerning the internet environment, there are some peculiarities to
be considered. In the Xuxa case, the fact in itself—making the film showing sexual
relations with a 12-year-old minor—is true, but the imputation of pedophilia (taking
into account that it is a representation) became distorted, which was prejudicial to the
The Protection of Personality in the Digital Environment 165

image and professional activity of the plaintiff against Google. The accusation against
the judge who had been investigated for involvement in pedophilia was dismissed,
an aspect that was not recognized as relevant to justify the plaintiff’s request. On
the other hand, in the case of the employee dismissed because the videos that she
herself produced showed intimate scenes, although here the veracity of the facts
and authenticity of the videos was not challenged, in the balancing criterion greater
weight was given to the argument that it was the plaintiff herself who was careless and
therefore contributed to the dissemination of the material, an argument that makes
sense in this case and can be useful to solve similar ones. 59
Although the criteria of historical relevance and public interest in the information,
time lapse, exhaustion of the useful life of the criminal information have been used,
the decisions are weak as regards the individual justification of their weight in the
balancing and as regards their intrinsic consistency. On the other hand, the criterion
of veracity of the facts and the manner in which they are disseminated, the possible
offensive and unlawful content of the expressions, as well as the respective impact
on the fundamental rights of the person allegedly harmed practically were not the
object of analysis by the SCJ.
Still in this context, it seems timely to mention that one of the criticisms against
the right to be forgotten is that recognizing it—which, in our view, is demonstrated
especially in the cases of a right to the erasure of information or its deindexation
from the search engines—would imply compromising the collective memory and,
consequently, the potential annihilation of a right to memory and to historical truth.60
On the contrary, however, some people claim that the right to be forgotten does not
aim at erasing certain facts or wiping out or rewriting History, because one is only
dealing with the right to rise against a particular individual projection of themselves,
extracted from facts and their respective past evaluations, if it causes them relevant
current damage and have an impact on their personality rights.61
In our view, however, that reason only comes to the partial aid of both approaches.
As already mentioned, recognizing a right to be forgotten does not mean recognizing
a broad right to erasure of information. Rather, as this is a case of conflict with the
freedom of speech and information, and even of a right to memory, we are dealing with
a problem related to the content and limits of the fundamental rights involved. On the
other hand, it is also not completely correct to say that the right to be forgotten has no
connection to erasing facts or rewriting History, but only with impugning a particular
projection extracted from past facts and evaluations. Indeed, although the facts in
themselves cannot be erased, because it has become material, information (including
the mere depiction of facts) and certain evaluations can be the object, at least to a
certain extent, of an erasure, which does not eliminate the circumstance that was at
stake and—perhaps even preponderantly—is the insurgence against individualized
projections about someone.

59 See, among others, Spindler (2013), p. 1001.


60 See again Sarmento (2016), pp. 11ff.
61 See, in this sense, the observation by Schreiber (2017a, b).
166 I. W. Sarlet

Likewise, it is striking that no great weight was given to the situations


which involve information and expressions that, although they may be legitimately
seen as prejudicial and highly impacting on the personality rights from the individual
perspective, are relevant from the point of view of their historicity and informational
value, independently of time elapsed. Here, indeed, it would be possible to adopt—
at least as an alternative criterion to be evaluated case-by-case, as was suggested in
the Candelária Slaughter case—the suppression of the identity of those involved,
without in this way placing an obstacle to the free and generalized access to the
content of the information.
Also, as to the criteria to recognize and apply a right to be forgotten, one can
find the absence of prioritization in using the various possibilities provided for in
positive law, which, since they already correspond to options chosen by the demo-
cratically legitimized legislator, should be the first taken into account, even if their
constitutionality may be challenged, depending on the case.
Thus, for instance, the legal legitimacy of decisions that—bearing in mind the
criminal law, the laws of protection of children and adolescents and of consumer
protection—determine the exclusion of such data as well as their deindexation of
the search engines, should, as a rule, be recognized, and can mutatis mutandis, be
extended to similar situations, as, by the way, can be taken from the CFI. The latter,
besides, expressly grants the internet user a subjective right to demand the exclu-
sion of certain pieces of information that they have supplied, and already repre-
sents a mechanism—although still partial—to concretely implement the right to be
forgotten.
Another point to be underscored and that likewise was not even mentioned in
the decisions shown concerns the State’s duties to protect fundamental rights in the
sphere of relationships between private persons, which, in the case of the protection
of the personality rights on the internet, is particularly relevant and implies problems
that are complex and difficult to solve, which also applies to the right to be forgotten.
Indeed, when the interests of private players that are powerful from an economic
and even political point of view—players that are powerful due to their capacity of
exerting pressure and even manipulating the legislative and regulatory processes, in
general even on an international scale, as is precisely the case of Google, Facebook
and others—are at stake, one finds a great imbalance between the parties involved in
the web of legal relations that are established among users, providers, etc. Moreover,
one should not ignore that the view that the digital environment is ruled by private
autonomy encounters strong criticism due to the fact that a large part of the goods
and services made available are only accessible through adhesion contracts, not to
mention the fact that, because of the necessity to use various of these services, in many
cases practically a (factual) obligation is established to make a contract that literally
annuls individual autonomy and the fundamental right to free informational self-
determination, acknowledged in the FC and, as already mentioned, in the statutory
law.62

62 On the problem of the power wielded by the large corporations and their impact on autonomy in
the digital environment, see, among so many others, Hoffmann-Riem (2017), pp. 121–142.
The Protection of Personality in the Digital Environment 167

For these reasons, also in the internet environment and of the media in general, one
cannot accept a sphere of action exempted from fundamental rights generating a kind
of immunity, what is much more dangerous the more powerful are the private actors.
That is why a strict control of the restrictions to fundamental rights, also in the sphere
of private relations, including in a preventive manner, is to be carried out also by the
Courts, paying attention primarily to legislative options, but watchful as regards their
possible unconstitutionality. In turn, this implies taking seriously the requirements of
the proportionality test, not only in the sense of forbidding an excessive intervention
(restriction) in the sphere of protection of the fundamental right affected, but also—as
a result of the duties to protect—in the sense of forbidding an insufficient protection
of one or some of the fundamental rights in question.63
In the case of the SCJ decisions regarding the right to be forgotten on the internet,
it appears to us that the mentioned aspects were not taken into account, at least
as regards the abyssal asymmetry between the parties involved and a necessary
compensatory correction in favor of the service users. Even if the SCJ denied the
liability of search engine providers because they are mere intermediaries, the fact is
that it still took a position about aspects connected to the merits of recognizing a right
to be forgotten. Besides, the SCJ—in the case of litigation involving providers of
content and of sharing—acknowledges a right to erasure of given contents, applying
in these situations the Consumer Protection Code and the Civil Framework of the
Internet, as already mentioned.
Under the circumstances presented, and to offer here a brief summary, it can be
said that, despite all the criticism of the decisions already advanced, the criteria used
by the SCJ (although, as a rule, not for the cases involving Google, because it did
not accept the latter’s liability), coincide in general terms with those adopted by the
CJEU in the case of Google Spain vs. Agencia Española de Protección de Datos
and Mario Costeja: (a) the nature of the information, particularly its character that is
sensitive to the private life of the person affected; (b) the public interest in access; (c)
the role that the person who feels harmed plays in public life; (d) the fact that—here
according to the European data protection regulation then in force—the data should
only continue to be accessible insofar as this is necessary for the purposes that led
to their collection and processing; (e) the non-determining character of the legality
or illegality of the content on the internet.64
Besides this, the SCJ—which is explained by the peculiarities of the case—empha-
sized the time factor, in the sense that the longer the time elapsed from the original
event that one should ‘forget’, the lower the impact on the rights of those involved, to
which the criterion of exhaustion of the informational value is added. In addition, the
possible contribution of the person allegedly affected by the circulation and exposure
of contents considered prejudicial was mentioned in one of the decisions, and this
is a criterion that may, depending on the specific case, contribute to an adequate
solution.

63 For the internet environment, see, pars pro toto, Schliesky et al. (2014), pp. 119ff.
64 See the summary by Holznagel and Hartmann (2016), p. 228.
168 I. W. Sarlet

Another noteworthy point concerns the SCJ’s decision of May 8th , 2018, which
for the first time acknowledged a right to the deindexation of search mechanisms. In
this case, as appropriately pointed out by Carlos Affonso Souza,65 the lack of indi-
cation of specific addresses (URLs) ended up imposing on the research providers a
duty of generic monitoring, which in turn also renders the effectiveness of the deci-
sion much more difficult. Furthermore, and again according to the critique made by
Carlos Affonso Souza, the installation of filters involves—depending on the criteria
used (with reference to the choice of certain keywords)—the risk of either reaching
a solution that does not meet the needs of the party that feels aggrieved (and in
whose favor the right to be forgotten has been acknowledged) because the filtering
is insufficient or—which in our view is more serious—of preventing the disclosure
and dissemination of lawful contents and even of contents of general interest.
Thus, in light of the above discussion as a whole, one can see that also in Brazil
there are many open polemical issues as well as a number of important criticisms that
can be levelled at the SCJ’s decisions regarding a right to be forgotten and especially
the criteria used to recognize it.

4.4 Procedural Issues

Even though it is not possible to further explore the topic, it is necessary to say a few
words about the problem of the effectiveness of a right to be forgotten, that covers
both the judicial protection and the extrajudicial means. In this sense, the Brazilian
procedural law already has a considerable collection of guarantees and instruments
available, which, although not targeted specifically at the right to be forgotten, is also
applicable to it.
A first guarantee that can be considered basically very robust, since it already
corresponds to a practice that is enshrined in the national territory (although with
major regional and local differences), is the right to broad judicial protection, guar-
anteed as a fundamental right in Article 5, XXXV, FC (‘the law shall not exclude
any injury or threat to a right from the consideration of the Judicial Power’). This
guarantee is complemented (Article 5, LXXIV) by the right to free legal aid for all
who show that they have insufficient funds to pay the expenses of litigation without
compromising their own sustenance and that of their family, a right that includes,
besides the costs of litigation proper, the waiver of the payment of lawyer’s fees
(unless there is a financial result if the case is won). Furthermore, the FC (Article
134) institutes a Public Legal Defense (at the federal and state level) ensuring that
private persons with a low income may have available a lawyer paid by the govern-
ment, be it as plaintiffs or defendants in a judicial case and even for consultative
purposes. It is clear that also the Public Defense offices are installed and structured
in a very heterogeneous manner and with greater or lesser deficiencies in human

65 Souza (2018).
The Protection of Personality in the Digital Environment 169

and material terms, but the system has meant a great advance and has ensured that
millions of people all over Brazil are served.
From the point of view of the procedural means available, the right to be forgotten
may be protected in any kind of proceeding, as long as the specificities of the case are
taken into account, including (the CFI also provides for this in Article 19, paragraph
3) before the Special Small Claims Court, through a more informal procedure and,
as a rule (up to a given limit as to the value involved in the case), without a need to
appoint a lawyer and without court fees.
Especially relevant for the case of data protection in general and the right to be
forgotten in particular is that the CFI established the court with jurisdiction based
on the place where the service is actually rendered. According to Article 8, sole
paragraph, item II, contractual clauses that in the adhesion contract do not contain the
alternative for the contractor to adopt the Brazilian jurisdiction to solve controversies
resulting from the service rendered in Brazil are void.
Besides this, Article 11 provides that
In any operation of gathering, storage, custody and treatment of records, personal data or
communications by connection and internet application providers in which at least one of
these acts occurs in national territory, the Brazilian law and the rights to privacy, protection
of personal data and the confidentiality of private communications and records must be
mandatorily respected.

This rule is applied also to the data collected on Brazilian territory (and to the
content of the communications) if at least one of the terminals is located in Brazil
(paragraph 1), even if the activities are performed by a company with its headquarters
abroad, as long as it offers some service to the Brazilian users or some member of
the same economic group keeps an establishment in Brazil.
With this, Brazilian legislation is, in general terms, aligned with the European
regulation and with the Google decision of the CJEU,66 ensuring the user in Brazil
(even if a foreigner) of direct access to legal protection, even if the immense difficulty
of making possible decisions prevail abroad is known.
It is more difficult to solve the problem of the possibility of a procedural
prohibitory injunction—which is even provided for by the CFI—and through the
use, by the judge, of their power to determine a variety of measures in a provisional
manner and even anticipating the effects of the judgment in certain cases, such as
provided for in the Brazilian Civil Procedural Code.67 This, for instance, would be
the case of a relief that, before the final decision, were to determine deindexation by
the search engines, the canceling (or suspension of exhibition) and/or rectification of
certain contents, as well as the imposition of economic sanctions. The same occurs
in the case of a final decision on the merits, at the level of its enforcement, since
also in this phase there is still controversy about the constitutional legitimacy of the

66See more closely Veit (2022), in this volume.


67Preliminary injunction is regulated in articles 300 to 301, interlocutory relief in articles 303–
304, provisional remedy in articles 305–10 and evidence-based relief in Article 311 of the Civil
Procedural Code.
170 I. W. Sarlet

aforementioned measures, which points to the need to establish substantial criteria


to recognize the right to be forgotten, which has already been the subject of study.

5 Final Considerations and Open Issues

Concerning its recognition as a fundamental right in Brazil, it can be said that the
right to be forgotten has been accepted both in the literature and in various decisions
by the Brazilian SCJ. The FSC’s decision from February 2021, as already seen, did
not really close the door to at least some dimensions of that right, specially–but not
only–in re-gard to the internet domain. However, even if the the right to be forgotten
as such is in general recognized, this does not mean that its scope of protection and
its limits are uncontroversial, being distant from adequate and consistent treatment
by legislation, literature and judicial precedents.
Another aspect to be remembered is that although the Brazilian constitutional
and legal order covers diverse concrete expressions of the right to be forgotten,
there are still gaps to be fulfilled. Whereas from the point of view of procedural law
(including access to Justice) there is already a major investment by the legislator, the
same cannot be said about the content and limits of the right to be forgotten, either
from its subjective perspective or much less as regards the State’s duties of protection
and their implementation.
In the case of the law bills that are being discussed in the National Congress,
although a legal regulation of the right to be forgotten should be welcomed, covering
the digital world and including besides a right to demand cancellation of information,
a right to deindexation from the search engines, the content of these bills demand
careful evaluation. In general, the bills mention the prejudicial character of informa-
tion regarding the personality rights and the need for the petitioner to demonstrate
this condition. The absence of historical and public relevance as a criterion to recog-
nize (or not) in each specific case the right to be forgotten is also considered, as well
as the fact that if persons who are in public positions are at stake, the protection of
their personality rights (e. g. privacy, image, honor) is weaker than in other cases.
From the procedural point of view, there is a reference to a direct procedure of the
users vis-à-vis the Media, including the internet, and, in case of refusal—adapted to
the orientation of the SCJ— the need to obtain a court order to prevent or repair the
violation of the rights.
Another point to be mentioned is that as regards the action of the Judicial Power—
here examined based on the decisions of the Superior Courts—it is possible to identify
a still significant lack of coherence and consistency in decisions, especially as regards
the criteria used to recognize the right to be forgotten in specificcases and carry out
their balancing in the face of conflicting fundamental rights.
It must be recalled that, although the SCJ mentioned the criteria of time elapsed,
of the prejudicial character of information regarding the privacy, honor and image
of persons, of the absence of historical value, of the low informational relevance or
The Protection of Personality in the Digital Environment 171

its exhaustion and of the absence of public interest, no effort was found to deter-
mine precisely the content and scope of such references. Likewise, the nature of the
information was not taken into account, such as the fact of being or not true and
even its substantial lawfulness from a criminal point of view. The same can be said
about using the legal rules in force that are applicable to the right to be forgotten
(prohibition of disseminating criminal records, negative data on consumers, etc.).
Even the requirements of proportionality—which also presupposes the examination
of the intensity of the restrictions imposed on the colliding fundamental rights—were
not taken seriously.
In addition, the cases that do not involve the internet—but discussed criteria to
apply the right to be forgotten—concerned primarily the acknowledgment of a right
to compensation for the damage caused by the dissemination of facts considered prej-
udicial. There was no discussion of a right to prevent the dissemination, completely
or in part.
On the other hand, as regards to the right to be forgotten on the internet, the
cancellation and, in the case of search engine providers, deindexation from the search
engines should only be recognized with a strict evaluation of the peculiarities of the
specific case, always taking into account the impact of the media and the alternatives
available to ensure the protection of personality without ignoring the freedom of
speech and information, as well as other relevant criteria, which still—as was also
seen in the case of Brazil—await adequate development.
The same applies to the recognition of a right to rectify information, to an indem-
nification in the sphere of civil liability and, on a smaller scale, to a right to reply,
which cannot become a rule, since, in the case of a hypertrophied use, they also
represent a menace to the freedoms of speech and of information. In the case of the
right to reply on-line, however, like the forms made available to the user to request
erasure or deindexation, the use of an instrument of this kind would ensure—at a
lower cost—a flexible and effective means to counter the version considered incorrect
by the persons who felt that they had suffered damage.
Among the problems that involve the rights to canceling, rectifying and dein-
dexing (beyond the discussion about the various criteria for their acknowledgment
and application in the specific case), mention should be made of the relevant crit-
icism—already voiced regarding the decision of the CJEU in the Google case, but
also mentioned in decisions of the SCJ—that the solution of attributing to the search
engine providers and even (in the cases of cancellation) the providers of content and
sharing the prerogative of deciding which data to cancel or which links to deindex
may lead to a system of private censorship.68
Adopting—as already practiced by Google, among others—a model of online
forms available to the user (in this case the request can or not be accepted by the
provider)—limits, up to a certain point, the discretion of providers, but even so
the decision remains on the level of relations among private persons and entities,
attracting the objection that this would create a system of private censorship. A way of
getting around part of the problem would be to demand that the other party responsible

68 Holznagel and Hartmann (2016), p. 228.


172 I. W. Sarlet

for posting the content considered prejudicial on the internet be summoned to be able
to impugn the request, thus ensuring the necessary adversarial proceeding. But even
here the issue of protecting the right of access to information by third parties and the
problem of a private censorship is not solved.69
In this context, it is necessary to make a few comments about one of the main
underlying issues concerning the solution of tensions and collisions between the
protection of personality and of a right to be forgotten, and the freedom of speech
and information. This issue refers to the adoption or not, by each legal system, of the
thesis regarding a preferential position of the freedom of speech and its respective
scope, an aspect that has found very variable responses in different legal systems.
In the case of Brazil, although from the strict perspective of the constitutional text
there is no way to advance without doubt the thesis of the preferential position of
freedom of speech as a requirement of the framers of the Constitution, including the
aspects highlighted by the recent FSC’S decision on the right to be forgotten70 .
However, when we examine the most recent case law of the FSC, as well as
the position of significant literature (despite the existence of significant dissenting
approaches), it can be concluded that the thesis of the preferred position of the
freedom of speech and information has prevailed.
This is also expressed as regards the right to be forgotten, as was shown in
already mentioned—and insufficiently justified—decisions of the SCJ involving the
liability of search engines on the internet (decisions in favor of Google), but does
not find support in all the decisions of the same Court involving the right to be
forgotten outside the digital domain, as occurred in the already mentioned case of
the Candelária Slaughter.
Among the cases judged by the FSC involving freedom of speech—although
not dealing with the right to be forgotten—one can mention the declaration of non-
reception, due to incompatibility with the FC, of the old Press Law elaborated during
the military regime, when Justice-Rapporteur Carlos Britto mentioned that freedom
of speech takes an almost absolute position and can only be the object of limitation
in the cases expressly established by the FC, specifically the right to compensation

69 It is true that Google, after the decision of the CJEU, besides taking in most of the requests for
deindexation, usually informs the users after the deindexation and erasure of the search results,
although, according to the CJEU, there is no legal obligation in this sense, nor even to ensure a
prior opportunity of expression and impugnation to the owner of the internet page (Holznagel and
Hartmann 2016), p. 228.
70 It should be noted that, at the same time as all and any form of censorship and the requirement for

a prior license to exercise freedom of expression are forbidden, the personality rights, specifically
the rights to privacy, intimacy, honor and image, were expressly characterized as inviolable rights,
which, in turn, was not expressly affirmed with regards to the freedom of expression. In fact,
according to Article 5, ‘IV—the expression of thought is free, and anonymity is forbidden; … IX—
the expression of intellectual, artistic, scientific, and communications activities is free, independently
of censorship or license; X—the privacy, private life, honor and image of persons are inviolable, and
the right to compensation for property or moral damages resulting from their violation is ensured;
….’
The Protection of Personality in the Digital Environment 173

and the right to reply.71 Likewise, two other cases deserve mention, the so-called
case of the ‘marijuana parade’, in which the SC decided that a public and collective
demonstration in favor of legalizing the consumption of marijuana could not be
criminally defined as an apology of crime,72 as well as the already mentioned case of
the unauthorized biographies, in which the FSC decided that it is unconstitutional to
require the prior authorization of the person about whom the biography was written.73
However, none of the cases named involved the dissemination of information or
expressions that were obviously untruthful or of a character that was intrinsically
offensive (verbal abuse, defamation and even slander), nor even situations in which
the so-called hate speech can be identified. Especially in the case of the latter, there
is no recent decision by the FSC, whose main precedent, from 2004, involving the
confirmation of the conviction for racism of an author and editor of a work that denied
the Jewish Holocaust during World War II, precisely does not support the thesis of
the preferred position of the freedom of speech, even if three Justices dissented.74
Nevertheless, as in the case of acknowledging a right to be forgotten, especially
as regards subjective positions that imply cancellation of data and/or a significant
difficulty of access to them, a particularly restrictive burden is being placed on the
freedoms of speech and information, and the guideline that measures restricting
rights should be interpreted restrictively and must be taken even more seriously.75
In other words, the acknowledgment of a right to be forgotten that involves the
aforementioned mechanisms (cancelling and/or deindexation) must be exceptional
and meet a set of criteria that should be strictly controlled in the different situations,
which—as already perceived—takes on a particularly relevant dimension on the
internet, where the possibility of people participating directly in the communication
and information processes essential for democracy is acutely expressed.
Thus, it cannot be just any expression that exposes aspects of private life that
will justify invoking and protecting the right to be forgotten. On the occasion of the
necessary balancing between personality rights and freedom of speech and informa-
tion, the argumentative burden to make personality rights prevail must be particularly

71 Brasil, Supremo Tribunal Federal, Arguição de Descumprimento de Preceito Fundamental:


ADPF 130. Justice-Rapporteur Carlos Britto, judged on Apr 30, 2009. https://redir.stf.jus.br/pagina
dorpub/paginador.jsp?docTP=AC&docID=605411. Accessed 11 Jan. 2022.
72 Brasil, Supremo Tribunal Federal, Arguição de Descumprimento de Preceito Fundamental:

ADPF 187. Justice-Rapporteur: Celso de Mello, judged on June 15, 2011. https://redir.stf.jus.br/
paginadorpub/paginador.jsp?docTP=TP&docID=5956195. Accessed 12 Jan. 2022.
73 Brasil, Supremo Tribunal Federal, Ação Direta de Inconstitucionalidade: ADI 4815, Justice-

Rapporteur Carmen Lúcia. En banc court, judged on June 10, 2015. https://redir.stf.jus.br/pagina
dorpub/paginador.jsp?docTP=TP&docID=10162709. Accessed 11 Jan. 2022.
74 Brasil, Supremo Tribunal Federal, Habeas Corpus: HC nº 82424. Justice-Rapporteur Moreira

Alves, Rapporteur for Appellate Decision: Maurício Corrêa. En banc Court. Judged on Sept 17,
2003. https://redir.stf.jus.br/paginadorpub/paginador.jsp?docTP=AC&docID=79052. Accessed 11
Jan. 2022.
75 In this sense see Barroso (2007).
174 I. W. Sarlet

high, since if there is any doubt about the constitutional legitimacy of the restriction,
the freedom of speech must be privileged, a parameter that must never be forgotten.76
In view of this, it can be stated that, also in Brazil, the determination of the content
and scope of the so-called right to be forgotten is still in an embryonic phase and
requires careful reflection that will be able, on the academic level (where it must be
recognized that there are already noteworthy efforts) or—with particular emphasis—
in the political-legislative and judicial realm, to solve the related problems. In this
sense, besides the development of adequate parameters that are constitutionally legit-
imate and sufficiently safe and effective, it is imperative to maintain a focus on the
real possibilities of a right to be forgotten, given the peculiarities of the internet.
This points, among other aspects, to the problem of the effectiveness of the right
to be forgotten in the sense of the efficacy of decisions that may acknowledge it, both
from the strictly technical point of view and from the perspective of jurisdiction and
procedure, and can generate an incongruence between the objectives of recognizing
a right to be forgotten and the reality, as for instance, the difficulty in enforcing the
judicial decisions at the internal level and beyond the state borders.
From the technical point of view, we have already pointed out the fact that in the
cases of deindexation the information remains on the internet pages because they can
be accessed by using other research terms. If there is a search engine provider (or
even content provider) that operates in more than one country (which is precisely the
case of Google and Facebook, the most powerful and practically omnipresent ones),
even if a prohibition of access and/or deindexation is imposed in one State (or in the
case of the CJEU, in Europe), it is still possible to find the information somewhere
else.
Besides this, there is the problem—which is not exclusive to the conflicts that
involve the internet—of a competition process (including a conflict) of jurisdictions,
be it internally in the relations between the different States (v. g. the denial of a right
to be forgotten in one case and the recognition in another or the use of different
criteria), be it on the level of the relations between supranational courts (as in the
case of the CJEU), and the national courts.77
Further, no matter how much one promotes the development of the regulation at
the internal level of the States, it is necessary to have a transnational network for
cooperation involving State actors, international organizations and the large enter-
prises of the media in general and of the internet, around reasonable and feasible
agendas, covering both regulation and incentive to (and even a regulation of) a self-
regulation in this domain.78 Indeed, the creation of adequate regulatory frameworks
within the States will still be limited (just staying at the domestic level) without a
transnational regulatory structure, even if itself is also limited as to its scope and
efficacy. An example of new efforts is the case of the European Union with the new

76 See Sarlet (2014), p. 473.


77 As pointed out in the document IDPC (2014), pp. 4–5 and 13–14.
78 On the problem of internet regulation see, pars pro toto, Hoffmann-Riem et al. (2013). See also,

with a focus on self-regulation, Hoffmann-Riem (2016).


The Protection of Personality in the Digital Environment 175

regulation of data protection and its application beyond the European territory,79
especially as regards data transfer, which also obliges the recipient countries to take
measures that are consistent with the demands established by the regulation, if they
wish to maintain a number of transactions with the states that are part of the Union.
Likewise, given the globalization of the internet and the flow of data, the formation
of islands of stricter protection of personal data (including a regime that is more open
to the right to be forgotten) may result in major distortions and asymmetries, including
competitive disadvantages and those of an economic nature. Here it is enough to
illustrate it by recalling that firms move their head offices and even their affiliates
to escape stricter regulation as in the case of the European Union.80 These aspects,
in turn, likewise point out to the already mentioned problem of a transnational scale
regulation and the creation of schemes of international cooperation, as well as to the
problem of a competition and even conflict between jurisdictions.
Thus, closing this paper but leaving many windows and questions open, it can
be said that, despite the difficulties regarding its effectiveness in practical terms, in
Brazil, the right to be forgotten can and even must be acknowledged as an implicit
fundamental right, directly related to the protection of human dignity and personality
rights. However, it is mandatory that the content and limits of the right to be forgotten
must be submitted to careful regulation by the legislator, contemplating a right to
data erasure and to deindexation as well as other appropriate entitlements to assure
the possibility to protect personality rights—such as a right to a digital response, the
suppression of the harmed persons identity, etc.—and establishing criteria that are
constitutionally consistent and strict in their application.
Furthermore, it is expected from the Judicial Power, mainly the Superior Courts,
that they still embrace the right to be forgotten and, at the same time, strictly control
its application, especially as regards its impact on the freedom of speech and of
information, which must continue to occupy a preferred position. Especially the
FSC will play a decisive role in this context, because, among other open questions,
a specific ruling about the right to be forgotten on the internet and mainly a right
to deindexation is needed. The fact that, from the perspective of Law, with this
approach we will only be responding to a small part of the problems and challenges
that involve the protection of personality rights on the internet, does not appear to
us, like it has been the case in Europe and other countries, to be sufficient reason to
give up a right to be forgotten.

79 See more closely Veit (2022), in this volume.


80 Hern (2018)Facebook moves 1.5bn users out of reach of new European privacy law. Available
via The Guardian. https://www.theguardian.com/technology/2018/apr/19/facebook-moves-15bn-
users-out-of-reach-of-new-european-privacy-law. Accessed 11 Jan. 2022.
176 I. W. Sarlet

References

Albers M (2005) Informationelle Selbstbestimmung. Nomos, Baden-Baden


Albers M (2016) A complexidade da proteção de dados. Rev Bras De Direitos Fundamentais Justiça
10(35):19–45
Albers M, Schimke A (2019) Vergessen im Internet. Manuscript
Barroso LR (2007) Liberdade de expressão versus direitos de personalidade. Colisão de direitos
fundamentais e critérios de ponderação. In: Sarlet IW (ed) Direitos fundamentais, informática e
comunicação: algumas aproximações. Livraria do Advogado, Porto Alegre
Botelho CS (2017) Novo Ou Velho Direito? O direito ao esquecimento e o princípio da propor-
cionalidade no constitucionalismo global. Ab Instantia V (7), pp 49–71. Available via SSRN
https://ssrn.com/abstract=3130258. Accessed 11 Jan. 2022
Branco S (2017) Memória e esquecimento na Internet. Arquipélago, Porto Alegre
Buchholtz G (2015) Das Recht auf Vergessen im Internet. Eine Herausforderung für den demokratis-
chen Rechtsstaat. In: Archiv des öffentlichen Rechts (AöR). Mohr-Siebeck, Tübingen, 140:127
ff
Bull H-P et al (2016) Zukunft der informationellen Selbstbestimmung. Erich Schmidt, Berlin
Carello CP (2017) Direito ao Esquecimento. Parâmetros jurisprudenciais. Prismas, Portão
Caro MA (2015) Derecho al Olvido en Internet. El nuevo paradigma de la privacidad en la era
digital. Reus, Madrid
Consalter ZM (2017) Direito ao Esquecimento. Proteção da intimidade e ambiente virtual. Juruá,
Curitiba
de Andrade FS (2006) Considerações sobre a tutela dos direitos da personalidade no Código Civil
de 2002. In: Sarlet IW (ed) O novo código civil e a constituição, 2nd ed. Livraria do Advogado,
Porto Alegre
de Moraes MCB (2003) Danos à pessoa humana. Uma leitura civil-constitucional dos danos morais.
Editora Processo, Rio de Janeiro
Dechenaud D (2015) Le droit à l’oubli numérique. Donnés normatives. Approche comparée. Larcier
Bruxelles
Diesterhöft M (2014) Das Recht auf medialen Neubeginn: Die ‘Unfähigkeit des Internets
zu vergessen’ als Herausforderung für das allgemeine Persönlichkeitsrecht. Beiträge zum
Informationsrecht, vol 33. Duncker & Humblot, Berlin
Esposito E (2017) Algorithmic memory and the right to be forgotten on the web. Big Data Soc.
https://doi.org/10.1177/2053951717703996
Ferreira Neto AM (2016) Direito ao Esquecimento na Alemanha e no Brasil. In: Marques CL,
Benicke C, Jaeger Junior A (eds) Diálogo entre o Direito Brasileiro e o Direito Alemão.
Fundamentos, métodos e desafios de ensino, pesquisa e extensão em tempos de cooperação
internacional, vol 2. RJR, Porto Alegre, pp 278–323
Frajhof IZ (2019) O Direito ao Esquecimento na Internet: Conceito, Aplicação e Controvérsias,
Almedina, Coimbra
Gonçalves LH (2016) O direito ao esquecimento na era digital. Desafios da regulação de desvincu-
lação de URLs prejudiciais a pessoas naturais nos índices de pesquisa dos buscadores horizontais.
Dissertation, Escola de Direito da Fundação Getúlio Vargas
Hassler T (2007) Droits de la personnalité: rediffusion et droit à l’oubli. Recueil Dalloz 40:2829–
2832
Hern A (2018) Facebook moves 1.5bn users out of reach of new European privacy law. Avail-
able via The Guardian. https://www.theguardian.com/technology/2018/apr/19/facebook-moves-
15bn-users-out-of-reach-of-new-european-privacy-law. Accessed 11 Jan. 2022
Heylliard C (2012) Le droit à l’oubli sur Internet. Dissertation, Université Paris-Sud, Faculté Jean
Monnet – Droit, Économie, Gestion
Hoffmann-Riem W (2016) Regulierte Selbstregulierung im digitalen Kontext. In: Fehling M,
Schliesky U (eds) Neue Macht- und Verantwortungsstrukturen in der digitalen Welt. Nomos,
Baden-Baden, pp 27–51
The Protection of Personality in the Digital Environment 177

Hoffmann-Riem W (2017) Reclaim autonomy. Die Macht digitaler Konzerne. In: Augstein I (ed)
Reclaim autonomy. Selbstermächtigung in der digitalen Weltordnung. Suhrkamp, Frankfurt am
Main, pp 121–142
Hoffmann-Riem W, Ladeur KH, Trute HH (eds) (2013) Innovationsoffene Regulierung des
Internets. Nomos, Baden-Baden
Holznagel B, Hartmann S (2016) Das ‘Recht auf Vergessenwerden’ als Reaktion auf ein
grenzenloses Internet – Entgrenzung der Kommunikation und Gegenbewegung. MMR: 228–232
IDPC (2014) Online privacy and freedom of expression. Available via UNESCO HQ. https://une
sdoc.unesco.org/ark:/48223/pf0000230176?2=null&queryId=e57da636-9f6d-457f-ad53-65a
b7aa5986d. Accessed 11 Jan. 2022
Kelly MJ, Satolam D (2017) The right to be forgotten. Univ Ill Law Rev 1:1–65
Lepage A (2001) Droit á L’oubli: une jurisprudence tâtonnante. Recueil Dalloz:2079
Maisch MM (2015) Informationelle Selbstbestimmung in Netzwerken. Rechtsrahmen,
Gefährdungslagen und Schutzkonzepte am Beispiel von Cloud Computing und Facebook.
Duncker & Humblot, Berlin
Maldonado VN (2017) Direito ao Esquecimento. Novo Século, São Paulo
Martinelli S (2017) Diritto all’oblio e motori di ricerca. Giuffré, Milano
Martinez PD (2014) Direito ao Esquecimento: A proteção da Memória Individual na Sociedade da
Informação. Lúmen Juris, Rio de Janeiro
Mayer-Schönberger V (2009) Delete: the virtue of forgetting in the digital age. Princeton University
Press, Princeton
Nolte N (2014) Das Recht auf Vergessenwerden – mehr als nur ein Hype? NJW 67(31):2238–2242
Rodrigues Junior OL (2013) Brasil debate o direito ao esquecimento desde 1990. CONJUR –
Consultor Jurídico, coluna de Direito Comparado publicada em 27.11.2013. https://www.conjur.
com.br/. Accessed 11 Jan. 2022
Rodotà S (2014) Il Mondo nella rete. Quali i diritti, quali i vincoli. Laterza, Roma
Sarlet IW (2014) Direitos Fundamentais em espécie. In: Sarlet IW, Marinoni LG, Mitidiero D (eds)
Curso de Direito Constitucional. Revista dos Tribunais, São Paulo, p. 473ff
Sarlet IW (2015) A eficácia dos direitos fundamentais. Uma teoria geral dos direitos fundamentais
na perspectiva constitucional. Livraria do Advogado, Porto Alegre
Sarlet IW (2018) Vale a pena relembrar o que estamos fazendo com o direito ao esquecimento. Avail-
able via CONJUR. https://www.conjur.com.br/2018-jan-26/direitos-fundamentais-vale-pena-rel
embrar-fizemos-direito-esquecimento. Accessed 11 Jan. 2022
Sarlet IW, Ferreira Neto A (2018) O Direito ao Esquecimento na Sociedade da Informação, Livraria
do Advogado Editora, Porto Alegre
Sarmento D (2016) Liberdades Comunicativas e ‘Direito ao Esquecimento’ na ordem consti-
tucional brasileira. RBDC 7(1):190–232. https://rbdcivil.ibdcivil.org.br/rbdc/article/view/76/70.
Accessed 11 Jan. 2022
Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Schliesky U, Hoffmann C, Luch AD, Schulz SE, Borchers KC (2014) Schutzpflichten und
Drittwirkung im Internet. Das Grundgesetz im digitalen Zeitalter. Nomos, Baden-Baden
Schreiber A (2014) Direitos da Personalidade. Atlas, São Paulo
Schreiber A (2017a) Direito ao esquecimento: críticas e respostas. In: Jornal Carta Forense.
Available via Colunas. https://www.academia.edu/35814707/Direito_ao_Esquecimento_Cr%
C3%ADticas_e_Respostas_Carta_Forense_. Accessed 11 Jan. 2022
Schreiber A (2017b) As três correntes do direito ao esquecimento. In: JOTA. Available
via Artigos. https://www.jota.info/opiniao-e-analise/artigos/as-tres-correntes-do-direito-ao-esq
uecimento-18062017. Accessed 11 Jan. 2022
Solove D (2011) Speech, privacy and reputation on the internet. In: Nussbaum M, Levmore S (eds)
The offensive internet. Speech, privacy and reputation. Harvard University Press, Cambridge, pp
15–30
178 I. W. Sarlet

Souza CA (2018) Direito ao esquecimento: 5 pontos sobre a decisão do STJ, JOTA, 13.05.2018.
Available through https://www.jota.info/coberturas-especiais/liberdade-de-expressao/direito-ao-
esquecimento-decisao-do-stj-13052018. Accessed 11 Jan. 2022
Spindler G (2013) Datenschutz- und Persönlichkeitsrechte im Internet – der Rahmen für
Forschungsaufgaben und Reformbedarf. GRUR:996-1003
Stehmeier M, Schimke A (2014) Internet-Suchmaschinen und Datenschutz. Zugleich eine
Besprechung von EUGH C-131/12 Google Spain und Google. UFITA Archiv Für Urheber-Und
Medienrecht 3:661–682
Tepedino G (1999) Temas de Direito Civil. Renovar, Rio de Janeiro
Veit RD (2022) Safeguarding regional data protection rights on the global Internet—the European
approach under the GDPR. In: Albers M, Sarlet IW (eds) Personality and data protection rights
on the internet. Springer, Dordrecht, Heidelberg, New York, London, in this volume
Weismantel J (2017) Das ‘Recht auf Vergessenwerden’ im Internet nach dem ‘Google-Urteil’ des
EuGH – Begleitung eines offenen Prozesses. Duncker & Humblot, Berlin

Ingo Wolfgang Sarlet Dr. Iur. Ludwig-Maximilians-Universität-München, Chair Professor for


Constitutional Law and current Head of the Graduation Program in Law (LLM-PHD) at the
Pontifical Catholic University Rio Grande do Sul, Brazil - PUCRS). Principal Investigator in the
Brazilian/German CAPES/DAAD PROBRAL-Research Project „Internet Regulation and Internet
Rights“. Current research projects: protection of human dignity and fundamental rights in the
digital domain and social rights, innovation and technology. Selected Publications: A Eficácia
dos Direitos Fundamentais, 13th Ed., Porto Alegre, Livraria do Advogado, 2018; Dignidade
da Pessoa Humana na Constituição Federal de 1988, 10th Ed., Livraria do Advogado, Porto
Alegre, 2015; Direito Constitucional Ecológicol, 6th Ed., Revista dos Tribunais, São Paulo, 2019;
Grundrechte und Privatrecht – Einige Bemerkungen zum Einfluss der deutschen Grundrechtsdog-
matik und insbesondere der Lehre Canaris’ in Brasilien, in: Festschrift für Claus-Wilhelm Canaris
zum 80. Geburtstag, Berlin, De Gruyter, 2017, pp. 1257-80; Menschenwürde und soziale Grun-
drechte in der brasilianischen Verfassung, in: Stephan Kirste, Draiton Gonzaga De Souza and Ingo
Wolfgang Sarlet (eds), Menschenwürde im 21. Jahrhundert. Untersuchungen zu den philosophis-
chen, völker- und verfassungsrechtlichen Grundlagen in Brasilien, Deutschland und Österreich,
Nomos, Baden-Baden, 2018.
Forgetting as a Social Concept.
Contextualizing the Right to Be
Forgotten

Anna Schimke

Abstract This chapter contextualizes the debate on the right to be forgotten, taking
into account cultural and social science considerations of forgetting on the Internet.
On this basis, it is shown how forgetting in general and the right to be forgotten in
particular can be meaningfully constructed from a legal perspective. The right to be
forgotten as described in the present contribution can be located in various areas of
law, whereby data protection law and press law are of particular importance. The
chapter tries to clarify how the two areas can be appropriately related to each other in
the case of the right to be forgotten. Using the ‘media privilege’ is proposed as a kind
of coordination mechanism that applies the area of law best suited to the medium in
question. The aim of this approach is to arrive at the most differentiated standards
for assessing the legality of public statements on the Internet and also for applying
the right to be forgotten.

1 Introduction

Forgetting and the right to be forgotten have become buzzwords in the worldwide
public and academic discussions in recent years.1 The discussion was stimulated in
particular by the Google-Spain decision2 of the European Court of Justice (ECJ) and
has since been conducted internationally. Forgetting and the right to be forgotten are
put in relation to constellations in which data once put on the Internet is to ‘disappear’
again. The constellations under discussion generally have an international aspect
because of the global networking of data on the Internet. At the same time, the

A. Schimke (B)
University of Hamburg, Hamburg, Germany
e-mail: anna.mareile.schimke@uni-hamburg.de
1 There have been numerous publications worldwide. See for the Brazilian debate for example Sarlet
(2022), in this volume, Gonçalves (2020) and Blum (2013). In Europe, several doctoral projects
have been published: Gstrein (2016), Diesterhöft (2014); Weismantel (2017), Becker (2019). See
for a comparative perspective the contributions in Werro (2020).
2 ECJ, decision of 13.5.2014—C-131/12.

© Springer Nature Switzerland AG 2022 179


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_8
180 A. Schimke

various manifestations of the right to be forgotten are reflected in the reception of


the ECJ’s approach. As Ingo Sarlet explains in this volume, this also applies to the
Brazilian debate.3 In order to further stimulate the existing reception processes and
make them fruitful in the future, it is important to compare the conditions under which
the various rights to be forgotten arise. This requires a minimum of common semantic
understanding of what is legally meant by ‘forgetting’ and ‘right to be forgotten.’
Although forgetting and the right to be forgotten have become buzzwords in the
international debate, the word ‘forgetting’ remains unclear in this context. Often it is
even seen as an obstacle because it leads to misconceptions.4 In particular, fears are
expressed that it could foster the idea that the right to be forgotten gives rise to a claim
that all personal data should and could be erased anywhere in the world.5 However,
this cannot be realized technically and would not be legally feasible against the
background of conflicting rights such as freedom of expression, information rights,
and rights of the media.6 Therefore, some authors argue that the term should be
dispensed with altogether.7
This chapter takes a different approach by first trying to contextualize the legal
debate on the right to be forgotten (Sect. 2) against the background of reception of
the right to be forgotten by the social and cultural sciences (Sect. 3). On this basis, it
is shown how forgetting can also be meaningfully conceptualized from a legal point
of view. If the right to be forgotten is constructed in this way, it becomes possible
to compare its various manifestations nationally and internationally. At the same
time, new questions open up on this basis. In Germany, for example, the right to be
forgotten in the sense proposed here can be part of both data protection law and press
law (Sect. 4). The question therefore arises as to how the two areas of law relate to
each other in the case of the right to be forgotten. In order to establish a productive
relationship between these areas of law, it is proposed to make use of the ‘media
privilege’ enshrined in European data protection law for new media on the Internet
as well (Sect. 5). Article 4 of the new Brazilian data protection law is a comparable
albeit a little less specified regulation; it stipulates that data protection law is not
applicable when data are processed for journalistic purposes. The media privilege
should be understood as a kind of coordination mechanism that applies the area of
law that best suits the medium in question. In this way, both data protection law and

3 Sarlet (2022), in this volume.


4 The debate is often traced back to Mayer-Schönberger (2011). However, earlier approaches to
forgetting and the right to be forgotten existed prior to it. See for a short overview Koops (2012),
p. 2. Many authors disagree with the terminology. See for the German debate inter alia Hornung and
Hofmann (2013), p. 170; Kühling (2014), p. 530; Spiecker gen. Döhmann (2014), p. 35. Spiecker
points out that she uses the term anyway because it has now become established. See also Sarlet
(2022), Sect. 1, in this volume.
5 Hornung and Hofmann (2013), p. 170.
6 See for the emphasis on conflicting interests especially the US debate on the right to be forgotten

Rosen (2012). A good overview on that debate and its relationship to the European view is provided
by Bernal (2014).
7 Koreng and Feldmann (2012), p. 315.
Forgetting as a Social Concept. Contextualizing … 181

press law could be further developed in a way that reacts sensitively to new media
developments (Sect. 6).

2 Forgetting and the Right to Be Forgotten in the Legal


Discussion

For some years now, jurisprudence has increasingly been discussing “forgetting”,
with the main discussion contexts in Europe and in Germany being the right to
be forgotten under Article 17 of the General Data Protection Regulation (GDPR),8
the Google-Spain decision of the ECJ9 based on the EU data protection Directive
EC 95/46, which has meanwhile been replaced, and the online archive cases of
the German civil courts.10 Particularly for the German discussion, the rulings of
the Federal Constitutional Court Recht auf Vergessen 1 and Recht auf Vergessen 2,
concerning the constitutional dimension of the right to be forgotten in data protection
law as well as in press law, are of special importance.11 While Article 17 of the
GDPR provides for the right to erasure as well as certain information duties, the
Google-Spain decision deals with a right to erasure on the basis of the former data
protection directive and the question whether and to what extent it can also be asserted
against the results list of an Internet search engine.12 The online archive cases, in
contrast, concern claims for injunctive relief on the basis of the German Civil Code
(Bürgerliches Gesetzbuch, BGB) directed against radio stations and press publishers
making their reports permanently available on the Internet.13 Forgetting and the
right to be forgotten are thus located in the academic and public discussion about
the digitization of data and their networking. Even the mere mention of the word
“forgetting” in this context seems to have a special effect: in particular the decision of
the ECJ has detached the discussion from the European context and led to worldwide
disputes about the question of forgetting on the Internet.14 Some authors assume that
this effect is due to the anthropological connotations of the word “forgetting”: the

8 A detailed analysis of its content is given by Hornung and Hofmann (2013).


9 ECJ, decision of 13.5.2014—C-131/12.
10 See for example BGH, decision of 13.11.2012—VI ZR 330/11. The online archive cases are

comparable to the droit à l’oubli in French law and the diritto al’oblio in Italian law. An overview
of the different traditions and their relationships to US approaches is given by Bernal (2014).
11 BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314; BVerfG, decision of

6.11.2019—1 BvR 276/17=NJW 2020, 314–328.


12 See for a detailed analysis of the judgment Stehmeier and Schimke (2014).
13 A brief overview of the jurisprudence is provided by Trentmann (2016).
14 See for the Brazilian debate Sarlet (2022), in this volume; see for an early analysis of the US

debate Bernal (2014); a suggestion for a differentiated approach to the construction of the right to
be forgotten that is globally viable is made by Jones (2016); see for the German debate Weismantel
(2017), Diesterhöft (2014), and Stumpf (2017); Gstrein (2016) deals with the right to be forgotten
as a human right.
182 A. Schimke

term seems to have a special proximity to humans and their natural dispositions, for
which reason its reception and discussion are largely emotional and not objective.15
In fact, there is no agreement as to what exactly the right to be forgotten should
be: should it be anchored at the level of fundamental rights or only at the level of
simple law? In which field of law can it be found or can there be different rights
to be forgotten in different fields of law? What is the concept behind these rights?
What distinguishes the right to be forgotten from other rights—in particular, rights to
erasure? Despite numerous international discussions, these questions remain largely
unresolved. In many cases, forgetting is equated with deletion and/or refers to human
forgetting, which is to be achieved by deleting a database on the Internet.16 Those
who forget here are either “the Internet” itself by deleting data or the individuals
who derive information from the data and can no longer do so because a piece of
data has been deleted, so that it becomes more likely that the information in question
is subsequently forgotten.17 The importance of these rights to erasure is seen above
all in the fact that outdated information, which was once available on the Internet,
could reappear again and again.18 The Internet, according to the concise formula,
does not forget.19 With regard to the individual in his/her social relationships, many
authors think that as a consequence of this development, a new beginning is no
longer possible and he/she is thus limited in his/her personal development.20 Some
also point out that at an individual level, the freedom of decision would be reduced as
soon as too much information was permanently available.21 At the same time, various
points of criticism of the rights to erasure as an instrument for creating forgetting
on the Internet are mentioned. They include the difficulties in effectively enforcing
these limited rights worldwide,22 the question of whether it is even possible to delete
all data or copies of data23 and, above all, the accusation that the decisions about
deletion requests lie with private providers of social networks and other services
on the Internet, which ultimately results in a form of private censorship.24 This is
accompanied by a relatively broad discussion on the question of the responsibility
of intermediaries for the information generated in the context of their services.25

15 The argumentation of Koreng and Feldmann (2012) follows this assumption.


16 Forgetting and deletion are equated, for example by Nolte (2011), p. 236; Boehme-Neßler (2014)
p. 825; Kodde (2013), p. 115. Forgetting is conceptualized as an internal individual phenomenon
inter alia by Koreng and Feldmann (2012), p. 312; Hornung and Hofmann (2013), p. 164.
17 Grimm (2013), p. 589.
18 Diesterhöft (2014), p. 23.
19 Mayer-Schönberger (2011).
20 Diesterhöft (2014a), p. 371.
21 Buchholtz (2015), p. 126. In the Federal Constitutional Court’s decision on the right to be forgotten

it becomes clear that this is not necessarily relevant from a constitutional perspective. See BVerfG,
decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 104 ff.
22 Spiecker genannt Döhmann (2014).
23 Spiecker genannt Döhmann (2014).
24 Arning et al. (2014), von Lewinski (2015), Rosen (2012).
25 See for an overview of different regulatory approaches van der Sloot (2015), Hofmann (2017).
Forgetting as a Social Concept. Contextualizing … 183

Thus, important questions of Internet regulation are bundled in the context of the
right to be forgotten. Nevertheless, the examples given indicate that it remains unclear
why we are talking about forgetting here. In order to be able to conduct a meaningful
international discussion about the problems associated with the right to be forgotten,
it is necessary to clarify the underlying concept as a common basis. Only in this
way is it possible to identify where the various legal systems provide for rights to
be forgotten and which problems arise from this in each case. The jurisprudential
discussion to date has done little to achieve such clarification.26 The Google ruling
of the ECJ was received not only by jurists, but was also taken up and reflected by
the cultural and social sciences. Examining this reflection can not only explain why
the term forgetting is so attractive, it can also help to identify the particularities of
the right to be forgotten in relation to other rights. It will be possible to meaningfully
describe the right to be forgotten as a legal concept on this basis.

3 Forgetting and the Right to Be Forgotten


in the Discussion in the Social and Cultural Sciences

In the cultural and social sciences, the legal discussion about the right to be forgotten
was received against the background of overarching concepts of memory.27 What
the various approaches have in common is that they understand memory neither as
a purely technical nor as a purely individual phenomenon. Rather, it is addressed
on the basis of different conceptions as a result of technical and supraindividual
developments.
Aleida Assmann distinguishes different forms of memory.28 In addition to indi-
vidual memory, she also identifies different concepts of collective memory in which
individuals are involved—for example through participation in memorial events—
but which are not embodied by individuals. Rather, they are artificial memories
that depend on media for their existence.29 Such media include inter alia language,
writing, photography, film, videos and other. Collective memory is not the same as
the media—it remains associated with a group as a carrier that uses the media for acts
of collective memory and forgetting. But the medium can also pre-structure the kind

26 Weismantel (2017), p. 134, offers a legal definition of the right to be forgotten. According to
this definition, the right to be forgotten is a right to deletion or isolation of personal data on the
Internet which should no longer be accessible because they are no longer relevant, either in modal
or in temporal terms (Weismantel (2017) p. 122). This definition, however, is less a definition of
the right to be forgotten but rather a description of the legal answer to the problem raised by the
Google-Spain decision. It therefore focuses too much on the constellation of an Internet search
engine, whereas the discussion on the right to be forgotten addresses more far-reaching aspects as
well. For a criticism of definitions that focus too strongly on the Internet constellations in general,
see also Sarlet (2022), Sect. 2.1, in this volume.
27 Assmann (2016), Esposito (2017).
28 Assmann (2010), p. 11.
29 Assmann (2010), p. 19.
184 A. Schimke

of memory and forgetting. Thus, language provides a different approach to memory


than a film does.30
With regard to the discussions on forgetting on the Internet, Assmann’s distinction
between memory as ‘ars’ and memory as ‘vis’ is revealing.31 An understanding
of memory as ‘ars’ ties in with the tradition of ancient mnemonic techniques and
describes a procedure aimed at identical retrieval of the input. In this understanding,
the time dimension is of no importance for memory. Rather, it is designed spatially.
Assmann assigns the concept of ‘storing’ to this memory. Storage can take place
through different writing systems or—with the help of different techniques—in the
body or in the mind.
In another tradition, memory is understood as ‘vis.’ Assmann sees Friedrich Niet-
zsche as a formative figure for this tradition. Memory as ‘vis’ emphasizes the iden-
tity-forming function of memory for both individuals and collectives. In this memory,
the dimension of time is important: it is remembered from the present that the past
only ever conceptualized from the moment of its reconstruction. Here, memory is
to be understood as a process in which remembering and forgetting are equally
important for the formation of a group’s identity. Media also play an important role
in this process by initiating memory processes and shaping the nature of collec-
tive memory.32 Forgetting is constitutive for memory as ‘vis’ because the identity-
forming function of memory is generated by developing distinctions and building
horizons that support the perspectives of individual and collective identities.33 For
Assmann, therefore, forgetting is a term that is necessarily linked to the formation
of identities.34
Following this differentiation, ‘the Internet’ is an inanimate entity that can neither
remember nor forget. However, the various media on the Internet are linked to acts of
remembering and forgetting. Thus meaning and relevance arise only in connection
with individual and collective identities, which are also responsible for the fact that
insignificant, irrelevant things are faded out and thus forgotten.35 The search result
of an Internet search is both technical and social: while the search engine structures
what can be found and thus remembered in which way, the search term and the
further search path are shaped by human attention embedded in forms of individual
and collective identity.36 When we talk about forgetting on the Internet, we must
therefore take this interrelation between society and technology into account. What
is forgotten and remembered is the result of this interaction.
Elena Esposito also arrives at a comparable differentiation when she writes that
the “algorithm forgets what had been forgotten by the users.”37 She distinguishes

30 Assmann (2010), p. 20.


31 See also for the following description Assmann (2010), p. 27.
32 Assmann (2010), p. 21.
33 Assmann (2016), p. 217.
34 Assmann (2016), p. 217.
35 Assmann (2016), p. 216.
36 Assmann (2016), p. 216.
37 Esposito (2017), p. 6.
Forgetting as a Social Concept. Contextualizing … 185

a technical and a social dimension in her concept of memory, which is a social


memory.38 To her, social memory is a matter of communication.39 Without people—
in Luhmann’s terminology: psychic systems—no communication can take place. At
the same time, communication functions independently and can be described inde-
pendently of the psychic systems.40 According to Esposito, memory is responsible
for checking for coherence within communication.41 Memory observes events and
determines a sequence between them so that an order is established overall. In this
concept, remembering means that a matter is condensed because it has become part of
the structure, which is to be considered dynamic.42 All communication content that
has not become part of the structure is forgotten.43 The more is forgotten, the greater
the ability of social memory to abstract and generalize.44 Which forms memory
takes concretely and to what extent its abilities to abstract and generalize are formed
depend on two developments: the development of media and techniques on the one
hand and the development of society on the other.45 For example, book printing
makes communication more abstract because a printed text must be understandable
regardless of the context in which it was written. At the same time, book printing was
only able to assert itself because more information had already been communicated
and literacy had increased.
According to Esposito, the Google-Spain decision concerns web memory.46 Web
memory is characterized by algorithms producing information based on data. Algo-
rithms function according to their own rationality. They deal with data without
being able to attribute meaning to it. At the same time, they produce meaning for
those who derive information from the processed data. Because remembering and
forgetting are linked to communication and thus to meaning, algorithms themselves
can neither remember nor forget. Nevertheless, they change the structure of social
memory by affecting the way information is generated. Because the Google-Spain
decision concerned the results list of a search engine, and this results list is generated
algorithmically, the decision, as Esposito highlights, pertains to a central issue of
web memory and tries to influence its functioning. However, this fails—according
to Esposito—in that the data protection right to deletion is based on the level of the
data and the consequences for social memory are not taken into account. After all,
at the level of social memory, erasure does not necessarily mean forgetting. This is
due, for example, to the fact that the deletion process itself produces data from which
new information can be derived, which may simply draw attention to the event to be
deleted (the Streisand effect) or that the information in question can still be derived

38 Esposito (2002), p. 38.


39 Esposito (2002), p. 9.
40 Esposito (2002), p. 9.
41 Esposito (2002), p. 27.
42 Esposito (2002), p. 27.
43 Esposito (2002), p. 27.
44 Esposito (2002), p. 33.
45 See also for the following example Esposito (2002), p. 38.
46 See for the following description of the web-memory Esposito (2017).
186 A. Schimke

from other data elsewhere. Esposito assumes that an appropriate way to handle social
memory, which is characterized by algorithms that deal with data, has not yet been
found. She sees a first approach in the fact that the algorithms themselves must be
perceived as actors and the corresponding traditional concepts of distribution and
responsibility must be adapted.
The Google-Spain decision’s reception in the cultural and social sciences calls into
question the background assumptions that lie at the root of its reception in the legal
sciences. Thus, the initial finding that the Internet does not forget falls short. As a
technical fact, the Internet itself can neither remember nor forget. The constellations
of the right to be forgotten have no relation to individual forgetting. Rather, it is
always about the production of information in particular sub-publics. This concerns
a collective dimension of memory, as described by Assmann and Esposito. Both
believe that the deletion process cannot be equated with forgetting. Deletion only
takes place at a technical level (Assmann) or refers to the data level (Esposito).
However, forgetting is not located at the data level, but at the level of collective
identity (Assmann) or at the level of communication (Esposito). It is therefore more
related to information than to data. Hence, it is important for the legal treatment of
forgetting on the Internet to differentiate between data and information, as has long
been demanded for information law in general and data protection law in particular.47
Although it cannot be said that the Internet does not forget, the Internet has
helped to change the way people forget. Esposito points out the important role of
algorithms in this context. Algorithms have also become an important topic in law.48
If law, too, understands algorithms in communication as part of social memory, it
follows that the law must observe and evaluate the social consequences of generating
information algorithmically. This means, for example, that the protection of the right
to be forgotten cannot be seen across the board in terms of a loss of freedom of
decision. This assumption is based on the hypothesis that the Internet does not forget,
and it narrows the issue to a technical perspective. If the information level, and thus
the social context in which people forget, is included, it becomes necessary to ask
exactly why information should be forgotten in a certain context. In this respect, the
goods to be protected must be determined context-specifically.
Finally, the social science reception of the Google-Spain decision draws atten-
tion to a problem that has also been addressed by various courts: requesting that
information be forgotten can lead to the opposite effect, namely to increased remem-
brance. That is certainly correct and must be considered by the claimant. A right to
be forgotten can nevertheless have a meaningful scope. Esposito takes all contexts
of possible information generation into consideration. Legally, on the other hand, a
distinction is made according to the facts negotiated. And for this isolated, legally
constructed issue, it may make sense under certain circumstances that partial social
forgetting is realized through a claim to deletion. Many people are already helped
by the fact that certain hits are no longer displayed in the European Google search

47Albers (2005), p. 87.


48See inter alia Drexl (2017), p. 529; Hoffmann-Riem (2017), p. 1; Martini (2017), p. 1017; Trute
and Broemel (2016).
Forgetting as a Social Concept. Contextualizing … 187

results list. In any case, this applies inasmuch as the request for deletion is primarily
intended to prevent accidental discoveries.49
Legally, the right to be forgotten can also be described in more detail. The Google-
Spain decision as well as the online archive cases are characterized by a situation
that is classified as legal at one point in time and illegal at a later point in time. As
the European Court of Justice rightly emphasized in its Google-Spain ruling, this
change in the assessment of legality can have various reasons50 : in data protection
law, for example, consent to data processing can be revoked or new interests worthy
of protection on the part of the data subject—such as an interest in reintegration
into society—can be added during the process of weighing various concerns, which
results in a different legal assessment.
The right to be forgotten can therefore be understood as a generic term for claims
that are aimed at the fact that within a certain communication relationship, knowl-
edge about a person arising from the lawful generation of information should no
longer exist within a certain context, because at the present time it could only be
generated on the basis of unlawful information. The right to be forgotten therefore
concerns a change in the assessment of lawfulness within the temporal dimension of
communication.

4 Regulatory Contexts of the Right(s) to Be Forgotten

If the right to be forgotten is defined in this way, it is not a new phenomenon—contrary


to what the current discussion context might suggest.51 In Germany, the old and very
well-known Lebach decisions of the Federal Constitutional Court are important with
regard to a right to be forgotten understood in this way.52 These decisions, handed
down in 1973 and 1999, revolved around reports about a crime committed in 1969, the
‘Soldier Murder of Lebach.’ In order to realize their dream of living on a yacht, three
men first wanted to capture weapons and use them to raise money. Two of the men
therefore carried out a raid on an armory of the German army, killing four sleeping
soldiers. After a lengthy search, the three men were arrested and sentenced for murder
or complicity to murder. The crime, the search, and the criminal proceedings were
reported extensively in the press and on the radio. In 1972, a television station wanted
to take up the case again and broadcast a documentary about it in which one member
of the trio who was not involved in the concrete attack was shown in a photo at the
beginning and mentioned by name several times. At the time of the planned broadcast

49 This is also highlighted by the Federal Constitutional Court. S. BVerfG, decision of 6.11.2019—1
BvR 16/13=NJW 2020, 300–314, para 131.
50 ECJ, decision of 13.5.2014—C-131/12, para 72–75.
51 This also applies to the Brazilian legal situation. See in this respect Sarlet (2022), Sect 2.2, in

this volume.
52 BVerfG, decision of 5.6.1973—1 BvR 536/72=NJW 1973, p. 1226 (Lebach I); BVerfG, decision

of 25.11.1999—1 BvR 348/98, 1 BvR 755/98=ZUM-RD 2000, p. 55.


188 A. Schimke

of the documentary, this offender had served half of his prison sentence and was
planning to return to his hometown. He unsuccessfully sued for an injunction to stop
the broadcast. In its 1973 decision, the Federal Constitutional Court had to decide
whether the negative decisions of the civil courts infringed the plaintiff’s general
right of personality, which is derived from Article 2(1) of the German Basic Law
(Grundgesetz, GG) in conjunction with Article 1(1) GG. In its 1973 decision, the court
weighed the general right of personality of the person concerned, against freedom
of broadcasting (Article 5(1) GG). The court differentiated between reporting at
the time of the crime and later reporting. While it considered reporting of serious
criminal offenses at the time, including the perpetrator’s name, to be permissible in
principle, it held that later reporting was in any case inadmissible if it was likely to
cause the perpetrator significant new or additional impairment compared with the
information at the time of the crime, in particular jeopardizing his integration into
society (resocialization).53 In its 1999 decision, the court made it clear, restrictively,
that the general right of personality does not confer any entitlement to never being
confronted with the offense again.54 The decisions of the Federal Constitutional
Court clearly show that the assessment of legality can change over time with regard
to reporting. With a view to the constellation to be decided, the court also laid out clear
criteria as to when such a change in the assessment of legality may be appropriate
for constitutional reasons. In this respect, the decisions have set the standard for a
large number of civil proceedings involving reporting on criminal offenses.

4.1 The Rights to Be Forgotten on the Internet:


Constitutional Dimension

Since broadcasters and newspaper publishers have also become active online, the
courts have been confronted with the question of the extent to which the developed
standards can also be applied to this constellation.55 The focus was and still is on
the question of whether old reports that have been placed in the companies’ online
archives can remain accessible there permanently and under which conditions this
could be permissible. With regard to the last point, the question becomes significant
to what extent the retrievability of the old reporting with the help of Internet search
engines affects the legal evaluation of the old reporting in the online archive. At
the level of the lower courts, no uniform case law has yet been established in this
respect.56 A constitutional complaint concerning this constellation has been decided
in 2019.57 The constitutional complaint concerns the Apollonia case, which is about
the coverage of a major German news magazine in the 1980s. It reported on the

53 BVerfG, decision of 5.6.1973—1 BvR 536/72=NJW 1973, p. 1231.


54 BVerfG, decision of 25.11.1999—1 BvR 348/98, 1 BvR 755/98=ZUM-RD 2000, p. 59.
55 An overview on and an analysis of the jurisdiction is provided by Théry (2016).
56 See for a summary of the different approaches Trentmann (2016).
57 BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314.
Forgetting as a Social Concept. Contextualizing … 189

murder trial against the complainant, who was accused and ultimately convicted
of shooting two people on the yacht Apollonia on the high seas. The plaintiff was
named in the reports. In 2002, the old, unchanged reports were placed in the journal’s
online archive and have been available there ever since. They can also be found by
entering the plaintiff’s name using Internet search engines. The reports explain the
circumstances of the crime and the proceedings in detail in a factual manner and use
the behavior of the plaintiff as an example of the consequences if otherwise relatively
inconspicuous people find themselves in extreme situations. The reports are marked
as old reports in the archive.
In its decision, the Constitutional Court points out that under the conditions of the
Internet, time gets a specific meaning in law. In the Lebach-decision, the court had to
compare two points in time, asking whether a new report on a specific past event is
admissible. Under the conditions of the Internet, old reports are permanently available
and can easily be recombined with other data. This poses a specific, legally relevant
danger for the personal development, protected by Article 2(1) in conjunction with
Article 1(1) GG. This right implies the possibility of change over time or—in other
words—the development of the person in time implies the chance of forgetting.58
According to the court, this is not only important for the personal freedom. Rather,
it benefits society as a whole by ensuring that its members freely communicate even
provocative or appellant positions.59 Therefore, the lawfulness judgment concerning
a publication online can also change over time: what might be justified in one moment
might not be justified anymore a few years later, considering the legally protected
interest of the involved person for a chance to change over time. On the other hand,
the press has a constitutionally protected interest to publish reports including the
names of the persons involved and to maintain an online archive in which these orig-
inal reports can be placed.60 This is also in public interest, because online archives
provide easy access to information and constitute an important source for journalistic
and historical research.61 The Constitutional Court sets standards for the weighing
process between the conflicting interests, which it divides into three sub-categories:
Procedural guidelines, criteria for the assessment of the time-factor and considera-
tions concerning the retrievability of the publication on the Internet.62 With respect
to the procedural aspects of the cases the court states that an obligation of the press
to permanently control the lawfulness of all publications in an online archive would
be incompatible with Article 5(1) GG. Inspection obligations can therefore only be
admissible in terms of notice-and-take-down procedures. Important aspects for the
lawfulness-judgment in time are inter alia the impact and the subject of the report
and the sequence of events in which the report can be classified. Finally, the extent

58 „Die Möglichkeit des Vergessens gehört zur Zeitlichkeit der Freiheit“ (BVerfG, decision of
6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 105).
59 BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 108.
60 BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 112.
61 BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 112 f.
62 See also for the following summary BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020,

300–314, para 116 ff.


190 A. Schimke

of the negative impact for the personality right also depends on the retrievabilty of a
report on the Internet. Therefore, a default-setting that prevents search engines from
finding and listing the reports in name based searches could be one way to balance
the interests.

4.2 The Right to Be Forgotten in Press Law

The civil courts now have to apply these constitutional standards. The relevant cases
are decided on the basis of press law. Its central elements are claims for damages, revo-
cation, correction, and injunctive relief from Section 823 (1) BGB or Section 1004
(1) BGB applied mutatis mutandis in connection with Section 823 (1) and/or (2)
BGB. As a rule, the various regulations presuppose that the person concerned is
claiming that his/her personality rights under civil law were violated by a statement
published in the media. Thus, they aim from the outset at influencing the knowledge
of the public. Personality rights under civil law are understood as framework rights
the concrete content and scope of which arise only in individual cases as a result
of a comprehensive consideration of conflicting interests.63 A characteristic of press
law is that the conflicting interests are essentially communication interests which
are protected by Article 5 GG. From the perspective of the person concerned, these
claims therefore presuppose that an infringement of personality rights has already
taken place or that its occurrence is at least imminent. Press law therefore applies a
model of ex-post regulation.64 Since weighing of interests is at the center of press
law, a highly differentiated case law has developed over time. In addition to an
approach geared to ex-post regulation and differentiated weighing standards, and in
contrast to data protection law, press law provides for a model of liability that can
reflect a graduated level of responsibility and, above all, take account of the inter-
ests involved in communication on the Internet via intermediaries (‘Störerhaftung’).
Thus, for example, intermediaries on the Internet can be sued for injunctive relief
only if they have not eliminated a violation of the law despite being aware of it. The
question of when an intermediary had or could have had the necessary knowledge
of the violation is dealt with by recurring to the legal construct of an infringement
of ‘reasonable inspection obligations’ (‘zumutbare Prüfpflichten’).65

63 Wagner (2017), para 364.


64 That does not mean that it does not have any structural effects for future communications.
65 Not that long ago the German Federal Court of Justice decided on Google’s liability for its search

results under German press law, using the legal construct of ‘reasonable inspection obligations’:
BGH, decision of 27.02.2018—VI ZR 489/16. Further details on the concept of being a ‘Störer’
are provided below in Sect. 4.3. However, this has been modified by now due to a new considera-
tion of the relationship between European and German law in this context. See BGH, decision of
27.7.2020—VI ZR 405/18. The Court now only applies Article 17 GDPR and the GDPR’s liability
regime.
Forgetting as a Social Concept. Contextualizing … 191

4.3 The Right to Be Forgotten in Data Protection Law

The right to be forgotten, understood as a right that concerns a change in the assess-
ment of lawfulness over time, exists in other areas of law besides press law. In
addition, and as the above-mentioned discussion contexts already indicate, another
important manifestation of the right to be forgotten can be found in data protection
law. There, multiple reasons for the change in the assessment of lawfulness can be
identified. In European law, for example, Article 6(1) GDPR provides for legal rules
that legitimate the processing of personal data.66 Article 6(1) GDPR links the lawful-
ness of data processing to certain criteria such as consent (a), entering into a contract
(b), compliance with a legal obligation (c), or the legitimate interest of the controller
(f) by stating that the processing must be necessary in order to fulfill these and other
criteria. As the ECJ pointed out in its Google-Spain decision, the necessity must be
given at all stages of processing.67 From this it follows that an instance of processing
can be lawful at one point in time because it is necessary for specific purposes then,
whereas that is no longer true at a later point in time—for example, because an
individual has withdrawn his or her consent in accordance with Article 7(3) GDPR
and the processing cannot be based on other provisions of Article 6 GDPR. Another
reason could be that the legitimate interest of the controller no longer outweighs the
interest of the data subject due to changes in the interests involved. For example,
on the part of the data subject, an interest in resocialization might be added after a
certain period of time. The criteria by which data protection law and press law assess
the lawfulness of an instance of processing that concerns a published statement can
overlap, as long as the relevant interests are similarly protected by the European
Charta of Fundamental Rights (CFR) and the GG. This is because, principally, the
GDPR being European law is being understood and applied in the light of the CFR
whereas press law is understood and applied in the light of the GG.68 In its Google-
decision the ECJ acknowledged that the time-factor might be one relevant factor for
the assessment of the lawfulness of a processing of personal data under European
data protection law.69
Once the processing becomes unlawful in data protection law for one of these
reasons, Article 17(1) GDPR provides for a right to deletion. When the right to
deletion is based on a change in the assessment of lawfulness, it can be identified as
a form of a right to be forgotten.70

66 A similar provision can be found in Article 7 of the Brazilian Data Protection Law. For a thorough
analysis of Art. 6 GDPR see Albers and Veit (2021).
67 ECJ, decision of 13.5.2014—C-131/12, para 93–95.
68 BVerG, decision of 6.11.2019—1 BvR 276/17=NJW 2020, 314–328, para 95; BVerfG, decision

of 6.11.2019—1 BvR 16/13=NJW 2020, 300–314, para 41 ff.


69 ECJ, decision of 13.5.2014—C 131/12, para 98.
70 Therefore, in my understanding, the right to deletion on the grounds of Article 17(1) GDPR itself

can in certain constellations be a form of the right to be forgotten. In the German literature on
Article 17 GDPR, however, the right to be forgotten is generally identified in Article 17(2) GDPR,
192 A. Schimke

This right to be forgotten is embedded in data protection law, which is character-


ized by a legal structure that differs from the structure in press law as described above.
Data protection law in Germany is divided into various standard areas. The general
provisions of the GDPR and the German Federal Data Protection Law (Bundes-
datenschutzgesetz, BDSG) can be distinguished from the many different data protec-
tion regulations applying specifically to the police and security services, health data
protection, social data protection, or data protection by telecommunications service
providers.
Against the background of a constitutional concept of informational self-
determination, which understands informationals self-determination on a first level
as a regulatory mandate for structuring the handling of personal information and
data, data protection regulations and in particular the provisions of the GDPR and
the BDSG must first be understood as a framework that structures data processing
procedures.71 Then objective-legal regulations located at this structural level are the
primary focus of data protection law. From the perspective of the violation of specific
protected goods, these constitute an ex-ante regulation because they take effect inde-
pendently of a specific risk situation and therefore, so to speak, ‘before’ it.72 The data
protection laws in Germany cannot easily be understood as a framework in this sense
because they go back to different historical origins and have thus been developed
in different contexts and according to different concepts. In particular, the ‘Verbot-
sprinzip’ (‘prohibition principle’) currently contained in Article 6 GDPR complicates
the understanding of data protection law as a framework inasmuch as it places every
instance of processing under the obligation to justify and thus takes account of a
concept that understands informational self-determination as the individual’s right
of disposal over the handling of personal information and data.
However, the fact that data protection regulations can also and precisely be under-
stood as a framework becomes apparent when the individual elements that make up
the applicable data protection law and have evolved over time are considered.73
Elements of system data protection, the development and design of communication,
and data processing techniques can be distinguished from the regulation and design
of the processing phases. In addition, there are modules that concern the information

which provides for certain information duties once a right to deletion has been asserted. See inter
alia Herbst (2021), para 49.
71 See for the underlying concept Albers (2005) and Albers (2014). See for the understanding of the

BDSG as a framework for regulating data processing processes Albers (2012), p. 164. See for the
construction of the right for data protection as a fundamental right at the European level Reinhardt
(2022), in this volume. In its recent decision on the right to be forgotten, the Federal Constitutional
Court also tends to identify informational self-determination with framework-regulation. Neverthe-
less, its underlaying understanding is different and much more oriented towards an individual-rights
approach where the right to informational self-determination is relevant in situation of automated,
non-public data processing: BVerfG, decision of 6.11.2010—1 BvR 16/13=NJW 2020, 300–314,
para 83 ff.
72 However, the regulations serve their own goods and are not only preventive measures. See Franzius

(2015b), p. 266.
73 See for the different elements in detail Albers (2012), p. 181.
Forgetting as a Social Concept. Contextualizing … 193

of those affected, give them opportunities to influence and participate, and address
guarantee and control mechanisms.
Rules that affect system data protection are applied before the concrete processing
procedures. They concern the general conditions under which data processing takes
place and are to be designed in accordance with data protection objectives. Such rules
may relate, for example, to the administrative organization or the technical instal-
lation of data processing equipment. The goals to be achieved are specified by law
and concern, for example, data security or the use of pseudonymization options.74
Regulations concerning the development and design of communication and data
processing techniques come into force even earlier. They aim at technologies that
are developed in compliance with data protection regulations and that could be used
by public agencies or companies.75 A central component of data protection law is
the regulation and design of processing phases. Different processing phases—from
collection to transmission—can be distinguished and independent legal requirements
are linked to the individual phases, some of which also serve to establish a super-
ordinate relationship. Central to the regulation of processing phases, for example,
is the principle of purpose limitation, which achieves data processing procedures
that are relatively manageable for the person concerned.76 Phase regulation can also
include the principle of necessity, which binds the individual processing steps to a
purpose to be specified in more detail.77 Essentially, this also serves to structure the
processing phases and to identify the relevant contexts in which data is processed.
Data processing is therefore made more manageable through elements of phase regu-
lation, which, among other things, makes hazardous situations for protected goods
more foreseeable, to which independent regulations can then be applied. The duty
to inform the parties concerned serves, among other things, to make these elements
of objective regulation visible to the party concerned and thus to give him/her the
possibility of exerting influence on these processes via rights of participation and
influence.78 The rights of participation and influence under data protection law—
such as rights to deletion or correction—are thus related to data processing by the
processing body and aim to influence the knowledge of this body about the person
concerned. Finally, guarantee and control instruments such as data protection audits
or the establishment of independent data protection officers are intended to ensure
implementation and compliance with legal requirements.79
Even this cursory description of some structural elements of data protection law
makes apparent that data protection law provides regulations in essential parts which
are to structure processing contexts, and thus starts at a level that can be described

74 Security measures are handled in Chap. IV, Section 2 GDPR.


75 In this respect Article 25 GDPR is an important provision.
76 A detailed analysis of the principle of purpose limitation is given by von Grafenstein (2018). The

principle of purpose limitation can be found in Article 5(1) b GDPR.


77 The principle of necessity can be found inter alia in the various legal grounds for the processing

of personal data in Article 6 GDPR.


78 The rights of the data subject can be found in Chapter III GDPR.
79 Chapter IV GDPR.
194 A. Schimke

independently of a concrete risk situation. Another decisive characteristic is the


bipolar structure of data protection law: the modules are geared to the relationship
between the controller and the data subject. Transmissions to sub-publics and the
attempt to influence the knowledge of this sub-public are therefore not covered by
data protection law in principle.80
Overall, data protection law is less about preventing data processing than about
structuring it in a meaningful way in terms of how individuals can participate in a
self-determined way in the handling of information and data concerning them.81
The rights to be forgotten in data protection law and press law are not only
embedded in different fields of law with different characteristics; the preconditions
under which the respective manifestations of the right to be forgotten are applied
differ as well. In data protection law the right to deletion follows a default setting
friendly to personality rights. This means inter alia that the data subject’s interests are
generally regarded as worthy of protection,82 a fact that becomes important within
weighing processes on the basis of Article 6(1) f GDPR. Furthermore, the burden of
producing evidence and proof (Darlegungs- und Beweislast) that the processing was
lawful often lies with the controller. A good example of this regulatory structure is
the right to object in Article 21 GDPR. If the data subject claims that the processing
on the grounds of Article 6(1) f GDPR is unlawful in his/her specific case and gives
reasons for this claim, the processing must stop and the data subject has a right to
deletion in accordance with Article 17(1) c GDPR unless the controller demonstrates
compelling legitimate grounds for the processing which override the interests, rights,
and freedoms of the data subject. Furthermore, when a public statement concerns
certain categories of data—‘special categories of data’ such as sex, race, or political
opinions—the justification to publish becomes even more difficult. For these data,
Article 9 GDPR is the relevant provision.83 It follows from Article 9 GDPR that if
the person concerned does not agree to the publication or the data has already been
published, the GDPR does not readily provide for legal grounds for the publication
of such data. In addition, the liability of the controller is interpreted broadly and once
a person is liable, he or she is fully liable (Article 26(3) GDPR). Therefore, there is
no concept of shared responsibility within the relationship between controller and
data subject in data protection law.84 Finally, data protection law can be enforced not

80 However, Article 6(1) f GDPR provides for the possibility of considering interests of third parties
within the weighing process. See in detail for the weighing process in Article 6(1) f GDPR Buchner
and Petri (2021), para 149–154.
81 Simitis (2018), p. 214.
82 Buchner and Petri (2021), para 148.
83 Article 9(1) GDPR reads as follows: “Processing of personal data revealing racial or ethnic

origin, political opinions, religious or philosophical beliefs, or trade union memberships, and
the processing of genetic data, biometric data for the purpose of uniquely identifying a natural
person, data concerning health or data concerning a natural person’s sex life or sexual orienta-
tion shall be prohibited.” Article 9(2) GDPR provides for exemptions from this general prohibi-
tion of processing special categories of data, but none of these exemptions is adapted to public
communication situations in old or new media.
84 This approach has been criticized in the legal literature. See inter alia Spindler (2012), p. 99.
Forgetting as a Social Concept. Contextualizing … 195

only by courts but also by a supervisory body. It is in this respect a law that aims at
a controlled flow of personal data.
Within press law, on the other hand, the legally protected interests of every side
involved in the weighing process must be identified. The burden of proof lies with the
person who wants content to be removed.85 Finally, the liability regime is stricter and
more nuanced: On the grounds of Section 1004 BGB a difference is made between
Täter and Störer. A Täter is someone who is immediately responsible for a rights
infringement, whereas a Störer can be any person who has intentionally caused a
rights infringement.86 Because this would go very far, the courts narrow down the
liability of the Störer: in order to be a Störer a person must have knowledge of
a rights infringement and must have violated reasonable inspection obligations.87
Another important difference between the application of the right to be forgotten
in data protection law and in press law is that data protection law is European law
and therefore falls under the jurisdiction of the European Court of Justice, whereas
press law as understood in this text is national law and falls under the jurisdiction of
the BGH. In terms of the fundamental rights that must be considered, it follows that
in data protection law European fundamental rights have to be taken into account,
whereas in press law the fundamental rights of the German Basic Law are applied.88 In
press law, no supervisory body is installed. The default-setting insofar is characterized
by a relatively uncontrolled flow of personal data.

4.4 Reasons for More Restrictive Application of Data


Protection Law

Although data protection law and press law are characterized by different features and
are designed to govern different situations, they do overlap: whenever a (partly) public
statement is processed automatically and concerns a natural person, both regimes are
applicable. This concerns for example reporting on an identifiable natural person in
a newspaper, but also any statement about a natural person that is shared online in
social media or that is shown in the results list of an Internet search. Since in these
situations a change in the assessment of lawfulness might occur over time, the area
of overlap between data protection law and press law includes cases of the right to
be forgotten. These cases could, in other words, be decided on the grounds of press

85 See for claims on the grounds of Section 1004 BGB analogously in general Baldus (2017), para
307.
86 See for a brief explanation Spindler and Volkmann (2015), para 5–18.
87 Permanent case law. See for a decision with respect to Internet search engines BGH, decision of

27.02.2018—VI ZR 489/16, para 31.


88 The relation is explained by the BVerfG in a detailed manner in its decisions on the right to be

forgotten: BVerfG, decisions of 6.11.2019—1 BvR 16/13, 1 BvR 276/17=NJW 2020, 300–314,
314–328. In these decisions the court also clarified that constitutional complaints can not only be
based on German fundamental rights but also on rights guaranteed in the CFR. An overview over
this aspect of the decisions is provided by Kühling (2020).
196 A. Schimke

law, on the grounds of data protection law, or both. Therefore, the question arises
as to how these areas of law relate to each other: if somebody feels offended by a
public statement concerning him or her, can he/she freely decide to sue the person
responsible for the statement (and/or for its dissemination) on the grounds of data
protection law (right to deletion) or on grounds of press law (right to injunctive
relief)? With respect to the legal consequences, both rights have the same effect89 :
the right to deletion is not directed towards the destruction of data. Rather it focuses
on the information that can be derived from the data by demanding that it should not
be possible to derive a certain piece of information from data in a specific context.
Therefore, destroying a specific piece of data, such as a statement on a social network,
might not suffice to comply with the deletion claim. In addition, it might be necessary
to ensure that the information cannot be derived from other data, such as the repetition
of the statement that has been deleted.90 Similarly, the right to an injunction aims
to ensure that the disputed statement is not repeated in the future. Because data
protection law is characterized by a default setting friendly to personality rights, it
would be prudent for the person concerned to sue the utterer/the mediator of the
utterance on the grounds of data protection law. However, in practice there seems
to be a lot of confusion with respect to the relationship of the two regimes.91 Some
claims are based on data protection law, and similar cases are based on press law.
Also the courts—which are actually required to consider every relevant legal aspect
of a case—sometimes apply data protection law, sometimes press law, and sometimes
both, without always being clear why they chose to do so.92 So far, therefore, it is
not clear how the two regimes relate to each other.93 As has just been explained,
it would be logical for data subjects to base their claims more on data protection
law in the future. In particular, this could lead to many cases about the admissibility
of statements on social media being decided on the basis of data protection law.
However, there are good reasons to be restrictive in the application of data protection
law in these contexts.
First of all, the structure of data protection law may not be adapted to these cases.
Data protection law has a bipolar structure focusing on the relationship between the
controller and the data subject. It assumes in principle that there is an asymmetry
of information between the two parties which must be remedied. This is shown,
for example, by the default setting of data protection law, which is friendlier to
personality rights than press law. Semi-public statements on the Internet, on the
other hand, are often located on a situation in which multiple interests are involved:
the utterer of a statement, the information mediator, the public, and the data subject.

89 Lauber-Rönsberg (2014), p. 179.


90 Herbst (2021), para 37.
91 Similarly Lauber-Rönsberg (2014), pp. 181 ff.
92 An example of the application of data protection law is the spickmich decision of the BGH: BGH,

decision of 23.6.2009, VI ZR 196/09; an example of the focus on press law is the autocomplete
decision of the BGH: BGH, decision of 14.5.2013, VI ZR 269/12; an example of the application
of both regimes is a recent BGH decision dealing with the results list of Internet search engines:
BGH, decision of 27.2.2018, VI ZR 489/16.
93 Lauber-Rönsberg (2014), p. 182.
Forgetting as a Social Concept. Contextualizing … 197

This kind of situation cannot be easily reflected in data protection law. The courts
have so far solved this problem by interpreting the data protection regulations under
consideration of the constitutionally guaranteed freedoms of communication. This
leads to the fact that data protection law and press law are assimilated in these cases:
the claims in both areas of law are in the end based on a weighing of interests.94
What is more, this balancing of interests is often not particularly context-sensitive
in the field of online media. One of the reasons for this is that assumptions about
the effect of a public statement are transferred from the mass media to the various
online media. An example is a case that was recently decided by the Regional Court
Saarbrücken (Landgericht Saarbrücken).95 The court had to decide on a claim for
injunctive relief on the basis of the BGB. The plaintiff wrote a private message via
Facebook to a well-known German actor often engaged in political discussions. The
message read: “You wanted to leave Germany, didn’t you? Why don’t you finally
keep your promise. Your understanding of democracy and your vocabulary disgust
me. With kind regards.”96 The defendant replied saying: “hey sweetie…! date!? just
the two of us? […]”.97 After replying to her, the defendant published a screenshot
of the conversation, including the plaintiff’s name, on his Facebook page, to which
over one million people have subscribed. This online conversation was commented
on and linked by third parties multiple times. Shortly after the publication by the
defendant, the plaintiff published a message in a closed network with twenty-five
thousand members who share—at least in tendency—her political opinion, telling
them about the actor’s publication of her message, including her name, and asking for
advice on how she should react. The message also stated that she was now receiving
a lot of insulting messages from the actor’s supporters.
The case was decided on the basis of press law. The court did not make clear why
it did not apply data protection law. In principle, data protection law would also have
been applicable since the publication of the data by the actor constituted automatic
processing of the plaintiff’s personal data, who therefore could have sued Facebook
and/or the actor on the basis of data protection law. Considering the conditions of
the claim on the basis of press law, the court focused on the question whether or
not the publication of the message infringed the personality right of the plaintiff
as enshrined in the BGB. After balancing the various interests involved, the court
concluded that the personality right was not infringed, although it was negatively
impacted by the defendant’s publication. A central argument was that the message
affected the plaintiff only in her ‘social sphere’ because its content concerned a polit-
ical discussion and not a private matter.98 Therefore, the level of protection on the

94 Lauber-Rönsberg (2014), pp. 178 ff.


95 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, pp. 115–122.
96 „Sie wollten doch Deutschland verlassen. Warum lösen Sie Ihr Versprechen nicht endlich ein.

Ihr Demokratieverständnis und Ihr Wortschatz widern mich an. MFG “(cited according to LG
Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p.115).
97 „hey schnuffi…! Date!? nur wir beide? […]“ (cited according to LG Saarbrücken, decision of

23.11.2017—4 O 328/17=ZUM-RD 2018, p. 115).


98 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 118.
198 A. Schimke

side of the plaintiff is relatively low.99 Circumstances that touch the social sphere
can, according to the court, only be sanctioned if they have severe consequences for
the personality right.100 Examples of these kind of consequences are stigmatization,
social exclusion, or a pillory effect.101 According to the court, the publication of
the conversation by the defendant had led to the plaintiff being exposed as a private
person to a large public and thus also to the comments of the defendant’s Facebook
followers.102 In addition, mentioning her name served only to expose and ridicule the
plaintiff as a person.103 Thus a pillory effect occurred. In principle, the publication
of the message including the name would therefore be unlawful. However, the plain-
tiff herself ensured that her interests did not outweigh the communication interests
involved. She did so, in the argument of the court, by publishing the conversation
in the closed forum.104 According to the court, the plaintiff thereby showed that she
had no interest in her conversation with the actor remaining private.105 Otherwise she
would have chosen another way to discuss her concern about the actor’s publication
of her name and message.106 The court applied the doctrine of the deliberate opening
of the private sphere by the person him/herself (Selbstöffnung der Privatsphäre),
which the German civil courts developed for cases where somebody willingly told
the press something private. After this opening of the private sphere, the person in
question generally cannot claim that the information that can be interpreted on the
basis of this data is private.
The court took the specifics of the communication situation into account at many
points in the decision.107 However, it did not sufficiently adapt the standard of Selb-
stöffnung der Privatsphäre. The court presupposed that the public produced by the
plaintiff led to the fact that a cease-and-desist declaration by the defendant could
no longer assume a meaningful function—because the communication was already
public. The court therefore presupposed here that the public which the defendant
produced was to be equated with the public reached by the plaintiff. The decision
was therefore based on the idea of a relatively homogeneous public sphere. In fact,
the public spheres produced by the plaintiff and the defendant are very different.108
The plaintiff addressed a closed forum in which she expected that the readers of her
contribution would, in principle, be sympathetic to it. The defendant, on the other

99 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 118.


100 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 118.
101 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 118.
102 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 118.
103 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 119.
104 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 120.
105 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 120.
106 LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 120.
107 See for example its considerations on the plaintiff’s protected interest that her private message

should not be published with respect to its content (and not with respect to the publication of the
plaintiff’s name): LG Saarbrücken, decision of 23.11.2017—4 O 328/17=ZUM-RD 2018, p. 117
and p. 119.
108 Both public spheres are—as the reproduction of the communication by the plaintiff shows—

connected with each other, and content can quickly be switched back and forth between the two. This
Forgetting as a Social Concept. Contextualizing … 199

hand, addressed a public with a positive attitude towards him at the outset and that
was also networked with other public spheres in much more complex ways—such
as the public sphere of the classical mass media. If the defendant had to refrain from
spreading the message—i.e., delete his mail as a result—this could trigger a consid-
erable response by this public. This would mean, for example, that the defendant and
other celebrities were in principle not allowed to publish private messages without
anonymizing them. At the same time, it would have become clear that the messages
sent to the plaintiff from the actor’s fan community were sent because of an illegal
publication and were therefore not legitimate. All this could have contributed to the
development of a communication standard for communication by celebrities in social
networks that applies irrespective of whether the applicant makes the communication,
including her name, public in another forum.
As described above, the case would probably have been decided similarly if data
protection law had been applied. This is because data protection law in Germany is
brought in line with press law by way of constitutional interpretation. In this respect,
both data protection law and press law are not yet fully adapted to new communication
formats in the same way. There are good reasons why—in the future—they should no
longer be brought in line with each other, but should rather be developed in different
ways. The hypothesis to be developed further is that a more restrictive application of
data protection law in the area of new communication formats could help to better
exploit the strengths of both areas of law, and by doing so, help to ensure more
context-sensitive application of the two areas of law.
There are therefore two reasons for a more cautious application of data protection
law in the area of new communication formats: On the one hand, the structure of
data protection law is not geared to multipolar communication situations. On the
other, a more restrictive application of data protection law could contribute to the
development of more context-sensitive application of both data protection law and
press law. The media privilege that has long been known in data protection law can
be used to achieve this end.

5 The Media Privilege as a Mechanism of Coordination

The media privilege can be found in Article 85(2) GDPR, which reads as follows:
For processing carried out for journalistic purposes […] Member States shall provide for
exemptions or derogations from Chapter II (principles), Chapter III (rights of the data
subject), Chapter IV (controller and processor), Chapter V (transfer of personal data to
third countries or international organisations), Chapter VI (independent supervisory author-
ities), Chapter VII (cooperation and consistency) and Chapter IX (specific data processing
situations) if they are necessary to reconcile the right to the protection of personal data with
the freedom of expression and information.

is an expression of the liquefied form of networked communication. The liquefaction of commu-


nication does not mean, however, that a distinction cannot be made between different publics. Cf.
Ladeur’s note on the liquefaction of communication in this case in Ladeur (2018), p. 122.
200 A. Schimke

Article 85(2) GDPR requires the Member States to provide for exceptions to
certain provisions of the GDPR which are applicable in principle if data are processed
for journalistic purposes and these exceptions are necessary in order to balance data
protection and communication interests. It is therefore up to the Member States to
assess which exceptions are appropriate in order to weigh communication interests
in the field of journalism against data protection interests. In this respect, European
law contains a regulatory mandate which constitutes a mandatory requirement as to
whether or not the exception should be made, but which grants the Member States a
margin of maneuver in the matter.109 It follows that the Member States provide for
such rules in their legal systems. In Germany, such regulations are enacted for the
press in the press laws of the German federal states, the Länder, (Landespressege-
setze) and for broadcasting in Section 57 of the Interstate Broadcasting Agree-
ment (Rundfunkstaatsvertrag), which also contains corresponding regulations for
the online platforms of the press and broadcasting.110 The preceding regulation at
the EU level was Article 9 of Directive 95/46 EC, which contained a regulation
comparable to Article 85(2) GDPR in view of the necessity to provide for exceptions
to data protection law in the context of journalistic activities.
The media privilege was introduced into data protection law with a view to journal-
istic activities in the mass media. It was based on the assumption that the application
of data protection law in the field of mass media journalism would lead to an excessive
restriction of journalistic activity. Examples include the rights of persons affected
by journalistic research, which would convey rights to information and correction to
the person affected by the research right from the start.111 In addition, frictions were
identified between the reporting of suspicions protected by the freedom of commu-
nication and the right to correct data processing under data protection law.112 Further
points that could hamper journalistic work include the documentation obligations
that data protection law imposes on the controller as well as the already described
default setting of data protection law that is friendly to the right of personality, which
can be seen, for example, in the distribution of the burden to provide evidence and the
rather broad understanding of liability. Against this background, the provisions on
the media privilege were and are understood as a reconciliation of interests between
data protection law and freedom of communication: in the field of journalistic activ-
ities in the mass media, data protection law per se is regarded as hostile to freedom
of communication; exceptions to data protection are therefore regarded as friendly
to freedom of communication.113
As for the right to be forgotten in these situations: inasmuch as the right to be
forgotten is based on press law and applied to reporting of mass media offline or in
online archives, the media privilege applies. However, if the right to be forgotten is

109 Buchner and Tinnefeld (2021), para 13.


110 A detailed overview of the German laws and of the debates surrounding their understanding in
relation to Article 85 GDPR is given by Cornils (2018).
111 Buchner and Tinnefeld (2021), para 18.
112 Lauber-Rönsberg (2014), p. 179.
113 Stender-Vorwachs and Lauber-Rönsberg (2021), para ff.
Forgetting as a Social Concept. Contextualizing … 201

directed against a publication in new communication formats such as social networks


or Internet search engines, there are clear uncertainties with regard to the application
of the media privilege.114 Only a few decisions have explicitly addressed the question
of whether and to what extent the media privilege also applies to new media. Of
particular importance in Germany is the spickmich decision of the BGH.115 At the
European level, the Google-Spain decision is often cited.116 In its spickmich decision,
the BGH had to decide, among other things, whether an online evaluation platform
on which students could evaluate their teachers is to be assessed under data protection
law or press law. The evaluation platform required users to register and to indicate
the name and place of their school. Then they could create profiles of the teachers
at their school, with the teachers’ names and subjects as well as a rating tool. The
rating tool included criteria such as ‘cool’ or ‘funny’ and enabled users to rate the
teachers using German school grades. An overall grade for the teacher was calculated
after four ratings. Evaluations including just the best or just the worst grade were
not considered in the calculation of the overall grade. Furthermore, students could
enter quotations from the teacher in a comment function. Teachers could notify the
network if they thought the rating was not justified. To decide which field of law
applies to this constellation, the BGH had to decide whether the assessment of the
teacher’s performance by the students was ‘journalistic’ in the sense of the German
norm implementing the European media privilege. In its landmark decision, the
BGH stated that it cannot be a case of journalistic processing if an evaluation is
based on an automatic process.117 The argumentation was based on a comparison
with investigative journalistic activity in the mass media, which is characterized
by editorial editing of content. The court classified all activity on the platform as
automatic in this sense and therefore concluded that the platform had to be assessed
under data protection law.118
In its Google-Spain ruling, the ECJ hardly addressed the question of the applica-
tion of the media privilege. It merely stated, without giving reasons, that the media
privilege does not apply to the results lists of Internet search engines.119 This decision
is surprising against the background of its previous case law. The ECJ had stressed
in its Satamedia decision that the word ‘journalistic’ is to be understood broadly and
includes, among other things, the following:
It follows from all of the above that activities such as those involved in the main proceedings,
relating to data from documents which are in the public domain under national legislation,
may be classified as ‘journalistic activities’ if their object is the disclosure to the public
of information, opinions or ideas, irrespective of the medium which is used to transmit
them. They are not limited to media undertakings and may be undertaken for profit-making
purposes.120

114 See for an overview Lauber-Rönsberg (2014).


115 BGH, decision of 23.6.2009, VI ZR 196/09.
116 ECJ, decision of 13.5.2014—C-131/12.
117 BGH, decision of 23.6.2009, VI ZR 196/09, para 21.
118 BGH, decision of 23.6.2009, VI ZR 196/09, para 19.
119 ECJ, decision of 13.5.2014—C-131/12, para 85.
120 ECJ, decision of 16.12.2008 -73/07, para 61.
202 A. Schimke

So far only a small fraction of the large number of possible online media cases has
been decided at the level of the highest courts. The decisions tend towards a rather
narrow understanding of media privilege. Jurisprudence at the level of the courts of
lower instance has already had to decide on a number of communication format,
without, however, reflecting deeply on the media privilege.
The media privilege can also play a useful role in the area of online media,
although it cannot be said that the application of data protection law per se brings
about excessive limitations of communication freedoms because the communication
formats are too versatile. Therefore, in the context of new media, the media privilege
cannot be understood as a form of reconciling interests. It can, however, serve a
different role with respect to online media. It can contribute to the application and
further development of data protection law and press law online in a way that is
appropriate to the subject matter. This can be achieved by a comparison of the two
areas of law: which area serves the medium and the personality rights in question
better and can therefore contribute to a communication situation online friendly to
media rights and personality rights? Therefore, the interpretation and application of
the media privilege should consider the structural features of data protection law and
press law and compare the effect of their application with respect to the medium
in question. The media privilege then applies the area of law that best suits the
respective communication situation. This means that the media privilege must be
understood as a kind of coordination mechanism between data protection law and
press law. The comparison of their structural features helps making aware of the
particular features of the two legal regimes and therefore helps identifying potential
of innovation with respect to new media within each regime. By doing so it also
serves to establish normative standards that consider the particular characteristics of
online communication situations.
Such an understanding presupposes, however, that the media privilege can be
interpreted in such a way that it also includes (partially) public statements in new
media. On the other hand, it presupposes that data protection law and press law
serve the same objects of protection in the areas where they overlap. If this is the
case, concrete criteria for the application of the media privilege as a mechanism of
coordination can be developed.

6 Application of the Media Privilege to New Media

As described above, the decision of the BGH and the latest decision of the ECJ
follow a rather strict understanding of the media privilege. According to the BGH,
a certain threshold of editing must be reached, and this cannot occur when data
are processed automatically. In the German commentary literature on Article 85(2)
GDPR it is said that the threshold cannot be reached in cases of mere data collection
and dissemination.121 This line of argument is too strongly focused on a comparison to

121 Buchner and Tinnefeld (2021), para 24.


Forgetting as a Social Concept. Contextualizing … 203

activities of traditional mass media. If one takes into account the goal of protection of
Article 85(2) GDPR, it is possible to come to another conclusion. Article 85(2) GDPR
protects the process of opinion formation concerning opinions that are mediated to a
public and can therefore have a specific impact.122 The media privilege is supposed
to protect this process, which is supposed to be as free as possible. It is true that mass
media play an important role in this process. But it is also true that the dissemination
of opinions to a public is not only done by mass media. In fact, new media play
an increasingly important role within this process.123 And within these new media,
a lot of processing is done automatically. In other words: not the fact that data
is processed automatically is important but rather the impact of this process on the
level of public information.124 This is true, for example, of the results lists of Internet
search engines and Facebook’s timeline. Application of data protection law (and/or
press law) that is not context-sensitive would hinder the development of normative
standards for communication in the new media, as indicated by the example of the
actor’s publication of the private message as it was decided by the Regional Court
Saarbrücken.125 The lack of a suitable normative standard for communication in
new media has negative effects on the process of opinion formation and therefore
impacts the goal of protecting the media privilege. The protection objective of the
media privilege thus argues for the inclusion of new media in its scope of application.
What is more, the comparison with classical mass media was undertaken by the BGH,
and the BGH only argues on the basis of national law. Its understanding is therefore
not binding for the understanding of the European regulation in Article 85(2) GDPR.
At the European level, in contrast, there are two contradictory decisions: on the
one hand, the very broad understanding of ‘journalistic’ in the Satamedia decision
and a rather narrow understanding in the Google-Spain decision. The Satamedia
decision concerns the enterprise Satamedia, which wanted to disseminate information
concerning tax data previously published on government websites. These data had
been collected by another enterprise. Satamedia wanted to disseminate the collected
data via new communication channels. Here, no level of editing could be identified.
The decision dealt with the mere collection and dissemination of data; the court
concluded that this activity must be understood as ‘journalistic’ in the sense of Article
9 data protection Directive 95/46.126 The court linked the word ‘journalistic’ to the
fact that a piece of information was made public rather than to the circumstances of
how this happened (automatically or with a certain amount of editing). According
to this broad understanding, the media privilege can be applied to new media online
provided there is a communication situation in which a public is to be reached with the

122 The fact that a public is addressed by the publication is emphasized inter alia by Herb (2018)
para 13c.
123 A differentiated study focusing on the role of intermediaries in the process of opinion formation

in Germany is provided by Schmidt et al. (2017).


124 Similar Ladeur (2020), 142.
125 See above, Sect. 8.4.4.
126 ECJ, decision of 16.12.2008 -73/07, para 62.
204 A. Schimke

help of communication technology.127 Although the ECJ’s Google-Spain decision


did not follow this broad understanding, the legislator of the GDPR did, namely by
stating in its recital 153 to Article 85(2) GDPR that notions such as ‘journalistic’
related to the freedom of expression should be understood broadly. In brief, Article
85(2) GDPR is therefore generally open to an interpretation that includes new media
in its scope of application.128

6.1 Protected Interests in Data Protection Law and Press Law

In order to understand the media privilege as a mechanism of coordination between


press law and data protection law, the two areas must serve the same protected inter-
ests. Otherwise it would not be possible to switch between them without infringing
the interest that is protected by only one regime. The question of protected interests
has accompanied the discussion on data protection law right from the outset.129 In
Germany, various concepts can be distinguished whose reference point is the right
to informational self-determination (Article 2(1) GG in conjunction with Article
1(1) GG).130 This right raises several questions.131 Following the case law of the
Federal Constitutional Court, it has often been conceptualized as a right of disposal
that gives individuals the power to decide for themselves about the disclosure and
use of personal data.132 This concept has been and continues to be criticized from
various quarters.133 Among other things, it is accused of placing almost every (state)
handling of personal data and information under legal reservation, thus leading to a
large number of laws whose steering effects are limited.134 Related to this, and even
more decisive, is the accusation that this concept does not take sufficient account
of the object of informational self-determination—the handling of personal data
and information—and thus results in inadequate solutions.135 Where information is

127 This criterion also provides a possible differentiation between the word ‘opinion’ in Article 85(1)
1 GDPR—which says that the Member States should adapt their laws to the Regulation in the field
of the freedom of opinion—and Article 85(2) GDPR. Article 85(2) GDPR is in my understanding
geared towards communication situations where content is mediated to a (sub)-public, whereas
Article 85(1) GDPR focuses more on the content of a message and also includes situations where
content is transferred between two parties. See for the relationship between Article 85(1) GDPR and
Article 85(2) GDPR and the consequences the possible interpretations have for the relationship of
press law and data protection law in the field of pictures containing natural persons Lauber-Rönsberg
and Hartlaub (2017).
128 Similar Michel (2018), p. 837.
129 An overview is provided by Simitis (2014).
130 Simitis (2014), p. 205.
131 Albers (2005).
132 Simitis (2014), p. 208.
133 Trute (2003), Bull (2011), Britz (2010), Albers (2005).
134 Franzius (2015a), p. 263.
135 Albers (2005), p. 156.
Forgetting as a Social Concept. Contextualizing … 205

understood to be the result of communication between at least two parties generated


on the part of the person who perceives a particular communication or a piece of data,
information is part of a social process.136 Solutions appropriate to the subject try to
take account of this social dimension of the right to informational self-determination.
This includes the fact that this right cannot be conceptualized as a power of disposal.
Thus, the decisive reference point for a large number of approaches to the protected
goods is also dissolved. They are replaced by a two-level concept.137 In this concept
the right to informational self-determination is conceptualized at a first level as a
formative task for the legislator and the executive. They are obliged to form regula-
tions that aim at structuring state (and, mediated through the third-party effect, also
private) handling of personal data and information in order to prevent this handling
from proceeding in ways that the individual is completely unable to control and
predict. The aim at this level is therefore to protect individuals from unlimited and
non-transparent handling of personal information and data. Regulatory requirements
for the legislator and public agencies include, for example, making the handling of
personal information and data appropriate and transparent, ensuring the accuracy
of information and data, and defining the framework and conditions for specific
instances of information and data processing in the sense of context control. An
instrument for context control can be seen, for example, in the principle of purpose
limitation, by means of which data and information processing by public agencies is
linked to task and authority standards.
On this basis, it is possible to estimate which threats to legally protected interests
exist from data processing processes in concrete contexts. This is where the second
level comes in. This level concerns the legal requirements against the background of
concrete threats to legally protected goods. For example, more specific requirements
may arise in a context where data processing concerns data and information collected
during a home search. In this case, Article 13 GG makes special demands. Similarly,
other fundamental rights may also be affected. Article 5(1) GG, for example, stip-
ulates that media companies do not have to publish data and information about the
source of information of a media report and thus limits the possibilities of the govern-
ment obtaining knowledge about a certain person. In a comparable manner, a data and
information dimension can be taken from the various fundamental rights. They often
mediate injunctive relief claims against certain data and information processes.138
Corresponding claims are also provided by the general right of personality (allge-
meines Persönlichkeitsrecht) as it is derived from Article 2(1) GG in conjunction with
Article 1(1) GG. The injunctive relief claims that consider this right are substanti-
ated at the level of statutory law, inter alia by data protection law and by press
law.139 Informational self-determination thus recognizes a wide range of protected
goods. As soon as there is an overlap between data protection law and press law, the

136 Albers (2005), p. 87.


137 Albers (2005), especially p. 451, p. 600.
138 Albers (2005), p. 590.
139 See in detail for the different dimension of the Allgemeines Persönlichkeitsrecht as a protected

good in press law Wagner (2017).


206 A. Schimke

general right of personality as an object of protection is affected. In this respect, data


protection law and press law serve the same object of protection.

6.2 Possible Criteria for the Interpretation of the Media


Privilege

Understanding the media privilege as a mechanism of coordination means that after


interpreting and applying the media privilege, the regime is applied that fits the case in
question best. The idea is that by doing so, both regimes might be further developed
in a sensitive way. This can be achieved by a two-step approach to interpreting
and applying the media privilege. As a first step one should ask if we are dealing
with a medium that establishes a certain public and by doing so serves the process
of individual and public opinion formation. As a second step one should take into
account and compare the characteristics of the two regimes.
The first step is necessary to establish a link to the process of individual and
public opinion formation and therefore to the protected good of the media privilege.
This step also provides a way to make sure that the specific features of the medium
in question are considered. Here one should ask if there is a link to the process
of individual and public opinion formation, how this process is supported by the
medium, which roles the parties involved play, what influence they can have, to what
extent the infrastructure predefines content, and what kind of public the resulting
public is. Especially the latter question is of importance for determining the nature
and intensity of an infringement of personality rights.
The second step then is a comparison of the characteristics of the two regimes
with respect to the medium in question. As described above, both areas have specific
characteristics, each of which can be meaningfully applied in the area of new commu-
nication formats. In data protection law, this includes the balancing of information
asymmetries between two parties through the creation of transparency and oppor-
tunities for codetermination. In the case of new communication formats, this may
be the case, for example, for the collection and use of data by the social networking
service provider in the context of the use of the service. One strength of press law—
which, of course, must be further developed for new communication formats—is
the development of context-specific standards of consideration that react to different
interests. With new communication formats, for example, this strength can always
come to bear when third parties can express themselves—these statements are usually
conveyed via a network or other service and cannot be anticipated by the latter in the
same way as by the other parties involved. In this respect, there is no typical case of
information asymmetry here. Against the background of the described characteris-
tics of data protection law, it would be logical, for example, to apply data protection
law whenever a medium produces relatively predictable results.140 This could be the

140Another possibility could be to distinguish between public and non-public communication.


This pressuposes a legal understanding of „public“ in the overlapping area of press law and data
Forgetting as a Social Concept. Contextualizing … 207

case, inter alia, for certain rating platforms. For outcomes that cannot be predicted,
however, the ex-post regulatory approach of press law could be more appropriate.
This applies, for example, for results lists of Internet search engines.
When data protection law is applied, all its possibilities should be exhausted.
For example, one could consider transferring the legal construct of the scientifi-
cally recognized mathematical-statistical method (Section 31 of the old BDSG) to
rating platforms. This could have the consequence that evaluations only become
visible when a sufficient number of evaluations have been submitted, so that the
result is statistically representative. One could also consider introducing a data
protection impact assessment (Article 35 GDPR) particularly for rating platforms
or rating features, or certificates for those rating platforms/rating features that are in
accordance with data protection law. Current data protection law does not provide
adequately for such regulations, but its further development in this direction would
be in line with its regulatory structure. And the media privilege as a coordination
mechanism could help to identify this innovative potential of data protection law. The
same applies for press law. With respect to the right to be forgotten and the frequent
accusations of private censorship, it would, for example, be particularly logical to
further develop the mechanisms of liability or to think about new institutions.141

7 Conclusion

The coordination mechanism outlined in this way can also be understood as a kind of
learning instrument for fields of law. The comparative interpretation and the presup-
posed separation of the fields of law should therefore not be understood too strictly.
In fact, press law and data protection law can both be understood as different but
related ways to structure and/or influence the processing of personal data and infor-
mation and therefore as different ways to realize the constitutional right to informa-
tional self-determination. At least from a constitutional perspective, the two areas are
therefore related.142 The question is what the realizations of this relationship should
look like. One possible way can be observed in the latest judgments in both areas:
the described adaptation and gradual assimilation of the two regimes to each other.
This text provides a different perspective by suggesting using the media privilege
as a mechanism of coordination between the two regimes. This includes a compar-
ative perspective and therefore a differentiation between them. The advantage of a
comparative approach is that the strengths of both areas can be identified and, as
a result, perhaps more differentiated standards can be achieved. In the case of the

protection law. This division is applied by the Federal Consitutional Court in its decisions on the
right to be forgotten. See particularly BVerfG, decision of 6.11.2019—1 BvR 16/13=NJW 2020.
300–314, para 82 ff.
141 Suggestions are made inter alia by Ladeur (2014).
142 Lauber-Rönsberg (2014), p. 1058 therefore classifies press law as a sub-category of data

protection law.
208 A. Schimke

right to be forgotten, it follows that different forms could develop in the different
areas, which react sensitively to their application context and whose prerequisites are
suitable to the context of application.143 This understanding has consequences for
the relationship of European law and national law in the fields of press law and data
protection law. Following a traditional understanding of the media privilege, press
law is conceived as national law, whereas data protection law is based on European
law. Using the media privilege as a mechanism of cooperation includes the extension
of its area of application. Since the media privilege is based on Article 85(2) GDPR
and therefore on European law, a connection to European law is given every time it
is applied. Therefore, the European fundamental rights have to be taken into account
and the jurisprudence of the ECJ has to be considered when the media privilege is
applied. This means that an understanding of the media privilege as a mechanism
of coordination between data protection and press law leads to an extended need for
cooperation between European law and national law in the field of press law. Further
questions therefore concern how this cooperation should and could be constructed
in the overlapping field of press law and data protection law, taking into account the
limited competences of the European Union in the field of speech regulation.144

References

Albers M (2005) Informationelle Selbstbestimmung. Nomos, Baden-Baden


Albers M (2012) Umgang mit personenbezogenen Informationen und Daten. In: Hoffmann-Riem
W, Schmidt-Aßmann E, Voßkuhle A (eds) Grundlagen des Verwaltungsrechts, vol 2. C.H. Beck,
München, § 22
Albers M (2014) Realizing the complexity of data protection. In: Gutwirth S, de Hert P, Leenes R
(eds) Reloading data protection. Springer, Dordrecht/Heidelberg/London/New York, pp 213–235
Albers M, Veit R (2021) Art. 6 Rechtmäßigkeit der Verarbeitung. In: Wolff HA, Brinck S (eds)
BeckOK Datenschutzrecht, 38th edn. C.H. Beck, München
Arning M, Moos F, Schefzig J (2014) Vergiss(,) Europa! Ein Kommentar zu EuGH, Urt. v.
13.5.2014 – Rs. C-131/12 – Google/Mario Costeja Gonzalez. CR, pp 447–456
Assmann A (2010) Erinnerungsräume. Formen und Wandlungen des kulturellen Gedächtnisses.
C.H. Beck, München
Assmann A (2016) Formen des Vergessens. Wallenstein, Göttingen
Baldus C (2017) § 1004 Beseitigungs- und Unterlassungsanspruch. In: Säcker FJ, Rickecker R,
Oetker H, Limperg B (eds) Münchener Kommentar zum BGB. vol 7. C.H. Beck, München
Becker C (2019) Das Recht auf Vergessenwerden, Mohr Siebeck, Tübingen
Bernal P (2014) The EU, the US and right to be forgotten. In: Gutwirth S et al (eds) Reloading data
protection. Springer, Dordrecht, pp 61–77
Blum RO (2013) The right to be forgotten in Brazil. https://iapp.org/news/a/the-right-to-be-forgot
ten-in-brazil/. Accessed 11 Jan. 2022

143 From a different perspective Koops (2012) arrives at a comparable conclusion, saying that one
could differentiate between different rights to be forgotten (‘right to a clean slate’) in different legal
contexts. See especially Koops (2012), p. 22.
144 A general overview of the relationship between European law and the law of the Member States

with respect to the fundamental rights dimension is given by Franzius (2015a).


Forgetting as a Social Concept. Contextualizing … 209

Boehme-Neßler V (2014) Das Recht auf Vergessenwerden – Ein Internet-Grundrecht im Europäis-


chen Recht. (NVwZ), pp 825–830
Britz G (2010) Informationelle Selbstbestimmung zwischen rechtswissenschaftlicher Grund-
satzkritik und Beharren des Bundesverfassungsgerichts. In: Hoffmann-Riem W (ed) Offene
Rechtswissenschaft. Mohr Siebeck, Tübingen, pp 561–596
Buchholtz G (2015) Das “Recht auf Vergessen” im Internet – Eine Herausforderung für den
demokratischen Rechtsstaat. AöR 140:121–153
Buchner B, Petri T (2021) Art. 6. In: Kühling J, Buchner B (eds) Datenschutz-
Grundverordnung/BDSG. Kommentar, 3rd edn. C.H. Beck, München
Buchner B, Tinnefeld MJ (2021) Art. 85. In: Kühling J, Buchner B (eds) Datenschutz-
Grundverordnung/BDSG. Kommentar, 3rd edn. C.H. Beck, München
Bull HP (2011) Informationelle Selbstbestimmung – Vision oder Illusion? Datenschutz im
Spannungsverhältnis von Freiheit und Sicherheit, 2nd edn. Mohr Siebeck, Tübingen
Caspar J (2010) Datenschutz im Verlagswesen: Zwischen Kommunikationsfreiheit und informa-
tioneller Selbstbestimmung. (NVwZ), pp 1451–1457
Cornils M (2018) Der Streit um das Medienprivileg. Zur Unionsrechtskonformität der neu gefassten
Regelungen zum Mediendatenschutz. ZUM, pp 561–577
Diesterhöft M (2014a) Das Recht auf medialen Neubeginn. Die “Unfähigkeit des Internets, zu
vergessen” als Herausforderung für das allgemeine Persönlichkeitsrecht, Duncker & Humblot,
Berlin
Diesterhöft M (2014b) Datenschutzrechtlicher Direktanspruch gegen Suchmaschinenbetreiber –
Königsweg zum medialen Neubeginn? – Zum “Recht auf Vergessen” im europäischen Daten-
schutzrecht. VBLBW, pp 370–375
Drexl J (2017) Bedrohung der Meinungsvielfalt durch Algorithmen. ZUM, pp 529–543
Engeler M (2018) Art. 85 DSGVO, die Meinungsfreiheit und das datenschutzrechtliche Verbot-
sprinzip. Telemedicus of 19.03.2018. http://tlmd.in/a/3272. Accessed 11 Jan. 2022
Esposito E (2002) Soziales Vergessen. Formen und Medien des Gedächtnisses der Gesellschaft.
Suhrkamp, Frankfurt
Esposito E (2017) Algorithmic memory and the right to be forgotten on the web. Big Data Soc, pp
1–11
Franzius C (2015a) Das Recht auf informationelle Selbstbestimmung. ZJS, pp 259–270
Franzius C (2015b) Stragien der Grundrechtsoptimierung in Europa. EuGRZ, pp 139–153
Gonçalves MAD (2020) The right to be forgotten according to the Brazilian precedents. In: Werro
F (ed) The right to be forgotten. A comparative study of the emergent right’s evolution and
application in Europe, the Americas, and Asia. Springer, Cham, pp 249–264
Grimm D (2013) Der Datenschutz vor einer Neuorientierung. JZ, pp 585–636
Gstrein O (2016) Das Recht auf Vergessenwerden als Menschenrecht. Nomos, Baden-Baden
Herb A (2018) § 57 Datenschutz bei journalistisch-redaktionellen Zwecken. In: Binder R, Vesting
T (eds) Beck’scher Kommentar zum Rundfunkrecht, 4th edn. C.H. Beck, München
Herbst T (2021) Art. 17 Recht auf Löschung (“Recht auf Vergessenwerden”). In: Kühling J, Buchner
B (eds) Datenschutzgrundverordnung/BDSG, 3rd edn. C.H. Beck, München
Hofmann F (2017) Mittelbare Verantwortlichkeit im Internet. Eine Einführung in die Intermediär-
shaftung. JuS, pp 713–719
Hoffmann-Riem W (2017) Verhaltenssteuerung durch Algorithmen – Eine Herausforderung für das
Recht. AöR 142:1–42
Hornung G, Hofmann K (2013) Ein “Recht auf Vergessenwerden”? Anspruch und Wirklichkeit
eines neuen Datenschutzrechts. JZ, pp 163–170
Jones ML (2016) Ctrl + Z: the right to be forgotten. NY Press, New York
Kirchberg E (2013) Identifizierende Altmeldungen über Strafverfahren in Online-Archiven: Beugt
sich das Recht der technischen Entwicklung? GRUR-Prax, pp 237–239
Kodde C (2013) Die “Pflicht zu Vergessen”. ZD, pp 115–118
210 A. Schimke

Koops BJ (2012) Forgetting footprints, shunning shadows. A critical analysis of the “right to
be forgotten”. In: Big data practice. Tilburg Law School Legal Studies Research Paper Series,
08/2012, pp 229–256
Koreng A, Feldmann T (2012) Das “Recht auf Vergessen” Überlegungen zum Konflikt von
Datenschutz und Meinungsfreiheit. ZD, pp 311–315
Kühling J (2014) Die Rückkehr des Rechts: Verpflichtung von “Google & Co.” zum Datenschutz.
EuZW, pp 527–532
Kühling J (2020) Das “Recht auf Vergessenwerden” vor dem BVerfG – November(r)evolution für
die Grundrechtsarchitektur im Mehrebenensystem. NJW, pp 275–280
Lauber-Rönsberg A (2014) Internetveröffentlichungen und Medienprivileg. Verhältnis zwischen
datenschutz- und medienzivilrechtlichem Persönlichkeitsschutz. ZD, pp 177–182
Lauber-Rönsberg A, Hartlaub A (2017) Personenbildnisse im Spannungsfeld zwischen Äußerungs-
und Datenschutzrecht. NJW, pp 1057–1062
Ladeur KH (2009) Anmerkung. JZ, pp 966–968
Ladeur KH (2014) Cyber Courts. Private Rechtsprechung in den neuen Medien. Murmann, Hamburg
Ladeur KH (2018) Persönlichkeitsschutz und “Gegenschlag” auf Facebook – Anmerkung zu LG
Saarbrücken, Urteil vom 23.11.2017 – 4 = 328/17. ZUM-RD, pp 122–123
Ladeur KH (2020) Grundrechtsschutz im Mehrebenensystem durch das BVerfG, insbesondere der
Grundrechtsschutz der Betreiber von Suchmaschinen. WPR, pp 139–143
Martini M (2017) Algorithmen als Herausforderung für die Rechtsordnung. JZ, pp 1017–1072
Mayer-Schönberger V (2011) Delete. The virtue of forgetting in the digital age. Princeton University
Press, Princeton
Michel S (2018) Bewertungsportale und das Medienprivileg – Neue Impulse durch Art. 85 DSGVO?
ZUM, pp 836–843
Neunhoeffer F (2005) Das Presseprivileg im Datenschutzrecht. Mohr Siebeck, Tübingen
Nolte N (2011) Zum Recht auf Vergessen im Internet. ZRP, pp 236–240
Reinhardt J (2022) Realizing the fundamental right to data protection in a digitized society. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the Internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Rosen J (2012) The right to be forgotten. Stanford Law Rev 64:88–92
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the Internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schmidt JH, Merten L, Hasebrink U, Petrich I, Rolfs A (2017) Zur Relevanz von Online-
Intermediären für die Meinungsbildung, Verlag Hans-Bredow-Institut (Arbeitspapiere des
Hans-Bredow-Institut Nr. 40), Hamburg
Simitis S (2014) § 1 Zweck und Anwendungsbereich des Gesetzes. In: Simitis S (ed) Bundesdaten-
schutzgesetz, 8th edn. Nomos, Baden-Baden
Spiecker genannt Döhmann I (2014) Steuerung im Datenschutzrecht. Ein Recht auf Vergessen
wider Vollzugsdefizite und Typisierung? Kritische Vierteljahresschrift für Gesetzgebung und
Rechtswissenschaft, pp 28–43
Spindler G (2012) Persönlichkeitsrecht und Datenschutz im Internet. NJW-Beil, pp 98–101
Spindler G, Volkmann C (2015) Störereigenschaft und Abgrenzung zu den Täter- und Teilnehmer-
formen. In: Spindler G, Schuster F (eds) Recht der elektronischen Medien. Kommentar, 3rd edn.
C.H. Beck, München, § 1004, para 9–18
Stehmeier M, Schimke A (2014) Internet-Suchmaschinen und Datenschutz, UFITA, pp 661–682
Stender-Vorwachs J, Lauber-Rönsberg A (2021) Art. 85, Verarbeitung und Freiheit der Mein-
ungsäußerung und Informationsfreiheit. In: Wolff HA, Brinck S (eds) BeckOK Datenschutzrecht,
38th edn. C.H. Beck, München
Stumpf F (2017) Das Recht auf Vergessenwerden. Das Google-Urteil des EuGH: Vorbote der
zweiten Chance im digitalen Zeitalter oder Ende der freien Kommunikation im Internet? Tectum
Verlag, Baden-Baden
Forgetting as a Social Concept. Contextualizing … 211

Théry P (2016) Online-Archive aus verfassungsrechtlicher Sicht. Dr. Kovač, Hamburg


Trentmann C (2016) Die (un)geklärte Rechtslage bei Altberichten in Online-Archiven. Kritischer
Überblick zum aktuellen Entwicklungsstand der Rechtsprechung, MMR, pp 731–735
Trute HH (2003) Verfassungsrechtliche Fragen des Datenschutzes. In: Roßnagel A (ed) Handbuch
Datenschutzrecht: Die neuen Grundlagen für Wirtschaft und Verwaltung, pp 156–187
Trute HH, Broemel R (2016) Alles nur Datenschutz? Zur rechtlichen Regulierung algorithmen-
basierter Wissensgenerierung. Berliner Debatte Initial, pp 50–65
van der Sloot B (2015) Welcome to the jungle: the liability of Internet intermediaries for privacy
violations in Europe. JIPITEC, para 1
von Grafenstein M (2018) The principle of purpose limitation in data protection laws. The risk-
based approach, principles, and private standards as elements for regulating innovation. Nomos,
Baden-Baden
von Lewinski K (2015) Der Staat als Zensurhelfer – Staatliche Flankierung der Löschpflichten
Privater nach dem Google-Urteil des EuGH. AfP, pp 1–6
Wagner G (2017) § 823 Schadensersatzpflicht. In: Säcker, FJ et al (eds) Münchener Kommentar
zum Bürgerlichen Gesetzbuch, vol 6, 7th edn, München
Weismantel J (2017) Das “Recht auf Vergessenwerden” im Internet nach dem “Google Urteil” des
EuGH: Begleitung eines offenen Prozesses. Nomos, Baden-Baden
Werro F (ed) (2020) The right to be forgotten. A comparative study of the emergent right’s evolution
and application in Europe, the Americas, and Asia. Springer, Cham

Anna Schimke Research fellow at the Chair for Public Law, Information and Communication
Law, Health Law and Legal Theory at Universität Hamburg. Main areas of research: Information
and Communication Law, Data Protection Law, Digital Memory and Law, Legal Theory. Selected
Publications: Internet-Suchmaschinen und Datenschutz. UFITA (2012): 661–683; Vergessen als
neue Kategorie im Recht. In: Autengruber M et.al. (eds.) Zeit im Recht – Recht in der Zeit, Jan
Sramek Verlag, Wien, 2016, pp. 87–104; Das Medienprivileg als Koordinationsmechanismus. In:
Albers M, Katsivelas I (eds.) Recht & Netz, Nomos, Baden-Baden, 2018, pp. 155–186; Rechtliche
Rahmenbedingungen der Veröffentlichung von Kinderfotos im Netz durch Eltern. NZFam (2019):
851–857.
Brazilian Internet Bill of Rights: The
Five Roles of Freedom of Expression

Carlos Affonso Pereira de Souza and Beatriz Laus Marinho Nunes

Abstract Freedom of expression was given a prominent position in the Brazilian


Internet Bill of Rights (Law no. 12.965/2014). It is mentioned five times in the
wording of the Law, each time playing a different role. From discussions on how to
protect anonymous discourse to the challenges of implementing a regime for Internet
intermediaries’ liability, this article reviews how freedom of expression is addressed
in the Brazilian Internet Bill of Rights.

1 Introduction

The Brazilian Internet Bill of Rights (Law no.12.965/14) is essentially an assertion


of rights, created from the need to inaugurate the regulation of the Internet in Brazil
not by criminal perspective, but by the protection of fundamental rights. The Internet
Bill of Rights’ wording specifically ensures a series of rights and guarantees for
the Internet user. However, in addition to the debate on privacy and personal data
protection, or even on the contours of network neutrality, the main issue addressed
by the Law is that of freedom of expression. The subject is addressed in five key
moments of the Brazilian Internet Bill of Rights, highlighting its relevance to the
regulation of Internet in the Country.
The discipline of Internet use in Brazil has freedom of expression as its founda-
tion, as provided for in Article 2. Soon after, in Article 3, its guarantee appears as
a principle of the same discipline. Article 8 states that the protection of freedom of
expression is a condition for the full exercise of the right of access to the Internet.
Regarding damages caused on the Internet and the consequent liability of its
agents, freedom of expression plays two important roles. The header of Article 19,

C. A. P. de Souza
Rio de Janeiro State University (UERJ), Rio de Janeiro, Brazil
e-mail: caff@itsrio.org
B. L. M. Nunes (B)
Intellectual Property Specialist, Pontifical Catholic University of Rio de Janeiro (PUC), Rio de
Janeiro, Brazil

© Springer Nature Switzerland AG 2022 213


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_9
214 C. A. P. de Souza and B. L. M. Nunes

which establishes the rule for Internet application providers’ liability, begins with
the expression “to ensure freedom of expression and to prevent censorship”.
With regards to online copyright infringement, the Brazilian Internet Bill of Rights
in the second paragraph of Article 19 also states that the application of the liability
regime determined by it is dependent on a specific legal provision. Although this
wording conditions its application to a future legislation, it is important to point
out that, according to the provision, this new specific legislation should “respect the
freedom of expression and other guarantees provided for in Article 5 of the Brazilian
Federal Constitution”.
The present Article reviews the mentioned five references to freedom of expres-
sion in the Brazilian Internet Bill of Rights and seeks to address issues through
a practical approach, involving the protection designated by Law no. 12.965/14.
Although the field of application of this right is extensive, the understanding of how
it is protected is of clear relevance in order to ensure, in practical cases, the appli-
cation of legal provisions in response to the constant transformations presented by
modern communication and information technologies.

2 Freedom of Expression: Basis for Protection

No other concept seems to have caused more adages, axioms, and entries in quoting
dictionaries than that of freedom. In its defense, in the course of history, some of
the most important political and social movements were on behalf of freedom of
expression, just as, also in the name of such a right, unspeakable atrocities were
committed.1 Because it is so close to human nature as to identify itself with the very
condition of men, grasping the concept of freedom is a complex task. Therefore,
and only for purposes of argumentation in this text, it can be said that freedom is
the absence of restrictions of a physical or moral order, and the subject’s will is not
subjected to that of others. Thus, through the lenses of its negative aspects, the idea
of freedom might be better understood for the narrow purposes of this Article.2
The subject’s will, however, may not be externalized. It is precisely for this reason
that it is important to emphasize that this internal dimension of freedom is not subject
to any legal restriction. On the other hand, it is in the act of the individual, in the
expression of his or her thought that the Law applies, ordering conduct and promoting
the appeasement of social relations. Knowing under what conditions the law protects
this expression is of ultimate relevance.
Through this perception, one could imagine that the presence of a judicial norm
necessarily implies in the curtailment of individual will, constituting a true prison of

1 Casanova (1875), p. 29.


2 Haddad Jabur (2000), p. 141. Antonio Scalisi offers a positive definition, according to which,
freedom is “the faculty of man to explain his own personality, to interpret and to live the existential
experience, according to his own personal way of feeling the universe to which he belongs”, in
Scalisi (1990), p. 31.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 215

free will. This notion was exposed, in literary terms, by Giuseppe Tomasi di Lampe-
dusa, author of the celebrated novel Gattopardo, in his poem La gioia e la legge, in
which the existence of law thwarts the feeling of happiness.3 However, the opposition
is not entirely appropriate, because along with its repressive function, which curtails
individual freedom, the law also—and above all—guarantees the fundamental rights
to the dignity of the human person.
This dual mission is present in the constitutional text, by protecting rights and
imposing duties and restrictions. Likewise, the Internet Bill of Rights, which is
substantive in terms of freedom of expression, explicitly emphasizes in its Article
6 that its validity is not only aimed at sanctioning conduct, as it would be expected
from criminal liberties gained using the network. So much so that the “nature of the
internet” itself is openly elevated to the level of the interpretive vector, as established
by Article 6:
Article 6. In interpreting this Law, the nature of the Internet, its particular uses and traditions,
and its importance in promoting human, economic, social, and cultural development must
be taken into account, in addition to the foundations, princi-ples, and objectives set forth
herein.

In this respect, a better understanding of the reasons that justify the publication of
the Brazilian Internet Bill of Rights point to the fact that regulating the relationships
waged through the Internet by means of its devices aim not only as way of orienting
conduct and emphasizing certain principles that should govern future regulations
on the Internet, but also ensuring that all freedoms that have been won through the
development of the Internet and information and communication technologies, are
not eroded by different interests.
In global forums on regulation and network governance, the expression “internet
freedom” is constantly repeated. Thus, it is important to clarify—and the Internet
Bill of Rights assists in this endeavor—that the freedom enjoyed on the Internet
does not exist because there is no law regulating the conduct therein undertaken,
on the contrary, it exists precisely because the laws currently projected or which
begin to take effect, as well as the interpretation of laws prior to the development
of the network, should seek to preserve the freedoms and liberties conquered by the
development of technology, always striving for the balance of rights involved in its
application.
Freedom itself can only be fully practiced by providing the Law with the means
necessary for its effectiveness. The repressive and protective function of the law has
been synthesized since Roman times, when the famous saying was coined: ubi lex,
ibi poena; ubi periculum, ibi lex.
For the Law, the human being, condemned to freedom,4 is free to act in the world
in the way that best suits him, as long as it does not violate legal prohibition. On
the other hand, it is the Law itself that will guarantee the exercise of this freedom,
protecting the different expressions of the right to freedom.

3 Luño (2002), p. 25.


4 Sartre (1970), p. 37.
216 C. A. P. de Souza and B. L. M. Nunes

Thus, freedom of thought, communication, religion, intellectual, artistic, and


scientific expression, as well as freedom of assembly are protected by the legal
system. Given the challenges presented by the development of the Internet, greater
emphasis will be given to the analysis of freedom of expression and its constitu-
tional contours, since it is precisely from an interpretation on how the Constitution
protects freedom of expression that the Brazilian Internet Bill of Rights’ application
and tutelage will be better comprehended.

2.1 Freedom of Thought and Freedom of Expression

Among the various forms of expression of individual freedom, the Federal Consti-
tution confers broad treatment on freedom of thought and expression.
Freedom of thought and expression can both be characterized as intellectual
content freedom, having as a presupposition for their exercise the interaction
between individuals, with the scope of communicating the product of thought, more
specifically, their beliefs, knowledge, ideologies, political opinions and scientific
works.5
From a binomial thought, one can, for purely academic purposes, separate the
non-externalized thought from that already manifested. Thus, there is a division
between freedom of thought, defined in the right to freedom of opinion, even if it
is not manifested, and freedom of expression, through which the free expression of
thought is safeguarded.
Freedom of expression basically encompasses an individual right to the manifes-
tation of thought and creation. In addition to non-externalized thinking, freedom
of expression protects freedom of worship and religious organization, freedom
of intellectual, artistic, scientific and cultural expression, as well as freedom of
information.
Inner thought, not externalized by the individual, is protected by freedom of
opinion. When and if it is externalized, freedom of opinion will give way to freedom
of expression of thought. As Claudio Luiz Bueno de Godoy explains: “Thus the
opinion of the individual is formed, which, as an expression of freedom of thought,
already in its external aspect, has the right to propagate”.6
Freedom of opinion can be considered the starting point for all other kinds of
freedom of thought because it presents the individual with the possibility of adopting
the intellectual attitude that best suits him.
The Constitutional protection of freedom of opinion is contemplated in the initial
part of Article 5, section VI, of the Federal Constitution, according to which “freedom
of conscience and belief is irrefrangible”, as well as in the wording of Article
5, section VIII, which guarantees freedom of religious belief and philosophical
conviction.

5 da Silva (1999), p. 240.


6 Godoy (2001) A Liberdade de Imprensa e os Direitos da Personalidade, p. 56.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 217

Freedom of expression, in turn, translates into the external facet of freedom


of thought, granting the Constitution a general protection for the manifestation of
thought. Such a manifestation may have as its content a religious belief, a journalistic
Article, a scientific work and so on.
The protection conferred on freedom of expression is governed by the constitu-
tional text in Article 5, section IV, of the Federal Constitution, which guarantees
the free expression of thought, forbidding anonymity, and in Article 5, section XIV,
according to which “everyone is guaranteed access to information, safeguarding the
secrecy of the source, when necessary for the professional exercise”. Article 220
also states that “manifestation of thought, creation, expression and information in
any form, process or vehicle shall not be subject to any restrictions, according to the
provisions of this Constitution”.
Freedom of expression, “in its most diverse manifestations, encompasses both the
right (ability) of expressing oneself and of not expressing oneself or not being self-
informed”. Thus, “freedom of expression assumes the primary condition of the right
of defense (negative right), operating as the right of the person not to be prevented
from expressing and / or disseminating their ideas and opinions, without prejudice,
but of a corresponding positive dimension, since freedom of expression implies a
right of access to the means of expression”.7
By offering an environment conducive to the development of freedom of expres-
sion, the constitutional text seeks to empower not only the individual, but also to
create conditions for the development of the Democratic State of Law itself. This
effort is not exhausted in the normative text. Rather, it dialogues with social prac-
tices to strengthen a culture of freedom of expression that encourages and enhances
participation in public life, while creating conditions for the broad development of
personality. As stated by Nicole Mader Gonçalves:
The consolidation of a Democratic State of Law, in which citizens fully exercise public
autonomy, participating in the public sphere of decision in a free and equal manner, in
addition to having security and protection for the development of their private autonomy,
that is, for reflection, thought, participation, and the freedom of expressing themselves, is
directly conditioned by the way that freedom of expression is internalized in social practices
and customs.8

From this rapid incursion on the foundations of the protection of freedom of


expression, one can then analyze the five moments in which its tutelage is triggered
in the Internet Bill of Rights and the controversies that consequently arise.

7 Marinoni, Mitidiero and Sarlet (2014), p. 459.


8 Gonçalves (2014), p. 403.
218 C. A. P. de Souza and B. L. M. Nunes

3 Freedom of Expression as a Foundation


for the Regulation of Internet Use in Brazil

The heading of Article 2 of the Internet Bill of Rights states that “the foundations
of Internet governance in Brazil are based on the respect for freedom of expres-
sion”. Soon after, it points to other fundamentals that, along with freedom of expres-
sion, play an essential role in determining the regulation of the network in Brazil.
Among the sections of the second Article, designated as fundamental rights are:
“human rights, the development of personality and the exercise of citizenship in
digital media” (Article 2, Section II), “plurality and diversity” (Article 2, Section
III), “free enterprise, free competition and consumer protection” (Article 2, Section
V) and the “social purpose of the network” (Article 2, Section VI).
A first question that arises from the reading of this provision is the reason for
which freedom of expression is prominently included in the wording of Law no.
12.965/14. Freedom of expression is predicted in Article 2 of the Internet Bill of
Rights as the basis for the discipline of the network, while the other grounds are
listed in the following subsections.
There are, in fact, technical and political reasons for this treatment of freedom of
expression. In political terms, the inclusion of freedom of expression, highlighted in
the main section of Article 2, meets the demand to promptly defend the legislation
as an important step to better ensure the expression of thought on the Internet.
The worldwide network of connected devices is often associated with the poten-
tializing forms of expression, breaking through blockages imposed by governments
or companies regarding other means of communication. Although this is a simplistic
view of the challenges that freedom of expression faces for its achievement on the
Internet—since the same network that enhances the free speech discourse can also be
an effective means for its curtailment—due to its global reach, there is a perception
that the Internet would be a territory for the free exercise of freedom of expression.
During the process that led to the approval of the Brazilian Internet Bill of Rights,
many were the criticisms directed to the bill for the simple fact that it sought to
establish parameters for the regulation of Internet use in the Country. As explained
in item 2 above, the mere existence of a law that deals with issues related to the
development of technology may be seen as a restriction on the allegedly existing
freedom because of the absence of a specific law.
The process of approving the Internet Bill of Rights, therefore had to circumvent
the natural distrust harbored by the technical community, which saw in the Internet
Bill of Rights—or in any law—an intrusion in the development of practices that are
transformed by the natural evolution of the network use.
In addition, components of a political-partisan nature joined the aforementioned
resistance, seeking to discredit the initiative of the bill as a Federal Government
maneuver to curtail speeches contrary to its interests in the network.
Thus, the emphasis given on freedom of expression in Article 2 undeniably has
a political component, trying to counter, in a single step, a part of the technical
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 219

community that interpreted in the Internet Bill of Rights as an intrusion in techno-


logical progress, while at the same time, evidencing that its approval would not lead
to any censorship. On the contrary, freedom of expression was consecrated in great
prominence as the foundation of the Internet discipline in Brazil.
A second component that helps in clarifying the role played by the Internet Bill
of Rights in Article 2 is not political but technical in nature. From reading the text of
Law no. 12.965/14, the legislator sought to create an environment conducive to the
expression of thought in the Internet. This environment can be perceived not only
by the general statements of its first Articles, but especially from the civil liability
regime established in Article 19.
Article 19, which will be further explored, establishes for the so-called Internet
application providers an environment that restricts the possibility of their being liable
for third-party content only in cases of noncompliance with a judicial court order.
This exemption regime is clearly based on other legislative initiatives that have had
a strong impact on the promotion of discourse and innovation in other countries,
according, for example, to Article 230 of the Communications Decency Act, in the
United States.9
The liability system applicable to application providers in the Internet Bill of
Rights ensures that failure to comply with a private notification would not be suffi-
cient, as a rule, to make the provider liable for third party content. Comprehending
that this rule guarantees a more favorable environment concerning free expression
of thought, Article 2 of the Internet Bill of Rights makes sense in electing freedom
of expression as the foundation of the Internet use in Brazil.
In fact, mentioning freedom of expression in Article 2 might lack technicality,
since soon after, in section II, it is stated that human rights are also the basis of the
regulation applied when using the network. Freedom of expression, without a doubt,
is a human right, which is why it would already be considered as the basis of such a
regulation in Brazil only by reading the wording used in section II. However, in view
of the political and technical motivations mentioned previously, the role undertaken
by freedom of expression and its inclusion in Article 2, is greatly evidenced.

9 See, among others: Bankston, McDiarmid and Sohn, Center for Democracy & Technology,
Shielding the Messengers: Protecting Platforms for Expression and Innovation, 2012, https://www.
cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf (accessed 11 Jan. 2022). For a list of cases
applying Article 230, see the compilation by EFF - Electronic Frontier Foundation, https://www.
eff.org/pt-br/issues/cda230 (accessed 11 Jan. 2022).
220 C. A. P. de Souza and B. L. M. Nunes

4 Freedom of Expression as a Principle for Regulating


Internet Use in Brazil

Article 3, section I, of Law no. 12.965/14 determines that the discipline of Internet
use in Brazil has as one of its principles the “guarantee of freedom of expres-
sion, communication and expression of thought, under the terms of the Federal
Constitution”.
According to the phrasing of Article 3, section I, freedom of expression, in addition
to constituting the foundation for Internet regulation in Brazil, is also one of the
principles that should govern said regulation. In the mentioned Article, the freedom of
expression guaranty will be carried out “in accordance with the Federal Constitution”,
thus attracting all the experience accumulated by decades of interpretation on the
constitutional provisions on the subject, especially Articles 5, Section IV and 220 of
the Brazilian Federal Constitution.
Once the constitutional discipline of freedom of expression has been incorporated
into the Internet Bill of Rights, a question arises with great evidence: the role played
by anonymity on the Internet and how to align its development with the constitutional
text that guarantees freedom of expression but prohibits anonymous speech.
To better understand this dilemma and how the constitutional text can be inter-
preted to enable the various ways of implementing anonymity on the Internet, it is
necessary to begin with the North American experience on the subject, since the
treatment given to anonymous discourse in the United States can offer significant
indications on the subject.

4.1 Freedom of Expression and Its Protection in the United


States

Freedom of expression is guaranteed in the US legal system by the First Amend-


ment to the Constitution. A long trajectory of decisions that sought to make the text
compatible—which apparently guarantees freedom of expression an absolute treat-
ment—with other fundamental rights and collective interests, such as public security,
eventually shaped a strict protection aimed at preserving the freedom of the citizen
to express his/herself freely. There are, nonetheless, some exceptions in which the
Judiciary understood that individual freedom should give way to more relevant values
in a recorded case.
According to the First Amendment to the United States Constitution:
Congress shall make no law respecting the establishment of religion or prohibiting the free
exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people
peaceably to assemble, and to petition the Government for a redress of grievances.

To align freedom of expression with other rights, the US Supreme Court jurispru-
dence has created several standards to measure whether, in a given situation, such
freedom should be curtailed.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 221

One of the most-known standards is the so-called “clear and imminent danger”,
according to which constitutional restrictions imposed by legislation on freedom of
expression, if justified “by an evident public interest, threatened not by a dubious
and remote danger, but by an evident and imminent danger”.10
The criterion of clear and imminent danger was created according to a judicial
opinion proffered by Justice Oliver Wendell Holmes in Schenk v. Arizona, tried
in 1919. The case involved the distribution of pamphlets by mail carried by two
individuals, aimed at obstructing enlistment in the military during World War I. The
object of controversy: calling upon US citizens to resist the intimidation of enlistment.
This reflected the interest of a chosen minority of Wall Street in the United States’
participation in the conflict, as well as denied the right to send US soldiers “in order
to terrorize the citizens of other lands”.11
In his judicial opinion, Justice Holmes stated that “the most inflexible guarantee
of freedom of speech would not protect a man who falsely shouted ‘fire!’ in a theater,
thus causing panic… The question in each case is to know if words have been used
in circumstances and are of such a nature that they involve a clear and imminent
danger of producing the substantive evils that Congress has the right to prevent.”12
In the 1960s, during the period known as the Warren Court, freedom of expres-
sion was strongly protected, with the First Amendment achieving unprecedented
effectiveness, declaring the unconstitutionality of several legislative acts that unduly
restricted the freedom provided in that device.
Currently, the Supreme Court uses other standards to assess the convenience
of declaring a law unconstitutional for infringement of the First Amendment. The
protection of freedom of expression has been more intense for discourses of a political
nature—this is one of the existing debates considering freedom of expression and its
protection since the First Amendment is precisely that derived from the protection
of the so-called “anonymous discourse” and especially its protection on the Internet.
The District Court of Washington, for example, in its assessment of an anonymous
posting on the network, stated that “the internet is a true democratic forum for
communication. It allows the free exchange of ideas in an unprecedented way both in
speed and scale. For this reason, the constitutional rights of internet users, including
First Amendment protection to anonymous speech, should be carefully guarded”.13
The District Court of Northern California, while analyzing a case involving the
anonymous posting of criticism in a blog, highlighted that “people may interact by
pseudonyms or anonymously, as long as their acts do not implicate infringing the

10 Excerpt of Judge Rutledge decision proffered in the Thomas v. Collins case. Apud. Sarmento
(2002), p. 158.
11 According to the judicial opinion proffered by Wendell Holmes, transcribed in Capaldi (1984),

p. 51.
12 Rodrigues (1958), p. 148.
13 No. C01-453Z, 140 F. Supp. 2d 1088, Doe v. 2TheMart. Com Inc., Dist. Court, WD Washington.

Judgement 2001, accessible under http://cyber.harvard.edu/stjohns/2themart.html. Accessed 11 Jan.


2022.
222 C. A. P. de Souza and B. L. M. Nunes

law. The ability to express oneself without third parties being able to know details
about your identity can foster broad communication and a robust debate”.14
Precisely because of the strong constitutional protection conferred to the anony-
mous discourse, the United States Courts have created in the last fifteen years a series
of conditions so that the victim of offenses derived from anonymous comments can
judicially require the identification of the author of the injurious speech.
The necessary criteria for this identification to occur varies significantly. However,
the list of conditions imposed in the Dendrite Int’l v. Doe No. 3, Superior Court of New
Jersey, has gained special prominence and has been routinely applied for damages
caused over the Internet by anonymous postings.
According to the test resulting from the mentioned decision, the victim, in seeking
the identification of the author of an anonymous and damaging commentary, will have
to meet five conditions: (i) demonstrate that the victim agrees to the arraignment of
the author of the anonymous quotation; (ii) clearly indicates the expression of thought
that violates their rights; (iii) indicates that the Judiciary can analyze the case and that
the action against the anonymous author is viable; (iv) produces sufficient evidence
that may prima facie instruct the action; and (v) the Court shall measure the protection
of the anonymous discourse present in the First Amendment with the strength of the
arguments presented by the victim in the sense of revealing the identity of the author
of the anonymous and offensive discourse”.15
Thus, it is evidenced that in the American constitutional-tradition, anonymous
discourse is, as a rule, protected by the First Amendment, allowing the victim, in
specific situations, to seek the identification of its author and from that result, to
proceed with an indemnity action.
It is important to understand this experience on the subject so that one can compre-
hend to what extent the discipline of the theme in Brazil differs from the United States.
Although Brazil has no tradition of protecting anonymous discourse, it is necessary
to recognize the reasons why the constitutional protection of freedom of expression
has been tied to the prohibition of anonymity and how it can be fulfilled considering
the Internet.

4.2 The Prohibition on Anonymity as Determined


by Brazilian Law

Article 5, section IV of the Federal Constitution determines that “the manifesta-


tion of thought is free, and anonymity is prohibited”. It provides the perception

14 185 F.R.D. 573, 578, Columbia Ins. Co. v. Seescandy.Com, N.D. Cal. Judgement 1999, accessible
under https://ilt.eff.org/Columbia_Ins._Co._v._Seescandy.html. Accessed 11 Jan. 2022.
15 No. 3, A-2774-00T3, Dendrite International, Inc. v. Doe. Judgement 11, July 2001, acces-

sible under https://law.justia.com/cases/new-jersey/appellate-division-published/2001/a2774-00-


opn.html. Accessed 11 Jan. 2022.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 223

that according to Brazilian law, anonymous discourse should not be protected, thus
distancing itself from the American tradition.16
However, what would be the reason, in the Brazilian experience, for the protection
of freedom of expression not to include protection for anonymous discourse? At
the root of this precept, as explained by the most renowned commentators to the
constitutional text, is the concern in identifying the author of a manifestation of
thought so that he or she can answer for any abuse and damages eventually caused.
The ban on anonymity was not introduced into Brazilian law by the current
Constitution. Its inclusion follows from the Constitution of 1891, in its Articles
72, Section 12. Since then, this connection between anonymity and the need to hold
the perpetrator liable for damages caused by abuse in the expression of thought is
recurrent in doctrine and in national jurisprudence.
Accordingly, the Federal Supreme Court has already established that:
The constitutional veto to anonymity, as is known, seeks to prevent the consummation of
abuse in the exercise of freedom of expression, since, by requiring the identification of
who uses this extraordinary legal-political prerogative, essential to the very configuration of
the Democratic state of law, is ultimately aimed at enabling any excesses arising from the
practice of the right to free speech to be made accountable, a posteriori, both in the civil
sphere and in the criminal sphere.
(…)
It is evident, then, that the clause prohibiting anonymity - by enabling a posteriori,
the criminal and/or civil liability of the offender - is a constitutional measure designed to
discourage abusive expressions of thought, moral heritage of persons unjustly disrespected
in their sphere of dignity, regardless of the means used to convey contentious accusations.17

Several authors extensively interpret this provision to emphasize its purpose of


restricting harmful behavior and identifying the perpetrator of abuse by means of
manifestation of thought.18
Based on the provisions of Article 7, of the revoked Press Law, which also denied
anonymity, Edilson Farias highlights that “given that anonymity means the mali-
cious concealment of ones’ name to evade liability for the disclosure of material
that can cause harm to third parties, it is easy to deduce that the primary purpose of
the principle is to prevent the authors of apocryphal messages from being immune
when damages are caused to honor, intimacy and well-being of the society. Thus,
identifying the communicating agent is a burden of freedom of expression and
communication”.19
Accordingly, Daniel Sarmento’s view on the model of freedom of expression
as determined by the Federal Constitution of 1988 is appropriate: “freedom with
liability”.20 That is, it would only protect free discourse to the extent that identifying

16 See also more closely Aftab (2022), in this volume.


17 Federal Supreme Court - Writ of Mandamus Nr. 24,369 MC/DF, Reporter Justice Celso de Mello,
10.10. 2002.
18 See: de Moraes (2002), p. 207.
19 Farias (2004), p. 183.
20 Sarmento (2013), p. 259.
224 C. A. P. de Souza and B. L. M. Nunes

the author is a possibility in case any abuses are committed concerning the expression
of thought, therefore promoting liability.

4.3 Freedom of Expression and Anonymity on the Internet

The evolution of communication through the Internet includes, to some extent,


the preservation of anonymity. For political purposes, the provision of anonymous
means of navigation and communication has been crucial to the development of the
network’s libertarian potential, especially in countries whose governments monitor
and rigidly censure what is seen and what is posted on the Internet.21
The link between the availability of tools that allow the anonymous use of the
network and important movements of political resistance in the recent past is evidence
that free speech is the core of anonymity, meaning that anonymity promotes freedom
of expression and access to knowledge and information.
One of the tools that allows anonymous browsing through a series of routings for
the network connection is Tor, an anonymity network which allows people to use the
Internet in a way that avoids traffic analysis. Supported by an extensive community of
volunteers and donors for this initiative, Tor was an important tool for communication
and access to information that instructed political movements of extreme relevance
such as the Arab Spring.22
Following the revelations brought to light by Edward Snowden on the implemen-
tation of an extensive mass surveillance program on the net,23 anonymization tools
for navigation and protection of the content of Internet communications have become
main stream. Therefore, not only Tor, but also some of the most popular browsers
started to implement an “anonymous navigation” function, currently available for
browsers such as Google Chrome and Mozilla Firefox.
In addition to the political component and the preservation of privacy, anonymity
has also become a powerful tool for creating communities on the Internet. In such
cases, anonymous discourse has generated not only online communities for the
dissemination of all kinds of material but has also led to the development of extremely
organized groups such as the Anonymous.24 Probably one of the clearest evidences

21 Cf. Aftab (2022), Sect. 4.2, in this volume.


22 Zahorsky, Tor, Anonymity, and the Arab Spring: An Interview with Jacob Appelbaum, University
for Peace & Conflict, 2011, https://ilt.eff.org/Columbia_Ins._Co._v._Seescandy.html. Accessed 11
Jan. 2022.
23 Wikipedia, Global surveillance disclosures (2013–present), https://en.wikipedia.org/wiki/Glo

bal_surveillance_disclosures_(2013%E2%80%93present), (accessed 11 Jan. 2022).


24 Wikipedia, Anonymous (group), 2018, https://en.wikipedia.org/wiki/Anonymous_group

(accessed 11 Jan. 2022). See: G. Sands, What to Know About the Worldwide Hacker Group
‘Anonymous’, ABC News, 2016, http://abcnews.go.com/US/worldwide-hacker-group-anonymous/
story?id=37761302 (accessed 11 Jan. 2022).
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 225

of the role that anonymity plays in the network, the anonymous group has been the
object of several studies on its controversial form of organization.25
During the protests that marked Brazil’s recent political history, the use of masks
by protesters sparked an intense debate over the contours of freedom of expression
in the country. Can protesters cover their faces with masks during a protest? How can
the State Department ensure that abuses committed during these demonstrations are
punished? Would it be better to ban the use of masks or simply to determine that any
masked person might be required to identify himself or herself to a police authority
for potential accountability?
This dilemma is evidence on how anonymity, driven by the natural behaviors of the
network, can have consequences that overflow the Internet. The group anonymous
chose the mask used by the Irish revolutionary Guy Fawkes (and popularized by
comics and films) as their symbol. The same mask nowadays can be seen in protests
and demonstrations around the world—from deputies in Poland, wearing the mask
as a form of protesting the vote on an intellectual property law,26 to their extensive
use in the protests of 2013 in Brazil.
Thus, technical, political, social and legal elements become confused in the mosaic
that reflects the relevant role that anonymity plays in the evolution of the many forms
of communication through the Internet.

4.4 The “Secret” Application Case

The ban of the app “Secret” in Brazil, a case that occurred shortly after the approval
of the Brazilian Internet Bill of Rights, put into practice the debate on the role of
anonymity in the network and the constitutional protection of freedom of expression.
Available, at the time, in both the Apple Store and Google Play, it quickly gained
popularity since it allowed users to post brief comments without any immediate
identification of the message’s authorship.
The user of the application was only informed if the author of the message was
a “friend” or “friend of friend”. This information was obtained as soon as the user
authorized the application to access his/her Facebook friends or the contacts of their
mobile device. Most popular comments could still be viewed in the app through
the “Explore” section. In previous versions of the app, it was possible to upload
any photo to illustrate the message. While viewing a post, other users could like or
comment. There was no indication, either to the author of the message or to those
who comment, of the authorship of the original message or subsequent comments.
The application was subject to a Public Civil Action filed by the Public Prose-
cutor’s Office of the State of Espírito Santo, through its 26th Civil Prosecutor’s Office

25See: Olson (2014).


26Olson, Amid ACTA Outcy, Politicians Don Anonymous Guy Fawkes Masks, Forbes,
2012, https://www.forbes.com/sites/parmyolson/2012/01/27/amid-acta-outcy-politicians-don-ano
nymous-guy-fawkes-masks/#3c24d98a5064 (accessed 11 Jan. 2022).
226 C. A. P. de Souza and B. L. M. Nunes

of the City of Vitória. The suit, filed against Apple, Google and Microsoft, sought
to ban the application in Brazil because it understood that its operation violated the
constitutional seal of anonymity and provided offenses that violated the dignity of
the human person.
The 5th Circuit Court granted Apple and Google a preliminary injunction, deter-
mining that they should remove the application from their virtual stores and, within
ten days, erase the already installed applications of all mobile devices in the country.
Due to this measure, the company responsible for the app took steps to prevent
damages to third party rights’ from being committed. Therefore, the use of photos
was initially restricted for Brazilian users. The company also informed the press that
it would quickly remove all illicit publications, just as soon as it was notified by
the eventual victim. Such notification could be sent through a mechanism in the app
itself, available since its release.27
In examining the complaint filed by Google Brazil, the Court of Justice of Espírito
Santo understood that the measure that forced the companies Google, Apple and
Microsoft to erase the application from the mobile devices that had downloaded
it could not prosper. In his vote, the rapporteur mentioned that the invasion of an
extraneous computer device has been a crime in the country since the entry into
force of Law n°12.737/12 (known as the “Carolina Dieckmann Act”) and, therefore,
the Judiciary could not adopt such a measure.28
In addition to the debate on the possibility of removing content directly from
private mobile devices - a topic that grows in relevance at the same speed as digital
inclusion advances through the mobile Internet in the Country - it is relevant to
investigate whether the Secret app case falls within constitutional contours. This
analysis is made relevant by the diversity of Internet applications that, in various
ways, use anonymity (or at least its appearance or expectation) as an essential feature
of the tool.
The Secret app seems to integrate the extensive and complex set of applications
on the Internet that promote anonymity. By allowing users to post messages without
their immediate identification, the platform seemingly provides users an anonymous
use of their services. However, a closer look at how the application is used and
how providers respond to notifications about illegal content reveals a very different
perspective. Paradoxical as it may seem, what the application in question offers is
not the experience of full anonymity, but only an expectation of anonymity.
This conclusion is arrived upon from reading the application’s Privacy Policy,
which very clearly stated that the alleged anonymity is maintained only among users
of the service, ensuring the possibility not only of identifying who is the author of a

27 De Lucca, Secret want to colaborate with Brazilian Authorities, and in Brazil, IDG
Now, 03 September, 2014, http://web.archive.org/web/20160826072303/http://idgnow.com.
br/blog/circuito/2014/09/03/secret-quer-colaborar-com-as-autoridades-brasileiras-e-aqui-no-bra
sil/ (accessed 11 Jan. 2022).
28 Capelas, Espírito Santo’s Court of Justice retracts decision and goes releases the Secret appli-

cation, Estadao, 12 September, 2014, http://link.estadao.com.br/noticias/geral,justica-do-es-volta-


atras-e-libera-aplicativo-secret,10000030513 (accessed 11 Jan. 2022).
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 227

message, but also sharing this information with authorities that will have the power
to order the company to disclose it for the purpose of investigating illicit activities.
The app’s Privacy Policy, in its June 11th, 2014 version, clarified that, even with
the alleged anonymity existing in the user interface, “it is still technically possible
for Secret to connect your Posts with your email address, phone number or other
personal information you have provided. This means that if a court or a competent
authority asks us to disclose your identity, we may be required to do so”.29
In addition to informing the competent authorities about the identifying data of
the author of harmful messages, Secret also provided the victim of damages caused
through its platform a tool for reporting illegal content.
Consider the aspects of such app: it was a tool that produced the expectation of
anonymity in an environment in which users interact. Additionally, it provided means
of identifying the author of messages sent via the app, promising to share such data
with the competent authorities if necessary, as well as promoting the withdrawal of
any content deemed harmful. With these aspects in mind, would the app fall under
the constitutional prohibition of anonymous speech?
By allowing the identification of the author of messages posted in their platform,
maintaining only the alleged anonymity among its users, brings to question whether
the Secret app ever created an anonymous environment. It seems more appropriate to
characterize the anonymity provided by the application as a relative one, or at best,
as a mere expectation of anonymity, which may be broken when the author of a
message violates the established terms of use or causes harm to others.
Furthermore, in additional evidence as to how the environment made available
by Secret does not fall into the constitutional prohibition to anonymity, it is possible
to state that the ratio of the constitutional precept is to avoid that the anonymous
discourse will encourage an abusive expression of the thought, allowing the author
of such manifestation to cower under the mantle of anonymity so as to evade eventual
liability.
Thus, by maintaining the possibility of identifying the author of any specific
message, by collaborating with the competent authorities for the identification of
the user responsible for the message, and by promoting a system of complaint and
removal of illegal content, it seems that the application has complied with the purpose
of the constitutional norm which protects freedom of expression, as determined in
Article 5, section IV.
As seen above, both doctrine and jurisprudence assert that the teleological inter-
pretation of the prohibition to anonymity present in the Constitution is to guarantee
liability. The Secret App has construed an environment that seeks to take advan-
tage of the freedom that the supposed anonymity confers, encouraging its users to
express their ideas and opinions freely, which goes back to the root of the protection
of anonymity in its American origin. At the same time, however, it stated that the
identification of the author of harmful messages would be possible, which seems to
be consistent with the ratio of the Brazilian constitutional precept.

29http://web.archive.org/web/20150317192053/https://www.secret.ly/privacy (accessed 11 Jan.


2022).
228 C. A. P. de Souza and B. L. M. Nunes

It is worth recalling Francesco Ferrara’s lesson, that “every provision of law has
a scope to accomplish, whether to fulfill a certain function and purpose, for whose
achievement it was created. The rule rests on a legal basis, a ratio iuris, which
indicates its real understanding”.30
The purpose of the constitutional provision in denying anonymity would not
extend these effects to the use of the Secret application. In other words, the user
of the application is anonymous until the moment he or she abuses his or her
freedom of expression and causes damages to third parties, thus complying with
the constitutional mandate.
There is one last issue that can be drawn from the constitutional discipline on
freedom of expression and its connection with the Internet Bill of Rights: if appli-
cations that generate a mere expectation of anonymity can be made available in the
Country, what about those who actually offer a complete experience of anonymity,
not revealing the author of the message and not even retaining records about the use
of the service? For these tools it will be necessary to construct a new interpretation,
since the final analysis of the Constitution would only more directly protect those
applications that produce a mere expectation of anonymity for those who use the
platform, but not for the competent authorities to ascertain any damages caused.
In addition, it is also worth emphasizing that the Internet Bill of Rights obliges,
as provided in Article 15, Internet application providers, constituted as legal entities
and who carry out organized and professional activities for economic purposes, to
maintain the respective records of Internet application access, under secrecy, in a
controlled and secured environment, for a period of 6 (six) months, in accordance
with the regulations.
Thus, by generating an environment of apparent anonymity, but with the retention
of records in accordance to Article 15 of the Internet Bill of Rights, the application
providers could make available applications, for example, like Secret, in Brazil.
However, as we can see today, this issue is less complex when it comes to the
availability of tools that guarantee anonymity on the Internet.

5 Freedom of Expression as a Condition for the Full


Exercise of the Right to Internet Access

Article 8 of the Brazilian Internet Bill of Rights establishes that the “protection of
the right to privacy and freedom of expression in communications is a necessary
condition for the full exercise of the right to Internet access”.
One of the core articles’, it dialogues with the process of creating access to the
Internet as a right and conditions its full exercise to the fact that, once you have access
to the Internet, the right to privacy and freedom of expression is also guaranteed.
The debate on the access to the Internet as a right, be it a human or fundamental
right, transcends the limits of this work. In a brief note, it can be said that the

30 Ferrara (1987), p. 141.


Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 229

discussion is currently polarized between those who understand that access to the
Internet would only be a means for the realization of real fundamental rights (such as
access to information and freedom of expression) while others recognize its character
as an autonomous human right.
A position which is widely remembered, but contrary to the recognition of access
to the Internet as a right is that defended by Vinton Cerf in an article published in
the New York Times.31 According to the author, access to the Internet should not be
considered a human right, because, like all technology, it is only a “rights enabler”;
a means and not an end in itself.
In the same article, Vinton Cerf states that technological progress derives from
achievements made by technicians and that it should be up to them to decide on
the path of technology, including ensuring that the Internet continues to enable the
exercise of human rights. This statement has a hidden recipient—the governments—
and can be read as a discourse for the technical self-regulation of the Internet. The idea
seems equally sympathetic and risky. The most obvious risk is its own radicalization,
leading to the understanding that the Internet, as a technical resource, should not be
the subject of political decisions.
This perception might lose sight of the fact that technology is not a given. Tech-
nology is not neutral, but rather the result of choices, of human decisions inherent
in its development process. It does not generate impacts on society, as something
external that moves and collides with society; on the contrary, it is a part of society.
Therefore, there are technologies more or less likely to generate certain behaviors.
As an example, the social networks and how their architecture suggests the effects
derived from its use. Networks that allow you to follow whomever you please are
more diversified and informative, stimulating criticism and the exchange of ideas.
Social networks that only allow their users to accompany the posting of friends,
on the other hand, can isolate the user in a true bubble of preferences, styles and
ideologies shared only by a small group of people who resemble each other.
The result of defending a strictly technical regulation of the Internet only removes
institutionalized political channels from the scene, because technology is the result
of choices, of decisions that are ultimately political. The Internet is thus a space
for the realization of rights, but whose construction is not only the responsibility
of the community of experts, but also of governments, companies, the third sector,
academia and, of course, the Internet user, who must be a participant and the purpose
of all network regulation.
The Brazilian Internet Bill of Rights is, in this regard, a good example of multi-
stakeholder construction involving the most diverse stakeholders in the regulation
of the network. Thus, essentially fundamental devices such as the one mentioned
in Article 8, make way to a more comprehensive understanding of what should be
understood as the full exercise of the right of access to the Internet and the prominent
role that freedom of expression operates in this scenario.

31Cerf, Internet Access Is Not a Human Right, New York Times, 4 January 2012, http://www.nyt
imes.com/2012/01/05/opinion/internet-access-is-not-a-human-right.html?_r=0, (accessed 11 Jan.
2022).
230 C. A. P. de Souza and B. L. M. Nunes

More specifically, by ensuring that the construction of the legal provision is plural,
so that different actors could contribute during the process of creating the Bill that
gave rise to the Brazilian Internet Bill of Rights, providing the Legislative Branch
with the necessary expertise to approve its final text, Law no. 12.965/14 appoints a
profitable direction to also understand how a multisectoral approach operates in the
definition of legal terms.
In other words, the composition of interests that mark the legislative process is
not only evident in specific devices that directly seek to achieve certain activities.
Quite the opposite, in core articles, such as Article 8, when stating that freedom of
expression is a condition for the full exercise of the right of access to the Internet,
the Internet Bill of Rights seeks to evidence a balance that affects all sectors.

6 Freedom of Expression and Its Parameters for Internet


Providers’ Liability

The Internet Bill of Rights is very specific when it comes to Internet providers’
liability.32 Accordingly, Article 19 determines that:
Article 19 In order to ensure freedom of expression and prevent censorship, Internet applica-
tions providers may only be held civilly liable for damages resulting from content generated
by third parties if, after specific judicial order, the provider fails to take action to make the
content identified as offensive unavailable on its service by the stipulated deadline, subject
to the technical limitations of its service and any legal provisions to the contrary.

It is noteworthy that the article on application providers’ liability starts by indi-


cating that the regime that follows is intended to preserve freedom of expression
and avoid censorship. This statement alone would already signal the prominent role
that freedom of expression plays in the Internet Bill of Rights and would justify its
treatment, as mentioned in Article 2, as the foundation of Internet use in Brazil.
The liability regime established for content generated by third parties according
to the Internet Bill of Rights aims to ensure that freedom of expression is not unduly
restricted, according to a systematic interpretation of the liability regime established
by Law no. 12.965/14.
Evidently, different regimes of liability can generate different impacts on the exer-
cise of freedom of expression of thought. A strict liability system, for example, which
would hold the application provider directly responsible for the content displayed,
encourages the active duty to monitor and exclude potentially controversial content.
Consequently, the expression of thought suffers an undue restriction because inter-
mediaries fear they might be held liable for third party content made available online.
Even if there might be a chance the identified content is not harmful at all, if it is still
deemed critical, controversial, contentious, it would still be removed.

32 For a dicussion with a view to Article 19 of the Internet Bill of Rights cf. also Schreiber (2022),
in this volume. For a proposal to consider mechanisms of self-regulation Hartmann (2022), in this
volume. As to the German Network Enforcement Act see Schulz (2022), in this volume.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 231

This was precisely the conclusion reached by the Supreme Court of Argentina
in addressing a case in which the plaintiff sought to hold Google accountable for
the constant results of its search mechanism. The following section reviews this
decision. Afterwards we turn to the provisions the Brazilian Internet Bill of Rights
as to understand how free speech in the Law is taken into account when designing a
liability regime for Internet providers.

6.1 The Argentinean Precedent

On October 28, 2014, the Argentinean Federal Supreme Court of Justice issued
its decision on the case involving the Argentinian model and actress María Belén
Rodriguez.33 The case tackles the issue of the liability of web search providers derived
from the content listed on their databases. Rodriguez sought to hold search providers
liable associating her name to websites which held sexual and pornographic content
as well as for displaying unauthorized photos.
The model sued Google and Yahoo! in Argentina claiming that her rights to
privacy, image and honor would be violated with the availability of search results
that led to adult sites. She also argued that the use of thumbnails representing her
image violated the right to her own image. Although these contents had not been
generated by the search providers, the actress argued that their propagation by means
of the search engines made their access easier. In addition to the removal of the links
and images, compensation was sought for damages. The defendants were ordered to
pay damages for fault-based liability, in addition remove search results of an offensive
nature and implementing a system that would prevent damages from recurring in the
future.
The debate promoted by the Court focused on the role played by these providers
to promote access to knowledge and information, as well as the impact a decision
imposing the filtering of sites could have on the protection of freedom of expression.
On the other hand, it sought to understand how damages caused by the network could
be avoided without this implying in previous censorship. This is precisely what the
Brazilian Internet Bill of Rights suggests.
The first issue examined by the Supreme Court was whether search providers
could be subject to any form of strict liability, i.e. whether they could be held liable
for the content made available online regardless of any wrongful conduct on their
part. This type of liability is usually derived from the determination by the courts that
a certain activity endangers third parties’ rights. Thus, even without any notification,
either by the individual or by some competent authority, the provider would already
be liable. The court unanimously rejected this approach.

33Municoy, Argentinean Supreme Court Rules in Favor of Google and Yahoo on Civil
Liability of Search Engines in María Belén Rodriguez case, The Free Internet Project, 11
April 2014, http://thefreeinternetproject.org/blog/argentinean-supreme-court-rules-favor-google-
and-yahoo-civil-liability-search-engines-mar%C3%ADa (accessed 11 Jan. 2022).
232 C. A. P. de Souza and B. L. M. Nunes

In doing so, the court noted that search keys play a “key role in the global dissem-
ination of online content, facilitating the access and identification of relevant data for
billions of users”. Thus, the court held that there would be no prior duty of providers
to monitor their platform for potentially harmful content, since such an assessment
could be highly subjective.
However, what happens if the provider receives a notification from the victim
claiming that certain content is causing damages? Should the provider act to remove
the material? What if he fails to remove the content or even choose not to do so by
understanding that there is no harm in the claimed case?
The Court then considered that a private notification would not be sufficient to
generate liability. In its decision, the Argentinean Federal Supreme Court of Justice
questions the effects of creating a system in which anyone could notify the provider
and thereby see the content removed. Approaching a scenario of private censorship,
the Court stated that this result would ultimately deprive the Judiciary from acting
as a legitimate body to ascertain whether a material is lawful or unlawful.
According to the decision, only a court order or notification of a competent
authority would have the power to allow the provider the unequivocal knowledge
that a content is unlawful, and that action should be taken to remove it. However,
as an exception, the Court further considered that this rule could be dismissed in
cases of material whose illegality was patent, falling within the category of child
pornography, death threats, genocide and “deliberate harm to the honor” of a third
party.
Therefore, the general rule, according to the Argentinean decision, determines that
the search providers do not objectively respond for the content they index. They can
only be held liable if they fail to comply with a notice which provides unequivocal
knowledge of the unlawfulness of the material. The exception to this rule lies in
cases where the unlawfulness is patent. In such circumstances it is possible for an
individual, upon receiving a notification, may hold the provider liable if he does not
act to remove the material once notified.
If the Court was unanimous in gauging this regime of providers liability, the
same cannot be said regarding the question of the unauthorized use of the image
of the actress in the thumbnails in the search by images. Although most Justices
exempted providers from any liability regarding the thumbnails, two Supreme Court
justices defended the thesis that the Argentinean Intellectual Property Law would
not allow such use since it is not authorized and does not fall within the exceptions
to the law (such as the use without consent of the image of others for scientific,
academic or cultural purposes). According to Article 31 of Law no. 11.723: “Es
libre la publicación del retrato cuando se relacione con fines científicos, didácticos
y en general culturales, o con hechos o acontecimientos de interés público o que se
hubieran desarrollado en público”.
There was a divergence among the judges regarding the creation of means to
prevent harm from recurring in the future. The actress demanded that providers start
developing a filter to prevent the same illicit content from recurring and being found
once again through the search engines. This is an extremely delicate point, since
such filters are generally not accurate and may end up blocking much more than
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 233

intended, including homonyms and legitimate manifestations of thought. The Court,


in its majority, understood that it would not be up to the court of appeals to require
the implementation of these filters, except on clearly exceptional issues, as set forth
in the American Convention on Human Rights.
The consequence of this majority decision is the determination that the victim
of damages must necessarily pinpoint the location of the content that he or she
wishes to remove. That is, for the courts to determine the removal, the contents must
be indicated with the corresponding URL or other precise means of location, thus
avoiding the restriction of access to the material through generic filters and locks.
It is important to question the precedent this decision can generate in the region.
In addition to being the first decision of a Constitutional Court in South America that
deals more in depth with the civil liability of search providers, it should receive greater
repercussion34 for the result that it is capable of achieving in terms of balancing
between the need to compensate damages caused on the Internet that could potentially
threaten the freedom of expression, caused by filters and blocks.35
In its reasoning, the Argentinean Court mentioned the Brazilian Internet Bill of
Rights to assert that the solution found by the court was in accordance to that foreseen
in Law no. 12.965/14.36

34 Massa, Los buscadores de Internet no son responsables de los contenidos, La Nacion,


2014, https://www.lanacion.com.ar/1739554-los-buscadores-de-internet-no-son-responsables-de-
los-contenidos (accessed 11 Jan. 2022).
35 Delatour, Gastaldi, Güemes, Rosati and Varela, Supreme Court confirms the liability standard

applicable to search engines, Linkedin, 2017, https://www.linkedin.com/pulse/supreme-court-con


firms-liability-standard-applicable-search-marcet/ (accessed 11 Jan. 2022). In a very similar case,
involving the also actress and model Carolina Valeria Gimbutas, the ruling confirmed that search
engines are only liable for third party content when they have actual knowledge of the illegality of
that content and do not act diligently. In this connection, the Court referred to the analysis made in
the “Rodriguez” case. Gimbuta had filed two lawsuits against Google so that it would remove from
its files the personal information related to the plaintiff and cease to use her images, in accordance
with Law 25.326 on Personal Data Protection. On the other hand and subsequently, the plaintiff
filed a damages action in order to obtain compensation by Google for the damages caused by (i)
the reproduction, dissemination and use of her image through the image search engine without
the corresponding authorization -based on Article 31 of Law 11.723 on Intellectual Property- and
(ii) for the damages derived from the linking of the plaintiff with pages related to pornographic
activities. For more information, see: Nolasco, ‘La Corte falló a favor de Google contra una modelo
argentina’, La Nacion, https://www.lanacion.com.ar/2062455-la-corte-fallo-a-favor-de-google-con
tra-una-modelo-argentina, (accessed 11 Jan. 2022).
36 It is true that both the Argentinean Courts’ decision and the Internet Bill of Rights share the

understanding that application providers only answer for third party content if they fail to comply
with a court order. However, the Internet Bill of Rights distinguishes between connection providers
(those that provide access to the network) and application providers (such as search engines, hosting,
social networks, etc.). The former does not answer for the acts of their users (Article 18) and the
latter only answers if they do not comply with a court order (except for copyrights and “revenge
porn” material, pursuant to Articles 19 et seq.). Argentinean Federal Supreme Court of Justice
quoted Article 18 of the Internet Bill of Rights, which deals with connection providers, when in fact
it would be more appropriate to have quoted Article 19, which deals precisely with the application
providers, such as Google and Yahoo! by offering a key-word search service.
234 C. A. P. de Souza and B. L. M. Nunes

According to Darian Pavli, the Argentinean trial may establish conditions for the
construction of a third approach on intermediaries’ civil liability, positioning itself
among the extremes presented by the United States, which adopts a wide liability
exemption as a rule (and a notification and takedown system for copyright) and
Europe, which would allow a greater space for private notifications.37
A future concern is the exception provided by the court in cases of manifest
illegality, in which the rule could be dismissed, generating the providers’ liability
when failing to act after having unequivocal knowledge of the offensive material.
When providing examples of cases of manifest illegality as “deliberate injury to
honor” there remains the danger that judgment on the illegality of content posted
online is extremely subjective. Depending on how future decisions go, such openness
by the court could even make the exception a rule.

6.2 Freedom of Expression and Liability as Provided


by the Brazilian Internet Bill of Rights

Article 19 of the Internet Bill of Rights determines that the liability of application
providers is of a fault-based nature, however, it does not derive from the noncom-
pliance of a private notification, but rather from the failure to comply with a judicial
court order that determines specific content as illicit.
The Judiciary was instated as the authority for deciding on the illegality of content
available online by the Internet Bill of Rights. Therefore, it resembles the estab-
lished by the Argentinean precedent, reducing the spectrum of private notifications.
However, it is worth highlighting that while the Argentinean decision allows for
greater exceptions, the Brazilian Internet Bill of Rights establishes specific excep-
tions in its text. Those are for cases involving copyright (Article 19, second paragraph)
and content classified as “revenge porn” (Article 21).38
The available case law in Brazil, prior to the Internet Bill of Rights, regarding
the Supreme Court’s position, established that application providers would be liable
if, after notified by the interested party, failed to comply, and remove the indicated
content. This liability system would apply to issues involving copyright infringement,
but also to any issue associated to different personality related rights, such as honor,
privacy, and image.39

37 Pavli, Case Watch: Top Argentine Court Blazes a Trail on Online Free Expression, Open
Society Foundations, 2014, https://www.opensocietyfoundations.org/voices/case-watch-top-argent
ine-court-blazes-trail-online-free-expression (accessed 11 Jan. 2022).
38 Spadaccini de Teffé, What is revenge porn and how can I protect myself?, Academia, https://www.

academia.edu/35886754/What_is_revenge_porn_and_how_can_I_protect_myself, (accessed 11
Jan. 2022).
39 See: Superior Court of Justice (STJ) – Special Appeal Nr. 1,193,764/SP, Reporter Justice Nany

Andrighi, 14.12.10; Superior Court of Justice (STJ) – Interlocutory Appeal in Special Appeal Nr.
1,309,891/MG, Reporter Justice Sidnei Beneti, 26.06.12.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 235

The Internet Bill of Rights embraces a different position from that previously
established in the last few years. Among the many reasons for such an approach, the
most important one is related to the outstanding role played by freedom of expres-
sion according to the terms established by the Internet Bill of Rights. By recur-
rently mentioning freedom of expression, Law no. 12.965/14 ensures that a specific
protection is granted to the subject at hand.
The Internet is an unprecedented means of communication and information. If, on
the one hand, it is true that no new law is needed for each new form of expression that
may arise, the Internet has so profoundly innovated the ways through which one can
express his or herself that the Law cannot remain idle to its development. The delicate
task set out for the Brazilian Internet Bill of Rights is to achieve a balance between
the development of an environment in which freedom of expression is cultivated—
considering that the Internet easily both broadens and restricts speech—while, at the
same time, ensuring victims of undue content the necessary means for identifying
the author and removal of such content.
It is worth mentioning that the Internet Bill of Rights only conditions the liability
of application providers to the non-compliance with a judicial order. By doing so, it
honors the Judiciary as the legitimate branch with the power to distinguish between
legal and illegal content. However, this understanding in no way prevents providers
from determining their own rules and defining what may or may not be displayed
on their platforms. Therefore, upon receiving private notifications indicating that
a content is unlawful, the provider has the freedom to decide whether to keep the
content or remove it as per request.
This conclusion seems to offer a complex balance on the exercise of rights consid-
ering it relieves the pressure from the provider who would otherwise have to remove
each and every kind of content deemed unlawful, ultimately implicating freedom of
expression on the Internet. Nevertheless, the provider is not prevented from acting if
the material is contrary to the terms of use and other policies governing the operation
of his platform.
Thus, although the private notification cannot forcefully oblige the provider to
comply with a private notification, under the penalty of holding the provider liable,
it is a customary practice used for reporting the existence of potentially harmful
content on the Internet. Since there is no obligation for providers to consistently
monitor what is being shared online, the notification acts as an alert so that they can
verify the origin of an alleged damage caused.
Therefore, if they decide to remove the content because it is contrary to the terms
governing their platform, the providers are not violating the Internet Bill of Rights in
doing so, because Law no. 12.965/14 does not prohibit the withdrawal of content in
those terms. This is not to say that the provider cannot abuse its position and actively
filter or block content in ways that unduly restrict freedom of expression.
In these cases, it will be relevant to weigh the reasons that led to the blocking,
filtering or spontaneous removal of the content (did the provider seek to avoid
damages derived from that content?), with the impact that its implementation causes
to freedom of expression. Given that Article 19 exempts providers from liability, with
the aforementioned exceptions, providers must exercise the freedom of expression
236 C. A. P. de Souza and B. L. M. Nunes

as core to their activities and only take measures to filter, block or remove content in
cases which are supported by obvious and evident reasons.
Lastly, it is important to clarify that, if for most of the application providers the
Internet Bill of Rights innovates by ensuring that liability is only due when failing to
comply with a court order, a different treatment of the so-called “search providers”
has been established by case law. The Federal Supreme Court has determined that
Google, as a keyword search engine, will not be liable for content resulted from
searches carried out by its users.40
Xuxa Meneghel, a famous television presenter, sought to compel Google to
remove from search results the expression “xuxa pedófila” or any other expression
or phrase that associated her name with any criminal practice. The Superior Court
of Justice (STJ), however, dismissed such request:
6. Search providers are not required to remove from their system the results derived from
searching for a particular term or expression, or results that point to a specific photo or text,
regardless of the URL of the page where it is inserted.

7. It is not possible, under the pretext of hampering the propagation of illegal or offensive
content on the web, to suppress the community’s right to information. Having weighed the
rights involved and the potential risk of violation of each one of them, freedom of information
should be guaranteed, in accordance to Article 220, §1° of the Federal Constitution, especially
considering that the Internet represents today an important mass media vehicle.41

Thus, freedom of expression, as seen since the Argentinean precedent, exerts


special influence on the providers’ liability regime established by the Internet Bill of
Rights.

7 Freedom of Expression and Copyright

The second paragraph of Article 19 refers to the civil liability regime for copyright
infringement as follows:
Article 19, Section 2. This article will apply to violations of copyright and related rights
only when specific legislation to that effect is adopted; the legislation, when adopted, must
respect the freedom of expression and other guarantees provided for in Article 5 of the
Federal Constitution.

It is worth noting that the aforementioned Article states that the subject of copy-
right falls beyond the limits of the Internet Bill of Rights, recommending, never-
theless, that regardless of the solution applied it should “respect the freedom of
expression”.

40 See: Superior Court of Justice (STJ) – Special Appeal Nr. 1,316,921/RJ, Reporter Justice Nancy
Andrighi, 26.06.12.
41 See: Superior Court of Justice (STJ) – Special Appeal Nr. 1,316,921/RJ, Reporter Justice Nancy

Andrighi, 26.06.12.
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 237

This recommendation has its roots in a long debate about the exercise of copyright
and how its implementation in recent decades has given rise to a series of legisla-
tive changes, especially in the criminal field, seeking to consolidate a broad legal
apparatus to hold accountable those involved in copyright infringements.
Two movements emerge in this scenario: on the one hand, a growth in the debate
on piracy and on how copyright infringements must be treated in a rigorous way;
on the other hand, the understanding that not only copyright infringements should
be analyzed, but also the behavior of authors and copyright owners, since they may
abuse the exercise of this right.42
Therefore, a need arises to recognize the role that freedom of expression plays
in this debate and how the design of one or another form of copyright exercise can
impact the expression of thought.
By guaranteeing the author and the copyright holder the possibility of preventing
the unauthorized use of an intellectual creation, this evident right may collide with
other rights, and in particular with that of freedom of expression. The publication of
a critique, the access to documents for research, the creation of a parody, the citation
of artistic work, among several other situations, are examples in which the respect
to copyright and the expression of the thought becomes evident.
The second paragraph of Article 19, therefore, can be interpreted as an imposition,
meaning that whatever the solution found determining liability for damages caused
to copyright, it should refrain from adopting a system which ignores the expression of
thought. Freedom of expression, notwithstanding the solution applied, should always
be the core reference.

8 Conclusion

The five roles of freedom of expression in the Brazilian Internet Bill of Rights clearly
demonstrate the importance of this right to comprehend what Law no. 12.965/14
represents to the current legal system. Recognized internationally as a positive step
towards the protection of fundamental rights in the worldwide network of connected
devices, its future is challenged by the different interpretations of its principles and
foundations.
In this respect, the Internet Bill of Rights’ emphasis on freedom of expression
answers to both technical and political reasons. The established balance sought by
its implementation is complex and will depend upon the Judiciary Power to make
use of the almost seven years of academic research, activism in the network and
legislative work dedicated to its elaboration.
The Internet will certainly pose unimagined challenges for the application of the
Internet Bill of Rights. Another certainty is the perception that, regardless of what the
challenges will be, having freedom of expression as a guiding principle not only for
the interpretation and application of Law no. 12.965/2014, but for other existing laws

42 See: Lemos (2005); Souza (2005); Pereira de Souza (2013).


238 C. A. P. de Souza and B. L. M. Nunes

applicable to the Internet in our Country, is a step in the right direction. The goal is
to consolidate Brazil as a leader on Internet regulation based on human rights, which
positively differs from the temptations to curtail free speech. The same Internet that
promotes freedom, also restricts it. Thus, the future of Law no. 12.965/14 will be
better served by actively protecting freedom, liberties, and promoting rights.

References

Aftab S (2022), Online anonymity—the Achilles’-heel of the Brazilian Marco Civil da Internet.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Capaldi N (1984) Da Liberdade de Expressão: uma antologia de Stuart Mill a Marcuse. FGV, Rio
de Janeiro
Casanova L (1875) Del Diritto Costituzionale. Eugenio e Filipo Cammeli, Florença
Cerf V (2012) Internet access is not a human right. New York Times, 4 January 2012, http://www.
nytimes.com/2012/01/05/opinion/internet-access-is-not-a-human-right.html?_r=0. Accessed 11
Jan. 2022
Farias E (2004) Liberdade de expressão e comunicação—teoria e proteção constitucional. Revista
dos Tribunais, São Paulo
Ferrara F (1987) Interpretação e aplicação das leis. Saraiva, São Paulo
Godoy CLB (2001) A liberdade de imprensa e os direitos da personalidade, Atlas, São Paulo
Gonçalves NPS (2014) Liberdade de Expressão e Estado Democrático de Direito. In Clève, CM
(ed) Direito Constitucional Brasileiro, vol I. Revista dos Tribunais, São Paulo, pp 391–405
Hartmann IA (2022) Self-regulation in online content platforms and the protection of personality
rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Haddad JG (2000) Liberdade de pensamento e direito à vida privada. Revista dos Tribunais, São
Paulo
Lemos R (2005) Direito, Tecnologia e Cultura. Editora FGV, Rio de Janeiro
Luño AEP (2002) Teoría del Derecho. Tecnos, Madrid
Marinoni LG, Mitidiero D, Sarlet IW (2014) Curso de direito constitucional, 3rd edn. Revista dos
Tribunais, São Paulo
Moraes A (2002) Constituição do Brasil interpretada. Atlas, São Paulo
Olson P (2014) We are anonymous: inside the hacker world of LulzSec, Anonymous, and the global
cyber insurgency. Back Bay Books, New York
Pavli D (2014) Case watch: top argentine court blazes a trail on online free expression. Open
society foundations. https://www.opensocietyfoundations.org/voices/case-watch-top-argentine-
court-blazes-trail-online-free-expression. Accessed 11 Jan. 2022
Pereira SCA (2013) Abuso do Direito nas Relações Privadas. Elsevier, Rio de Janeiro
Rodrigues LB (1958) A corte suprema e o direito constitucional Norte-Americano. Forense, Rio de
Janeiro
Sarmento D (2002) A ponderação de interesses na Constituição Federal. Lumen Juris, Rio de Janeiro
Sarmento D (2013) Comentário ao artigo 5º, IV. In Gomes Canotilho JJ, Mendes GF, Sarlet IW and
Streck LL (eds.) Comentários à constituição do Brasil, Saraiva/Almedina, São Paulo
Sartre JP (1970) L’Existentialisme est un humanisme. Nagel, Paris
Scalisi A (1990) Il valore della persona nel sistema e i nuovi diritti della personalittà, Giuffrè, Milão
Schreiber A (2022) Civil rights framework of the internet (BCRFI; Marco civil da internet): advance
or setback? Civil liability for damage derived from content generated by third party. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Brazilian Internet Bill of Rights: The Five Roles of Freedom of Expression 239

Schulz W (2022) Regulating intermediaries to protect privacy online—the case of the German
NetzDG. In: Albers M and Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Silva JA (1999) Curso de direito constitucional positivo. Malheiros, São Paulo
Souza AR (2005) A função social dos direitos autorais. Faculdade de Direito de Campos, Campos
dos Goytacazes

Carlos Affonso Pereira de Souza Professor of Law at the State University of Rio de Janeiro
(UERJ). Director of the Institute for Technology and Society of Rio de Janeiro (ITS Rio). Main
areas of research: Internet governance and regulation. Selected Publications: O Futuro foi Repro-
gramado: como a tecnologia está transformando as leis, a política e os relacionamentos, Rio de
Janeiro: Obliq, 2018; Abuso do Direito nas Relações Privadas, Rio de Janeiro, Elsevier, 2013;
(Coord) Marco Civil da Internet: Jurisprudência comentada, São Paulo, Revista dos Tribunais,
2017.

Beatriz Laus Marinho Nunes Intellectual Property Specialist (PUC-Rio). Research Project: O
Impacto da Impressão 3D no Regime da Propriedade Intelectual e na Indústria da Moda, PUC-
Rio, 2019. Main areas of research: Intellectual Property, 3D Printing and Fashion Law. Free-
lance Translator, for articles and other documents both related and non-related to Intellectual
Property and Law. Selected Publications: (Coauthor) ‘Copyright Limitations in Brazil’, Rout-
ledge, upcoming, 2021, p.169-175; (Translator) ‘Statute of Limitations and International Arbitra-
tion’, CBAr, 2019; (Co-authorship) Marco Civil da Internet: Jurisprudência comentada, São Paulo,
Revista dos Tribunais, 2017.
Civil Rights Framework of the Internet
(BCRFI; Marco Civil da Internet):
Advance or Setback? Civil Liability
for Damage Derived from Content
Generated by Third Party

Anderson Schreiber

Abstract This article examines the controversies involving the clash between
personality rights in cyberspace and freedom of expression, especially in relation to
the new rules of the Brazilian Civil Rights Framework of the Internet (Law 12.695)
and its consequences to the problem of civil liability of Internet providers in connec-
tion with harmful content generated by third parties. The regime is extremely restric-
tive, which represents an undeniable setback when compared to the path that was
being trod by Brazilian case law on this matter. In this context, this article analyses
the “specific court order” requirement mentioned by Article 19 of the Civil Rights
Framework of the Internet to set the civil liability of Internet providers in contrast
to the general discipline of civil liability. The compatibility of Article 19 with the
Brazilian Constitution is also discussed, in light of the fundamental rights of the
human being in cyberspace.

1 Introduction

Iracema Cristina, psychologist of a large business company, had her name included
on a romantic date website. Next to her full name and actual business telephone
number, the website conveyed the following information about Iracema: “person
proposing to engage in affective and sexual activities in exchange for money”.1 To
get her name removed from the website, Iracema needed to file a lawsuit, in the scope
of which she affirmed, in a personal deposition, that she feared to lose her job on
account of the embarrassing exposure to which she was being submitted.
False profiles and other types of disclosure of untrue information may cause
irreparable damage to persons victimized by their use. The disclosure of false or

1 Superior Court of Justice—STJ—Special Appeal No. 566468/2004, Reporter Justice Jorge

Scartezzini, 11.23. 04. Plaintiff’s name was changed to prevent her identification.

A. Schreiber (B)
Faculty of Law, Rio de Janeiro State University (UERJ), Rio de Janeiro, Brazil
e-mail: schreiber@schreiber.adv.br

© Springer Nature Switzerland AG 2022 241


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_10
242 A. Schreiber

defamatory information in the virtual universe produces quite real effects, spanning
from mere mistrust to the cooling down of professional and personal relationships,
covering, at times, more objective measures, such as losing a promotion or being
left behind in an interview, dismissal or even a breakdown in affective relationships.
Victims are frequently affected by the episode and start to be seen as careless, unin-
terested persons, even when they have contributed to the outcome. The argument
of the “there is some truth in it”, which is so common in Brazilian society, is not
easily dissipated and suppression of the false of defamatory content from the Internet
proves to be only the first step in a long path to be traveled by the victim in restoring
her reputation.
As if this were not enough, it is a first step, which meets enormous resistance.
Suppression of damaging information—even in cases which where they show to be
flagrantly untrue—it is a measure avoided by many companies which exploit the field
of social networks and of relationship sites. The case of Adriana Valença appears to
be emblematic. She saw true photos taken for a “book” end up in the network, in a
false profile created in Facebook, which, next to her cell phone number, contained
the information that Adriana was a “call girl”. Adriana started to receive telephone
calls from “customers” interested in hiring her alleged services and those who knew
her saw her with suspicion. She tried, in every way, to get the false content removed
from the air, by sending emails to the social network but affirms that she did not
even receive an answer. Her last resort was to go to Court, with all the costs and
expenditures involved.2
Such lawsuits, which could seem simple and even unnecessary at first glance, are
often converted into endless legal battles. The need to defend one’s honor frequently
meets a major obstacle, which ends up as an actual flag of contemporary society:
freedom of expression in the virtual universe.

2 Freedom of Expression in the Virtual Universe. Social


Networks and Hate Speech. Technological Determinism
and the Obligation of Law. Civil Liability for Damage
Derived from Content Generated by Third Party

The Internet is usually seen as an ally of freedom of expression. Its capacity to


expand the reach of individual expression is frequently indicated as an incentive
to the free circulation of ideal. Social networks, for instance, as many technology
theorists claim, would be creating a new public space, where the free speech would
tend to reach almost Arcadian levels. So, the Internet would represent a renewed

2 Court of Justice of the State of São Paulo [TJSP]—Civil Appeal No. 0173842-
95.2012.8.26.0100/2014, Reporter Associate Justice Beretta da Silveira, j. 01.21.14. Check also
the article: Oliveira (2014). Plaintiff’s name was also changed here to prevent identification.
Civil Rights Framework of the Internet … 243

hope of achievement of democracy, by creating an environment fully open to the


ethical, cultural, political and so many other types of discussion.3
A quick visit to the most accessed social networks in the world (Facebook, Twitter
etc.) reveals a less enthusiastic reality. Far from being an idyllic forum of debate,
what one sees here, more frequently, is a parade of unilateral pronouncements that do
not seem to consist of effective dialogue. The messages published in social networks
end up assuming, very often, a unilateral promotional nature which objective is the
self-affirmation of the identity created by the issuer. These messages make them
sound as open to debate as “iconoclastic messages glued to car windows”.4 On the
other hand, the public—composed of “friends” or “followers” of “friends of friends”
or merely people who “you perhaps know”—normally performs a more passive role,
almost always focused on minimalist acts such as “like” or “share”, and only more
seldom prepared to discuss, in fact, the ideas transmitted. For some thinkers, the
public of the Internet is not built and sought as an effective speaker, but very often
represents a mere instrumental certifier of the existence of the issuer of the message
and its own capacity to “publish” its opinion. In the words of Francis Jauréguiberry,
“others are sought with the sole purpose of attesting, encouraging and cajoling their
virtual Internet users”.5
The envisaged performance of these social networks as a free space for the discus-
sion of ideas is somewhat frustrating. Their promised contribution to politics—in its
genuine effect of discussion of subjects related to citizens and the exercise of citi-
zenship—is small or, in the opinion of some authors, even negative. These authors
argue that the spasmodic issuance of “political opinions” in the social networks also
ends up by emptying the desire to change the real world, through a kind of public
cybernetic relief. Thus, as Thomas Frank alerts, “the politics basically becomes an
exercise of individual self-therapy, an individual fulfillment, not an effort geared
at the construction of a movement”.6 The technology, which promised to intensify
democracy, is in danger of converting itself into a mechanism of preservation of
the status quo. One sees the risk of that, which Zygmunt Bauman refers to as an
“honorable renouncement of the politics of what is real”:
Servants swallow and store the milestones of dissention and protest so that the liquid-modern
politics can go ahead without suffering influences or interruptions – substituting confronta-
tion and argument by sentences out of context and opportunities for photos, (...) when the

3 Accordingly, see Lévy (1999), p. 167: “Cyberspace, interconnection of the planet’s computers,
tends to become the main infrastructure of economic production, transaction and management. Soon,
it will be the main piece of international collective equipment of memory, though and communi-
cation. In short, in approximately tens of years, cyberspace, its virtual communities, reserves of
images, interactive simulations, its irresistible proliferation of texts and signs, will be the essential
mediator of humanity’s collective intelligence.”
4 Bauman (2008), p. 138.
5 Still in the author’s words: “In searching for successful self-identification, self-manipulative indi-

viduals maintain quite an instrumental relationship with their interlocutors. The latter are only
admitted to attest to the existence of the manipulator—or, more exactly, to allow manipulators to
make ‘their virtual counterparts’ face reality.” (Jauréguiberry (2004), p. 6).
6 Frank T (2004) Le Marché de Droit Divin. Paris, Lux, apud Bauman (2008), p. 138.
244 A. Schreiber

dissent travels in the direction of electronic warehouses, it is sterilized, neutralized and made
irrelevant (...) Real politics and virtual politics run in opposite directions, and the distance
between both grows in the proportion in which the self-sufficiency of each benefits from the
lack of the other’s company.7

Apart from this, these concerns of a more generic nature and abstract nature,
it seems undeniable that virtual communication frequently assumes a self-centered
character, which ends up, in many cases, leading to radicalizations and extremisms. In
the Brazilian scenario, for instance, the great volume of offensive pronouncements
(including cursing and other impolite behavior) that is present in social networks
cannot be disregarded. Nor should one ignore the role of virtual communication in
the exacerbation of the clashes, which surrounded the Brazilian presidential election
of 2014. In these election, especially in the second round, it ended up raised to levels
of real danger to aggression of the virtual pronouncements made by supporters of both
candidates—and even persons who appear to be little engaged in real life. In addition,
on the one hand, it is true that, one year before said elections, the social networks had
been praised as a locus of political mobilization, for having served as an important
instrument in the pronouncements of June 2013. On the other hand, it is also true
that such pronouncements were characterized precisely by the absence of defined
pleas, joining persons more around a general and diffuse feeling of dissatisfaction and
rebellion than properly around practical objectives to be achieved. Hence, perhaps
the involuntary association of these pronouncements with vandalism. It erupted from
them as a frankly minority, but constant, element (vandalism that, like every explosion
of violence, reflects the gross feeling that one does not succeed in “communicating” in
actual projects), and the general sensation that, at the end of the wave of mobilization,
“nothing happened”, “nothing changed”, given the absence of concrete perceptible
and safely expected consequences given the grandiosity of the jest upon which “the
giant woke up”. Mobilizations stopped so suddenly and abruptly as they started, as
if they had been “unplugged” or “disconnected” from the network, with the speed
typical of “clicks”. It confirms, in a certain way, the theory of Bauman, for whom “the
‘network’ seems, disturbingly, a sand dune blown by the wind and not a jobsite where
reliable social ties may be established”8 or—one should add—a political freedom
geared to obtaining actual transformations in society is exercised.
The problem is not summarized, however, to an absence of concrete effectiveness
of the talked about freedom of expression in the virtual universe, but it reaches the
very essence of freedom of expression, insofar as new forms of communication on the
Internet, if they apparently encourage the exercise of such freedom, in an increasing
measure also oppress it. This is not a paradox. Extremism and radicalism—derived
from the individualistic character that have been increasing in these new communi-
cation environments—often drift into verbal aggression, stigmatizing labeling and
hate speech which are scattered through the network. The idea that the Internet is a
space of the utmost freedom—immune, due to its absence of geographical base, to
regulatory or governmental controls—contributes, to a certain extent, to new forms

7 Bauman (2008), pp. 139–140.


8 Bauman (2008), p. 137.
Civil Rights Framework of the Internet … 245

of oppression, such as virtual bullying and the so-called online hate speech, revealing
what has been called the “dark side” of social networks: their increasing role in the
propagation of hate.9
Given these new forms of oppression, there are those who acquiesce, saying that
the Internet is like that10 ; those who do not like it should keep distance from the
virtual world. Virtual abstinence, however, is not a concreted alternative for the new
generations, who are not restricted to using the social networks only for purposes
of predilection. The new generations needs to access them also to obtain utilities
that transcend leisure, such as, for instance, to have access to notices from class
representatives at school and university, or, also, to data from events geared to the
young audience, which, very often, are divulged exclusive via social networks. Virtual
abstinence is not a satisfactory solution insofar as, although less violent than virtual
aggression, it equally represents a form of exclusion and, therefore, annihilation of
freedom of expression, among other freedoms.
Therefore, the only way is to apply rules which ensure that freedom of expression
is not exercised against itself. As it has already been said in the past in relation to the
freedom to contract,11 freedom of expression is “autophagic”, to the effect that, in
any environment where there is not equality of forces, the freedom of expression of
the stronger tends to subjugate the freedom of expression of the weaker. In unequal
scenarios, the absence of rules does not normally result in greater freedom; on the
contrary, in mere semblance of freedom, to the extent that regulatory omission benefits
only those who, having greater economic and technical might, see themselves, finally,
free to pursue their own interests without needing to respect rules established in the
interest of society as a whole.
It is necessary to understand that the virtual environment, at least in its current
design, does not characterize a paradisiacal locus for meeting individuals prepared
to freely debate their ideas, but consists of, before anything else, a space of market
activity. As Marvin Ammori recalls, the members of the legal departments of compa-
nies such as Google and Twitter “have business reasons for supporting free expres-
sion.”12 Relationship sites and social networks are quite a successful business model,
which, under the appearance of almost distracted entertainment, hides an industry of

9 Clarke (2013).
10 An example of this attitude is in the regrettable decision of the 4th Panel of the Superior Court
of Justice—STJ, Special Appeal No. 844736/2009, Reporter Justice Honildo Amaral de Mello
Castro, 20.27.09, where it was concluded that sending spam after express request not to receive
it characterizes moral damage; it is up to the user to contract an antispam service (Justice Luis
Felipe Salomão was defeated, by a substantial vote). One of the Justices who accompanied the
prevailing understanding even affirmed: “Spam is something to which the Internet user is submitted.
At this point, I don’t see how we can untie use of the Internet from spam.” (‘4th Panel does not
recognize Non-Pecuniary Damage for sending erotic SPAM to an Internet user’, Superior Court
of Justice—November 03, 2009, https://scon.stj.jus.br/SCON/GetInteiroTeorDoAcordao?num_reg
istro=200600946957&dt_publicacao=02/09/2010. Accessed 11 Jan. 2022.
11 Roppo (1988), p. 38.
12 And concludes: “Indeed, all of these companies talk about their businesses in the language of

free speech. Google’s official mission is ‘to organize the world’s information and make it univer-
sally accessible and useful.’ WordPress.com’s corporate mission is to ‘democratize publishing.’
246 A. Schreiber

cyphers significantly higher than the traditional media itself. This is an aspect that
cannot be merely disregarded in debates involving the application of legal rules to
the virtual space.
The idea that the Internet must today be a space free from the application of
any kind of rule represents, today, an essentially romantic proposal.13 To keep the
Internet far from the boundaries of the Law means to deliver it to the command of the
market: its development will stop being guided by the legal rules to browse through
the flavor of the interests of big industry, which safely does not correspond to the
vision of the future of those who advocate uttermost freedom in the network.14 To
see the Law as an enemy of freedom is a deep methodological mistake, insofar as
only in a regulated environment can the exercise of freedom take place without fear
of abuse, which represents its own denial.
Let us go back to the case of Iracema Cristina, who feared losing her job due to
a false profile, which defamed her, or of Adriana Valença, who after being labeled
on the Internet as “call girl”, became seen as suspect in her closest social milieu.
The perception of the devastating impact that the transmission of material on a
certain person can produce in her real life is only one of the many circumstances
that confirm the need to apply legal rules to the virtual world. The Internet cannot
represent irresponsibility of life in society.
This understanding had already been noticed by Brazilian courts, before the enact-
ment of the Civil Rights Framework of the Internet, in a series of judicial proceedings
involving victims of content transmitted by third parties in social networks and rela-
tionship websites. To evaluate the treatment reserved to this matter by the Civil Rights
Framework of the Internet, it is essential to understand what the state of our case law
prior to enactment of the law was.

Facebook’s is to ‘give people the power to share and make the world more open and connected’.”
(Ammori (2014), p. 2260).
13 This Romanticism is well represented in the famous Declaration of Independence of Cyberspace,

written in 1996 by John Perry Barlow, who, among other points, affirmed: “We are creating a world
that all may enter without privilege or prejudice accorded by race, economic power, military force,
or station of birth. We are creating a world where anyone, anywhere may express his or her beliefs,
no matter how singular, without fear of being coerced into silence or conformity. Your legal concepts
of property, expression, identity, movement, and context do not apply to us. They are all based on
matter, and there is no matter here.” (Barlow (1996)).
14 In such respect, it is interesting to note how liberal thinkers, who defended the absence of

intervention of the State-norm on the Internet, started to denounce as a kind of “censure” the
norms created by the virtual communication companies and applied purely privately, without the
disclosure and transparency which characterize the state-legislative branch. See, on this instigating
theme, Heins (2014).
Civil Rights Framework of the Internet … 247

3 Position of Brazilian Case Law Prior to the Civil Rights


Framework of the Internet. The Issue of Identification
of a Third Party. The Importation of the Notice
and Takedown

Civil liability by transmission on the Internet of content generated by third parties


was a theme that gave rise to intense debates in Brazilian courts prior to enactment
of Law 12.965, known as Civil Rights Framework of the Internet. While victims
of untrue or defamatory content sought reparation, business companies, owners of
social networks and relationship sites, sustained that they could not be held liable
for content inserted by other users, as it would be impossible for them to monitor all
the material inserted in those sites.
Although the case law scenario was still too diversified and one could not talk
about unanimous or consolidated positions, examination of the Brazilian Courts
judgment revealed a straight path to overcoming the theory of no liability of busi-
ness companies, which were owners of social networks and relationship sites. Such
owners—which, in an effort to remove their liability, prefer to call themselves mere
“administrators”, but which, actually, are true owners of the brand, of the web address,
of the publicity space and of all else that makes up the social network—had been
suffering, throughout Brazil, convictions for damages resulting from contents posted
by users of their sites. Such decisions were based on two different legal foundations.
On one hand, it was based on the defect in the service provided (Consumer Defense
Code, Article 14)—a foundation which depends on the characterization of the rela-
tionship between users and social networks, such as a consumer relationship.15 On
the other hand, such decisions were based on the characterization of operation of
the social network as a high-risk activity (Civil Code, Article 927, sole paragraph),
having in view the high potential of damage inherent to the creation of a space where
the content inserted assumes a public dimension, without any kind of prior filtering.16
Without ignoring or disregarding the technical difficulty of monitoring the entire
content posted, the Brazilian Courts had been pointing to measures that could be
adopted by the owners of social networks to prevent or attenuate damages caused,
such as the identification of the individual, which divulges false or defamatory
content. In a case involving a victim of a false profile in Orkut, the Court of Justice
of Rio de Janeiro concluded:
Even if one considers the difficulty of inspecting the contents of all that is posted on the
pages of Orkut, as the defendant company supports, it is possible to check if the information

15 Accordingly, the 3rd Panel of the Superior Court of Justice—STJ had already decided, in 2012,
that: “The fact that the service provided by the provider of Internet service being free does not
depreciate the consumer relation, as the term ‘by remuneration’, contained in Article 3, §2, of the
CDC [Civil Law Code] must be interpreted broadly, so as to include the indirect gain of the supplier”
(Superior Court of Justice—Special Appeal No. 1308830/2012, Reporter Justice Nancy Andrighi,
05.08.12).
16 See, for illustration, Court of Justice of the State of Rio de Janeiro (TJRJ)—Civil Appeal No.

0006047-50.2009.8.19.0040/2009, Reporter Associate Justice Benedicto Abicair, 12.01.09.


248 A. Schreiber

has grounds, as, in fact, it was done after submission of this appeal (...) so, if the defendant
is a person or entity of means, as it proved albeit late, to identify the offender, and did not do
so, it will answer for the latter’s anonymity, the duty to compensate the damage sustained
being clear.17

In another case of false profile, the victim, who, next to where her true telephone
number was displayed, was described as a woman “in her 40s, craving sex”. In this
case, the same Court of Justice of Rio de Janeiro went further, concluding that the
identification of the third party divulging the damaging content does not release from
liability the company, which owns the website:
The indication, by the defendant, of the user who promoted the inclusion of said false profile
does not remove his responsibility which, in the case, is objective. It is up to the supplier to
develop protection mechanisms with view to prevent fraud, especially when the occurrences,
as that described in these records, become frequent, removing from them the nature of act
of God.18

Generally, our courts had been considering that business companies, which
somehow create and operate social networks must be regarded liable for the damage
caused to the victims of damaging content. Not only because they provide, as an
aspect inherent to their activity, a space to spread messages of their users, but also
because they obtain economic gains from the direct or indirect exploitation of this
communicative space. In this sense, it is worth highlighting the tone attributed to the
matter by the Superior Court of Justice:
Those who make it technically feasible, those who benefit economically and, actively,
encourage the creation of communities and relationship pages on the internet are as respon-
sible for the control of eventual abuse and for the guarantee of the personality rights of
internet users and third parties as the very internet users who disseminate information which
is offensive to the most trivial values of life in community, whether it be real or virtual.19

17 Court of Justice of the State of Rio de Janeiro—Civil Appeal No. 2009.001.14165/2009, Reporter
Associate Justice Alexandre Câmara, 04.08.09. Excerpt extracted from the vote of Reporter,
accompanied by his peers.
18 Court of Justice of the State of Rio de Janeiro—Civil Appeal No. 2009.001.41528/2009, Reporter

Associate Justice Ernani Klausner, 03.09.09.


19 Superior Court of Justice—Special Appeal No. 1.117.633/2010, Reporter Justice Herman

Benjamin, 03.09.10. Note that the situation is analogous to what already happens outside the virtual
world, in the provision of services, which may be manipulated by third parties to generate damage
to users, as in the case of registration in the credit protection services for debt contracted with false
documents, a case which the STJ already decided to fall within the sphere of responsibility of the
contracting financial institution, even if only belatedly such institution becomes aware of such docu-
mental falsification of documents: “The action of persons without scruples specialized in acquiring
credit cards with the name and taxpayer identification number [CPF] of a deceased person to acquire
a credit card and use it until it is suspended for default on invoices has become commonplace. (…)
The credit card administrator which normally executes its contracts by phone or Internet, without
requiring the physical presence of the consumer who is the user of the credit card only becomes
aware of the fraud when it deflagrates the extrajudicial collection procedures. The case law of this
Court is settled to the effect that the undue indication of the name of consumer in credit protection
bodies produces non-pecuniary damages, generating the obligation to indemnify for those who
make the registration.” (Superior Court of Justice—Special Appeal No. 1.209.474/2013, Reporter
Justice Paulo de Tarso Sanseverino, 10.09.13).
Civil Rights Framework of the Internet … 249

Even the Courts judgement that gave prestige to the technical difficulty of diver-
sion of content posted on the social networks, did not conclude, for the most part,
in favor of the lack of liability of companies. On the contrary, they were favor of a
kind of conditioned liability, deflagrated only from the time when, informed about the
existence of the damaging material, the companies did not adopt measures to remove
the offensive material from its website. Check, in this direction, the following court
decision rendered by the 4th Civil Chamber of the Court of Justice of Rio de Janeiro
in another case, which deals with false profile in Orkut:
In the case in question, and in a first analysis, I share the understanding of the learned judge
to the effect of the impossibility of the host provider duly verifying the preview of the 40
million pages existing in Orkut. This is in a first analysis. However, if this is so, there is no
doubt that, when the exclusion of the profile is requested, the provider is obliged to exclude
it if it is false and offensive to the honor of who has been portrayed.20

In this cross-sectional way, the theory of notice and takedown began to go into
Brazilian reality. Inspired in the Digital Millennium Copyright Act, said theory
emerges from copyright law, to create a kind of exception of liability for violation
of copyright on the Internet, ensuring immunity to providers who readily complied
with the notification by the party offended to withdraw the improper material. With
the notification, the controversial duty of permanently monitoring the network is
transformed into a specific obligation to act, which could no longer be refused based
on the argument of practical unfeasibility of monitoring and which, if complied with,
would exempt the party notified from civil liability.21
The import of the theory of notice and takedown into the field of civil liability
for damage resulting from content generated by third party would represent, from a
certain angle, a split-up in the Brazilian civil liability system. The liable party would
only be considered as such if, after being informed, it would stop acting to prevent
perpetuation of the damage. This would be a kind of ex post civil liability, subse-
quent to the damage, geared to preventing the damage from being propagated. It is
evident that, in practice, such importing would mean that the damage sustained by
the victim during the period prior to the notification would remain not being reim-
bursed (or could only be reimbursed by the third party which generated the content,
almost always anonymous or, even when identified, not capable of being located or
incapable—legally or economically—of bearing the indemnity or technically inca-
pable of adopting any other measure which could mitigate the effects of the damage
sustained by the victim). The notice and takedown would establish, to this effect, a
kind of “immunity” of the owner of the website until the time of notification, leaving

20 Court of Justice of the State of Rio de Janeiro—Civil Appeal No. 2008.001.04540/2008, Reporter
Associate Justice Horácio dos Santos Ribeiro Neto, 25.03.08.
21 The Digital Millennium Copyright Act (Public Law No.: 105-304/1998) regulates in detail, in

its Title II (referred to as Online Copyright Infringement Liability Limitation), the procedure of
notification and counter-notification, in addition to measures, which must be followed by providers
to to be entitled to limitation of liability. See particularly Section 202, which brings substantial
modification to § 512 of Chap. 5 of Title 17 of the United States Code, compilation of federal rules
of a general and permanent nature.
250 A. Schreiber

without reparation at least part of the damage suffered by the victim,22 which could
raise allegations of affront to the principle of full reparation.
On the other hand, the practical effects of the import proved to be promising.
The immunity promised would at least encourage a more proactive behavior by the
owners of social networks, which would have, at the time of the notification, the
opportunity to evaluate the content posted by the third party and decide whether or
not adopting measures for its removal from the website (like what the majority of
such companies already do in relation to child pornography). It would contribute to a
healthier virtual environment, which respected the fundamental rights of the human
being, without the need to impose on the victim appealing to the Judiciary, which, in
addition to being costly, requires time incompatible with the quick diffusion of the
offensive content by the virtual world. This “stimulus” to a proactive performance
did not appear, however, exempt from controversies. In its homeland, the notice
and takedown is criticized by a kind of “chilling effect”, which its abusive use may
cause in the exercise of free expression.23 Such criticisms, however, are normally
linked to the notifications based on copyright protection, which, through their own
nature, end up exercising a defensive role by the entertainment industry, undermining
forms of artistic expression which are typical of the virtual environment, such as the
mélanges, sampling and pasting. In matter of protection of the fundamental rights
to the honor, privacy and image of the human persons, the argument of the “chilling
effect” of free expression is not only less usual, it is also less convincing, at least
in the largest portion of concrete cases, which involve unauthorized disclosure of
personal messages, discriminatory messages, incitation to hate, coarse cursing and
other situations in which the exercise of free expression proves to be clearly abusive.
In any case, the very mechanism of the notice and takedown, when adopted by
a legal order, must contain counter-caution to prevent it abusive use and ensure its
effective operation. In the United States, for instance, the notification must contain
minimum requirements (identification of the content that violates copyright, data
of the contact making the notification, etc.) and there are express provisions for
counter-notification by the alleged violator of copyrights, in addition to terms for the

22 Check, in this respect in particular, the prevailing vote in a court decision of the Court of Justice
of Rio de Janeiro, in which it was affirmed that the doctrine of the notice and takedown is diverted
from Brazilian legal tradition, for which “damage happens at the time of publication, the theory of
the learned prevailing vote that the plaintiff should first request the removal of the page not being
valid, as this mere step does not elide the loss already sustained. There is not in our Law an offense
which is not subject to indemnification” (Court of Justice of the State of Rio de Janeiro—Civil
Appeal No. 2008.001.56760/2008, Reporter Associate Justice Otávio Rodrigues, 12.03.08).
23 See https://lumendatabase.org/, official page of the Lumen Project, an initiative developed jointly

by the Electronic Frontier Foundation and prestigious American universities (Harvard, Stanford,
Berkeley etc.) with the declared objective of enlightening the public and preventing the legislation
of the United States (especially the system of notice and takedown) from being used abusively to
“cool off” the exercise of freedom of expression on the Internet. (Lumen is a Project of the Berkman
Klein Center for Internet & Society at Harvard University. [website], 2017, https://lumendatabase.
org/pages/about. Accessed 11 Jan. 2022.
Civil Rights Framework of the Internet … 251

activity of the site owner.24 The danger of a case law incorporation of the notice and
takedown in Brazil lays precisely on its “incorporation by half”. As it happens among
us, especially in matter of civil liability, the “idea” of the institution was incorporated
without previous understanding and consequent acceptance of its multiple aspects.25
The restricted space to a judge in a concrete case did not permit detailed development
of the institution and, thus, the notion of notice and takedown started to make its
way into our case law, always with the best of intentions, but in a somewhat risky
way. An essentially procedural mechanism started to appear in our judicial decisions
without a regulated procedure, without estimate of counter-notification and of the
other guarantees that surrounded it in its origin. It results in a deformed version
of the original doctrine, mainly supported on the argument of authority of the US
experience than properly on understanding of this experience and its adjustment to
the naturally diverse and peculiar Brazilian scenario.
The imminence of discussion of the Bill of Law of the Internet Civil Rights
Framework promised, however, to remove these risks. What one expected from the
Legislative Branch, at this time, was that it acted in an exempt and efficient way,
detailing the operation of the notice and takedown, so as to create an effective mech-
anism of conflict resolution for the Internet in Brazil. Unfortunately, what ended up
happening was exactly the opposite.

4 Article 19 of the Civil Rights Framework of the Internet:


Obvious Setback. Entrenching of Protection. Neither
Notice, nor Takedown. Importation
or Misrepresentation?

Instead of regulating notice and takedown, by establishing reciprocal guarantees and


ensuring the efficiency of its operation, Law 12.965, of April 23, 2014—known as
Civil Rights Framework of the Internet—established an extremely rigid mechanism,
which creates intense protection for business companies that operate social networks,
and reduces the degree of protection which had already been established by Brazilian
case law for Internet users. See the wording of Article 19 of Law 12.965:
Art. 19. In order to ensure freedom of expression and prevent censorship, the provider of
Internet applications can only be subject to civil liability for damages resulting from content
generated by third parties if, after an specific court order, it does not take any steps to, within
the framework of their service and within the time stated in the order, make unavailable the
content that was identified as being unlawful, unless otherwise provided by law.

The wording of the provision already begins by showing to which side of the scales
its content leans in favor. In a section entitled “Civil Liability for Damages Arising

24 See .S. Code, Title 17, Chapter 5, Section 512, especially Section 512 (c) (3)—on the requirements
of the notice –; and Section 512 (g) (2) e (3)—on counter-notice and terms of activity.
25 On the issue, let one be allowed to refer to Schreiber (2014b), pp. 209–215 and 231–243.
252 A. Schreiber

from Content Generated by Third Parties”, Article 19 begins with an unusual affirma-
tion of purpose, which invokes the values of freedom of expression and prohibition
of censorship, that, abstractly, would advocate against the very idea of liability. The
whole discussion on liability arises exactly when the exercise of freedom of expres-
sion violates the victim’s fundamental rights, proving abusive—because contrary
to the very purpose of free expression—or illegitimate—because it affronts, with
pondering reasoning, the sphere of protection of other rights of equal hierarchy in
that concrete specific situation. The fundamental rights of the human person (honor,
privacy, image, among others) are also protected by the Brazilian Constitution, on
an axiological threshold not lower than free expression. Recalling only “one side of
the coin” already at the beginning of Article 19 represents bad legislative technique
and a concerning warning on what was about to come.
After its unusual beginning, Article 19 declares that the “provider of Internet
applications”—thus the Civil Rights Framework refers to that which offers a “set
of functionalities that can be accessed through a terminal connected to the Internet”
(Article 5, VII), yielding to the already mentioned terminology, which conceals the
proprietary aspect—“can only be subject to civil liability”—the word “only” already
revealing a restrictive perception of its civil liability – for “damages resulting from
content generated by third parties” if, “after an specific court order, it does not take
any steps to, within the framework of their service and within the time stated in the
order, make unavailable the content that was identified as being unlawful, unless
otherwise provided by law.”
Mention of a “court order” gives a death blow to all the inspiration of the notice
and takedown. As already emphasized, the legal limitation to civil liability of the so-
called “providers of applications” can only be justified as a stimulus to its proactive
activity, capable of preventing the spread of the damage regardless of the time and
cost necessary to file a lawsuit. If the victim of damage to its fundamental right needs
to resort to the Judiciary, claiming a judicial order, to be issued to the Company, then
Article 19 is entirely useless for the simple reason that the possibility of resorting to
the Judiciary always existed in Brazilian law and failure to comply with a court order,
regardless of any consideration on civil liability, constitutes the crime of disobedience
(Article 330 of the Brazilian Criminal Code).
Even worse: in the literal words of Article 19, noncompliance with a court order
becomes a necessary condition for holding the providers liable. In this context, filing
of a lawsuit fails to be a mere instrument of protection of the victim’s rights and of
obtaining reparation for their violation to become a sine qua non condition of the
civil liability. The victim, who before filed a lawsuit as its last resort, to obtain the
defendant’s liability, now needs to file the lawsuit and plead the issue of a specific
court order, so that, only then and only in the case of noncompliance with said
court order, the owner of the site or social network may be considered liable. In a
reality increasingly conscious of the cramming of the Judiciary Branch, Law 12.965
takes a different path than every single trend and transforms the judicialization of
the conflict into a measure necessary to the protection of the victim’s rights in the
virtual environment, environment in which, by its own speed and dynamism, judicial
remedies tend to be less efficient and, therefore, more criticized.
Civil Rights Framework of the Internet … 253

The truth is that, quite the contrary of protecting the victim with a more efficient
system of protection of its fundamental rights, Article 19 protects companies, which
operate the network, as it still requires that the legal order be “specific”—giving room
for allegations of lack of specificity, which would authorize noncompliance26 —and
restricts the need for compliance to measures that must be adopted “within the frame-
work of their service”—opening one more door to arguments which would exclude
the need for compliance with the legal order, with the aggravating circumstance
of making reference to technical questions, which, in this kind of legal action, the
defendants usually know in greater detail than the members of the Judiciary, as they
concern the meanderings of the very business developed by the company.
Perhaps in an effort to soften the flagrant setback, §3 of Article 19 declares that
“The compensation disputes for damages arising from content made available on the
Internet related to the honor, reputation or personality rights, as well as the removal
of related contents by Internet application providers, can be presented to special
small causes courts.” The consolation prize does not remove the inconvenience of
the imposition of appealing to the Judiciary to deflagrate the civil liability of the
“providers of applications”. Neither it represents an effective prize, to the extent that,
even before the publication of Law 12.965, the victims of harmful content already
relied on this possibility, sufficing that they did not indicate as value pleaded on
account of non-pecuniary damages an amount higher than the ceiling of the special
claims courts. However, this condition is almost irrelevant since, in Brazilian practice,
the indication of the amount of the non-pecuniary damage is rare, as the majority of
lawyers opt for leaving such measurement at the discretion of the judge, avoiding
the reflex of the value claimed on the calculation of the legal costs, and eventually of
adversarial fees. Therefore, §3 of Article 19 does not bring any benefit to the victims
of damage which had not been already available to them, in the scenario prior to
publication of the Civil Rights Framework of the Internet.
In short, for human persons users of the Internet who may be affected by content
damaging their fundamental rights, Article 19 does not bring any benefit. Quite
the contrary, it represents an obvious setback if compared to the paths which had
been followed by Brazilian case law on this matter. This is a rule that shields busi-
ness companies that operate Internet services, especially through social networks
and virtual communication spaces. Considering the international scenario, this rule
offers much more protection to the Internet industry than those from countries with
a long-standing tradition in the protection of business interests in the network, such
as the United States. From the perspective of legal technique, there is deep misrepre-
sentation of the mechanism of notice and takedown, which ends up sending conflicts
back to an already crammed Judiciary, distancing itself in such a way from that insti-
tute that it is impossible to understand how legal experts on Internet continue to talk
about the incorporation of notice and takedown by the Civil Rights Framework of

26 §1 of Article 19 expands even more this room for defense by affirming that the judicial order
mentioned in the main section of the provision “must include, under penalty of being null, clear
identification of the specific content identified as infringing, allowing the unquestionable location
of the material”.
254 A. Schreiber

the Internet. For Article 19, neither the notice suffices, nor will it lead to takedown,
because what the provision does is to create conditions and obstacles to effective
compliance with the legal order. The rule assures to the victims of damage less than
they already had by the general system of civil liability and hampers protection of
fundamental rights, moving steps back from the protection courts had already been
ensuring in this field.
The remaining question is the following: with all its flaws, is Article 19 of the
Civil Rights Framework of the Internet only a bad rule or is it unconstitutional? The
discussion is not simple from a technical-legal viewpoint, but a deep examination
leads to the conclusion of its unconstitutionality.

5 Unconstitutionality of Article 19 of the Civil Rights


Framework of the Internet. Violation
of the Constitutional Guarantee of Full Reparation
for Damages to Honor, Privacy and Image (Brazilian
Constitution, Article 5, X). Violation of the Principle
of Access to the Justice System (Brazilian Constitution,
Article 5, XXXV). Failure to Observe Prohibition
to Stepping Backwards (Ratchet Effect). Axiological
Inversion

First, in dealing with content damaging fundamental human rights, Article 19 of


Law 12.965 violates the Constitution of the Brazilian Republic because it conditions
the reparation of damage to filing of a lawsuit and issuance of a specific legal order.
The constitutional text assures full reparation of moral or property damages resulting
from violation of intimacy, privacy, honor and image of human persons, in Article
5, item X:
Article 5 (…)
X – intimacy, private life, honor and image of persons are inviolable, the right to
indemnification for material or non-pecuniary damage resulting from their violation being
guaranteed.

Thus, the establishment by the infra-constitutional legislator of any condition to


the protection of these fundamental rights cannot be admitted, as it has already been
sustained in our courts while examining international conventions which provide
for regulating indemnifications with specific values in a table, especially in the field
of airplane accidents. The creation, by ordinary law, of conditions or limits to civil
liability by violation of these rights is unconstitutional, insofar as they would be
restricted to a protection which the Constitution wants to be full, to the point of
stating it without mentioning ulterior activity by the legislator. Thus, by conditioning
the reparation of the damage resulting from violations which may affect the honor,
privacy and image of the human person to previous filing of a lawsuit, Article 19 of
Civil Rights Framework of the Internet … 255

Law 12.965 affronts Article 5, X, of the Constitution of the Brazilian Republic. The
counterargument that the condition applies only to companies which own sites and
social networks, the victim conserving the right to claim reparation to the third party
which initially discloses the content, is almost fictional. As already emphasized,
identification of the third party itself depends on the activity of said companies
and, even when identification occurs, the chance of obtaining reparation is not only
reduced by recurrent circumstances in the virtual environment (lack of identification
of the location of the individual, lack of means by the user etc.), but also ends up
limiting itself to indemnifications in cash, when those companies have much more
effective technical means to prevent propagation of the damage (suppression of the
damaging content, unidentifying the victim, etc.).27
Therefore, Article 19 of Law 12.965 violates Article 5, X of the Brazilian Consti-
tution, which would suffice to conclude that it is unconstitutional. One can, however,
add to this element others which corroborate the divergence between Article 19 of
the Civil Rights Framework of the Internet and the constitutional text. For instance,
the prior requirement to file the lawsuit and issuance of a specific judicial order as
requirements for liability also violate item XXXV of Article 5,28 as the guarantee of
access to the Judiciary, in a substantial reading, consists of a right of the victim, never
a duty. By imposing the appeal to the Judiciary as an essential condition to repara-
tion of the sustained damage, Article 19 of Law 12.965 misrepresents the meaning
of Article 5, XXXV, affronting its substantial dimension.
The provision also violates the so-called “principle of prohibition to stepping
backwards”, to the extent that, by conditioning the protection of such rights to the
receipt of a “specific court order”, it retrocedes in relation to the degree of protection
which was already assured by Brazilian case law, which had been considering the
defendants liable for such damages if they did not act after any kind of communication
(extrajudicial, therefore, and, even, electronic).29 The reduction of the degree of
protection of fundamental rights finds an obstacle in said prohibition of stepping
backwards, a notion which has been broadly developed in the field of Public Law,30
but which also applies to Private Law, especially in the Brazilian legal experience,
which is progressively released from the old dichotomy between Public and Private
Law to find in the maximum fulfillment of constitutional values the north that unites
its legal system.31

27 On the theme, see item 7, below.


28 Constitution of the Republic, “Article 5. (…) XXXV—the law shall not exclude from appreciation
by the Judiciary damage or threat to a right.”
29 Accordingly, it should be mentioned that Regulation Act No. 7962, of March 15, 2013, which

disciplines electronic commerce, determined that “websites or other electronic means used to offer
or for conclusion of a consumption contract must make available, in a prominent place, and easy to
visualize: (…) II—physical address and email, and other information necessary to its location and
contact.”
30 Derbli (2007).
31 Tepedino (2004), pp. 1–22. Also, let it be consented to refer to Schreiber (2013).
256 A. Schreiber

The unconstitutionality of Article 19 is also reinforced by a deep axiological


inversion extracted from the provision itself. In fact, §2 of Article 19 creates an
exception to the rule of the caption, determining that the application of the rule of
protection of providers—which determines limitation of its civil liability to cases
of noncompliance with specific court order(s)—it does not apply to offenses with
respect to copyrights or connected rights, wich depends on “specific legal provision”.
Check below the provisions of §2 of Article 19:
Art. 19 (…)
§2. The implementation of the provisions of this article for infringement of copyright or
related rights is subject to a specific legal provision, which must respect freedom of speech
and other guarantees provided for in article 5 of the Federal Constitution.

The approval of the Civil Rights Framework of the Internet created the following
situation in Brazilian Law: if the content posted offends copyrights or connected
rights, the applications provider cannot invoke Article 19. Therefore, the general
rules of civil liability apply. According to them, liability is full—and not expressly
limited to the situations of lack of action after a specific judicial order—, and, at the
most, the provider may obtain a judicial interpretation, which, based on the case law
prior to publication of Law 12.965, establishes as the initial milestone of its liability
the time when, becoming aware by any means (extrajudicial notification, electronic
communication, etc.) of the content damaging to copyrights or connected rights, the
providers does not act to prevent the spread of the damage. In other words, with the
enactment of Law 12.965, the mechanism of protection of copyrights became simpler,
quicker and more efficient than that reserved to the protection of the fundamental
rights of the human being (honor, privacy, image, etc.), which becomes dependent,
for the very deflagration of civil liability, on resorting to the Judiciary and on the
issuance of a specific judicial order. Here, one has a true axiological inversion, to the
extent that the copyrights and connected rights—also, therefore, those of exclusively
property content—gain a stronger, faster, and more effective instrument of protection
than the fundamental rights of the human being, to which the Brazilian Constitution
attributes greater, if not hierarchical, at least axiological importance, as seen from the
express mention of the dignity of the human person as a foundation of the Republic
(Article 1, III, of the Brazilian Constitution). Such inversion affronts the Constitution,
resulting, unequivocally, in the unconstitutionality of Article 19 of the Civil Rights
Framework of the Internet.
Civil Rights Framework of the Internet … 257

6 A Proposal for Salvation: Interpretation in Conformity


with Article 5, X, of the Constitution of the Republic.
Exegesis of Article 21 of the Civil Framework
of the Internet. Identity of Reason. No Need for Judicial
Notice

In short, there are several ways which lead to the conclusion that Article 19 of the
Civil Rights Framework of the Internet does not consist only in a bad rule, but
instead in an unconstitutional provision, which is legally null. To save Article 19
from unconstitutionality, it would be necessary to interpret it in conformity with
the full protection of personality rights, assured in Article 5, X, of the Constitution.
Such interpretation could be achieved through a reading that approximates Article
19 of Law 12.965 to another provision of the Civil Rights Framework of the Internet;
namely, Article 21. In this rule, the legislator addressed a specific subject: publishing
of material containing “nudity or sexual activities of a private nature”.
Art. 21. The internet application provider that makes third party generated content available
shall be held secondarily liable for the breach of privacy arising from the disclosure of
images, videos and other materials containing nudity or sexual activities of a private nature,
without the authorization of the participants, when, after receipt of notice by the participant
or his/hers legal representative, refrains from removing, in a diligent manner, within its own
technical limitations, such content.

Scholars have been studying this provision under the nomenclature of “revenge
pornography”, following US sources. The labeling is improper, as the literal text
of the rule alludes to scenes of nudity and sex regardless of the reason of their
disclosure. The use of the American expression can be forgiven provided that it does
not lead to a restrictive interpretation of the rule. Strictly speaking, Article 21 is
not the epitome of legislative technique, containing its own dose of improprieties—
such as the equivocated use of the expression “secondarily”32 —, but it has the huge
advantage of having preserved the essential element of notice and takedown: the
extrajudicial nature of the notification.33
And if the protection of sexual intimacy of the individual, with the consequent
liability of the application provider for damage resulting from exposure of its nudity or

32 Civil liability here is not secondary, but proper and direct, because it derives from the absence
of activity of the provider after becoming aware of the fact. The use of the expression “secondary”
must be interpreted as mere reinforcement of the idea that the provider here is answering for the
content transmitted by a third party, but technically it is the own and direct liability, not secondary.
So that there is no doubt, it is not necessary for the victim to file a lawsuit or take any attitude in
relation to a third party—who, frequently, cannot even be identified. It may act directly against the
provider of applications who fails to make the content unavailable, whether to compel him to do
so, or to obtain proper reparation for damages resulting from exposure of the material in the period
spanning from the notification to effective removal.
33 Although he was not express with respect to the extrajudicial nature, the legislator alluded here to

mere “notification”; the interpreter shall not restrict the term to extrajudicial notifications, especially
if compared to the speech of Artcle 21 with that used in Article 19 of the same Federal Statute No.
12965/2014.
258 A. Schreiber

sexuality, can also be deflagrated by mere extrajudicial notification, it is not possible


to understand why the protection of other attributes of human personality—whose
protection is situated on an identical threshold by the Constitution of the Brazilian
Republic—depends, in the text of Article 19, on a “specific court order”. From the
opposition between Article 19 and Article 21 of the Civil Rights Framework of the
Internet, it becomes possible to see that the civil liability of application providers by
unauthorized disclosure of photography which displays the nudity of a person does
not depend on the filing of a lawsuit, but the disclosure of scenes of invasive medical
treatment, of a surgery of amputation of a limb of one’s body, of physical aggression
sustained or of other events whose transmission may damage even more the same
intimacy, privacy or honor of that same person depends on the filing of an action in
court, with the issue of a specific judicial order. Improper labeling of a certain person
as “call girl”, as it happened in the already mentioned case of the false profile of
Adriana Valença on Facebook, would depend on a specific judicial order because,
although transmitting false information related to sexuality, it does not characterize
transmission of a “scene of nudity or of private sexual acts” pursuant to the terms of
Article 21.
Apart from the absence of coherence by the ordinary legislator, what these exam-
ples reveal is that, in dealing with fundamental rights of equal hierarchy (and, in said
cases, even of the very same fundamental rights), the protection mechanisms must
be identical or, at least, equally efficient, under the risk of a regulatory differentiation
being established which is not supported in the constitutional text. If the sexual inti-
macy is protected by mere extrajudicial notification, other forms of intimacy must
be equally protected, as well as other personality rights of the victim. This access
to the same remedies is indispensable to the maximum fulfillment of constitutional
values, expressed in the fundamental rights of the human being.
The salvation of Article 19 of the Civil Rights Framework of the Internet can
only be achieved by an interpretation in accordance with the Constitution of the
Republic which does not require a specific judicial order, being satisfied with mere
notification, whenever the content in question damages personality rights—be it the
sexual intimacy, as already recognized by Article 21 of the ordinary law, or another
attribute of human personality which deserves protection in light of the constitutional
text.
It should be noted that the Brazilian Law of Data Protection (Law 13.709) was
enacted in August 18th of 2018 and will be in full force and effect within 18 months
from such date. Although it doesn’t regulate specifically the provider’s responsibility
for damages resulting from content generated by third parties, the new law is very
protective of privacy, being heavily inspired by the European General Data Protection
Regulation (GDPR). A good example for this is Article 42 of the law, that establishes
a broad responsibility for violation of personal data protection:
The data controller or operator who, due to the exercise of personal data processing, cause
someone financial, moral, individual or collective damage, in violation of the data protection
law, is obliged to repair it.
Civil Rights Framework of the Internet … 259

It might be possible to interpret Article 19 of the Civil Rights Framework of the


Internet systematically, taking into account the principles of the Data Protection Law
and also its Article 42, in order to achieve the responsibility of the Internet provider
without the need of a judicial notification, at least when the content generated by
third parties consists in personal data of the victim. This is a path not yet explored
by Brazilian scholars and deserves a deeper study.

7 The Problem of Suppression. Other Remedies Applicable


to the Virtual Environment: Unidentification,
Appropriate Indexation, Contextualization. Right to Be
Forgotten on the Internet

A constitutional interpretation of Article 19 of the Civil Rights Framework of the


Internet allows one to suppress the requirement of judicialization, the main obstacle
imposed by the ordinary legislator to protection of personality rights in a virtual
environment. The construction of an effective system of protection of these rights
depends, however, on certain additional measures, especially the development of a
broad range of remedies that may be placed at the disposal of the victim and of the
applications providers themselves.
The Civil Rights Framework of the Internet establishes, with the conditions
already seen, the final duty of “make unavailable the content that was identified as
being unlawful” (Article 19, in fine). The suppression of the content consists, doubt-
less, in the strongest measure to prevent the spread of the damage, being a remedy
which proves appropriate to those most frequent situations of publishing of damaging
content (unauthorized display of another’s intimacy, discriminatory contents, inciting
to hate, etc.). However, in certain cases—which, in Brazilian reality, represent not the
rule but the exception—, suppression may represent interference in a non-abusive
exercise of freedom of expression. For this very reason, Law 12.965 imposes, in
Article 20, that suppression be accompanied by the largest amount of information to
the third party who transmitted the allegedly damaging content.
Art. 20. Whenever the contact information of the user directly responsible for the content,
referred to in art. 19, is available, the provider of Internet applications shall have the obligation
to inform the user about the execution of the court order with information that allows the
user to legally contest and submit a defense in court, unless otherwise provided by law or in
a court order.

If it did not establish the mechanism of counter-notification, as the US Digital


Millennium Copyright Act did, the Civil Rights Framework of the Internet guaran-
teed, at least, the information necessary to the third party, which sees suppressed by
the applications provider the content which he had previously transmitted. The third
party is also guaranteed the right to require from the provider the replacement of
the content made unavailable by “a note of explanation or with the text of the court
260 A. Schreiber

order that gave grounds to the unavailability”.34 What the Civil Rights Framework
of the Internet did not do—but could perfectly have done—was to consider remedies
other than the extreme measure of suppression, not as an alternative to the provider
of applications, but as an alternative to the victim him/herself, who, in certain cases,
may not have interest in suppressing the material; nor, let alone, in the whole polemics
which the suppression awakes in relation to free speech in the network.
Suppose, for instance, that somebody discloses in a social network image files
which portray a certain person in her childhood or as a teenager, in a certain embar-
rassing situation, all without authorization of the person portrayed. Suppression of the
material is not necessary to protect the honor of who was portrayed, but it is in his/her
interest to prevent the material from circulating accompanied by mention of his/her
name or identification of his/her face, as it is already frequent on social networks
such as Facebook and Instagram. What the victim is interested in obtaining here is
the absence of identification of her/his individuality, without necessarily intending
to suppress the material from the network, material which may, for instance, portray
other people, including the third party himself (one should think, for instance of a
photo of a costume party from high-school times). In cases such as that, the third
party has, in principle, the right to disclose the image, which (also) portrays him and
mere removal of identification of the victim can already be a sufficient measure to
protect his rights.
Another measure that the victim can be interested to put into practice is a simple
adjustment in the indexation of the material disclosed, so as to prevent it becoming
the main one or one of the main references linked to his identity. There is, also,
in some cases, interest in obtaining the largest contextualization of the material
published, with the addition of information that prevents interpretation of the event
portrayed separated from its original context. Contextualization is an instigating
remedy because, contrary to the others, it points to a solution which contains more,
not less information on the content transmitted.
Both removal of the victim’s identity in the content published by the third party
and appropriate indexation of this content in relation to the victim’s name, as well as
contextualization of the content published by the third party are measures which could
appear sufficient, in the eyes of the victim, to protect his rights. Additionally, such
measures perfectly conform with the best approach to the right to be forgotten, which
can be defined not as a right to “rewrite history”, of changing the facts or suppressing
content which may be published on the Internet, but, instead, as the right “not to be
followed for certain facts”; one avoids thus inappropriate identification of the human
person, which would violate its right to personal identity.35

34 “Article 20. (…) Sole Paragraph. When requested by the user, who provided the content made
unavailable, the provider of Internet applications that carries out this activity in an organized,
professional manner and for economic purposes, shall replace the content made unavailable for a
note of explanation or with the text of the court order that gave grounds to the unavailability of such
content.”
35 On the subject, let it be consented to refer to Schreiber (2014a), pp. 172–174.
Civil Rights Framework of the Internet … 261

The Civil Rights Framework of the Internet would have done well to mention such
remedies (unidentification, appropriate indexation, contextualization of the content,
etc.) as optional, at the discretion of the damaged party, but ended up by being
exclusively centered on the unavailability of the material, a remedy which is not
always desired by the victim and which may attract intense discussions about free
speech, discussions, which, notwithstanding their legitimacy, delay the protection of
the fundamental rights that could be effectuated by other less polemic means.

8 The Question of Own Content. The Nissim Ourfali Case


and the Right to Repentance in the Posting of Own
Content

Although the whole set of problems of civil liability in Law 12.965 orbits around the
content of the third party, the question of own content still assumes legal relevance.
As it has been demonstrated by a succession of concrete facts, the individual who
transmits material in the network concerning himself (photographs, comments, etc.)
can also be a victim of damage resulting from unrestricted or out-of-context trans-
mission of this material or its use for purposes other than those who encouraged its
initial disclosure.
Take for instance the case of the boy Nissim Ourfali. In order to celebrate his Bar
Mitzvah, his parents hired a company specialized in making films of celebrations. In
the film, the 13-year old boy appears as the protagonist of a comic version video clip in
Portuguese of the song “What makes you beautiful” of the group One Direction. The
video clip displays, always with a humoristic bias, photos of Nissim’s life, including
trips to Israel and Baleia Beach, in the coast of São Paulo. The film was posted
on the website of YouTube by members of the family of Nissim with the purpose
of making it accessible to distant friends and relatives. However, it was discovered
by the Internet users and became a fever in the virtual world, almost reaching an
awesome number of accesses. Discothèques in Rio de Janeiro and São Paulo started
to play that version of the song and the video clip became the purpose of parody in
the network, including editing with Silvio Santos and Bart Simpson in the role of
Nissim Ourfali.
Nissim’s family started to receive daily phone calls from companies and agencies
interested in exploiting his image. To preserve himself, Nissim, represented by his
parents, filed a lawsuit at the Court of Justice of São Paulo, against the company that
owned the YouTube website, requesting the removal of the video clip based on the
protection of this right of image and privacy. The first-degree judge dismissed the
request for prior relief, arguing that the material had been spontaneously put on the
website. Check the entire content of the decision:
From what I understand of the amendment to the complaint, the video in which plaintiff
participates was posted spontaneously on the public website for sharing videos, the ‘private’
mode not being an obstacle for all to see how many people have an account on said website
262 A. Schreiber

to access it. From then on, it seems that the video was disseminated on the internet, as
verified by research in Google by the author’s name, which returns nothing less than 790
entries relative to it (cf. print attached). In view of this, the technical possibility of removing
from the internet, from all the access paths which at this point were established to the
video in question is doubtful. It is certain that uncontrolled dissemination of the content is a
characteristic of the internet (which, perhaps due to the short amount of time of its existence,
a fair share of the users is not aware of). So it is that, although it comprises a human aspect, the
situation in which plaintiff finds himself, I do not see a plan of the possibility of establishing
an affirmative covenant, especially due to the open content as intended, for the defendant. It
is convenient to commence the adversary party system. Thus, I dismiss the request for prior
relief for the time being.36

Although the argument may seem convincing at first—because the voluntary


placement of a video on a site such as YouTube suggest tacit authorization of what
was portrayed or made available to users of the same site—such cases frequently
involve aspects, which the Judiciary cannot fail to consider even though it is a case
of prior relief. First, the video clip in question has as leading figure a minor of 13 years
old who, besides worthing special protection in light of the Brazilian constitutional
order, is considered in legal terms as totally incapable for performing acts in civil life,
including the concession of authorization to display its image. The act would require
representation, a formality which, due to the Internet’s own characteristics, is not
normally requested from people who post videos and photos on collective websites,
nor can be safely measured by formality of access to the network. The very initiative
of the minor claiming the removal of the material already serves as sign of the fact
that its authorization was not validly obtained, or that he was not fully clarified on
the possible, practical effects of this disclosure.
In fact, even if the authorization of what has been portrayed had been validly
granted, it seems evident that, in the case in question, the act of putting the video on
the network, though voluntary, ended up earning entirely unexpected repercussion,
becoming distant from their original purpose, which was sharing only with relatives
living abroad. Here, the Judiciary cannot run the risk of being too strict, not taking
into account a certain lack of knowledge by persons not used to the virtual world
on the risks involved in the circulation of videos and images on the Internet. The
huge repercussion of the video clip of Nissim Ourfali surely was not estimated by his
family, so that the initiative to request the removal of the video is not only a legitimate
measure but also a praiseworthy one in light of the need to preserve the privacy of
the minor. If the parents’ perception of the risks involved was belated, such “error”
of evaluation cannot serve as an obstacle to the protection of their rights, keeping a
link with an act originating merely from attachment to a finally no longer existing
will.
After all, what would justify the continued exposure of the boy if he had already
expressly uttered his intention not to have the video shown? Nissim did not sign the
contract, or received any consideration from the company that owned the website for
displaying his image. The fact that the video was put on the network was probably

36Court of Justice of the State of São Paulo—Summary Proceedings No. 0192672-


12.2012.8.26.0100/2012, 1st Civil Court—Central Civil Venue, assigned on 09.18.12.
Civil Rights Framework of the Internet … 263

a thoughtless and casual fact. Moreover, it was an undeniably free act, without the
formalities which the legal order requires to attract the obligational effect. In these
circumstances, one does not glimpse an interest that deserves protection opposed
to his right to preserve his image and privacy, already sufficiently exposed by the
equivocation of his family members, and not to remain forever “marked” by the
episode. What comes into play here is the already mentioned right to be forgotten,
in its virtual mode, an expression of the right to personal indemnity.
If there is a cost in the adoption of steps which are necessary to suppress the
material from the website, this cost can, in a more severe line, be attributed to the
website itself, which is subject to such risk by operating the communicative space.
These costs, in a more liberal line, can be attributed to the boys’ parents. Nevertheless,
while this can be discussed during the proceedings, the right to obtain the measures
of suppression proves to be as clear as it is urgent. To understand otherwise only
because the video was posted spontaneously by his family’s members is to submit
him, in his frail 13 years of age, to the binding force of a “contract”, which he never
signed and for which he received nothing. This binding force, in an extreme degree
of restriction, is not even admitted in business let alone in the distracted and non-
economic use of a site for sharing videos. If, even in the scope of contracts, one
admits breach of the obligational effect due to hardship resulting from unpredictable
and extraordinary facts, why a teenager who saw a comic film of his Bar Mitzvah
turn into a fever across Brazil cannot obtain measures intended to attenuate the huge
burden which he has been suffering from?
Cases such as that of Nissim Ourfali reveal that the problem of civil liability of the
so-called providers of applications by content posted on their sites is not limited to the
damage resulting from the content posted by third parties. The problem also covers
the question—more rare but no less delicate—of the unauthorized exploitation of
the content posted by the user himself. More specifically, it also covers the lack of
measures intended to prevent the diffusion of content which the user expressly no
longer wants to be exhibited on the site. The Civil Rights Framework of the Internet
does not contemplate the subject, suggesting that the provider can only be called
to act in the case of content posted by a third party. However, there is no reason to
eliminate its duty to act in those cases where the user himself “regrets” transmitting
the content and claims for the suppression of the material, so that its reproduction
on the same website be prevented. Here, the whole drama of free expression is not
presented, as the disclosing party itself wishes to see the video excluded from the
site.
Looking closely, the need to adopt technical measure to remove the undesired
content from the video integrates the business risk of companies which exploit
the informal nature of posting the content and encourage all the time its users to
transmit information, comments or private files (photos, videos, etc.). Even if one
could discuss on whom the cost of the measure falls—a question which could attract
different answers, according to the case, for instance, the degree of information trans-
264 A. Schreiber

mitted—, there is no doubt that the person who repents should have the right to claim
it, under penalty of acclaiming an obligational effect ad eternum, as an informal act
that, strictly, does not even qualify in light of the legal order as if it were a source of
obligation.

9 Conclusion

Article 19 of Law 12.965 allocates to the problem of civil liability of the so-called
providers of applications by content transmitted on one of their sites an extremely
restrictive relationship, which represents an undeniable setback in connection with
the path which was being trod by Brazilian case law on this matter. By conditioning
civil liability with noncompliance with the “specific court order”, said Article 19
promotes a dreadful hampered of protection of the rights of the Internet user, often
fundamental rights expressly protected by the Brazilian Constitution such as honor,
image ad privacy. It creates a real bubble of non liability, insofar as it restricts the
civil liability of business companies, which operate the sites where the damaging
content is transmitted. This position limits eventual claim to obtain reparation to
third parties”, almost always anonymous and whose identy and location can only be
known, in most cases, by those same business companies which the Brazilian Civil
Rights Framework for the Internet exempts from liability. Even when known, the third
parties have no technical or economic conditions to attenuate the propagation of the
damage, which is why the eventual liability has few or no practical consequence.
Worse than a misplaced rule, Article 19 must be considered as an unconstitutional
act, for violating different aspects of the Brazilian Constitution (Article 5, items X
and XXXV; Article 1, item III; among others). Not only does it restrict the protec-
tion of fundamental rights, setback in the protection which was already assured by
Brazilian courts, but it also promotes intolerable axiological inversion, by permitting
more favorable treatment to rights with patrimonial content (property copyrights, for
instance) than to personality rights, it being established that the constitutional order
addresses the latter with primacy.
The result of the unconstitutionality consists of restoration of the general disci-
pline of civil liability in relation to the content transmitted on the Internet, with the
possibility of the imputation of full liability of the so-called providers of applications
for damages sustained by the victim. This liability has no temporal limitation, from
the beginning of disclosure of the offensive material, whether based on the Consumer
Defense Code (Article 14), or based on Article 927 sole paragraph of the Civil Code.
The only salvation for Article 19 of the Civil Righs Framework of the Internet would
be the application of an interpretation in accordance with the Brazilian Constitution
(Article 5, item X), which, like what Article 21 of Law 12.965 already does, would
be satisfied with an extrajudicial notification to the provider. This extrajudicial noti-
fication would be considered the beginning of the civil liability in case of inertia in
the adoption of the measures necessary to the removal of the damaging content to its
website.
Civil Rights Framework of the Internet … 265

Business companies which operate sites and social networks have unarguably the
best technical conditions to adopt the steps necessary for reparation or attenuation
of the propagation of the damage. This damage transcends the mere question of
suppression of the material transmitted, but also cover measures, such as concealment
of the victim’s identification, adequate indexation of the material transmitted, its
contextualization, among other remedies, which would be capable, in many cases,
of protecting the victim without raising a risk to the freedom of expression of third
parties, which transmit the material considered damaging.
In any case, one has to be very cautious when invoking freedom of expression in the
virtual universe. It is unquestionably a fallacious argument, since the huge majority
of the actual cases involving requests for the suppression of damaging material in
Brazilian case law concerns information which is obviously false, evident offenses,
racist comments and other kinds of content. In this sense the argument, quite the
opposite of expressing the legitimate exercise of free expression, aim at cannibalizing
it, by intimidation, virtual bullying, online hate speech, and other virtual forms of
oppression.

References

Ammori M (2014) The “New” New York Times: free speech lawyering in the age of Google and
Twitter. Harv Law Rev 127(8):2259–2295
Barlow JP (1996) A declaration of the independence of cyberspace. In: Electronic Frontier
Foundation. https://www.eff.org/cyberspace-independence. Accessed 11 Jan. 2022
Bauman Z (2008) Vida para Consumo – A Transformação das Pessoas em Mercadorias. Zahar, Rio
de Janeiro
Clarke T (2013) Social media: the new frontline in the fight against hate speech. In: Minority
rights group international. http://minorityrights.org/2013/10/30/social-media-the-new-frontline-
in-the-fight-against-hate-speech/. Accessed 11 Jan. 2022
Derbli F (2007) O Princípio da Proibição de Retrocesso Social na Constituição de 1988. Renovar,
Rio de Janeiro
Heins M (2014) The brave new world of social media—how “terms of service” abridges free speech.
Harv Law Rev 127(8):325–330
Jauréguiberry F (2004) Hypermodernité et manipulation de soi. https://web.univ-pau.fr/RECHER
CHE/CIEH/documents/Hypermodernite_manipulation_de_soi.pdf. Acessed 12 Nov 2018
Lévy P (1999) Cibercultura. Editora 34, São Paulo
Oliveira M (2014) Fui motivo de piada na rua’ diz usuária que ganhou processo contra Facebook.
In JusBrasil. https://dellacellasouzaadvogados.jusbrasil.com.br/noticias/113852641/fui-motivo-
de-piada-na-rua-diz-usuaria-que-ganhou-processo-contra-facebook. Accessed 11 Jan. 2022
Roppo E (1988) O Contrato. Almedina, Coimbra
Schreiber A (2013) Direito Civil e Constituição. Atlas, São Paulo
Schreiber A (2014a) Direitos da Personalidade. Atlas, São Paulo
Schreiber A (2014b) Novos Paradigmas da Responsabilidade Civil. Atlas, São Paulo
Superior Court of Justice (2009) Quarta Turma não Reconhece Dano Moral por Envio de SPAM
Erótico a Internet usera. Accessed 31 Jan. 2018
Tepedino G (2004) Premissas Metodológicas para a Constitucionalização do Direito Civil. In:
Tepedino G (ed) Temas de Direito Civil. Renovar, Rio de Janeiro, pp 1–22
266 A. Schreiber

Anderson Schreiber Full Professor of Private Law at the State University of Rio de Janeiro
(UERJ). Professor at Fundação Getulio Vargas (FGV). Head of the Research Project “Law and
Media”. Main areas of research: Constitutional Rights in Private Relations, Torts, Contract Law
and Media Law. Selected Publications: Manual de Direito Civil Contemporâneo, São Paulo:
Saraiva, 2021 (4th edition); Equilíbrio Contratual e Dever de Renegociar, São Paulo: Saraiva,
2020 (2nd edition); A Proibição de Comportamento Contraditório: Tutela da Confiança e Venire
Contra Factum Proprium, São Paulo: Atlas, 2016 (4th edition); Novos Paradigmas da Respons-
abilidade Civil, São Paulo: Atlas, 2015 (6th edition); Direitos da Personalidade, São Paulo: Atlas,
2014 (3rd edition).
Self-regulation in Online Content
Platforms and the Protection
of Personality Rights

Ivar A. Hartmann

Abstract As public debate moves to private content platforms online, content


moderation by the Judiciary is no longer the most common, the most efficient or
the most legitimate way of separating acceptable from unacceptable content. A new
model of self-regulation of personality rights is being implemented by such platforms,
where enforcement—and sometimes review decisions—are performed by code while
users operate as reviewers as much as content producers and consumers. Platforms
have freedom to design and implement decentralized moderation systems, but the
rules set in their architecture and community guidelines must comply with procedural
boundaries established by the efficacy of fundamental rights between private parties.
Judicial review plays a role not in evaluating the merits of content posted or shared
online in these platforms, but rather in course-correction of the procedural rules that
ensure self-regulation does not disproportionally restrict informational personality
rights such as speech, honor and privacy.

1 Introduction

John is very proud. He has just pulled an all-nighter editing a montage of pictures
and small videos of his best friends to share on online. He was careful to pick the
ones where the lighting is favorable and everyone is smiling or making a funny face.
As most teenagers their age, John and his friends produce an enormous amount of
media depicting themselves, so it takes a long time to choose the ones that make the
cut and their order or placement. Also like most teenagers today, John has the skills
and means to remix1 the pictures with videos and a background song. He sought out
to learn the basic functions of a video editing software recommended by a friend
and used it to give the montage a more professional feel. This includes inserting

1 Lessig (2009).

I. A. Hartmann (B)
Insper Learning Institution, São Paulo, Brazil
e-mail: ivarhartmann@insper.edu.br

© Springer Nature Switzerland AG 2022 267


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_11
268 I. A. Hartmann

motivational messages throughout the video. The background music is a mixture of


two songs that John and his friends love: David Guetta and Skylar Grey’s “Shot me
down” along with Taylor Swift’s “Look what you made me do”. After so much hard
work, he could not wait to post the result on Youtube and then disseminate the link on
Facebook as a way to express an uplifting message of how he feels about his friends
and the importance of trust and good character in a person’s life.
Youtube displays 6 billion hours of video every day2 without charging users a
cent to host content on its servers. The business model is only viable due to forced
advertising. John intended to make good use of this service, until he was surprised
by a notice that his video violated third-party copyrights and therefore would not
displayed to Youtube viewers. After looking into it and asking for advice from friends
who dabble with online video sharing, John learns the third party in this case are
David Guetta, Skylar Grey and Taylor Swift. More specifically, the company that
holds the rights over their music.
John is finishing law school and has some knowledge of intellectual property. He
knows that there are certain things one cannot do with songs and other people’s artistic
creations without first asking for permission. But he also knows that fair use means
some freedom to put together works from different creators into something entirely
new. That is precisely what he did by overlapping different parts of the two songs in
ways that emphasize the message on the pictures and motivational statements in the
montage. It is not a mere copy because the pictures and videos of himself along with
his friends give the music a peculiar context and provide for an original creation.
John also understands, however, that this is an inevitably subjective evaluation and
that is why the Judiciary is called upon somewhat often to solve controversies like
this one. And yet Youtube’s evaluation produced a response to John in a few minutes
at 5am.
It was evidently not a judge who ascertained the intellectual property aspects
of the case and provided a verdict on whether John’s video was an original artistic
expression or not. In fact, it was not even a human with some knowledge of copyright.
Rather, it was decided by a machine running an algorithm developed by Google. The
system is called Content ID:
Videos uploaded to YouTube are scanned against a database of files that have been submitted
to us by content owners. Copyright owners get to decide what happens when content in a
video on YouTube matches a work they own.3

Thanks to this system, every time John and other people create something new
using a song or part of a video that has been submitted to Content ID’s database,
Youtube will keep the new work automatically out of display—without any human
evaluation and bypassing an impartial judicial ruling on the matter. What is curious is
that “Shot me down” is originally from Nancy Sinatra. Guetta and Grey did the same
as John: used parts of the artistic creation of others in order to produce something new.
But when the video clip of Guetta and Grey’s version of shot me down was posted

2 https://www.youtube.com/yt/about/press/. Accessed 20 Jan. 2018.


3 https://support.google.com/youtube/answer/2797370. Accessed 11 Jan. 2022.
Self-regulation in Online Content Platforms and the Protection … 269

on Youtube it was not prevented from being accessible by users. That is because
the rights holder of Guetta and Grey’s works negotiated with those of Sinatra’s. It
is remixing with a prior request for permission and payment of fees. While this
type of composition is protected by Content ID, amateur remixing is not. Google
adopted this system voluntarily. In most countries the law establishes that a company
playing the role of a content intermediary, such as a platform or social media, is not
responsible for copyright violations unless it was notified and failed to remove the
allegedly infringing material. In this case, Youtube ran John’s video through prior
analysis before making it available, which is very different from waiting until a rights
holder expresses their dissatisfaction with the amateur montage. Content ID is not
legally required.
Moreover, it generates a substantial cost to Google as it had to be first developed
and is later continuously tested and calibrated. There is also the hardware cost of pre-
processing billions of hours of video uploaded to Youtube. Above all, there is the cost
of decreasing revenue. The company makes money from advertising, which depends
on viewership. This, in turn, depends not only on the accessibility of each single
video, but on the perception of Youtube’s value as a content platform as a whole. The
network effect dictates that the more content and user interaction it hosts, the more
valuable it is to all users.4 This means that by barring amateurs remixes developed
by its uses, Youtube is acting in detriment to its own value as a platform.
Content ID is of course old news. Algorithms are now being proposed as a way to
fight other potentially unlawful content, including revenge porn.5 The application of
automatic filters for copyright violations exposes the high potential for false positives,
but content platforms6 are slowly being asked to play the role of judge when a much
more nuanced decision is required: distinguishing between speech that is legal and
that which constitutes hate speech or violates a person’s honor.
It turns out John’s motivational message was directed to white people. The phrases
used along the video were “white lives matter”, “Muslims are terrorists”, “blacks are
lazy”, “build the wall” and “white pride”. Taylor Swift’s song was chosen specifically
because of what John feels it represents for white supremacists.7 Could an algorithm
identify and bar or remove hate speech without censoring legal speech in the process?
In a time where so many people contribute to the public debate and seemingly
harmless cultural expressions can be hijacked by certain groups and made to mean
something their original creators never dreamed of,8 can a machine properly pick
apart illegal from outrageous but legal speech?

4 For a description of the network effect, see for example, Wu (2003).


5 Solon (2017).
6 “A platform is a business based on enabling value-creating interactions between external producers

and consumers. The platform provides an open, participative infrastructure for these interactions and
sets governance conditions for them. The platform’s overarching purpose: to consummate matches
among users and facilitate the exchange of goods, services, or social currency”, see Parker et al.
(2016), p. 299.
7 Khal (2017).
8 Ellis (2017).
270 I. A. Hartmann

There are increasing incentives for private platforms to try and identify illegal
speech. The safe harbor provisions in European countries, as well as in other like the
United States and Brazil are slowly being eroded by court rulings and legislators,
creating new risks for content hosts.9 The German Netzwerkdurchsetzungsgesetz that
has come into force in 2017 is a good example. In contrast to the Brazilian option for
liability only after judicial notice, this new law forces private companies to arbitrate
speech and determine within 24 h if expression is “obviously illegal” after notice from
an offended party.10 Faced with this task, how long until private companies forsake
human manual detailed analysis and invest on quick and monumentally cheaper
software alternatives?
As the current scenario stands, there are increasing incentives to have social media
and other private content platforms use algorithms to identify illegal speech such
that John’s video would never even make it to anyone’s newsfeed on Facebook.
There are already some experiments, including by government funded research in
Germany11 and by major players such as Yahoo,12 but initial results are not always
completely satisfactory.13 Not all incentives to filter speech come from legal liability
risks, however, as many social media companies have “community standards” that
are more stringent on expression than the law they are required to apply.
Is this a kind of self-regulation? Do the characteristics of this type of practice
allow them to be classified under the traditional self-regulation categories? What is
unique about these experiments in the current rich context of the internet and artificial
intelligence that might defy traditional self-regulation theory? My goal is to answer
these questions with the purpose of providing a conceptual framework based not
only on self-regulation theory, but also aspects of the protection of personality rights
online, including possible avenues for the efficacy of fundamental rights in private
relationships. In the first part, I will present a summary of classic self-regulation
elements, bringing them together with the notion of code as law and the restraints
imposed by the constitutional protection of personality rights. In the second part, I
will retell John’s story from a different angle while detailing and better explaining
what is at stake, how the technology is used and what are the elements of the dynamic
between regulation by code and law in this context.

9 Hartmann (2017), p. 29.


10 Germany starts enforcing hate speech law, BBC News, 1 January 2018, http://www.bbc.com/
news/technology-42510868. Accessed 11 Jan. 2022.
11 Algorithms to combat hate speech, Research in Germany, http://web.archive.org/web/201909140

55210/https://www.research-in-germany.org/en/infoservice/newsletter/newsletter-2017/october-
2017/algorithms-to-combat-hate-speech-online.html. Accessed 11 Jan. 2022.
12 Reynolds (2016).
13 The initial results are unsurprisingly not perfect: “Consistent with previous work, we find that

certain terms are particularly useful for distinguishing between hate speech and offensive language.
While f*g, b*tch, and n*gga are used in both hate speech and offensive language, the terms f*ggot
and n*gger are generally associated with hate speech. Many of the tweets considered most hateful
contain multiple racial and homophobic slurs. While this allows us to easily identify some of the
more egregious instances of hate speech it means that we are more likely to misclassify hate speech
if it doesn’t contain any curse words or offensive terms.”, Davidson et al. (2017), p. 4.
Self-regulation in Online Content Platforms and the Protection … 271

2 Classic Self-regulation and Self-regulation by Code

Regulation was always seen as a product of the action of institutions. Although


the classic law & economics school has largely dedicated itself to identifying the
inefficiencies of regulation and, therefore, of the functioning of institutions created
and operated to this end, the critique can currently be redrawn or redirected. The
problem is not the existence of institutions, for they admittedly perform certain roles
that contribute to the optimization of value transfers in society. The main one is to
create certain restrictions to individual conduct that enable predictability of such
conduct by others.14
The problem then is, therefore, the cost-benefit analysis of certain institutional
configurations. While the restriction of some types of action demands rigidity and
some stability, these have in turn created, with time, a new problem: the adequacy of a
specific institution and its respective regulation to a specific market.15 An institutional
format with certain rules for a certain market might be the best possible regulatory
solution in 1960 and later become the worst possible solution for that same market
in 2000. The example of copyright protection online and in social media such as
Youtube with a regulatory framework devised before the internet became popular
illustrates the issue of cost–benefit analysis in regulation and the convenience of
periodic adjustments.
Not all deficiencies that arise in this context result from regulation itself. Failures
may in part be attributed to a split between the regulators beliefs on human action and
the reality of it as unveiled by studies in anthropology and psychology. It is not simple
to plan the regulation of human conduct when instead of perfectly rational beings,
regulation will deal with people that exhibit conditioned rationality, conditioned will
and conditioned selfishness.16
Regardless of this and other critiques of behavioral law & economics to traditional
attempts at coercing human conduct in a planned fashion, there are still other prob-
lems of state regulation that have brought about the phenomenon of self-regulation.
According to John Stiglitz’s, “regulation is necessary because social and private costs
and benefits, and hence incentives, are misaligned.”,17 any regulatory endeavor must
therefore take three factors into account.

14 “Institutions simplify the decision problem for economic actors, by imposing restraints on each
person’s conduct which render it substantially predictable to others.” MacKaay (2000), p. 82.
15 “Their virtue lies in the relative fixity they provide. But the fixity is also their weakness. At

the time of its creation, an institution may be chosen so as to provide generally the best trade-off
in the face of the circumstances of the moment. As circumstances change, institutions may come
to represent less than optimal trade-offs and yet their fixity prevents them from being instantly
adjusted. The benefit of fixity and predictability is bought at the risk of ill fit over time.” MacKaay
(2000), p. 83.
16 “[I]t is useful for purposes of behavioral law and economics analysis to view human actors

as departing from traditional economic assumptions in three distinct ways: human actors exhibit
bounded rationality, bounded willpower, and bounded self-interest.”, Jolls (2007), p. 10.
17 Stiglitz (2009), p. 13.
272 I. A. Hartmann

First, information asymmetries. The regulator—the actor that designs or applies


regulation—often times has less data about the specific market and its particular
phenomena than the regulated actors do. Second, the moral risk in the regulator’s
conduct. It may operate a regulatory system with pernicious or even noble goal, but
one that is not aligned with the social welfare that the system was created to pursue.
Third, human fallibility. Mistakes are a natural part of human conduct and its costs
cannot be entirely avoided, just minimized.18
Two out of the three characteristics result from that fact that the regulator and
the regulated are not the same entity. In the context of self-regulation there are no
information asymmetries between the regulator and the regulated because they are
the one and the same. There is also no risk that the regulated actor would self-apply a
regulatory scheme that unnecessarily restricts freedom and brings the state regulator
unfair advantages. The risk is precisely the opposite: constraints that turn out to be
too feeble. And that is why the Content ID and “community standards” examples
are counterintuitive, because they involve stricter standards than those required by
legislators or courts.
As Anthony Ogus indicates in his work on self-regulation, it occurs when three
distinct conditions are fulfilled: (1) market failure, (2) impossibility or inadequacy
of private law instruments to correct such failure and (3) self-regulation presents a
lower cost than state regulation correct the failure.19 If these requirements are met,
then self-regulation would be advantageous for at least three reasons.
First, because regulatory agents—such as self-regulatory agencies—hold greater
expertise about the regulated market than the government. Second, the costs of over-
sight and enforcement of the regulatory framework are smaller. Third, adjustments
along the way are inevitable in any regulatory framework and are much easier and
faster in a self-regulation scheme. Fourth, regulation costs are internalized by the
regulated actors alone, instead of being transferred to people in general.20 In any
case, while it is clear that state and private actors perform the role of regulators
under different conditions, it is also noticeable that self-regulation can no longer be
discussed under the traditional public vs private dichotomy because, among many
reasons, private actors behave like the state when they (self) regulate and the state
also behaves at times much like a private entity, even if with broad social welfare
goals.21
Self-regulation models can thus adopt three formats. The first is the establishment
of self-regulatory agencies. They are responsible for creating rules, enforcing them
and adjudicating controversies on their application. An example is the Brazilian

18 Stiglitz (2009), p. 13.


19 Ogus (1995), p. 97.
20 Ogus (1995), p. 98.
21 According to Robert Hockett and Saule Omarova, there are at least four situations in which the

State plays the role of a private entity: as market creator, as market influencer, as market enhancer
and as market preserver. See Hockett and Omarova (2013), p. 5.
Self-regulation in Online Content Platforms and the Protection … 273

National Council for Advertising Self-regulation (CONAR), which holds compa-


nies accountable for the advertisements they sponsor in order to censor abuses in
commercial speech.
The second model is that of group negotiation. Groups of private actors with
different interests might set up contracts voluntarily creating conduct restrictions
for all parties. This model assumes that varying groups have a minimum level of
information about the market, that they can resort to the Judiciary in case of breach of
contract and that they can reach a sufficient level of internal organization that enables
them to act in an organized fashion. The common example is that of collective labor
agreements.
The third and last model is that of competition between self-regulation agencies.
Different private actor groups adhere to different self-assembled agencies. Each one
operates under specific rules of conduct that its members are bound by voluntarily.
In this model the identification or brand of each agency is essential as it permits
the belonging and submission to a certain agency to be used as a positive sign by
private actors to persuade those with whom they wish to have business. The common
example are environmental standards seals adopted by companies all over the world,
especially those whose raw materials are normally extracted directly from nature. In
ostensibly displaying the Forest Stewardship Council’s seal a company signals to its
potential consumers that it has adopted a set of conduct restrictions that render its
business more sustainable than those of the competition.
None of these models covers or explains what happened to John because of his
apparent copyright violation or what could happen, in a very near future, because
of the point of view he expressed in the video. There are many reasons for this,
chiefly that Youtube is restricting its own conduct—by censoring the most valuable
element of its platform, videos—but also that of its users. Youtube did not sign on
to a self-regulation agency and does not display Content ID as the brand of a private
collective that wishes to self-regulate. Content ID is proprietary technology. John is
not part of a private collective that has negotiated with the group to which Youtube
belongs to in order to then accept the rules that censor John’s videos.
The apparent paradox reveals a central innovation: the users are regulated because
they are producing the content consumed in the platform. Youtube does not produce
videos. Facebook does not write posts. Instagram does not have an enormous team
of photographers in charge of taking all the pictures that circulate on its platform.
The entries on Wikipedia read by millions of internet users were written by users
themselves. In these ecosystems, the user is at the same time a consumer and a
producer of the information goods that are hosted, catalogued, organized, highlighted
and, eventually, filtered by the platform.
The consumer in this context is served by two kinds of players with distinct roles
that are intrinsically intertwined: the platform and the user. In this article I focus on
the legal aspects of the consumer also being the developer of the informational good
or service. But there is yet another characteristic of such a scenario that escapes the
existing concepts of self-regulation: the user-consumer is also a regulator when it
participates in the process of setting the rules to be followed—such as determining
the concept of hate speech that is banned from the platform, their application to
274 I. A. Hartmann

concrete cases—when it decides whether specific posts, pictures or videos fall into
the hate speech category, and the enforcement of these decisions—when it presses
the button to remove, demote or red flag content as illegal. In many online private
platforms, the user is therefore gradually becoming a rule-making and rule-enforcing
actor, as I have noted and described elsewhere,22 which is also a novel aspect of this
new type of self-regulation. Here, however, I intend to focus on describing the role
played by the private companies that host crowdsourced content.
Another novel element and one that is central to this model, is the style of enforce-
ment. John did not have a chance to disrespect the rule created and applied by Youtube
on copyrighted material. His videos will always be subjected to Content ID verifi-
cation and his compositions will always be censored when they include elements
from copyrighted songs or videos. John does not have the option to break the rule
of self-regulation and take the risk of potential subsequent punishment. There is no
adjudication procedure to evaluate whether the video was at odds with the rules that
govern Content ID. There is no established penalty for such an impossible act.
To understand the meaning of the way in which technology operates in this context
requires accepting that law and the market are not the only avenues to restrict human
behavior. As Lawrence Lessig explained when the World Wide Web was in its infancy,
there are four avenues.23 Law is the most obvious one, imposing constraints through
rules. Because they can be disrespected, it is necessary to determine and apply punish-
ments for lack of obedience. The market restricts as well: although law does not forbid
it, an elementary school teacher cannot own a private jet. The law of nature also limit
what we can do. No human being can fly because of the law of gravity. No one can
ignore it, so there is no need to set up procedures to resolve disputes on whether
gravity was properly followed.
The fourth avenue of regulation indicated by Lessig is similar to the laws of
nature in this sense. The laws of the virtual world are impossible to disrespect. It is
undeniable today that regulation has been enabled along the years by modifications
in the physical and logical structure of the internet. Lessig posits that this technical
plane, which he has called “code”, configured by engineers and programmers, is
exactly like a form of regulation, just like human written law. The architecture of the
virtual world defines what we can and cannot do while in it.
This guarantees ex ante control in the enforcement of norms, in as much as
the choices on code design define what the user actually is able and unable to do,
unlike prescribing what it may or may not do, legally.24 But moving away from state
controlled regulation implies abandoning democratic rule-making. Lessig thus alerts
to the possibility of code being used as an oppressive form of regulation by the state,
a hypothesis that suggests the need for even greater participation and leadership of

22 Hartmann (2017), p. 39.


23 Lessig (1998).
24 “Regulation in cyberspace is, or can be, different. If the regulator wants to induce a certain

behavior, she need not threaten, or cajole, to inspire the change. She need only change the code-the
software that defines the terms upon which the individual gains access to the system, or uses assets
on the system”. Lessig (1996), p. 1407.
Self-regulation in Online Content Platforms and the Protection … 275

civil society actors in at least some of the regulatory layers.25 The inclusion of non-
public actors in the regulation of online behavior, as well as on policy decisions that
shape public and private internet institutions has been understood for years as the
field of internet governance,26 such that the idea of online regulation not coming from
state entities is not at all new. Quite to the contrary, the diversity of actors actually
holding the reigns of regulatory activity concerning internet conduct is increasingly
being emphasized within the internet governance arena.27
Tim Wu addresses the strength of code from a different perspective, acknowl-
edging the function described by Lessig, but proposing yet another one, which in a
certain way opposes the perfect control of ex ante (self)regulation. To Wu, code is
also an antiregulatory tool that can be wielded by certain groups to decrease the cost
of following the law and therefore generating an advantage.28 The groups that do not
wish to obey the law have two options: to change it or to avoid it. The first option is
political and requires great organizational capacity and economic power to perform
the necessary lobby in order to amend legislation. Those without economic power
or mobilization ability employ mechanism to bypass the law. Here “bypassing the
law” means to avoid doing what traditional law demands by imparting changes on
the law of code. It is crucial to note, however, that traditional laws can be violated
without being amended, while the laws of code and network architecture can only
ever be altered, but never merely disobeyed.29
Evidently, the number of people required to clandestinely reconfigure code in
order to disobey it is much lower than the number of people willing to (even if not
entirely capable of) disrespecting traditional laws. This asymmetry brings even more
complexity to the regulation by code, but at the same time, transparent changes in
the laws of code are even more decisive.
Only the state can produce and modify traditional laws. The laws of the natural
world are never subject to change. The laws of network architecture, on the other
hand, and its sub settings—such as social media—are created and altered many
times and unilaterally by private groups or companies like Google. In creating and
applying Content ID the company has profoundly affected people’s behavior vis-
à-vis the production of amateur remixes and artistic compositions. A software that
preemptively identifies and blocks hate speech would have the same effect. The
algorithm has an infinitely larger impact in the protection of musicians’ interests or
the fight against hate than any article or paragraph in traditional law.

25 “If cyberspace were to become this perfect technology of technology and democracy, then there
would be little reason to worry. But a perfect technology of control does not entail a perfect
technology of justice, and it is this that commends a continued check.” Lessig (1996) p. 1411.
26 Mueller et al. (2007), p. 250: “[M]ultistakeholder governance should be legitimized and main-

tained. This norm is a logical extension of principles relating to private networks and global scope.
The Internet is in effect a global confederation of network operators and users and should not be
regulated in a top-down manner via agreements among states alone.” See also Kleinwächter (2009),
p. 384.
27 Belli (2015).
28 Wu (2003), p. 682.
29 Lessig (2006), p. 7.
276 I. A. Hartmann

3 Personality Rights and the Boundaries of Self-regulation


by Code

As with the internet in general, in the self-regulation model that I propose here
there is significant weight of private parties’—both large content platforms and
regular users—online conduct on the exercise of fundamental rights such as freedom
of expression, access to information, privacy, propriety and the free exercise of
economic activity. That is to say, in the arena of self-regulation mediated by code
the issue of the efficacy of fundamental rights in private relations takes center stage.
There are many questions that can illustrate this point. Has Youtube violated John’s
freedom of expression in automatically and voluntarily censoring his artistic expres-
sion? Would Youtube violate the right to intellectual property of musicians if it made
Content ID to be an optional tool for users as they upload video? Is Content ID as
currently applied in violation of the right to access information that Youtube viewers
have in relation to John’s montage? That same goes for future algorithmic solutions
to online hate speech and violations of personal honor. How strict is a company
required to be when designing and operating natural language processing software?
Are there limits to how much speech can be censored?
I believe personality rights and the protection afforded them by most constitutions
should come into play here. Judicial analysis of the merits of the content in posts,
videos and pictures online has proven not to be the best alternative for online content
thus far and it should not be considered as the first and foremost battleground in which
to answer the questions posed in the previous paragraph. As Jack Balkin noted long
ago, it is administrative law and not constitutional rights adjudication that provides
a more suitable and effective context for securing the old and new liberties critical to
the internet,30 which includes developing the tools for adequately protecting freedom
of expression and other informational personality rights online.
Self-regulation of online expression as described here implies the action of private
platforms engaging with users in order to make decentralized gatekeeping31 decisions
on the content that should be barred, demoted or obscured. Algorithms are deployed
in parallel to either make some of these judgement calls or merely enforce whatever
decision on content has already been made. As rule, any recourse to the Judiciary
is only adequate to the extent that it invites considerations on procedural fairness
instead of the traditional content analysis. The efficacy of constitutional protected
personality rights in private relations operates as criterion and boundaries for such
procedural review. In most cases, a judge should evaluate, for example, not whether

30 “Protecting free speech values in the digital age will be less and less a problem of constitutional
lawalthough these protections will remain quite important—and more and more a problem of
technology and administrative regulation.” Balkin (2009), p. 441.
31 Gatekeeping can be defined “as the process of controlling information as it moves through a gate.

Activities include, among others, selection, addition, withholding, display, channeling, shaping,
manipulation, repetition, timing, localization, integration, disregard, and deletion of informa-
tion.” Barzilai-Nahon (2008), p. 1496. More specifically, “[D]ecentralized gatekeeping consists of
numerous, microlevel interactions between individuals engaged in a particular collective endeavor.”
Shaw (2012), p. 357.
Self-regulation in Online Content Platforms and the Protection … 277

what A said about B classifies as damaging hate speech, but rather if a certain aspect
of the platforms architecture creates an imbalance between A and B in their capacity
to communicate. The following part of this article is dedicated to addressing this in
detail.
The development of fundamental rights theory in Brazil, including in the field of
efficacy among private parties, is strongly influenced by German law, either directly
or indirectly through Portuguese literature.32 The most basic aspect here that already
signs a departure from traditional American constitutional law is that constitutional
rights are not foreign to relationships between private parties such as the platform
and the user. Naturally, even in the United States the idea that private companies
such as Facebook and Google should be treated completely different from the state
for the purposes of freedom of speech or privacy protection is losing ground. But
a more pure or simplistic view of state action doctrine has been slowly suffering
revision regardless of the internet.33 I will not entertain here the merits of horizontal
efficacy, but rather discuss elements that would provide useful guidelines for the
judicial review of self-regulation efforts involving private content platforms, users
and speech. More specifically, is it possible to identify cases where a procedural
flaw in the self-regulation system is so severely at odds with personality rights that
it should be directly corrected by a judge?
Therefore, even if in Brazilian law many people defend a prima facie direct effi-
cacy of fundamental rights in private relations—which would shed some provocative
light on the Content ID example—I prefer to frame self-regulation as a context in
which direct and indirect efficacy coexist, in accordance with Ingo Sarlet’s view.34
It is therefore an issue of carefully selecting criteria to enable the implementation of
direct efficacy to a subset of cases.35
Daniel Sarmento offers a different understanding with regards to the prevalence
of indirect or direct efficacy, but also provides guidelines for the application of direct
efficacy to concrete cases. They are especially useful in the field of moderation of
speech in private platforms. First, material inequality, expressed in this context by the
gap between the power of Facebook and one of its users.36 There are many aspects in
which acknowledging a duty by online platforms to respect personality rights guides
legislative choices, by means of indirect efficacy, as well as isolated judicial review,
by means of direct efficacy. Among them, the one with the largest social impact is
certainly intermediary liability. The asymmetry of power between the single user
and the company that removes her post, often without any explanation, shows the

32 Pinto (2007); Canaris (2003). See also Andrade (1987), p. 245; Canotilho (2003), p. 1253; Sarlet
(2006), p. 398.
33 Mark Tushnet highlights the weaknesses of this theory such as, for example, the fact that it denies

that private parties could be bound by constitutional rights not because there is no basis for this in
constitutional text but because the goal is to avoid such positions to be construed by judges. Tushnet
(2007).
34 Sarlet (2007), p. 144.
35 Sarlet (2007), p. 134.
36 Sarmento (2010), p. 264.
278 I. A. Hartmann

near state-like character of censorship, making it so that in current times platform


abuse is a greater threat than being silenced by public authorities.
The liability of online intermediaries is a specific field of law with extensive litera-
ture and jurisprudence. My goal is not to go over the merits of the options available to
Congress or, when faced with legislative vacuum (as was the case in Brazil for many
years), to the courts. It should be noted, however, that a choice of standard can push
for more private voluntary censorship—such as the recent German Netzwerkdurch-
setzungsgesetz does—or more subjective, case-by-case content analysis by courts—
such as the Brazilian Marco Civil. The latter is the lesser of two evils, however,
and was arguably a good choice for Brazil37 because it affords a more sophisticated
balancing between the respect of free speech demanded of content hosts, on the one
hand, and the protection of the privacy, honor and image of the offended parties,
on the other. The notice and take down standard set by American law in the area of
copyright protection has created incentives for excessive censorship by platforms as
a way of reducing the risk of lawsuits.38
A second guideline is the nature of the fundamental right of the parties on each side.
Personality rights closely related to “existential issues”39 are more intensely protected
than the economic interests of the platform in maintaining its advertiser base. Third,
direct efficacy should consider political and cultural diversity and not result in require-
ments that level down the diversity of values, beliefs and viewpoints. “Community
standards” adopted by social networks frequently impose homogenizing and puritan
limitations in an apparent utopian search for a sanitized environment where nobody
is ever shocked by cultural variety—because it has been erased. A useful example
is the suspension by Facebook of users who shared pictures of seminude aboriginal
tribes. The pictures did not fit the platform’s decency aesthetic and thus violated the
community standards.40 Sarmento’s lesson is especially convenient here, even years
before social media became popular.41
The heteronormativity that anchors many policies and concrete case decisions
by social networks in their search for a “family environment” is another poignant
example. Facebook has taken this misguided and appallingly discriminatory path
more than once.42 This highlights the need to recognize that private platforms must

37 “The ‘notice and take-down’ system, however, is flawed because it allows arbitrary removal of
content based on a simple complaint made by the interested person, without the necessary due
process of law. Furthermore, it is a system that condones censorship, temporary or permanent, or
else intimidates or restricts freedom of expression.” Mulholland (2017), p. 179.
38 Citing several case studies and research, see Seltzer (2010), p. 171. Seltzer advises against an

even worst scenario of strict liability: “Moreover, the chilling effect analysis indicates that over-
deterrence is a problem deeper than the DMCA notice-and-takedown regime; it is a problem endemic
to copyright law and its secondary liabilities. As copyright expands in scope, time, and breadth,
its erroneous application and the chill of secondary liability assume greater significance.” Seltzer
(2010), p. 227.
39 Sarmento (2010), p. 267.
40 Alexander (2016).
41 Sarmento (2010), p. 329.
42 This happened in 2011 (Lee 2011) and again in 2013, when the picture was actually posted in

the page of Brazil’s largest newspaper (Facebook Deletes Folha’s Post on Gay ’Kiss-in’, Folha de
Self-regulation in Online Content Platforms and the Protection … 279

respect the right to equality on a very basic level, in order to prevent hateful discrim-
ination, but also the right to freedom of expression. The Brazilian Constitution, for
example, could be interpreted to impose limits on social media companies’ sanitizing
censorship that casts a wide net over online daily interaction.
A fourth guideline is the extent to which the relationship between the private
parties has characteristics that place it closer to the public sphere.43 The case of
content platforms that host third party speech, there is a deep connection with
democratic practice in the public sphere which would favor direct efficacy binding
the administrators of social networks and other websites to the protection of the
fundamental right to freedom of expression.
Another possible solution is the adoption of direct efficacy of fundamental rights
in private relations only in as much as the protection of human dignity is concerned,
as Jörg Neuner proposes in German law.44 This is an example of the profitable
coexistence between direct and indirect efficacy, even if certain aspects of German
law do not translate entirely well into other national legal orders. In any event, it
seems relevant that there is a connection between the defining role that free speech,
image and privacy play in individual identity, on the one hand, and human dignity,
on the other.
We must however understand the parts filled by the state and different private
actors online. Lessig indicates that the internet was conceived by engineers and
academics who resented control, funded by the military, which had the goal of
creating a decentralized network. Its configuration, its code, as a result, was struc-
tured in a way that created the free and democratic environment of the early days
of the World Wide Web. However, once corporate interests were added to the equa-
tion, the code began to be altered to allow privileges to those with larger financial
resources.
The relationship between this fear and the efficacy of fundamental rights in
private relations lies in the fact that the state often stimulates, pressures or even
forces telecommunication companies, access providers and content providers, among
others, to perform changes in the code in detriment to basic liberties—while it
remains merely an indirect violator of freedom of access. In other cases compa-
nies who own intellectual property pressure social networks such as Youtube into
adopting architecture rules that protect copyrights even at the cost of basic levels of
free speech.
The complexity of problems associated with online self-regulation systems might
require a new model of constitutional rights’ efficacy to be applied to private relations.
Vagias Karavas brings bold contributions, enumerating several new contexts for the
protection of fundamental rights between private parties in the networked society
(Castells). He believes even the Claus-Wilhelm Canaris’ theory of protection duties
would result in a failed scenario where the state contemplates the individual as a mere

São Paulo, 24 July 2013, http://www1.folha.uol.com.br/internacional/en/brazil/2013/07/1315835-


facebook-deletes-folhas-post-on-gay-kiss-in.shtml. Accessed 11 Jan. 2022.
43 Pereira (2006), p. 188.
44 Neuner (2007), p. 235.
280 I. A. Hartmann

object of public protection policy instead of as a responsible coauthor of law and a


problem-solver.45 As I have indicated, John and users in general should be considered
rule setting agents in the face of Youtube as much as the other way around.46
Karavas mentions the proposals of three authors. The first is Oliver Gersten-
berg’s “polyarchy”: a new understanding of democracy. The production of norms is
performed autonomously and not by higher stances or through delegating compe-
tences. Fundamental rights—including personality rights—play a role, due to judi-
cial safeguards, of guarantees of chances in participating in the process of private
production of law. This is very much in line with the idea that the standards of what
constitutes illegal speech online should be defined and enforced by users and plat-
forms. The users’ ability to participate in this process is enabled by judicial review of
the procedural rules that shape the virtual environment where the decision-making
takes place. For example: a judge could annul a rule by the platform that excludes all
users over 60 from the activities of red-flagging potentially abusive content. It should
be irrelevant to the judge in this case what the content was that an elderly wanted to
mark as inappropriate. The point is about procedural fairness, more specifically, in
this example, non-discrimination in the participation in enforcement activities.
The second, offered by Karl-Heinz Ladeur, allows possibly broader judicial review
than the polyarchy model, but only to the extent that the rationale of private parties
is not supplanted by that of the judge, enabling communication needs and self-
regulation of civil society to reconcile.47 The third, presented by Gunther Teubner,
is the comprehension of fundamental rights, also in the private realm, not only as
subjective rights, but as institutional guarantees of plurality of discourse in society
and as keepers of societal structural diversity, as a result of the assertion that norms
production by an authority is incapable of solving social conflicts.48
Thomas Vesting believes the German Federal Constitutional Court is yet to learn
how to deal with a telecommunications system that does not have the state as its
central player, but rather reflects a new model of acentric society. The Court should
then focus on providing support and stability to the processes of self-organization and
self-regulation, even if this implies correcting certain flaws. In the field of freedom
of expression, Jack Balkin frames the problem in the same way. To him,
The model of judicial protection of individual rights remains crucially important in the
digital age. But it will not be able to protect freedom of speech fully. (…) A healthy and

45 Karavas (2007), p. 80.


46 Belli and Zingales (2017).
47 “Als Netzwerk sollten in einer rechts- und sozialwissenschaftlichen Perspektive primär nicht-

hierarchische Beziehungen zwischen privaten oder öffentlichen Akteuren, Ressourcen und


Entscheidungen verstanden werden, deren Selbstkoordination emergente, nicht unabhängig vom
Prozess ihrer Hervorbringung denkbare Regeln, Handlungsmuster und Erwartungen erzeugen
kann.“, Ladeur (2009), p. 175. To Ladeur, the State could therefore “durch strategische Interven-
tion, als Moderator oder ‚knowledge-broker ‘ den Varietätspool der Relationierungsmöglichkeiten
erweitern und damit die ‚kollektive Intelligenz “von Netzwerken steigern – ohne genau zu wissen,
was am Ende herauskommen wird.”, p. 176.
48 Among many other works, Teubner discusses the challenges that digitisation, privatisation and

globalisation bring to law in Teubner (2004).


Self-regulation in Online Content Platforms and the Protection … 281

well-functioning system of free expression depends on technologies of communication and a


public ready and able to use those technologies to participate in the growth and development
of culture.49

Despite some shortcomings of classic theories of efficacy of constitutional rights


among private parties in cyberspace, as revealed by Karavas and others, it seems that
the best alternative remains, for the time being, the concomitant adoption of the indi-
rect efficacy as a rule and direct efficacy in specific cases. That is, Canaris’ theory
of protective duties,50 where the legislator’s role is to produce specific normative
determinations in order to ensure the necessary protection of the fundamental rights
of both the self-regulators and those subject to this regulation, not overstepping the
boundaries of the prohibition of insufficient or excessive protection.51 On the other
hand, as the examples illustrate, the application of direct efficacy is welcome when
certain actors—such as Youtube or Facebook—reside in an asymmetrical position in
relation to internet users—such as John or those who had their accounts suspended
because they posted pictures of topless aboriginals. The essential nucleus of freedom
of expression, access to information and privacy is sometimes put in danger. This
line of argument, supported by J. J. Gomes Canotilho’s proposal of a differentiating
solution,52 should acknowledge, however, that the relationships of dominance tradi-
tionally found in society and which would be the basis for the choice of application
between direct or indirect efficacy are not cleanly transposed to the networked-
society. Still, power imbalances online are plenty and the basic core of personality
rights is at times threatened.
To be clear, I am not proposing judicial review of each post, picture or video on
social media. This moderation has been taken on and should continue to be performed
increasingly by users themselves in the context of self-regulation, especially as a way
to ensure wider diversity of views and sturdier protection of speech against the state.53
The same goes for private platforms. It should not be up to judges on specific cases
to individually revise choices made humanly or mechanically to demote, remove
or obscure content. The limits imposed on the private actors online should concern
procedure. In exceptional cases, when the Judiciary is called upon to reverse a deci-
sion on content, it should be with an eye on the procedural aspects that allowed a
violation of personality rights so severe that direct efficacy comes into play.
In the case of John, as regards copyrights, the judge’s role goes beyond merely
recognizing fair. It should also be to identify the aspects of Youtube architecture
that resulted in this instance of free speech restriction. By unearthing and describing
the system rules that produce excessive censorship in John’s case, the ruling could
contribute with a transparent roadmap of how to improve the self-regulation system.
From the point of view of the Judiciary not evaluating every single case, the
model of self-regulation I propose here might be the same as notice and takedown

49 Balkin (2004), pp. 51–52.


50 Canaris (2003), p. 56.
51 On these limits, see Sarlet (2005).
52 Canotilho (2003), p. 1253.
53 Binenbojm (2016), p. 310.
282 I. A. Hartmann

systems of that of the German Netzwerkdurchsetzungsgesetz. There are, however,


two essential distinctions. First, self-regulation involves a prominent role of users in
establishing, overseeing, applying and enforcing rules on personality rights’ sensible
situations. Private intermediaries are not solely in charge of making judgement calls.
Second, most legislative and jurisprudential alternatives have very little concern
with procedural rules that would foster a long-term, increasingly Judicial-review-
independent balance between the protection and realization of personality rights
online. The answer seems to focus on more centralized analysis based on content
by traditional or new state-like authorities and more viewpoint regulation by social
actors other than users themselves. That is precisely the opposite of what I argue
here.
Third, the traditional solutions—including the unfortunately less popular judi-
cial notice and takedown standard adopted in Brazil—always assume that the only
possible regulation mechanism is removal. This might have made sense in the age
of mass media when the space for content was scarce. Today, however, space is
unlimited. In contrast, human attention has become the scarce commodity.54 As a
result, removing a statement or picture from one link is irrelevant because it can
easily be placed in another, endlessly. At the same time, removing attention—in
the form of search results or newsfeed appearances—has huge impact even when
content remains untouched. Platforms and users acting in collaboration have already
come up with very diverse and effective architectural system features to allow for the
increase or decrease of attention to specific content, with users themselves at helm.
Government and the Judiciary, on the other hand, have but experience with managing
decisions to remove or censor. To remove is a binary decision—content stays or is
taken offline. Attention, on the other hand, is much more nuanced. Only in the context
of self-regulation can fine-grained judgement calls be made about attention, about
when content is displayed, under what circumstances and to whom it is proactively
served, regardless of always being passively accessible.
The efficacy of fundamental rights in private relations requires companies such as
Facebook to trail a very wide yet demarcated path in the elaboration of system design
and application—by code or human governance—of rules that deal with freedom of
expression of its users. There are limits that could eventually trigger judicial review
and produce rulings that offer indications of how to calibrate the architecture of
the platform, in addition to just revising the analysis of the specific content in the
case. The Judiciary thus has the responsibility of pointing out necessary procedural
adjustments that enhance the balance between the information personality rights in
social media and other online fora.

54 Hartmann (2015).
Self-regulation in Online Content Platforms and the Protection … 283

4 Conclusion

My intention was to briefly describe central aspects of self-regulation mechanisms


through code. The goal was not to present a finalized and tested theory of a new
kind of self-regulation, but rather to point the reasons for why it is warranted. The
self-regulation performed by content platforms along with users cannot be explained
by traditional notions of how to restrict speech abuse or by existing definitions of
self-regulation in administrative law. Still, maybe with an open concept, acknowl-
edging diversity in the experiences of self-regulation and their constant evolution, the
criteria for comprehension and evaluation enumerated by Gustavo Binenbojm offer
at least some of the central elements of the response that the law should provide to this
phenomenon: chiefly the possibility of exploring institutional spaces that foster the
cooperation of state and private agents and the development of incentives and iden-
tification of the best versions of relationship between state, regulated and interested
parties.55
Ex ante enforcement is an element that strengthens the control of conduct, possibly
even excessively in the case of speech, as I have tried to show here. The experience
of users in these ecosystems indicates a new, more efficient role of the relationship
between regulated and interested parties, especially—but not exclusively—due to
level of cooperation in the production of norms. This only has a chance of operating
with satisfactory results if transparency is one of the central guidelines.
As I have suggested, the efficacy of fundamental rights in the interaction between
the private intermediary and the users ensures the possibility of judicial intervention
in order to correct procedural excesses, but not review evaluations of the merits of
content of user speech. The same goes for the possible intervention of the Executive
branch in this field (Hartmann 2020).
Further developments in scholarship should focus on a deeper analysis of the
application and effects of the mechanisms of self-regulation by code in order to shed
some light on the factual and legal limitations of the collateral restrictions that subject
individuals online. Less information about such mechanisms perpetuates a context
in which users have less means to protect their legitimate interests and companies,
in turn, benefit from a disproportionate concentration of power stimulated by the
utter lack of knowledge of other private and state actors56 about the new mechanisms
of self-regulation that thrive in the design, control and free reconfiguration of the
architecture or open online spaces in the networked-society.
This specific aspect of the new self-regulation is object of studies on algorithm
transparency and accountability.57 To know, predict and follow the direction in which
machine moderation of speech online is going one needs to know and understand
how algorithms operate. Transparency for algorithms in this scenario is not what
people usually think it could be and there are good reasons why pure transparency
is not viable for complex algorithms such as those used by Facebook and Google

55 Binenbojm (2016), p. 309.


56 Pasquale (2016).
57 Diakopoulos (2016).
284 I. A. Hartmann

to sort out content. There are solutions for accountability, however.58 It is currently
impossible to avoid that software plays a role in making judgement calls about the
quality expression, including the protection of honor and hate speech. But it is not
impossible to prevent lack of accountability for the ways the private platforms design,
improve and operate algorithms.
One last example of an extremely popular topic should drive the point home.
The part played by users as moderators in a code-enforced self-regulation of content
system solves the problems of platforms when it is too risky for them to filter content,
as they are unsure of what criteria to employ or whether their action might have its
legitimacy questioned. Actively filtering themselves might bring political in addi-
tion to financial costs. That is the challenge of fake news for private social media
companies today. Self-regulation with users at the helm is the strategy adopted by
Facebook to tackle fake news. About figuring out what content to classify as fake
news, Mark Zuckerberg informed that.
We could try to make that decision ourselves, but that’s not something we’re comfortable with
(…) We considered asking outside experts, which would take the decision out of our hands
but would likely not solve the objectivity problem. We decided that having the community
determine which sources are broadly trusted would be most objective.59

Seen through the lenses of traditional legal concepts, this might look like an
example of regular outsourcing. What I propose is that it should be acknowledged
as a two-way relationship that is both necessary and efficient, especially because the
future of speech moderation is to be found on regulation and not on isolated judicial
review of millions of posts, pictures and videos. It seems this is the best way forward
in the fight against fake news. One of the few academic works already published in
the wake of the 2016 presidential election in the United States, concerning the legal
aspects of restricting fake news points to promising structural regulation solutions
instead leaving it all exclusively to courts,60

References

Alexander L (2016) Facebook’s censorship of Aboriginal bodies raises troubling ideas of ‘decency’.
Available via The Guardian 23 March 2016. https://www.theguardian.com/technology/2016/mar/
23/facebook-censorship-topless-aboriginal-women. Accessed 11 Jan. 2022
Andrade J (1987) Os direitos fundamentais na constituição portuguesa de 1976. Almedina, Coimbra
Annany M, Crawford K (2016) Seeing without knowing: limitations of the transparency ideal and
its application to algorithmic accountability. New Media Soc. https://doi.org/10.1177/146144481
6676645

58 Annany and Crawford (2016).


59 Tiku (2018).
60 “Not all regulations affecting fake news publishers are strictly legal in nature. Many adver-

tising networks, social media companies, and other Internet partners enact and enforce their own
restrictions relevant to the publication of fake news.” Klein and Wueller (2017), p. 10.
Self-regulation in Online Content Platforms and the Protection … 285

Balkin J (2004) Digital speech and democratic culture: a theory of freedom of expression for the
information society. New York Univ Law Rev 79(1):1–55
Balkin J (2009) The future of free expression in a digital age. Pepperdine Law Rev 36:427–444
Barzilai-Nahon K (2008) Toward a theory of network gatekeeping: a framework for exploring
information control. J Am Soc Inform Sci Technol 59:1493–1512
Belli L, Zingales N (eds) (2017) Platform regulations: how platforms are regulated and how they
regulate us. Official outcome of the UN IGF dynamic coalition on platform responsibility. Escola
de Direito do Rio de Janeiro da Fundação Getulio. Vargas, Rio de Janeiro
Belli L (2015) A heterostakeholder cooperation for sustainable internet policymaking. Internet
Policy Rev 4(2):1–21
Binenbojm G (2016) Poder de Polícia. Ordenação. Regulação. Transformações Político-jurídicas,
econômicas e institucionais do direito administrativo ordenador. Fórum, Belo Horizonte
Canaris CW (2003) Direitos fundamentais e direito privado. Almedina, Coimbra
Canotilho JJG (2003) Direito constitucional e teoria da constituição, 7th edn. Almedina, Coimbra
Davidson T et al (2017) Automated hate speech detection and the problem of offensive language. In:
Proceedings of the 11th international AAAI conference on web and social media (ICWSM-17).
https://arxiv.org/abs/1703.04009. Accessed 11 Jan. 2022
Diakopoulos N (2016) Accountability in algorithmic decision making. Commun ACM 59(2):56–62
Ellis E (2017) The alt-right’s newest ploy? Trolling with false symbols. Available via Wired 10 May
2017. https://www.wired.com/2017/05/alt-rights-newest-ploy-trolling-false-symbols/. Accessed
11 Jan. 2022
Hartmann IA (2015) Liberdade de Manifestação Política e Campanhas: É preciso atenção aos
algoritmos. In: Falcão J (ed) Reforma Eleitoral no Brasil. Civilização Brasileira, Rio de Janeiro
Hartmann IA (2017) Let the users be the filter? Crowdsourced filtering to avoid online intermediary
liability. J Oxford Centre Socio-Legal Stud 1:21–47
Hartmann IA (2020) A new framework for online content moderation. Comp Law Secur Rev
36:1–10
Hockett R, Omarova S (2013) “Private” means to “public” ends: governments as market actors.
Cornell Law School research paper no, pp 13–84. http://ssrn.com/abstract=2222444. Accessed
11 Jan. 2022
Jolls C (2007) Behavioral law and economics. Public law and legal theory paper no. 130. http://
ssrn.com/abstract=959177. Accessed 11 Jan. 2022
Karavas V (2007) Digitale Grundrechte. Elemente einer Verfassung des Informationsflusses im
Internet. Nomos, Baden-Baden
Khal (2017) Why do people keep connecting Taylor Swift’s new song to the alt-right? Complex 29
August 2017. http://www.complex.com/music/2017/08/why-do-people-keep-connecting-taylor-
swift-new-song-to-the-alt-right. Accessed 11 Jan. 2022
Klein D, Wueller J (2017) Fake news: a legal perspective. J Internet Law 20(10):6–12
Kleinwächter W (2009) Internet co-governance. Towards a multilayer multiplayer mechanism of
consultation, coordination and cooperation (M3C3) In: Mansell R (ed) The information society,
vol III (Democracy, governance and regulation). Routledge, London
Ladeur KH (2009) Der Staat der „Gesellschaft der Netzwerke“. Zur Notwendigkeit der Forten-
twicklung des Paradigmas des „Gewährleistungsstaates“. Der Staat 48(2):163–192
Lee A (2011) Facebook apologizes for censoring Gay Kiss photo. The Huffington Post 19 April
2011. https://www.huffpost.com/entry/facebook-gay-kiss_n_850941. Accessed 11 Jan. 2022
Lessig L (1996) The zones of cyberspace. Stanford Law Rev 48:1403–1411
Lessig L (1998) The new Chicago school. J Leg Stud 27(2):661–691
Lessig L (2006) Code. Version 2.0. Basic Books, New York
Lessig L (2009) Remix: making art and commerce thrive in the hybrid economy. Penguin Books,
London
MacKaay E (2000) History of law and economics. In: Bouckaert B, De Gheest G (eds) Encyclopedia
of law and economics, vol 1. Edward Elgar, Cheltenham, pp 65–117
286 I. A. Hartmann

Mueller M et al (2007) The internet and global governance: principles and norms for a new regime.
Glob Gov 13(2):237–254
Mulholland C (2017) Secondary liability of service providers in Brazil: the effect of the civil rights
framework. In: Dinwoodie G (ed) Secondary liability of internet service providers. Springer, New
York, pp 171–184
Neuner J (2007) A influência dos direitos fundamentais sobre o direito privado alemão. In: Sarlet
IW, Neuner J, Monteiro A (eds) Direitos fundamentais e direito privado. Uma perspectiva de
direito comparado. Almedina, Coimbra
Ogus A (1995) Rethinking self-regulation. Oxf J Leg Stud 15(1):97–108
Parker G, Van Alstyne M, Choudary S (2016) Platform revolution: how networked markets are
transforming the economy and how to make them work for you. W. W. Norton & Company, New
York
Pasquale F (2016) The black box society: the secret algorithms that control money and information.
Harvard University Press, Cambridge
Pereira J (2006) Apontamentos sobre a aplicação das normas de direito fundamental nas relações
jurídicas entre particulares. In: Barroso L (ed) A nova interpretação constitucional. Ponderação,
direitos fundamentais e relações privadas. Renovar, Rio de Janeiro
Pinto PM (2007) A influência dos direitos fundamentais sobre o direito privado português. In:
Sarlet IW, Neuner J, Monteiro A (eds) Direitos fundamentais e direito privado. Uma perspectiva
de direito comparado. Almedina, Coimbra
Reynolds M (2016) Yahoo’s anti-abuse AI can hunt out even the most devious online trolls. Available
via Wired 29 July 2016. http://www.wired.co.uk/article/yahoo-online-abuse-algorithm. Accessed
11 Jan. 2022
Sarlet IW (2005) Constituição e Proporcionalidade: o direito penal e os direitos fundamentais entre
proibição de excesso e de insuficiência. Boletim da Faculdade de Direito da Universidade de
Coimbra 81:325–386
Sarlet IW (2006) A eficácia dos direitos fundamentais, 6th edn. Livraria do Advogado, Porto Alegre
Sarlet IW (2007) A influência dos direitos fundamentais no direito privado: o caso brasileiro. In:
Sarlet IW, Neuner J, Monteiro A (eds) Direitos fundamentais e direito privado. Uma perspectiva
de direito comparado. Almedina, Coimbra
Sarmento D (2010) Direitos fundamentais e relações privadas, 2nd edn. Lumen Juris, Rio de Janeiro
Seltzer W (2010) Free speech unmoored in copyright’s safe harbor: chilling effects of the DMCA
on the first amendment. Harvard J Law Technol 24:171–232
Shaw A (2012) Centralized and decentralized gatekeeping in an open online collective. Polit Soc
40:349–388
Solon O (2017) Facebook asks users for nude photos in project to combat ‘revenge porn’. Available
via The Guardian, 7 November 2017. https://www.theguardian.com/technology/2017/nov/07/fac
ebook-revenge-porn-nude-photos. Accessed 11 Jan. 2022
Stiglitz J (2009) Regulation and failure. In: Moss D, Cisternino J (orgs) New perspectives on
regulation. The Tobin Project, Cambridge, pp 11–23
Teubner G (2004) Societal constitutionalism: alternatives to state-centered constitutional theory?
In: Joerges C, Sand IJ, Teubner G (eds) Constitutionalism and transnational governance. Hart
Publishing, Oxford, pp 3–28
Tiku N (2018) Facebook’s latest fix for fake news: ask users what they trust. Available via Wired 19
January 2018. https://www.wired.com/story/facebooks-latest-fix-for-fake-news-ask-users-what-
they-trust. Accessed 11 Jan. 2022
Tushnet M (2007) Weak courts, strong rights: judicial review and social welfare rights in comparative
constitutional law. Princeton University Press, Princeton
Wu T (2003) When code isn’t law. Virginia Law Rev 89:103–170
Self-regulation in Online Content Platforms and the Protection … 287

Ivar Alberto Martins Hartmann Associate Professor at Insper Learning Institution in São
Paulo. MSc (Pontifical Catholic University Rio Grande do Sul, Brazil—PUCRS), LL.M. (Harvard
Law School), Ph.D. (State University of Rio de Janeiro, Brazil—UERJ). Main areas of research:
Cyberlaw, Legal Data Science and Constitutional Law. Selected publications: A new frame-
work for online content moderation. Computer Law & Security Review, v. 36 (2020), pp. 1–
10; Combining Ad Libraries with Fact Checking to Increase Transparency of Misinformation. In
Karanicolas, Michael (ed). Tackling the “Fake” Without Harming the “News”. Wikimedia/Yale
Law School Initiative on Intermediaries and Information (2021), pp. 67–84; A Right to Free
Internet? On Internet Access and Social Rights. Journal of High Technology Law, v. XIII (2013),
pp. 297–429; Let the Users be the Filter? Crowdsourced Filtering to Avoid Online Intermediary
Liability. Journal of the Oxford Centre for Socio-Legal Studies, v. 2017 (2017), pp. 21–47; Timing
Control without Docket Control. Journal of Law and Courts, Spring (2017), pp. 105–140 (in
collaboration with Diego Werneck Arguelhes).
Regulating Intermediaries to Protect
Personality Rights Online—The Case
of the German NetzDG

Wolfgang Schulz

Abstract Along with the shift of communication to the internet, hate speech and
misinformation are also relocating to the Net. Germany has attempted to tackle
this problem by enacting the infamous Network Enforcement Act (Netzwerk-
durchsetzungsgesetz, or NetzDG), which targets online platforms instead of indi-
vidual speakers. After providing an overview of the specific challenges presented
by internet-based communication, this chapter discusses the NetzDG’s regulatory
concept. It then critically examines the NetzDG’s compatibility with the e-Commerce
Directive and its impact on fundamental rights, namely freedom of expression.
Based on the findings, the chapter discusses human-rights-friendly alternatives to
NetzDG-style platform regulation.

1 Introduction

With the shift of communication to the internet, conflicts between freedom of expres-
sion and personality rights are also relocating to the Net. This article examines the
peculiarities of online communication and poses the question of the role of inter-
mediaries in these conflicts, both de facto and legally. Using the German Network
Enforcement Act as an example, it considers the consequences of compelling inter-
mediaries to restrict speech. Finally, the paper discusses what a human-rights-friendly
solution might look like.

W. Schulz (B)
Leibniz-Institute for Media Research | Hans Bredow-Institut, Rothenbaumchaussee 36, 20148
Hamburg, Germany
e-mail: schulz@hiig.de

© Springer Nature Switzerland AG 2022 289


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_12
290 W. Schulz

2 Personality Rights Risks Online

For lawyers—as opposed to politicians, from time to time—it is evident that what
is illegal offline is also illegal online. This is especially true for libel, defamation
and other privacy and personality rights infringements. Nevertheless, internet-based
communication is structurally different in ways that may lead to a reassessment of
existing rules and of the traditional means of implementing legal requirements.1 The
following attributes set online communication apart from offline communication:
Volume—The sheer number of cases can become a structural challenge. Face-
book, to take an example, stated that it makes over 100,000 content-related deci-
sions per month in the German version of Facebook alone.2 This means that the
suggestion of letting domestic courts decide on all content-related user-user or
user-platform conflicts must be ruled out as a solution.
Acceleration—While some content put on a social media platform might never be
read by anyone apart from the authors, some posts can go viral across platforms
and reach many people in a very short period of time.3 Therefore, governance
mechanisms have to come into effect quickly, before the content has been widely
distributed, since one can assume that after its distribution the content cannot be
located and addressed. There are currently no effective mechanisms to control
access to content after distribution.4
Observability—Online communication makes it easy for third parties or even for
the general public to observe communication among groups or between individ-
uals. This has made services like rating platforms possible but also poses distinct
challenges: observers may not know the contexts in which and the social rules
under which individuals or groups discuss. Many of society’s problems with the
internet in general and social media in particular result from the fact that previously
private and localized discussions, for instance at the pub or the dinner table, are
now visible to everyone, with all their exaggerations, polemics and stereotypes.
Persistence—It has become a popular saying that the internet never forgets. This
is true in the sense that at least so far, there is no technology that can ensure
that a piece of information can be deleted from all servers, wherever they are.
The so-called “right to be forgotten” is no more than a right to deletion that can
provide a data subject with a claim against a particular controller.5 However, while
search engines can help make the information in question far harder to find, this
information is not gone.
Attribution Problems—It can be hard to identify a speaker online, for instance
if one wants to file a claim against him or her. Even if there is a clear name policy,
this does not mean that the users are identifiable through their username in online

1 Schulz (2019), pp. 3f.


2 Köver (2016).
3 Goel et al. (2009).
4 For technical possibilities and limits, see Gorwa et al (2020) and Federrath (2015).
5 Cf. Paal (2018), Rt. 1–3.
Regulating Intermediaries to Protect Personality Rights Online—The Case … 291

communication. Anonymity is common online, and many users state that the
option of not disclosing their real name is essential for many of the functions that
the internet fulfils for public communication.6 Often it is also unclear whether
other actors, like the platform providers, are claiming ownership of a piece of
information as well, making attribution difficult.
Weak Social Control—Coupled with the attribution problem is the aspect of
social control. Legal concepts like privacy and its protection depend for their
implementation largely on the internalization of relevant norms. Social norms and
social control make the main difference, not legal enforcement. Those mechanisms
might be weaker online, especially when the speakers do not know each other in
“real life”. There are studies that show that when people communicate in anony-
mous forums, they are more uninhibited—and ultimately more hurtful—than
usual.7
Jurisdiction Issues—As a global technical medium, the internet enables anyone
to access any content, at least in principle. Although people initially worried
about the internet being a legal vacuum, it has now become apparent that, to the
contrary, its main challenge is that legal regimes of all nation-states can apply at
the same time. Since legal systems frame privacy in different ways, there is a risk
of jurisdictional conflicts.8
These are only some important structural characteristics of online communica-
tion. Against this background, there are political initiatives to tackle the problem of
harmful content online, including content that infringes on the privacy of others. The
effects mentioned above make it unlikely that the problem can be solved by means
of traditional legal instruments, e.g. users suing each other in domestic courts. The
attention of policymakers has therefore turned to the actors that might solve the
problem in an easy way: the providers of the intermediary services, like social media
platform providers.9 These providers generally have mechanisms in place to assess
the conformity of content with their own community standards, they can act before
things go viral and they can respond even if the user who posted the content cannot
be identified.10 Addressing the intermediaries is supposedly the easy way.

3 Approaches to Regulating Intermediaries

This article focuses on approaches within the European Union, especially Germany.
Under Articles 12 to 14 of Directive 2000/31/EC (e-Commerce Directive), there is a

6 Especially emphasized by the German Federal Court (BGH) in its spickmich.de decision, BGH
23.06.2009 – VI ZR 196/08.
7 Cf. Pötzsch (2010).
8 For a comprehensive discourse on the topic, see Internet & Jurisdiction Policy Network (2018).
9 Cf. Schulz (2019), p. 9.
10 Balkin (2018a), p. 2019.
292 W. Schulz

limit to the liability of service providers for third-party content.11 The e-Commerce
Directive (Article 14) has led to the development of notice-and-takedown procedures,
but it does not regulate them in detail. Even though there are differences in scope
and concept, this can be seen as the European equivalent of section 230 of the
US Communications Decency Act.12 The Directive 2000/31/EC also establishes the
country of origin principle, which asserts that member states in which the service
does not originate do not have the competence to regulate the service for pursuits
that are covered by the Directive.
At the European level, the Commission has—against this background—so far
refrained from regulating intermediaries specifically but has opted to encourage
measures for self-regulation. In 2016, the Commission negotiated a Code of Conduct
on illegal online hate speech with big players of the IT industry, and it evaluates
compliance on a regular basis.13
The German government started with a similar approach, i.e. the Ministry of
Justice called on the industry to improve their complaints mechanisms. However,
the government declared in 2017 that they were not satisfied with the performance
of these self-regulatory efforts. According to the German Ministry of Justice, Face-
book in particular reacted too slowly to complaints while Twitter generally had low
response rates.14
Driven by fears that misinformation and hate messages could influence the
Bundestag15 election campaign in the autumn of 2017, the German government
hurriedly worked on a draft for the Network Enforcement Act (Netzwerkdurchset-
zungsgesetz, or NetzDG).16 Even though most experts advised against the approach
taken by this Act during a hearing at the judicial committee of the Bundestag, the
coalition used its majority to pass it before the summer recess in 2017.17 The NetzDG
came into effect on October 1, 2017, and has been fully applicable since January 1,
2018.
The NetzDG’s regulatory model has since been adopted by at least thirteen other
countries, according to a report published in 2019.18 In this sense, the German govern-
ment unintentionally created a blueprint for online censorship, which is used by both
democratic and authoritarian states.19

11 Implemented in Germany in section 8–10 TMG. Liability in such cases is contingent on the
service provider taking note of the unlawfulness of third-party content. For considerations on the
Brazilian legal situation see Schreiber (2022), in this volume.
12 Wang (2018), pp. 36–37.
13 The challenges of enforced self-regulation in terms of the rule of law principle and human rights

aspects cannot be discussed here. There is the obvious risk of states or supranational bodies like the
EU Commission trying to convince the industry to “voluntarily” implement measures that the state
could not legally enact.
14 Press release by the GMJ 2017.
15 The German Parliament.
16 Cf. Schulz (2019), pp. 10f.
17 Bundesgesetzblatt (BGBl.) Jahrgang 2017 Teil I Nr. 16, issued September 07 2017, 3352.
18 Mchangama and Fiss (2019).
19 Cf. Mchangama and Fiss (2019).
Regulating Intermediaries to Protect Personality Rights Online—The Case … 293

3.1 The German Network Enforcement Act

The Act defines the social networks that fall within its scope: some rules are applicable
only to big networks (based on the number of users in Germany), some to all. There is
no definition of hate speech or fake news; instead, the act refers to existing definitions
of criminal offences under the German Criminal Code. There is a long list of offences
covered by the Act, partly aiming to protect general rights and public safety, partly
safeguarding individuals’ rights, such as insult, slander and defamation (section 185–
189 German Criminal Code). The latter are the link to protecting privacy, which is
the main focus of this article.
The Act also specifies that the provider of a social network must maintain an
effective and transparent procedure for handling complaints over unlawful content.
This is the main obligation under the Act. While early draft versions of the Act
determined that providers would violate their obligations if they failed to deal with
individual cases properly, it is now the obligation of the platform providers to provide
a functioning complaints system.
The NetzDG specifies that a functioning complaints system shall provide the
following: providers have to ensure that they delete manifestly unlawful content
within 24 h after a complaint has been filed. When content is not manifestly unlawful
but still unlawful, deletion has to be achieved within seven days. The review period
may exceed seven days if more time is required for the decision-making process.
Providers may also make use of a system of self-regulation in this case, in order to
reduce “overblocking”.
The inclusion of an element of self-regulation follows a debate in the lawmaking
process in which critics suggested using a model of co-regulation from German minor
protection law, which is widely regarded as a success, as an alternative to the NetzDG
approach. Under this scheme, the providers can form a self-regulatory body that
takes on monitoring and sanctioning responsibilities. State-appointed supervisory
authority is shifting towards providers engaging in self-regulation.20 Lawmakers,
however, only included the above-mentioned notion of self-regulation but did not
build on the regulation established for minor protection. In March 2020, the Frei-
willige Selbstkontrolle Multimedia-Diensteanbieter (FSM) was accredited as a self-
regulatory body by the Federal Office of Justice and has made a number of content
decisions since then.21
Under the NetzDG, providers of social networks must immediately name a person
authorized to receive complaints in the Federal Republic of Germany as well as a
point of contact for law enforcement.
Providers of social networks that receive more than 100 complaints over unlawful
content per calendar year must also produce biannual German-language reports on
the handling of complaints.
The Federal Office of Justice, which is directly subordinated to the Ministry of
Justice, oversees the implementation of the Act. Breaches of the obligations under

20 Cf. Schulz et al. (2008).


21 FSM, NetzDG, https://www.fsm.de/de/netzdg#N1_2. (accessed 11 Jan. 2022).
294 W. Schulz

sec. 2 para 1 or of the obligation to remove unlawful content can be punished with
a regulatory fine of up to fifty million euros.22 Fines can only be issued in cases
of systematic failure of the complaints handling system. The administrative offence
may be sanctioned even if it is not committed in the Federal Republic of Germany.
As of January 2020, the Federal Office of Justice reportedly initiated around 1300
procedures against social network providers based on violations with their NetzDG
obligations.23 Thus far, only Facebook was sanctioned with a fine of 2 million euros
in July 2019.24
Furthermore, the NetzDG establishes a right to disclosure: anyone whose general
personality rights have been violated by criminal offences covered by the Act will,
as a rule, be able to demand that the social network in question disclose details of
the person that committed the offence. Such entitlement to information is founded
on existing general principles of civil law, in particular on the principle of good
faith and the law of torts, within the bounds set by data protection rules. However,
prior to the NetzDG and the corresponding amendment of the German Telemedia
Act (TMG), the courts held that there was no legal basis under data protection law
to allow the operators of online platforms to disclose the subscriber information
(“Bestandsdaten”) of anonymous users.25 Injured parties therefore had to first lodge
a criminal complaint in order to find out the name and address of the perpetrator.
With the NetzDG, the right to disclosure can actually be asserted. Operators of
social networks are no longer restricted by data protection law and may disclose the
subscriber information of the infringer to the injured party under sec. 14 para 3 TMG.
In order to do so, however, a civil court with jurisdiction over the matter must have
ruled the disclosure to be permissible under sec. 14 para 4 TMG (judicial scrutiny
reservation).26

3.2 Revisions of the NetzDG in 2021

Building on the practical experience gained since the NetzDG entered into force,
the German parliament passed a bill to amend the Act in June 2021.27 Importantly,
the changes obligate providers of social networks to provide a more user-friendly
complaints mechanism under sec. 3b NetzDG. This requirement is not just formu-
lated for the types of unlawful content the NetzDG initially covered, but also extends
to removals of content for the violation of terms of service pursuant to sec. 3b

22 The fine complies with section 4 para 2 NetzDG: it refers to section 30 para 2 OWiG and hence
the fine can add up to 50 million euros.
23 Neuerer (2020).
24 Federal Ministry for Justice (2019); Facebook filed an appeal against the order.
25 Rixecker (2018), Rt. 318.
26 Whether this is helpful or not depends on what information the provider itself has, i.e. whether

the service requires a logon with a clear name.


27 Gesetz zur Änderung des Netzwerkdurchsetzungsgesetzes vom 3. Juni 2021 (BGBl. I S. 1436).
Regulating Intermediaries to Protect Personality Rights Online—The Case … 295

para. 3 NetzDG. In its amended form, sec. 2 para. 2 NetzDG now also imposes
more stringent transparency reporting obligations on providers, including informa-
tion regarding the use of automated content moderation systems (no. 2), the results
of counter-notification procedures (no. 11 et seq.) and extended explanations on
the interpretation of terms of service and their compliance with German civil law
requirements (no. 16, no. 17).
Most of these amendments, such as improved transparency obligations, are
welcome.28 Nonetheless, the legislator failed to address more fundamental concerns
regarding the NetzDG; arguing that there was no evidence for overblocking, it left
the Act’s core regulatory approach untouched.
In a separate legislative package that was passed recently, the German Bundestag
even broadened the scope of unlawful content under the NetzDG by amending the
German Criminal Code.29 The new law also obligates social network providers to
report certain potentially criminal content to the Federal Criminal Police Office
(BKA), thus raising concerns regarding a chilling effect on online speech.30

3.3 Critical Analysis

Apart from the more formal points—including the notion that the federal state
in Germany lacks legislative competence to regulate the issue31 and the NetzDG
being invalid for an infringement of Article 5 para 1 (EU) 2015/1535 because crim-
inal offences and the concept of regulated self-regulation were added without re-
notification of the EU Commission32 —criticism of the NetzDG mainly refers to
violations of the e-Commerce Directive and to fundamental rights concerns.33 As
discussed above, the recent amendments of the NetzDG do not manage to alleviate
these concerns.

28 Cf. Heldt (2020).


29 Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität.
30 Cf. Heldt (2020).
31 Cf. Statement of the German Bar Association (DAV) 2017, 5. Bitkom statement 2017, 3. For a

dissenting opinion see Roßnagel et al. (2017), p. 9, arguing that the law does not regulate content
but only substantiates sec.10 TMG, which is itself a (lawful) federal provision.
32 Member states must notify again a draft which has already been examined under the provisions of

Article 5 para 1 (EU) 2015/1535, if they make significant amendments to the draft in the following:
the third subparagraph of Article 5 para 1 specifies that amendments made to the text are considered
significant if they have the effect of altering its scope, shortening the timetable originally envisaged
for implementation, adding specifications or requirements, or making the latter more restrictive (cf.
CJEU in case C-433/05, Lars Sandström). Criminal offences protecting the state’s reputation were
removed (section 90, 90a, 90b StGB) and section 201a StGB was added on the recommendation
of the parliamentary committee on legal affairs on 28th June 2017, which can be regarded as a
“significant change” under Article 5 para 1 (EU) 2015/1535.
33 See the Statement by Digitale Gesellschaft e.V (2017), p. 3. See also the statements by BVDW

e.V (2017), pp. 1f, and OSCE (2017), pp. 8f, as well as Article 19 (2017), pp. 12ff and Global
Network Initiative (2017).
296 W. Schulz

3.3.1 Violation of Principles of the e-Commerce Directive

The assessment of European Law focuses on the liability privilege under Article 14
e-Commerce Directive. According to Article 14, hosting services can incur liability
if the provider has positive knowledge of illegal content. It loses this privilege if
it does not react “expeditiously”. This leads to the question of whether member
states substantiate the term “expeditiously” when naming specific time frames, like
the German NetzDG demands, or whether they are deferring to the e-Commerce
Directive, which might have specifically opted not to give a rigid time frame.34
There is good reason to believe that Germany overstepped the mark because the
whole purpose of the directive is to harmonize cross-border service provision in
Europe. Introducing different time frames in different member states would obstruct
that aim.35
Even if member states were able to substantiate the time frame, under the e-
Commerce Directive the country of origin would be responsible for ensuring the
provider’s compliance with its legal framework. Other member states could only
enforce their regulation under the exemption clause of Article 3 para 4 e-Commerce
Directive. This exception is, however, restricted to individual cases and does not
allow member states to establish their jurisdiction “through the back door”.36
The obligation to name a domestic authorized recipient might also violate Article
3 e-Commerce Directive and hamper the freedom of establishment as laid down in
the EU treaties.37

3.3.2 Human Rights Aspects

Even if we limit our consideration to possible infringements of the right to freedom


of expression, the multi-level fundamental rights system in Europe, a system that has
not been constructed purposefully but rather emerged over time, makes for a rather
complex situation38 :
The European Convention on Human Rights (ECHR)—an international treaty
that is part of the Council of Europe and enforced by the European Court on
Human Rights. Freedom of speech is protected under Article 10 para 1 ECHR.
In Germany, it is ratified as binding law at the level of a federal act, thus ranking
behind the national constitution. The EU itself has not signed and ratified the
Convention.
The EU Charter of Fundamental Rights (ChFR)—Part of the European Union’s
legal system and binding to European institutions and member states, but only

34 Schulz (2019), p. 12.


35 See Recital 5, DIRECTIVE 2000/31/EC.
36 Cf. Spindler (2017), pp. 14f.
37 Cf. Ladeur and Gostomzyk (2017), p. 93.
38 For a further referral to the law’s (non-)compatibility with the International Covenant on Civil

and Political Rights (ICCPR), see Kaye (2017) and Article 19 (2017), pp. 5ff.
Regulating Intermediaries to Protect Personality Rights Online—The Case … 297

when they are implementing and applying Union law. Freedom of speech is
protected under Article 11 para 1 ChFR.
The National Constitution, here the Grundgesetz (GG, Basic Law)—all laws,
including the NetzDG, have to be in accordance with it. Freedom of speech is
protected under Article 5 para 1 GG.
These three spheres of human rights protection in Europe—national, suprana-
tional, and international—overlap and intertwine. This peculiar architecture makes
it difficult to grasp the relationship between the frameworks and the interplay between
the different courts.39 For instance, the normative hierarchy between the provisions
of the Basic Law and those of the ECHR does not automatically lead to the inap-
plicability of the latter in case of a potential conflict with the former; due to the
commitment of the—hierarchically superior—Basic Law to international treaties,
the fundamental rights of the Basic Law instead need to be interpreted in such a way
as to accommodate the guarantees of the ECHR to the largest extent possible.40
This analysis will not go into specifics regarding freedom of speech guarantees
under those fundamental rights systems but will only name points of criticism that
might be relevant in all legal assessments. The complexity of the analysis is also
increased by the fact that fundamental rights interests of various actors are involved:
(1) The “victim” offended by speech on the platform, which might or might not
also be the complainant,
(2) the provider of the social media platform,
(3) the author of illegal content that is taken down,
(4) the author of legal content that is (wrongly) taken down,
(5) the recipients of content.
The legal status of intermediaries like social media services under freedom of speech
protection is still unclear.41 They clearly enjoy freedom of speech protection for
their own statements on the platform, but whether the provision of the platform as
such and specific functions provided for the users are protected as well is heavily
debated.42 This especially pertains to the curation of social media content, such
as Facebook’s News Feed, since the editorial decisions taken by intermediaries
might also be protected. Furthermore, intermediaries enable or at least facilitate
the communication of others and might therefore indirectly fall within the scope of
free speech guarantees.43 The central question is therefore whether an establishment

39 Cf. Schütze (2012), pp. 410f.


40 The Federal Constitutional Court took this view i.a. in the Görgülü decision, Order of 14 October
2004 – 2 BvR 1481/04.
41 In the case of Delfi AS v Estonia (64569/09), the ECHR ruled against a news platform that had

been deemed liable for anonymous defamatory comments posted under an article on their website.
42 Schulz (2019), p. 15.
43 Schulz (2017), p. 375.
298 W. Schulz

of a complaints system as required by the NetzDG constitutes an encroachment on


the providers’ right to freedom of communication.44
The type of content that is illegal to publish under any circumstances is limited. For
all other types of content, protecting freedom of speech requires a context-sensitive
determination of the meaning of the act of speech. This is especially true for possible
infringements of personality rights. German constitutional law requires a complex
balancing act when reporting personal details about a person without consent. It is
unlikely to encounter any “obvious” case in this field.
If a state law is likely to make a provider remove content that is legal, this law
interferes with freedom of speech (Article 5 para 1 GG, Article 10 para 1 ECHR,
Article 11 para 1 ChFR).45
To begin with, the time frame of 24 h for removing content that is “manifestly
unlawful” triggers freedom of speech concerns. First, there is doubt over whether
obviously illegal content can be identified easily, given that context always has to be
taken into account.46 Second, each piece of content that has been flagged has to be
assessed to identify the obviously illegal parts. According to Facebook, defamation
and hate speech alone account for 100,000 takedowns per month in Germany.47
Given that figure, it seems rational for a provider to take down any flagged content
if in doubt, just to save costs.48
The seven-day deadline for the remaining (not obviously) illegal content also
raises doubts. The determination of whether a declaration is a statement of fact or a
proclamation of an opinion is essential for an assessment under German law, since the
level of protection differs accordingly. While the expression of an opinion is always
protected under Article 5 para 1 GG, a statement of fact is only protected insofar as it
is a prerequisite of or conducive to the formation of an opinion of others.49 This is a
complex issue, and it is possible that different courts might disagree on the result.50
The same is true for the question of whether a statement of fact is true or not, since a
deliberately false assertion as well as an evidently untrue statement fall outside the
scope of Article 5 para 1 GG.51 To conduct such assessments within the given time
frame puts pressure on a provider and might again push said provider towards the
simple but human-rights-adverse solution of taking down the content in almost any

44 See Ladeur and Gostomzyk (2017), pp. 32ff and 93f for an emphasis of (and an argument for the
violation of) the provider’s fundamental right to occupational freedom.
45 Cf. Reporter ohne Grenzen (2017), pp. 7f; Ladeur and Gostomzyk (2017), pp. 76f.
46 Cf. Schulz (2019), p. 12.
47 Steinlechner (2018).
48 For a more nuanced analysis based on platforms’ transparency reports see Gollatz et al. (2018)

and Heldt (2019).


49 Jarass (2018), Rt. 5 f.
50 See Lee (2017) for a discussion of the difficulties of context and statistics in German court

decisions.
51 Jarass (2018), Rt. 7.
Regulating Intermediaries to Protect Personality Rights Online—The Case … 299

case. Furthermore, the providers lack information about the context, as well as the
necessary information-gathering tools, to make a proper assessment.52
Early versions of the draft contained a specific obligation to ensure that similar
content would not be uploaded again. This triggered fears of overblocking since the
most effective way of doing this is by using upload filters, which, at the current
state of development, fail to detect irony or critical reference to content.53 This part
of the draft has been removed, but the current legislation still requires that similar
content on the platform should be detected and removed, which again is best achieved
by automated systems that are as of yet not context-sensitive. Concerns therefore
remain that platforms will resort to (semi-)automated flagging systems that can have
a substantial impact on the final outcome of content removal decisions.54
If an encroachment on fundamental rights occurs and we seek to assess whether it
is justified, we have to perform a proportionality test. At least according to German
methodology, the first step of that test is to see whether the legislation follows a “legit-
imate aim”. While helping to enforce criminal law certainly constitutes a legitimate
aim, reducing the impact of criminal acts—as the NetzDG mainly does—might also
be legitimate. It is noteworthy, however, that the official explanatory memorandum
for the NetzDG begins by referring not to this aim, but rather to the need to maintain a
culture of political debate. This is understandable given the rise of right-wing populist
movements at the time the law was passed. However, the desire to protect political
culture, plausible as it is, does not suffice to limit human rights; it instead burdens
the NetzDG with the complexity of moral aspirations.55 It is not the responsibility
of the state to govern the style and civility of debate in society. However, if there
is indeed a structural risk to the freedom and openness of debate, state intervention
might be justified.
Another point of criticism is that the Federal Office of Justice has a crucial
role to play in the enforcement of the Act and reports directly to the Minister of
Justice, making it by no means politically independent.56 This is especially notable
in Germany, where the independence of the media system is firmly protected by
the Federal Constitutional Court, one of the reasons being the historical use of new
media during the Nazi dictatorship.57
These are only the broad lines of criticism, showing the structural problems
with this kind of regulation. It can thus already be said that trying to make use
of intermediaries might not be the silver bullet after all.58

52 Schulz (2019), p. 13.


53 Cf. Duarte et al. (2018).
54 Cf. Kaye (2018).
55 Schulz (2019), p. 15.
56 Cf. Kaye (2017), p. 4.
57 Schulz (2019), p. 12.
58 For a more nuanced conclusion and an emphasis on the positive effects of the law, see, inter alia,

Roßnagel et al. (2017) and Theil (2018).


300 W. Schulz

4 Towards a Human-Rights-Friendly Solution

4.1 Regulating Intermediaries

Online intermediaries of different kinds, including search engines, social media or


app platforms, play a constitutive role in today’s digital environment. They have
become a new type of powerful institution in the twenty-first century. Due to their
role in shaping the public networked sphere, they are subject to intense and often
controversial policy debates.59 As mentioned before, their intermediary function
make them convenient targets for lawmakers and regulators.
Driven by the need to better understand online governance structures, we devel-
oped a four-component system built on a concept by Lawrence Lessig.60 It considers
legal norms set by the state, contracts, social norms and computer code in their inter-
action. With regard to intermediaries, it is significant that computer code, contracts
and, to some extent, social norms are developed by them and that the state tries to
influence all of these factors through the law. The power of intermediaries also raises
questions about “private ordering” by intermediaries,61 which is not the focus of
this article, but is important nonetheless. This article discusses the law harnessing
the power of intermediaries, for instance by coding filters or by acting on their
contractual right to remove content.
If legislation creates incentives for a rational intermediary to act in a way that likely
has a negative impact on freedom of speech, this must be considered as interfering
with freedom of expression.62 This means that it can only be justified when it complies
with the limitations specified in the GG, the ChFR or the ECHR.
There is an incentive to fulfil the legal demands by using technology. Detection
and deletion of harmful speech online is already a common practice, and they are used
to investigate child pornography and terrorist activities.63 Technology is also used
in the field of copyright infringements.64 What is significant, however, is how the
technical possibilities of regulation via code influences the legal requirements—and
vice-versa.

59 Cf. Gasser and Schulz (2015), pp. 3f.


60 See Niva Elkin-Koren (2011), pp. 16f.; see generally Lessig (2006).
61 Cf. Niva Elkin-Koren (2011), pp. 17f. and Balkin (2018b), pp. 1182f.
62 For the freedom of expression guaranteed by the GG, this follows from the modern interpretation

of an interference with fundamental rights as any act by the state that makes the exercise of the
affected right impossible or substantially more difficult. Cf. Grabenwarter (2017), Rt.100.
63 Cf. Iqbal et al. (2018).
64 See. Dewey (2016) on how YouTube uses automated takedown measures and the complications

that arise, especially in connection to citations and Fair Use.


Regulating Intermediaries to Protect Personality Rights Online—The Case … 301

4.2 The Significance of Context as a Law and Technology


Problem

Privacy infringements are a special case since, at least in Germany, the constitution
demands a proper construction of the meaning of the statement in question. This in
turn requires, as mentioned before, an assessment of the context.65 It makes, to take
an example, a big difference for the legal assessment whether critical reflection or
irony is involved. Child pornography is one of the few examples for which no context
is conceivable that would justify publication.
A widely discussed case in Germany regards a selfie taken by a refugee with
Chancellor Angela Merkel. This picture was circulated by right-wing groups along
with the false statement that this particular refugee was involved in terrorist activities;
this post went viral. Much of the ensuing discussion about the case also used the
picture. As a result, using the indicator of this picture in connection with the word
“terrorist” would lead to the removal of many posts that critically examined the use
of the image by right-wing extremists.
In consequence, the state of the art in context detection thus becomes the
determining factor for an automated system to produce legally acceptable results.
Despite significant progress in the field of artificial intelligence, it appears that there
are currently no systems that can detect irony66 and critical references to illegal
content with sufficient certainty. Therefore, the legal assessment—especially the
proportionality of a measure—depends on the technological state of the art.
This is not unheard of, but legal requirements for content regulation are among
the rare cases for which it becomes relevant on a constitutional level. This has to be
taken into account when discussing a human-rights-compatible solution for privacy
challenges online.

4.3 Council of Europe Recommendation on the Roles


and Responsibilities of Internet Intermediaries

One of the most comprehensive attempts to issue guidance to states on how to


regulate intermediaries in a human-rights-friendly manner is the Recommendation
CM/Rec(2018)2 of the Committee of Ministers to member states on the roles and

65 This means, inter alia, that due to the importance of this right for a democratic society, where
different interpretations of a statement are equally possible, one must assume the one that is still
protected by freedom of speech. Cf. Grabenwarter (2017), Rt.139f. In Germany, the Federal Consti-
tutional Court has established a nuanced test system for assessing ambiguous statements, Order of
25 October 2005 – 1 BvR 1696/98—(see also Hochhuth (2006)).
66 Van Hee, Lefever and Hoste (2018).
302 W. Schulz

responsibilities of internet intermediaries adopted by the Council of Europe in April


2018.67
The recommendation states that protection of privacy and personal data is funda-
mental for the enjoyment and exercise of most of the rights and freedoms guaranteed
in the Convention. However, the internet has facilitated an increase in privacy-related
risks and infringements and has spurred the spread of certain forms of harassment,
hatred and incitement to violence, in particular on the basis of gender, race and
religion, which remain underreported and are rarely remedied or prosecuted. More-
over, the rise of the internet and related technological developments have created
substantial challenges for the maintenance of public order and national security,
for crime prevention and law enforcement, and for the protection of the rights of
others, including intellectual property rights. Targeted disinformation campaigns
online, designed specifically to sow mistrust and confusion and to sharpen existing
divisions in society, may have destabilising effects on democratic processes.
A wide, diverse and rapidly evolving range of players, commonly referred to as
“internet intermediaries”, facilitate web-based interactions between natural and legal
persons by offering and performing a variety of functions and services. The recom-
mendation highlights that intermediaries may carry out several functions simultane-
ously. Intermediaries may moderate and rank content, including through the auto-
mated processing of personal data, and they may thereby exert forms of control that
influence users’ access to information online in ways comparable to the media, or they
may perform other functions that resemble those of publishers. Traditional media
outlets might also offer intermediary services, for instance when they offer space for
user-generated content on their platforms. The regulatory framework governing the
intermediary function is without prejudice to the frameworks that are applicable to
the other functions offered by the same entity.
Remarkably, the recommendation is divided almost equally into proposals for
states and proposals for intermediaries themselves. The latter are based on the Ruggie
principles, stating that companies, while not directly bound by human rights, should
nevertheless have a responsibility to observe them in their decision-making.
There are some recommendations that set limits on the NetzDG-style of regu-
lation in the interest of protecting human rights. No. 1.3.2 is of particular impor-
tance, requiring state authorities to obtain an order from a judicial or other indepen-
dent administrative authority whose decisions are subject to judicial review when
compelling intermediaries to restrict access to content. This does not apply in cases
concerning content that is illegal irrespective of context, such as child pornography,
or in cases requiring expedited measures in accordance with the conditions prescribed
in Article 10 ECHR.
Crucial in this context is 1.3.7, stating:
States should ensure, in law and in practice, that intermediaries are not held liable for third-
party content which they merely give access to or which they transmit or store. State author-
ities may hold intermediaries co-responsible with respect to content that they store if they

67Available at: https://rm.coe.int/1680790e14. (accessed 11 Jan. 2022). The author of this article
was the chairman of the Committee that drafted the Recommendation.
Regulating Intermediaries to Protect Personality Rights Online—The Case … 303

do not act expeditiously to restrict access to content or services as soon as they become
aware of their illegal nature, including through notice-based procedures. State authorities
should ensure that notice-based procedures are not designed in a manner that incentivises
the take-down of legal content, for example due to inappropriately short timeframes. Notices
should contain sufficient information for intermediaries to take appropriate measures. Notices
submitted by states should be based on their own assessment of the illegality of the notified
content, in accordance with international standards. Content restrictions should provide for
notice of such restriction being given to the content producer / issuer as early as possible,
unless this interferes with ongoing law-enforcement activities. Information should also be
made available to users seeking access to the content, in accordance with applicable data
protection laws.

This statement offers several valuable points, including the need to specify complaints
and the explicit mention of the time frame, which have been some of the main points
of criticism of the NetzDG approach.

5 Conclusions

The core of the legal assessment of measures taken by the state is the proportionality
test. The Council of Europe Recommendation already marks some consequences
of applying the test. Conflicts between privacy and freedom of communication are
especially delicate because of their sensitivity to context. Intermediaries do not have
the knowledge base to carry out a proper assessment and technical means are not yet
available. In consequence, measures that lead to an ad hoc assessment and/or to the
use of technology for content moderation will not meet the proportionality test and,
as a result, violate freedom of expression. The proportionality test should also guide
the regional scope of measures.68
This insight—which is bitter for the protection of privacy—can only be mitigated
by accompanying measures. One obvious measure might be to enhance the knowl-
edge of counter speech.69 This allows for a more “informed and nuanced approach
to identifying and limiting” hate speech, e.g. based on taxonomies of actors, types of
speech and types of harm.70 In political debates, this is sometimes used as an argu-
ment against regulation without actually getting into the issue. However, extensive
research on various practices and their effects exists.71
Furthermore, it can be in the best interest of intermediaries—and should be encour-
aged by lawmakers and courts alike—to design and implement instruments of dispute

68 In its Google Spain decision (C-131/12), the CJEU purposefully limited the deletion obligation
to the search results on European search queries, leaving open the possibility of still finding the
results when using, for example, Google.com.
69 See Benesch (2014).
70 Gagliadore et al. (2015), p. 5.
71 See for example the work of Susan Benesch, especially the Dangerous Speech Project, https://

dangerousspeech.org/. Accessed 11 Jan. 2022.


304 W. Schulz

resolution so that the conflicts can be brought back to their primary parties.72 Finally,
we will almost certainly witness the adaptation of social norms to the new means of
communication. When we realise that people do not attach so much significance to
an incidental statement on an online platform, this will reduce the actual harm done
by these instances of communication and consequently reduce the need for legal
action to protect privacy online.

References

Article 19 (2017) Germany: the act to improve enforcement of the law in social networks—
legal analysis. https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-
German-NetzDG-Act.pdf. Accessed 11 Jan. 2022
Balkin J (2018a) Free speech is a triangle. Columbia Law Rev 118:2011–2056
Balkin J (2018b) Free speech in the algorithmic society: big data, private governance, and new
school speech regulation. UCLA Law Rev 51:1149–1209
Benesch S (2014) Countering dangerous speech: new ideas for genocide prevention. https://www.
ushmm.org/m/pdfs/20140212-benesch-countering-dangerous-speech.pdf. Accessed 11 Jan.
2022
Bitkom (2017) Stellungnahme zum Regierungsentwurf NetzDG. https://www.bitkom.org/Bitkom/
Publikationen/Stellungnahme-zum-Regierungsentwurf-NetzDG.html. Accessed 11 Jan. 2022
Bundesverband Digitale Wirtschaft e.V. (2017) Stellungnahme zum Entwurf eines Gesetzes
zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken des BMJV vom 14. März
2017. https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Stellungnahmen/2017/Dow
nloads/03302017_Stellungnahme_BVDW_RefE_NetzDG.pdf?__blob=publicationFile&v=2.
Accessed 11 Jan. 2022
Cheung A, Schulz W (2018) Reputation protection on online rating sites. Stanford Law Technol J
Dewey C (2016) How we’re unwittingly letting robots censor the web. https://www.washingto
npost.com/news/the-intersect/wp/2016/03/29/how-were-unwittingly-letting-robots-censor-the-
web/?noredirect=on&utm_term=.e1efcfb456a2. Accessed 11 Jan. 2022
Digitale Gesellschaft e.V. (2017) Stellungnahme zum Referentenentwurf des Bundesministeriums
der Justiz und für Verbraucherschutz für ein Gesetz zur Verbesserung der Rechtsdurchset-
zung in sozialen Netzwerken. https://digitalegesellschaft.de/wp-content/uploads/2017/03/Stellu
ngnahme_DigiGes_NetzDG.pdf. Accessed 11 Jan. 2022
Duarte N, Llansó E, Loup A (2018) Mixed messages? The limits of automated social media content
analysis. https://cdt.org/files/2017/12/FAT-conference-draft-2018.pdf. Accessed 11 Jan. 2022
Elkin-Koren N (2011) Mapping the frontiers of governance in social media. Draft paper prepared
for the 1st Berlin symposium on Internet and society, 25–27 October 2011
Federal Ministry for Justice (2019) Bundesamt für Justiz (BfJ) erlässt Bußgeldbescheid gegen Face-
book. https://www.bmjv.de/SharedDocs/Artikel/DE/2019/070319_Facebook.html. Accessed 11
Jan. 2022
Federrath H (2015) Geoblocking und die Möglichkeiten der Technik. ZUM 59(12):929–932
Gasser U, Schulz W (2015) Governance of online intermediaries—observations from a series
of National case studies. https://cyber.harvard.edu/publications/2015/online_intermediaries.
Accessed 11 Jan. 2022

72 For early encouragement of such ideas, see Lide (1996) and Perritt (2000). New perspectives cf.
Cheung and Schulz (2018). For the advantages of modes of self-regulation see Article 19 (2017),
p. 11. Cf. also for a proposal to establish self-regulation mechanisms through code Hartmann (2022),
in this volume.
Regulating Intermediaries to Protect Personality Rights Online—The Case … 305

Gagliadore I, Gal D, Alves T, Martinez G (2015) Countering online hate speech. http://unesdoc.
unesco.org/images/0023/002332/233231e.pdf. Accessed 11 Jan. 2022
German Bar Association DAV (2017) Stellungnahme des Deutschen Anwaltsvereins durch die
Ausschüsse Informationsrecht und Strafrecht – Regierungsentwurf des Bundesministeriums der
Justiz und für Verbraucherschutz – Entwurf eines Gesetzes zur Verbesserung der Rechtsdurch-
setzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz – NetzDG). https://www.cr-onl
ine.de/DAV_SN_41-2017.pdf. Accessed 11 Jan. 2022
German Federal Ministry of Justice and Consumer Protection (2017) Löschung von strafbaren
Hasskommentaren durch soziale Netzwerke weiterhin nicht ausreichend. https://www.bmfsfj.de/
bmfsfj/aktuelles/presse/pressemitteilungen/loeschung-von-strafbaren-hasskommentaren-durch-
soziale-netzwerke-weiterhin-nicht-ausreichend-115300. Accessed 11 Jan. 2022
Global Network Initiative (2017) Proposed German legislation threatens free expression around
the world. https://globalnetworkinitiative.org/proposed-german-legislation-threatens-free-expres
sion-around-the-world/. Accessed 11 Jan. 2022
Goel S, Anderson A, Hofmann J, Watts D (2016) The structural virality of online diffusion. Manage
Sci 62(1)
Gollatz K, Riedl M, Pohlmann J (2018) Removals of online hate speech in numbers. https://www.
hiig.de/en/removals-of-online-hate-speech-numbers/. Accessed 11 Jan. 2022
Gorwa R, Binns R, Katzenbach K (2020) Algorithmic content moderation: technical and political
challenges in the automation of platform governance. Big Data Soc. https://doi.org/10.1177/205
3951719897945
Grabenwarter C (2017) Art. 5. In: Maunz, Dürig (eds) Grundgesetz-Kommentar. Beck, München
Hartmann IA (2022) Self-regulation in online content platforms and the protection of personality
rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the Internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Heldt A (2020) Germany is amending its online speech act NetzDG… but not only that. Internet
Policy Rev. https://policyreview.info/articles/news/germany-amending-its-online-speech-act-net
zdg-not-only/1464. Accessed 11 Jan. 2022
Heldt A (2019) Reading between the lines and the numbers: an analysis of the first NetzDG reports.
Internet Policy Rev 8(2)
Hochhuth M (2006) Kein Grundrecht auf üble Nachrede – Der Stolpe-Beschluss schützt das Personal
der Demokratie. NJW, pp 189–191
Internet & Jurisdiction Policy Network (2018) Data & jurisdiction work plan. https://www.intern
etjurisdiction.net/uploads/pdfs/Papers/Data-Jurisdiction-Work-Plan.pdf. Accessed 11 Jan. 2022
Iqbal F, Marrington A, Hung P, Yankson B (2018) A study of detecting child pornography on
smart phone. In: Conference paper for the international conference on network-based infor-
mation systems. https://www.researchgate.net/publication/319268051_A_Study_of_Detecting_
Child_Pornography_on_Smart_Phone. Accessed 11 Jan. 2022
Jarass H (2018) Art. 5. In: Jarass H, Pieroth B (eds) Grundgesetz Kommentar. Beck, München
Kaye D (2018) Report of the special Rapporteur on the promotion and protection of the right to
freedom of opinion and expression
Kaye D (2017) Open letter to the German Chancellor concerning the draft law “Netzwerkdurchset-
zungsgesetz” (OL-DEU-1-2017). http://www.ohchr.org/Documents/Issues/Opinion/Legislation/
OL-DEU-1-2017.pdf. Accessed 11 Jan. 2022
Köver C (2016) Facebook verrät, wie viele Hasskommentare es wirklich löscht. https://www.wired.
de/collection/life/facebook-verraet-wie-viele-hasskommentare-wirklich-geloescht-werden.
Accessed 11 Jan. 2022
Ladeur K-H, Gostomzyk T (2017) Gutachten zur Verfassungsmäßigkeit des Entwurfs eines Gesetzes
zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsge-
setz – NetzDG) i.d.F. vom 16. Mai 2017 – BT.Drs. 18/12356 – Erstattet auf Ansuchen des Bitkom.
https://www.cr-online.de/NetzDG-Gutachten-Gostomzyk-Ladeur.pdf. Accessed 11 Jan. 2022
Lee D (2017) Germany’s NetzDG and the threat to online free speech. https://law.yale.edu/mfia/
case-disclosed/germanys-netzdg-and-threat-online-free-speech. Accessed 11 Jan. 2022
306 W. Schulz

Lessig L (2006) Code: Version 2.0. Basic Books, New York


Lide C (1996) ADR and cyberspace: the role of alternative dispute resolution in online commerce,
intellectual property and defamation. Ohio St J Disp Resol 12(1):193–222
Mchangama J, Fiss J (2019) The Digital Berlin Wall: how Germany (accidentally) created a
prototype for global online censorship. Justitia. https://globalfreedomofexpression.columbia.
edu/wp-content/uploads/2019/11/Analyse_The-Digital-Berlin-Wall-How-Germany-Accidenta
lly-Created-a-Prototype-for-Global-Online-Censorship.pdf. Accessed 11 Jan. 2022
Neuerer (2020) Bisher rund 1.300 Bußgeldverfahren gegen soziale Netzwerke. Handelsblatt.
https://www.handelsblatt.com/politik/deutschland/gesetz-gegen-hass-im-netz-bisher-rund-1-
300-bussgeldverfahren-gegen-soziale-netzwerke/25419580.html?ticket=ST-10011715-MTG
vec7PAfNXxmhB4VRg-ap3. Accessed 11 Jan. 2022
Organization for Security and Co-operation in Europe (2017) Legal review of the draft law on better
law enforcement in social networks. https://www.osce.org/fom/333541. Accessed 11 Jan. 2022
Paal B (2018) Art. 17. In: Paal B, Pauly D (eds) Beck’sche Kompakt-Kommentare Datenschutz-
grundverordnung Bundesdatenschutzgesetz. Beck, München
Perritt H (2000) Dispute resolution in cyberspace: demand for new forms of ADR. Ohio St J Disp
Resol 15(3):675–703
Pötzsch S (2010) Einfluss wahrgenommener Privatsphäre und Anonymität auf Forennutzer. In:
Ziegler J, Schmidt A (eds) Mensch & Computer 2010: Interaktive Kulturen. Oldenbourg Verlag,
München, pp 129–138
Reporter ohne Grenzen (2017) Stellungnahme zum Entwurf eines Gesetzes zur Verbesserung der
Rechtsdurchsetzung in sozialen Netzwerken der Fraktionen von CDU/CSU und SPD (BT DS
18/12356) zur Anhörung im Rechtsausschuss des Bundestages am 19. Juni 2017. https://www.
reporter-ohne-grenzen.de/fileadmin/Redaktion/Dokumente/Internetfreiheit/20170619_Stellu
ngnahme_oeA_BT-Rechtsausschuss_NetzDG_Reporter_ohne_Grenzen.pdf. Accessed 11 Jan.
2022
Rixecker R (2018) Appendix to § 12. In: Münchener Kommentar zum BGB. Beck, München
Roßnagel A, Bile T, Friedewald M, Geminn C, Heesen J, Karaboga M, Krämer N, Kreutzer M,
Löber L, Martin N, Nebel M, Ochs C (2017) Forum Privatheit und selbstbestimmtes Leben in der
digitalen Welt: policy paper das Netzwerkdurchsetzungsgesetz. https://www.forum-privatheit.de/
wp-content/uploads/Policy-Paper-NetzDG.pdf. Accessed 11 Jan. 2022
Schreiber A (2022) Civil rights framework of the Internet (BCRFI; Marco Civil da Internet):
Advance or setback? Civil liability for damage derived from content generated by Third Party.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the Internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Schulz W (2017) Kontrolle vorherrschender Meinungsmacht – Dekonstruktion eines medien-
rechtlichen Schlüsselbegriffs. AfP 2017:373–379
Schulz W (2019) Roles and responsibilities of information intermediaries: fighting misinformation
as a test case for a human rights-respecting governance of social media platforms. Hoover Working
Group on National Security, Technology, and Law, Aegis Series Paper No. 1904
Schulz W, Held T, Dreyer S (2008), Regulation of broadcasting and Internet services in
Germany. https://www.hans-bredow-institut.de/uploads/media/Publikationen/cms/media/f7f756
29f2781560a1b898f16a46cf87a066822c.pdf. Accessed 11 Jan. 2022
Schütze R (2012) European constitutional law. Cambridge University Press, Cambridge
Spindler G (2017) Legal expertise commissioned by BITKOM concerning the notified German Act
to improve enforcement of the law in social networks. https://www.bitkom.org/sites/default/files/
file/import/Legal-Expertise-Official-2-0.pdf. Accessed 11 Jan. 2022
Steinlechner P (2016) 100.000 Hassinhalte in einem Monat gelöscht. https://www.golem.de/news/
facebook-100-000-hassinhalte-in-einem-monat-geloescht-1609-123459.html. Accessed 11 Jan.
2022
Theil S (2018) The German NetzDG: a risk worth taking? https://verfassungsblog.de/the-german-
netzdg-a-risk-worth-taking/. Accessed 11 Jan. 2022
Regulating Intermediaries to Protect Personality Rights Online—The Case … 307

Wang J (2018) Regulating hosting ISPs’ responsibilities for copyright infringement. Springer,
Singapore

Wolfgang Schulz Professor of Media Law and Public Law including Theoretical Foundations.
Director of the Leibniz-Institute for Media Research | Hans-Bredow-Institut in Hamburg and
Director of the Alexander von Humboldt Institute for Internet and Society in Berlin. Chair
of the Expert Committee for Communication and Information of the German Commission for
UNESCO. Main areas of research: regulation of media content, questions concerning the legal
basis of new communication media, communication freedoms. Selected publications: Setting
Rules for 2.7 Billion. A (First) Look into Facebook’s Norm-Making System: Results of a Pilot
Study (together with Matthias C. Kettemann), Working Papers of the Hans-Bredow-Institut, 2020;
Künstliche Intelligenz, Intermediäre und Öffentlichkeit, Bericht an das BAKOM (together with
Stephan Dreyer), 2019; Kontrolle vorherrschender Meinungsmacht—Dekonstruktion eines medi-
enrechtlichen Schlüsselbegriffs, AfP 2017, pp. 373–379.
Online Anonymity—The Achilles’-Heel
of the Brazilian Marco Civil da Internet

Sohail Aftab

Abstract This article highlights the problem of interpreting the Brazilian constitu-
tional provision that prohibits online anonymity, and views it as in conflict with many
provisions of the Marco Civil da Internet (Law no.12.965/14) regarding protection
of private life, confidentiality of communication, and freedom of speech. It draws on
the reasoning of judges in cases related to the permissibility of mobile applications
for anonymous messages. The reasoning was either in favor of an absolute privilege
to free speech, including anonymous speech, disregarding its challenges for dignity-
related interests, or focused on the literal interpretation of the provision, favoring
full disclosure of the speaker in all circumstances to ensure liability in case of viola-
tion. This article also discusses the jurisprudence of the European Court of Human
Rights where anonymity is encouraged and, at the same time, dignitarian interests
are diligently protected through a complex balancing exercise. It further explores
the theoretical importance of anonymity being a value connected to the umbrella of
privacy rights. The empirical evidence from Pakistan shows that anonymity has a
great instrumental value that protects people from threats to their lives if their speech
or actions oppose the prevailing societal ethos. The lack of conclusive research-
findings connecting speech abuses with anonymity as well as the modern challenges
to anonymity in the online sphere are explored, with the conclusion that the case
for blanket prohibition of anonymity—as in the Brazilian constitution—is not very
strong.

1 Introduction

This article asserts that the blanket prohibition of anonymity in the Brazilian consti-
tution is an impediment to the objectives of the Marco Civil da Internet, which
grants historic digital rights to Brazilian citizens, including the right to privacy

S. Aftab (B)
Government of Pakistan, Islamabad, Pakistan
University of Hamburg, Hamburg, Germany

© Springer Nature Switzerland AG 2022 309


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_13
310 S. Aftab

and the protection of private life. The controversy over the proper interpretation
of anonymity in the online world surfaces intermittently whenever the Brazilian
courts have to deal with encrypted Internet communication. In 2014, a judgment of a
trial court attracted much attention when the constitutional prohibition of anonymity
was invoked to block smartphone-messaging applications. Although the judgment
was soon overturned on appeal, neither the court of first instance nor the appeal
court judges could lay down elucidating principles for balancing various interests
linked to online anonymity. The judgments of both courts offered a unidirectional
and simplex conceptualization of anonymity, viewing it as an instrument to exploit
the right to free speech and to block the accountability arising from abuse of free
speech in the shape of attacking the privacy and dignity of others. To say that online
offensive speech such as defamation, cyberbullying, and hate speech is due solely to
the anonymous character of online interaction is only one side of the coin; this not
only ignores the privacy-related interests linked to anonymous online activities, but
also the variety of functions anonymity performs in the use of digital technologies.
In contrast to the Brazilian case, the regulation of anonymity in Europe and the
jurisprudence of the European Court of Human Rights present a balancing approach
rather than pushing for a blanket prohibition of encrypted channels for online
browsing. The scholarly literature shows that anonymity is ontologically linked to
the right to privacy and recognizes it as an essential condition for the free develop-
ment of personality. Furthermore, empirical research evidences the complexity of
the nature and functions of anonymous online communication. The precise correla-
tion of the causal effects of anonymity on the type of speech cannot be determined.
This plurality of functions is reflected in the European approach to anonymity regu-
lation where case law and legal instruments expressly recognize the importance of
anonymity for legitimate purposes.
Case studies from certain jurisdictions such as Pakistan strengthen the belief that
without anonymity, the self-expression of unconventional individuals seriously jeop-
ardizes their right to live a free life. In such circumstances, anonymity, pseudonymity,
and the use of encrypted devices and privacy-enhancing tools become crucial for
safeguarding them against profiling and surveillance.1
Based on the considerations of European Court of Human Rights’ jurisprudence,
the conceptual linkage of anonymity with privacy as well as its instrumental value,
the article proposes that Brazilian judges and policy makers must reconsider the
prevailing legal conception of anonymity. In order for the Marco Civil to fulfill
its privacy-related objectives, clear rules should be legislated setting criteria under
which the courts can order the disclosure of the identity of Internet users. Any legal
lacuna and uncertainty will result in the weakening of the expectations of privacy
while surfing the Internet and resultantly, the human rights’ commitment of Marco
Civil da Internet.

1 For a detailed account of the concept of surveillance and its regulation at the European level Albers
(2022), in this volume.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 311

2 The Right to Privacy and Online Anonymity in Brazil

Known as the “Citizens’ Constitution”, the Constitution of Brazil grants many rights
to its citizens and dedicates no less than 78 sections of its Article 5 to expressly
elaborate these rights in a codified form. Item No. X of Article 5 states,2 “The privacy,
private life, honor and image of persons are inviolable, and the right to compensation
for property or moral damages resulting from their violation is ensured”. Item No.
XI declares that the home is “inviolable refuge of the individual, and no one may
enter therein without the consent of the dweller, except in the event of flagrante
delicto or disaster, or to give help, or, during the day, by court order”. Similarly,
the secrecy of correspondence, communication, and data is made inviolable.3 The
Brazilian constitution also recognizes the right to one’s data in Item No. LXXII which
provides for habeas data writ, “to ensure the knowledge of information related to the
person of the petitioner, contained in records or data banks of government agencies or
of agencies of a public character; b) for the correction of data, when the petitioner does
not prefer to do so through a confidential process, either judicial or administrative.”
In late 2021, a right to the protection of personal data, including those in digital
media, has been incorporated into the constitution, Article 5 LXXIX of the Brazilian
Federal Constitution.
The Brazilian Civil Code provides for personality rights (Article 11), their enforce-
ment mechanism in the shape of claiming damages in case of infringement (Article
12), and the right to image (Article 20), as well as the right to seek injunction from a
court to prevent violability of the private life of a natural person.4 Moreover, Brazil
is also a signatory of International Covenant on Civil and Political Rights (ICCPR)
and is bound by its international obligation to the privacy right protection.5
The protections available to private life, family, home, and communication should
be equally applied in the context of Internet-related activities. Cognizant of this fact,
Brazil has enacted an omnibus General Data Protection law (in Portuguese Lei Geral
de Proteção de Dados Pessoais-LGPD) which became effective with all its provisions
in August 2021.6 This law is greatly influenced by the European Union General Data

2 English version is available at Brazilian Chamber of Deputies website: http://bd.camara.gov.br.

Accessed 11 Jan. 2022.


3 Article No. 5 (XII) of the Constitution of the Federative Republic of Brazil.
4 For details: Costa (2012), pp. 1–19.
5 According to Article 17 of ICCPR, “1. No one shall be subjected to arbitrary or unlawful inter-

ference with his privacy, family, home or correspondence, nor to unlawful attacks on his honor
and reputation. 2. Everyone has the right to the protection of the law against such interference or
attacks.”
6 The enforcement of this law was initially planned for February 2020. It was postponed till August

2020 in order to grant a grace period to tech companies for adapting their systems with the new data
protection legal regime. Due to further delays and disputes, the LGPD finally entered into force on
September 18, 2020, with the exception of the sanctions in articles 52–54 which came into effect
on August 1, 2021.
312 S. Aftab

Protection Regulation (GDPR)7 and is highly appreciated by experts dealing with


data protection laws, as it will introduce much clarity to the data protection and
privacy enforcement regime. It will either replace or supplement the already existing
more than forty legal norms, which deal with privacy and data protection either
directly or indirectly.8 Prior to the current legislation, only piecemeal legal provisions
existed in different laws such as consumer law, credit information law, and access
to information law.9 However, the effective enforcement of privacy-related rights is
still dependent on the nature of interpretation of these instruments by the Brazilian
courts and that how the judiciary undertakes the challenge of reconciling various
conflicting provisions. For example, the constitutional prohibition on anonymous
expression is one of such provisions, which can work as an impediment to the use
of privacy enhancing technologies. The following section will further analyze this
problem with reference to the framework legislated exclusively for the protection of
Internet related basic rights.

2.1 Marco Civil da Internet: The Magna Carta for the Digital
World

Brazil has received many accolades for the Civil Rights’ Framework for the Internet,
which is indigenously known as Marco Civil da Internet. It has been termed a Magna
Carta for cyberspace, a constitution for the Inetrnet, and “a hallmark, not only for
the Brazilian Internet, but also for the world’s democratic process”.10 The regulation
of three distinct but interdependent values, namely data privacy, net neutrality, and
freedom of expression, in a single document has an inherent holistic dimension that
has attracted worldwide attention.11 It has the potential to work as a model legal
framework for Latin American and other international jurisdictions as they can take
on Brazilian civil-law approach of dealing with the Internet in a comprehensive
manner with the perspective of fundamental rights.12 Even countries beyond Latin
America have considered following the Brazilian Internet regulatory approach.13

7 See for a comprehensive analysis of the influence of GDPR because of its effectiveness and
inherent international character Veit (2022), in this volume.
8 For details: Renato Leite Monteiro, “The new Brazilian General Data Protection Law-a detailed

analysis”, available at: https://iapp.org/news/a/the-new-brazilian-general-data-protection-law-a-det


ailed-analysis/. Accessed 11 Jan. 2022.
9 Doneda and Mendes (2014), pp. 3–20.
10 De Souza, Viola and Lemos (2015), pp. 6–10 and 26.
11 Saldías (2014), p. 1.
12 Saldías (2014), p. 4.
13 Souza, Viola and Lemos (2017), p. 49: “The Internet Bill of Rights has already inspired other

nations that are interested in following Brazil’s footsteps, while other governments are already
launching their own online consultation processes for writing their versions of our Internet Bill
of Rights. In Europe, members of the Italian parliament have contacted the Internet Bill of
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 313

Marco Civil da Internet provides a broad array of rights to the users of digital
services. These rights include freedom of expression,14 protection of privacy, devel-
opment of personality, and data protection. Article 2 declares freedom of expres-
sion as the foundation of Internet discipline in Brazil, and Article 3 states that
freedom of expression, communication and thought, protection of privacy, personal
data protection, as well as preservation of network neutrality are the principles of
Internet governance in Brazil. The subsequent articles provide for inviolability of
the related interests such as secrecy of the flow of digital communication and the
data of private communication.15 Disclosure of Internet users’ personal data to third
parties is prohibited,16 and the principles of data subjects’ consent as well as purpose-
limitation are established by the legislation.17 Marco Civil reasserts the two pillars of
online communication: “The guarantee to the right to privacy and freedom of speech
in the communications is a condition for the full exercise of the right to access to the
Internet.”18
In spite of its innovative and pro-rights nature, the broad and open-ended language
has left some room for incomplete protection of Internet users’ privacy-related inter-
ests. The provisions for data retention such as keeping connections and log infor-
mation can be used for surveillance purposes.19 More importantly, because of the
lack of effective enforcement mechanism and its status as a non-substantive legal
instrument, the provisions of the framework are open to arbitrary judicial interpre-
tations and the discretion of individual judges. Marco Civil clearly sets out that its
principles do not exclude any other Brazilian law and international treaties that deal
with the same subject matter.20 This means that Marco Civil cannot be considered in
isolation and that any other legal instrument can jeopardize Internet-related liberties
of Brazilian citizens.21
In this context, the constitutional prohibition of anonymity needs to be highlighted.
Although external to the present law, it is connected to many of its provisions that
expressly guarantee the protection of privacy, private life, and sanctity of the confi-
dentiality of Internet communication in general. According to Article 5-IV of the

Rights rapporteur, the Brazilian Internet Steering Committee (CGI.br) as well as the Institute for
Technology & Society of Rio de Janeiro (ITS Rio) in order to explore a similar process.”
14 As to the roles of freedom of expression cf. Affonso Pereira de Souza and Laus Marinho Nunes

(2020), in this volume, and with a view to anonymity esp. Sects. 9.4.2 and 9.4.3.
15 Article 7 II and III.
16 Article 7 VII.
17 Article 7 VIII.
18 Article 8.
19 De Souza (2011), p. 521: “In any case scenario the fact that remain true is that whenever registra-

tion data (essentially name and billing address) and connexion [sic] data (IP address, time of access
and addresses visited) exists stored, there is a possibility to rebuild and trace a user whereabouts on
line with simple computer query. This scenario can easily be interpreted as a form of compulsory
registration to access the Internet, extinguishing in this way the anonymity of users”.
20 § 1 (sole paragraph) of Marco Civil da Internet.
21 Rossini et al. (2015), p. 21.
314 S. Aftab

Constitution of Brazil, “the expression of thought is free, and anonymity is forbid-


den”. The Civil Rights’ Framework for the Internet has been unable to introduce
clarity to the apparently conflicting positions of various constitutional provisions
that, on the one hand, provide for the protection of private life, but on the other,
prohibit anonymity across the board.22 The views of judges in various courts further
highlight this confusion, as may be observed in the following section.

2.2 Anonymity Before Brazilian Courts23

The dilemma of conceptualizing anonymity became apparent a few months after the
promulgation of Marco Civil da Internet when the Fifth Civil Court in the district of
Vitória issued an injunction, banning mobile-messaging applications called Secret
and Cryptic. Secret was an Android and IOS-based messaging app which claimed to
be encrypted and ensured users that their communications were anonymous. Cryptic
was only available at the Windows online store and was advertised as providing
encrypted communication.
The case was initiated under the Public Class Action Law by the public prose-
cutor with the aim of requiring Apple Computer Brasil Ltda., Google Brasil Internet
Ltda., and Microsoft Informática Ltda. to remove both applications not only from
their online stores but also to remotely wipe them from users’ mobile devices who
had already installed them. The Court, while acknowledging the urgency of the
matter, granted the injunctions referring to the constitutional provision which forbids
anonymity. The reasons given in support of the injunctions are based on the idea
behind the legislators’ intention to impose a blanket ban on anonymous speech.
Judge Paulo Cesar De Carvalho observed that although freedom of thought is guaran-
teed in the constitution, the same constitutional provision guarantees privacy-related
rights along with the right to compensation if violated.24 The judgment highlights the
importance of free speech for democratic dispensation and development of person-
ality. However, such an ideal can only be achieved if the corresponding duty is not
ignored. He asserted:
To make such rights compatible with the freedom of expression, without previous censoring,
the Constitution adopted the model of freedom with responsibility, preventing anonymity.
Therefore, the prohibition of anonymity allows the responsibilization for any eventual offense
to the referred rights to personality, also constitutionally protected.25

22 As earlier stated, it remains to be seen as to what extent the new data protection law (LGPD) and
its subsequent interpretation by the judges are successful to overcome the existing ambiguity.
23 The judgments have been translated into English by Internet Lab and are available at the online

platform: http://bloqueios.info/wp-content/uploads/2016/11/08-Secret.pdf. Accessed 11 Jan. 2022.


24 The court asserts that item X of Article 5 guarantees, “the privacy, private life, honor and image

of persons [as] inviolable, and the right to compensation for property or moral damages resulting
from their violation is ensured”.
25 Case Number: 0028553-98.2014.8.08.0024.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 315

Referring to Brazilian legal scholar Daniel Sarmento, the judge noted the justification
of the prohibition of anonymity as a requirement of identifying the speaker not only
for a system of “responsibilization” but also because the identity of the speaker is
important for the receiver to make a value-judgment about the expressed content. The
injunction for the removal of the Cryptic and Secret apps was issued on the basis of
apprehensions that being anonymous, the users’ speech could pose an eventual harm.
Awarding compensation to the aggrieved party in case of infringement of her right
to privacy, honor or image will become a challenge in case of anonymous speaker.
The Court declared the use of apps for anonymous communication a violation of the
constitutional provision prohibiting anonymity.
The addressees of the judgment, namely Apple, Microsoft, and Google, responded
to it in different ways. Apple and Microsoft removed Secret and Cryptic from their
stores, while Google did not remove Secret from its store and filed an appeal against
the judgment in the Court of Justice of the State of Espirito Santo. Microsoft later
filed an interlocutory appeal in the same court.
The Court of Justice, after considering various aspects of the trial court’s judgment,
repealed the injunction with a two-thirds majority vote.26 The appeal court judges
discussed the issue in a more detailed fashion and highlighted further aspects of the
issue that had been ignored in the initial adjudication. The judges also raised concerns
regarding the possible violation of the right to privacy in the process of deleting
applications already installed.27 However, the judgment still focused on the greater
possibility of identification through IP addresses during the use of messaging appli-
cations. Referring to Articles 15 and 19, which deal with the application providers’
liability to store application access logs and intermediaries’ liability for the third
party content, the judge relied on the possibility of identifying users through their IP
addresses.28 The issues relating to online anonymity, pseudonymity, cryptography,
and their relationship to the protection of personal data and privacy were not explored
sufficiently. “[A]lthough the anonymity figures as the very reason to be for the appli-
cation, I do not think there are doubts as to the possibility of identifying the user

26 Case Number: 0030918-28.2014.8.08.0024 (Google) and 0031238-78.2014.8.08.0024


(Microsoft).
27 Appeals’ Court Judge Robson Luiz Albanez (Rapporteur) remarked, “It has to be pondered,

yet, that the determinations contained in the contested decision seem to be technically unfeasible,
being able to give rise, even before a perfunctory analysis, to the violation of the right to privacy of
the users, as it imposes to the Appellant to establish a remote access to the devices of all citizens
who have installed the application in their respective smartphones, with the purpose to remove the
program from their devices.”
28 Marco Civil da Internet, Article 15: “The Internet application provider that is duly incorporated as

a legal entity and carry out their activities in an organized, professional and with economic purposes
must keep the application access logs, under confidentiality, in a controlled and safe environment,
for 6 months, as detailed in regulation.”; Article 19: “In order to ensure freedom of expression and
prevent censorship, the provider of Internet applications can only be subject to civil liability for
damages resulting from content generated by third parties if, after an specific court order, it does
not take any steps to, within the framework of their service and within the time stated in the order,
make unavailable the content that was identified as being unlawful, unless otherwise provided by
law.”
316 S. Aftab

through its IP address…”, with this observation, the judge voted in favor of repealing
the initial order.
Another appeals court judge, Samuel Meira Brasil Júnior, questioned the effec-
tiveness of identifying users through their IP addresses as provided by Marco Civil.
He declared that tracking users through their IP addresses was not a perfect solu-
tion because IP addresses track the flow of data from one system to another and do
not identify the actual user. In order to provide protection to users in a preemptive
manner, the judiciary can order the removal of the applications from smartphones
for an anticipated wrongdoing even if the law prohibits such removal of data. He
raised the question how it could be possible to allow such encrypted applications
when the Constitution expressly prohibits anonymity and rejected the submission of
identification of users through IP addresses as insufficient measure for reaching the
actual person. Based on such reasoning, the judge eventually denied the interlocutory
appeal against the lower court’s injunction.
Ronaldo Gonçalves De Sousa, third judge of the appeals court, however, presented
a freedom of speech approach, and called for its prioritization in almost all situations.
He rejected the court of first instance’s reasoning concerning “advanced protection”
as contradicting the Federal Constitutional Court’s tendency towards prioritization
of free speech over the threats to honor and privacy as well as the rejection of prior
censorship.29 The judgment also mentioned some of the positive uses of anony-
mous communication and suggested that, as there was no conclusive proof of abso-
lute anonymity, the ban was unnecessary. Before permitting the appeal, the judge
raised some pertinent questions with forceful analogies that indirectly highlight the
multiplicity of functions of anonymous communication.
Would it be the case, under the argument that the Constitution prohibits anonymity, of
impairing all social networks that give margin to manifestations that do not propitiate the
immediate and instantaneous identification of the interlocutor? It is reasonable to prevent
the use in Brazil of social networks like, for example, Facebook, YouTube, Twitter, Tumblr,
Qzone, WhatsApp, WeChat, Line, under the argument that its users could anonymously
make offensive posts to honor and privacy of determined individuals? Obviously not. This
would be like prohibiting the sale of a knife in virtue of the risk of somebody with bad
intentions using it to attempt against someone’s life. It would be like prohibiting the sales
of automobiles in virtue of the risk of somebody negligent offering risks to the other drivers
and to the pedestrians. What I mean to say is that the existence of potential damage of a tool
is not enough for the undiscriminated characterization of the founded concern of irreparable
damage or of difficult compensation in relation to its use.30

29 “Aside from the necessity of definitely breaking with a tradition of recurring attacks to the
expressive freedom, the STF assessed that without a context of satisfactory viability of the exercise
of freedom of speech, it would not even be possible to think on the entirety of all other rights once
it is precisely through the communication that we develop culturally and can recognize, search
and promote rights, as well as preserve and reaffirm democracy.” (STF. ADI 4.815 Rap. Minister
Carmen Lúcia. DJ. 10/06/2015). As referred to in the translated version of the judgement.
30 Continuation of the Judgement dated: 21/7/2015, Appeals Court Judge Ronaldo Gonçalves De

Sousa.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 317

Although the judgment repealed the injunctions issued by the trial court, the interests
related to the right to privacy, private life, and personality development remained
unexplored during the entire process. The third judge justified anonymity on the
grounds of freedom of speech, suggesting that the protection of free speech even
includes “antagonistic manifestations” that must be admitted in ‘the free market of
ideas’.
The aftermath of the “Secret” case is less important as the mobile app was shut
down around the world.31 However, the courts missed the opportunity to streamline
the right to anonymous browsing as an aspect of the right to privacy and to lay down
basic principles for balancing exercises. The acceptance of anonymous speech under
the belief of prioritization of free speech over dignity and privacy is another extreme
and has the potential to leave personality rights unredressed in cases of abusive
speech. Similarly, the literal interpretation of the constitutional ban also has the
tendency to jeopardize the Internet-related fundamental rights guaranteed by Marco
Civil da Internet.32 The ban on anonymity provided by the constitution coupled
with the data retention provisions of Marco Civil has the potential to discourage
anonymous use and deprive citizens of their right to protection of private life as well
as free development of personality.
The issue of anonymity through encrypted messaging applications is a matter
of ongoing legal controversy between law enforcement agencies and WhatsApp-a
popular mobile messaging application. Various courts in Brazil imposed tempo-
rary bans on WhatsApp at least three times,33 apparently due to the failure on the
part of WhatsApp to provide the details of data transfer through its network.34 The
spokesperson of the Federal Police in Sergipe state explained that WhatsApp was
required to furnish the content of messages and other data such as geolocation for
investigative purposes.35 However, these actions were highly criticized as being
disproportionate, depriving more than 100 million people from using WhatsApp,
and were resultantly turned down by the Federal Supreme Court.36

31 http://www.bbc.com/news/technology-32531175. Accessed 11 Jan. 2022.


32 For detailed analysis of the law see Prodriguez and Pinho (2015).
33 1. In February 2015, a judge in Teresina determined the nationwide suspension of WhatsApp

for noncompliance with judicial requests for user data. 2. In December 2015, a judge from São
Bernardo do Campo determined the nationwide suspension of WhatsApp for noncompliance with
judicial requests for user data. 3. In April 2015, a judge from Lagarto determined the nationwide
block of WhatsApp for noncompliance with wiretap orders. http://bloqueios.info/en/timeline/. 4.
See also Freedom House report on Brazil “Freedom on the net-2019”, available at: https://freedo
mhouse.org/country/brazil/freedom-net/2019. Accessed 11 Jan. 2022.
34 See http://foreignpolicy.com/2015/12/17/why-did-brazil-block-whatsapp/. Accessed 11 Jan.

2022.
35 See http://www.independent.co.uk/life-style/gadgets-and-tech/news/whatsapp-banned-in-bra
zil-court-rules-app-must-go-down-for-72-hours-as-it-fights-government-over-a7010876.html.
Accessed 11 Jan. 2022.
36 See https://www.theguardian.com/world/2016/jul/19/whatsapp-ban-brazil-facebook. Accessed

11 Jan. 2022.
318 S. Aftab

3 Anonymity Under European Human Rights’ Law

An analysis of the legal regimes for online anonymity reveals that the European
jurisdiction has largely been successful in providing protection to the dignitarian
interests of its citizens despite the fact that it does not impose a complete ban on
anonymity.37 The language and interpretation of the relevant provisions (Article
8 and Article 10) of the European Convention on Human Rights (ECHR) present a
balanced image of how anonymous communication should be treated in terms of law.
The construction of the provisions regarding the protection of both private life and
freedom of expression provides an inbuilt foundation for the balancing exercise to be
performed by the European Court of Human Rights (ECtHR). According to Article 8,
the public authority must justify the interference with the right to respect for private
and family life, home, and correspondence. The justification must be according to
law, democratic principles, and “in the interest of national security, public safety or
the economic well-being of the country, for the prevention of disorder or crime, for
the protection of health or morals, or for the protection of the rights and freedoms
of others.”38 Similarly, Article 10 also provides the right to freedom of expression,
opinion and dissemination of information subject to legal formalities, conditions,
restrictions, or penalties.39
Other international law instruments such as the Universal Declaration of Human
Rights contain more or less similar provisions, but the European Court of Human
Rights (ECtHR) has developed a rich jurisprudence on conflicting interests as a
result of sophisticated balancing exercises. Such principles provide helpful guidance
for analysis and comparative studies. In particular, the notion of “private life” has
been broadly interpreted with the realization that it is not susceptible to exhaustive
definition, and enshrines the physical and psychological integrity of a person, personal
and social identity, gender identification, name, as well as sexual orientation.40 The
protection granted to private and family life encompasses the name of a person
and all the means of personal and family identification.41 More importantly, Article

37 Cf. for regulation of anonymous speech under German law and constitution Michael (2022), in
this volume.
38 Article 8 of the Convention for the Protection of Human Rights and Fundamental Freedoms 1950

(ECHR).
39 Article 10 of ECHR: “1. Everyone has the right to freedom of expression. This right shall include

freedom to hold opinions and to receive and impart information and ideas without interference by
public authority and regardless of frontiers. This Article shall not prevent States from requiring
the licensing of broadcasting, television or cinema enterprises. 2. The exercise of these freedoms,
since it carries with it duties and responsibilities, may be subject to such formalities, conditions,
restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the
interests of national security, territorial integrity or public safety, for the prevention of disorder or
crime, for the protection of health or morals, for the protection of the reputation or rights of others,
for preventing the disclosure of information received in confidence, or for maintaining the authority
and impartiality of the judiciary.”
40 S and Marper v United Kingdom [2008] ECHR 1581. § 66.
41 S and Marper v United Kingdom [2008] ECHR 1581. § 66.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 319

8 provides protection to the right to personal development, free development of


personality not only in the personal sphere, but also in the form of establishing
relationships with the other human beings and with the outside world.42
The concept of privacy as a legal “right to be let alone” emerged in the United
States, but could not sustain itself as a unified and holistic value for a long time due
to its conceptual limitation of focusing only on the negative freedom. It was further
reduced to different torts independent of each other in 1960 by William Prosser.
“The law of privacy comprises four distinct kinds of invasion of four different inter-
ests of the plaintiff, which are tied together by the common name, but otherwise
have almost nothing in common except that each represents an interference with the
right [….] to be let alone”, Prosser claimed in a well-argued article.43 In Europe
however, the evolution of the right to privacy followed a different path and went far
beyond the traditional right to be let alone. Not only restraining other individuals, it
even protected against the state intrusions into the privacy of home and correspon-
dence, “not merely seen as ensuing from the principle of human dignity but also as
a precondition to the free development of personality.”44 The concept of free devel-
opment of personality is much broader and pre-requires the essential circumstances
to shape one’s persona not only as an individual being, but also as part and parcel of
society. The scholarly writings on the issue prescribe more sophisticated regulatory
approaches and legal solutions to ensure the development of decisional capacities
of data subjects, and their autonomy in cyberspace in order to protect them in their
sociality.45 Any interference from outside in the form of state surveillance or third
party observation may impair the goal of developing one’s personal identity in terms
of intimate relations of love and friendship, which otherwise cannot be developed
in a crowed.46 Therefore, the European Court of Human Rights in Strasbourg has
expanded the scope of the right to private and family protection, which is not only
available against both private and public entities, but is also an obligation of the state
to take appropriate measures for its protection. The development of personal life in
a social context covers the right to autonomously make informed decisions, freedom
of choice, the right to access to information, as well as the right to exercise control
over personal information.47

42 S and Marper v United Kingdom [2008] ECHR 1581. § 66.


43 Prosser (1960), p. 389: “1. Intrusion upon the plaintiff’s seclusion or solitude, or into his private
affairs. 2. Public disclosure of embarrassing private facts about the plaintiff. 3. Publicity which
places the plaintiff in a false light in the public eye. 4. Appropriation, for the defendant’s advantage,
of the plaintiff’s name or likeness.”
44 Rouvroy and Poullet (2009), pp. 62–63.
45 Albers (2014), pp. 213–235 and 219. Referring to the German Constitutional Court’s decision

in BVerGE 65, 1, 43, as: “free decisions and action are possible only under certain circumstances.
If a person is unsure whether deviating behavior may be stored as information and used to his/her
disadvantage, he/she will try not to attract attention by such behavior and is no longer free to act
at will”, Professor Albers advocates extending the scope of fundamental rights to information and
data processing.
46 Rouvroy and Poullet (2009), p. 63.
47 Rouvroy and Poullet (2009), pp. 64–67.
320 S. Aftab

The overriding scope of Article 8 has involved anonymity as a necessary condi-


tion to personality development and a right of the person to remain anonymous for
legitimate purposes. On the other hand, anonymous speech has not been granted
protection by Article 10 ECHR, if it is used as a tool to seriously violate the right
of private and family life protected under Article 8. Anonymity in both the online
and offline worlds is neither prohibited per se nor is it an absolute right. A range
of legal instruments recognizes the importance of anonymity. The Declaration on
Freedom of Communication on the Internet adopted by the Committee of Minsters
of the member states of the Council of Europe provides that:
In order to ensure protection against online surveillance and to enhance the free expression
of information and ideas, member States should respect the will of users of the Internet not
to disclose their identity. This does not prevent member States from taking measures and
co-operating in order to trace those responsible for criminal acts, in accordance with national
law, the Convention for the Protection of Human Rights and Fundamental Freedoms and
other international agreements in the fields of justice and the police.48

In more specific recommendations, the Council of Europe emphasizes the need to


develop such techiques which not only permit the anonymity of data subjects but also
protect the confidentiality of the information content.49 Moreover, the Committee of
Ministers not only makes Internet users aware of the online profiling and encourages
them to use technical means to counter such traceability, but also directs Internet
service providers (ISPs) to inform their users about legitimate encryption and digital
signatures.50 Similarly, the Committee of Ministers also recommends, “you may
choose not to disclose your identity online, for instance, by using a pseudonym.
However, you should be aware that measures can be taken, by national authorities,
which might lead to your identity being revealed.”51 These documents expressly
provide for encouraging users to opt for cryptographic technologies to protect their
communication and data protection rights.
In spite of the commitment to respect the right of individuals to anonymity under
Articles 8 and 10, the European Court of Human Rights (ECtHR) has focused on
the protection of other vital interests as well, and has tried to balance the competing
values. The judgments in Delfi v Estonia52 as well as in K.U. v Finland 53 shed light
on how the ECtHR takes a variety of involved interests into consideration.
In Delfi v. Estonia, the applicant invoked the jurisdiction of the ECtHR under
Article 10 of the Convention regarding the Estonian Supreme Court’s confirmation

48 Council of Europe Declaration on freedom of communication on the Internet (Adopted by the


Committee of Ministers on 28 May 2003 at the 840th meeting of the Ministers’ Deputies).
49 Council of Europe (Recommendation No. R (99) 5 on “The protection of privacy on the Internet”

Adopted by the Committee of Minsters on 23 February 1999).


50 Council of Europe (Recommendation No. R (99) 5 on “The protection of privacy on the Internet”

Adopted by the Committee of Minsters on 23 February 1999) (Recommendation No. II-2 and III-3).
51 Recommendation CM/Rec(2014)6 of the Committee of Ministers to member States on a Guide

to Human Rights for Internet Users, adopted on 16 April 2014.


52 Delfi AS v. Estonia (2015) ECtHR 64669/09.
53 K.U. v. Finland (2008) ECtHR 2872/02.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 321

of a lower court judgment. The lower court had made Delfi (an online news portal)
liable for the lack of action to diligently prevent harm to the reputation of others due
to defamatory, offensive, and threatening comments directed to the employees of a
shipping company. While asserting the importance of both the rights of free speech
and privacy provided in Articles 10 and 8 respectively, the court acknowledged that
there is a higher risk of harm to private life in the case of online communication.54
The ECtHR highlighted the importance of reputation as protected under Article 8,
stating that it requires seriousness and balancing with other rights, particularly the
free speech right under Article 10.55 ECtHR endorsed the stance of the Supreme
Court of Estonia regarding liability of action on the part of the Internet news portal,
as the speech was manifestly unlawful.56 The Court also examined whether Delfi
was the publisher of the users’ comments or only a technical facilitator. It endorsed
the findings of the Supreme Court that Delfi’s involvement in publicizing the users’
comments was “beyond that of a passive, purely technical service provider” and
that Delfi was liable for not preventing hate speech on its portal.57 On the issue of
disclosing the identity of anonymous commenters, the Court refrained from issuing
a general or blanket prohibition of anonymity rather acknowledged its importance
in an elaborated observation.
The Court observes that different degrees of anonymity are possible on the Internet. An
Internet user may be anonymous to the wider public while being identifiable by a service
provider through an account or contact data that may be either unverified or subject to some
kind of verification – ranging from limited verification (for example, through activation
of an account via an e-mail address or a social network account) to secure authentication,
be it by the use of national electronic identity cards or online banking authentication data
allowing rather more secure identification of the user. A service provider may also allow
an extensive degree of anonymity for its users, in which case the users are not required to
identify themselves at all and they may only be traceable – to a limited extent – through the
information retained by Internet access providers. The release of such information would
usually require an injunction by the investigative or judicial authorities and would be subject
to restrictive conditions. It may nevertheless be required in some cases in order to identify
and prosecute perpetrators.58

The Delfi case is more important for the determination of intermediaries’ liability in
cases where a third party uses forums for offensive speech; nevertheless, it also sheds
light on the protection of private life and human dignity in the online sphere. The
judgment confirms that there cannot be black-and-white answers to the normative
questions regarding online anonymity, and that they involve complicated balancing
exercises, taking the peculiarities of every case into account.

54 Delfi AS v. Estonia (2015) ECtHR 64669/09. §133.


55 Delfi AS v. Estonia (2015) ECtHR 64669/09. § 137 and § 139.
56 Delfi AS v. Estonia (2015) ECtHR 64669/09. § 110 and § 117.
57 Delfi AS v. Estonia (2015) ECtHR 64669/09. § 146.
58 Delfi AS v. Estonia (2015) ECtHR 64669/09. § 148.
322 S. Aftab

The ECtHR in K.U. v. Finland held the Government of Finland liable for violating
Article 8 because of its failure to formulate an effective legal regime to protect chil-
dren from Internet-related abuses.59 In this case, an anonymous person had posted
the personal details of a 12-year-old boy in the form of an advertisement on a dating
website, claiming that he was looking for another boy to establish intimate relation-
ships. The applicant’s father had complained to the Finish police and law enforce-
ment authorities for disclosure of the perpetrator to initiate legal action against him.
However, Finish authorities failed to reveal the identity of the accused. A law enforced
at that time prohibited ISPs from such disclosures, because it was providing complete
protection to anonymous speech. The ECtHR rejected the Government’s stance that
legal lacunae prevented it from prosecuting the child abuser and observed that effec-
tive steps had to be taken to identify and prosecute the person who had posted the
online advertisement. Such steps had not been taken by the Government of Finland,
which resulted in violation of the positive obligation under Article 8. The Court
observed:
Although freedom of expression and confidentiality of communications are primary consid-
erations and users of telecommunications and Internet services must have a guarantee that
their own privacy and freedom of expression will be respected, such guarantee cannot be
absolute and must yield on occasion to other legitimate imperatives, such as the prevention
of disorder or crime or the protection of the rights and freedoms of others. Without prejudice
to the question whether the conduct of the person who placed the offending advertisement on
the Internet can attract the protection of Articles 8 and 10, having regard to its reprehensible
nature, it is nonetheless the task of the legislator to provide the framework for reconciling
the various claims which compete for protection in this context.60

The European approach toward online anonymity reflects the recognition of multi-
plicity in the nature and functions of anonymous expression. Unlike the approach
adopted in the Secret case by Brazlian courts, the ECtHR refrained from issuing a
sweeping declaration in favor of the prioritization of free speech or apprehensions
of advance protection with regard to anonymous users. It is not possible to precisely
assign a particular positive or negative value to anonymity. Rather its function in each
particular case needs to be analyzed, which must follow a comprehensive balancing
exercise the ECtHR did in Delfi and K.U. cases. The following sections shall explore
further concepts and practical issues of anonymity.

4 The Value and Concept of Anonymity

Anonymous speech and publications are centuries-old practices.61 However, the


typical characteristic of online communication in the shape of users’ ability to remain
behind pseudonymity has generated substantial intellectual discourse in the last

59 K.U. v. Finland (2008) ECtHR 2872/02.


60 K.U. v. Finland (2008) ECtHR 2872/02. § 49.
61 Martin and Fargo (2014), p. 317.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 323

decades, which does not mean that it was discovered only after the advent of the
Internet. It was very much present in the physical space long before cyberspace was
invented or even conceived.62 For instance, William Shakespeare is considered a
pseudonym and, if that is the case, nobody knows the identity of the actual author of
spectacular works of English literature.63 There are many instances where prominent
historical figures confronted the prevailing powers while remaining undercover to
escape reprisals and persecution. The role played by anonymity or pseudonymity
in challenging oppression, discrimination, and human rights’ violations, financial
corruption of big corporations, wrongdoings of the ruling elite, and misuse or abuse
of religious power by the clergy is generally acknowledged. Moreover, the protec-
tion of whistleblowers, both in public and corporate sectors, as well as the privi-
lege of confidentiality available to the source of journalistic enterprises enjoy legal
and constitutional recognition in many circumstances.64 In the recent past, we have
witnessed how various kinds of leaks played a crucial role in raising human rights’
awareness on the one hand and in agitating for transparency on the other. WikiLeaks
and the Panama Papers65 hit the headlines in this regard and are still making follow-
up appearances in the news worldwide.66 Due to these and many other reasons,
anonymity has its own support base particularly from American-based civil society
organizations advocating free speech and digital rights.67
The following section will explore the relationship of anonymity to the right to
privacy, its instrumental value for the protection of the right to life in Pakistan, its
effect on online behavior, as well as modern challenges faced by anonymity in the
Internet.

4.1 Anonymity and the Right to Privacy

A review of the scholarly literature on the theoretical foundation of the right to privacy
shows that anonymity is closely associated with most conceptions of privacy.68 The
modern conception of privacy as a legal right was articulated by the scholarly work

62 See for details: Wallace (1999a, b), pp. 2–3. See also for variety of anonymous writings in
historical context: Barendt (2016) pp. 14–55.
63 Palme and Berglund (2002).
64 See for example Goodwin v. United Kingdom, 22 EHRR 123 Court European Court of Human

Rights (1996). The court held that the liability of journalists to reveal their source would seriously
hamper their functions as a watchdog of wrongdoings in society.
65 For details: https://panamapapers.icij.org/. Accessed 11 Jan. 2022.
66 Bastian Obermayer of Süddeutsche Zeitung has never disclosed the identity of his

source that transferred the data regarding thousands of offshore companies registered
in Panama. https://www.theguardian.com/news/2016/apr/16/panama-papers-inside-the-guardians-
investigation-into-offshore-secrets. Accessed 11 Jan. 2022.
67 For example, Electronic Frontiers Organizations and Article 19.
68 Solove (2008), pp. 12–38. Barendt (2016) pp. 12–13.
324 S. Aftab

of Warren and Brandeis.69 The article published in the Harvard Law Review, titled
“The right to privacy”, consolidated related interests and declared that all of them
belong to a constitutional right to be let alone. The key expressions used by Warren
and Brandeis for the elaboration of their concept of privacy provide for anonymity
as an important and integral aspect of inviolate personality. They advocated for the
sensitivity of “civilized man” to publicity and his desire to seek retreat from the world
through solitude. Basing their argument on the protection available to copyrights
where the expression of thoughts and emotions are embodied in a recorded form, they
proposed similar protection for “a general right to privacy for thoughts, emotions, and
sensations”, whether expressed in writing, or otherwise “in conduct, in conversation,
in attitudes, or in facial expression.”70 The same concept was supported by Edward
Bloustein in his “Answer to Dean Prosser”. According to Bloustein, identifying
somebody in a detailed manner, who was so far a face in the crowd, constitutes a
wrong even if the sketch is of a benign nature.71 The nature of the wrong in such
cases, according to Bloustine, is replacing personal anonymity or private life with
public display.
Privacy is sometimes conceptualized as the right to limit access to and to obtain
control over information related to oneself. These concepts of privacy include one’s
identity as the subject of limited access rights and the information to which one
has the right to control. A condition, ability, claim, or right to prevent others from
accessing the self enables people to define their own place in the social context in
terms of establishing and maintaining social relations.72 Ruth Gavison gives immense
importance to anonymity, which is one of the three elements of her conception of
privacy.73 Each of the three privacy-related elements of secrecy, anonymity, and
solitude denotes the extent to which a person is known, the subject of attention, or
physically accessed.74
The conception of privacy as control over private information is relevant to the
Internet communication and justifies the right to informational privacy. Alan Westin
defines privacy as “the claim of individuals, groups, or institutions to determine for
themselves when, how, and to what extent information about them is communicated
to others”.75 He regards anonymity as an important state of privacy, which is acquired

69 Warren and Brandeis (1890), pp. 193–220.


70 Warren and Brandeis (1890), p. 206.
71 Bloustein (1964), p. 1003: “The man who is compelled to live every minute of his life among

others and whose every need, thought, desire, fancy or gratification is subject to public scrutiny, has
been deprived of his individuality and human dignity. Such an individual merges with the mass.
His opinions, being public, tend never to be different; his aspirations, being known, tend always to
be conventionally accepted ones; his feelings, being openly exhibited, tend to lose their quality of
unique personal warmth and to become the feelings of every man. Such a being, although sentient,
is fungible; he is not an individual.”
72 Rachels (1975), pp. 323–333.
73 Gavison (1980), p. 428: “My conception of privacy as related to secrecy, anonymity, and

solitude…are distinct and independent, but interrelated…”


74 See Footnote No. 40 in Gavison (1980).
75 Westin (1967), p. 5.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 325

when a person feels free from identification and surveillance despite his presence in
a public place. Therefore, the core of many anonymous actions is human desire for
“public privacy”.76 Among many virtues of such informational self-determination,
the capability of human beings to decide for themselves without external strains is
fundamental.
The basic point is that each individual must, within the larger context of his culture, his status,
and his personal situation, make a continuous adjustment between his need for solitude and
companionship; for intimacy and general social intercourse; for anonymity and responsible
participation in society; for reserve and disclosure. A free society leaves this choice to the
individual, for this is the core of the “right of individual privacy”-the right of the individual
to decide for himself, with only extraordinary exceptions in the interests of society, when
and on what terms his acts should be revealed to the general public.77

The conceptual proximity of anonymity to the right to privacy increases the value of
anonymous online expression. The normative and jurisprudential discourses should
also include such diverse conceptualization. The expansive right to private life, in
particular, has enshrined the interest in anonymity, as can well be seen in the European
approach to regulating anonymity.

4.2 Anonymity and the Right to Life in Pakistan

According to Michael Froomkin:


In places that are less free, avoiding retribution for saying the wrong thing may be a matter
of life and death. Political dissidents, ethnic minorities, religious splinter groups, people
campaigning for women’s rights or gay rights, and many others are, or have been, subject to
the risk of genuine and very palpable violence. If they wish to speak or write for their causes
they need a means to protect themselves. Anonymity is one such tool.78

The importance of anonymity is generally accepted in such regimes that have political
and socio-cultural impediments in the way of self-expression and the free develop-
ment of personality. A few instances from the Islamic Republic of Pakistan shall
support Froomkin’s view that expressing oneself in a ‘wrong’ way can actually
endanger one’s life.
Pakistan is a South Asian country with a population of more than 200 million.
According to official figures, it has 83 million Internet-broadband, and 81 million
3G/4G subscribers.79 Nonetheless, the conservative outlook in the socio-cultural
dimension, as well as weak democratic values results in a grim human rights record.
Social, religious, and political realities characterize issues such as honor killing, mob
justice as a reaction to blasphemy allegations, as well as institutionalised restrictions

76 Westin (1967), pp. 34–5.


77 Westin (1967), p. 46.
78 Froomkin (2015), p. 121.
79 https://pta.gov.pk/en/telecom-indicators. Accessed 11 Jan. 2022.
326 S. Aftab

on political criticism imposed by public authorities. In such a situation, it is very diffi-


cult to avoid the negative consequences of expressing oneself in a non-conforming
manner. Thus, concealing one’s identity through pseudonymity or anonymity can
guarantee the protection of the right to life. The following cases from the recent past
will highlight the gravity of adverse situation for online performers, activists, and
general users.

4.2.1 A Victim of Identification

Qandeel Baloch (born in 1990) was a self-made social media celebrity who actively
used social media for her publicity without any external help from a sponsor, a
producer, or an advertising agency. In a short period, she succeeded in attracting
thousands of followers on Twitter, Facebook, and YouTube by posting her videos,
and commenting on socio-political issues and public figures in a frank and enter-
taining manner. In December 2013, she came to the limelight for the first time as
a participant in the Pakistani version of the Idol singing competition broadcast on
a private television channel,80 where the judges of the show pushed her out of the
stage because of her ‘coarse’ voice. The following years were eventful, when she
remained in the news-headlines because of various controversies. Her pictures and
videos with religio-political leaders, her style of dress- declared as “vulgar” and “un-
Islamic” by the media-caused occasional stirs in the media as well. In June 2016, she
orchestrated a relatively big storm in the electronic media when she broke the news
of her meeting with a prominent religious figure named Mufti Qavi, and posted the
video and pictures of the meeting.81 During this period, she was omnipresent in the
electronic and social media news.82
Perhaps she was at the peak of her modeling and acting career when a local news-
paper published a copy of her passport, revealing her real name (Fouzia Azeem) and
many intimate details regarding her past marriage, subsequent divorce, and pictures
of minor son in the custody of her former husband. Like her other sensational pres-
ence in the national media, this news also flashed for hours in the social and electronic
media. It is pertinent to note that the intimate details belonged to the period when
she was an ordinary person and not a celebrity. She used a fake identity because she
was aware of the consequences of remaining in the limelight beyond certain limits,

80 The format of the show is similar to many other Idols e.g. American Idol or Deutschland sucht

den Superstar.
81 He had an official position in the federal government as well. Consequently, Mufti Qawi was

removed from the position. Details of the meeting can be read at: https://images.dawn.com/news/
1175672. Accessed 11 Jan. 2022.
82 https://www.samaa.tv/social-buzz/2016/06/qandeel-baloch-selfies-with-mufti-qavi-go-viral/.

Accessed 11 Jan. 2022.


Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 327

which she had already crossed. Media persons were also aware of the possible conse-
quences as they researched her village and tribe, whose dwellers were unaware of
her media-life.83
The disclosure of her actual identity drew an immense psychological impact on
her life. She was no longer a symbol of bravery and started worrying about her safety.
She not only implored for her safety in television interviews, but also submitted a
formal request to the Interior Minister for ensuring her security.84
Not a month had passed after the revelation of her actual identity when TV chan-
nels broadcast the breaking news of her murder, committed by her brother.85 He
surrendered to the police and confessed, saying that she was a “disgrace” to the
family. The Guardian reported that the brother had strangled her to death “after
being taunted by his friends over her behavior”.86 Many commentators stated that
had her identity remained undisclosed, she would be alive.87 This is a typical example
of how a person can express herself anonymously and how later identification can
entail life-threatening consequences.

4.2.2 The Blasphemy Allegations and Mob Justice

The blasphemy law of Pakistan88 has a controversial role in the form of radicalizing
society through promoting and encouraging curbs on the freedom of expression. In
case of certain offences, it prescribes death sentence to the “blasphemer”. In most

83 As young women in Pakistani villages and tribal areas mostly observe veil very strictly and even
close relatives do not have an idea about their facial appearances.
84 https://www.dawn.com/news/1267709. Accessed 11 Jan. 2022.
85 She was killed during the night of 15 July, 2016.
86 https://www.theguardian.com/world/2016/jul/28/pakistani-model-qandeel-baloch-killed-by-bro

ther-after-friends-taunts-mother. Accessed 11 Jan. 2022.


87 See for example: (1) https://www.thequint.com/news/world/qandeel-baloch-honour-killing-we-

killed-qandeel-baloch-identity-exposed-multan-pakistan-criminal-code (2) Special follow up report


on the issue, Titled “Did the media kill Qandeel Baloch” Available at: https://www.dawn.com/news/
1275990. and (3) News analysis, titled: “Revealing Qandeel’s real identity put her life at risk”, avail-
able At: https://tribune.com.pk/story/1143410/sharing-blame-revealing-qandeels-real-identity-put-
life-risk/. Accessed 11 Jan. 2022.
88 Pakistan Penal Code deals with the offences relating to religion. The offences include injuring

or defiling place of worship (Section 295); malicious acts intended to outrage religious feelings
(Section 195A); defiling of Holy Quran (Section 295B); use of derogatory remarks in respect of the
Holy Prophet (Section 295C); disturbing religious assembly (Section 296); trespassing on burial
places (Section 297); uttering words to wound religious feelings (Section 298); use of derogatory
remarks in respect of Muslims’ holy personages (Section 298A); misuse of epithets, descriptions
and titles etc. reserved for certain holy personages or places (Section 298 B); person of Qadiani
group etc., calling himself a Muslim or preaching or propagating his faith (Section 298C). The
punishment provided for these offences range from two years imprisonment to life imprisonment.
In case of Section 295C, the punishment is death sentence.
328 S. Aftab

cases, people do not wait for prosecuting the accused formally under blasphemy law
and resort to mob justice.89
In April 2017, Mashal Khan, a 23-year old Moscow Business School graduate
and a student of journalism was lynched by a frenzied mob inside a public sector
university. The mob consisted of university students of different disciplines. The
charges leveled against Mashal Khan were related to posting “blasphemous content”
on his Facebook page. The video footage showing people throwing stones on him
and chanting religious slogans went viral on social media. The allegations were,
however, turned to be false after his Facebook page was analyzed, and after media
spoke with his friends and relatives. It was revealed that he was an extra-ordinary
person, believing in human equality, and peace, and that he had a stern stance against
corruption in public institutions.90 However, his “liberal” views were unconventional
in a closed society and the people opposing him exploited those views to charge the
mob, alleging him of committing blasphemy on Facebook. Even after his murder,
many people were less sympathetic to him, held him responsible for what happened
and that “his way of thinking”91 was contradictory to so-called Islamic approach.92
The Chief Justice of Pakistan took a suo moto notice of the incident and assigned
the probe to a joint investigation team (JIT) of thirteen officials belonging to various
departments. The report of the JIT confirmed that no blasphemous content was found
on either his Facebook or Twitter account. The lynching was the result of a planned
strategy of the students’ political body and the university administration to retaliate
Mashal Khan’s media talk over financial irregularities in the university.93 More than
fifty persons were tried by the anti-terrorism court, which awarded various punish-
ments, ranging from death sentence to two years imprisonment. Those who were
released due to lack of evidence were given a “hero’s welcome” by the religious
parties of the area, and some of them announced on the stage set for them that they
were proud of killing a “blasphemer”.94

89 US State Department report on religious freedom in the country in 2015. Its Executive
Summery can be found at: https://2009-2017.state.gov/j/drl/rls/irf/religiousfreedom/index.htm#wra
pper. Accessed 11 Jan. 2022.
90 https://herald.dawn.com/news/1153981 (He was posthumously declared Herald’s person of the

year 2017); https://herald.dawn.com/news/1154008 (“Mashal Khan: Lighting a flame in our hearts


by losing his life”). Accessed 11 Jan. 2022.
91 Many orthodox Muslim believers consider the liberal views on human equality or gender equality

as against the spirit of Islam.


92 Even the religious leader (Imam) refused to perform the funeral ritual and it was

announced through the loudspeakers of the mosques that nobody should attend the
funeral. His father has requested the Supreme Court of Pakistan to provide security
to his family as they are still feeling threatened by the religious zealots and oppo-
nents. For details: http://dailytimes.com.pk/islamabad/18-May-17/mashals-father-requests-sup
reme-court-to-transfer-trial-to-islamabad. Accessed 11 Jan. 2022.
93 https://tribune.com.pk/story/1427069/plot-get-rid-mashal-khan-hatched-month-murder-conclu

des-jit/. also see https://www.dawn.com/news/1327151. Accessed 11 Jan. 2022.


94 https://tribune.com.pk/story/1629256/1-released-students-mashal-khan-case-get-heros-wel

come/. Accessed 11 Jan. 2022.


Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 329

In such a situation, it is not possible to profess one’s worldview according to


personal judgment and to express extremely personal thoughts about religious beliefs
in an open manner. Incidences such as these cause a chilling effect on the freedom
of expression. In social media too, Internet users in Pakistan are extremely careful
in exercising the freedom of thought, as it can be interpreted in terms of opposing
a generally professed religious beliefs. The consequences of expressions opposing
the mainstream views are dire for sure. To deprive such people of the facility of
anonymity would definitely jeopardize their right to speech and the right to live a
free life.

4.2.3 The Persecution of Social Media Activists

The problem of enforced disappearances in Pakistan is a matter of concern to


national and international human rights organizations. According to the International
Commission of Jurists, the number of missing persons is not only high in the polit-
ically troubled areas such as Baluchistan Province, but it has become a nationwide
phenomenon.95 In 2016, 728 persons were reportedly “disappeared”.96 The families
of many missing persons allege that the security agencies are responsible for these
abductions. The indolent response of the state institutions, aimed at the enforcement
of fundamental rights, shows that the practice of forceful disappearance has assumed
general legitimacy in order to oppress the dissident and critical voices. Due to the
influence of civil and military regimes, the media have adopted self-censorship in
covering the so-called sensitive political issues.97
The government, in a bid to impose speech restrictions on the social media, is
adopting various legal and extralegal measures.98 The law enforcing authorities in
Pakistan are in a regular contact with Facebook for retrieval of users’ data that result
into the blockage of many accounts.99 Moreover, the social media users, having
critical political views, are also deterred through threats and intimidations. In January
2017, four bloggers and social media activists mysteriously disappeared. Among
them, Salman Haider is a university professor and outspoken human rights activist,
and Waqas Goraya is a freelance journalist. Both of them are popular for using
cyberspace to highlight various social and political issues such as human rights’
abuses and civilian suffering during military operations against Al-Qaida, Taliban,

95 https://www.icj.org/enforced-disappearances-in-pakistan-un-statement/. Accessed 11 Jan. 2022.


96 http://web.archive.org/web/20171015045816/http://hrcp-web.org/hrcpweb/wp-content/uploads/
2017/05/State-of-Human-Rights-in-2016.pdf. Accessed 11 Jan. 2022.
97 Reporters Without Borders (RSF) ranked Pakistan 138th out of 180 countries in its recent annual

report. https://www.dawn.com/news/1403971. Accessed 11 Jan. 2022.


98 https://www.aljazeera.com/news/2017/12/pakistani-activist-raza-khan-reported-missing-171

206091156178.html (Reza Khan, who has recently been abducted by ‘unknown men’ was popular
in the social media for his pro-peace stance on Indo-Pakistan relations. Enforced disappearances
not only result in silencing the victim but it has a chilling effect on the freedom of expression in
the society in general.). Accessed 11 Jan. 2022.
99 http://digitalrightsfoundation.pk/2017/04/. Accessed 11 Jan. 2022.
330 S. Aftab

and other militant groups; blasphemy laws, and the rights of religious minorities.
They are also critical of the growing military influence in political affairs. Although
all of them were freed after weeklong captivity and torture,100 the incident caused
chilling effects on social media users and the freedom of online speech.101
In an attempt to discourage any kind of online political discourse, the Federal
Minister for Interior Affairs met with Facebook Vice President for Global Policy
Joel Kaplan and urged him to close “fake accounts” in Pakistan. Facebook VP has
reportedly assented to the request.102 The compulsory disclosure of users’ actual
identity on Facebook would be another blow to the freedom of online expression in
Pakistan and a threat to the life of social media users.
The above instances strengthen the case against blanket prohibition on anonymity
and pseudonymity. In such a socio-political environment, if Internet users were forced
to use their real names for registration, they would not be able to exercise their right
of free development of personality along with the constellations of many associated
interests. The forces against anonymity usually argue in favor of the “civilizing effect”
of the full disclosure of identity and that anonymity creates a deindividuation effect
that makes people expression their views prone to abuse. The following section will
analyze the effect of anonymity on behavior.

4.3 Anonymity and Online Behavior

The question of adopting a particular approach toward regulation of online


anonymity103 is dependent on the effects of online anonymity on the behavior of
Internet users. The apprehensions expressed by Brazilian judges in the Secret case
are not singular, as many other quarters have shown the same concerns. It has been
argued that society would pay a “high price” for individuals who want to be anony-
mous, as anonymity impedes the basic objective of the justice system in terms of
ensuring accountability.104 According to this approach, embracing accountability
and rejecting anonymous communications is the only way forward. The threats of
online anonymity using encryption technology have been enumerated as making
impossible to obtain necessary evidence, the frustration of interception in anti-crime
efforts, and an increase in the magnitude of privacy violation than otherwise would

100 http://www.bbc.com/news/world-asia-41662595 and http://indianexpress.com/article/world/i-


was-tortured-beyond-limits-pakistani-blogger-ahmad-waqass-goraya-4563002/. Accessed 11 Jan.
2022.
101 For instance, Waqas Goraya has left Pakistan and now living Netherlands. He shared his experi-

ence with DW: http://www.dw.com/en/pakistani-activist-waqas-goraya-sharing-experiences-vital-


for-activists/av-39352252. Accessed 11 Jan. 2022.
102 https://www.dawn.com/news/1343822/facebook-vp-nisar-discuss-removal-of-blasphemous-

content. Accessed 11 Jan. 2022.


103 For a detailed discussion on the legal treatment of online anonymity see: Barendt (2016) pp. 122–

154.
104 Davenport (2002), pp. 33–35.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 331

have occurred.105 The evidence suggests that cybercriminals take advantage of the
anonymous feature of the Internet.106 It is believed that the limited traceability and
identification during online communication creates a feeling of immunity from the
law and social rules in the mind of anonymous users, who then resort to inappropriate
or illegal activities.
However, the results of various research studies do not present a clear view on the
exact link between anonymity and online behavior. The mere fact of being anonymous
does not induce anti-social behavior by itself, and the motivation to resort to such
behavior is more important.107 Conflicting conclusions of different research studies
on the issue suggest that, in order to understand the effect of anonymity on Computer
Mediated Communication (CMC), other factors such as motivational background,
objectives, and relevant values should be taken into consideration.108 Many people,
including Zuckerberg, view anonymity as an impediment to conforming to social
norms, yet anonymity can encourage a collective social sense, which could translate
into a very positive participation in the social, communal, and political spheres of
life.109 Thus, the uses of anonymity are very much context-specific. Many possibil-
ities could be found depending on the circumstances of each case. The contribution
of anonymity to crimes such as hate speech may not be that of a cause, it may merely
exacerbate tendencies that already existed at the time of blocking identification.110
Similarly, researchers have also challenged the traditional theories, based on the
concept of deindividuation, where anonymity takes away self-awareness, and which
results in negative outcomes in the shape of anti-normative behavior. Modern theories
such as social identity model of deindividuation effect (SIDE) suggest that specific
social circumstances are more important for determining the connection between
anonymity and anti-normative behavior.111

4.4 Challenges to Online Anonymity and the Need


for a Broader Conceptualization

The anonymity of Internet users is in jeopardy from surveillance undertaken for


security reasons by governments and for commercial interests by private entities.
The argument in favor of full disclosure of Internet users’ identities is the well-
known “nothing-to-hide” paradigm. Facebook authorities have reportedly spoken

105 Etzioni (2008), pp. 78–80.


106 Armstrong and Forde (2003), pp. 209–215.
107 Chui (2014), p. 2.
108 Chui (2014), p. 5.
109 Chui (2014), p. 7.
110 Rowland (2006), p. 533: “the development of the social identity model of deindividuation effects

(SIDE) has challenged the view that anti-social behavior is a necessary result of anonymity or loss
of identity.”
111 Christopherson (2007), pp. 3050–51.
332 S. Aftab

against anonymity, arguing in favor of the “civilizing effect” of using the net with
one’s real identity.112
Internet anonymity is facing an existential challenge from many directions. Apart
from ideological opposition to anonymity, which takes it negatively per se, the prof-
itability in the disclosure of maximum information as well as many other techno-
logical, political, and self-induced reasons are responsible for the demise of online
anonymity.113 The public and political sensitivity toward security demands that every
untoward incidence must be prevented and that the perpetrators must be nabbed in
advance by sensing their pattern of Internet usage. This motive is prevalent all over the
world, particularly after the terror attacks of 9/11.114 The Snowden revelations have
alarmed privacy advocates who see no room for privacy in any part of the world.115
The self-created disclosure to social media sites such as Facebook and Twitter, in
the shape of interconnectivity between different websites and mobile applications,
sharing pictures, documents, and locations, as well as reacting to online posts in
certain ways through the tools these programs provide have made digital profiling
very easy for so-called data brokers. Due to these reasons, scholars present a skeptical
view on the question of whether anonymity can be saved or not. Michal Froomkin
is of the view that “for cellphones the game is fully lost”.116
Big data also poses a major challenge for users seeking to browse the Internet
anonymously and to keep their identities protected from government and commercial
actors. It is a fact that online activities have already generated immense amounts of
data.117

112 Randi Zuckerburg once said that online anonymity “must go away” https://www.huffingtonpost.
com/2011/07/27/randi-zuckerberg-anonymity-online_n_910892.html. Accessed 11 Jan. 2022.
113 Froomkin (2015), pp. 123–129.
114 Froomkin (2015), p. 124.
115 Froomkin (2015), p. 128: “We know from statistics that came out in the Snowden revelations

that as of May 2012, US security services and their allies had collected technical information on
about 70% of the cell phone networks in the world. They had data on 701 out of 985 known cell
phone networks. As for the physical undersea cables that move communications transnationally, it
is believed that every one of those cables has a tap attached to it, which is why some countries now
want to build their own undersea cables, which I interpret, perhaps cynically, to mean they want to
own the taps.” (References omitted).
116 Froomkin (2015), p. 131: “To protect identities in the cell phone world would take a whole new

hardware, a whole new architecture, and given the size and value of the installed base and the power
of incumbent carriers, one has to ask if this is even possible”.
117 Tucker, Has Big Data Made Anonymity Impossible? MIT Technology Review, 116.4 (July–

August 2013): “We’re well down this path already. The types of information we’ve thought of as
personal data in the past—our name, address, or credit card records—are already bought and sold
by data brokers like Acxiom, a company that holds an average of 1500 pieces of information on
more than 500 million consumers around the world. This was data that people put into the public
domain on a survey form or when they signed up for services such as TiVo.”
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 333

The challenges to anonymity suggest that its protection requires a favorable regu-
latory approach, which is ultimately dependent on a broader theoretical meaning
and conceptualization. To view anonymity in terms of its association with name of
the author becomes a limited conception in the online world. It does not sufficiently
protect the intended interests attached to anonymity because of the possibility of
unprecedented traceability.118 Identification from small random bits of information
or even from writing style and other patterns of behavior make the situation compli-
cated for a person who wants to surf the net namelessly. According to Nissenbaum,
“the value of anonymity lies not in the capacity to be unnamed, but in the possibility of
acting or participating while remaining out of reach, remaining unreachable”.119 She
therefore suggests a broader concept of anonymity, dissociating it from namelessness
and considering the “withholding the information or constellation of information”
that can be used to identify the person.120 Anonymity is also regarded in its broader
way as “a kind of relation between an anonymous person and others, where the
former is known only through a trait or traits which are not coordinatable with other
traits such as to enable identification of the person as a whole”.121 In such conceptu-
alization, anonymity has a social context and does not pre-suppose isolation of the
individual from his or her social environment or situation.

5 Conclusion

This article has attempted to highlight various conceptual and regulatory issues asso-
ciated with anonymous activities such as online communication and Internet surfing.
It has primarily focused on the issue of blanket prohibition of anonymous expression
of thought in the Brazilian constitution and argued that any literal interpretation of
this provision has the potential to impede the Marco Civil da Internet’s commitment
to safeguard private life and ensure online data protection. Similarly, the conceptu-
alization of anonymity by the courts that adjudicated the cases of mobile messaging
applications, namely Secret and Cryptic, could not take into account the complex

118 Nissenbaum (1999), p. 141: “We are targets of surveillance at just about every turn of our lives.
In transactions with retailers, mail-order companies, medical caregivers, day-care providers, and
even beauty parlors, information about us is collected, stored, analyzed, and sometimes shared.
Our presence on the planet, our notable features and momentous milestones, are dutifully recorded
by agencies of federal, state, and local government, including birth, marriage, divorce, property
ownership, drivers’ licenses, vehicle registration, moving violations, passage through computerized
toll roads and bridges, parenthood, and, finally, our demise. In the great store of information, we
are identified through name, street address, e-mail ad- dress, phone number, credit-card numbers,
social security number, passport number, level of education, and more; we are described by age,
hair color, eye color, height, quality of vision, purchases, credit-card activity, travel, employment
and rental history, real-estate transactions, change of address, ages and numbers of children, and
magazine subscriptions.”
119 Nissenbaum (1999), p. 142.
120 Nissenbaum (1999), pp. 142–143.
121 Wallace (1999a, b), p. 24.
334 S. Aftab

nature and functions of anonymity. The views of judges against anonymity focused
on the preemptive protection against abusive speech and cybercrimes, and they voted
in favor of its blanket prohibition. On the other hand, the judges who acknowledged
the positive dimensions of anonymity argued from the absolute freedom of speech
theory and called for the absorption of side effects of free speech in the shape of
privacy and dignitarian harms. As a comparative analysis, the legal approach of the
Council of Europe is presented where anonymous communication is not only lawful
but is encouraged in both offline and online platforms. Similarly, the value of private
life and personality rights has equal importance, and no interest can be inherently
superior to the other one. For these reasons, anonymity is regarded as a principle,
while the disclosure of identity is ordered as an exception when the objective is
to protect the interests related to Article 8 of the European Convention on Human
Rights. The European Court of Human Rights reminds all the national jurisdictions
of their positive obligation to make legal and regulatory arrangements to prevent the
violation of human dignity, private life, and personality development. The approach
and balancing exercise mentioned here could be a helpful guide for Brazil, which
should formulate comprehensive rules for the interpretation of relevant constitu-
tional and legal provisions. The judicial orders for disclosure and interpretation of
anonymity can thus be streamlined, and the unlimited discretion of the court would
not then result in legal uncertainty.
Furthermore, the conceptual connection of anonymity to the right to privacy
should be considered in interpretation of the legal instruments. To bestow an absolute
primacy to free speech pushes the balance to the other extreme, whereby significantly
important interests could go without appropriate remedy. The instrumental value of
anonymity is obvious as it comes to the rescue in jurisdictions such as Pakistan,
where there are many reprisals and violent reactions to self-expression. Anonymity
also deserves a sympathetic regulatory approach and a multifaceted complex concep-
tualization because of the ubiquitous sophisticated methods of online surveillance
and profiling.
The considerations presented here may help the Marco Civil da Internet
achieve its objectives of protecting private life as well as confidentiality of online
communication.
Online Anonymity—The Achilles’-Heel of the Brazilian Marco … 335

References

Affonso Pereira de Souza C, Laus Marinho Nunes B (2020) Brazilian Internet bill of rights: the
five roles of freedom of expression. In: Albers M, Sarlet IW (eds) Personality and data protection
rights on the Internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Albers M (2014) Realizing the complexity of data protection. In: Gutwirth S, Leenes R, Hert PD
(eds) Reloading data protection. Springer, Dordrecht, pp 213–235
Albers M (2022) Surveillance and data protection rights: data retention and access to telecommuni-
cations data. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the Internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Armstrong HL, Forde PJ (2003) Internet anonymity practices in computer crime. Inf Manag Comput
Secur 11(5):209–215
Barendt E (2016) Anonymous speech: literature, law and politics. Bloomsbury Publishing, London
Bloustein EP (1964) Privacy as an aspect of human dignity: an answer to Dean Prosser. NYUL Rev
39:962
Christopherson KM (2007) The positive and negative implications of anonymity in Internet social
interactions: “On the Internet, nobody knows you’re a dog.” Comput Hum Behav 23(6):3038–
3056
Chui R (2014) A multi-faceted approach to anonymity online: examining the relations between
anonymity and antisocial behavior. J Virtual Worlds Res 7(2)
Costa L (2012) A brief analysis of data protection law in Brazil. 28th Plenary meeting of the Consul-
tative Committee of the Convention for the Protection of Individuals with regard to Automatic
Processing of Personal Data [ETS No. 108] (T-PD). Council of Europe, June 2012. Available via
SSRN. https://ssrn.com/abstract=2087726. Accessed 11 Jan. 2022
Davenport D (2002) Anonymity on the Internet: why the price may be too high. Commun ACM
45(4)
De Souza CAP, Viola M, Lemos R (2015) Understanding Brazil’s Internet bill of rights ITS, Rio
de Janeiro
De Souza RM (2011–2012) The struggle over privacy, security, cyber-crimes and the civil rights
in the Brazilian law–a historical overview. In: Maria B, Eugenia A, Iglezakis I (eds) Values and
freedoms in modern information law and ethics, p 508. https://bottis.ihrc.gr/download.php?file=
downloads/articles_2012_07.pdf. Accessed 11 Jan. 2022
Doneda D, Mendes LS (2014) Data protection in Brazil: new developments and current challenges.
In: Gutwirth S, Leenes R, Hert PD (eds) Reloading data protection. Springer, Dordrecht, pp 3–20
Etzioni A (2008) The limits of privacy. Basic Books, New York
Froomkin AM (2015) From anonymity to identification. J Reg Self-Reg 1:121
Gavison R (1980) Privacy and the limits of law. Yale Law J 89(3):1421–1471
Martin JA, Fargo AL (2014) Anonymity as a legal right: where and why it matters. NCJL Tech
16:311
Michael L (2022) The impact of jurisdiction and legislation on standards of anonymity on the
Internet. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the Internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Nissenbaum H (1999) The meaning of anonymity in an information age. Inf Soc 15(2):141–144
Palme J, Berglund M (2002) Anonymity on the Internet. http://dsv.su.se/jpalme/society/anonymity.
pdf. Accessed 11 Jan. 2022
Prodriguez K, Pinho L (2015) Marco Civil da Internet: The Devil in the Detail. Electronic Fron-
tiers Foundation. https://www.eff.org/deeplinks/2015/02/marco-civil-devil-detail. Accessed 11
Jan. 2022
Rachels J (1975) Why privacy is important. Philos Public Aff 4(4):323–333
Rossini C, Cruz FB, Donedo D (2015) The strengths and weaknesses of the Brazilian Internet Bill
of Rights: examining a human rights framework for the Internet. The Centre for International
Governance, Innovation and the Royal Institute of International Affairs’. Paper Series No. 19
336 S. Aftab

Rouvroy A, Poullet Y (2009) The right to informational self-determination and the value of self-
development: reassessing the importance of privacy for democracy. In: Gutwirth S, Leenes R,
Hert PD (eds) Reloading data protection. Springer, Dordrecht, pp 45–76
Rowland D (2006) Griping, bitching and speaking your mind: defamation and free expression on
the Internet. Penn St L Rev 110:519
Saldías O (2014) Coded for export! The contextual dimension of the Brazilian Marco Civil Da
Internet. HIIG Discussion Paper Series No. 2014-06
Solove DJ (2008) Understanding privacy. Harvard University Press, Cambridge
Souza CA, Viola M et al (eds) (2017) Brazil’s Internet bill of rights: a closer look. Institute for
Technology and Society of Rio de Janeiro (ITS Rio)
Veit RD (2022) Safeguarding regional data protection rights on the global Internet—the European
approach under the GDPR. In: Albers M, Sarlet IW (eds) Personality and data protection rights
on the Internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Wallace JD (1999a) Nameless in cyberspace: anonymity on the Internet. Cato Institute
Wallace KA (1999b) Anonymity. Ethics Inf Technol 1(1):21–31
Warren SD, Brandeis LD (1890) The right to privacy. Harv Law Rev 15:193–220
Westin AF (1967) Privacy and freedom. Ig Publishing, New York

Sohail Aftab Central Superior Services (CSP Officer) of the Government of Pakistan and Director
in the Ministry of Information and Broadcasting, DAAD scholar and Doctoral candidate at
Hamburg University, Hamburg Germany. Main areas of research: Fundamental Rights, especially
Free Speech and Privacy, Constitutional Law and Jurisprudence, Law of Torts, Press, Broadcasting
Media, and Social Media Laws.
The Impact of Jurisdiction
and Legislation on Standards
of Anonymity on the Internet

Lothar Michael

Abstract Anonymity is one of the most defining characteristics of communication


on the Internet. From a legal perspective, it is protected as part of the fundamental
right to informational self-determination. In addition, the anonymous expression of
opinion could also be included within the scope of freedom of expression. However,
the relativity of the protection of fundamental rights implies that the speaker’s
anonymity is not absolutely but only relatively ensured. Therefore, the rights of
third parties are restricting the right to anonymity.

1 Introduction

The Internet is a medium of communication that also makes it possible to express and
disseminate opinions anonymously. Anonymity is part of the essence of communica-
tion on the Internet. While the provision for and guarantee of anonymity is one of the
principles of a legal framework for the Internet, this does not necessarily mean that
anonymous and non-anonymous expressions of opinion are protected in the same
way.
The following example shows the potential for conflicts: Are doctors, teachers and
hotel operators able to defend themselves against anonymous criticism from online
evaluation portals? In this example we have to keep in mind three actors and their
interests: The critic, the provider of the online evaluation platform (review site) and
the assessed person (e.g. doctor, teacher, hotel operator). This will be discussed later.
The legal evaluation of anonymous expressions of opinion is a particularly inter-
esting question in a book of essays examining German and Brazilian perspectives. We
find two different approaches in Brazilian and German constitutional law to answer
the question, to what extent are each actor’s interests constitutionally protected.
Article 5, paragraph IV of the Brazilian Constitution forbids anonymous speech in

L. Michael (B)
Universität Düsseldorf, Düsseldorf, Germany
e-mail: Lothar.Michael@uni-duesseldorf.de

© Springer Nature Switzerland AG 2022 337


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_14
338 L. Michael

principle.1 That should give us food for thought. The text of Article 5 of the German
Basic Law, which concerns freedom of expression, does not distinguish between
anonymous and non-anonymous speech.2 One approach derives the principle of
anonymity on the Internet from the right to informational self-determination. The
right to informational self-determination was developed by the Federal Constitutional
Court3 and recognised as a fundamental right. It protects the right of individuals to
autonomously dispose of their personal data and is therefore central to data protection
in Germany. But we should keep in mind that the opportunity to express a view on
the Internet anonymously goes beyond the right to informational self-determination
and depends on the scope of freedom of expression.
The answers to the questions (1) whether anonymous and non-anonymous expres-
sions of opinion are protected in the same way and (2) how someone can defend them-
selves against anonymous expressions of opinion on the Internet, are very contro-
versial in Germany. This is in part because the laws regarding data protection on the
Internet are in need of interpretation due to ambiguity and incompleteness. One recent
example governing a similar issue is § 24 Federal Data Protection Act (Bundesdaten-
schutzgesetz n.F. (BDSG 2018)),4 which addresses the processing of personal data
by private parties for purposes other than which it was collected. This legislation
remains open to interpretation insofar as it refers to a balance of interests: Although
§ 24, para 1, no. 2 recognizes the need for private data processing when it “is neces-
sary for the establishment, exercise or defence of legal claims”, it makes the legal
permissibility of doing so subject to an assessment of the interests of the data subject.
Such processing will not be permissible if “the data subject has an overriding interest
in not having the data processed”.
The courts are therefore very influential in the formation of standards. In spite of
several good reasons for the courts to have this weighty authority, it also raises more
problematic issues since the Basic Laws themselves require further interpretation.
Not only are the legal problems and their non-constitutional law solutions changing;
fundamental rights are also interpreted dynamically in Germany. Of course, this is

1 Article 5, IV states "the expression of thought is free, and anonymity is forbidden"Constitution


of the Federative Republic of Brazil: constitutional text of October 5, 1988, with the alterations
introduced by Constitucional amendments no. 1/1992 through 64/2010 and by Revision Constitu-
tional Amendments no. 1/1994 through 6/1994. – 3. ed.—Brasília: Chamber of Deputies, Docu-
mentation and information Center, 2010 available at http://english.tse.jus.br/arquivos/federal-con
stitution. See with more detailed considerations on the Brazilian Constitution, statutory regulations
and jurisdiction Aftab (2022), in this volume. Accessed 11 Jan. 2022.
2 Article 5, para 1 states in relevant part: “(1) Every person shall have the right freely to express and

disseminate his opinions in speech, writing and pictures, and to inform himself without hindrance
from generally accessible sources. … There shall be no censorship." Basic Law for the Federal
Republic of Germany, available at https://www.gesetze-im-internet.de/englisch_gg/. Accessed 11
Jan. 2022.
3 Federal Constitutional Court (BVerfG), Judgement of 15 December 1983 – 1 BvR 209/83,

BVerfGE 65, 1.
4 Gesetz zur Anpassung des Datenschutzrechts an die Verordnung (EU) 2016/679 und zur

Umsetzung der Richtlinie (EU) 2016/680 (Datenschutz-Anpassungs- und -Umsetzungsgesetz


EU – DSAnpUG-EU), v. 30. Juni 2017, BGBl. I, p. 2097.
The Impact of Jurisdiction and Legislation on Standards … 339

in tension with constitutional law’s claim to permanence. The limits of the so-called
‘constitutional change’ (Verfassungswandel) are correspondingly controversial.5
The German Federal Court of Justice (Bundesgerichtshof, ‘BGH’) starts from the
premise that anonymous expressions of opinion enjoy the full protection of freedom
of expression.6 This approach has been widely discussed in the academic literature.7
The Court provides no justification for this premise and possible arguments in favour
of the premise8 are put to the test below. Article 5, paragraph IV of the Brazilian
Constitution gives reason to question the approach of the BGH and it is unclear which
way the German Federal Constitutional Court (Bundesverfassungsgericht) will rule
on this question.
The topic concerns key questions regarding the interpretation of fundamental
rights. In a digital world fundamental rights are to be interpreted dynamically.
The Federal Constitutional Court has actually advanced a dynamic interpretation
of fundamental rights with regard to the right to informational self-determination.9
But the informational self-determination of those who use the Internet is not the
only fundamental right that deserves attention in a digital world. It is time to put
the interpretation of other fundamental rights—here freedom of expression—in the
context of the Internet to the test. At the same time, further classical questions are
being raised: What effects do fundamental rights have between private individuals?
How dynamically should fundamental rights be interpreted? How strongly does the
interpretation of fundamental rights dominate the interpretation of the laws? How
strongly do constitutional courts influence the jurisprudence of civil courts?
Another fundamental question asks: What role does the legislature play and what
role should it play? In Germany, the legislature has not yet acted to protect personal
rights against anonymous evaluations in specific evaluation portals. It has, however,
passed particularly far-reaching laws regulating general social networks. The very
controversial Network Enforcement Act (NetzDG)10 came into force on October 1,
2017. This law aims to prevent the spread of hate speech in commercially operated
social networks. However, according to § 1, paragraph 1, sentence 3 NetzDG, the
law applies only to general platforms for the exchange of any content and not to the
case of a topic-specific evaluation portal as examined here. Moreover, according to
§ 1, paragraph 3 NetzDG, the law is only directed against criminal content, i.e. hate

5 Michael (2014), pp. 426–480.


6 Federal Court of Justice (BGH), Judgement of 23 June 2009 - VI ZR 196/08, BGHZ (Decisions of
the Federal Court of Justice) 181, 328 ff.—spickmich.de; Federal Court of Justice (BGH), Judgement
of 1 July 2014 - VI ZR 345/13, BGHZ 201, 380, mn. 9 ff.,—Ärztebewertung I; Federal Court of
Justice (BGH), Judgement of 1 March 2016, VI ZR 34/15, BGHZ 209, 139—jameda.de.
7 Kaiser (2009), p. 1474; Gomille (2009), p. 815; Gounalakis and Klein (2010), p. 566; Schröder

(2010), p. 212; Kamp (2001), p. 210–218; Grewe and Schärdel (2008), p. 644; Schulze-Fielitz
(2013), zu Article 5 I, II, mn. 75.
8 Kersten (2017b), pp. 196–197.
9 Federal Constitutional Court (BVerfG), Judgement of 15 December 1983 – 1 BvR 209/83,

BVerfGE 65, 1; Judgement of 27 February 2008 – 1 BvR 370/07, BVerfGE 120, 274.
10 Netzwerkdurchsetzungsgesetz v. 1. September 2017, BGBl. I p. 3352.
340 L. Michael

speech. However, the portal operators must pursue all complaints according to this
law and not only those from persons who assert a violation of their own rights.
There are constitutional objections to this law. Its protection objective goes far
beyond the protection of individual rights. A regulation that comprehensively requires
the operators of social networks to delete illegal content runs the risk of also deleting
content that deserves the protection of freedom of expression (overblocking) in the
process of fulfilling its obligations. It is troubling that the law unilaterally punishes
underblocking, but not overblocking.11 It is to be feared that the algorithms used to
check (supposedly) illegal contributions cannot differentiate between satire worthy of
protection and unlawful insults. Nor does the law distinguish between anonymous and
named contributions. There are strong arguments that the NetzDG imposes excessive
restrictions on freedom of expression and is therefore unconstitutional. This does not
mean, however, that the legislature is not allowed to regulate this area at all.12

2 Developments in the Need for Anonymity and Its


Endangerment

Opinions on the legal framework for anonymity on the Internet vary considerably.
This is also due to the fact that there are very different preconceptions about this topic.
Anonymity can be seen as a blessing or a curse. The individual need for anonymity
can vary greatly from person to person. Some people seek anonymity—be it in the
masses or in quiet solitude, some people suffer from anonymity and want to be
noticed as an individual by as many others as possible—be it in public or in private
life. Some people are very shy on stage while others absolutely enjoy appearing
in the spotlight. The need for anonymity can also vary greatly depending on the
situation. Typically, people who receive lots of individual attention in a professional
context, and are known to a more or less limited audience, long for privacy and
anonymity in their private lives. In contrast, the exact opposite phenomenon can be
observed among people whose professional activities are highly anonymous—these
people often strive for greater personal attention in their role as private individuals.
From a legal perspective, such needs for more or less anonymity cannot be assessed.
The interpretation of fundamental rights should be based on the self-image of the
individual and be open to situational differentiation.13 Fundamental rights protect
both extroverts who enjoy individual attention and people who prefer their privacy
and anonymity. At the same time, anonymity is a quantitative phenomenon.14 The
circle of those who recognize an individual can be relatively large or small.

11 Hong (2018), https://verfassungsblog.de/the-german-network-enforcement-act-and-the-presum


ption-in-favour-of-freedom-of-speech/. Accessed 11 Jan. 2022.
12 Hong (2018), https://verfassungsblog.de/the-german-network-enforcement-act-and-the-presum

ption-in-favour-of-freedom-of-speech/. Accessed 11 Jan. 2022.


13 Thiel (2016), p. 10.
14 Thiel (2016), p. 11.
The Impact of Jurisdiction and Legislation on Standards … 341

Even if the phenomenon of anonymity has a very special role on the Internet,
anonymity is by no means a new phenomenon of the digital age. A relatively high
degree of anonymity has shaped the urban life of industrial societies for genera-
tions. Since law develops over time and reacts to developments, it is worth asking
about trends in development. There is even a view15 according to which the defining
characteristic of the Internet is that it increasingly endangers anonymity. The human
need for data protection and the development of the right to informational self-
determination,16 especially in Germany in the twentieth century, fit in with such a
basic assumption. In this context, the right to be forgotten postulated by the Court
of Justice of the European Union17 should also be mentioned. Today, the interest
in the confidentiality of personal data not only requires sufficient legal protection
against a surveillance state, but also increasingly raises the question of protection
against private actors. It is also true that the Internet—and in particular the mobile
Internet and the Internet of Things—creates possibilities for deanonymization and
identification.
It would, however, be an oversimplification to conclude that only the protection
of the anonymity of Internet users—if at all18 —is a legal need. Evaluation portals,
for example, also concern the personal rights and anonymity interests of those being
evaluated. Accordingly, we encounter the interest in anonymity on rating portals
in two ways: the anonymity of the evaluators and the anonymity of the evaluated
persons. These interests are opposed to each other and may conflict.
In this context, it is also important to consider the new dimensions that the Internet
has created in favour of anonymity. However, it should be noted that the Internet both
enables and endangers anonymity. Above all, the anonymity of the evaluated person
is endangered by the Internet in a new way. Even the very nature of a dimension
of anonymity of opinions is new. In the offline world, anonymity was primarily a
phenomenon of non-political privacy and of fear of communication in and with the
public. Those who wanted to make public and political statements in an industrial
society almost always did so personally and identifiably. The Internet, and particu-
larly the format of a rating portal, opens up the possibility for people to express their
opinion to the public without disclosing their identity. Although it is true that publi-
cations under a pseudonym existed in the past, these remained the exception and were
not a mass phenomenon of daily discourse. The Internet has increased the range of
communication, i.e. everyone can access it at any time from any location, and created
a platform for users to reveal their private lives or, even more problematically, the
personal data of third parties.

15 Thiel (2016), p. 16.


16 Federal Constitutional Court (BVerfG), Judgement of 15 December 1983 – 1 BvR 209/83,
BVerfGE 65, 1.
17 Court of Justice of the European Union, Judgment of the Court (Grand Chamber), 13 May 2014,

Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario
Costeja González, Case C-131/12.
18 But cf. Thiel (2016), pp. 19–22.
342 L. Michael

3 The Right to Informational Self-determination


and the Right to Anonymity on the Internet

Considering online rating portals, the data protection interests of the different actors
are diametrically opposed. On the one hand, evaluators are interested in using the
portal without being tracked. On the other hand, people who are being evaluated
have an even keener interest in preventing third parties from putting information
about them on the Internet. In Germany, it is a fundamental right that everyone
shall be protected from the involuntary and other-directed creation of a personality
profile19 which is available on the web. Since even the voluntary, active Internet user
is protected from a personal profile being created, the law must also prevent third
parties from the unauthorized creation of such a profile for or about someone else.
Active Internet users leave a trail of their actions so that the consequences of their
activity can at least be traced back to them. In contrast, the dissemination of personal
information by third parties may also affect those who have not contributed to the
dissemination of their data on the Internet and could even affect somebody who
does not use the Internet. If the law protects people from themselves, then it must
also protect them from third parties. It is not plausible that the right to anonymity
of those who use the Internet voluntarily and actively outweighs the requirement
to protect those who did not even contribute to the dissemination of their data. This
interim outcome could change if other interests have priority. In particular, the right to
disseminate information could be given appropriate weight. Therefore, the question
of the protection given to anonymous expressions of opinion is central. But before
reaching this question, another point has to be clarified.
The interest of informational self-determination also weighs heavily when a
person’s profession is the subject of personal evaluations. In view of this constella-
tion, the Federal Court of Justice plainly refers to the “social sphere”, which includes
the working environment of doctors and teachers.20 Such a lack of protection would
be acceptable if such review portals were used only within the professional sphere. A
teacher, for example, has to accept that his students are talking about him. Teachers
must also accept that their students talk about them in their private environment. The
key point is that this private environment had a limited scope in the offline world.
For example, even if students talked about a teacher, the scope of those reached by
the student was limited to the immediate surroundings. Unlike a celebrity, a teacher
is a “familiar face” only in the community where the school is located. The work
of teachers and doctors, however, only concerns and addresses a rather small part of
the public. In the offline world teachers and doctors have the opportunity to separate
their private sphere from their professional sphere. The possibilities and dimensions
of communication through social media goes beyond the scope of this contribution.

19 Michael and Morlok (2019), mn. 432; Michael and Morlok (2016), p. 352.
20 Federal Court of Justice (BGH), Judgement of 23 June 2009 - VI ZR 196/08, BGHZ (Decisions of
the Federal Court of Justice) 181, 328 ff.—spickmich.de; Federal Court of Justice (BGH), Judgement
of 1 July 2014 - VI ZR 345/13, BGHZ 201, 380, mn. 9 ff.,—Ärztebewertung I; Federal Court of
Justice (BGH), Judgement of 1 March 2016, VI ZR 34/15, BGHZ 209, 139—jameda.de.
The Impact of Jurisdiction and Legislation on Standards … 343

There are good reasons why some teachers and doctors deliberately accept a longer
commute in order to clearly separate their working environment from their private
family life. This attitude corresponds to the basic human need for informational self-
determination, namely the autonomy to shape the image that others have of you and,
in doing so, to take up different roles. The right of personality is strongly impaired,
for example, if a doctor has to accept that anyone can, at any time, read about how
sensitive or unapproachable a patient experienced him to be. A bad rating affects
more than job competition. While a good or bad reputation was formerly limited to
a certain segment of the public, now it is transparent and generally accessible on
the web. The assessed person’s standing is not only available to “clients” but, for
example, also bank advisors and future employers. A public evaluation might even
influence private life, for instance when a potential partner finds the evaluation. Even
on holiday, a rated doctor cannot escape his reputation on the Internet.
Similar problems also show up in teacher evaluations. Let us imagine the example
of a teacher who has lost authority at his school. He gets doubly on the defensive when
his own children and friends can read details of this loss of authority on the Internet.
There are good reasons why people in a private context are very empathetic, and in
a professional context all the more unapproachable. There are also good reasons for
a person to leave the respective sphere in the dark about what qualities they have
in another role: Business leaders can be considered henpecked husbands and family
patriarchs can play subordinate roles at work. Most people attach great importance
to the separation of these spheres. Rating portals blur the spheres: firstly, they may be
accessible to everyone, and second, they contain information people’s personalities
that are of interest in both spheres. Evaluation portals typically contain statements
of personal impression such as empathy or trustworthiness that the assessing person
has of the assessed professional. The reason for this is that a layperson is best able
to assess personal characteristics. A patient, for example, cannot usually provide an
expert determination as to whether or not a specific medical treatment met the highest
professional standards. The patient can say, however, whether the doctor had a good
bedside manner. These problems only occur, however, if the evaluation portal is freely
accessible and it is a named individual who is evaluated. A completely different, less
problematic situation is the evaluation of institutions and companies—for example,
a rating portal for hotels.
Turning now to the normative analysis of protection of anonymity. The prin-
ciple of anonymity on the Internet is derived from the right to informational self-
determination. Regarding the right to anonymity on the Internet, several types of use
can be differentiated: Firstly, the sole viewing of information, secondly, the conclu-
sion of contracts, thirdly, communication, fourthly, the mobile Internet and fifthly,
the Internet of Things. Therefore, one has to examine the situation more closely to
determine which fundamental right(s) may protect a person who is anonymously
assessing others on the web. The following deals with the dimension of anonymous
communication, more precisely with expressions of opinion about persons, which in
turn impairs the anonymity of third parties.
The appeal to one’s own anonymity must not be a license to legally exclude the
protection against infringement of personal rights of third parties. There are also
344 L. Michael

factual reasons: The anonymity of the evaluators makes it difficult for the assessed
party to assert defence rights. This, in turn, exacerbates the problem: anyone who
does not have to expect to be held personally accountable for his comments has one
less reason to resist unacceptable comments.
In other words, the right to anonymity is to be understood as a negative defensive
right and not as a positive right to agitation. Opinions do not fall within the scope
of the right to anonymity. In any case, the right to anonymity on the Internet is not
affected if anonymous expressions of opinion are prohibited or deleted, but rather
may be affected when the author of an opinion is disclosed. In principle, the right to
anonymity on the Internet opens up the possibility of using the Internet anonymously.
However, there is no right to violate the rights of third parties on the Internet under
the cloak of anonymity. The right to anonymity does not say anything about the
extent to which Internet users can exercise other fundamental rights (e.g. freedom of
opinion and freedom of occupation) anonymously.
The right to anonymity on the Internet is in this respect comparable to the right
of association.21 Both rights give the individual opportunities to remain in the back-
ground as an individual and the possibility that his or her own interests may be
exercised by a different personality or anonymously. Freedom of association only
protects the institutional right to form an association. But what the association actu-
ally does is not protected by freedom of association, but only in the context of the
protection of fundamental rights, which are also due to the individual.
It follows that the question of the fundamental rights protection of anonymous
expressions of opinion is not only a matter of the interest in data protection, but
instead depends on whether and to what extent freedom of expression also includes
anonymous remarks.

4 The Protection of Anonymous Statements as an Open


Constitutional Issue

The question of the appropriate level of protection for anonymous opinions has not
yet been satisfactorily resolved. It is certainly not out of the question that the level of
protection for freedom of expression is different for anonymous and non-anonymous
statements. The level of protection depends on two factors: First, the importance a
society attaches to the ability to exchange opinions in one form or another. Second,
the individual need for protection of a fundamental rights holder, which, for example,
is particularly high for minorities and vulnerable people.
The opportunity to express a view anonymously reduces inhibition thresholds,
which has positive and negative effects: On the one hand, any discourse is enriched
by the anonymous expression of unvarnished criticism and unpopular views that
would otherwise not have been expressed publicly due to, and in fear of, the expected
social pressure. Anonymity promotes passionate discussion by reducing inhibition

21 Michael and Morlok (2019), mn. 300; Michael and Morlok (2016), p. 274.
The Impact of Jurisdiction and Legislation on Standards … 345

thresholds which in turn encourages people to take part in the public discourse,
especially in a critical way. Thus, anonymity can avoid “chilling effects” and can
break the “spiral of silence”.22 In addition to protecting the individual interests of
those who would otherwise remain silent, supporting critical discourse is also of
social importance. There is a social interest in protecting even anonymous discourses
if the information that is disseminated is in turn of public interest. On the other hand,
there is an increased likelihood that untruths are spread and personal rights are
violated. In particular, the reduction of inhibition thresholds has the negative effect
of reinforcing negative behaviour.
The right to freedom of expression is content neutral.23 Under constitutional law,
the quality of the anonymous contribution is irrelevant. The fear that the level of
communication might also decrease through anonymity therefore does not go against
the protection of opinions that are expressed in this way.
In this context, the question of whether statements posted under a false identity
and, in particular, whether bots enjoy the protection of fundamental rights does not
need to be clarified. The fact that such contributions are intended to manipulate and
distort competition among opinions also argues against extending the protection of
fundamental rights to such statements. In Germany, the dissemination of objectively
false statements of fact is not protected by freedom of expression.24 The scope of
protection of freedom of expression is extended to claims of fact unless and until a
claim has been proven to be false. This also applies to contributions in assessment
portals: It often happens that evaluated doctors ask the portal operator to delete a
contribution because they doubt its authenticity, i.e. whether it is based on a real
patient’s experience in their office. According to a judgement of the Federal Court
of Justice,25 the portal operators are obliged in such cases to request the evaluator to
prove the truth of his assertions. Since rating portals usually only contain contribu-
tions that refer to concrete experiences, deception about the identity of the contributor
in these cases also implies deception about objective facts and for this reason alone
such contributions are not protected by freedom of expression.
Quite the contrary, the chance for a quantitatively wide distribution of online
discourse alone indicates that anonymous statements and their distribution ought to
be protected by freedom of expression. Thereby, the distribution of opinions has its
own weight. When considering online evaluation portals, the operator is another—
possibly not anonymous—holder of fundamental rights, who has to be taken into
consideration. The operator him- or herself may invoke his or her own rights. In
detail, this includes not only professional and entrepreneurial freedom but also the
freedom to spread the opinions of others without processing them before doing so.
The assessed person might furthermore invoke his or her rights against the operator.

22 Noelle-Neumann (1980), p. 86.


23 Michael and Morlok (2019), mn. 646, 654; Michael and Morlok (2016), p. 513, 517.
24 Michael and Morlok (2019), mn. 210; Michael and Morlok (2016), p. 208.
25 Federal Court of Justice (BGH), Judgement of 1 March 2016, VI ZR 34/15, BGHZ 209, 139—

jameda.de.
346 L. Michael

The operator might finally serve as a mediator between the assessed person and the
critic, perhaps even knowing and disclosing the critic’s name.
The Internet additionally creates a forum for direct discussion between citizens
without the interference of the press. In this regard, there are certain parallels to the
freedom to demonstrate. A discourse on the web can be considered as a digital town
square. There are parallels between political demonstrations and the discourse on
the Internet. Both open up opportunities for citizens to express themselves directly
beyond debates in parliament and discourses in the media, and, above all, to clearly
communicate messages of protest. Both are phenomena of a “swarm democracy”.26
Digital communication and in-person assemblies can even complement each other.
Today, calls for real world meetings can be made through social networks on the
Internet. From a legal point of view, the comparison of both phenomena is also
worthwhile. The legal standards are as controversial for flash mobs27 as for rating
portals.
It makes sense to compare the legal standards for public demonstrations and for
rating portals on the Internet.28 It is striking: German law forbids masks or any
other device aimed at obscuring a person’s identity during an assembly.29 Of course,
there are also critics30 who consider the authorities’ legally granted discretion to ban
masks and disguises during demonstrations to be a violation of the fundamental right
of freedom of assembly. It is true that the scope of protection of the fundamental
right to freedom of assembly also includes anonymous or disguised demonstrations.
It is also true that banning disguises requires constitutional justification and that a
blanket statutory requirement to show one’s face at public meetings would violate the
principle of proportionality. Lawmakers want to prevent violence and facilitate the
prosecution of criminal offenses. They accept the consequence that this regulation
may have a deterrent effect on peaceful protesters who fear social repression for their
views on certain matters. As on the Internet, it is also about the possibly disinhibitory
effects of anonymity.
What can we learn from this for the present context? On the one hand, there are
good reasons to protect anonymity. But on the other hand, there are equally important
reasons for limiting anonymity. The different reasons for and against anonymity
have to be weighed when searching for differentiated solutions. The content of such
differentiated solutions is a question of constitutional interpretation and a question
of law design, which means also designing the structure of fundamental rights by
law (‘Grundechtsausgestaltung’).31

26 Kersten (2017a), Schwarmdemokratie, p. 159.


27 Federal Constitutional Court (BVerfG), Decision of the Third Chamber of the First Senate of 18
July 2015 - 1 BvQ 25/15, BVerfGE 139, 378; Kersten (2017b), p. 200.
28 Gesetz über Versammlungen und Aufzüge (Versammlungsgesetz) § 17a; Kersten (2017b),

pp. 198–200.
29 Jahn (1988), pp. 545–551.
30 Hoffmann-Riem (2011), p. 1152; Kersten (2017b), p. 199.
31 Häberle (2018), p. 126.
The Impact of Jurisdiction and Legislation on Standards … 347

For jurisprudence, three questions follow from this: First, what are the interests
involved and their respective values? These should be identified in order, if appro-
priate, to postulate their constitutional roots. Second, what are the legal instruments?
To what extent do the laws de lege lata comply with the interests and what instru-
ments de lege ferenda would be desirable? Finally, an institutional question needs to
be addressed: to what extent are courts allowed to fill gaps in legislation and correct
legislative deficits?

5 Two Guidelines for Interpretation: The Need


for Protection of Vulnerable Individuals and the Personal
Responsibility of the Fundamental Rights Holders

We should keep in mind two guidelines for the interpretation of fundamental rights:
The need for protection of vulnerable individuals and the personal responsibility
of the fundamental rights holders. The first of these guidelines argues in favour
of protecting opinions when they are anonymous. However, this does not neces-
sarily mean that anonymous statements should have the same level of protection
as non-anonymous statements.32 The ideal of an ‘open society’33 is shaped by the
commitment and courage of its citizens. In an ideal world, no one would be afraid
to disclose his non-compliance with the main stream. Constitutional law should
protect the right to openly and publicly express one’s personal views. Freedom of
opinion also protects, in principle, the right to speak about other people in a ruth-
lessly critical way. The right to informational self-determination does not extend to
prohibiting someone from criticizing you. But the liveliness of public discourse is
not just a matter of legal (non-)regulation. Whether one has the courage to reveal
his name and to show his face in the public controversy also depends on social and
economic circumstances. Beneficial circumstances include: representing a popular
opinion, enjoying a high social standing and economic independence.
Freedom of expression should also protect those who are actually disadvantaged
in this regard. Fundamental rights ask about the protection needs of individualists.
Fundamental rights protection is especially important for those who represent unpop-
ular views, who are themselves outsiders and who are economically dependent on
third parties.
Freedom of expression’s scope of protection must not be determined solely by the
ideal of the “citoyen”, who shows his political colours and stands up for his views.
Protection is not limited to the courageous, the personally independent or those who
can hope for encouragement and appreciation. In fact, freedom of expression espe-
cially protects the frail and timid who are hard hit by repressive measures as a result
of swimming against the stream. It would be an idealization of democracy to assume
that in a democratic society, a person who expresses his opinion will automatically

32 But see Kersten (2017b), p. 196.


33 Popper (1945), p. 185.
348 L. Michael

be appreciated for his openness. A frank person will in fact risk reprisals despite
their openness if their opinion is not shared by society at large. Since anonymity
also protects the weaker members of society, it helps to bridge the gap34 between
‘Bourgeois’ and ‘Citoyen’.
On this point, therefore, the conclusion has to be that the scope of protection of
freedom of expression also includes anonymous statements. However, the argument
that constitutional law should help to avoid chilling effects is neither exhausted by a
broad understanding of the scope of protection of freedom of expression, nor does it
argue in favour of protecting anonymous opinions with the same level of protection
as non-anonymous statements. On the contrary, the argument for avoiding chilling
effects speaks in favour of particularly protecting those courageous people who stand
up for provocative opinions with their names. In the context of protection of opinions,
statements that are not made anonymously therefore deserve a particularly high level
of protection. This has an effect on the weighing with counter-rights: It is more
reasonable to bear personal (non-anonymous) criticism than anonymous criticism.
A second guideline should be the starting point for restricting anonymity on the
Internet. Protection of freedom implies the principle that freedom corresponds to
responsibility. Lawful liberty is bound by the freedom of others. An exercise of
freedom which deprives others of the assertion of reciprocal rights cannot obtain
legal protection. Accordingly, fundamental rights only have relative effects. Those
who claim unrestricted freedom on the web have to ask themselves whether this
freedom is exercised at the expense of violating the rights of third parties.
Protecting anonymous expressions of opinion poses the risk that counterclaims
become devoid of purpose. This does not, however, exclude that anonymous expres-
sions of opinion are protected in general. Even though the protective scope of freedom
of expression covers anonymous statements, the relativity35 of the protection of
fundamental rights implies that the speaker’s anonymity is not absolutely but only
relatively ensured. The question is whether the protection of counterclaims can only
be invoked against the right to further disseminate an anonymous statement or if it
might even bring about a loss of anonymity. Put another way, does a person who
violates the rights of a third party through an anonymous statement need to expect
that the distribution of the statement will simply be stopped or that his or her identity
will be disclosed.
Neither anonymity or freedom of expression alone, nor their combination is per
se given priority. The positive impact that online portals have on the formation of
public opinion is also not protected limitlessly. This impact must instead be balanced
against the right of personality of the person being evaluated.
Eliminating anonymity must be the last resort (ultima ratio). Consideration of
fundamental rights requires that deterrent effects be avoided. The law must not funda-
mentally call into question the positive effects of disinhibition through anonymity. It
would be neither possible nor desirable to reverse the principle of anonymity on the
Internet as such. Rather, it is a matter of limiting the scope of anonymous activity

34 Smend (2010), p. 311.


35 Alexy (1985), p. 75.
The Impact of Jurisdiction and Legislation on Standards … 349

on the Internet if personality rights and informational self-determination (which also


protects anonymity needs) of third parties would otherwise be violated.

6 Proposed Solutions

The next question raised concerns the possibilities for non-constitutional law to
draw conclusions from the constitutional balance of interests. To make this as clear
as possible, we return to the example of the online evaluation portal. The legal
framework for online evaluation portals has to bring three constitutionally protected
interests into balance. First, the interest of the assessing person in remaining anony-
mous. He or she has a need to express and disseminate his or her opinion about a
service. A comprehensive evaluation is also in the public interest. According to the
principle of competition, rating portals can help to ensure that quality (or better: that
what their users consider quality-relevant) prevails. Secondly, the portal operator is
genuinely interested in spreading anonymously expressed opinions. The commercial
pursuit of such a platform also implicates economic interests. Finally, the assessed
person wants his or her right of personality to be protected. Economic interests are
also at stake for these people when their professional performance is assessed.
All these perspectives are relevant from a constitutional point of view. Hence,
all the conflicting interests have to be brought into practical concordance (“prak-
tische Konkordanz”36 ), meaning that all rights involved in the conflict are granted
the greatest possible effect.
For this purpose, the portal’s operator should be involved in the protection of
both the rights of the users and of the assessed persons. Theoretically, it would be
conceivable to apply various instruments of administrative law, criminal law and civil
law. It is conceivable to oblige the portal operator to prevent violations of privacy
rights and, if necessary, to block or delete contributions. Civil claims brought by the
assessed person may also be possible against the user and/or portal. This raises the
question of whether an assessed person can only dispute individual assessments or
whether he can demand to be completely removed from a portal with a preliminary
injunction. Criminal sanctions against the assessor are also possible, which may
require the anonymity of the assessor to be lifted. The latter also touches on the right
to informational self-determination. Finally, legal requirements may be placed on
the content and quality of the contributions in order to increase the objective value of
competition in a way that is as fair and neutral as possible and, in particular, excludes
untrue and inappropriate assessments.
For the purpose of criminal prosecution and claims for damages for violations
of the right of personality, the evaluated person must be entitled to learn about the
critic’s identity. The German law and jurisprudence only allows anonymity to be

36 Hesse (1999), p. 28.


350 L. Michael

lifted in cases of punishable statements.37 Even in this case, anonymity can only be
revoked by the prosecution and, through access to the files, revealed to the victim.
In practice, however, prosecutors are much too overworked to sufficiently prosecute
such crimes. There is also a law which makes it possible for copyright holders to be
told the identity of the person who anonymously infringed their rights. A legislative
initiative which the Federal Council (Bundesrat) launched in 2016 aimed at equating
personal rights and intellectual property. This initiative failed in the end, which
reflects an imbalance.38 Only the intellectual property, which is of importance for a
large business lobby, was regulated by the legislature on behalf of this lobby.
The German laws and the German courts do not adequately protect the personality
rights of the assessed persons. The right of personality should give the assessed person
the option to enjoin the portal operator from publishing any anonymous assessment
of individualized personal characteristics in freely accessible internet portals, even if
the assessment relates to a professional activity. This injunctive relief follows from
the equivalence of the personality rights of the evaluator and the evaluated person.
If the evaluators want to remain anonymous with their criticism on the Internet, they
cannot claim the right to name the criticized person. The Federal Court of Justice39
has seen this differently, arguing that this option would contradict the idea of freedom
of expression. In the opinion of the Court, public interest in information about medical
services and the free choice of a doctor outweigh a physician’s personality rights.
This argument must be rejected. The right of personality has to prevail in cases of
evaluations ad personam. The fact that expressions of opinion are also protected
when they are published anonymously is meaningful if the criticism refers only to
impersonal circumstances. For example, criticism concerning long waiting times
should be protected, but not criticism about a doctor’s lack of empathy. Injunctive
relief, which the assessed person is entitled to, can be sought against the portal
provider. In order to satisfy the injunction, the provider is neither obliged nor entitled
to reveal the anonymous critic’s identity. In case of doubt and if there are contested
facts, the infringed person can obtain an interlocutory injunction.
In a new decision,40 however, the Federal Court of Justice has relativized its
standards. Now, the weight of freedom of expression depends on whether the operator
of the rating portal occupies a position as a “neutral” information intermediary. A
portal is neutral if all locally competing physicians (professionals) are evaluated
in the same way. In this case, the operator of an assessment portal for physicians
in Germany offered a premium paid service that allowed physicians to supplement
their assessment page with pictures and information about their practice. In addition,
information about the (possibly better) assessments of competing physicians, which
was included on non-premium assessment pages, was replaced on the premium pages

37 Federal Court of Justice (BGH), Judgement of 1 July 2014 - VI ZR 345/13, BGHZ 201, 380,
mn. 9 ff.,—Ärztebewertung I; Federal Court of Justice (BGH), Judgement of 1 March 2016, VI ZR
34/15, BGHZ 209, 139—jameda.de.
38 Lauber-Rönsberg (2014), pp. 13–14.
39 BGH, Judgement of 23 September 2014 - VI ZR 358/13, BGHZ 202, 242.
40 BGH, Judgement of 20 February 2018 - VI ZR 30/17.
The Impact of Jurisdiction and Legislation on Standards … 351

with self-advertisement. The Federal Court of Justice found that the portal was not
acting as a neutral information intermediary and upheld the physician’s claim for
the deletion of all her data and reviews on the portal operator’s site. The practical
consequence of this decision, however, was not that information about the physician
was deleted. Rather, the portal operator responded immediately after the decision
was announced to change its practices so that they satisfied the Court’s neutrality
requirements. Thus, physicians must continue to tolerate anonymous assessments of
their activity on Internet portals.
The jurisprudence is thus more concerned with an objective improvement of the
neutrality and thus the information content of such evaluation portals than with
the protection of the doctors’ personal rights. The jurisprudence on the neutrality
obligation of the portal operators leads ultimately to an optimization of the interests
of the patients and the public interest in fair competition between medical services.
This approach is not sufficient to protect the personal rights of those assessed.
This deficit of individual legal protection also has objectively negative effects: The
exercise of those professions that are the subject of such public performance reviews
becomes less attractive if linked to becoming a public person. It is true that job profiles
change over time and that the law should not function to prevent such change. For
example, the social role of the doctor has changed from an unassailable authority
(“Herrgott in Weiß”) to a service provider of whom patients are critical. The law is
not intended to protect the factual unassailability of doctors, but, on the contrary, has
rightly contributed significantly to changing the doctor-patient relationship through
medical liability law. The law has also strengthened patients’ rights precisely in order
to counterbalance patients’ dependence on doctors. Only the law creates equality of
arms here. Anonymous evaluation portals have created a new imbalance whereby
doctors are subjected to personal accusations against which they are completely
defenceless. This should also be addressed by the law.

7 Quis Iudicabit?

The decisions of the Federal Court of Justice should not only be criticized in terms
of their content. Rather, the institutional question also arises as to whether it should
be the task of the supreme civil court to develop legal standards for the protection
of freedom of opinion, anonymity and personal rights on the Internet. Although the
German legal system has historically been shaped by written laws and not by case
law, standard setting is increasingly a function of both the courts and the legislature.
This has various causes: Here, Europeanisation and internationalisation play a
role, which are not only advanced by the executive branch and the legislative branch.
In Germany, however, the leading cause is the phenomenon of the constitutionalisa-
tion of the legal system, i.e. the derivation of the balance of interests from a weighing
of constitutional principles, in particular from fundamental rights. This, in turn, is
rooted not least in a strong constitutional jurisdiction. It is not the place here to
fundamentally question the limits of the legislative functions of the courts and in
352 L. Michael

particular of the Federal Constitutional Court. It is assumed in the following that


there are good reasons for the courts to enforce the protection of individual rights
vis-à-vis the legislature and that fundamental rights in this way shape the standards
for the reconciliation of interests beyond an individual case.
Against this background, however, the case law of the Federal Court of Justice
appears problematic for three reasons:
Firstly, the fundamental rights premises of its jurisprudence—in particular the
assumption that anonymous expressions of opinion were subject to the protection
of Article 5(1) of the Basic Law—are not sufficiently justified. In the meantime, the
Federal Court of Justice has recognised that this approach needs to be relativized, at
least in terms of the outcome.
Secondly, however, this relativization takes place through a reconciliation of inter-
ests that primarily balances objective interests and still remains deficient in the
protection of individual rights. Even if the objectives of improving the quality of
contributions in rating portals and guaranteeing the neutrality of the portal operators
are legitimate, this is a regulatory concept that should be discussed and decided by
the legislature. The courts should concentrate on protecting the individual rights of
those concerned—be it within the framework of the law or—as here—if necessary
also praeter legem.
Thirdly, court decisions must be questioned insofar as they impose obligations
and prohibitions on the portal operators. Legal prohibitions or commands represent
encroachments on fundamental rights—even if they are carried out for the protection
of private individuals and enforced with instruments under private law.41 The reser-
vation of the law (‘Vorbehalt des Gesetzes’) applies in principle to encroachments
on fundamental rights. This means that an interference with fundamental rights is
only justified if it has a statutory law foundation. This doesn’t mean, however, that it
must necessarily be based on a specific law. Such interventions may, of course, also
be based on general clauses as judicial interventions. The decisions of the Federal
Court of Justice are based on the general clause of § 242 BGB (“Good faith”).
Even more concrete is the new § 24 Paragraph 1 No. 2 BDSG 2018 as a legal basis
for judicial intervention—including the restriction of the basic rights of the evalua-
tors. A generous interpretation of the reservation of the law and case law based on
general clauses is only legitimate to the extent that it can be based on an obligation to
protect fundamental rights. This requires sufficient constitutional justification, which
is lacking here. The objectives of the approach of the Federal Court of Justice are
primarily related to the public interest and not the protection of fundamental rights.
This is followed by the question of which institutions would be in a position to
correct this judicial development of law. This could be the legislature, but also the
Constitutional Court. The Federal Constitutional Court is known for its comprehen-
sive and dynamic interpretation of fundamental rights and for setting standards.42 It
is quite conceivable that the Federal Constitutional Court will have the opportunity
to comment on these problems. Above all, the individual constitutional complaint

41 Michael and Morlok (2019), mn. 481, 505; Michael and Morlok (2016), pp. 393–396, 407–412.
42 Jestaedt, Lepsius, Möllers, and Schönberger (2011), p. 159.
The Impact of Jurisdiction and Legislation on Standards … 353

(‘Verfassungsbeschwerde’) comes into consideration. Applications by an individual


to address violations of the constitution are well known in both Germany (Article
93 (1) N. 4a Basic Law)43 and Brazil (Article 102, paragraph III of the Brazilian
Constitution).44 However, such a constitutional complaint would only be successful
if the Federal Constitutional Court came to the conclusion that the Federal Court of
Justice had fundamentally misjudged one of the fundamental rights concerned.
It is a question of content whether we see the legislation in this area as a failure or
as a reasonable restraint. The selective rules for the Internet are a compromise. They
go too far for some and not far enough for others. It should be mentioned here that
with the NetzDG—in this respect quite parallel to the decisions of the Federal Court
of Justice—the German legislature is pursuing the approach of imposing obligations
on the intermediaries of the Internet to monitor the content of contributions.
The overall result is that both the courts and the legislature in Germany have set
out to prevent the infringement of rights by opinions expressed on the Internet. This
in itself is to be welcomed, even if one is of the opinion that some of the approaches
overshoot the mark and others do not go far enough.

References

Aftab S (2022) Online anonymity—The Achilles’-heel of the Brazilian Marco Civil da Internet.
In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer,
Dordrecht, Heidelberg, New York, London (in this volume)
Alexy R (1985) Theorie der Grundrechte. Nomos, Baden-Baden
Gomille C (2009) Prangerwirkung und Manipulationsgefahr bei Bewertungsforen im Internet.
Zeitschrift für Urheber- und Medienrecht 815–824
Gounalakis G, Klein C (2010) Zulässigkeit von personenbezogenen Bewertungsplattformen – Die
„Spickmich”-Entscheidung des BGH vom 23. 6. 2009. Neue Juristische Wochenschrift 566–571
Grewe H, Schärdel F (2008) Der digitale Pranger – Bewertungsportale im Internet. Multimedia und
Recht 644–650
Häberle P (2018) Fundamental rights in the welfare state. In: Kotzur M (ed) Peter Häberle on
constitutional theory. Nomos, Baden-Baden, pp 17–128
Hesse K (1999) Grundzüge des Verfassungsrechts der Bundesrepublik Deutschland, 20th edn. C.
F. Müller, Heidelberg
Hoffmann-Riem W (2011) Versammlungsfreiheit, § 106. In: Merten D, Papier HJ (eds) Handbuch
der Grundrechte (Band IV). C. F. Müller, Heidelberg, pp 1117–1212
Hong M (2018) The German network enforcement act and the presumption in favour of
freedom of speech. https://verfassungsblog.de/the-german-network-enforcement-act-and-the-pre
sumption-in-favour-of-freedom-of-speech/. Accessed 11 Jan. 2022
Jahn R (1988) Verfassungsrechtliche Probleme eines strafbewehrten Vermummungsverbots.
Juristenzeitung 545–551

43 “The Federal Constitutional Court shall rule on constitutional complaints, which may be filed by
any person alleging that one of his basic rights (…) has been infringed by public authority”.
44 “The Federal Supreme Court is responsible, mainly, for safeguarding the Constitution and it is

incumbent upon it: (…) to adjudicate, at extraordinary appeal level, cases decided in a sole or last
instance, when the appealed decision: a) is contrary to a provision of this Constitution; b) declares
the unconstitutionality of a treaty or a federal law (…)”.
354 L. Michael

Kaiser AB (2009) Bewertungsportale im Internet – Die spickmich-Entscheidung des BGH. Neue


Zeitschrift für Verwaltungsrecht 1474–1477
Kamp J (2001) Personenbewertungsportale. C. H. Beck, München
Kersten J (2017a) Schwarmdemokratie. Mohr Siebeck, Tübingen
Kersten J (2017b) Anonymität in der liberalen Demokratie. Juristische Schulung 193–203
Lauber-Rönsberg A (2014) Rechtsdurchsetzung bei Persönlichkeitsrechtsverletzungen im Internet.
Verantwortlichkeit von Intermediären und Nutzern in Meinungsforen und Personenbewer-
tungsportalen. MultiMedia und Recht 10–14
Michael L (2014) Die verfassungswandelnde Gewalt. Rechtswissenschaft 426–480
Michael L, Morlok M (2016) Direitos Fundamentais, tradução de de Sousa AF e Franco A. Savaiva,
São Paolo
Michael L, Morlok M (2019) Grundrechte, 7th edn. Nomos, Baden-Baden
Möllers C, Jestaedt M, Schönberger C (2011) Das entgrenzte Gericht, Eine kritische Bilanz nach
sechzig Jahren Bundesverfassungsgericht. Suhrkamp, Berlin
Noelle-Neumann E (1980) Die Schweigespirale. Öffentliche Meinung – unsere soziale Haut. Piper,
Zürich/München
Popper KR (1945) The open society and its enemies. Routledge, London
Schröder M (2010) Persönlichkeitsrechtsschutz bei Bewertungsportalen im Internet. Verwal-
tungsarchiv 101:205–230
Schulze-Fielitz H (2013) Art. 5 I, II. In: Dreier H (ed) Grundgesetz, vol I, 3rd edn. Mohr Siebeck,
Tübingen
Smend R (2010) Bürger und Bourgeois im deutschen Staatsrecht. In: id. Staatsrechtliche Abhand-
lungen, 4th edn. Duncker & Humblot, Berlin, pp 309–325
Thiel T (2016) Anonymität und der digitale Strukturwandel der Öffentlichkeit. Zeitschrift für
Menschenrechte 10(1):9–24

Lothar Michael Professor of Public Law at Düsseldorf University. Main areas of research:
Fundamental Rights, Constitutional Theory, Administrative Law. Selected Publications: Grun-
drechte (with Martin Morlok), Baden-Banden: Nomos, 2019 (7th edition); Direitos Fundamen-
tais, tradução de de Sousa AF e Franco A. Savaiva, São Paolo: Saraiva 2016; Gleiches Recht der
Älteren auf gesellschaftliche Teilhabe, Baden-Baden, Nomos 2018; Die verfassungswandelnde
Gewalt, in: Rechtswissenschaft, Baden-Baden, Nomos 2014, pp. 426–480.
Digital Identity and the Problem
of Digital Inheritance
Limits of the Posthumous Protection of Personality on
the Internet in Brazil

Gabrielle Bezerra Sales Sarlet

Abstract Computers and information, data coding and handling systems are under-
stood as an extension of the human person, forging a web of unusual relations that
require a protection according to the model of the possibilities of appeals, i. e. a
protection that is multidimensional and compatible with less control and greater
scope in the regulation of risks. It is important to ensure the protection against risks
of material and immaterial damages, e.g. in cases of the creation of fake profiles,
violation of privacy, data retention and manipulation, stigmatization, discrimination
through registers etc. Personal data refers to all information of a personal char-
acter that are characterized by the possibility of identifying and determining the
data subject, whereas sensitive data are those that have to do with racial and ethnic
background, political, ideological and religious convictions, sexual preferences, data
about health, genetic and biometric data. The ensemble of these pieces of information
makes up the digital profiles or identities and has a political and, above all, economic
value, since they can become the raw material for the use of software that is directly
connected to the new forms of social control. Thus, data protection is, in sum, the
protection of the human person, mainly the protection of the free development of
his or her personality by guaranteeing informational self-determination, including
protection after their death. Digital inheritance is the set of data, digital assets which,
in sum, are digitized assets stored in the cloud. Considering that heritage involves
material and immaterial goods, the destination of the digital heritage of individuals
after their death becomes an urgent issue. This is, in principle, a conflict between the
enforcement of the fundamental right to inheritance and the guarantee of the right to
intimacy and privacy, which are essential to the exercise of personality rights. And
there is a demand for new forms of personality rights and inheritance rights that
encompass the notion of virtual personality and digital inheritance. Digital legacy,

This is the result of research developed in post-doctoral studies at the Pontifical Catholic University
of Rio Grande do Sul (PUCRS) and at the University of Hamburg, under the supervision of Professors
Carlos Alberto Molinaro and Marion Albers, and owes its realization particularly to the support and
granting of a scholarship by the Max Planck Institute of Hamburg.

G. B. S. Sarlet (B)
Pontifical Catholic University of Rio Grande do Sul—Law School (PUCRS), Porto Alegre, Rio
Grande do Sul, Brasil

© Springer Nature Switzerland AG 2022 355


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_15
356 G. B. S. Sarlet

i.e., the sum of rights, assets and obligations in the digital sphere that must be passed
to the heirs, is framed in a classification of goods that cannot by economically valued
and of goods that can be economically valued. The former would have a predomi-
nantly affective value and the rest an undeniable economic value due to the direct
possibility of their monetization. The concept of digital inheritance means that multi-
national companies will have to take steps to protect the personal data of their users.
Despite the civil framework for the Internet in Brazil, there are still some legisla-
tive gaps involving both personality and virtual inheritance. The approach to digital
inheritance issues must go through the data protection issue. As long as the virtual
personality persists, essential data will be protected using passwords, encryption,
etc. However, since there is no specific regulation in Brazil, the pure and simple
application of inheritance rights to the digital universe affects both the personality
rights sphere and some of the most valuable rights and principles guaranteed by the
Constitution.

1 General Context

The radical influence of technology on daily life is currently undeniable and even
determines the resignification of the concept of human being.1 On the one hand,
due to the extreme complexity that results from the use of technology, particularly
technology applied directly to life, i.e. biotechnology, some canons of Western civi-
lization have become completely obsolete. Due to the implications of the information
age in the lives of contemporary human beings, some canons, such as the idea of
death as a temporal marker of human finitude, or canons inherited, e.g. from Shake-
spearean literature, such as the idea of a crystallized identity, or even resulting from
the work of Kant such as the free, unhindered self-determination as a substrate of
subjectivity, have lost their primary condition for humankind. Among the most devas-
tating effects, the abyssal change undergone by the concept of privacy and economics
in the last few years should be recalled.2 On the other hand, concepts such as glob-
alization, virtualization and connectivity have been added to people’s lives and, as a
rule, become indispensable to perceive the current context.3
The popularization4 of information technology, in its turn, has borne revolutionary
fruits that permeate all the way from the quantity of data that are currently made

1 Jayme (1995), p. 9.
2 About all this, see Castells (2016), pp. 57–103.
3 Castells (2003), p. 43.
4 In the first decade of the twenty-first century, the number of people connected to the internet went

from 350 million to 2 billion. Moreover, in the same period, the number of people with mobile
phones went from 750 million to 5 billion. For 2025 it is expected that most of the world population
will have access to instantaneous information, and if the rate of growth of people connected to the
internet is maintained, on that date there will be 8 billion people online. Cf. Cohen, Schmidt (2014),
p. 15.
Digital Identity and the Problem of Digital Inheritance 357

available, the energy cost,5 the storage sites and security, the virtuality6 of social
relations to the velocity at which these data travel through the web and reach places
which had previously been unthinkable for human beings, besides the illusion of
neutrality that involves everyone in this process.7
Some consequences are already perceptible, while others point to a prognosis of
an abyssal devise in the history of humankind.8 Indeed, the revolution in the use of
information technology has caused profound changes in social relations, generating
political, economic,9 legal10 and existential11 effects.
There is an irrefutable mismatch between the parameters of comprehension and,
especially, of regulation in force and the current velocity and ephemeralness which are
characteristic of contemporary relations, especially after the advent of the internet.12

5 In 2011, the USA internet consumed an average of 2% of all the electricity produced worldwide.
The heat generated by the servers, active or not, requires compatible cooling, and, thus, it is estimated
that in 2020 in the USA consumption will rise to 140 billion kWH, 54% more. See Farinaccio (2016).
6 Agamben (2006), p. 35.
7 Neutrality refers to the principle that ensures that all the content transmitted by the computer web

should be treated in the same way, that is, should not suffer any kind of discrimination, be it for its
content, for its origin or for its destination. See, pars pro toto, Marsden (2010), p. 36.
8 Ramsay (2001), p. 45.
9 For a perception of the monetarization of the internet use, suffice it to recall that Microsoft in 2016

acquired LinkedIn, paying 26 million dollars for the company and especially for its professional
cadaster of 430 million users and 100 million visitors a month. The value represents 60 dollars per
user or 260 dollars per visitor a month, see The Economist (2016) LinkedUp. http://www.econom
ist.com/business/2016/06/18/linkedup. Accessed 11 Jan. 2022.
10 “Information Society—which is nothing more than a specific form of social organization, in which

the management, processing and transmission of information become the fundamental sources of
production and power, due to the new technological conditions that arose in this historical period.
The rise of this new society, therefore, brought about the need to rethink the role of the State in this
new context”, Leite (2016), p. 150.
11 Main characteristics of contemporaneity—a society structured in the form of a web, generating

infinite opportunities for control and surveillance depending radically on the information webs;
excessive volumes of information in comparison with the decrease of the production of knowledge;
hyperacceleration and hyperexposure—society of confession and intimacy, in which power was
replaced by attention and wealth can be translated by a person’s access to the sources of raw
material, namely, to information itself. The introduction of this information model altered the
cultural grammar of society radically, insofar as its structures were transformed and new behaviors
were generated, making those that were previously known more complex and, simultaneously,
initiating new conflicts that are not yet legally regulated because of their novelty. An immediate
consequence was the illusion that it was a completely neutral and, consequently, safe environment.
This situation caused a kind of displacement of a considerable group of people situated at the
margins of formal knowledge who, fascinated, give over their personal data, without much care,
including sensitive data, to reach the possibility of access to a simulacrum of digital citizenship and,
in this way, to feel included.
12 The virtual political sphere only reflects traditional politics and, in these terms, not all information

produces democracy, especially due to the volume, since access to the internet cannot ensure political
access and activity. Strictly speaking, digital democracy consists of an experience of the internet
and of its devices turned towards the potentiation of civil, active and responsible civil participation
in conducting public business.
358 G. B. S. Sarlet

Due to the fragilization of the previous paradigms,13 typical of this era, many chal-
lenges began to be imposed on all those who deal with sciences, particularly human
sciences and applied social sciences.
One of the main challenges to be met is the analysis of the Copernican revolution
imposed by augmented reality, by virtualization, by the personification of avatars,
by the invention of new symbolic exchanges, by the overexposure of private life
on the social networks, by the excess of information, particularly personal informa-
tion, by re-structuring commercial transactions and by the need for rapid and precise
responses that have no precedent in Western civilization and even actually deter-
mine the appeal for a new modality of juridification, i.e. challenges coming from
the impacts of identity digitalization and, consequently, a reshaping of personality
protection in the digital environment.
This article is the result of an investigation, based primarily on the Brazilian legal
system, about the posthumous protection of personality on the internet. It highlights
the problem of digital inheritance, a topic that includes unusual property-related
aspects14 and shows new outlines for personality rights. It aims to propose some
possibilities of constructing a theory of probate rights compatible with the principle
of the human person dignity, as well as with the spheres of freedom and private
autonomy, focusing on the Brazilian legal system as far as personality rights are
concerned.
For this purpose, it will start from the problem of digital identity and the protection
of personality on the internet. Then it will discuss the problem of the digital inheri-
tance, its consequences for the legal domain, especially the new outlines of probate
law because of the post mortem guarantee of personality rights, and the potential
solutions in case law, legal doctrine and the laws that have already been adopted or
are presently under construction in Brazil.

2 Digital Identity

Strictly speaking, the internet challenges time and space and even produces immor-
talities in cyberspace, besides changing peoples’ lives, both individually and
collectively. Currently, computers and systems of information, data encoding, and
processing are seen as an extension of the human person,15 forging a web of
relationships that need adequate protection.

13 Braaten (1991), p. 44; Habermas (2001), p. 10.


14 Property is made up not only of economically measurable legal relations, but equally by extraprop-
erty relations, which, when violated, affect the person’s property. Therefore, personality rights are
part of the concept of property insofar as they are taken as immaterial elements that are inalienable
and due to the possibility that, once they are violated, they will generate economic elements such
as the effect of the right to indemnification and/or reparation. About the broader configuration of
the concept of property, see Branco and Mendes (2015), p. 323.
15 Sarlet (2013), p. 5.
Digital Identity and the Problem of Digital Inheritance 359

The unique product of the web of meanings of the experiences, undergone or


not, throughout a lifetime is what is called personal identity. This is a notion from
philosophy and logic, archetypal for various areas of investigation about human
beings, and it is characterized by a plurivocity and an emphasis on the chronological
and spatial aspects, for identification16 . The civil system of protection of the human
person, in turn, necessarily involves the transformations of the concept of identity,
which was initially seen from an individualist perspective and not as an asset or value,
a biographical synthesis produced in a relational dimension that fabricates a property,
be it of an intellectual, ideological, ethical, religious, sexual or professional nature.
Thus, it requires a bundle of rights that will correspond to the core of personality
protection, but that will also take the aspects of probate into account.
Consequently, the protection of identity can be broken down into at least two
aspects, viz. protection of the personal identity proper, which, in brief, also aims
at the free development of the personality, such as honor, reputation etc., and into
the necessary protection considering the current techniques used to identify persons.
As part of the texture of personality rights,17 the right to identity18 is generally the
basic premise for the current shape of the democratic Rule of Law and, therefore, it
is guaranteed worldwide by a great number of constitutions and by Brazil’s Federal
Constitution of 1988 (henceforth, FC/88).19
The internet promoted the maturing of the concept of digital identity20 , enabling
e.g. interaction between persons in communities of similar interests21 and, in this
way, creating the illusion of a private environment under the aegis of contextualized
privacy.
In this context, one must take into account that in the communicational environ-
ment is anchored in the perspective of Homo Faber, which means that each person
engenders his own private sphere,22 turned outward, on the basis of the prior selec-
tion,23 dissemination and removal of personal information, in which the average
person is necessarily led to become the maker of their own life in a trajectory that,
by means of continuous exposure, i.e. through a superproduction of data, leads them
from anonymity to a celebrity status.24
Personal data, in this sense, are all the strictly personal information characterized
by identifiability and determinability of its holder, while sensitive data are those that

16 Meyer (2011), pp. 23–25.


17 Cupis (1959), p. 45; Schreiber (2014), p. 6.
18 Cicco (2017); Moraes (2003), p. 27.
19 About the fundamentality of the right to identity in the FC/88, see: Sarlet (2018).
20 To understand the distinction between virtual and digital identity, see Meyer (2011), p. 52.
21 Negroponte (2009), p. 195.
22 Fena, Jennings (2000), p. 15.
23 Balkin (2009), p. 428; Dwings (2015), p. 67.
24 Lévy (2002), p. 47.
360 G. B. S. Sarlet

deal with racial and ethnic origin, political, ideological and religious convictions,
sexual preferences, data on health, genetic data, and biometrical information25 .
The ensemble of this information makes up the digital profiles or identities and
has political and especially economic value. It is material and immaterial property
and can be employed as raw material for the use of software attached to new forms
of social control and the current modalities of immortalization.26
Digital identity27 is, thus, the result of the use of this category to facilitate the
clarification of the phenomenon of digitalization in a human being’s life, especially
about their singularization as a person, inside or outside the cybernetic environment.
It consists of a set of information transformed into bits or pixels that represent a
human person and can be used in the relationship with machines or with other users,
e.g. passwords, data on face recognition of the face, the voice, the iris, fingerprints.
Digital identity however, does not consist only of the spontaneously supplied data, but
also of the digital footprints or shadows, i.e. the history of all transactions performed
by the user28 that form the records of the sites and portals of access to the internet.
Proportionally to the use of the internet, the digital shadows or footprints include
images on surveillance cameras, banking transactions, telephone calls, medical data,
scanner copies and hospital exams, credit information, history of convictions, espe-
cially criminal,29 in other words, all the information that can be accessed in the
datacenters.
Data protection is, in brief, the protection of human persons in a version updated
to the demands of contemporaneity, especially regarding the protection of the free
development of their personality by ensuring their informational self-determination,
which necessarily includes post mortem protection. One of the main consequences
of this new cultural dimension is the production of digital property.30 In this way,
a demand for new outlines of personality rights and probate rights arises, namely,
outlines that will include the notion of virtual personality31 and digital inheritance.
The usual violations of personality rights, by means of global attacks or by the
isolated conduct of private persons, have included data theft, which represents, among
other cybernetic crimes, the theft of digital identity,32 the creation of false profiles
in social media, the theft of financial assets, and even, because of the lack of secure

25 Biometry is the anthropometric act of marking, according to the criteria of Alphonse Bertillon.
It is a general and subtle form of control by the State that allows prosecution.
26 Cha (2005). In this sense the view of Statement 531 of the VI Jornada de Direito Civil (6th

Meeting on Civil Law) promoted by the Council of the Federal Justice in March 2014 is relevant.
It says: “The protection of the human person dignity in information society includes the right to be
forgotten”.
27 Júlio (2005).
28 Bock (2017), pp. 370–417.
29 Molinaro and Sarlet (2013), p. 69.
30 Lima (2010), p. 95; Reis (2010), p. 95.
31 Oliveira (2002), pp. 121–143.
32 The so-called Streisand effect must be included in the analysis of the guarantee of the right to

data protection, since publicity about the damage may trigger many people’s curiosity, as occurred
in the case involving the actress Barbara Streisand.
Digital Identity and the Problem of Digital Inheritance 361

parameters, usurping personal data after the user’s death. Likewise, there are cases
of discrimination, inciting hatred, digital public shaming, revenge porn etc.
In general, there are two types of data usurpation: the use of already existing
accounts and profiles, and the creation of accounts or profiles based on stolen data,
which has become a very recent problem for the electoral system of many countries33
when surveying the intentions of votes.
It should be recalled that this is a behavior that directly affects, among others,
the fundamental right to intimacy enshrined in article 5, X of the FC/88, which, by
the way, was useful for the protection of personality, from which various rights and
guarantees are extracted, including the fundamental right to the free development
of personality. It should also be underscored that Brazilian civil law confirms the
inviolability of the private life of the natural person along the lines of article 21 of
the Brazilian Civil Code (BCC), also covering the protection of the deceased, as
provided by the sole paragraph of article 12 of the BCC, particularly as regards the
objective honor.

3 Digital Inheritance and the Posthumous Protection


of Personality on the Internet

Digital inheritance is the composition of a set of data, of digital assets that, in brief,
are virtual goods that are usually kept on the internet. As property involves both
the material goods and the immaterial ones, the issue about the destination given to
the digital property of a person after he or she dies becomes imperative, especially
considering that the attribute of matter about the concept of property is an operational
rather than an essential construction.34
In brief, there is a profusion of concerns resulting from the concept of digital
death and its effects when there is a dearth of decisions about their digital property.
Basically, a conflict is revealed between applying the fundamental right to inheritance
and guaranteeing the right to the inviolability of communications, to intimacy and
to privacy, essential to exercising personality rights, insofar the person’s action on
the internet is basically connected to the intimate sphere in the composition of an
expression of personality which often is different from the one that is manifest to the
heirs.

33 Globo (2017).
34 Abreu (2003), pp. 30–45.
362 G. B. S. Sarlet

3.1 Problems and Approaches

Traditionally, the digital legacy,35 i.e. the sum of rights, goods, assets and obligations
within the digital sphere that should be transmitted to the heirs is structured in a
classification of goods that are not susceptible to economic valuing and goods whose
value can be measured economically, in which the former have predominantly an
affective value, such as photographs that make up the documental memory, and the
others have an undeniable property value due to the direct possibility of monetary
conversion36 . As long as the data are stored in personal computers or on a USB-Stick,
their sharing along the lines of the current laws is unarguable.
Indeed, the problems lie in the regulation of the posthumous disposition of the
data stored in the cloud and in the management of digital accounts. Examples of this
are e-mail accounts, access to insurance portals, passwords and records of financial
transactions in internet banking, access to mileage and to the system of scores in sites,
online purchases and subscriptions, portal and blog maintenance, files and purchasing
of space in the cloud, digital counseling services, investments and statements of online
transactions in the stock exchange, credits in virtual coins such as bitcoins and others,
management of avatars, management of profiles in social networks, digital assets
on music and game platforms, management of accounts such as Twitter, LinkedIn,
Amazon, Skype, Netflix, YouTube, eBay, WhatsApp, Instagram etc.
The internet, insofar as it is a deterritorialized field of action for persons, is, in
general, averse to attempts at State regulation. It is claimed that the internet is a
free environment37 and, therefore, an arena of private autonomy. Currently private
entities38 are the main target of the modalities of regulation designed to implement
full digital democracy, whose essential core is the strengthening of the human person
as an active subject through the expression of free informed consent of its participants.
Thus, a virtual will would be the best option, especially because of the legal gaps,
although it is not yet commonly used, above all in Brazil. On the other hand, the
simple-minded solution of the use of passwords by third parties for the cases of
access to accounts or to profiles is, to say the least, inappropriate, since, besides
the many conflicts regarding family and probate law, it represents a crime of false
identity defined as such in article 307 of the Brazilian Criminal Code.39
By the way, the concept of digital inheritance, because of its complexity and novel
character, generally implies a new way of articulation of the transnational companies
regarding the protection of the personal data of their users.

35 Bock (2017), p. 372.


36 About digital assets and the rise of regulation in Brazil, see Castro J, Hirata L (2017).
37 Although apparently naïve and rather disconnected from reality, about the problem of regulation

and some control, it is interesting to see the shape of the manifesto of cyberspace independence.
Barlow (1996).
38 Sarmento (2006), p. 34.
39 Article 307 of the Brazilian Criminal Code: “To attribute to oneself or to a third party a false

identity to obtain advantage for oneself or another, or to cause damage to somebody else”.
Digital Identity and the Problem of Digital Inheritance 363

There is no concept that is solidified in legal scholarship nor in the legislation


about the limits and availability of the digital inheritance, because this is a recent
topic. It gained worldwide proportions in a case that emerged in 2004, in which
Yahoo was requested to give the parents access to the contents of the e-mail of
Justin Ellsworth, a North American soldier who died in the Iraq war. Among the few
representative decisions about this matter are also the judgments of German civil
courts that ultimately enabled the mother of a Facebook user to access the account
that had been transformed into a memorial. This is an innovative case because, besides
literally using the nomenclature digital inheritance, it revealed several gaps in the
law and the legal doctrine. One of the results the court of first instance arrived at was
the necessary distinction between strictly personal data and common digital data,
as well as the reflection on the reshaping of the various possibilities of protection,
which includes posthumous protection. According to the plaintiff, who is the mother
of the 15-year old girl, the request to transform the account into a memorial, which
had been made by an unknown person, prevented full access to it, creating obstacles
to the search for answers to her daughter’s suicide. In the final outcome, the Federal
Court of Justice, supported by the principle of universality of inheritance and family
protection, accepted the claimant’s request and held Facebook liable.40
In the current Brazilian context, there are also cases of family members who
try to access the data of a person who died or is disabled. There is not yet a clear
monetarization on a large scale of the action of users on the internet, nor is the notion
of digital property concrete for a common citizen. This would be the main reason for
the precariousness of the topic in the national legal scholarship and case law, which
makes it difficult to enjoy rights, particularly due to the lack of specific normative
parameters, which implies a flagrant legal insecurity.
From this perspective, it is useful to record some measures that companies like
Microsoft,41 Google and others began to employ regarding digital death42 . Google,
if there is a virtual testament, accepts passing the information on to the heir who
is responsible, allowing access to the accounts of the deceased user, the request
to close them and even the denunciation of invasion by third parties. Facebook,
regarding inactive accounts, offers two possibilities: transformation into a memorial
or extinction. In these terms, it has provided users with the opportunity to indicate a
digital heir who, although they cannot erase or reconstruct the memory by altering the
content posted nor access the messages exchanged, has the possibility of accepting
a request for new friends, altering the photo of the profile, i.e. of managing the
memorial. Instagram, through an online form to which the user’s birth and death
certificates can be attached, accepts the extinction of the account or its transformation

40 See Alexander (2016), pp. 301–308, and more closely with a detailed analysis of the courts’
decisions and the German legal situation Heidrich (2022), in this volume.
41 The requests for information regarding the account of a deceased person should be sent to the

following e-mail address: msrecord@microsoft.com or to the physical address: Next ofkin, One
Microsoft Way, Redmond, WA 98,052.
42 Apple (2018).
364 G. B. S. Sarlet

into a memorial.43 Twitter offers the possibility of removing photos and files of
deceased persons if their death is confirmed. Microsoft enables access to the content
of e-mail accounts (Hotmail, Outlook and Live) by the representative of the deceased
user, requiring a court order for this. Amazon, for instance, does not accept the transfer
of licenses to third parties, alleging that this is not a purchase, but rather a service, a
license for use, and in this situation, it acts by ignoring consumer protection.44 The
data storage service in the cloud, Dropbox45 , allows access to the data and to the files
after the user’s death by petition if it is proved that the petitioner has a court order.
Finally, the ambiguity of using copyright for e-mail accounts should be underscored,
since the use of the great majority of them is perceived as a license to use a service,
which expires when the user dies.
Ultimately, one can observe a general tendency to adopt the DAP trust system
(Digital Asset Trust) which allows the customer to designate a person or a company46
to access and manage their files in case of death or severe disability.
Outstanding, due to its relevance for discernment in this topic, is the act of consent
as a gnoseological process in which all information about relevance, adequacy,
purpose, time of collection, storage, treatment and transmission of data obtained must
be previously clarified to enable production, renunciation, alteration, use, cession and
availability or refusal of the internet user, that is, in clear, precise, appropriate and
sufficient language. Thus, it is important to ensure protection against risks of material
and immaterial damages, e.g. creation of false profiles, violation of privacy, reten-
tion and manipulation of data, stigmatization, discrimination by means of cadasters,
etc.47 In fact this is the starting point for the idea of virtual testament, that is, an
expression of freedom presupposing the ample, correct and precise information.

3.2 On the Legal Regulation of Digital Inheritance in Brazil

3.2.1 General Approach to Successory Rights in Brazil

According to Beviláqua, inheritance is the totality of the goods that someone leaves
when they die and that are acquired by the heirs.48 Inheriting is to succeed to the
ownership of property according to article 91 of the Brazilian Civil Code (henceforth

43 Instagram (2017).
44 UN Resolution 39/248, April 9, 1985, establishes the guarantee of consumer protection as a duty
of all member States.
45 Dropbox (2017).
46 As an example, see Perpetual Websites. Available via internet. http://web.archive.org/web/201

80811110313/http://www.perpetualwebsites.net/. Accessed 11 Jan. 2022.


47 Beyer and Cahn (2012), pp. 40–41.
48 Bevilaqua (1978), p. 2.
Digital Identity and the Problem of Digital Inheritance 365

BCC), i.e. it is circumscribed to the legal relations endowed with an economic value,
since the personality rights are, as a rule, non-transmissible49 .
As a direct result of the fundamental right to private property, the right to inheri-
tance is enshrined in article 5, XXX of the FC/88. The law of probate and succession,
a consequence of the corollary of private autonomy, results from the verification of
the real or presumptive death of a natural person according to article 6 of the BCC.
It exists to preserve private property, and, in terms of its social function, it mainly
addresses family maintenance,50 and it should be underscored that there is also a
right to inheritance about obligations contracted by the deceased.
If there is no express provision to the contrary, the transmissibility of property to
the heirs is instantaneous, due to the principle of saisine enshrined in article 1784 of
the BCC. The main effects of applying the legal fiction originating in the institute of
saisine51 are: opening the inheritance with the real or presumed death of the person
and immediate attribution both of possession and property of the inheritance to the
heirs, who then have ad causam legitimacy, especially due to the faculty of protecting
the inheritance against attacks by third parties. It consists of a subjective mutation,
and, thus, the goods are transferred just as they were in the previous possession. The
application of saisine, which since its medieval inception has referred exclusively
to the property aspects of inheritance, must obviously be relativized for the sake of
protection of the personality rights of the deceased person.
The law establishes in article 1845 of the BCC the list of necessary heirs, since the
Brazilian lawgiver limited the freedom to dispose of the goods insofar as it protected
the legitime (see articles 1845 and 1846 of the BCC).
Article 80 of the same legislative text, in accordance with what was previously
underscored, considers an immovable good the right to open succession, and, empha-
sizing this guarantee, it is considered that the transmission occurs at the time when
the probate is opened, and not when the inventory is opened, at the time of the small-
estate probate or at the time of the partition proper. Likewise, because of the principle
of indivisibility of the inheritance and its regulation on the lines of a co-ownership
extracted from article 1791 of the BCC, any of the co-heirs can claim the inheritance
against a third party. Further about this aspect, mention should be made of what is
provided for in article 1825 of the BCC, which establishes that the action of peti-
tioning for the inheritance, even if exerted by only one of the heirs, may include all
the hereditary assets. It should also be mentioned that, according to articles 1793–
1795, the heir cannot cede any asset by themselves, unless there is a court order and
under strict observance of the right of preference of the other heirs.
Succession,52 therefore, is distinguished in an intestate and a testate succession,
which refers to the act of disposition of the last will, according to articles 1786 and
1788 of the BCC. Article 227, paragraph 6 of the FC/88 innovated by establishing
the equality between the offspring and by generating isonomic effects on the order of

49 Morato (2012), p. 153.


50 Dias (2008), p. 24.
51 Bevilaqua (1956), p. 43; Miranda (1966), p. 22.
52 Hironaka (2011), p. 34.
366 G. B. S. Sarlet

succession of article 1829 of the BCC of 2002 and providing that it should be done
by classes that exclude each other, consisting of a preferred order. Thus, even if there
is the hypothesis of an estate in abeyance, in which the period of five years is used to
transmit the legacy to the public property, the Brazilian civil law considers only the
heirs by intestacy, testamentary heirs or optional heirs, highlighting the possibility
of exclusion due to unworthiness (article 1814) or disinheritance (articles 1961 and
1964).
From article 1847 of the BCC it is inferred that calculating the legitimate inheri-
tance must consider half the total value of the assets that exist at the time the probate
is started, after the debts and funeral expenses are deducted and the values of the
assets subject to collation are added. After transmission of the inheritance, thus, each
of the heirs must confirm the acquisition, accepting it or refusing it by waiver. Both
acceptance and waiver have retroactive effects.53 A partial acceptance or waiver is
not admissible in Brazilian law. Waiver, provided for in articles 1,804 and 1,805
of the BCC, is a potestative right that is performed in a unilateral, irreversible and
solemn legal transaction undertaken by means of a public instrument, which is not
dependent on ratification and is distinguished as express and as translational waiver,
which is applicable only when there is a finding of full capacity, agreement of the
spouse, save in cases of a regime of full separation of property, and insofar as it does
not cause losses to the creditors.
It is relevant to underscore that only if there are no necessary heirs can the testator
dispose of the assets in full. According to articles 1857 to 1859 of the BCC, a will is
a strictly personal legal transaction and an act of last will that cannot be restricted to
values or property but must be extended to existential matters.54 It can be used, for
instance, to acknowledge children, to forgive the unworthy heir, for last directives
of the will on the lines of the living will, for appointment of a guardian, exemption
from collation of assets that were previously donated. The validity of the will is,
consequently, conditioned to obeying the elements intrinsic to legal transactions, i.e.
the testator’s capacity, the spontaneity of the statement of will, the lawfulness of the
object, respect for limits and for formal elements etc.55 .
It was not the New Code of Civil Procedure (henceforth NCCP) that relevantly
changed the regulation of inheritance neighter the probate proceeding in the national
legislation when it came into force. In general, the most prominent change occurred
due to the impossibility of opening the inventory ex officio. It included the possibility
of processing and elucidating the partition through a public deed to record the goods
and survey assets when there is no will, agreement and capacity of the parties. It
established the venue of the domicile of the deceased, regardless of the place of death,
and, in case the domicile is not certain, that of the real estate must be recognized.
If there are multiple domiciles, any of them can be established. It should be also

53 Hironaka and Pereira (2004), p. 7.


54 Here there are similarities between the institutes of the living will and the digital will, since it is
an act with the last disposition of will which, if the person becomes incapacitated, becomes effective
even during the testator’s lifetime. Rizardo (2011), p. 219; Lippmann (2013), p. 17.
55 Pereira (2012), p. 184.
Digital Identity and the Problem of Digital Inheritance 367

explained that any of the parties with standing and likewise the parties interested in
the property may request that proceedings be started. There is also the possibility that
injunction be granted, which, in brief, provided the anticipation of use and fruition
of goods according to articles 647–649 of the NCCP and to article 2017 of the BCC,
which outlined the principles that rule the act of partition.
It is congruent to clarify the content of articles 664, 665 and 667 of the NCCP to
understand the institute of small-estate probate in its common and summary modal-
ities. The common one is limited to the value of 1000 minimum wages of the inher-
itance, but there is a new provision for its application in cases in which there is
agreement between the parties and likewise of the Prosecution Office, even if inter-
ests of minors are involved. Regardless of values, the summary small-estate probate
occurs when the parties are competent and there is no litigation. Conveniently, the
lawmaker provided the court order as a substitute for probate in small conveyances
of money that should not be more than approximately R$ 10,000.00 (ten thousand
real) according to Law 6858 of 1980.

3.2.2 On the Regulation of Digital Inheritance in Brazil

It is undeniable that, despite efforts to produce the Brazilian Civil Rights Frame-
work for the Internet and what can be inferred from the content of the constitutional
text, civil law and consumer law, Law of Access to Information and the consti-
tutional enshrinement of the institute of Habeas Data, there are still many voids
regarding issues involving both the posthumous dimension of the personality and
digital inheritance.
The fact is that the approach to the problem of digital inheritance necessarily
hinges on the matter of data protection56 and the latter still lacks specific regulation
in Brazil, despite the general guarantee of protection of fundamental rights and
personality rights, especially by the recent legislation (Federal Act N. 13.709/2018),
that did not include a solution for this problem. By the way, this law is only in effect
since recently.
This is in fact a topic correlated with protection extracted both from the Consumer
Protection Code (henceforth CPC) and from the BCC against the existence of abusive
contractual clauses to which both statutes confer ex tunc nullity. Thus, it is the
Prosecution Office, civil organizations and state bodies, through public civil action,
that must require nullity, besides those who suffered losses, who should do it through
individual actions.
In this way, the protection circumscribed to adhesion contracts, a common
modality in the digital world, is reaffirmed, and it is emphasized that this involves a
presumption of absolute vulnerability. It is also relevant to reaffirm the provision for
an interpretation that is more beneficial to those who adhere in cases of ambiguity
or contradiction, which can be inferred from article 423 of the CPC, besides what

56 Rodotà (2006).
368 G. B. S. Sarlet

is provided for in article 47 of the same statute and in item VI of article 170 of the
FC/88, which deals with consumer protection.
Thus, despite the normative basis that were once established, insofar as in Brazil
there is lack of a specific legislation for cases of digital inheritance, the pure and
simple application of rights of probate in the field of the digital universe, besides not
attaining the specificities of the topic, would involve both the sphere of the personality
rights and some of the most valuable constitutionally ensured rights and principles57 .
In this sense, as to the disposition of the digital property and considering its novel
character,58 the modality of virtual will arises, aimed, but not exclusively, at the
disposition of the existential aspects. It consists of a statement of the will of a natural
person in full enjoyment of their capacities and full exercise of their autonomy, whose
object is an authorization or a total or partial restriction of the management of the
assets that make up the digital property considering the future impossibility of doing
it because of death or potential disability.
The adaptation of this consequence of the institute of a will to digital reality
is noteworthy because of its obligatory relationship with the right to freedom of
personal development, with informational self-determination and with other funda-
mental rights that are enshrined in the FC/88. As regards the transmissibility of goods
that can be obtained in money, it is important to highlight the wording of article 82
of the BCC on the concept of personal property in connection with what is inferred
from article 83 of the same statute as long as it deals with the energies that have an
economic value and therefore involves the due application of the rights of probate
property.
It is relevant to explain the approach between the virtual testament, a term
commonly more used by legal scholarship, and the idea of a digital codicil that
would be very appropriate for existential situations. According to article 1881 of the
BCC, it consists in an act of last will through which a person establishes special
provisions regarding their burial and/or the legacy of low value personal property.
The problem that remains is the real impossibility of quantifying the moneta-
rization of certain digital files and, thus, of establishing the definitive boundary
between the two modalities that make up the digital property, especially consid-
ering the dynamics of the virtual world itself, which constantly alters the traditional
form of measuring real values and of monetarization. As an example of the tenuous
boundary of the protection of property and of personality, one can point out the use of
the technology for identification by radiofrequency, known as RFID, currently used
by subcutaneously implanting microchips to facilitate the tracking and use of data
that range from personal, biometric, and banking information to some passwords to
access sites and accounts59 .

57 Chehab (2015), p. 56.


58 Rohrmann (2005), p. 195.
59 Barros (2006).
Digital Identity and the Problem of Digital Inheritance 369

Here it is appropriate to underscore the content of the law bills that seek to regu-
late this matter: PL 4099/2012 and its appendix, PL 4847/2012.60 Along the lines
of the latter, Chapter 2-A would be added: “Article 1797-A. Digital inheritance is
granted as the intangible content of the deceased, everything that can be stored or
accumulated in a virtual space, under the following conditions: passwords; social
networks; internet accounts; any virtual and digital good and service owned by the
deceased. Article 1797-B. If the deceased, being capable of making a will, has not
done so, the inheritance will be transmitted to the legitimate heirs.
Article 1797-C. The heir must: (a) transform it into a memorial, restricting access
to friends; (b) erase all of the user’s data, or; (c) remove the account of the former
user.” Meanwhile, PL 4099/2012, currently pending in the Federal Senate, tries to
exclusively ensure the heirs of the transmission of all digital contents and goods, as
well as access to the content stored in e-mail accounts and social networks owned
by the deceased. In accordance with this intention, only a sole paragraph would be
included in the article: “Article 1788. […] Sole paragraph. The heirs will receive all
the contents of digital accounts or files owned by the deceased.”
However, it should be added, that the Brazilian Federal Senate filed both law bills
on December 2018. Besides this, the provisions and the protection of both law bills
were really insufficient, above all because of the lack of a broad discussion of the
topic in Brazil and the fact that they did not encompass the totality of the problem,
especially in terms of facing the idea of non-temporality brought by the internet and
its repercussions, not only in the legal sphere, but particularly as regards the limits
of factual applicability and real efficacy.

3.2.3 On the Posthumous Protection of Personality on the Internet


in Brazil

It is evident that, if death occurs, one cannot immediately replace the ownership
of digital property without affecting or even violating the dignity and likewise the
personality rights of the deceased, particularly their rights to intimacy and to privacy.
Thus, it becomes clear that a systematic interpretation is the exegetic modality that is
particularly ideal to establish guidelines for a judicial solution in the sphere of digital
law, taking as central nexus the provisions of the FC/88 to build constellations created
based on the rules, legal scholarship and precedents related to the topic.
The enactment of the FC/88 was auspicious for the protection of personality rights
especially due to the absence of restrictions to the rights and guarantees provided in
the constitutional text and due to the compatibility with the rules that make up the

60 Brasil, PL 4847/2012, Available via internet. http://www.camara.gov.br/proposicoesWeb/fichad


etramitacao?idProposicao=563396. Accessed 11 Jan. 2022.
370 G. B. S. Sarlet

protection of human rights. On the other hand, the BCC, going against the legisla-
tive tendency, instead of creating an opportunity for full and in this sense dialec-
tical protection, followed the pattern that prioritizes a list of rights of defense and,
therefore, does not reach the level of protection implied by current demands.61
It is important to emphasize the protection to the personality guaranteed by
Brazilian law, even after death, according to article 12 of the BCC, mainly because
it allows modeling parameters of balancing.62 The Superior Court of Justice, in an
emblematic trial about the protection of the honor and the image of the world famous
soccer player Garrincha, stated that “the image and honor of someone who dies still
deserves protection, as they are not nobody’s things, because they remain eternally
remembered in the memories as immortal goods that are prolonged well beyond life
and are even above it”.63
Despite the criticism of the restrictive and confusing character of what is inferred
from articles 12, 20 and 21 of the BCC, as well as the incongruence with the charac-
teristic of non-transmissibility that is recognized to the personality rights, there is an
undeniable need to create a protection that will be compatible with the constitution
and, thus, cover the protection not only of the image and objective honor, but above
all the privacy64 of the deceased.65 The BCC doesn’t contemplate specifically the
post mortem protection of personality in the internet. There is still a lack of case
law, and the outstanding paradigm is a decision from the State Appeal Court of Mato
Grosso, stating that the Facebook profile should be extinguished under a penalty of
a daily fine, for being demonstrated a violation of personality rights.

3.2.4 Provisions that Make up Digital Law in Brazil and Affect


the Posthumous Protection of the Personality on the Internet

As to the construction of memory, the recognition of full protection formerly guaran-


teed by Brazilian law should be emphasized. About the terms of the Federal Statute
12,965 of April 23, 2014, the so-called Marco Civil da Internet (Brazilian Civil
Rights Framework for the Internet) although it is not a general law for data protec-
tion, it establishes principles to deal with certain questions arising from the digital
inheritance, and particularly the content of some articles that underscore the guar-
antee of privacy should be highlighted. According to article 3, the discipline of using
the internet in Brazil has the following principles: guarantee of freedom of speech,
communication and expression of thoughts according to the FC/88; protection of
privacy; and protection of personal data. According to article 6, in the interpretation

61 Tepedino (2004), pp. 23–58.


62 Schreiber (2018).
63 Brasil, Superior Tribunal de Justiça (2003), Recurso Especial 521,697/RJ. Min. Rel. César

Asfor Rocha. Available via internet. https://scon.stj.jus.br/SCON/GetInteiroTeorDoAcordao?num_


registro=200300533543&dt_publicacao=20/03/2006. Accessed 11 Jan. 2022.
64 Doneda (2006), p. 2.
65 Schreiber (2014), pp. 154–158.
Digital Identity and the Problem of Digital Inheritance 371

of this law, besides the foundations, principles and objectives provided, the nature,
uses and habits of the internet will be considered, as well as its importance for the
promotion of human, economic, social and cultural development. From the content
of article 7 one can infer the rules on moral and material damages in case of viola-
tion of intimacy and private life, as well as on the confidentiality of the flow of
communications and stored communications. It provides for the right to not supply
personal data to third parties by consent of the user, to definitively exclude personal
data supplied for a specific application on the internet, the publicity and clarity of
potential policies of use of connection and application providers. Articles 10 and 11
structure the basis to guarantee the right to data protection in Brazil, referring to the
regulation issued by Decree 8771/16.
In general, the user is assured, among others, of the following rights66 :—inviola-
bility of intimacy and private life, their protection and indemnification for material or
moral damage resulting from their violation in internet domains, save by court order,
as required by the law;—inviolability and confidentiality of their private communica-
tions and personal data that they may have supplied to a given internet application at
their request, at the end of the relationship among the parties, save for the hypotheses
of mandatory holding of records provided for in this law; holding and making avail-
able records of connection and access to internet applications dealt with in this law,
as well as data to preserve the intimacy of the private life, honor and image of the
parties directly or indirectly involved.
Strictly speaking, the General Law on the Protection of Personal Data and the
provisory measure that created the National Data Protection Authority do not directly
address the problem of digital inheritance. This legislation has entered into force
only recently, but this legislative experience, as a result of the influence of the Euro-
pean Regulation (GDPR), is becoming more and more decisive for the creation and
formatation of guidelines and solutions related to protection of fundamental rights
in the internet environment.

4 Final Considerations

Considering the former considerations, one could say that a fundamental aspect
concerning the regulation of personality rights protection related to digital inheritance
is to verify the insufficient level of effectiveness of the current legislation in Brazil.
It is thus necessary to consolidate a new legal framework for fundamental rights,
particularly for the personality rights considering the peculiarities of digital and the
virtual world, but not breaking with the regulation and main principles that apply
to the analogical world. In this field there is a need for a posthumous protection
of the personality offering solutions to handle, for instance, with the breakdown of
chronological time by the internet and other aspects already presented.

66 Hoffmann et al. (2015), p. 217.


372 G. B. S. Sarlet

Although it is controversial, the expression digital inheritance has become estab-


lished in legal doctrine, especially after the Berlin trial and the correspondent court
decisions already mentioned. Besides this, there are still several dimensions to be
unveiled about rights violations in the internet and the monetarization of the use of
data, particularly personal data and, among them, the sensitive data. Thus, although
far from perfection, the terminology is useful to think about the need for a more effec-
tive protection of the ensemble of data collected and produced, inside and outside the
digital world, over a lifetime that are an essential part of the personality and likewise
of the property of persons.
The absence of secure normative parameters in Brazil regarding this subject
creates great legal insecurity and even certain unfamiliarity among the users regarding
their rights, generating a kind of inducement to the inappropriate use of passwords
and usurpation of the digital identity. However, it does not hurt to reaffirm that the
legislator, above all because of the dynamic aspect that characterizes personality
and permeates relations on the internet, is not expected to provide abstract, prêt-à-
porter solutions, but rather parameters of balancing among the different interests to
be protected that will guide the judiciary, the administrative authorities and espe-
cially the common user, forming a new constellation of rights of personality and,
thus, an effective protection of the digital identity and, consequently, of the digital
inheritance.
Of chief importance in this regard is the compulsory strengthening of private
autonomy, despite the simulacrums created by companies through the service
contracts, and, for this purpose, the proposal of the law bill about digital inheri-
tance which was discussed in the Federal Senate was clearly innocuous, since it was
not functional and does not supply an adequate legal construction. It did not hurt
to recall that, as an example, the multiplicity of contracts with internet servers and
their posthumous consequences generally prevent a mere application of a theory
of contracts and of probate rights in their current form. Indeed, the conception of
inheritance reveals both its universality and the transmissibility of property, and it is
unconceivable to apply it when what is at stake is a matter of strictly personal data
such as the personal e-mail etc.
There is undoubtedly a worldwide tendency to adopt the DAP trust system that
allows the customer to designate a person or company for access to files and data
management in case of death or severe disability. It consists in the consolidation of
the process of knowledge and consent as elementary for the free development of the
personality and for the fruition of human and fundamental rights also in the digital
environment.
In this sense it would be valid to include, as a prior condition, a virtual will clause
when opening an account on the internet. Nevertheless, ultimately, the option for a
virtual will appears to be the most appropriate. Therefore, it is crucial to distinguish
between assets that can or cannot undergo economic valuing, which in turn would
enable a differentiated protection to the ensemble of data that, while making up the
digital identity proper, at the same time make up the intimate sphere of the digital
memory. Finally, it is essential to remember that even there is no specific legislation
on this subject in Brazil, the already existing constitutional and legal framework and
Digital Identity and the Problem of Digital Inheritance 373

the centrality of the protection of the human person have been accessed by Judges and
Courts in several decisions developing some quite interesting and consistent criteria
and strengthening at least in part the personality rights protection in the internet.

References

Abreu R (2003) A emergência do patrimônio genético e a nova configuração do campo do


patrimônio. In: Abreu R, Chagas M (eds) Memória e patrimônio: ensaios contemporâneos. DP&A,
Rio de Janeiro, pp 30–45
Agamben G (2006) Lo abierto: el hombre y el animal. Adriana Hidalgo, Buenos Aires
Alexander C (2016) Digitaler Nachlass als Rechtsproblem? Überlegungen aus persönlichkeit-
srechtlicher, datenschutzrechtlicher und vertragsrechtlicher Sicht. Kommunikation and Recht,
pp 301–308
Balkin JM (2009) The future of free expression in a digital age. Pepperdine Law Rev 36:427–444
Barlow J (1996) A declaration of the independence of cyberspace. Available via internet. https://
www.eff.org/cyberspace-independence. Accessed 11 Jan. 2022
Barros M (2006) Sistema permite identificar mercadorias, animais e até pessoas usando sinais
de radiofrequência. Available via internet. http://www1.folha.uol.com.br/fsp/informat/fr0103200
601.htm. Accessed 11 Jan. 2022
Bevilaqua C (1978) Direito das Sucessões. Editora Rio, Rio de Janeiro
Bock M (2017) Juristische Implikationen des digitalen Nachlasses. In: Archiv für die civilistische
Praxis, (AcP), pp 370–417
Braaten J (1991) Habermas’s critical theory of society. State University of New York Press, New
York
Branco PG, Mendes GF (2015) Curso de direito constitucional, 10th edn. Saraiva, São Paulo
Brasil, PL 4847/2012. Available via internet. http://www.camara.gov.br/proposicoesWeb/fichadetr
amitacao?idProposicao=563396. Accessed 11 Jan. 2022
Brasil, Superior Tribunal de Justiça (2003), Recurso Especial 521697/RJ. Min. Rel. César Asfor
Rocha. Available via internet. https://scon.stj.jus.br/SCON/GetInteiroTeorDoAcordao?num_reg
istro=200300533543&dt_publicacao=20/03/2006. Accessed 11 Jan. 2022
Beyer GW, Cahn N (2012) When you pass on, Don’t leave the passwords behind: planning for
digital assets. Probate and Property 26:40–41
Castells M (2003) A galáxia da internet: reflexões sobre a Internet, os negócios e a sociedade. Jorge
Zahar, Rio de Janeiro
Castells M (2016) A sociedade em rede. A era da informação: economia, sociedade e cultura, vol
1, 10th edn. Paz e Terra, São Paulo, pp 57–103
Castro J, Hirata L (2017) Ativos do Brasil se destacam e são opção entre emergentes. Avail-
able via internet. http://www.valor.com.br/financas/5152374/ativos-do-brasil-se-destacam-e-sao-
opcao-entre-emergentes. Accessed 11 Jan. 2022
Cha A (2005) After death, a struggle for their digital memories. Available via internet. https://
www.washingtonpost.com/archive/politics/2005/02/03/after-death-a-struggle-for-their-digital-
memories/074e8451-e756-4f6f-8c47-01b86f3e465b/. Accessed 11 Jan. 2022
Chehab G (2015) A privacidade ameaçada de morte. LTr, São Paulo
Cicco MC (2017) O ‘novo’ perfil do direito à identidade pessoal: o direito à diver-
sidade. Annali della Facoltà Giuridica dell’Università di Camerino, 6, 2017. Available
at internet. http://d7.unicam.it/afg/sites/d7.unicam.it.afg/files/DeCicco_O%20novo%20perfil%
20do%20direito%20à%20identidade%20pessoal_0.pdf. Accessed 11 Jan. 2022
Cohen J, Schmidt E (2014) The new digital age: reshaping the future of people, nations and business.
John Murray, London
Cupis AD (2014) I diritti della personalità. Guiffrè, Milano
374 G. B. S. Sarlet

Dias MB (2008) Manual das Sucessões, 5th edn. Revista dos Tribunais, São Paulo
Doneda D (2006) Da privacidade à proteção de dados pessoais. Renovar, Rio de Janeiro
Dropbox (2017) Política de privacidade do Dropbox. Available via internet. https://www.dropbox.
com/pt_BR/privacy. Accessed 11 Jan. 2022
254813. Accessed 5 Dec. 2017
Dwings L (2015) The right to be forgotten. Action Intell Prop 9(J45):67:45–80
Farinaccio R (2016) Afinal, quanta energia elétrica a internet utiliza para funcionar? In: Tecmundo.
Available via internet. https://www.tecmundo.com.br/internet/104589-quanta-energia-eletrica-
internet-utiliza-funcionar.htm. Accessed 11 Jan. 2022
Fena L, Jennings C (2000) Priv@cidade.com: como preservar sua intimidade na era da internet.
Futura, São Paulo
Globo O (2017) Perfil falso usou fotos de brasileiro para influenciar eleição dos EUA. Available via
internet. https://oglobo.globo.com/mundo/perfil-falso-usou-fotos-de-brasileiro-para-influenciar-
eleicao-dos-eua-21821791. Accessed 11 Jan. 2022
Habermas J (2001) Moral consequences and communicative action. MIT, Cambridge, MA
Heidrich F (2022) The digital estate in the conflict between the right of inheritance and the protection
of personality rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the
internet. Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Hironaka G, Pereira R (2004) Direito das Sucessões e o novo Código Civil. Del Rey, Belo Horizonte
Hironaka G (2011) Morrer e suceder: passado e presente da transmissão sucessória concorrencial.
Revista dos Tribunais, São Paulo
Hoffmann C, Luch AD, Schulz SE, Borchers KC (2015) Die digitale Dimension der Grundrechte:
Das Grundgesetz im digitalen Zeitalter. Nomos, Baden-Baden
Instagram (2017) Relatar a conta de uma pessoa falecida para transformar em memorial no
Instagram.
Jayme E (1995) Identité culturelle et integration: le droit international privé postmoderne. Recueil
des Cours, vol 251. Nihoff: Brill, Boston, MA
Júlio B (2005) Identidade e interação social em comunicação mediada por computador. Universidade
Nova de Lisboa. Available via internet. http://www.bocc.ubi.pt/pag/julio-bruno-identidade-intera
ccao-social.pdf. Accessed 11 Jan. 2022
Lévy P (2002) Ciberdemocracia. Instituto Piaget, Lisboa
Lima I (2018) Herança digital: direitos sucessórios de bens armazenados virtualmente. Available
via internet. http://bdm.unb.br/handle/10483/6799. Accessed 11 Jan. 2022
Lippmann E (2013) Testamento vital: o direito à dignidade. Matrix, São Paulo
Marsden CT (2010) Net neutrality: towards a co-regulatory solution. Bloomsbury Academic,
London
Meyer J (2011) Identität und virtuelle Identität natürlicher Personen im Internet. Nomos, Baden-
Baden
Miranda P (1966) Tratado de direito privado: parte especial, 2nd edn. Borsoi, Rio de Janeiro
Molinaro CA, Sarlet IW (2013) A sociedade em rede, internet e Estado de vigilância: algumas
aproximações. Revista Da AJURIS, Porto Alegre 40:132
Moraes MC (2003) Danos à pessoa humana—uma leitura civil-constitucional dos danos morais.
Renovar, Rio de Janeiro
Morato AC (2012) Quadro geral dos direitos de personalidade. Revista da Faculdade de Direito da
Universidade de São Paulo, São Paulo, 106/107
Negroponte N (2009) Vida digital. Companhia das Letras, São Paulo
Norris P (2001) Digital divide: civic engagement, information poverty, and the internet worldwide.
Cambridge University Press, Cambridge
Oliveira VC (2002) Comunicação, identidade e mobilização social na era da informação. Revista
Fronteiras—Estudos Midiáticos, São Leopoldo, 4, 2
Pereira CM (2012) Instituições de direito civil: direito das sucessões, 19th edn, vol 6. Forense, Rio
de Janeiro
Digital Identity and the Problem of Digital Inheritance 375

Ramsay I (2001) Consumer protection in the era of informational capitalism. In: Wilhelmsson T,
Tuominem S, Tuomoca H (eds) Consumer Law in the Information Society. Kluwer, Boston
Rizardo A (2011) Direito das sucessões, 6th edn. Forense, Rio de Janeiro
Reis C (2010) Dano moral, 5th edn. Forense, Rio de Janeiro
Rodotà S (2006) La conservación de los datos de tráfico en las comunicaciones electrónicas. In:
Segundo Congreso sobre Internet, derecho y política: análisis y prospectiva [monográfico en
línea], IDP, Derecho y Política, 3 UOC. Available via internet. https://www.raco.cat/index.php/
IDP/article/download/49964/50870/. Accessed 11 Jan. 2022
Rohrmann CA (2005) Curso de Direito Virtual. Del Rey, Belo Horizonte
Santos B (2016) Apesar de expansão, acesso à internet no Brasil ainda é baixo. Avail-
able via internet. https://exame.abril.com.br/brasil/apesar-de-expansao-acesso-a-internet-no-bra
sil-ainda-e-baixo/. Accessed 11 Jan. 2022
Sarlet IW (2018) Direitos Fundamentais em espécie. In: Sarlet IW, Marinoni LG and Mitidiero D
(2018) Curso de Direito Constitucional, 7th edn. Saraiva, São Paulo
Sarlet IW (2013) Do caso Lebach ao caso Google versus Agência espanhola de proteção de
dados. Available via internet. https://www.conjur.com.br/2015-jun-05/direitos-fundamentais-leb
ach-google-vs-agencia-espanhola-protecao-dados-mario-gonzalez. Accessed 11 Jan. 2022
Sarmento D (2006) Direitos fundamentais e relações privadas, 2nd edn. Lumen Juris, Rio de Janeiro
Schreiber A (2018) Os Direitos da Personalidade e o Código Civil de 2002. Available via
internet. http://www.andersonschreiber.com.br/downloads/os_direitos_da_personalidade_e_o_c
odigo_civil_de_2002.pdf. Accessed 11 Jan. 2022
Tepedino G (2004) A tutela da personalidade no ordenamento civil-constitucional brasileiro. Temas
de Direito Civil, 3rd edn. Renovar, Rio de Janeiro, pp 23–58

Gabrielle Bezerra Sales Sarlet Dr. Iur. (University of Augsburg, Germany), Master’s Degree
and Bachelor of Laws from the Federal University of Ceará (UFC). Professor of the Graduation
Program in Law (LLM-PHD) at the Pontifical Catholic University Rio Grande do Sul, Brazil—
PUCRS). Main areas of research: Constitutional Law, Human and Fundamental Rights in the
Digital Domain, Bioethics, Digital Law, Neuroscience, Literature and Law. Selected Publications:
Überzählige Embryonen in der Reproduktionsmedizin: Ein Rechtsvergleich zwischen Deutsch-
land und Brasilien, Schriften zum Bio-, Gesundheits- und Medizinrecht Band 13, Baden-Baden,
Nomos, 2013; Notas sobre a identidade digital e o problema da herança digital: uma análise
jurídica acerca dos limites da proteção póstuma dos direitos da personalidade na internet no
ordenamento jurídico, Revista de Direito Civil Contemporâneo, Vol. 17 (2018), pp. 17–33.
The Digital Estate in the Conflict
Between the Right of Inheritance
and the Protection of Personality Rights

Felix Heidrich

Abstract Online shopping via Amazon or Zalando, email communication, posting


and sharing videos, photos and messages on Facebook, Youtube or Instagram, domain
rights, Bitcoin transfers, copyright and trademark issues: the free development of the
individual in the online environment produces a large bulk of data. In a series of
decisions between 2015 and 2018, German civil courts for the first time had to deal
with aspects of “digital estate”: The parents of a teenage girl who had died in an
accident sued Facebook for access to the full social media account of their daughter
because they hoped, i.a., to obtain background information about the death of the
teenager. However, the actual significance of “digital estate” goes way beyond this
case. This chapter deals with questions arising from constitutional law. It outlines
in a first step a differentiated notion of “digital estate” and then introduces the facts
and arguments of the case before the German courts. In its main parts, it analyses in
detail the legal interests of the people concerned: the right to inherit on the one hand
and the secrecy of telecommunications as well as the protection of personal data on
the other hand, all guaranteed by fundamental rights of the German Constitution and
further regulated in statutory law. It arrives at the result that neither the secrecy of
telecommunications nor data protection in favour of the testator’s communication
partners create obstacles to the heirs’ right to access the testator’s digital account data.
Therefore, apart from providing legal certainty and as regards the cases discussed,
already existing rules offer sufficient ground for convincing solutions.

The author wishes to thank Dr. Ralph Zimmermann and Prof. Dr. Marion Albers for their valuable
support.

F. Heidrich (B)
Leipzig University and Federal Administrative Court, Leipzig, Germany
e-mail: felix.heidrich@uni-leipzig.de

© Springer Nature Switzerland AG 2022 377


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_16
378 F. Heidrich

1 Introduction

In 2017 approximately 771 billion e-mails were sent in Germany. In May of the same
year the social network Facebook had approximately 30 million users throughout the
country. In all, 45.9% of the population used social network services at least once a
month. And the relevant numbers will increase in the future.1 Every individual leaves
a vast majority of data throughout his or her life and in an era of progressive digi-
talisation more and more data is stored digitally on the servers of Internet providers
all over the world. Over time, the almost unlimited storage capacity inevitably raises
one serious question: “[…] [W]hat happens to these ‘digital footprints’ after death
[translated by the author]?”2 Heirs may have a considerable interest in accessing
this data. The “digital estate” leads to a conflict between the right of inheritance
and the protection of personality rights, which is being discussed intensively and
controversially worldwide and as well in Germany.3
As in many other countries, the German statutory regulations do not include
specific provisions regulating the succession to data.4 Internet providers follow
their own—mostly different—paths5 when dealing with the accounts and data of
deceased users.6 Neither the judiciary nor legal scholars have so far reached a clear
and satisfying conclusion concerning the legal treatment of digital estates.7 This is
reflected to a particular extent in the 2017 appeal decision of the Higher Regional
Court (Kammergericht, KG) Berlin8 that reversed a decision of the Regional Court

1 Concerning e-mail usage: Statista, ‘Dossier E-Mail Nutzung’, Statista, [website], 2018, https://
de.statista.com/statistik/studie/id/24350/dokument/e-mail-nutzung-statista-dossier/, (accessed 11
Jan. 2022), p. 14; concerning social networks: Statista, ‘Dossier Soziale Netzwerke’, Statista,
[website], 2017, https://de.statista.com/statistik/studie/id/11852/dokument/soziale-netzwerke-sta
tista-dossier/, (accessed 11 Jan. 2022), pp. 11, 13.
2 Hohenstein (2018), p. 5.
3 See with a view to Brazil Bezerra Sales Sarlet (2022), in this volume.
4 See also for the Brazilian situation Bezerra Sales Sarlet (2022), in this volume. The situation is

different in several states of the U.S., where provisions on the “digital footprint” exist; see Raude
(2017), p. 434 with reference to Seidler (2016), pp. 10 f.
5 Specified by Brisch and Müller-ter Jung (2013), pp. 447 f. as well as, on the basis of minor cases,

Deusch (2014), pp. 3 f.


6 Facebook provides a legacy contact feature that looks after the memorialised account or lets the

user in life choose to have the account permanently deleted. Google provides an inactive account
manager to let users in life choose at which time the account will be considered inactive and what will
happen to the existing account data: granting trusted contacts access or deletion. GMX deactivates
the account after six months without login. If not reactivated within another six months, the account
is deleted. Heirs may have the account deleted by submitting a death certificate. If they want to have
access to the account, they have to present a certificate of inheritance.
7 Hohenstein (2018), p. 5.
8 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, Zeitschrift für Erbrecht und Vermögensnachfolge,

vol. 24, no. 7, 2017, pp. 386 ff.


The Digital Estate in the Conflict Between the Right of Inheritance … 379

(Landgericht, LG) Berlin.9 Meanwhile, in a landmark decision,10 the Federal Court


of Justice (Bundesgerichtshof) ruled that heirs must be granted access to the account
data of a deceased Facebook user. In these proceedings German courts dealt for the
first time with legal problems arising from digital estates.
The latest coalition agreement between the CDU, CSU and SPD parties aims at
creating legal certainty11 in the field of digital estates.12 The “Digital Restart” working
group of the Justice Ministers’ Conference (Justizministerkonferenz), in fact, does
not perceive a need for legislative action concerning access to testators’ accounts and
their data, but points out that a clear and unambiguous legal situation would facilitate
the application of the law.13 In light of the latest decision of the Federal Court of
Justice it remains to be seen whether or not the Federal Government will initiate
a law-making process at all. A statutory rule would, however, conflict with neither
draft Article 11 of the Regulation on Privacy and Electronic Communications14 nor
the General Data Protection Regulation.15
Both the judicial and the legislative decisions are complex to a special degree,
because not only must they address procedural aspects of (European) private inter-
national law,16 but they must also reconcile a vast number of norms and valuations
at the material level: national provisions on inheritance, telecommunications and
(European) data protection law, but also constitutional law with its effects on inferior
law.17

9 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, Zeitschrift für Erbrecht und


Vermögensnachfolge, vol. 23, no. 4, 2016, pp. 189 ff.
10 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de.
11 CDU, CSU, and SPD (eds.), Ein neuer Aufbruch für Europa. Eine neue Dynamik für Deutschland.

Ein neuer Zusammenhalt für unser Land. Koalitionsvertrag zwischen CDU, CSU und SPD, https://
www.spdfraktion.de/system/files/documents/koalitionsvertrag_2018-2021_bund.pdf, (accessed 11
Jan. 2022), p. 131, lines 6175 f.
12 The authors use the term “digital property”, which includes “user accounts” and “data stocks”.
13 Justizministerkonferenz (ed.), Bericht der Arbeitsgruppe “Digitaler Neustart” der Konferenz der

Justizministerinnen und Justizminister der Länder vom 15. Mai 2017, https://www.justiz.nrw.de/JM/
schwerpunkte/digitaler_neustart/zt_bericht_arbeitsgruppe/bericht_ag_dig_neustart.pdf, (accessed
11 Jan. 2022), p. 349.
14 Proposal for a Regulation of the European Parliament and of the Council concerning the respect

for private life and the protection of personal data in electronic communications and repealing
Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), COM(2017) 10
final.
15 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on

the protection of natural persons with regard to the processing of personal data and on the free
movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation),
Official Journal L 119/1.
16 DAV (ed.), Pressemitteilung 6/2018.
17 Bock (2017), p. 371.
380 F. Heidrich

This article outlines the debate concerning digital estates in Germany18 from
a constitutional perspective. The main issues when dealing with access to digital
account data involve the relationship between the right of inheritance and the protec-
tion of the secrecy of telecommunications. Both dimensions enjoy protection under
German constitutional law. Because of the sheer volume of Internet service providers
all over the world who make use of a large variety of user contract types, aspects
of international private law and the drafting of contracts related to general terms
and conditions (AGB) are not covered below. Furthermore, the following reflections
focus on e-mail and social network accounts. In a first step the concept of a “digital
estate” will be the subject of detailed consideration (16.2). After a short summary
of the German judicature, focussing on the recent decision of the Federal Court of
Justice (16.3), it will be shown that digital estates enjoy protection under the funda-
mental right of inheritance (16.4), and neither the secrecy of telecommunications
(16.5.1) nor data protection (16.5.2) in favour of the testator’s communication part-
ners create obstacles to the heirs’ right to access the testator’s digital account data.
In the conclusion and outlook (16.6) section, I answer the question of the necessity
of a future statutory rule, and sketch out perspectives for digital estates.

2 The Digital Estate

The definition of “digital estate” entails a number of uncertainties.19 A generally


accepted formula is still missing.20 However, all views, whether representing a broad
or a narrow understanding of the term, have in common that a digital estate is a
collective term21 for various rights and legal relations of the testator.22
If “digital estate” is restricted to testators’ accounts and data on the Internet that
continue to exist after death,23 data stored on local hardware of the deceased would
be disregarded.24 A broader understanding points to “the entirety of digital assets,
[…] and as a consequence the entirety of the testators’ accounts and data on the
Internet [translated by the author]”.25 Meanwhile, the definition that covers “the
entirety of the testator’s legal relations regarding information technology systems,

18 The issues raised by digital estates have for a long time neither been addressed by legal scholars,
see Herzog (2013), p. 3745, nor seem to have been of interest to particular users; see Martini (2012),
p. 1145, which has been illustrated by the lack of related provisions in wills, contracts of inheritance
and health care proxies. See, again, Herzog (2013), p. 3745.
19 Raude (2017), p. 434. See also Bezerra Sales Sarlet (2022), in this volume.
20 Alexander (2016), p. 302; Gloser (2016a), p. 12; Hohenstein (2018), p. 7.
21 Alexander (2016), p. 302.
22 Raude (2017), p. 434.
23 Pruns (2013), p. 3161.
24 Salomon (2016), p. 325.
25 Bräutigam (2013), p. 93.
The Digital Estate in the Conflict Between the Right of Inheritance … 381

including all of his or her electronic data [translated by the author]”26 seems to be
prevailing among legal scholars. Due to the identical quality and form of all digital
data there is a need to include also data located on the testator’s storage media.27 The
“digital silhouette, becoming more and more the pars pro toto of human personality
[translated by the author]”,28 is not revealed solely by e-mail or social network
accounts, but also through locally (such as on hard drives, pen drives, SIM cards or
smartphone memory) stored content or content stored online (such as in clouds).
In systematic terms, the rights and legal relations involved in digital estates may
be divided into three groups.29 Firstly: rights to data stored on storage media with the
testator (“offline data”). Secondly: rights to data stored on foreign servers (“online
data”) as well as contractual relationships between the testator and Internet providers.
Thirdly, here not specified: other legal relations concerning the digital space and not
primarily directed at the usage of an information technology system.30
Currently there is a great deal of debate on the legal classification of data according
to civil law.31 Because of their lack of corporeality data cannot be considered things.32
A considerable number of legal scholars oppose the idea of an independent “property
right” concerning data.33 Consequently, other criteria—depending on the storage
location of the data—have to be developed.34
Offline data, stored on local storage media with the testator (first group) become
corporeal at the moment of storage.35
Access to online data on foreign servers (second group) is specifically granted
by account-based user contracts (with regard to e-mail, social network or cloud
accounts).36 Accounts themselves are neither corporeal objects nor subject to intel-
lectual property rights.37 The relationship between user and provider is of a contrac-
tual38 nature, albeit often with elements of different contract types.39 The contractual
relationship grants a substantive right to access the particular account data as well as

26 Deusch (2014), p. 3; see also Raude (2017), p. 434.


27 Hohenstein (2018), p. 7.
28 Martini (2012), p. 1146.
29 Kunz (2017), § 1922 BGB mn. 601.
30 Kunz (2017), § 1922 BGB mn. 601 ff.
31 Bock (2017), p. 379.
32 Alexander (2016), p. 302; Stresemann (2018), § 90 BGB mn. 35; concerning e-mails Seidler

(2016), p. 74.
33 Bock (2017), p. 380 with fn. 80; concerning e-mails, again, Seidler (2016), p. 74.
34 Bock (2017), p. 380, p. 414.
35 Alexander (2016), p. 302; Bock (2017), p. 380; Herzog (2013), p. 3749.
36 Alexander (2016), p. 303; Kunz (2017), § 1922 BGB mn. 618 f.
37 Bock (2017), p. 377.
38 On particular applicable contract types Bock (2017), pp. 377 f.; on the typological classification

of e-mail and social network contracts Seidler (2016), pp. 62 ff., 129 ff.; with regard to contracts
with the social network Facebook LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 190;
KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 387; left open by Federal Court of Justice, III
ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesgerichtshof.de, mn. 19.
39 Bock (2017), p. 377.
382 F. Heidrich

a right to information about the password.40 Online data—as well as offline data—
may be subject to intellectual property rights (such as copyright concerning photos
and videos).41 The relationship to access and host providers is contractual42 to the
same extent as the one to DENIC regarding ownership and the right to use a certain
domain name.43
The legal transactions of the third group relate to the exchange of corporeal
objects or the provision of services in the offline world.44 In this regard the account-
based user contract (such as with an online shop or a payment service provider)
has to be distinguished from the contract concluded through the account (such as a
purchase agreement, a contract for the management of the affairs of another, or a
giro contract).45
The article will, however, focus on aspects of the second group: online data in
e-mail and social network accounts as well as the underlying account contract.

3 The Current State of Digital Estates in Germany:


Decisions of the LG Berlin (17 December 2015), the KG
Berlin (31 May 2017), and the Federal Court of Justice
(12 July 2018)

The proceedings before German courts mentioned above also concerned only certain
aspects and legal problems related to digital estates. The three decisions nevertheless
are important for the most prominent questions discussed in that context, which the
present article also deals with.
A mother sued the social network Facebook for unrestricted access to the entire
user account and related account data (especially messages stored within the account)
of her daughter, who died on 3 December 2012 at the age of 15 for as yet unknown
reasons after she was hit by a metro train. The plaintiff hoped to obtain information
from the communication content about a possible suicide of her daughter as well as
to avert a claim for damages asserted by the metro driver. Although the daughter had
handed over her access data to her parents, the mother was not able to access the
communication content. After a third person had informed Facebook about the death,
the account was memorialised by the provider. This meant that only registered friends
of the deceased had continued access. After a login by the parents only a notification
about the memorialisation appeared; further access was denied.

40 Herzog (2013), p. 3749; Kunz (2017), § 1922 BGB mn. 619.


41 Bock (2017), p. 381.
42 Kunz (2017), § 1922 BGB mn. 620.
43 Herzog (2013), p. 3750.
44 Kunz (2017), § 1922 BGB mn. 624.
45 Kunz (2017), § 1922 BGB mn. 625.
The Digital Estate in the Conflict Between the Right of Inheritance … 383

The Federal Court of Justice has now clarified that the contractual right to be
granted access to the account of the testator and all data contained therein is heritable
according to the principle of universal succession enshrined in § 1922(1) German
Civil Code (BGB). None of the postmortal protection of the deceased’s person-
ality rights, the secrecy of telecommunications, provisions on data protection, or the
general right of personality in favour of the testator’s communication partners would
be an obstacle to inheritance.46 With that said the Federal Court of Justice opposed
the decision of the KG Berlin that left the aspect of inheritability explicitly open47
and confirmed the ruling of the LG Berlin.48
Similarly to the previous instances the Federal Court of Justice identified neither
the general terms and conditions of Facebook49 nor the nature of the user contract50
as obstacles to the inheritability of the right to be granted access. Not the contractual
obligations of the social network provider, but rather the content added by the users
are of a personal nature.51 The contractual obligations would merely be account-
related.52 Following the inferior courts, the Federal Court of Justice ruled that a strict
differentiation between personal data on the one hand and asset-related data on the
other, as partly argued by legal scholars,53 would be neither possible in practice nor
inherent in German inheritance law.54 A different treatment of digital and analogue
estates would lead to inappropriate results and could not be justified.55 Since right
holders, normally the next of kin, might exercise only defensive rights and rights of
revocation towards third persons rather than claiming access to the testator’s digital
estate, the postmortal protection of the testator’s personality rights does not exclude
inheritability either.56
According to the Federal Court of Justice the protection of the secrecy of telecom-
munications in favour of the testator’s communication partners that was extensively

46 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.
bundesgerichtshof.de, mn. 17 ff.
47 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 387 ff.
48 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, pp. 190 ff.
49 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 24 ff.; KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388; LG


Berlin, 20 O 172/15, Judgment of 17 December 2015, pp. 191 f.
50 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 33 ff.; KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388; LG


Berlin, p. 189.
51 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 35 ff.


52 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 40 ff.


53 Hoeren (2005), p. 2114; Martini (2012), pp. 1147 ff.
54 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 47 ff.


55 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 50.


56 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 53.


384 F. Heidrich

discussed by the KG Berlin, does not prevent the right of access to the testator’s
digital account data.57 While the KG Berlin argued that § 88(3) Telecommunications
Act (TKG) would render the plaintiff’s right of access unenforceable, the Federal
Court of Justice, in turn, found that granting access to digital account content would
not violate the provision, since heirs could “in any case” not be qualified as “other
persons” in the sense of § 88(3) s. 1 TKG.58 § 88(3) TKG provides that service
providers shall be prohibited from procuring for themselves or other persons any
information regarding the content or detailed circumstances of telecommunications
beyond that necessary for the commercial provision of telecommunications services,
unless provided for by the TKG or any other legal provision with explicit reference
to telecommunications traffic. The KG Berlin previously argued that heirs are, from
a naturalistic point of view, different natural persons from the testator and that the
passing on of account data to the heirs would not remain within the limits of what
is necessary for the commercial provision of telecommunication services. Because
of the lack of any other legal provision that permits the passing on of digital account
data, the secrecy of telecommunications would be an obstacle to the right of access.59
In the Federal Court of Justice’s view, heirs would themselves become participants
in the telecommunications process by entering into the contractual relationship of
the testator with the service provider according to § 1922(1) BGB.60 Furthermore, a
treatment different from letter mail would be unreasonable.61
The GDPR, applicable since 25 May 2018, would also, to the extent that it is
applicable to the case at all, not prevent the granting of access, since passing on
digital account data would fall under Article 6(1) lit. b) var. 1 and Article 6(1) lit. f)
GDPR.62

4 Digital Estates as Part of the Right of Inheritance

To the extent that the selected rights and legal positions concerning digital estates
are covered by the protection of fundamental rights, these constitutional implications
must be taken into account when applying and interpreting provisions on protection
of the secrecy of telecommunications that could be an obstacle to the heirs’ right to
access the testator’s digital account data.

57 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.
bundesgerichtshof.de, mn. 54 ff.
58 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 56.


59 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 392 ff.
60 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 58 ff.


61 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 62.


62 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 54 ff.


The Digital Estate in the Conflict Between the Right of Inheritance … 385

4.1 The Right of Inheritance According to Article 14(1) s. 1


Alt. 2 GG

Article 14(1) s. 1 alt. 2 GG forms the constitutional basis of German inheritance


law. The provision’s material scope of protection encompasses the guarantee of
inheritance as a legal institution as well as an individual right.63
Article 14(1) s. 2 GG—similarly to the right to property—leaves it up to the
legislature to determine the content and boundaries of the right of inheritance.64
The legislative determination clearly defines the individual’s right of inheritance and
makes it an enforceable right because the relevant statutory law provisions shape the
necessarily abstract fundamental principles of inheritance law.65
The legislature may exercise extensive discretionary power in order to reach a
reasonable balance between individual and common interests.66 It is up to it to deter-
mine which legal positions constitute assets of the testator and shall be passed on to
the heirs after death.67 The legislature must, however, ensure the fundamental content
of the constitutional guarantee contained in Article 14(1) GG, be consistent with all
other constitutional provisions, and must in particular adhere to the principles of
proportionality and equality.68 The possibilities for limiting the right of inheritance
go further than limiting the right to property, because they involve the transition of
assets.69
Unlike other norm-shaped fundamental rights, Article 14(1) s. 1 GG contains
a limited guarantee of continued existence (Bestandsgarantie).70 With respect to
the right to property the Federal Constitutional Court (Bundesverfassungsgericht)
states: “What powers are conferred on an owner at a particular time result from
[…] considering as a whole all provisions applicable at that time dealing with the

63 Federal Constitutional Court (FCC), 1 BvR 513/78, Order of 16 October 1984, Decisions of the
Federal Constitutional Court (BVerfGE) 67, 329, 340; 1 BvR 720/90, Order of 14 December 1994,
BVerfGE 91, 346, 358; 1 BvR 1644/00 and 1 BvR 188/03, Order of 19 April 2005, BVerfGE 112,
332, 348, accessible under http://www.bverfg.de, mn. 62; established jurisprudence.
64 FCC, 1 BvR 412/65 and 1 BvR 524/65, Order of 1 December 1965, BVerfGE 19, 202, 206; 1

BvR 810/70, 1 BvR 57/73 and 1 BvR 147/76, Order of 8 December 1976, BVerfGE 44, 1, 17; 1 BvR
513/78, Order of 16 October 1984, BVerfGE 67, 329, 340; 1 BvR 720/90, Order of 14 December
1994, BVerfGE 91, 346, 360; 1 BvR 2161/94, Order of 19 January 1999, BVerfGE 99, 341, 351,
accessible under http://www.bverfg.de, mn. 43; 1 BvR 1644/00 and 1 BvR 188/03, Order of 19
April 2005, BVerfGE 112, 332, 348, accessible under http://www.bverfg.de, mn. 62.
65 FCC, 1 BvR 2161/94, Order of 19 January 1999, BVerfGE 99, 341, 351, accessible under http://

www.bverfg.de, mn. 43.


66 Depenheuer and Froese (2018), Article 14 GG mn. 526.
67 Jarass (2020), Article 14 GG mn. 107; Federal Administrative Court, V C 26.69, Judgment of 15

June 1970, Decisions of the Federal Administrative Court (BVerwGE) 35, 278, 286 f.
68 FCC, 1 BvR 513/78, Order of 16 October 1984, BVerfGE 67, 329, 340; 1 BvF 1/01 and 1 BvF

2/01, Judgment of 17 July 2002, BVerfGE 105, 313, 355, accessible under http://www.bverfg.de,
mn. 118; 1 BvR 1644/00 and 1 BvR 188/03, Order of 19 April 2005, BVerfGE 112, 332, 348,
accessible under http://www.bverfg.de, mn. 62; established jurisprudence.
69 Wieland (2013), Article 14 GG mn. 78; Epping (2019), mn. 458.
70 Epping (2019). mn. 443.
386 F. Heidrich

ownership position. […] The entirety of the laws in accordance with the constitution
determining the content of property thus constitute the subject-matter and scope of
the protection of continuing property use/ownership rights granted by Article 14(1)
s. 1 GG [translated by the author].”71
This means that legal positions inheritable according to statutory law at a particular
time also enjoy protection under Article 14(1) s. 1 alt. 2 GG.72 As a consequence,
legal positions and contractual relationships involved in digital estates, which are
inheritable under German inheritance law, form part of the constitutionally protected
right of inheritance.

4.2 The Principle of Universal Succession

German inheritance law is based on the principle of universal succession. It consti-


tutes one possible way of determining the transfer of rights to heirs,73 but is not
guaranteed by Article 14(1) s. 1 alt. 2 GG.74 Due to the absence of specific legal
provisions on digital estates, online data from e-mail and social network accounts
as well as the underlying account contracts are subject to this principle, too.75 But
just as the “analogous” inheritance may contain non-inheritable legal positions, there
might be reasons for excluding parts of the digital estate from being passed on to the
heirs. These aspects will be examined in more detail in the following sections.

4.2.1 Passing on of Assets as a Whole

The principle of universal succession is rooted in § 1922(1) BGB. According to


this principle a person’s assets (inheritance) are passed as a whole to one or more
other persons (heirs) upon the death of the person (devolution of an inheritance). The
assets concept of § 1922 BGB is vague,76 but has to be understood in a broad sense77
as “all rights and obligations as a whole [translated by the author]”,78 the entirety
of inheritable legal relationships, including all liabilities.79 The interpretation of the
term “assets” in the sense of § 1922(1) BGB is flexible in a way: at its heart it is limited
by the monetary value of a legal position or relationship, but simultaneously it is open

71 FCC, 1 BvL 77/78, Order of 15 July 1981, BVerfGE 58, 300, 336.
72 Epping (2019), mn. 443.
73 Kunz (2017), § 1922 BGB mn. 25.
74 Leipold (2020), Einleitung mn. 54.
75 Hohenstein (2018), p. 7; Leipold (2020), § 1922 BGB mn. 33.; Steiner and Holzer (2015), p. 262.
76 Kunz (2017), § 1922 BGB mn. 63.
77 Alexander (2016), p. 304; Herzog (2013), p. 3747; Klas and Möhrke-Sobolewski (2015), p. 3473;

Solmecke, Köbrich and Schmitt (2015), p. 291.


78 Hoeren (2005), p. 2113.
79 Leipold (2020), § 1922 BGB mn. 17.
The Digital Estate in the Conflict Between the Right of Inheritance … 387

to the inclusion of valueless positions as well as to the exclusion of ones with monetary
value.80 For this reason the assets concept of § 1922(1) BGB only constitutes the
framework, which permits treating legal positions and relationships with monetary
value as principally inheritable: monetary value might be evidence of inheritability,
but in a case-by-case review other indicators also have to be considered.81
Rights in rem in moveables constitute heritable assets just like contractual rela-
tionships including their principal, ancillary and alternative claims.82 Nevertheless
some positions with asset value83 are excluded from inheritability or inheritability is
made conditional on a legal transaction.84 Furthermore the real or assumed will of
the testator can be an obstacle to the inheritability of legal positions and relations.85
Positions with asset value can be excluded from inheritability if they are personal to a
special degree, especially when serving the testator’s personal purposes or individual
needs or being for other reasons closely related to him or her.86 Especially the latter
criterion was subject to much debate about the inheritability of online data and the
relevant account contracts, since a digital estate often comprises private messages,
videos, photos or the like.

4.2.2 Inheritability of Data

In connection with the inheritability of data according to the principles outlined


above, a distinction must be made between online and offline data. But it is equally
true that not individual objects or (sets of) data but rather legal positions are subject
to inheritance.87
To the extent that data is stored on the testator’s storage media, the property right
to the storage media is passed on to the heirs according to § 1922(1) BGB. In a sense,
locally stored data share the “destiny of the volume as a corporeal object [translated
by the author]”– irrespective of their content and classification as asset-related or
personal—and have to be “viewed in the context of the ownership of the physical
medium [translated by the author]”.88 If third persons hold intellectual property rights
in that data they can exercise them against the heirs.89

80 Kunz (2017), § 1922 BGB mn. 70; with the same conclusion Müller-Christmann (2020), § 1922
BGB mn. 24; Leipold (2020), § 1922 BGB mn. 19.
81 Kunz (2017), § 1922 BGB mn. 73.
82 Detailed Herzog (2013), p. 3747; see Leipold (2020), § 1922 BGB mn. 21, 25; Pruns (2013),

p. 3163.
83 Herzog (2013), p. 3747.
84 Kunz (2017), § 1922 BGB mn. 72.
85 Herzog (2013), p. 3748.
86 Müller-Christmann (2020), § 1922 BGB mn. 24.
87 Detailed Herzog (2018), pp. 475 f.
88 Ludyga (2018), p. 3; see Bräutigam (2019), nach § 1922 BGB mn. 7; Herzog (2013), p. 3749;

Hoeren (2005), p. 2114; Hohenstein (2018), p. 8; Steiner and Holzer (2015), p. 262.
89 Herzog (2018), p. 476.
388 F. Heidrich

By contrast, in the case of data stored on foreign servers, testators neither own nor
are in possession of the storage media. Both these legal positions cannot be passed on
to the heirs according to § 1922(1) BGB. But since the usage of the respective storage
space is based on a contractual relationship between the testator and the provider, the
contract itself, including the rights to access, change and delete all data stored within
the account, pass to the heirs according to the principle of universal succession. To the
extent that these data are subject to the intellectual property rights of third persons,
there is no difference from offline data.90 Some legal scholars, however, challenge
the principles concerning online data, with respect to both particular contractual
relationships and specific types of data. The following sections will discuss further
why the objections are not convincing.

4.2.3 Inheritability of Personal Contractual Relationships

The inheritability of social network contracts is challenged by the view that argues
that any particular contractual relationship has a mainly personal character.91 The
personal character of the principal contractual obligation (Hauptleistungspflicht) is
decisive.92 The obligor could have an interest worthy of protection only in connection
with a particular person, with the result that an arbitrary change in the identity of the
obligee could be unreasonable.93
However, the principal contractual obligations of the provider are limited to
granting access to the different functions and to providing communication content
for registered users.94 The personalisation is performed by the users, and does not
have any effect on the (principal contractual) obligations of the provider. The person-
alised profile merely pools the functions available for the particular user and serves
as an organisational framework (zur “Ordnung der Verhältnisse “).95 Since the range
of functions offered by the provider always remains the same, it is almost impos-
sible to argue that the provider’s obligation would be changed in its content if the
contracting partner changed and that there would be a need to protect the interests of
providers.96 Consequently, social network contracts and particularly e-mail service
contracts have no personal character that prevents inheritability.97

90 On all of these aspects see Herzog (2018), p. 476.


91 Klas and Möhrke-Sobolewski (2015), p. 3474.
92 Klas and Möhrke-Sobolewski (2015), p. 3474.
93 Klas and Möhrke-Sobolewski (2015), p. 3474.
94 Correctly emphasised by KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388; Federal Court

of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesgerichtsh


of.de, mn. 35.
95 In this direction KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388.
96 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388.
97 Raude (2017), p. 436 with reference to Seidler (2016), pp. 88 ff., 134 ff.
The Digital Estate in the Conflict Between the Right of Inheritance … 389

4.2.4 Inheritability and Confidentiality Obligations

In the context of potential personal contractual relationships, the issue whether legal
positions qualify as non-inheritable because providers make use of a particular trust
on the part of their users is often discussed.98 But relevant confidentiality provisions
from the “analogous” world cannot be used in favour of digital estates, as it already
lacks a special bipolar bond of trust between providers and users.99 Admittedly,
particular user accounts often contain personal data, but the users do not store them
in the account in order to share them with the provider—comparable to a doctor or
priest—but rather to share them with other users. In terms of the unity of the legal
system, these findings correspond to the legislature’s assessment within German
criminal law: Internet providers are not subject to criminal sanctions according to
§ 203(5) German Criminal Code100 (StGB) and—in turn—do not enjoy the right to
refuse to give evidence according to § 53 German Code of Criminal Procedure101
(StPO).102 Furthermore, the contractual obligation to provide users with messages
and other content is from the outset account-related and not restricted to a certain
individual.103 The providers cannot ensure that the person logged in is identical to
the designated recipient of a message or other content. There is always a risk—
similar to analogous means of communication—that third persons in possession of
the respective login data (or the key to the letterbox) will take note of messages or
other content.104

98 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191; Federal Court of Justice,


III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesgerichtshof.de, mn. 39
ff.
99 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191; KG Berlin, 21 U 9/16, Judgment

of 31 May 2017, pp. 388 f.; Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018,
accessible under http://www.bundesgerichtshof.de, mn. 39 ff.; Bock (2017), p. 386; Herzog (2013),
p. 3749; Klas and Möhrke-Sobolewski (2015), p. 3477; Kunz (2017), § 1922 BGB mn. 630 f.;
different view Martini (2012), p. 1150.
100 The provision sanctions the disclosure of a secret of another person after the death of that

person. Offenders can only be members of certain groups of professionals or persons, such as
lawyers, notaries, doctors, representatives, public officials, or publicly appointed experts, who are
formally obliged by law to conscientiously fulfil their duties, etc.
101 According to this provision, a person may refuse to testify on professional grounds. The purpose

is to protect the bond of trust between certain professions, such as clergymen, lawyers, notaries,
doctors, representatives and employees of press and broadcasting corporations, and the people who
seek their help.
102 Explicitly LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191.
103 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 40 ff.


104 With regard to the communication partner’s protection of legitimate expectation Federal Court

of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesgerichtsh


of.de, mn. 41. But the court’s reasoning also applies to the testator.
390 F. Heidrich

4.2.5 Inheritability of Asset-Related and Personal Data

Some legal scholars, who, together with the majority, recognise that user contracts105
between the testator and a provider principally descend to the heirs,106 nevertheless
argue that it is necessary to distinguish the account-related user contract from the
data contained in the account.107 The background to this view is that the principally
asset-related user contract is subject to the principle of universal succession, while
specific account content—even if characterised as asset-related—shall—due to its
personal character in individual cases—not belong to the heirs but to the next of
kin as the testator’s right holders.108 The contractual right of the heirs to be granted
access to the testator’s account content would thus not include personal data of the
testator.109 According to another view, the passing on of this data is excluded as long
as the testator in life did not rule otherwise.110 Both views are based on considerations
concerning the postmortal protection of the testator’s personality rights.111 The first
one builds on the presumption that e-mails exclusively concerning non-pecuniary
positions of the general right of personality (such as love e-mails) are relevant for
the memory of the deceased and may affect his or her personality rights even after
death.112 The latter points out that the concentration of digital data within an account
allows for the creation of comprehensive personality profiles and that the absence of
protection of these data after death may impose a restriction on the free development
of the testator’s personality in life.113
In line with the vast majority of legal scholars,114 such a distinction between
asset-related and personal data and their different legal treatment has to be rejected
for practical and legal reasons.

105 The classification of the contract’s legal nature is not undisputed but plays a minor role for
inheritability; see Raude (2017), p. 435; Federal Court of Justice, III ZR 183/17, Judgment of 12
July 2018, accessible under http://www.bundesgerichtshof.de, mn. 19.
106 Bräutigam (2019), nach § 1922 BGB mn. 8; Herzog (2013), p. 3749; Hoeren (2005), p. 2114;

Ludyga (2018), p. 3; Müller-Christmann (2020), § 1922 BGB mn. 101; Pruns (2013), pp. 3166
f.; Steiner and Holzer (2015), p. 263; LG Berlin, 20 O 172/15, Judgment of 17 December 2015,
pp. 190 ff.
107 Hoeren (2005), p. 2114.
108 Hoeren (2005), p. 2114.
109 Hoeren (2005), p. 2114.
110 Martini (2012), p. 1152.
111 Steiner and Holzer (2015), p. 262.
112 Hoeren (2005), p. 2114.
113 Martini (2012), pp. 1149 ff. with reference to Federal Court of Justice, I ZR 44/66, Judgment

of 20 March 1968, Decisions of the Federal Court of Justice (BGHZ) 50, 133, 139.
114 Stated by KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 389 while leaving the question

open; see also LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191; Federal Court
of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesgerichtsh
of.de, mn. 47 ff.
The Digital Estate in the Conflict Between the Right of Inheritance … 391

According to both views, restriction of the right to access digital account content
requires a previous examination of the data according to its personal content. To avoid
escalating dispute on the character of particular data, a clear identification of the data
as personal or asset-related is necessary. Such a distinction is, however, practically
impossible.115 This results from the fact that accounts often contain asset-related as
well as personal data, and even individual data may feature both asset-related and
personal elements.116
The question who could legally undertake such an examination also remains unan-
swered.117 Even if providers were not legally excluded by § 88(3) TKG118 or §§ 91 ff.
TKG,119 it would at least entail an unreasonable amount of effort.120 Furthermore, it
is questionable whether Internet providers, who cannot even guarantee the effective
protection of the personal data of living persons, should become “guardians of the
[testator’s] postmortal protection of personality rights [translated by the author]”.121
Probate courts would also be unable to cope with this task.122
By contrast, the testator’s right holders123 cannot rely on a legal basis for the
provision of account content.124 The state’s duty to protect human dignity deriving
from Article 1(1) s. 2 GG calls for postmortal protection of the testator’s personality
rights that outlasts his or her death.125 This right to protection, however, is not inher-
itable, nor does it descend to the testator’s right holders. The latter merely exercise
the right in trust on behalf of the testator without being subject to the right them-
selves.126 Consequently, right holders can only exercise defensive rights and rights
of revocation,127 but no pecuniary rights.128 For this reason, right holders do not

115 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191 following Solmecke, Köbrich
and Schmitt (2015), p. 291; Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018,
accessible under http://www.bundesgerichtshof.de, mn. 51.
116 Bock (2017), p. 392; Martini (2012), p. 1152; Federal Court of Justice, III ZR 183/17, Judgment

of 12 July 2018, accessible under http://www.bundesgerichtshof.de, mn. 51.


117 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 51.


118 Different view KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 389.
119 Different view Herzog (2013), p. 3750.
120 Bräutigam (2019), nach § 1922 BGB mn. 10; for a different view see Martini (2012), p. 1152,

who, nevertheless, admits the increased effort as well as the need for generalised decisions by the
respective Internet providers.
121 Alexander (2016), p. 305; likewise Bock (2017), p. 392.
122 Bock (2017), p. 392.
123 The testator may in life informally appoint right holders. According to the respective construction

of a will these may be heirs or close relatives, see Bock (2017), p. 390.
124 Gloser (2016a), p. 17; Herzog (2013), p. 3748, 3750; Ludyga (2018), p. 5.
125 FCC, 1 BvR 435/68, Order of 24 February 1971, BVerfGE 30, 173, 194.
126 Kunz (2017), § 1922 BGB mn. 302.
127 Federal Court of Justice, I ZR 44/66, Judgment of 20 March 1968, BGHZ 50, 133, 137; I ZR

135/87, Judgment of 8 June 1989, BGHZ 107, 384; Leipold (2020), § 1922 BGB mn. 158 f.
128 Kunz (2017), § 1922 BGB mn. 302.
392 F. Heidrich

need to obtain knowledge of personal account content;129 they must only take note
of the usage of such data in a way that constitutes a violation of the testator’s human
dignity.130
It could be argued that asset-related data stocks would be “infected” by personal
content, with the effect that heirs would be denied access to all digital account data.131
But this would lead to the creation of huge amounts of orphaned data that would, as
a consequence, be attributed to the particular Internet provider as a kind of “de facto
heritage” and could be analysed and used indefinitely.132 Furthermore, such a result
could not be justified by the postmortal protection of the testator’s personality rights.
The different treatment of asset-related and personal account content just outlined
seems to be barred for another reason, too: The legal position subject to inheritance
does not consist of the individual data sets stored in the account but of the right of
access to the entirety of data based on the user contract with the Internet provider
that has been passed on to the heirs according to § 1922(1) BGB.133
Not least, German inheritance law contains provisions that serve as an indica-
tion that the law does not distinguish between asset-related and personal legal posi-
tions.134 Such a differentiation is alien to the “analogous” estate. There is no objective
reason135 for the different treatment of letters and diaries on the one hand and e-mails
as well as Facebook messages on the other hand.136

129 The Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://
www.bundesgerichtshof.de, mn. 53, is content with stating that the postmortal protection of person-
ality rights does not establish a right of close relatives to access personal account data superior to
inheritance law.
130 Correctly emphasised by KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 389 f.;

subsequently Ludyga (2018), p. 5.


131 See Bräutigam (2013), p. 16, 24, who, nevertheless, wants to forward all “infected” data to the

right holders.
132 Mackenrodt (2017), pp. 541 f., who, however, takes the reasoning of the KG Berlin according

to § 88 Abs. 3 TKG as a basis. See also id. (2018), p. 47.


133 Raude (2017), p. 435.
134 Ludyga (2018), p. 4; Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018,

accessible under http://www.bundesgerichtshof.de, mn. 49.


135 Herzog (2013), p. 3750; Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018,

accessible under http://www.bundesgerichtshof.de, mn. 50.


136 LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 191; Ludyga (2018), p. 4; Steiner

and Holzer (2015), p. 263.


The Digital Estate in the Conflict Between the Right of Inheritance … 393

4.2.6 Inheritability and Data Protection of Testators

The inheritability of legal positions concerning digital estates is also not excluded due
to the application of data protection law in favour of the testator. Neither national
(Federal Data Protection Act [BDSG],137 §§ 91 ff. TKG, §§ 11 ff. Teleservices
Act [TMG]) nor European Union law encompasses data protection of deceased
persons.138
For instance, the recitals 27, 158 and 160 of the GDPR exclude data of deceased
persons from the scope of application of the regulation. The GDPR leaves the protec-
tion of these data to the discretion of the Member States. The German legisla-
ture has not yet—also against the backdrop of the right to be forgotten (Article
17 GDPR)—taken action in that direction.139
As much as postmortal data protection may be desirable,140 it is not necessary
according to constitutional law.141 As long as the legislature remains inactive in
protecting the data of deceased persons, there will be no interference with digital
estates.

4.3 Consequences

According to § 1922(1) BGB, user contracts with the provider of e-mail and social
network services are inheritable. As a result, the contractual right to access all data
contained in the account falls within the scope of protection of Article 14(1) s. 1 alt.
2 GG. The constitutional protection of digital estates therefore needs to be balanced
against the protection of the secrecy of telecommunications.
Notwithstanding the above, the exclusion of inheritability may be subject to
contractual agreements.142 To the extent that such agreements are contained in the
providers’ general terms and conditions, problems arise with respect to their incorpo-
ration into the contract, as well as with respect to the test of reasonableness according
to § 307 BGB.143 Nevertheless, these aspects will not be discussed further.

137 For a different view, Martini (2012), p. 1148; Spilker (2015), p. 58.
138 Bock (2017), pp. 398 f., 401; Ludyga (2018), p. 5.
139 Bock (2017), p. 401.
140 In this direction Alexander (2016), p. 307; Hohenstein (2018), p. 9.
141 For a different view, Martini (2012), pp. 1148 ff.; Spilker (2015), p. 58.
142 See Alexander (2016), p. 306; Bock (2017), pp. 411 ff.; Mackenrodt (2018), pp. 43 ff.; Raude

(2017), p. 437; Seidler (2016), pp. 143 ff.; LG Berlin, 20 O 172/15, Judgment of 17 December
2015, pp. 191 f.; KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 388.
143 See Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://

www.bundesgerichtshof.de, mn. 24–32.


394 F. Heidrich

5 Secrecy of Telecommunications and Data Protection


of the Testator’s Communication Partners

The question whether the protection of the secrecy of telecommunications in favour


of the testator’s communication partners may exclude inheritability has been the
subject of considerable debate. Furthermore, provisions concerning protection of
data of communication partners could be an obstacle to this. Consequently, both
lines of argument played an important role in the recent decision of the Federal
Court of Justice, too.

5.1 Safeguarding the Secrecy of Telecommunications


of the Testator’s Communication Partners

Similar to the right of inheritance, the secrecy of telecommunications is protected by


constitutional law. Article 10(1) GG formulates the fundamental right of individuals
and § 88 TKG transfers its protective content to statutory law. When applying and
interpreting § 88(3) TKG in the context of the heirs’ right to access the testator’s
digital account data, the constitutional implications not only of Article 10(1) GG but
also of Article 14(1) s. 1 alt. 2 GG must be adequately considered. Only if de lege
lata no reasonable balance between Article 14(1) s. 1 alt. 2 GG and Article 10(1) GG
can be achieved would there be a need for an amendment of telecommunications law
de lege ferenda as implied by the latest coalition agreement.

5.1.1 § 88 TKG as an Expression of the State’s Duty to Protect

The provision of e-mail and social network content by the respective service providers
falls within the scope of protection of Article 10(1) GG. Due to the principle of
speciality, the right to informational self-determination (enshrined in Article 2(1)
GG in conjunction with Article 1(1) GG) is inapplicable.144 Furthermore, the right
to the guarantee of the confidentiality and integrity of information technology systems
(enshrined in Article 2(1) GG in conjunction with Article 1(1) GG, too) is subsidiary
to Article 10 GG and the right to informational self-determination.145

144 FCC, 1 BvR 330/96 and 1 BvR 348/99, Judgment of 12 March 2003, BVerfGE 107, 299, 312,
accessible under http://www.bverfg.de, mn. 43; 1 BvR 668/04, Judgment of 27 July 2005, BVerfGE
113, 348, 364, accessible under http://www.bverfg.de, mn. 79; 2 BvR 902/06, Order of 16 June
2009, BVerfGE 124, 43, 56, accessible under http://www.bverfg.de, mn. 49; 1 BvR 256/08, 1 BvR
263/08 and 1 BvR 586/08, Judgment of 2 March 2010, BVerfGE 125, 260, 310, accessible under
http://www.bverfg.de, mn. 191.
145 2 BvR 902/06, Order of 16 June 2009, BVerfGE 124, 43, 57, accessible under http://www.bve

rfg.de, mn. 51; Hermes (2013), Article 10 GG mn. 45.


The Digital Estate in the Conflict Between the Right of Inheritance … 395

The secrecy of telecommunications protects the confidentiality of other than phys-


ically transmitted information to individual recipients146 with the aid of telecom-
munications traffic. The provision covers not only the content of communications,
but also the confidentiality of the immediate circumstances of the communications
event, which include, in particular, whether, when and how often telecommunica-
tions traffic occurred or was attempted between what persons or telecommunica-
tions equipment.147 Not only the messages, thoughts or opinions148 exchanged fall
within the material scope of protection, but also—due to their (potential) commu-
nicative character149 —all forms of expression (speech, pictures, sound, symbols or
other data),150 including the communication services of the Internet.151 Protection is
granted from the time the sender releases the message “out of his or her hand” until
the message has entered the (exclusive) sphere of the recipient.152
E-mails stored on the mail server of a provider qualify as information transmitted
by means of telecommunications traffic to individual recipients, and thus fall within
the scope of protection of Article 10(1) GG. The temporal scope of application
comprises the whole period of time the messages are stored on the servers of the
respective provider. The danger continues to exist to the extent that providers or
investigating authorities can access the server.153 The same is true for social networks’
data, to the extent that instant messages154 and content shared with a particular set
of users (friends or friends of friends) is concerned. The latter, too, falls within the

146 Gusy (2018), Article 10 GG mn. 59.


147 FCC, 1 BvR 1494/78, Order of 20 June 1984, BVerfGE 67, 157, 172; 1 BvR 1430/88, Order
of 25 March 1992, BVerfGE 85, 386, 396, accessible under http://www.bverfg.de, mn. 47; 1 BvR
2226/94, 1 BvR 2420/95 and 1 BvR 2437/95, Judgment of 14 July 1999, BVerfGE 100, 313, 358,
accessible under http://www.bverfg.de, mn. 161; 1 BvR 330/96 and 1 BvR 348/99, Judgment of
12 March 2003, BVerfGE 107, 299, 312 f., accessible under http://www.bverfg.de, mn. 47; 2 BvR
2099/04, Judgment of 2 March 2006, BVerfGE 115, 166, 183, accessible under http://www.bve
rfg.de, mn. 68, 71; 1 BvR 370/07 and 1 BvR 595/07, Judgment of 27 February 2008, BVerfGE
120, 274, 307, accessible under http://www.bverfg.de, mn. 183; 1 BvR 256/08, 1 BvR 263/08 and
1 BvR 586/08, Judgment of 2 March 2010, BVerfGE 125, 260, 309, accessible under http://www.
bverfg.de, mn. 189.
148 FCC, 1 BvR 1494/78, Order of 20 June 1984, BVerfGE 67, 157, 171.
149 Hermes (2013), Article 10 GG mn. 41.
150 FCC, 1 BvR 1611/96 and 1 BvR 805/98, Order of 9 October 2002, BVerfGE 106, 28, 36, acces-

sible under http://www.bverfg.de, mn. 19; 2 BvR 2099/04, Judgment of 2 March 2006, BVerfGE
115, 166, 182 f., accessible under http://www.bverfg.de, mn. 67.
151 FCC, 1 BvR 370/07 and 1 BvR 595/07, Judgment of 27 February 2008, BVerfGE 120, 274, 307,

accessible under http://www.bverfg.de, mn. 183; 2 BvR 902/06, Order of 16 June 2009, BVerfGE
124, 43, 54, accessible under http://www.bverfg.de, mn. 43.
152 Hermes (2013), Article 10 GG mn. 38 with reference to FCC, 2 BvR 2099/04, Judgment of 2

March 2006, BVerfGE 115, 166, 181 ff., accessible under http://www.bverfg.de, mn. 62 ff.; 2 BvR
902/06, Order of 16 June 2009, BVerfGE 124, 43, 56, accessible under http://www.bverfg.de, mn.
48.
153 FCC, 2 BvR 902/06, Order of 16 June 2009, BVerfGE 124, 43, 54 f., accessible under http://

www.bverfg.de, mn. 46.


154 Thus consenting, but dissenting with regard to (“status update”) postings that address the whole

“network community” Gusy (2018), Article 10 GG mn. 63.


396 F. Heidrich

definition of information transmitted to individual recipients, since it is distinctive


for this kind of data that access to the respective content is restricted by means of an
individually assessable user configuration155 and communication is thus comparable
to e-mails addressed to a large number of recipients. There is the same need for
protection of confidentiality.
The secrecy of telecommunications is an entitlement of the communication partic-
ipants156 with regard to third parties, not in relation to each other.157 But according to
the general rules, Internet service providers as private parties are not directly bound
by Article 10(1) GG. They are primarily bound by § 88 TKG, the manifestation of
the secrecy of telecommunications in statutory law.158 Concerning access to digital
account data, the conflict of fundamental rights thus takes place in a three-person
constellation among private parties, mediated by the state, and so shifts to the level
of applying and interpreting § 88 TKG.

5.1.2 The Application and Interpretation of § 88(3) TKG According


to Constitutional Law

The core of the debate on the exclusion of inheritability in accordance with the
secrecy of telecommunications is the application and interpretation of § 88(3) s.
1 TKG. While the KG Berlin interpreted the provision strictly in the sense of the
secrecy of telecommunications,159 the Federal Court of Justice made clear that §
88(3) s. 1 TKG does not exclude the heirs’ right to access the testator’s digital
account content,160 and thus ensured practical concordance in the individual case.
The application of § 88(3) TKG on the heirs’ right of access that was ultimately
left open by the Federal Court of Justice161 nevertheless meets the requirements
set out by constitutional law. The view that considers e-mail162 and social network
providers163 as service providers in the sense of § 88(2) TKG is convincing. Even

155 Hermes (2013), Article 10 GG mn. 39; Schwabenbauer (2012), pp. 20 f.


156 Gusy (2018), Article 10 GG mn. 24; Hermes (2013), Article 10 GG mn. 26; Jarass (2020),
Article 10 GG mn. 10; Stettner (2011), § 92 mn. 43.
157 Hermes (2013), Article 10 GG mn. 17; see Stettner (2011), mn. 93.
158 Klesczewski (2013), § 88 TKG mn. 1; Mayen (2018), § 88 TKG mn. 1.
159 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 390 ff.
160 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 54 ff.


161 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 56.


162 Bock (2013), § 88 TKG mn. 22; Heun (2018), § 88 TKG mn. 19 ff., 47; Munz (2015), § 88

TKG mn. 13; Säcker (2013), § 3 TKG mn. 65; Seidler (2016), p. 112; VG Köln, 21 K 450/15,
Judgment of 11 November 2015, Kommunikation & Recht, vol. 19, no. 2, 2016, pp. 141 ff. The
legislature undertook the same classification, see BT-Drs. 16/3078, p. 13. See Biermann (2018),
mn. 38; Deusch (2014), p. 6, for implicitly consenting views; see Martini (2012), p. 1148 for a
different view.
163 Detailed KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 390 f.; see LG Berlin, 20 O

172/15, Judgment of 17 December 2015, p. 192; Raude (2017), p. 437; differentiating between
The Digital Estate in the Conflict Between the Right of Inheritance … 397

though these providers do not convey signals themselves, they are attributed the
conveyance performed by a third party (such as an access provider), since they de
facto appropriate the conveyance for their own purposes.164 Consequently, not only
web-based e-mail services such as Gmail165 but also social network services qualify
as telecommunication services in the sense of the TKG. This view takes into account
the constitutional protection mandate of Article 10(1) GG, as the confidentiality of
digital communication content is shielded from dangers emanating from the providers
of telecommunications services. The users cannot establish who in the course of
the specific communication process is responsible for performing which particular
services. Nevertheless, during the whole time the same dangers exist. To the extent
that users regard e-mail service providers as the only ones providing services, the
latter have to ensure appropriate protection for the communication content entrusted
to them.
The application of § 88(3) TKG is not excluded from the time a user takes notice
of a message within his or her account, since the danger of influence caused by the
third party assigned with the transmission persists until the message has entered the
(exclusive) sphere of the recipient.166
Finally, the application of § 88(3) TKG is not excluded because all communication
partners167 would have consented to the passing on of the account content to the
heirs.168 At least the assumption that the testator’s communication partners would
have consented is subject to strong doubts.169

e-mail-like communication via instant messengers and content shared with a limited number of
users on one hand, and content available to the general public on the other hand Deusch (2014),
p. 6; id. (2017), p. 399; see Seidler (2016), pp. 138 f. for a different view; dissenting with respect
to the material scope of Article 10 GG Schenke (2019), Article 10 GG mn. 43; left open by Federal
Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.bundesger
ichtshof.de, mn. 56.
164 VG Köln, 21 K 450/15, Judgment of 11 November 2015, p. 144.
165 The Federal Network Agency (BNetzA) classified the Internet-based e-mail service Gmail as

a telecommunications service and bound Google to register the service. The Administrative Court
(VG) Köln dismissed the lawsuit of Google. The Higher Administrative Court (OVG) Münster
suspended the appeal proceedings on 26 February 2018 to refer to the ECJ according to Article 267
TFEU for a preliminary ruling, i.a., whether Article 2 lit. c) Directive 2002/21/EC of the European
Parliament and of the Council of 7 March 2002 on a common regulatory framework for electronic
communications networks and services (Framework Directive)—the European Union law basis of
§ 3 Nr. 24 TKG – also applies to web-based e-mail services, see OVG Münster, 13 A 17/16, Order
of 26 February 2018, Kommunikation & Recht, vol. 21, no. 5, 2018, pp. 348 ff.
166 Klesczewski (2013), § 88 TKG mn. 13; Mayen (2018), § 88 TKG mn. 23; id. (2018), pp. 467 f.;

Munz (2015), § 88 TKG mn. 12; LG Berlin, 20 O 172/15, Judgment of 17 December 2015, p. 92.
167 Detailed Hermes (2013), Article 10 GG mn. 58; see also Mayen (2018), p. 469.
168 Biermann (2018), mn. 44; Bock (2017), p. 408; Solmecke, Köbrich and Schmitt (2015), p. 292;

KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 396 ff.; left open by LG Berlin, 20 O 172/15,
Judgment of 17 December 2015, p. 192; Federal Court of Justice, III ZR 183/17, Judgment of 12
July 2018, accessible under http://www.bundesgerichtshof.de.
169 See for different arguments Bock (2013), § 88 TKG mn. 44; Mayen (2018), § 88 TKG mn. 52;

id. (2018), p. 469; Klesczewski (2013), § 88 TKG mn. 35; KG Berlin, 21 U 9/16, Judgment of 31
May 2017, p. 397.
398 F. Heidrich

An interpretation of § 88(3) s. 1 TKG that excludes the heirs’ right to access


the testator’s digital account content without exception by qualifying heirs as “other
persons” and the access to digital account data as exceeding “that necessary for
the commercial provision of telecommunications services”170 is unconstitutional. It
would mean that the secrecy of telecommunications takes precedence over the right
of inheritance in every case. This result is neither predetermined in the abstract—
due to the extraordinary significance of both fundamental rights171 nor reasonable in
a concrete perspective. The merely limited strengthening of the secrecy of telecom-
munications is out of proportion to the manifold, considerable burdens on the heirs
when administering not only the digital but also the entire estate. Nevertheless, the
provision can be interpreted in conformity with constitutional law.
Prerequisites are that the wording of the particular provision is open to more than
one possible interpretation, at least one of the interpretations meets the requirements
set out by constitutional law, and the particular interpretation does not conflict with
the objective of the norm.172
Heirs Are Not “Other Persons”.
The broad interpretation of § 88(3) s. 1 TKG stating that heirs are not “other persons”
carried out by the Federal Court of Justice meets these criteria.
The wording of § 88(3) s. 1 TKG is ambiguous and leaves room for the interpretive
result of the Federal Court of Justice. From a naturalistic point of view, heirs have
to be considered as neither senders nor recipients and therefore not involved in the
communication process.173 From another point of view, and according to § 1922(1)
BGB, they enter into the legal position of the testator as the contractual partner174
with the effect that they can be considered as legally the same person.175

170 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, pp. 392 ff.


171 The right to property and the right of inheritance together form the basis of the property system
stipulated in the GG, FCC, 1 BvR 720/90, Order of 14 December 1994, BVerfGE 91, 346, 358;
2 BvR 552/91, Order of 22 June 1995, BVerfGE 93, 165, 173 f., accessible under http://www.
bverfg.de, mn. 23; the secrecy of telecommunications is part of the right to free development of
the personality by means of the private exchange of information excluding the general public, and
protects human dignity, FCC, 1 BvR 1494/78, Order of 20 June 1984, BVerfGE 67, 157, 171; 1
BvF 3/92, Order of 3 March 2004, BVerfGE 110, 33, 53, accessible under http://www.bverfg.de,
mn. 105; 2 BvR 2099/04, Judgment of 2 March 2006, BVerfGE 115, 166, 182, accessible under
http://www.bverfg.de, mn. 64.
172 Epping (2019), mn. 67. On principles and limits of the interpretation in conformity with the

constitution Hesse (1995), mn. 79 ff.


173 Bock (2017), p. 406; KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 394.
174 Biermann (2018), mn. 40; Leipold (2020), § 1922 BGB mn. 35; Salomon (2016), p. 327; Seidler

(2016), p. 115, p. 120; Steiner and Holzer (2015), p. 264; Federal Court of Justice, III ZR 183/17,
Judgment of 12 July 2018, accessible under http://www.bundesgerichtshof.de.
175 Biermann (2018), mn. 40.
The Digital Estate in the Conflict Between the Right of Inheritance … 399

According to the latter perspective, the right of inheritance and the secrecy of
telecommunications are balanced according to the principle of practical concor-
dance on a case-by-case basis.176 This means that the legal positions protected by
constitutional law are related to each other in such a way that every position is given
effect. The principle of unity of the constitution calls for limitations of all positions
concerned in a way that gives effect to all of them to the broadest possible extent.
The particular limitations must, in turn, be proportionate in the individual case.177
This cannot be accomplished by confining the scope of protection of the secrecy
of telecommunications to the transmission process until an e-mail or other message
reaches the account of the testator,178 but by elevating the heirs to participants in
the communication process by means of § 1922(1) BGB, with the result that they
themselves enjoy protection under the secrecy of telecommunications.179 Hence, on
the one hand the scope of protection of Article 14(1) s. 1 alt. 2 GG in favour of
the heirs remains completely unaffected, while the secrecy of telecommunications in
favour of the communication partners of the testator on the other hand—in the light of
the Federal Constitutional Court’s jurisprudence on the temporal scope of protection
of Article 10(1) GG with regard to e-mails180 —will not be impermissibly confined in
its scope. As to the rest, the level of protection will not be disproportionally limited
either. In particular, the secrecy of telecommunications will not completely make way
for the right of inheritance.181 The limitation regarding the access of heirs to accounts
of the testator covers a personally as well as materially predefined area. Apart from
this, digital communications remain entirely protected. Alternatives maintaining the
full protection of the secrecy of telecommunications are not available.
In the final analysis, this interpretation also does not conflict with the objec-
tive of § 88(3) TKG. § 88(3) s. 1 TKG aims at the realisation of the legislature’s
duty to protect the secrecy of telecommunications enshrined in Article 10(1) GG182
and hence a defence against access by third parties.183 The communication partners
themselves are no danger to the secrecy of telecommunications, since they enjoy joint

176 On the necessity to balance the conflicting fundamental rights positions in accordance with the
principle of practical concordance, partly without any further explanation Brisch and Müller-ter
Jung (2013), pp. 450 f.; Herzog (2013), p. 3751; Klas and Möhrke-Sobolewski (2015), pp. 3477 f.;
Solmecke, Köbrich and Schmitt (2015), p. 291. On the legislator’s task of balancing the conflicting
fundamental rights positions in accordance with the principle of practical concordance, Mayen
(2013), p. 78.
177 Hesse (1995), mn. 72.
178 For a different view, Bock (2017), p. 409. Drawing a comparison to letters Brisch and Müller-ter

Jung (2013), pp. 446, 450 f.; Gloser (2018), p. 18.


179 Seidler (2016), p. 115; Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018,

accessible under http://www.bundesgerichtshof.de, mn. 60.


180 FCC, 2 BvR 902/06, Order of 16 June 2009, BVerfGE 124, 43, 54 ff., accessible under http://

www.bverfg.de, mn. 42 ff.


181 In this direction, however, Kuntz (2016), p. 191; Raude (2017), p. 438.
182 Mayen (2018), § 88 TKG mn. 1.
183 Salomon (2016), p. 327.
400 F. Heidrich

protection against encroachments by third parties.184 Thus, “spilling” of the commu-


nication content by one of the communication partners does not fall within the scope
of protection of Article 10(1) GG.185 According to this the secrecy of telecommu-
nications in favour of the communication partners cannot prevent the passing on of
digital account data from the testator to his or her heirs during the former’s lifetime.
But it then seems inconsistent to reach a different conclusion for the passing on as
set forth in a will or when the passing on follows from § 1922(1) BGB. The death
of a communication partner does not cause a danger for the secrecy of telecommu-
nications, but falls within the category of general risks of life.186 Communication
partners will not be exposed to access to communication content by any third party,
but rather by the legal successors of the testator.187 § 88(3) s. 1 TKG, however, does
not grant protection to the communication partners against the legal successors of the
testator, since the former can be expected to independently look after their interests
in confidentiality, in terms of their communication with the testator.188
Considerations from a systematic perspective do not exclude such an interpreta-
tion either. On the one hand this is made clear by comparison with provisions on the
secrecy of the post, enshrined in Article 10(1) GG: Postal operators are allowed to
deliver mail to heirs189 or grant access to post office boxes of the testator190 without
violating the secrecy of the post. From a functional perspective there is no difference
between the provisions on protection of the secrecy of the post and on protection
of the secrecy of telecommunications.191 On the other hand German inheritance law
indicates that the legal framework subordinates the interest of the testator in confi-
dentiality as well as his or her communication partners to the constitutional right
of inheritance.192 Moreover, it would hardly be possible to justify the unequal treat-
ment of content of the communication partners stored on the testator’s smartphones
or other devices—easily accessible to heirs—and content stored online.193
That Necessary for the Commercial Provision of Telecommunications Services.
At the same time, constitutional considerations do not support the interpretation that
granting access to digital account data stays within the limits of what is necessary
for the commercial provision of telecommunication services.
Admittedly, the wording of § 88(3) s. 1 TKG leaves room for different interpre-
tive results. On the one hand it might be argued that not every fulfilment of legal

184 Hermes (2013), Article 10 GG mn. 93.


185 Hermes (2013), Article 10 GG mn. 93; Mackenrodt (2017), p. 541.
186 Bock (2017), p. 408.
187 Biermann (2018), mn. 41.
188 Biermann (2018), mn. 41 with reference to Bock (2017), p. 404.
189 Bock (2017), p. 409; Brisch and Müller-ter Jung (2013), p. 450; Ludyga (2018), p. 4.
190 Ludyga (2018), p. 6; Pruns (2014), p. 2180.
191 Bock (2017), pp. 408 f.
192 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 63.


193 Klas and Möhrke-Sobolewski (2015), pp. 3477 f.
The Digital Estate in the Conflict Between the Right of Inheritance … 401

obligations—such as that enshrined in § 1922(1) BGB—stays within the limits of


what is necessary for the commercial provision of telecommunications services. The
limits would have to be determined with respect to the specific telecommunications
service. §§ 91 ff. TKG lays down a “maximum limit of what is permitted” but does
not permit Internet providers to grant the heirs access to the digital account content
of the testator.194 On the other hand heirs enter into the legal position of the testator.
For this reason it could be argued that Internet providers must provide the heirs—as
previously the testator—with the digital account data of the testator in the context of
the commercial provision of telecommunications services.195
The latter interpretation leads to the same result as not considering heirs to be
“other persons” and is therefore proportionate, too.
However, in the end, the purpose of § 88(3) s. 1 TKG prevents such an inter-
pretation. The limitation of the procurement of information to “what is necessary”
aims first and foremost at obtaining information in order to safeguard the regular
process of telecommunications, eliminating operational disruptions, and simplifying
accounting.196 Indications of what is permitted in order to serve these purposes follow
from §§ 91 ff. TKG.197 But the provisions contain no permission to pass on commu-
nication content to heirs. Nevertheless, a provider is self-evidently allowed to access
data of his or her contractual partners in order to fulfil his contractual obligation
to provide the relevant communication content.198 But if the fulfilment of every
legal obligation (including the passing on of digital estates to the heirs according to
§ 1922(1) BGB) were to remain within the limits of what is necessary and therefore
fall under § 88(3) s. 1 TKG, § 88(3) s. 3 TKG—which does not refer to § 88(3) s.
1 TKG199 would be deprived of its scope of application.200 § 88(3) s. 4 TKG, which
refers to § 138 StGB, would also become superfluous.201

194 Biermann (2018), mn. 43; Deusch and Eggendorfer (2017), pp. 97 f.; Mayen (2013), pp. 81 f.;
Mayen (2018), pp. 470 f.; Kuntz (2016), p. 191; KG Berlin, 21 U 9/16, Judgment of 31 May 2017,
p. 394. Left open by Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible
under http://www.bundesgerichtshof.de.
195 Ludyga (2018), p. 6; Pruns (2014), pp. 2178 f.; Steiner and Holzer (2015), p. 264; LG Berlin,

20 O 172/15, Judgment of 17 December 2015, pp. 192 f.; with the same conclusion Bock (2017),
pp. 410 f.
196 Bock (2013), § 88 TKG mn. 26; Klesczewski (2013), § 88 TKG mn. 23.
197 Bock (2013), § 88 TKG mn. 26; Heun (2018), § 88 TKG mn. 33; Mayen (2018), § 88 TKG mn.

74; Munz (2015), § 88 TKG mn. 15. Narrower Klesczewski (2013), § 88 TKG mn. 25.
198 Biermann (2018), mn. 43.
199 Bock (2017), p. 410.
200 Mayen (2018), pp. 470 f.
201 KG Berlin, 21 U 9/16, Judgment of 31 May 2017, p. 393.
402 F. Heidrich

5.1.3 Consequences

The interpretation of § 88(3) s. 1 TKG in conformity with the constitution shows


that heirs are not to be considered as “other persons” and the secrecy of telecommu-
nications is no obstacle to their right to access digital account content of the testator
stored in his or her e-mail and social network accounts.
With that said, the question of the necessity of amendment of the TKG may also be
answered. The legislature remains free to enact such a provision in order to eliminate
doubt. But it is certainly not necessary.

5.2 Data Protection of the Testator’s Communication


Partners

In the light of the newly applicable GDPR, the Federal Court of Justice has intensively
examined the concerns of communication partners with respect to data protection.202
Nevertheless this does not lead to a different evaluation either. Regardless of whether
the GDPR is applicable in a particular case,203 the provision of digital account data to
heirs can be justified according to Article 6(1) lit. (b) var. 1 GDPR as well as Article
6(1) lit. f) GDPR.204

6 Conclusion and Outlook

The recent decision of the Federal Court of Justice on digital estates gave answers to
some, but by no means all vital questions concerning this topic. The core statement of
the entire proceedings was, however, clear: Heirs are entitled to access digital account
content of the testator. The present article has shown that this result is in agreement
with the perspective of German constitutional law. In accordance with the assessment
provided by the Justice Ministers’ Conference, there is no need for further statutory
regulation. According to the latest news it is also doubtful whether the legislature
will take action—as implied by the coalition agreement—in that direction.205

202 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.
bundesgerichtshof.de, mn. 68–93.
203 Left open by Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible

under http://www.bundesgerichtshof.de, mn. 70.


204 Federal Court of Justice, III ZR 183/17, Judgment of 12 July 2018, accessible under http://www.

bundesgerichtshof.de, mn. 70–93.


205 Legal Tribune Online, ’Die juristische Presseschau vom 18. September 2018’, Legal Tribune

Online, [web blog], 18 September 2018, https://www.lto.de/recht/presseschau/p/2018-09-18-maa


ssen-ende-facebook-kommentar-loeschung-kavanaugh-vorwuerfe/, (accessed 11 Jan. 2022).
The Digital Estate in the Conflict Between the Right of Inheritance … 403

It remains to be seen how Internet providers will develop their practices in the
future. To avoid uncertainties or dispute it is advisable for people to make arrange-
ments for their own digital estates.206 Corresponding information campaigns could
support the public in that regard and thus seem likely to be helpful. Nevertheless,
courts will also have to deal with various aspects of digital estates in the future.

References

Bezerra Sales Sarlet G (2022) Digital identity and the problem of digital inheritance. Limits of
the posthumous protection of personality on the internet in Brazil. In: Albers M, Sarlet IW (eds)
Personality and data protection rights on the internet. Springer, Dordrecht, Heidelberg, New York,
London (in this volume)
Biermann B (2018) Digitaler Nachlass. In: Scherer S (ed) Münchener Anwalts Handbuch Erbrecht,
5th edn. C.H.Beck, München, pp 1952–1980
Bock M (2013) § 88 TKG. In: Geppert M, Schütz R (eds) Beck’scher TKG-Kommentar, 4th edn.
C.H.Beck, München
Bock M (2017) Juristische Implikationen des digitalen Nachlasses. Archiv für die civilistische
Praxis 217(3):371–417
Bräutigam P (2013) Der digitale Nachlass—empirischer Befund und Fragestellung. In: DAV
(ed) Stellungnahme Nr. 34/2013 durch die Ausschüsse Erbrecht, Informationsrecht und Verfas-
sungsrecht zum Digitalen Nachlass, pp 16–29. https://anwaltverein.de/files/anwaltverein.de/dow
nloads/newsroom/stellungnahmen/2013/SN-DAV34-13.pdf, (accessed 11 Jan. 2022)
Bräutigam P (2013) Glossar. In: DAV (ed) Stellungnahme Nr. 34/2013 durch die Auss-
chüsse Erbrecht, Informationsrecht und Verfassungsrecht zum Digitalen Nachlass, pp 93–
98. https://anwaltverein.de/de/newsroom/id-2013-34?file=files/anwaltverein.de/downloads/new
sroom/stellungnahmen/2013/SN-DAV34-13.pdf, (accessed 11 Jan. 2022)
Bräutigam P (2019) Nach § 1922 BGB. Anhang. Digitaler Nachlass. In: Burandt W, Rojahn D (eds)
Erbrecht, 3rd edn. C.H.Beck, München.
Brisch K, Müller-ter Jung M (2013) Digitaler Nachlass—Das Schicksal von E-Mail- und De-Mail-
Accounts sowie Mediencenter-Inhalten. Anforderungen an Internet-Provider nach dem Tode des
Account-Inhabers. Computer und Recht 29(7):446–455
Chr A (2016) Digitaler Nachlass als Rechtsproblem? Überlegungen aus persönlichkeitsrechtlicher,
datenschutzrechtlicher und vertragsrechtlicher Sicht. Kommunikation und Recht 19(5):301–307
Chr S, Köbrich T, Schmitt R (2015) Der digitale Nachlass—haben Erben einen Auskunftsanspruch?
Überblick über den rechtssicheren Umgang mit den Daten von Verstorbenen. Multimedia und
Recht 18(5):291–295
Depenheuer O, Froese J (2018) Art. 14 GG. In: Huber PM, Voßkuhle A (eds) Grundgesetz, vol 1,
7th edn. C.H.Beck, München
Deusch F (2014) Digitales Sterben: Das Erbe im Web 2.0. Zeitschrift für Erbrecht und Vermögen-
snachfolge 21(1):2–8
Deusch F, Eggendorfer T (2017) Das Fernmeldegeheimnis im Spannungsfeld aktueller Kommu-
nikationstechnologien. Kommunikation and Recht 20(2):93–99
Deusch F (2017) Anmerkung zu KG Berlin, 21 U 9/16, Judgment of 31 May 2017. Zeitschrift für
Erbrecht und Vermögensnachfolge 24(7):399–400
Epping V (2019) Grundrechte, 8th edn. Springer, Berlin

206On that matter Gloser (2016b), pp. 101 ff.; Salomon (2016), pp. 328 ff.; Steiner and Holzer
(2015), pp. 265 f.
404 F. Heidrich

Gloser S (2016a) “Digitaler Erblasser” und “digitale Vorsorgefälle”—Herausforderungen der


Online-Welt in der notariellen Praxis—Teil I. Mitteilungen des Bayerischen Notarvereins, der
Notarkasse und der Landesnotarkammer Bayern 2016(1):12–19
Gloser S (2016b) “Digitaler Erblasser” und “digitale Vorsorgefälle”—Herausforderungen der
Online-Welt in der notariellen Praxis—Teil II. Mitteilungen des Bayerischen Notarvereins, der
Notarkasse und der Landesnotarkammer Bayern 2016(2):101–108
Gusy C (2018) Art. 10 GG. In: Huber PM, Voßkuhle A (eds) Grundgesetz, vol 1, 7th edn. C.H.Beck,
München
Hermes G (2013) Art. 10 GG. In: Dreier H (ed) Grundgesetz, vol 1, 3rd edn. Mohr Siebeck, Tübingen
Herzog S (2013) Der digitale Nachlass—ein bisher kaum gesehenes und häufig missverstandenes
Problem. Neue Juristische Wochenschrift 66(52):3745–3751
Herzog S (2018) Der digitale Nachlass und das Erbrecht. Universalsukzession gilt auch in der
digitalen Welt—vorsorgende Rechtspflege stärken. Anwaltsblatt 68(8):472–481
Hesse K (1995) Grundzüge des Verfassungsrechts der Bundesrepublik Deutschland, 20th edn.
C.F.Müller, Heidelberg
Heun S-E (2018) § 88 TKG. In: Eßer M, Kramer Ph, v. Lewinski K (eds) Datenschutzgrund-
verordnung, Bundesdatenschutzgesetz und Nebengesetze, 6th edn. Carl Heymanns Verlag,
Köln
Hoeren T (2005) Der Tod und das Internet—Rechtliche Fragen zur Verwendung von E-Mail- und
WWW-Accounts nach dem Tode des Inhabers. Neue Juristische Wochenschrift 58(30):2113–
2117
Hohenstein (2018) Die Vererblichkeit des digitalen Nachlasses. Spannungsfeld zwischen postmor-
talem Persönlichkeitsrecht und dem Recht der Erben. Kommunikation und Recht 21(1):5–10
Jarass HD (2020) Art. 10 GG. In: Jarass HD, Pieroth B (eds) Grundgesetz für die Bundesrepublik
Deutschland, 16th edn. C.H.Beck, München
Jarass HD (2020) Art. 14 GG. In: Jarass HD, Pieroth B (eds) Grundgesetz für die Bundesrepublik
Deutschland, 16th edn. C.H.Beck, München
Klas B, Chr M-S (2015) Digitaler Nachlass—Erbenschutz trotz Datenschutz? Neue Juristische
Wochenschrift 68(48):3473–3478
Klesczewski D (2013) § 88 TKG. In: Säcker FJ (ed) Telekommunikationsgesetz, 3rd edn. Deutscher
Fachverlag, Fachmedien Recht und Wirtschaft, Frankfurt am Main
Kuntz W (2016) Digitaler Nachlass: Zugang der Erben zum Facebook-Nutzerkonto. LG Berlin,
Urt. v. 17.12.2015, Az. 20 O 172/15. juris—Die Monatszeitschrift 2016(5):190–192
Kunz L (2017) § 1922 BGB. In: Otte G (ed) J. von Staudingers Kommentar zum Bürgerlichen
Gesetzbuch mit Einführungsgesetz und Nebengesetzen, Buch 5, Erbrecht, vol Erbfolge, Sellier
and De Gruyter, Berlin
Leipold D (2020) § 1922 BGB. In: Kessal-Wulf S (ed) Münchener Kommentar zum Bürgerlichen
Gesetzbuch, vol 11, 8th edn. C.H.Beck, München
Leipold D (2020) Einleitung. In: Kessal-Wulf S (ed) Münchener Kommentar zum Bürgerlichen
Gesetzbuch, vol 11, 8th edn. C.H.Beck, München
Ludyga H (2018) “Digitales Update” für das Erbrecht im BGB? Zeitschrift für Erbrecht und
Vermögensnachfolge 25(1):1–6
Mackenrodt O (2017) Der “digitale Nachlass” und die Verweigerung des Zugangs zu einem Interne-
taccount gegenüber den Erben—Anmerkung zu KG ZUM-RD 2017, 524. Zeitschrift für Urheber-
und Medienrecht—Rechtsprechungsdienst 21(10):540–542
Mackenrodt O (2018) Digital inheritance in Germany. J Euro Consumer Market Law 7(1):41–48
Martini M (2012) Der digitale Nachlass und die Herausforderungen postmortalen Persönlichkeitss-
chutzes im Internet. Juristenzeitung 67(23):1145–1155
Mayen T (2018) § 88 TKG. In: Scheurle KD, Mayen T (eds) Telekommunikationsgesetz, 3rd edn.
C.H.Beck, München
Mayen T (2018) Das Fernmeldegeheimnis und der digitale Nachlass. Warum das Erbrecht nicht
ausreicht, um den digitalen Nachlass rechtlich zu bewältigen. Anwaltsblatt 68(8):466–471
The Digital Estate in the Conflict Between the Right of Inheritance … 405

Mayen Th (2013) Verfassungsrechtliche Rahmenbedingungen. Fernmeldegeheimnis. In: DAV (ed)


Stellungnahme Nr. 34/2013 durch die Ausschüsse Erbrecht, Informationsrecht und Verfas-
sungsrecht zum Digitalen Nachlass. https://anwaltverein.de/de/newsroom/id-2013-34?file=files/
anwaltverein.de/downloads/newsroom/stellungnahmen/2013/SN-DAV34-13.pdf, (accessed 11
Jan. 2022), pp 66–82
Müller-Christmann B (2020) § 1922 BGB. In: Hau W, Poseck R (eds) Beck’scher Online-
Kommentar Bürgerliches Gesetzbuch, 54th edn. C.H.Beck, München
Munz M (2015) § 88 TKG. In: Taeger J, Gabel D (eds) Kommentar zum Bundesdatenschutzge-
setz und zu den Datenschutzvorschriften des Telekommunikationsgesetzes und des Telemedi-
engesetzes, 2nd edn. Deutscher Fachverlag, Fachmedien Recht und Wirtschaft, Frankfurt am
Main
Pruns M (2013) Keine Angst vor dem Digitalen Nachlass! Erbrechtliche Grundlagen—Alte
Probleme in einem neuen Gewand? Neue Wirtschafts-Briefe. Steuer- und Wirtschaftsrecht
2013(40):3161–3167
Pruns M (2014) Keine Angst vor dem digitalen Nachlass! Erbrecht vs. Fernmeldegeheimnis. Neue
Wirtschafts-Briefe. Steuer- und Wirtschaftsrecht 2014(29):2175–2186
Raude K (2017) Rechtsprobleme des digitalen Nachlasses: Der Anspruch der Erben auf Zugang zum
Account des Erblassers in sozialen Netzwerken. Zeitschrift für Erbrecht und Vermögensnachfolge
24(8):433–439
Säcker FJ (2013) § 3 TKG. In: id. (ed) Telekommunikationsgesetz, 3rd edn. Deutscher Fachverlag,
Fachmedien Recht und Wirtschaft, Frankfurt am Main
Salomon P (2016) “Digitaler Nachlass”—Möglichkeiten der notariellen Vorsorge. Zeitschrift für
die notarielle Beratungs- und Beurkundungspraxis 20(9):324–331
Schenke RP (2019) Art. 10 GG. In: Stern K, Becker F (eds) Grundrechte-Kommentar, 3rd edn. Carl
Heymanns Verlag, Köln
Schwabenbauer T (2012) Kommunikationsschutz durch Art. 10 GG im digitalen Zeitalter. Archiv
des öffentlichen Rechts 137(1):1–41
Seidler K (2016) Digitaler Nachlass. Das postmortale Schicksal elektronischer Kommunikation.
Wolfgang Metzner Verlag, Frankfurt am Main
Spilker B (2015) Postmortaler Datenschutz. Die Öffentliche Verwaltung 68(2):54–60
Steiner A, Holzer A (2015) Praktische Empfehlungen zum digitalen Nachlass. Zeitschrift für
Erbrecht und Vermögensnachfolge 22(5):262–266
Stettner R (2011) Brief-, Post- und Fernmeldegeheimnis. In: Merten D, Papier H-J (eds) Handbuch
der Grundrechte, vol 4. C.F.Müller, Heidelberg, pp 335–388
Stresemann Chr (2018) § 90 BGB. In: Säcker FJ (ed) Münchener Kommentar zum Bürgerlichen
Gesetzbuch, vol. 1, 8th edn. C.H.Beck, München
Wieland J (2013) Art. 14 GG. In: Dreier H (ed) Grundgesetz, vol 1, 3rd edn. Mohr Siebeck, Tübingen

Felix Heidrich Research Assistant at Leipzig University, Referent at the Federal Administra-
tive Court in Leipzig. Main areas of research: Fundamental Rights, Administrative Law, Euro-
pean Union Law. Selected Publications: Ein (Bären-)Dienst an der Europäischen Demokratie? Zur
Aufhebung der Drei-Prozent-Sperrklausel im Europawahlrecht, Zeitschrift für Europarechtliche
Studien, Vol. 17 (2014), pp. 259–272 (together with Markus Kotzur); “Freitage für das
Weltklima”, Sächsische Verwaltungsblätter, Vol. 28 (2020), pp. 157–165.
Algorithms and Discrimination: The
Case of Credit Scoring in Brazil

Laura Schertel Mendes and Marcela Mattiuzzo

Abstract The chapter aims at analyzing how the current debate on algorithmic
discrimination is reflected in the case of credit scoring in Brazil. Firstly, it discusses
the concepts of algorithm and algorithmic discrimination and explains why such
concepts are particularly meaningful in a data-driven economy. It presents how Big
Data, combined with algorithms, has fundamentally altered some decision-making
process in our everyday lives, and turns to one application in particular—credit
scoring—to discuss how this may pose challenges for Brazilian law, especially
regarding the risk of discriminatory outcomes. After analyzing the currently evolving
normative data protection framework in Brazil—including the new General Data
Protection Act—the chapter discusses whether the existing or suggested legal tools
are sufficient to deal with the challenges of automated decision-making processes
and their potential asymmetric outcomes. The last Section is focused on presenting
policy proposals for dealing with credit scoring and discrimination by presenting
the literature on algorithmic governance as well as some of the main disagreements
among specialists, highlighting the debates on the limits of transparency as a viable
policy proposal. Finally, it turns to the challenges of anti-discriminations proposals
when applied to credit scoring in Brazil.

1 Introduction

One of the most important functions of data processing through algorithms is to


provide the basis for economic decisions, which in turn can contribute to risk miti-
gation. Risk assessment and mitigation become all the more relevant if an economic
sector is characterized by information asymmetries. That is why data processing

L. S. Mendes (B)
University of Brasília (UnB), Brasília, Brazil
e-mail: laura.schertel@unb.br
M. Mattiuzzo (B)
University of São Paulo, São Paulo, Brazil
e-mail: marcela.mattiuzzo@usp.br

© Springer Nature Switzerland AG 2022 407


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_17
408 L. S. Mendes and M. Mattiuzzo

plays a prominent role in high-risk sectors, such as the credit and private insurance
industries. But its relevance is also increasing in other industries. In a nutshell, infor-
mation and data processing in the private sector proves to be a means of simplifying
economic decisions and increasing efficiency in an environment characterized by
information deficits.
Technological development has made important contributions here: Not only is it
possible to collect and process more information for risk assessment than ever before,
but this assessment can be quantified by a score—or a prognosis of future behavior
of an individual. This score is created through an automated procedure, in which
the existing data is incorporated into an algorithm and the individuals are assigned
to a specific risk category. Recent developments in information technology—which
could be summarized under the now famous term “Big Data”—provide even more
incentives for making use of forecasts in the private sector, as they allow more
information to be processed and new correlations between data and future behavior
to emerge.
In the credit report and credit score business, automatizing decision-making
processes was at first seen as a means to overcome the well-known biases and
discriminatory tendencies ingrained in individuals, companies and public entities.
Nevertheless, it soon became clear that the statistical method, which supposedly
received objective data as inputs and therefore should generate objective results as
outputs could, in many circumstances, reproduce the existing biases and lead to
more of the same discrimination. That is because, firstly, causation and correla-
tion can often be predefined by the individuals (data controllers), who transmit the
same biases already present in old-fashioned decision-making to the algorithm. If
one believes women to be overall ill-suited for some kinds of jobs—say, being a
mechanical engineer—and this person designs an algorithm that internalizes such
logic, regardless of the quality of the input, the output may present the same biases
of the designer. Secondly, even if the algorithm is programmed to identify its own
correlations by collecting raw existing data—which should eliminate the problem of
direct bias transfer—it could still reproduce discriminatory correlations present in
such data. In other words, algorithms could learn discriminatory patterns present in
society and normatively replicate them as “objective truth”. Even if the algorithm’s
designer is not biased towards men for mechanical engineering jobs, if there are
enough datapoints that suggest gender to be a significant variable to determine such
fitness—say, there are more men than women engineers—the output could reproduce
the existing discriminatory conditions.1
Furthermore, since algorithms rely to a great extent on statistical discrimina-
tion, that is, differentiating individuals based on probabilistic group characteristics,
it becomes paramount to fully comprehend whether the processes and criteria used
to classify individuals are correct, fair, transparent and, ultimately, legitimate. A
question to be answered in that regard is the following: did all the individuals who

1Barocas and Selbst (2016). The authors identify five mechanisms by which data mining’s ill effects
may occur, and go deeper into clarifying why these discriminatory outcomes emerge.
Algorithms and Discrimination: The Case of Credit Scoring … 409

possess the same characteristics, and are substantially similar, receive parallel treat-
ment? Seen in this light, it is clear how the claim for equal treatment and against
potential discriminatory behavior has become an important issue in the context of
algorithmic based decision making.
In this chapter, we aim at analyzing how the current debate on algorithmic discrim-
ination is reflected in the case of credit scoring, especially in regard to the Brazilian
legal scenario. To do so, we have dedicated Sect. 2 to setting out parameters and
establishing definitions which will be useful throughout the text. That section is
mainly concerned with broadly defining both algorithms and algorithmic discrim-
ination, and explaining why such concepts are particularly meaningful in what we
refer to as a Big Data (or data-driven) economy. It presents how Big Data, combined
with algorithms, has fundamentally altered some decision-making processes in our
everyday lives, and turns to one application in particular—credit scoring—to discuss
how this may pose challenges for Brazilian law, especially regarding the risk of
discriminatory outcomes.
Section 3 will analyze the currently evolving normative data protection framework
in Brazil and how it applies to our discussed topic. Our goal is, firstly, to highlight how
the reform of the Brazilian Credit Report Act brought algorithmic based decision-
making to the heart of the debate and, secondly, to discuss whether the existing
legal tools are sufficient to deal with the challenges of automated decision-making
processes and their potential asymmetric outcomes.
Section 4, lastly, is focused on presenting policy proposals for dealing with credit
scoring and discrimination. We do so by presenting the literature on algorithmic
governance and the discussions set out in this arena, as well as some of the main
disagreements among specialists, highlighting the debates on the limits of trans-
parency as a viable policy proposal. Finally, we turn to the challenges of anti-
discrimination proposals when applied to credit scoring in Brazil. We do not aim
to exhaust the debate, nor to provide definitive answers for all the many questions
that surround algorithmic discrimination in credit scoring, but to provide a roadmap
of the debate.

2 Algorithms and Statistical Discrimination—Profiling,


Ranking, and Scoring

Before delving deeper into the credit scoring discussion, we present in this section an
overview of the themes that will later prove essential to the debate on policy proposals
for algorithmic discrimination. Firstly, we describe what an algorithm is—and, more
specifically, what a computer algorithm is. Secondly, we tackle why this concept
is particularly relevant in our data-driven economy. Thirdly, we clarify what makes
algorithmic discrimination peculiar, present a preliminary typology of potential kinds
of discrimination and analyze how credit scoring fits into this scenario.
410 L. S. Mendes and M. Mattiuzzo

2.1 Algorithms in the Data-driven Economy

An algorithm is often described as a set of steps and instructions that determines how
something should be done. It can be absolutely anything, and in no way is this concept
dependent on the use of modern computer power. One can conceive of an algorithm to
get dressed, an algorithm to get the bus to work, to execute a recipe, and endless other
tasks, for an algorithm is nothing but a formula in which tasks are put in a specific
order. Though this is a correct description, it fails to provide sufficient information for
the purposes of this chapter. We will therefore adopt Thomas Corden’s explanation,
who is careful enough to point out that there is a difference between any given
algorithm and those—a subset of which we discuss here—that run on computers.
Computers, unlike humans, do not understand the meaning of “sufficient”, “almost”,
“bad” or any other word that implies a subjective evaluation of the world around us.
That is why an algorithm that tells your smartphone to change the brightness on its
screen whenever there is “almost no battery left” would be useless. A computer can
interpret percentages, but it will not be able to determine what “almost no battery”
means, unless someone tells it so. As Corden puts it, “You might be able to tolerate
it when an algorithm is imprecisely described, but a computer cannot. (…) So a
computer algorithm is a set of steps to accomplish a task that is described precisely
enough that a computer can run it”.2
It is also important to note that the goal of algorithms as they are currently used
today and as we intend to discuss here is mainly to solve problems and help people,
companies and governmental authorities to make decisions. When you look for flights
from São Paulo to Berlin, you want answers to your question. More so, you want
to find the correct answer to that question. Here, again, we run into a nuance of
algorithms: the program will only be as good as the information (or input) provided
for it, and it will be correct whenever it uses such information according to its
specifications. When looking for the “best” flight from São Paulo to Berlin, the
algorithm will need to know whether by “best” we mean “shortest”, or “cheaper”.
If the algorithm is programmed to find the shortest route, in terms of kilometers
traveled, it may think the time spent in an airport waiting for a connecting flight is
irrelevant, and thus provide us with an answer that though incorrect according to our
preferences is correct from the standpoint of the program. The issue here is not with
the algorithm, but with the input provided and with the alignment of our expectations
and the program’s goal.
In this sense, it becomes clear that a fundamental goal of algorithms is to
make forecasts by means of probability. Though algorithms cannot provide accu-
rate answers to all questions, they can analyze inputs and provide well-informed
guesses. If the amount and quality of data points available to the algorithm is suffi-
cient, the guess can be ever closer to predicting the real outcome. Let us take the
example of dice throwing. For any given dice, one can safely assume the probability
of it providing a result equal to 5 is of one in every six chances. That is true because
if we randomly roll a dice for an infinite number of times, there is an equal chance

2 Cormen (2013), p. 1.
Algorithms and Discrimination: The Case of Credit Scoring … 411

of each face of the dice coming up as a result, and the number 5 is only present in
one of the faces. Though that is true of any ideal imaginary dice, it is less so for a
particular dice. That is, real-world dice can be rigged. If a clever player adds a rigged
dice to the game, what is the probability of that dice turning the number 5? To answer
the question, we must observe successive throws. After we reach a certain number
of throws, say 200, we will have sufficient information to predict, with more or less
accuracy, what the result of subsequent throws will be.
It is important to stress that the higher the incentives for the economic use of data
processing through algorithms as a basis for decision-making and the more readily
available and cheaper the technologies that make it possible, the more urgent the
discussion on the consequences of such procedures for individuals and its associated
risks.
Algorithms need one basic input to provide us with relevant answers: data. What
has lead to the exponential growth in their use and impact onto our lives is the ever
increasing amount of information available in the form of data. The term “Big Data”
has been coined to convey this phenomenon.3 As Mayer-Schönberger and Cukier
put it, Big Data is not just about size, but especially about “the ability to render into
data many aspects of the world that have never been quantified before”.4 With the
internet and the development of computing power, virtually all information can now
be turned into data, from DNA sequencing to car engines functioning. The creation
and complexification of algorithms walk hand in hand with this process.
The most important function of Big Data is producing forecasts on the basis of
numerous data and information: from climatic disasters to economic crises, from
the outbreak of an epidemic to the winner of a sports championship, from buyers’
behavior to customers’ solvency. Thus, Big Data analysis can be used to make fore-
casts of general facts about the economy, nature or politics as well as about individual
behavior. For the topic covered here, the latter function is of main interest, as it relies
on personal data to generate information and knowledge about the behavior of the
individual, providing the basis for economic decisions. A Big Data analysis can there-
fore directly affect the life of the individual—and can cause discriminatory outcomes
that impact her life in particular.
According to Mayer-Schönberger and Cukier, there is no precise definition for Big
Data, but it can be characterized by three tendencies. Firstly, the quantity of data and
information. Big Data analysis not only gathers more data than ever before, rather it
attempts to collect all the data and information for a particular situation, not merely
a sample—as they put it, in Big Data, n = all. Secondly, due to the large amount
of information available, the data can be inaccurate. As the scale increases, so do
the chances of error. The third property of Big Data is finding correlations instead

3 There is much discussion on whether this term is appropriate and what it in fact means. Boyd and
Crawford present a good summary of it: “Big Data is notable not because of its size, but because of
its relationality to other data. Due to efforts to mine and aggregate data, Big Data is fundamental
networked. Its value comes from the patterns that can be derived by making connections between
pieces of data, about an individual, about individuals in relation to others, about groups of people,
or simply about the structure of information itself”. Boyd and Crawford (2011), pp. 1–2.
4 Mayer-Schönberger and Cukier (2014).
412 L. S. Mendes and M. Mattiuzzo

of causalities. This means that a relationship between two facts or characteristics is


determined according to a statistical analysis. Big Data is not about understanding
the inner functioning of a process, nor about finding the cause of a fact.
In this context, it is clear that the phenomenon of Big Data presents many chal-
lenges for our discussion, especially regarding how automated decision-making can
generate asymmetric impacts and discriminatory outputs. This is especially true when
it comes to decisions that can have a significant impact on the financial situation of the
person concerned. The grounds for discriminatory behavior seem to be even vaster
in times of Big Data. On the one hand, fuzzy and inaccurate data can be included in
the analysis and thus compromise its quality and precision. On the other, the process
carried out by the algorithm can be opaque, which both questions the legitimacy of
the decision and reduces the possibilities of correcting inaccuracies.
More significantly for the purposes of this chapter is the substitution of causality
for correlation.5 The way science and scientific discovery functioned for much of
human existence was through causality. We tend to look at the world searching for
the actions and its consequent reactions. It is a feature developed and stimulated in
scientific research, because it is usually thought to be the only way by which we can
correctly understand what goes on around us. Humans have a natural pull towards
comprehending the universe they live in, and causation is but a manifestation of this
desire.
In the world of Big Data, however, causation moves aside and leaves space for
correlation. A correlation is the probability of one event happening once another event
also takes place. It is a statistical relationship between such events. Instead of trying
to figure out the inner workings of a phenomenon, correlations let us understand the
world by use of proxies:
By letting us identify a really good proxy for a phenomenon, correlations help us capture
the present and predict the future: if A often takes place together with B, we need to watch
out for B to predict that A will happen. Using B as a proxy helps us capture what is probably
taking place with A, even if we can’t measure of observe A directly.6

Big Data has opened the possibility of ever greater use of correlations since the
amount of proxies one can work with have exponentially increased. A famous
example is the development of Google Flu Trends, a tool Google publicized in 2008
as an attempt to predict the spread of the influenza virus in different regions of the
United States, and later the world, taking as input the search queries of users. As
explained by the developers of this tool in a paper published by Nature, “because
the relative frequency of certain queries is highly correlated with the percentage of
physician visits in which a patient presents with influenza-like symptoms, we can
accurately estimate the current level of weekly influenza activity in each region of
the United States, with a reporting lag of about one day.”7 Naturally, there is no cause

5 As Nate Silver has rightly stated, the search for correlations is embedded in a complex context in
which prejudice and subjective assumptions are reflected; this means that even statistical methods
are not completely objective. See Silver (2012).
6 Mayer-Schönberger and Cukier (2014), p. 53.
7 Ginsberg et al. (2009).
Algorithms and Discrimination: The Case of Credit Scoring … 413

and effect between typing search terms on your computer and contracting the flu,
but this correlation served as a proxy for the spread of the disease.
After a run of several years providing results that helped authorities fight and
prevent the disease, Flu Trends faced issues that eventually lead to its termination. In
early 2013, the system was providing inaccurate results and claiming the likelihood
of flu in some regions within the US was almost twice than that reported by the
Center for Diseases Control (CDC).8 The problems encountered by Google were later
examined by several authors, most notably by Lazer et al. in an article at Science.9
The researchers point to two issues that lead to Flu Trends’ downfall: (i) big data
hubris and (ii) algorithm dynamics. In regard to (i), they call attention to the fact that
the data-driven economy has led many to entirely substitute traditional approaches
to research by “big data alternatives”. In their words, this is problematic because the
data produced by Google and used by tools such as Flu Trends bears no particular
attention to validity and reliability, both of which are essential for scientific analysis.10
Concerning (ii), the authors point to the constant alterations made by engineers to
the search algorithm and by users on how they approach search as cause for some of
the wrong measurements.11
The case of Flu Trends has much to teach us, and though it is unclear what the
central issue with the algorithm was, it is clear it had something to do with what
is called in statistics a spurious correlation. A correlation is spurious whenever it
indicates causation where cause and effect do not exist. Simply put, it is a correlation
that shows two variables influencing each other, when in fact they simply present
similar patterns.12 Tyler Vigen has famously dedicated an entire book to spurious
correlations, aimed at showing how one can be fooled by similarly slopping graphs.
Examples presented by him include the number of people who drowned by falling
into a pool and the number of films Nicolas Cage appeared in from 1999 to 2009—
there is a 66.6% correlation between both—and the per capita cheese consumption
in the US, which correlates at a 94.71% rate with the number of people who died by
becoming tangled in their bedsheets from 2000 till 2009.

8 Butler (2013).
9 Lazer et al. (2014), pp. 1203–1205.
10 Lazer et al. (2014), p. 1203. “Elsewhere, we have asserted that there are enormous scientific

possibilities in big data (9–11). However, quantity of data does not mean that one can ignore
foundational issues of measurement and construct validity and reliability and dependencies among
data (12). The core challenge is that most big data that have received popular attention are not the
output of instruments designed to produce valid and reliable data amenable for scientific analysis”.
11 Lazer et al. (2014), p. 1204: “In improving its service to customers, Google is also changing the

data generating process. Modifications to the search algorithm are presumably implemented so as
to support Google’s business model—for example, in part, by providing users useful information
quickly and, in part, to promote more advertising revenue. Recommended searches, usually based
on what others have searched, will increase the relative magnitude of certain searches. Because GFT
uses the relative prevalence of search terms in its model, improvements in the search algorithm can
adversely affect GFT’s estimates”.
12 Beware Spurious Correlations, Harvard Business Review (2015). Available via https://hbr.org/

2015/06/beware-spurious-correlations. Accessed 11 Jan. 2022.


414 L. S. Mendes and M. Mattiuzzo

All the topics presented here acquire one more layer of complexity once we
consider the recent developments of computer science in the field of artificial intelli-
gence (AI). AI is interested in the development of “smart” machines, be them robots,
cars, computers, and whatnot. A field within AI called machine learning (ML) is of
particular interest to computer scientists, and has developed quickly over the past
decade. ML is mainly concerned with giving machines the ability to learn by them-
selves given inputs provided by humans. As Pedro Domingos very clearly stated,
ML changes the rules of the game because:
Every algorithm has an input and an output: the data goes into the computer, the algorithm
does what it will with it, and out comes the result. Machine learning turns this around: in
goes the data and the desired result and out comes the algorithm that turns one into the other.
Learning algorithms – also known as learners – are algorithms that make other algorithms.
With machine learning, computers write their own programs, so we don’t have to.13

As the rise of ML emphasizes, another pressing issue with algorithms is not the substi-
tution of causation for correlation, but the opacity of algorithms in their decision-
making processes. The problem acquires more relevance given that algorithmic
solutions have been largely adopted by both private and public sectors.14
Perhaps the most famous case of public algorithmic use is that of COMPAS.
COMPAS stands for Correctional Offender Management Profiling for Alternative
Sanctions. It is a tool originally intended to provide jail management with “critical
inmate management information”, everything running from mental health screening
to gang tracking. As equivant,15 the owner of COMPAS, states the tool works by way
of a 9-level decision tree model, that classifies inmates in a risk spectrum ranging
from one till nine, nine being the highest and one being the lowest. Though COMPAS
was originally designed for jail monitoring, it has been used for other purposes in the
United States, namely for recidivism risk assessment.16 One case in which COMPAS
was used for such purpose was that of Eric Loomis in the state of Wisconsin. In 2013,
Loomis was accused of eluding the police in the city of La Crosse, after driving a car
used in a shooting. He had been previously convicted of third-degree sexual assault,
and, after an assessment by COMPAS, was ruled to be of high risk of committing
another crime, thus sentenced to a six-year sentence.
Loomis’ lawyers appealed the sentence, claiming the defense had no access to the
risk assessment carried out by COMPAS, given its proprietary nature, even though
such result was expressly taken into consideration by the judge in his decision.
The case reached the Supreme Court of Wisconsin, which maintained the judge’s
decision, stating that COMPAS was not the only reason the decision was based upon.

13 Domingos (2015), p. 6.
14 Though here, given the nature of credit scoring, we will be largely concerned with analyzing
private repercussions.
15 As of January 9, 2017, Court View Justice Solutions Inc., Constellation Justice Systems Inc., and

Northpointe Inc. have rebranded to form equivant.


16 State v. Loomis. 881 N.W.2d 749 (Wis. 2016). In: https://harvardlawreview.org/2017/03/state-v-

loomis/. Accessed 11 Jan. 2022.


Algorithms and Discrimination: The Case of Credit Scoring … 415

A writ of certiorari was later brought forward and denied by the Supreme Court, but
the case nonetheless highlights the importance of algorithms by public authorities.
Another example is that of CrimeRadar,17 a tool designed in Brazil aimed at
predicting crime rates and patterns in the city of Rio de Janeiro. To this day, we have
no knowledge of it being used by public authorities to fight crime but that certainly
is a trend elsewhere—algorithms such as PredPol18 are in use by police in several
American states and have changed the way in which police departments operate.
On the private side, examples are likewise vast, if not more numerous. One field
in which algorithms thrive is recruitment. As noted by the Wall Street Journal,19
companies such as Unilever are eliminating traditional resume-driven processes and
instead relying on software to select its candidates for job offers. The goal expressly
stated by the company was to diversify the pool of applicants. Danieli, Hillis and
Luca, clarify why the use of algorithms is this field is so attractive:
To see the close relationship between algorithms and hiring, consider the simple fact that
hiring is essentially a prediction problem. When a manager reads through resumes of job
applicants, she is implicitly trying to predict which applicants will perform well in the job and
which won’t. Sales organizations are trying to predict which sales associates will successfully
close deals. School districts are trying to predict which teachers will bring a classroom to
life. Police departments are trying to predict which officers will keep a neighborhood safe.20

Also, using algorithmic systems leads to extremely consistent results and help firms
save a considerable amount of money—the business of hiring is very time and money
consuming. It is certainly true humans are biased and have been known to apply those
biases in recruitment.21 In this sense, using algorithms to select candidates has the
potential to hinder discrimination. However, the more we rely on such tools, the
harder it becomes for unfit candidates to enter the market, and even for them to
understand what precisely makes their applications less attractive than others. That
was the case of Peter Lane, a History graduate from Cardiff University who after
applying for over 50 positions and going through at least 15 interviews, had no
offers on his hands, nor any helpful feedback on how to approach his next attempts.
His interactions were mostly which machines, and even his interviews were video-
based.22
Another example is the case of Kyle Behm, who had trouble finding a job after
being diagnosed with bipolar disorder even though he had almost perfect SAT scores.

17 A pop-up on CrimeRadar’s website says, before you are allowed to use the tool: “The probability
estimates featured in the FUTURE function of this app are based on a predictive algorithm. As
such, the accuracy of such information is subject to various uncertainties, and it is strongly advised
that users do not rely exclusively on this function for decision-making purposes”.
18 For more information on PredPol, see: http://www.predpol.com. Accessed 11 Jan. 2022.
19 Gee (2017).
20 Danieli et al. (2016).
21 To mention but one example, a study from Northwestern university, conducted by Lincoln Quil-

lian, showed that between 1990 and 2015, white applicants received 36% more callbacks than black
applicants, and 24% more callbacks than latinos.
22 Finley (2018).
416 L. S. Mendes and M. Mattiuzzo

Behm brought a suit against seven companies for their use of a personality test devel-
oped by the workforce management company Kronos. Given these not so successful
stories, some of the companies that have adopted automated hiring appear to be
changing their practices, according to Cathy O’Neil, but that remains a topic to be
watched.23
As the case of hiring makes clear, lack of clarity proves to be a serious concern
once we talk about the legal consequences of algorithmic discrimination. Firstly
because, if the algorithm is opaque, it is challenging to affirm discrimination did in
fact occur; secondly because it can be hard to prevent it; thirdly because algorithms,
if carelessly maneuvered, may reinforce rather than fight discriminatory outcomes.
If statistical discrimination has endogenous proxies, and the learner algorithm uses
such proxies for creating its program, we may feedback-loop discrimination into the
system.
In this new economy filled with Big Data processing algorithms that establish
correlations among the most varied bits of information in order to provide us with
answers for our questions, many issues arise. Here, we will focus on one, the possi-
bility of results yielded by algorithms being discriminatory. The rest of this chapter
will be devoted to clarifying what we understand algorithmic discrimination to be
and to dwelling on one specific application of algorithms in the data-driven economy:
credit scoring. As we shall see, this is one field of application where group profiling
can (and has already) lead to unwanted results.

2.2 Algorithmic Discrimination

To clarify what precisely we mean by algorithmic discrimination, we will draw on


references from three main sources. First, on the works of Frederick Schauer, espe-
cially on his book Profiles, Probabilities and Stereotypes, then on the economic theory
of statistical discrimination, and finally on Gabriele Britz’s Einzelfallgerechtigkeit
versus Generalisierung.
After analyzing the concept of statistical discrimination in the economic literature,
we will discuss four different forms of algorithmic discrimination and then examine
the specific case of credit scoring against the background of the proposed typology.
Our focus is to hint at the protection requirements needed for the individual submitted
to the credit scoring process, in order to avoid discriminatory outcomes.
When thinking of the concept of discrimination, we usually imagine a scenario
where a person is singled-out from a group, owing to a specific trait she possesses.
Someone is taken as less deserving of a job because that person did not go to a top
university, or is not invited to a party because is considered a bad guest. Though that
is a common form of thinking about discrimination, the idea we are mostly concerned
with in this work is that of a discriminatory outcome coming from pertaining to a
group and being judged by the traits of that group; the scenario where the individual

23 O’Neil (2018).
Algorithms and Discrimination: The Case of Credit Scoring … 417

characteristics of a person are disregarded, and that person is seen solely as the
member of a larger parlor of people.
According to Schauer, one of the problems of the concept of prejudice—and we
claim his use of prejudice is equal to our use of discrimination—is our linguistic
application of the word. To him, we create confusion for we use the same word to
describe different circumstances.
Schauer explains the problem by delving deeper into the concept of generalization.
According to him, there are two main types of generalizations, the sound and the
unsound. Sound generalizations can be (i) universal—the famous example used by
Socrates in explaining syllogism, “All humans are mortal”, meaning the entirety of
the human race does one day die, is a universal generalization in the sense that it
holds true for 100% of cases; and (ii) nonuniversal—meaning the generalization does
not intend to describe the entirety of a group, but rather a feature shared by most
individuals in that group. When one says “Brazilians have European heritage”, it
is clear the sentence does not apply to all Brazilians and that some people born in
Brazil may in fact not have European roots. But the generalization is still sound and
useful if it holds for most cases, as research has proven.24
Owing to our linguistic use, Schauer adds a third and final group of sound gener-
alizations to the mix, that of generalizations which are not universal, nor describe a
feature shared by most members of a group, but that “accurately portrays the members
of the class as having a greater prevalence of some trait than has some larger class of
which the group is ordinarily taken to be a part, even though the trait appears in less
than a majority of the members of both groups”.25 He uses the following example to
clarify what he means by this category of generalization: when one claims “bulldogs
have bad hips”, it certainly does not mean all bulldogs have hip problems, and it
also does not mean most bulldogs have hips problems, it simply means that bull-
dogs, when compared to the larger class of dogs, tend to have hip problems more
frequently than other breads. This use of a generalization is statistically sound so
long as bulldogs do in fact have bad hips in a larger proportion than most dogs. In
short, this third category of generalization relies heavily on a comparative dimension.
Unsound generalizations, for their turn, fail in fulfilling the above standards. If
one claims “Aries are impulsive”, for example, it is quite easy to verify that (i) this is
not a universal characteristic—not all people born between March 21 and April 20
are impulsive, (ii) there is no evidence that these people are any more impulsive than
those born in other periods of the year, and (iii) describing someone as impulsive
is not predictive of that person being a Aries, or vice versa. A well-known unsound
generalization in Law are the studies by Cesare Lombroso on the “criminal man”.
Lombroso was an Italian physician and founder of the Italian school of positivist
criminology. His studies were devoted to proving how criminals were born this way
and how certain physical characteristics could help identify criminality. His research
concluded that the criminal individual congregated some traits, such as excessively
long arms, asymmetric cranium and face etc. To this day, there is no concrete evidence

24 http://www.pnas.org/content/112/28/8696. Accessed 11 Jan. 2022.


25 Schauer (2006), p. 11.
418 L. S. Mendes and M. Mattiuzzo

Lombroso was right and the fact a person possesses long arms is in no way predictive
of whether that person will commit or has already committed a crime.
Going back to prejudice, Schauer states that we use the term prejudice in referring
to two different scenarios. We describe something as prejudicial when a statement
relies on unsound statistical generalizations, but also when it relies on statistically
sound and non-universal generalizations. In this sense, to say Aries are impulsive is
as prejudicial as claiming that male drivers have a higher risk of being involved in car
accidents, even though there is no evidence to support the first statement, and there
is evidence to support the second, since the higher percentage of male drivers that
causes car accidents is a statistically proven fact.26 That is so because we attribute
to the word “prejudice” a negative connotation, and thus are not comfortable with
applying it to the second scenario, so long as it is not true that all men are less cautious
when driving cars. As Schauer puts it, “all human beings (…) deserve to be treated
as individuals and not simply as members of a group, so the argument goes, and
actuarial decisions about human beings are in most instances morally wrong”.27
The problem with this line of thought is actuarial decisions (or discrimination)
about human beings are extremely common in any legal system, and often times
indispensable. Whenever the law states that only people above a certain age limit are
allowed to vote, or to drink alcoholic beverages, we make an actuarial decision about
human beings. There certainly are different individuals that at the age of 15 would be
able to cast a vote, or exercise sufficient discretion to drink alcohol responsibly, but
we ignore that possibility in the name of other values. The same goes for determining
the speed limit in a highway. Naturally, different individuals are able to drive safely
at different speeds, but we take a value—often times statistically tested—according
to which the number of accidents falls to levels considered acceptable. It is worth
noting that some people may still not be able to drive as safely as others under the
limits that currently exist, but we as a society decide this risk is bearable. Actuarial
decisions, moreover, are not limited to the law. We constantly apply that same logic to
countless other daily situations: we choose to board a plane despite knowing there are
risks associated with it, we decide to apply exams so people can attend universities,
colleges or schools, despite knowing they are unable to account for all cognitive
abilities and end up leaving many talented applicants out.
Other than understanding discrimination as a common feature of our legal
systems—and of life in general, another important clarification lay in better under-
standing what precisely is statistical discrimination (SD). SD is an economic theory,
whose origins are attributed to Kenneth Arrow and Edmund Phelps,28 that tries
to explain how inequality can be an issue even when individuals are not actively
engaging in racial, gender or other forms of discrimination.29 They claim that is so

26 See: https://www.oxera.com/wp-content/uploads/media/oxera_library/the-use-of-gender-in-ins
urance-pricing.pdf. Accessed 11 Jan. 2022.
27 Schauer (2006), p. 19.
28 Phelps (1972); Arrow (1973).
29 In opposition to such form of discrimination, the economic literature identifies what is called

taste-based discrimination, as defined by Becker in his The Economics of Discrimination. In that


Algorithms and Discrimination: The Case of Credit Scoring … 419

because one sometimes—and as we will see, algorithms do so very frequently—uses


group characteristics to evaluate individuals.
It is worth noting that the theory assumes such form of discrimination is rational
and runs from the fact that information is limited, but agents must nonetheless make
decisions. Consequently, they are “likely to utilize easily observable traits such as
gender, race, education, etc. as proxies for productive characteristics”.30 In other
words, agents make educated guesses about other individuals based on characteristics
they can observe, which are used as substitutes for other characteristics they cannot.
Take the labor market as an example. Employers may be prone to hire men rather than
women because they understand the group “women” as one likely to face tougher
career paths—they usually have to choose between work and family, and they do not
always choose work. The employer knows nothing about the particular situation of
the woman she will interview, but may well adopt this generalization into her decision
making-process. It is essential to note that the economic approach to discrimination is
different from the legal discriminatory notion. The economic theory does not discuss
whether a given form of categorization is fair or not, it simply analyzes whether it is
rational.
One further important aspect to be noted is that statistical discrimination may
occur due to exogenous or endogenous differences between groups.31 In the first
case, the variable that renders a group distinct from the other is external, whereas
in the second, it is internal and might feed back into the differentiation.32 Going
back to our previous example, it may be that women have historically been more
involved in child-raising and house-caring duties, but this outcome can be a result of
them being given less career opportunities in a male-dominated work environment.
The consequence of the discrimination, in this case, leads to the confirmation of the
initial assumption. Differently, if one takes the market for car insurance, it is easy to
observe insurance for young male drivers is more expensive than insurance for young
female drivers. Gender is an apparently irrelevant proxy, that nonetheless correlates
strongly with the rate of accidents and is thus frequently used by companies for
pricing. In this second scenario, gender is no longer an endogenous variable, but
rather an exogenous one, for nothing in the fact men pay more for insurance leads
this group to effectively get involved in more car accidents.
Economically, these observations are important because statistical discrimination
based on endogenous aspects can be inefficient. The outcome—say, have less women
inserted into the labor market—could be modified and lead to overall higher welfare
if the initial assumption was not present.

book, he claims some individuals have a “taste” for discrimination and as such includes in the utility
function of such individuals a “coefficient for discrimination”, representing such preference.
30 Goodman and Bryce (2016), p. 3.
31 Moro (2009).
32 This result is what is usually referred to as a feedback loop. A feedback occurs when the output of

a system—say an algorithm—is fed back into the system as an input. In other words, a given effect of
the system returns as its cause. The result of less women being hired returns to the decision-making
system as an input for the decision-maker and thus reinforces the conclusions it creates.
420 L. S. Mendes and M. Mattiuzzo

We use the term algorithmic discrimination in this chapter encompassing scenarios


that entail statistically unsound statements and those that, though statistically sound,
in some form treat humans not as individuals but as part of a group. That is so not
because we believe actuarial decisions to be necessarily problematic, but because,
in our view, a sound categorization can nonetheless be unfair. With that in mind, we
present four main types of algorithmic discrimination:
(i) Discrimination by Statistical Error—any and all mistakes that are genuinely
statistical, encompassing anything from the data being incorrectly captured,
to there being a problem in the model so that it fails to account for part of
the data available. Basically, this is the type of discrimination that runs from
a mistake by the engineers/data scientists responsible for the algorithm;
(ii) Discrimination by Generalization—in this case, though the model works
well, and is statistically correct, it leads to a situation in which some people
are mistakenly classified into some groups.33 For example, if Jane Doe lives
in a neighborhood commonly associated with low income, and the model has
nothing but her address to decide whether or not she is a good candidate for
credit, it will classify her as pertaining to a group she may nevertheless not be
a part of. It will do so owing to a correct algorithm, and to correct data, but it
nonetheless results in incorrect generalizations;
(iii) Discrimination by use of Sensitive Information—the reason we consider
this category to pertain to the discriminatory bundle, though it may well be
and often times is statistically sound, is that it relies on legally protected
data or proxies for it.34 Two more features are relevant for profiling to be
considered discriminatory in this case: other than dealing with sensitive data,
the classification must rely on endogenous characteristics35 or it must single-
out historically discriminated-against groups.36

33 Such incorrect classification can be due to spurious correlations, but not necessarily. Curiously,
the problem can also run from the fact that the algorithmic system does not have enough information
on the individual, and as such classifies her given the information it possesses.
34 Hurley and Adebayo (2016).
35 As mentioned earlier, traits used in decision-making can be endogenous or exogenous. Our claim

is that, when the characteristic under consideration has endogenous effects, it will always lead
to discrimination. If however the trait is exogenous, despite being sensitive, the result may not
be discriminatory. An example would be the already mentioned higher propensity of young male
drivers to get involved in car accidents when compared to young female drivers. Gender may be
a sensitive characteristic, but in this case it is neither endogenous nor singles-out a historically
discriminated-against group.
36 Imagine that instead of singling-out young male drivers statistically accurate data demonstrated

young female Muslims are more prone to get involved in car accidents. Let us also assume this is
an exogenous variable. We believe the problem here lies in the targeted population. That is to say
that, in the view we defend here, if the data showed young male protestants in the United States,
or young catholic males in Brazil, were the group more prone to be involved in car accidents and
therefore subjected to higher insurance the result would not be discriminatory. As will be seen in
Sect. 4, this is a very significant observation for public policy proposals.
Algorithms and Discrimination: The Case of Credit Scoring … 421

(iv) Discrimination by Inadequate Correlation—unlike in the previous cate-


gory, the inadequacy runs not from the group singled-out by the discrimina-
tion, but from the relationship between the information used by the algorithm
and the fulfillment of a right. If there is a strict connection between both and
if the right in question is severely impaired, the more likely it is for the use to
be discriminatory.37
At this point, the reader may be inquiring what this chapter is in fact about, and why
we choose to devote its first pages to rather hermetic descriptions of computer science
and economics concepts. As noted previously, our discussion on credit scoring and
algorithmic discrimination can only be properly understood once there is common
ground on what we mean by a computer algorithm, how it can lead to discriminatory
outcomes, and why the development of such tools is of particular relevance today.
Having discussed the concepts and the context, we now turn to the specific application
of algorithmic discrimination this chapter is interested in, credit scoring.
Applying this typology of algorithmic discrimination to the specific case of credit
scoring can clarify each one of the discriminatory categories as well as their useful-
ness for the law. Regarding the first category, it is easy to see why discriminatory
outputs can occur either because the data used in the decision is incorrect, or because
the statistical procedure proves to be incorrect. In fact, incorrect data and there-
fore erroneous results are a very common problem in credit scoring. The Federal
Trade Commission of the United States has stated in an opinion to Congress that the
confirmed error rate of Credit Reports in the US is between 10 and 21%, depending
on the nature of the error.38 Other studies in the US have also shown that errors in
customer credit ratings can lead to huge economic losses if the person is classified
to a wrong risk category.39
Risk assessments also creates a need for protection due to the risk of generaliza-
tion, such as described by the second category mentioned above: Information-based
risk management decisions are not grounded solely on personal data that indicates
the risk of credit or breach of contract by the individual. Frequently, these decisions
are also based on other characteristics, which are initially unrelated to the risk under
assessment, but which, according to statistical experience, often coincide with this
same risk.40
As we already saw, the economic literature calls this phenomenon statistical
discrimination because the correlation between the apparently neutral features and the
targeted data is determined by a statistical method. From a legal-doctrinal perspective,

37 In Schauer’s view, the problem in this case is not with discrimination per se but rather with
exclusion. The same holds true if we imagine the example on footnote 26 referring to health and
not car insurance. It would be less clear that young males should pay more than young females for
coverage, even though the group of young males is by no means a historically impaired group. The
reason, Schauer argues, does not run from discrimination, but rather from a sense of exclusion from
access to a facility that helps fulfill an essential right: the right to healthcare.
38 FTC (2012).
39 Buchner (2006).
40 Britz (2007).
422 L. S. Mendes and M. Mattiuzzo

this practice can in some cases raise questions of individual justice and unequal treat-
ment. Statistical discrimination allows for a differentiated treatment to be performed
on the basis of a personal characteristic, since this characteristic is, according to statis-
tical assumptions, a property relevant for the decision-making. Since the property
one is looking for is usually hard to measure (credibility, solvency, labor productivity
etc.), a “proxy feature” is used that stands for this main feature.
The problem of injustice due to generalization arises for the person who proves
to be an atypical case: Although she carries the proxy characteristic, she does not
show the other expected qualities of the group. An example is the use of a customer’s
address as part of a credit score analysis, if it is assumed that characteristics relating
to the client’s assets can be derived from her place of residence. Therefore, it is
possible that merely living in a “poor” area means a negative rating in a scoring
procedure without further examination of the actual solvency and assets of the credit
applicant.
Another problem that raises equality considerations arises from the differentiated
treatment based on classical discriminatory and stereotyping characteristics, such
as nationality, gender, age, or sexual identity (third category). On the one hand,
these characteristics are closely related to the personality kernel, proving to be virtu-
ally invariable. On the other hand, these features stand for historical differences in
treatment and group stereotyping. Moreover, their application as a basis for decision-
making can cause accumulation effects, i.e. the discrimination of certain groups in
society can be intensified.
Here the need for protection differs from discrimination by generalization, since
it is not simply a violation running from the miscategorization of the “atypical”
person; rather, all persons in the group are affected. That is because discrimination
based on sensitive information may be, and often times is, statistically sound. This
use of discriminatory characteristics is mainly to be found in the insurance sector,
where nationality and gender, for example, may be used as negative criteria in credit
risk assessment.
Under the fourth type of discrimination—discrimination through inadequate
correlation—different constellations are conceivable in which the nature and rele-
vance of the data are important for the decision. This category comprises cases in
which the nature of the information materially does not do justice to the purpose of
the decision-making process, for example if the credit information is required for
purposes totally unrelated to the customer’s solvency. An example is the case where
car insurers calculate the customer’s accident risk based on her credit history, as
there is no factual or causal relationship between the two set of data. In addition,
the information must also provide unambiguous conclusions about the assessed item
rather than just a suspicion, as in the case of a rent warning file, in which people are
given a negative rating simply because they have been looking for a home for a long
time.
While types 1 and 3 of the discrimination forms described are objective and require
only the use of inadequate data (inaccurate or sensitive information) and the harmful
result for the individual, types 2 and 4 of discrimination (respectively, discrimination
by generalization and by inadequate correlation) are more difficult to identify, since
Algorithms and Discrimination: The Case of Credit Scoring … 423

not any generalization nor every inadequate correlation will lead to discriminatory
outputs. This will depend on a proportionality and compatibility test, in which the
reasoning of the need to use such types of data must be compared to the effects on
life of the individual.41
Germany has experienced that credit bureaus may have economic incentives to
consider the exercise of a person’s right as a negative criterion. It is known that credit
bureaus negatively assessed information regarding the exercise of the right to access
one’s own credit score. While in the view of the data subject she is only exercising
her right of access, from the point of view of the credit bureaus it is assumed that
those who demand information about their creditworthiness have an increased risk
of default. This practice was considered abusive and was prohibited in the German
Data Protection Law (§ 6 para. 3 BDSG—previous version).
Credit reports are commonly described as black boxes, as their results are difficult
to understand. The scoring procedure is criticized for being incomprehensible to the
individual, since she normally receives no information about the internal structure
of the algorithm. She also does not know what precisely influences her score. It is
obvious that the lack of transparency has an influence on the error rate of the loan
information since the procedure can not be controlled either by the person concerned
or by the supervisory authority. The same applies to insurance companies’ warning
systems, which are also characterized by low transparency: what data flows into these
systems, how these data influence the insurance contract, whose data is stored there
and how it can be saved, is often times not public knowledge.
We must also worry about discrimination with regard to the due process of the
decision-making, when the individual is not given the opportunity to present the
details of his case. When the normally complex personal decision-making process
is replaced by automatic data processing, is essential to guarantee the possibility of
participation of the affected person.

3 Credit Scoring in Brazil—The Current State


of the Discussion

Data protection in Brazil has recently undergone a very significant change, with the
approval of the Brazilian General Data Protection Act.42 Even before that, however,
the country already counted with an important data protection framework, which
comprises privacy rights provided in the Brazilian Constitution and several statutes
dealing directly with personal data, such as the Consumer Protection Code, the

41 A study from the Institute for Technology & Society of Rio de Janeiro (ITS Rio) has analyzed
how segmentation can affect vulnerable groups in a disproportionate way, indicating a criteria to
the lack of proportionality and compatibility between the data used in the analysis and the results
for the person’s life (ITS, p. 5).
42 English non-official version available at: https://www.pnm.adv.br/wp-content/uploads/2018/08/

Brazilian-General-Data-Protection-Law.pdf. Accessed 11 Jan. 2022.


424 L. S. Mendes and M. Mattiuzzo

Credit Information Act and the Marco Civil da Internet (Brazilian Internet Rights
Framework).
The Brazilian Constitution directly addresses matters regarding information by
providing for the fundamental rights of freedom of expression43 and access to infor-
mation and transparency.44 It also acknowledges the inviolability of private life and
privacy45 and of telephonic, telegraphic and data communications,46 as well as estab-
lishes the home as the inviolable refuge of the individual.47 Furthermore, it meanwhile
also includes a right to personal data protection as well as it provides for the writ of
habeas data,48 which aims to provide citizens with a way to access and correct data
about themselves held by public entities and third parties.
Since the challenges of data processing in the data-driven economy are always
varying, the concept of privacy and the instruments for its protection have also under-
gone constant development. As in other parts of the world, these transformations can
be seen in the Brazilian Data Protection Framework and in results both from courts
decisions and from legislative modifications.

3.1 Brazilian General Data Protection Act

In 2018 Brazil approved its first General Data Protection Act (Law n. 13,709/2018
or GDPA) after a very long period of legislative debate, several different bills and
numerous difficulties. Some articles of Bill 53/2018, which became the new Act, were
vetoed by the Presidency, more importantly those concerning the creation of the Data
Protection Authority. In late 2018, however, an Executive Order later converted into
law inserted articles 55-A to 55-L into the GDPA, effectively allowing for the creation
of the Autoridade Nacional de Proteção de Dados Pessoais (ANPD). Although the
Act’s vacatio legis had been extended—the law came into force in September 2020—
article 55-A clearly established that the authority could have started its activities. In
fact, however, the ANPD only initiated its enforcement actions in November 2020.
Moreover, because the Act has been in force for a very short period of time, it is
unclear how precisely its dispositions will be applied. Still, it is worth mentioning
that some of its provisions will be directly applicable to automated decisions, namely
articles 20 and 21:
Article 20. The data subject has the right to request review of decisions taken solely on
the bases of automated processing of personal data that affects her/his interests, including
decisions intended to define her/his personal, professional, consumer or credit profile or
aspects of her/his personality.

43 Article 5º, IX; Article 220, Federal Constitution.


44 Article 5º, XIV; Article 220; Article 5º, XXXIII; Article 5º, XXXIV, Federal Constitution.
45 Article 5º, X, Federal Constitution.
46 Article 5º, XII, Federal Constitution.
47 Article 5º, XII, Federal Constitution.
48 Article 5º, LXXIX; Article 5º, LXXII, Federal Constitution.
Algorithms and Discrimination: The Case of Credit Scoring … 425

§1 Whenever requested to do so, the controller shall provide clear and adequate information
regarding the criteria and procedures used for an automated decision, subject to commercial
and industrial secrecy.

§2 If there is no offer of information as provided in §1 of this article, based on commercial


and industrial secrecy, the national authority may carry out an audit to verify discriminatory
aspects in automated processing of personal data.

Article 21. Personal data concerning the regular exercise of rights by the data subject cannot
be used to her/his detriment.

As the provision states, decisions based on solely automated processing that somehow
affect individuals’ interests may be subject to review. What is not yet clear, however,
is what the definition for “solely automated” will be, as well as the degree of review
that will be required for the provision to be considered fulfilled. It is also worth
mentioning that originally article 20 stated that such review had to be carried out
by a natural person, but that provision was later excluded from the law. In that
context, one must inquire precisely which degree of review will suffice to fulfill this
requirement. It is equally unclear whether decisions based primarily, but not only, on
automated means will also be somehow subjected to the legislation, and if so, how.
Furthermore, the Law establishes a role of principles, which can directly influence
the debate regarding algorithm discrimination:
Article 6 Activities of processing of personal data shall be done in good faith and be subject
to the following principles:

I – purpose: processing done for legitimate, specific and explicit purposes of which the
data subject is informed, with no possibility of subsequent processing that is incompatible
with these purposes;
II – suitability: compatibility of the processing with the purposes communicated to the
data subject, in accordance with the context of the processing;
III - necessity: limitation of the processing to the minimum necessary to achieve its
purposes, covering data that are relevant, proportional and non-excessive in relation to
the purposes of the data processing;
V – quality of the data: guarantee to the data subjects of the accuracy, clarity, relevancy
and updating of the data, in accordance with the need and for achieving the purpose of
the processing;
VI – transparency: guarantee to the data subjects of clear, precise and easily accessible
information about the carrying out of the processing and the respective processing agents,
subject to commercial and industrial secrecy;
VIII – prevention: adoption of measures to prevent the occurrence of damages due to the
processing of personal data;
IX – nondiscrimination: impossibility of carrying out the processing for unlawful or
abusive discriminatory purposes; and
X – accountability: demonstration by the agent of the adoption of measures which are
efficient and capable of proving the compliance with the rules of personal data protection,
including the efficacy of such measures.
426 L. S. Mendes and M. Mattiuzzo

No doubt that a very important principle, directly applicable to credit scoring, is


nondiscrimination. Although it is not yet clear what unlawful or abusive discrim-
ination means for the purposes of the Act, it is clear that the law accepts statis-
tical discrimination in general—otherwise scoring would be altogether prohibited in
Brazil. It remains to be seen, however, how the authorities will reconcile these ideas
in applying these rules to credit scoring.

3.2 The Consumer Protection Code and the Regulation


of Consumer Databases

The modern concept of data protection emerged in Brazil originally as a consumer


protection issue. The Consumer Protection Code (Law 8.078 of 1990) succeeded
in arranging a framework to address privacy and data protection demands through
principle-based norms that were broad enough to offer solutions to new conflicts
related to information technology. As defended by one of us in another article49 :
Four pillars of the Brazilian consumer protection system explain how it could promote and
enforce data protection standards: a) specific regulations for consumer databases that address
the rectification and notice process; b) a broad clause governing damage claims (overall
liability); c) a public consumer redress structure, which includes both an administrative and
a judicial system of redress (small claims courts); and d) a broad conceptualization of who
are consumers.

Article 43 of the Consumer Protection Code provides for specific rights and safe-
guards regarding personal information stored in databases, specifically: (a) a right of
access to all of such personal information; (b) the principle regarding data quality,
according to which all stored data shall be objective, accurate and presented in a
comprehensible language; (c) a right to be notified, through written communication,
before the storage of any negative personal information is carried out; (d) a right of
rectification of any inaccurate data that is being stored and (e) the principle regarding
time storage limitation which establishes the maximum of five years for the storage
of negative personal information. According to Antonio Herman Benjamin, who
worked on the drafting of the Consumer Protection Bill, Article 43 was inspired by
the U.S. Fair Credit Reporting Act.50
Combined with the general clause of strict stability disposed on Article 6, VI,
and Article 14 of the Consumer Protection Code, these data protection standards
assume great relevance, since consumers may register their complaints against credit
information databases operators at the Public Consumer Protection Bodies,51 which
will handle the individual complaints through an extra-judicial conciliation proce-
dure, and because the Brazilian judicial system has a variety of small claims courts,

49 Doneda and Mendes (2014), pp. 3–20.


50 See Benjamin et al. (2005).
51 There is a National Register of Consumer Complaints (SINDEC https://sindecnacional.mj.gov.

br/home) as well as a website where complaints can be filed.


Algorithms and Discrimination: The Case of Credit Scoring … 427

which facilitate consumer litigation and forego the need for hiring a lawyer. In fact,
courts have already recognized a right to compensation, for instance, when negative
personal data about a consumer is stored without prior notification, as long as the
consumer has not had any prior register at these databases.52
Finally, it is important to note how the concept of “consumer” put forward by
the Brazilian Consumer Protection Code—which allows for application in a variety
of scenarios beyond the strict contractual relation between consumers and traders—
gains relevance in this context.53 A consumer can direct damage claims to the firm
with which she has a contract, as well as exercise the rights to correction and disclo-
sure against the party responsible for a database. For this reason, the data protection
norms of the Consumer Code have had a much broader application by advancing
beyond contractual consumer relations.54 However, in spite of the relevance of this
norm to credit information databases, it does not apply to the processing of borrowers’
payment histories (“positive information”) nor does it yield any approach concerning
automatized decisions and their boundaries.

3.3 The Credit Information Act (Law 12.414 of 2011)

The fact that the Consumer Protection Code did not address issues involving the
processing of positive credit information resulted in a scenario of legal uncertainty.
Therefore, the Credit Information Act—Law 12.414 of 2011—emerged to regu-
late credit information systems, especially borrowers’ payment histories. The norm
furnishes detailed regulation concerning credit information databases, establishing
a legal framework that simultaneously encourages data flow and protects personal
data. As mentioned, the recent passing of the General Data Protection Act adds a
layer of complexity to the debate and reconciling all normative diplomas may prove
a challenge.
Law 12.414 of 2011 was reformed by Bill 441/2017. The Credit Information
Act lays a variety of rules ranging from the creation of a payment history to the
establishment of responsibilities in case of damages. It determines, for example,
when a payment history can be created (Article 4), what information can be stored

52 https://ww2.stj.jus.br/docs_internet/revista/eletronica/stj-revista-sumulas-2013_35_capSumu
la385.pdf; http://www.stj.jus.br/sites/STJ/default/pt_BR/Comunica%C3%A7%C3%A3o/noticias/
Not%C3%ADcias/STJ-define-tese-em-repetitivo-sobre-inscri%C3%A7%C3%A3o-em-cadastro-
de-inadimplentes. Accessed 11 Jan. 2022.
53 The conceptualization of consumer in the Code comprises four definitions: (a) according to the

standard definition, consumer is any physical person or corporate entity who acquires or uses a
product or service as a final user (Article 2°): (b) consumer is also a collectivity of persons who
participate in consumer relations (Article 2, § 2°); (c) consumer is, furthermore, anyone who has
suffered damages caused by a commercial activity (Article 17) and (d) any person who is exposed
to a commercial practice, such as advertising or databases is also considered a consumer (Article
29). In: Marques (2011), pp. 385–386.
54 Tepedino (1999), pp. 199–216.
428 L. S. Mendes and M. Mattiuzzo

(Article 3, §2 and §3), what are the rights of the data subject (Article 5), what are
the duties of the data processor (Article 6), who supervises the databases (Article
17) and who is liable in case of damages (Article 16). Many of its norms correspond
to the principles provided in Convention 108 of the Council of Europe and in the
European Directive 95/46/EC, but it can also be said that the Credit Information Act
corresponds to a typical U.S. regulation on credit reporting.
Consumer consent was the touchstone of the Credit Information Act, as provided
by the previous version of Article 4. Thereby the law confered the consumer the
prerogative over the creation, transfer and cancellation of her credit history. That is
no longer the case. With the reform, the Act now allows the credit scoring companies
to create profiles, and establishes an opt-out model, should the consumer wish her
data to be deleted. According to Article 5, consumers shall obtain the cancellation of
their records upon request and, as determined by Article 9, the sharing of information
is allowed.
Similar to the Consumer Protection Code, the Credit Information Act prescribes
the principle of quality or accuracy of personal data (Article 3, §1) and the rights
to access, rectification and cancellation of data (Article 5, II and III). In addition,
it guarantees access by the consumer to the main criteria used in the credit rating
process, that is, the consumer has the right to know the criteria upon which a calcu-
lation of credit risk is based (Article 5, IV). Concerning risk assessment, the law
ensures the right to ask for a review of any decision made exclusively by automated
means (Article 5, VI).55
The Credit Information Act also includes a very important provision: it offers
an explicit legal basis for the purpose limitation principle, which was only implicit
under the Consumer Protection Code. The principle of purpose, which in the General
Data Protection Act also gained much prominence, spreads through the entire credit
information system. Firstly, the Act defines the strict scope of its application, which
is solely to risk assess rather than assessment databases in credit and commercial
transactions (Article 2, I). Secondly, it establishes the right of the data subject to have
the processing of personal information limited to the original purposes of collection
(Article 5, VII). Thirdly, Article 7 describes the purposes for using the data collected
under the Act: either to conduct risk analysis or to assist decisions regarding the
granting of credit or other commercial transactions that involve financial risk. That
is to say that these databases cannot be used for marketing or any other activity not
explicitly provided for in the law.56
The prohibition against the storage and use of sensitive and excessive informa-
tion, as provided by Article 3, §3, is a decisive rule of the Credit Information Act,

55 This rule is comparable to Article 22 of the GDPR and aims to ensure the possibility of human
intervention in a process that can significantly affect his or her life. According to Article 22 of
the GDPR, “The data subject shall have the right not to be subject to a decision based solely on
automated processing, including profiling, which produces legal effects concerning him or her or
similarly significantly affects him or her”.
56 In this context, one notices another similarity to the European Directive, particularly Article 6,

1, b, which determines that personal data should be “collected for specified, explicit and legitimate
purposes and not further processed in a way incompatible with those purposes”.
Algorithms and Discrimination: The Case of Credit Scoring … 429

also in line with the General Data Protection Act. Pursuant to the norm, excessive
information is defined as the information unrelated to the credit risk analysis. Sensi-
tive information, on the other hand, is described as the information related to social
and ethnic origin, health, genetic information, sexual orientation and political, reli-
gious and philosophical beliefs. The prohibition intends to prevent that some types
of information lead to profiling, discrimination and the violation of the principle of
equality.57
Regarding the enforcement of the Act, instead of creating an authority to carry
out enforcement, it designates the existing public consumer protection bodies at the
federal, state and local levels, as responsible for the supervision (Article 17). The law
also establishes that the administrative penalties provided in the Consumer Protection
Code are applicable to credit scoring. Both provisions are only applicable, however,
when the data subject qualifies as a consumer. This norm’s main problem derives
from the lack of expertise of the consumer protection entities to deal with personal
data. This may explain the timid approach of these entities concerning this issue, in
spite of the law explicitly attributing such competence to them.

3.4 Judicial Decisions on Credit Scoring

The Superior Court of Justice (STJ for its Portuguese acronym) has defined the
practical application of the Credit Information Act in two opportunities. The first case
deals with the lawfulness of credit scoring and its boundaries, and the second case
with credit information access by the registered person and the processual formalities
to do so.
In the Special Appeal (“Recurso Especial”) n. 1.419.697, it was established that
the credit scoring is compatible with Brazilian law and is a licit commercial conduct,
as disposed by Articles 5, IV and 7, I, of the Credit Information Act. However,
the decision specified that the limits imposed by the Consumer Protection Code, in
respect to privacy and the transparency of contractual relations, must be observed in
the credit risk evaluation.
The decision also stipulated that, despite consumer consent not being mandatory
for the consult, access to the credit history source as well as to the personal infor-
mation evaluated, if requested, must be granted. Finally, it established that the use
of excessive or sensitive information is cause for damages (Article 3, I and II) and
that refusing credit based on incorrect data will give rise to strict solidary liability
between the service supplier and the person responsible for the database (Article 16).
Another relevant aspect of this decision is that a specific section was dedicated
to privacy protection and transparency about consumer information, as provided by
the Credit Information Act (Article 3) and the Consumer Protection Code (Article

57 It is possible to make another parallel to the European Directive, namely, Article 8, which concerns
the processing of special data categories. In the General Data Protection Act, the provision on Article
5, II provides the definition of sensitive data.
430 L. S. Mendes and M. Mattiuzzo

43). The limitations imposed by these norms were summarized in five duties to the
service supplier: veracity, clarity, objectivity, prohibition of excessive information,
and prohibition of sensitive information. Relating to the prohibition of excessive
and sensitive information, the Reporting Justice highlighted the distinction: sensitive
information relates to race and ethnic, sexual orientation and religious belief, whereas
excessive information refers to personal taste such as soccer team preferences, and
others. Neither can be used for scoring purposes.
To summarize the ruling, five theses were suggested by the Reporting Justice
and later unanimously approved by the Second Section of the Court, namely: (1)
“The credit scoring system is a method developed to evaluate the credit concession
risk, based upon statistic models that consider diverse variables, with the attribu-
tion of a score to the evaluated consumer (credit risk score)”; (2) “This commercial
practice is licit, being authorized by Article 5, IV and Article 7, I, of the Law n.
12.414/2011 (Credit Information Act)”; (3) “In the credit risk evaluation, the limits
imposed by the consumer protection system in respect to privacy safeguards and the
maximal transparency of contractual relations must be observed, in accordance with
the Consumer Protection Code provisions and Law n. 12.414/2011”; (4) “Despite
consumer’s consent being unnecessary, clarifications must be given, if requested,
about the considered data source (credit history), as well as about the personal infor-
mation evaluated”; 5) “Failure to observe these legal limits in using the credit score
system configures an abusive exercise of rights (Article 187 of the Civil Code),
and may give rise to strict solidary liability between the service supplier and the
responsible for the database, the source and the consulting (Article 16 of Law n.
12.414/2011), for the occurrence of damages in the use of excessive or sensitive
information (Article 3, I and II of Law n. 12.414/2011) or in situations where credit
is refused based on outdated or incorrect data”.
In Special Appeal 1.304.736, the Court established the grounds for a consumer’s
judicial demand for accessing the credit history source and the information used for
credit scoring, brought up in the decision discussed above. The decision determined
the existence of legal interest in bringing proceedings to compel the exhibition of
documents whenever the author of the demand intends to know or verify her own
documents, or any documents of her interest, notably the ones referring to her person
and that are in the power of others. To this end, for the document to be exhibited it
is important that it be of common interest.
In this context, the decision proclaimed two requirements to fulfill the legal interest
in bringing proceedings: first, proof of a request for obtaining the data—or at least of
the attempt to do so—before the institution responsible for the scoring; and, second,
proof of the refusal to obtain the desired credit, owing to the score attributed by the
system.
The decision imposes severe restrictions to the access of information for the
consumer regarding the criteria used to calculate her credit score and could be inter-
preted as a hindernisse to address algorithm discrimination in the credit reporting
context.
Algorithms and Discrimination: The Case of Credit Scoring … 431

It should be noted that both decisions date from before the General Data Protection
Act was approved, so this piece of legislation was not part of the Justices’ analysis.
Also, they were both expedited before the Credit Information Act was reformed.

3.5 The Reform of the Credit Information Act

Bill 441/2017 (in Portuguese, Projeto de Lei Complementar) was proposed by


Senator Dalírio Beber (from the Brazilian Social Democratic Party, or PSDB in
its Portuguese acronym). This Bill proposed modifications in Supplementary Law
n. 105/2000, which deals with bank secrecy, and in the Credit Information Act, with
the goal to increase the number of (positive) consumer entries in the credit reporting
system, since there were at the time only five million entries in a potential universe
of one hundred million consumers.58 In the Government’s view, low adherence to the
system was due to many bureaucratic requirements necessary to create a credit report,
as well as to consumer’s inertia, since the model provided for an opt-in system, in
which the express consent of the consumer was necessary to open a consumer file.59
In this context, the amendment proposal of the Credit Information Act compre-
hended three central modifications. The first one concerned consumers’ consent as
a requirement to open a consumer file and process positive information—it changed
the system from an opt-in to an opt-out model. The second was the suppression of
the agents’ liability clause. The third was the modification of the Bank Secrecy Law,
which aimed to allow the flow of financial information between databases controlled
by different agents.
The main goal of the Bill was the adoption of an opt-out model for consumers, in
opposition to the opt-in model. The opt-out model prevailed regarding the access to
credit scores by companies who want to do business with the consumer. In this case,
their access to consumers’ credit score without the need for his or her consent was
approved. Nevertheless, to access the consumer’s detailed credit history, companies
still need previous authorization, meaning an opt-in model remains for this specific
purpose.

4 The Way Forward—Challenges for Anti-Discrimination


Policies in Credit Scoring

Having set the ground for how credit scoring functions in Brazil, we now turn to our
thoughts on the challenges faced by policy proposals aimed at hindering algorithmic
discrimination in credit scoring. First, we will present an overview of the literature

58 Cf. Article of the Secretary of Economic Politics of the Ministry of Finance: Pinho Mello et al.
(2018).
59 Idem.
432 L. S. Mendes and M. Mattiuzzo

on algorithmic governance, and then move on to present this matter as it relates to


the Brazilian credit scoring regulation.

4.1 Algorithmic Discrimination and Governance

To deal with the issue of algorithmic discrimination and regulation, one must first
survey the literature on algorithmic governance. The body of authors that discuss how
discrimination can be a factor in algorithmic decision-making has come to realize
that policy solutions are needed and thus focused on providing an overview of what
these solutions might be. This subsection will review the most important debates that
have arisen in this field, focusing on (i) the commonly discussed policy solutions,
(ii) the disagreements among authors on these proposals, and (iii) our view on the
topic.
The literature on algorithmic governance, though fairly recent, is already vast.
Several authors debate when and how (or whether) algorithms should be regulated.
Most notably, some groups of scholars have come up with principles that could
govern algorithmic decision-making. The Fairness, Accountability and Transparency
in Machine Learning Organization (FAT-ML) is one of such institutions. They have
compiled a list of what they believe to be the key principles that should be observed by
companies and government when dealing with algorithms: responsibility, explain-
ability, accuracy, auditability, and fairness.60 In the United States, the Association
for Computing Machinery (ACM) followed the same path and devised its own prin-
ciples, adding awareness, access and redress, data provenance, and validation and
testing to the list.61
Responsibility, in the FAT-ML’s view, is connected to the idea that one, in
designing algorithmic systems, must be aware of the people that will be impacted
by the decision-making process and as such should to some extent provide alterna-
tives for redress—both at individual and societal levels. This idea connects to the
ACM’s principles of awareness—which is mostly focused on making the algorithm’s
builders and users mindful of the possible consequences of its use, especially of the
biases that can arise from it; and of access and redress—which claims regulators
should adopt mechanisms that allow individuals impacted by the decisions made by
algorithms to question and redress the possible harm caused.
As Doshi-Velez et al. put it, the idea of explanation (or explainability), when
applied to decision-making, usually means “the reasons or justifications for that
particular outcome, rather than a description of the decision-making process in

60 The Fairness, Accountability and Transparency in Machine Learning Organization, https://www.

fatml.org/resources/principles-for-accountable-algorithms. Accessed 11 Jan. 2022. The authors


who subscribe to the principles are: Diakopoulos, Friedler, Arenas, Barocas, Hay, Howe, Jagadish,
Unsworth, Sahuguet, Venkatasubramanian, Wilson, Cong Yu, Zevenbergen.
61 Association for Computing Machinery US Public Policy Council (USACM), ‘Statement on Algo-

rithmic Transparency and Accountability’, 12 January 2017 http://www.acm.org/binaries/content/


assets/public-policy/2017_usacm_statement_algorithms.pdf. Accessed 11 Jan. 2022.
Algorithms and Discrimination: The Case of Credit Scoring … 433

general”.62 Therefore, what they consider to be an explanation, and we adopt in this


chapter, is a “human-interpretable description of the process by which a decision-
maker took a particular set of inputs and reached a particular conclusion”.63 It is
important to note explanation is not the same as transparency, for being able to
understand the process by which a decision was made is not the same as knowing
every step taken.
The principle of accuracy, according to Diakopoulos and Friedler, means “sources
of error and uncertainty throughout an algorithm and its data sources need to be
identified, logged, and benchmarked”.64 Put bluntly, it is only by understanding
where mistakes come from that one can hope to mitigate them. The ACM expresses
a similar idea with the principle of data provenance, which states that “a description
of the way in which the training data was collected should be maintained by the
builders of the algorithms, accompanied by an exploration of the potential biases
induced by the human or algorithmic data-gathering process.”
The principle of auditability is also constant in algorithmic governance discussion.
It entails the idea that a third party must be able to review the method used by the
algorithm to reach its conclusion.65 As we shall see below, there is much debate on
how such disclosure could be possible, and whether it should at all take place in some
circumstances.
Fairness may be the most obvious, but least clear of all proposed principles. The
idea behind fairness is that algorithms might reach discriminatory outcomes. But
as seen in item 2 above, what a discriminatory outcome is proves often to be a
challenging analysis, and there can be reasonable disagreement on this topic. The
ACM, though not expressly advocating for this principle, puts forward the validation
and testing standard, according to which “[institutions] should routinely perform tests
to assess and determine whether the model generates discriminatory harm”.
Although not explicitly present in either of these manifests, a principle widely
discussed in the literature and in policy makers’ circles is transparency. Algorithms
have famously been called by Frank Pasquale “black boxes” due to the opacity of
their decision-making processes, that transforms them into a source of distrust. As

62 Doshi-Velez et al. (2017), p. 2.


63 Doshi-Velez et al. (2017), pp. 2–3. They go further to say that an explanation should be able to
answer at least one of the three following questions: (i) what were the main factors in a decision; (ii)
would changing a certain factor have changed the decision; and (iii) why did two similar-looking
cases get different decision, or vice-versa.
64 Diakopoulos and Friedler (2016).
65 Sandvig et al. (2014): “Although the complexity of these algorithmic platforms makes them seem

impossible to understand, audit studies can crack the code through trial and error: researchers can
apply expert knowledge to the results of these audit tests. By closely monitoring these online plat-
forms, we can discover interactions between algorithm and data. In short, auditing these algorithms
demands a third party that can combine both expert and everyday evaluations, testing algorithms
on the public’s behalf and investigating and reporting situations where algorithms may have gone
wrong”.
434 L. S. Mendes and M. Mattiuzzo

such, tools aimed at bringing algorithms to light have been considered by some an
essential step in any proposed regulatory solution.66
There is no consensus on what precisely is the ideal combination of these
proposals, and authors often disagree on their importance and usefulness. The most
visible disagreement comes from those who believe transparency—and even explain-
ability—are not adequate or sufficient tools to deal with algorithms, defending
accountability to be a better option in many cases.
Pasquale and Citron are among the authors who believe transparency is a mean-
ingful solution for algorithmic discrimination, specifically when applied to credit
scoring. In their words:
We believe that each data subject should have access to all data pertaining to the data
subject. Ideally, the logics of predictive scoring systems should be open to public inspection
as well. There is little evidence that the inability to keep such systems secret would diminish
innovation.67

They are clear in stating that the “threats to human dignity” are sufficient to justify
disclosing not only the dataset and the overall functioning of the system to authorities,
but also the code and modeling of algorithms to the public in general.68
Another author that follows a similar line of thought is Zarsky, who puts the
argument forward in the context of automated predictions in government initiatives.
In short, he claims that “the most basic and popular justification for transparency is
that it facilitates a check on governmental actions.”69 He states that, in this context,
accountability and transparency are usually referred to as synonyms, but argues
the concepts are fundamentally different for accountability regards the idea that
individuals are ethically responsible for their actions, whereas transparency is a
tool—and not the only tool—aimed at facilitating accountability.
Some experts, however, have pointed towards the limitations of transparency
solutions. Lawrence Lessig has famously presented such a view in respect to govern-
ment transparency, when he claimed that turning the panopticon upon the rulers and
building civic omniscience had its issues. He builds his argument by explaining the
ideas expressed by Brandeis in Other People’s Money, especially the argument that
full disclosure of information would help the public judge quality and as such allow
the people to regulate the market. But as Lessig warns, “not all data satisfies the
simple requirement that they be information that consumers can use, presented in a
way they can use it”.70 Though the goal of his paper was not to discuss algorithmic
discrimination, many of his observations hold true for this context.

66 Pasquale (2013).
67 Pasquale and Citron (2014), p. 26. They also say: “The FTC’s expert technologists could test
scoring systems for bias, arbitrariness, and unfair mischaracterizations. To do so, they would need to
view not only the datasets mined by scoring systems but also the source code and programmers’ notes
describing the variables, correlations, and inferences embedded in the scoring systems’ algorithms”.
68 Pasquale and Citron (2014), pp. 30–31.
69 Zarsky (2013), p. 1533.
70 Lessig (2009).
Algorithms and Discrimination: The Case of Credit Scoring … 435

Specifically regarding algorithms, Kroll et al. offer four main arguments why
transparency is not a sufficient policy proposal71 : firstly, because it simply may be
unattainable—there either will be well-grounded public reasons that forfeit the right
to disclosure, such as national security or the prevention of strategic behavior aimed
at gaming the system,72 or reasons that impact the individuals under scrutiny, for
example if the data collected is highly sensitive, in which case it may be in the
individual’s best interest not to have full transparency applied to the database—;
secondly, because it might be insufficient—even if a rule is public, “[the] methods
are often insufficient to verify properties of software systems, if these systems have
not been designed with the future evaluation and accountability in mind”,73 as often
is the case; thirdly, whenever the algorithm incorporates randomness—which is
arguably a fundamental function of computerized systems—full transparency guar-
antees nothing74 ; fourthly, “intelligent” systems that change over time and adapt
to their environment, such as ML algorithms, cannot be properly comprehended
through transparency solutions. Following this line of thought, the authors argue that
accountability, in the form of procedural regularity, is a better policy proposal that
should be studied in more depth, for it can provide the desired outcome even in the
face of secrecy.
Desai & Kroll, for their turn, claim that transparency, though having its place, is
limited, for it may further emphasize discrimination by providing a false sense of
clarity. In their words:
Many of the current calls for transparency as a way to regulate automation do no address
such limits [of the algorithmic systems], and so they may come up short on providing the sort
of legal-political accountability they desire, and which we also support. Instead, as software
(and especially machine learning systems, which separate the creation of algorithms and
rules from human design and implementation) continues to grow in importance, society may
find, and we argue, that identifying harms, prohibiting outcomes, and banning undesirable
uses is a more promising path.

Edwards & Veale also highlight the limitations of transparency and explainability,
aiming specifically at the General Data Protection Regulation in Europe. They point to
the uselessness of transparency, both for its unfeasibility and for its lack of fit to users’
needs—namely inability to redress substantive justice, and suggest a framework
focused on building better algorithmic solutions from the start (solutions that are

71 Kroll et al. (2017).


72 The authors use the example of tax evasion. If tax evaders knew precisely how the government
red flags possible fraud scenarios, the algorithm in place would be useless.
73 Kroll et al. (2017), p. 633.
74 In their words, “a simply lottery provides an excellent example: a perfectly transparent algo-

rithm—use a random number generator to assign a number to each participant and have the partici-
pants with the lowest numbers win—yields results that cannot be reproduced or verified because the
random number generator will produce new random numbers when it is called upon later”. Kroll
et al. (2017).
436 L. S. Mendes and M. Mattiuzzo

effectively concerned with policy objectives) and giving agencies and institutions
the power to oversee algorithmic integrity.75
In our view, placing too much faith on transparency is indeed misguided, though
not for the goal transparency aims at attaining, but for the fact that it may prove
insufficient in fulfilling its promise. In the specific context of credit scoring, disclo-
sure of databases and of decision-making processes usually faces resistance, be it
for claims of commercial secrecy, be it for allegations of systems’ gaming. From
a policy standpoint, especially considering the insertion of ML into credit scores
and the randomness element indicated by Kroll et al., transparency is a tool whose
usefulness is limited to informing data subjects of their current scores and allowing
such individuals to correct and update information. But advocating for open source
code in this context, as will be seen in the next subsection, is unlikely to provide
practical and sufficient solutions.

4.2 Regulating Credit Scoring and Algorithmic


Discrimination

In this subsection we aim at, first, presenting how our typology of discrimination
helps shed light upon possible discriminatory outcomes that may arise in the Credit
Information Act. To do so, we will present how some provisions of the Act converse
with our discussion, and also how the General Data Protection Act may come to
influence the debate. Secondly, we intend to connect such provisions to the literature
on algorithmic governance, and suggest some policy proposals that could assist in
fighting discrimination.
Remembering our typology, we claimed there are four types of discrimination
running from algorithmic systems:
(i) Discrimination by statistical error,
(ii) Discrimination by generalization,
(iii) Discrimination by use of sensitive information,
(iv) Discrimination by inadequate correlation.
Naturally, the first type of discrimination may arise in any case in which an algorithm
is in use, so long as the program presents errors—wrong prediction models, for
instance.
The General Data Protection Act has established a principle regarding data quality
that speaks to this concern, as well as a right for any individual to correct or complete
data available in datasets. There is however no provision aimed at ensuring that
models are statistically correct, much less to verify their accuracy. One could claim

75 Edwards and Veale (2017), pp. 22–23. “As the history of industries like finance and credit shows,
rights to transparency do not necessarily secure substantive justice or effective remedies. We are
in danger of creating a “meaningless transparency” paradigm to match the already well known
“meaningless consent” trope”.
Algorithms and Discrimination: The Case of Credit Scoring … 437

the reason for lack of regulation here is that the market incentives are sufficient
to address this problem. In other words, no norm is needed because it is not in a
company’s best interest to have its algorithm present statistical imprecisions, simply
because inaccuracy of this kind would mean inefficient resource allocation. Classi-
fying a good creditor as a bad one ultimately means less revenue. Still, it is worth
noting that there is no overreaching obligation for the private or for the public
sectors to show that their models are statistically sound, and present no mathematical
mistakes.
In Germany, for example, such an obligation does exist. The DSAnpUG-EU, or
the new Bundesdatenschutzgesetz (BDSG), in section 31, states that the method
used for credit scoring needs to be a scientifically recognized statistical procedure.76
This provision is viable in Germany because the country has a well-established data
protection system and a central authority responsible for supervising the application
of the norm. Still, especially with the approval of the General Data Protection Act
and the creation of a central authority in Brazil, such a recommendation could go a
long way in hindering type 1 discriminatory outcomes.
Turning to discrimination by generalization, the issues credit scoring may present
are three-fold. Firstly, there is the problem of accuracy. Secondly, the challenge of
data correction. Thirdly, the question of insufficient data leading to miscategorization.
The Brazilian model for accuracy assurance is far from ideal. The entire credit
information system is based upon the notion that correction lies with the consumer—
she must be able to identify mistakes, bring such mistakes forward, and require the
company responsible to modify the information. The idea is very much in line with
individual complaints and runs from consumer law methodology, but it is bound to
fail. First, because the information asymmetry is enormous—putting it bluntly, the
consumer knows far less about the database then the company; her effort to verify
whether the information collected is accurate is completely disproportionate to the
effort of the databroker—, and secondly, because there are other far superior ways
to ensure accuracy—Brazilian legislation could adopt audit mechanisms, periodical
reports of the databases etc., as has been the case in other jurisdictions.77
The General Data Protection Act builds on this idea and establishes a principle
aimed at tackling the quality of data as an overreaching goal that should be followed
by companies regardless of individual complaints. Also, because the provisions
on liability in the GDPA are far more severe than those of the Credit Information

76 § 31 (1, no. 2) BDSG: “Die Verwendung eines Wahrscheinlichkeitswerts über ein bestimmtes
zukünftiges Verhalten einer natürlichen Person zum Zweck der Entscheidung über die Begrün-
dung, Durchführung oder Beendigung eines Vertragsverhältnisses mit dieser Person (Scoring)
ist nur zulässig, wenn […] die zur Berechnung des Wahrscheinlichkeitswerts genutzten Daten
unter Zugrundelegung eines wissenschaftlich anerkannten mathematisch-statistischen Verfahrens
nachweisbar für die Berechnung der Wahrscheinlichkeit des bestimmten Verhaltens erheblich sind”.
77 Hurley and Adebayo (2016). Congress directed the FTC to conduct a study of credit report

accuracy and provide interim reports every two years, starting in 2004 and continuing through
2012, with a final report in 2014. The reports are being produced under section 319 of the Fair and
Accurate Credit Transactions Act, or FACT Act.: https://www.ftc.gov/news-events/press-releases/
2013/02/ftc-study-five-percent-consumers-had-errors-their-credit-reports. Accessed 11 Jan. 2022.
438 L. S. Mendes and M. Mattiuzzo

Act—controllers are liable whenever damage is caused, and obliged to repair such
damage—the tendency is for the individual to be more protect even when she is not
as engaged in verifying the accuracy of credit scores.
A different issue is that of data correction. According to the Credit Information
Act, if the consumer identifies a mistake in the database, it is not clear how precisely
the problem should be addressed. Article 5, III of the Credit Information Act states
that the consumer can request the correction of the data, but it does not provide how
such request can be made, nor, more importantly, what happens in case there is a
dispute between the consumer and the database operator regarding the accuracy of
the information. Who bears the burden of proof? If we take Brazilian consumer law
as a parameter, the burden should fall upon the database operator, but if we claim the
Credit Information Act should be the sole source of the answer, we simply fall short
of providing a response. In the GDPA, the request for data correction is explicit in
Article 18, III. The Act goes further and states that the request for data correction
shall be complied with by the data controller in a timely manner (Article 18, §5—
the law also states that the exact deadline for controllers shall be determined by the
authorities via infralegal regulation).
In regards to miscategorization due to incompleteness, Brazilian law is silent on
any matter related to what is a bundle of information sufficient to render useful
credit scoring results. The existing mechanism that could be of service to address
such problem is the right to revision in Article 5, VI Credit Information Act. But
again, for this right to have any meaning, the consumer would either have to identify
the information in her score—which we already established to be difficult due to
the accuracy model present in the law; or to infer that her low score is due to a
miscategorization of the sort presented in item 2—the database assumes that, because
she lives in a neighborhood usually associated with low income, she is a bad creditor,
because it does not have any other sources of information for that particular person.
Obviously, placing such burden upon the consumer seems less than ideal.
It should also be said that, due to Article 21 of the GDPA and Article 7-A, III of
the Credit Information Act, the individual has a further level of protection in case of
type 2 mistakes. Personal data that refers to the exercise of rights by the individual
can not be used in her detriment. A practice that fits in this norm is for instance the
lowering of the credit score when the consumer verifies her own score, which was
already reported by some data protection activists in Germany and motivated the
change of the former German Data Protection Act (BDSG).78 Another example are
the black lists created by employers to register employees who filed cases in labor
courts.79
Moving on to our third category, discrimination by use of sensitive information,
the Credit Information Act explicitly addresses such problem in Article 3, §3, I and
II, as mentioned in Sect. 3, and so does the GDPA, in Article 11, stating that use
of sensitive personal data is only allowed in limited circumstances. However, the
lack of cases that deal with this issue—there has not been one single instance in

78 Mendes (2015), p. 175.


79 Mendes (2014), p. 77.
Algorithms and Discrimination: The Case of Credit Scoring … 439

which a consumer claimed the data being used by a company was sensitive and thus
prohibited—is telling of the problem faced by Brazilian legislation. Because Brazil
has a still very recently established central data protection authority, there is still
no enforcement, running from the inexperience of the consumer system on dealing
with this matter. In addition, and likely also related to lack of enforcement, there
is no standard to address the issue of proxies. The Act prohibits using sensitive or
excessive information, but it says nothing about information that is neither sensitive,
nor necessarily excessive, and still provides a good proxy for either one of those
categories. Again, were there stronger enforcement, and consequently case law that
helped build criteria for what precisely sensitive or excessive data is, this problem
would likely be mitigated. The creation of a DPA, and the coming into force of the
GDPA, may change this scenario.
The fourth type of discrimination—by inadequate correlation—was evidently
possible in Brazil, but after the reform of the Credit Information Act, became unlawful
for the purposes of credit scoring. Previously, whenever the databrokers used the
request of information by the consumer on her credit score to determine the score
itself, we faced such a scenario. Now, after the inclusion of Article 7-A, III, the
fulfillment of the individual’s right to access information about herself cannot be
used against the consumer.
We now move on to our thoughts on how the aforementioned problems could be
mitigated through modifications in the Credit Information Act, and by the application
of the overreaching data protection system in Brazil. When it comes to type 1 issues,
we believe a provision such as the one in the new BDSG would be sufficient to address
the problem, but it is also possible that the challenge will be solved by itself due to
market incentives. Specially, with the creation of a central data protection agency
and the expectation of enforcement, companies are unlikely to risk their business by
providing inaccurate scoring methods.
As for discrimination by generalization, there are still many challenges to be faced
before a normative system able to overcome this problem emerges. Nevertheless
important measures have been taken to reduce discrimination by generalization,
such as guaranteeing in an automated individual decision making process the right
to obtain review, as granted by Article 22, 3, GDPR and, in Brazil, by Article 20 of
the GDPA. The Credit Information Act grants the individual a similar right to ask for
revision from automated decisions. In both cases, the idea is that the atypical case
could be demonstrated with evidences brought by the affected person.
Consumers should have an easier way to access data, and a clear procedure to
request corrections from databases. Ideally, we believe this procedure should be
homogeneous for all databrokers, in order to diminish consumers’ transaction costs.
Still, the issue we believe to be the hardest to address here is that of proxies. A
way forward could be that of the German regulation, which, though not providing an
overall prohibition on the use of proxies, states that no score based solely on address
information shall be allowed and that, if such address data is in fact used, the data
subject must be notified in advance.80 This solution is partial for it deals with only

80 § 31 (1, no. 3 and 4) BDSG.


440 L. S. Mendes and M. Mattiuzzo

one type of data—zip codes, and we are not aware of norms in other jurisdictions
that specifically tackle this problem. Both the GDPR and the GDPA, when stating
that the consumer has the right to a review of automated decisions, are not explicit
on the problem of proxies.
In regards to discrimination by use of sensitive information, though Brazilian
law clearly states that the use of such information for scoring is prohibited, there is
little to no orientation on what precisely sensitive or excessive information means.
What are traits that pertain to social and ethnic origin, health, genetic information,
sexual orientation, and political, religious and ideological convictions? If a company
has enough information to infer someone is gay, but nothing that by itself would
indicate sexual orientation, is the use of this bundle of data prohibited? For this
debate to evolve, enforcement is paramount. Again, the creation of a centralized data
protection authority may go a long way in providing clearer criteria. But a necessary
alternative is to build capacity and bring awareness in the consumer system.
Discrimination by inadequate correlation, lastly, was addressed in the context of
credit scoring with the addition of Article 7-A, which is similar to the one put forward
in the former German legislation. A similar concept was introduced by the General
Data Protection Act in Article 21. Although the concept behind article 21 is correct
and useful, it is not yet clear what its real boundaries will be, since fulfillment of a
right could be interpreted in a very broad fashion.
The Brazilian system would also benefit from substitute for “an understanding
that” the concept of excessive information, which already exists in the Credit Infor-
mation Act, also covers situations where discrimination arises from inadequate corre-
lations. But this would require the further development of this concept by case law
and doctrine, in order to specify its definition and facilitate its appliance.
Finally, the purpose limitation principle, established in Article 5, I b of GDPA,
could be applied, in order to avoid spurious correlations in the use of data, since it
prohibits the use of a subsequent processing that is incompatible with these initial
purposes of the processing.
It is clear that the challenges of this last type of algorithmic discrimination lays
in better establishing the criteria used to distinguish legitimate from inadequate use.
Although the GDPA has made an important progress in this direction, its consistent
interpretation will depend on the relevant activities of the Data Protection Authority.

5 Final Considerations

The governance of algorithms, and specifically of credit scoring systems, is a very


disputed and convoluted field. As we have shown throughout this chapter, the topic is
scarcely discussed in Brazil, but even in jurisdictions where the debate on algorithmic
discrimination is more present, the issues are still far from resolved. The solutions
usually put forward by academia—transparency and accountability—address some
of the concerns, but not all. The rise of machine learning has brought about many
doubts on how effective these solutions can be in certain scenarios.
Algorithms and Discrimination: The Case of Credit Scoring … 441

Brazilian legislation could adopt some of the suggestions that have been put
forward elsewhere in the Credit Information Act, but the algorithmic discrimination
debate is still far from over and will likely consume policy makers’ agenda for a long
time.
Furthermore, the debate on credit scoring poses other specific challenges for
Brazil, since the establishment of a specialized authority to enforce data protection
rules may address some of the problems concerning algorithm discrimination. It is
clear that the ANPD’s role will be essential in many aspects: it could provide clarity
on when the GDPA should prevail over the Credit Information Act, in cases where
conflicts of interpretation may arise; it would be a technical body able to deal with
the complex issues arising out of data protection debates, some of which the existing
authorities have difficulty in handling; and it would be a source of legal certainty for
any entity subjected to the Brazilian legislation that deals with data protection and
credit scoring, most notably facing debates such as when algorithmic discrimination
in fact takes place and how it should be addressed.

References

Afonso da Silva V (2005) A Constitucionalização do Direito: Direitos Fundamentais e Relações


entre Particulares. Malheiros, São Paulo
Arrow K (1973) The theory of discrimination. In: Ashenfelter O, Rees A (eds) Discrimination in
labor markets. Princeton University Press, Princeton
Barocas S, Selbst A (2016) Big data’s disparate impact. Calif Law Rev 104:671–732
Benjamin AH et al (2005) Código brasileiro de Defesa do Consumidor comentado pelos autores
do anteprojeto. Forense Universitária, Rio de Janeiro
Boyd D, Crawford K (2011) Six provocations for big data. Presented at a decade in Internet
time: symposium on the dynamics of the Internet and society. https://ssrn.com/abstract=1926431.
Accessed 11 Jan. 2022
Buchner B (2006) Informationelle Selbstbestimmung im Privatrecht. Mohr Siebeck, Tübingen
Britz G (2007) Freie Entfaltung durch Selbstdarstellung. Mohr Siebeck, Tübingen
Burkell et al (2014) Facebook: public space, or private space? Inf Commun Soc 17:974–985
Butler D (2013) When Google got flu wrong. Nature, 13 Feb 2013. https://www.nature.com/news/
when-google-got-flu-wrong-1.12413. Accessed 11 Jan. 2022
Cormen TH (2013) Algorithms unlocked. MIT Press, Cambridge
Danieli O, Hillis A, Luca M (2016) How to hire with algorithms. Harvard Bus Rev 17 October
2016. https://hbr.org/2016/10/how-to-hire-with-algorithms. Accessed 11 Jan. 2022
Diakopoulos N, Friedler S (2016) How to hold algorithms accountable. MIT Technol Rev 17
Nov 2016. https://www.technologyreview.com/s/602933/how-to-hold-algorithms-accountable/.
Accessed 11 Jan. 2022
Domingos P (2015) The master algorithm: how the quest for the ultimate learning machine will
remake our world. Basic Books, New York
Doneda D (2006) Da Privacidade à proteção de dados pessoais. Renovar, Rio de Janeiro
Doneda D, Mendes L (2014) Data protection in Brazil: new developments and current challenges.
In: Gutwirth S, Leenes R, De Hert P (eds) Reloading data protection: multidisciplinary insights
and contemporary challenges. Springer, Dordrecht, Heidelberg, London, New York
Doshi-Velez F et al (2017) Accountability of AI under the law: the role of explanation. Harvard
public law working paper, no 18-07
442 L. S. Mendes and M. Mattiuzzo

Edwards L, Veale M (2017) Slave to the algorithm? Why a ‘right to an explanation’ is probably not
the remedy you are looking for. Duke Law Technol Rev 16(1):18–84
Federal Trade Commision (2012) Report to congress under section 319 of the fair and accurate
credit transactions act of 2003. December 2012. https://www.ftc.gov/sites/default/files/doc
uments/reports/section-319-fair-and-accurate-credit-transactions-act-2003-fifth-interim-federal-
trade-commission/130211factareport.pdf. Accessed 11 Jan. 2022
Finley S (2018) I didn’t even meet my potential employers. BBC News 6 Feb 2018. http://www.
bbc.com/news/business-42905515. Accessed 11 Jan. 2022
Gee K (2017) In Unilever’s radical hiring experiment, resumes are out, algorithms are in. In:
Ginsberg J et al (2009) Detecting influenza epidemics using search engine query data. Nature,
vol 457, pp 1012–1014
Goodman BW (2016) Economic models of (algorithmic) discrimination. Paper presented at 29th
conference on neural information processing systems, Barcelona, Spain
Hurley M, Adebayo J (2016) Credit scoring in the era of big data. Yale J Law Technol 18(1):148–216
Institute for Technology & Society of Rio de Janeiro (2017) Algorithm transparency and governance:
a case study of the credit bureau sector
Kroll J et al (2017) Accountable algorithms. Univ Pennsylvania Law Rev 165:633–705
Lazer D et al (2014) The parable of Google Flu: traps in big data analysis. Science 343:1203–1205
Lessig L (2009) Against transparency. The New Republic, 9 Oct 2009. https://newrepublic.com/art
icle/70097/against-transparency. Accessed 11 Jan. 2022
Marques CL (2011) Contratos no Código de Defesa do Consumidor. O Novo Regime das Relações
Contratuais. Revista dos Tribunais, São Paulo
Mayer-Schönberger V, Cukier K (2014) Big data: a revolution that will transform how we live,
work, and think. First Mariner Books, New York
Mendes, LS (2014) Privacidade, proteção de dados e defesa do consumidor. Linhas gerais de um
novo direito fundamental. Saraiva, São Paulo
Mendes, LS (2015) Schutz vor Informationsrisiken und Gewährleistung einer gehaltvollen Zustim-
mung. Eine Analyse der Rechtmäßigkeit der Datenverarbeitung im Privatrecht. De Gruyter,
Berlin
Moro A (2009) Statistical discrimination. In: Durlauf NS, Lawrence E (eds) The New Palgrave
dictionary of economics. Palgrave Macmillan, Basingstone
O’Neil C (2018) Personality tests are failing American workers. In: Bloomberg View, 18
Jan 2018. https://www.bloomberg.com/view/articles/2018-01-18/personality-tests-are-failing-
american-workers. Accessed 11 Jan. 2022
Papacharissi Z (2010) A private sphere: democracy in a digital age. Polity Press, Cambridge
Pasquale F, Citron D (2014) The scored society: due process for automated predictions. Washington
Law Rev 89:1–33
Pasquale F (2013) The Emperor’s new codes: reputation and search algorithms in the finance sector.
Draft for discussion at the NYU “Governing algorithms” conference
Phelps E (1972) The statistical theory of racism and sexism. Am Econ Rev 62:659–661
Pinho Mello JM, Mendes M, Kanczuk F (2018) Cadastro Positivo e democratização do crédito.
In: Folha de São Paulo, March 2018. https://www1.folha.uol.com.br/opiniao/2018/03/joao-man
oel-pinho-de-mello-marcos-mendes-e-fabio-kanczuk-cadastro-positivo-e-democratizacao-do-
credito.shtml. Accessed 11 Jan. 2022
Sandvig C et al (2014) An algorithm audit. In: Gangadharan SP (ed) Data and discrimination:
collected essays. Open Technology Institute
Schauer F (2006) Profiles, probabilities, and stereotypes. Harvard University Press, Cambridge
Silver N (2012) The signal and the noise. The art and science of prediction. Penguin, London
State v. Loomis. 881 N.W.2d 749 (Wis. 2016). In: Harvard Law Review. https://harvardlawreview.
org/2017/03/state-v-loomis/. Accessed 11 Jan. 2022
Tepedino G (1999) As Relações de Consumo e a Nova Teoria Contratual. In: Tepedino G Temas
de Direito Civil. Renovar, pp 199–216
Zarsky T (2013) Transparent predictions. Univ Ill Law Rev 2013(4):1503–1570
Algorithms and Discrimination: The Case of Credit Scoring … 443

Laura Schertel Mendes Professor for Civil Law at the University of Brasília (UnB) and at
the Public Law Institute of Brasília (IDP). PhD in private law at the Humboldt University in
Berlin, Germany. Director of the Center for Internet and Society at IDP (CEDIS/IDP). Main areas
of research: Data Protection, Consumer Law, Algorithmic Governance. Selected publications:
Privacidade, proteção de dados e defesa do consumidor: linhas gerais de um novo direito funda-
mental. Saraiva, 2014; Schutz vor Informationsrisiken und Gewährleistung einer gehaltvollen
Zustimmung: eine Analyse der Rechtmäßigkeit der Datenverarbeitung im Privatrecht. De Gruyter,
2015; (Co-author) The Regulation of Commercial Profiling: a Comparative Analysis. European
Data Protection Law Review, v. 2, 2016, pp. 535–554; (Co-author) Discriminação algorítmica:
conceito, fundamento legal e tipologia. Revista de Direito Público, v. 16, 2019, pp. 39–64;
(Co-author) Datenschutz und Zugang zu Informationen in der brasilianischen Rechtsordnung—
Einführung. In: Indra Spiecker gen. Döhmann; Sebastian Bretthauer (Hrsg.) Dokumentation zum
Datenschutz, 2019, pp. 225ff.

Marcela Mattiuzzo Ph.D. Candidate at the University of São Paulo, Master in Constitutional
Law from the University of São Paulo, former visiting researcher at Yale Law School and chief
of staff from the Office of the President at the Administrative Council for Economic Defense
(CADE). Partner at VMCA Advogados. Main areas of research: Data Protection, Antitrust, Algo-
rithmic Governance, Artificial Intelligence. Selected publications: Online Advertising Platforms
and Personal Data Retail: Consequences for Antitrust Law. Competition Policy International
Antitrust Chronicle, v. 7, p. 1, 2015; (Co-author) Discriminação algorítmica: conceito, fundamento
legal e tipologia. Revista de Direito Público, v. 16, 2019, pp. 39–64; (Co-author) Data Portability:
The Case of Open Banking and the Potential for Competition. In: David S. Evans; Allan Fels
AO; Catherine Tucker (Eds.) The Evolution of Antitrust in the Digital Era: Essays on Compe-
tition Policy. Competition Policy International, 2020, v. 1, pp. 183 ff.; “Let the algorithm decide”:
is human dignity at stake? Revista Brasileira de Políticas Públicas, v. 11, n. 1, 2021, pp. 342–369.
Safeguarding Regional Data Protection
Rights on the Global Internet—The
European Approach Under the GDPR

Raoul-Darius Veit

Abstract Enforcing law on the Internet is a general challenge that cannot be limited
to data protection rights. It is, however, in the context of these rights that the
complexity of the issue becomes particularly clear: the unlimited archiving of (poten-
tially false) personal information and data has initiated a debate over the possibilities
of “forgetting on the Internet”. The global cross-border dimension of (anonymous)
communication and (nontransparent) network-like flow of data on the Internet makes
it difficult not only to determine who is processing what data where and for which
purposes, but also to identify the applicable jurisdiction and even more so to enforce
national law against the communication partner and/or data processors (potentially)
acting from abroad. Against this background, a binding global data protection frame-
work appears to be the most suitable instrument to meet these challenges. While, as
will be shown, this remains a distant goal and alternative approaches to data protection
governance have not proved to be successful in the international context, significant
progress with respect to the harmonization of data protection laws has been made
at the level of European Union: With the recently passed General Data Protection
Regulation (GDPR), the EU is setting an example on how to tackle the enforcement
issues in a regional, multi-national context. The GDPR provides for different instru-
ments to ensure the effective enforcement of its provisions not just within the EU
but also extraterritorially. The legal act is complemented by data protection agree-
ments with non-EU states of which the former so-called EU-US Privacy-Shield can
be considered the one with the greatest practical impact. The aim of this chapter is to
analyze the approach of the European Union pursued in the General Data Protection
Regulation and evaluate it with respect to its effectiveness and normative legitimacy.

R.-D. Veit (B)


University of Hamburg, Hamburg, Germany
e-mail: raouldarius.veit@gmx.de

© Springer Nature Switzerland AG 2022 445


M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2_18
446 R.-D. Veit

1 Introduction

It is common knowledge that for any laws, legal rules, or rights to be effective, they
require mechanisms and institutions that safeguard their application and enforcement.
Traditionally, in Western social thought, the creation and enforcement of rules are
seen as capacities exclusively reserved for and therefore closely linked to nation-
states.1 In a democratic state, parliaments pass laws as reflections of democratically
adopted policies, and their enforcement is primarily assigned to the national executive
and judiciary. In this way, the nation-state fulfills its profound democratic obligation
to sustain the rule of law.
In the age of the Internet, however, this traditional paradigm is increasingly coming
under pressure. The current trends of globalization and digitalization affect every
aspect of life and thus every field of law: the shift from offline to online and conse-
quently from national to transnational in economic and social interaction has led to
legal issues occurring more and more frequently in cross-border constellations where
various authorities and courts each tasked with enforcing their domestic laws meet
and jurisdictional conflicts arise.
This increase of transboundary interaction, fueled particularly by the rise of the
relatively new global phenonemon/network, has likewise turned governance into a
matter that requires a transnational or even global scope, which, in turn, has resulted
in a shift in the global legal order in the last twenty years: in reaction to the observation
that nation-states alone can no longer effectively respond to this new scenery due to
their territorial and jurisdictional limitations, an ever-increasing number of regional
and international organizations have come to the fore, competing for influence in the
task of providing appropriate legal frameworks in response to these developments.2
With respect to data protection law, certainly as one of the most prominent subjects
of postnational governance in general and Internet regulation in particular, the Euro-
pean Union is considered to be the most influential stakeholder with its former
core instrument, Data Protection Directive 95/46/EC (hereinafter: Directive 95/46),3
having served as a blueprint for data protection laws all across the globe.

1 Viellechner (2017), p. 106.


2 The resulting legal setting is often referred to as ‘postnational law’ and is characterized by a
variety of different layers of law coming from different levels of governance that are increasingly
interwoven without one level being ultimately superior to the other. In the search for a (theoretical)
structure for this postnational reality, the concepts of (global) constitutionalism and pluralism are
the main competitors. For this discussion as well as good arguments for a pluralist vision see Krisch
(2010).
3 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the

protection of individuals with regard to the processing of personal data and on the free movement
of such data, Official Journal L 281/31.
Safeguarding Regional Data Protection Rights on the Global Internet … 447

After a long and cumbersome legislative procedure,4 the European Union recently
gave this framework a much-anticipated update: the General Data Protection Regu-
lation (hereinafter: GDPR),5 which came into force in May 2016 and has been appli-
cable since 25 May 2018. In legal scholarship, the quality of this new legal act has
been subject to controversial debate; while some praise it as “a milestone”6 or a “role
model for other policy areas,”7 others see in it an “unconstitutional aberration” that
“will fail.”8 Prominent points of discussion are, among others, the new legal act’s
‘perforated’ structure9 as well as its (lack of) sustainability.10
This article seeks to contribute to this debate, but with a different focus; in light of
the aforementioned challenges, this paper analyzes the EU data protection regime’s
extraterritorial reach under the scope of the new GDPR and assesses whether its
instruments provide a feasible and normatively justified approach to the enforcement
of data protection law and rights in the era of the Internet.
For this purpose, the first part (Sect. 2) will outline the specific challenges to data
protection law and rights and their enforcement which have emerged in recent years.
In response to the rise of the Internet and the pressure on traditional enforcement
mechanisms along with it, various alternative approaches to Internet governance have
been put forward, however, with little influence or success in the international context
(Sect. 2.1). While a legal data protection regime with a global scope therefore appears
to be the most suitable instrument to meet the new challenges, such a global regime
remains a distant goal. However, different regional laws have been adopted, of which
the European Union’s Directive 95/46 is often seen as the most influential due to its
wide international impact and broad territorial scope, and its successor is expected to
have similar influence (Sect. 2.2). The second part (Sect. 3) will present relevant case
law of the European Court of Justice (ECJ) in the context of the external dimension
of European data protection law, which will also reveal the Court’s impact on the
provisions of the GDPR. Drawing on these findings, the third section (Sect. 4) will

4 For first-hand insights into this procedure by the rapporteur for the General Data Protection
Regulation in the European Parliament see Albrecht and Jotzo (2017), pp. 40–44.
5 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on

the protection of natural persons with regard to the processing of personal data and on the free
movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation),
Official Journal L 119/1.
6 Dammann (2016), p. 314.
7 Albrecht (2016), p. 289.
8 Giesen (2016), p. 23.
9 ‘Perforated’ in the sense that instead of setting one consistent data protection standard for all the

EU Member States, as originally intended, the legal act contains various provisions allowing for
additional concretizing, specifying, or even deviating national legislation (’opening clauses’). It
is difficult to say how wide the national margin of maneuver is as the opening clauses are quite
different in their wording and scope. The possibilities and limits of national legislation within the
framework of the GDPR have been subject to debate since the publication of the final version of the
legal act. See, for instance, Wagner and Benecke (2016), pp. 353–361. By now, all twenty-seven
Member States have adopted national legislation to “fill” these “gaps” and the remaining countries
have published drafts.
10 Rather critical: Roßnagel (2016), p. 561, Zarsky (2017), p. 995.
448 R.-D. Veit

then focus on the GDPR and shed light on the extraterritorial mechanisms it provides,
the new territorial scope (Sect. 4.1) as well as the third-country data transfer regime
laid down in Chapter V of the regulation (Sect. 4.2), in order to assess whether the
updated European regime pursues a feasible approach to the enforcement of data
protection rights on the Internet that is normatively justified. Finally, the findings
will be summarized (Sect. 5).

2 The Specific Challenges of Enforcing Data Protection


Law

While the challenges to the enforcement of law and rights in an interconnected online
world cannot be limited to a specific field of law, it is in the context of personality
and data protection rights that they are particularly evident. The anonymity in public
Internet fora offers ideal conditions for uninhibited behavior of their users, including
hate speech and slander, and raises the question of how and to what extent we should
safeguard the personality rights of the affected users,11 particularly when infringe-
ments occur in cross-border constellations (for example, on international social media
platforms) with different notions of how to balance freedom of expression and the
personality rights involved.12
With respect to privacy and data protection rights,13 the dynamic development
of new information and communication technologies that rely on global networks,
notably the Internet, and their ever-increasing use in all aspects of human life generate
massive flows of data across borders, including personal information relating to the
users. This globalization of data and information movements is complemented and

11 To counter issues such as online hate speech, libel, slander, and defamation, Germany passed the
Network Enforcement Act, Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken
(Netzwerkdurchsetzungsgesetz, NetzDG), 1 September 2017, BGBl. I p. 3352. This law introduces
an obligation for telemedia service providers to maintain a procedure that allows them to remove
or block access to content that is “manifestly unlawful” within 24 h of receiving the complaint.
On the NetzDG see Schulz (2022), in this volume. For alternative approaches to tackle offensive
communication online with a self-regulatory emphasis see Ladeur (2013), pp. 283–286; Hartmann
(2022), in this volume.
12 Even in the EU, where legal cultures are relatively homogeneous, no consensus on this question

could be found and jurisdictions are not so clear-cut. On this matter see Márton (2016). On the
legal landscape in Brazil with comparative accounts see Lana de Freitas Castro and Pereira Winter
(2014).
13 Data protection and privacy are related to each other but at the same time distinct rights; data

protection is a predominantly European concept and is both broader and narrower than privacy. It is
narrower because it protects privacy only as far as personal information and data are concerned, and
it is broader because it does not solely serve the protection of privacy but encompasses a “complex
bundle of interests and legal positions aiming at protection the individual in his or her sociality,”
see Albers (2014), pp. 224–229. On the distinction of the concepts in the jurisprudence of the ECJ
and the European Court of Human Rights see Kokott and Sobotta (2013), pp. 222–228.
Safeguarding Regional Data Protection Rights on the Global Internet … 449

accelerated by the constant and rapid emergence of new data processing technolo-
gies (such as big data and its algorithm-based analysis) and appliances (‘Internet of
Things’ or rather ‘Internet of Everything’) that notwithstanding their obvious benefits
also bear new challenges and (at least in part) require new regulatory approaches.14
In the social dimension, it is safe to assert that we find ourselves in an information
society where the distinction between the public and private domain has become
blurred and individuals are no longer mere ‘victims’ of the data collectors but are
actively involved in the generation and cross-border transfer of data by shifting
everyday activities into the online world.
While the necessity of the free movement of data and information across borders
for the expansion of international trade and cooperation is generally undisputed,15 this
new digital reality also presents new risks and dangers for the individual’s personal
rights and freedoms. The ubiquitous and no longer point-to-point but network-like
flow of information and data across the globe naturally goes hand in hand with
the individual to whom the information relates losing knowledge and control over
who is processing which data for what purposes—and to whom and for how long
this personal information and data shall be accessible.16 The lack of transparency
of the collection, processing, and (potentially transboundary) transfer of personal
data and information impedes the free development of the individual; it encourages
self-censorship and “social cooling”17 and thereby also entails negative impacts on
democratic processes.18 The recent data scandal involving Facebook and Cambrige

14 Traditional (European) data protection paradigms such as the focus on personal data as well as
the principle of purpose limitation appear to be opposed to the rationalities of, for example, big
data analysis. For its (possible) incompatibility with the GDPR see, for instance, Zarsky (2017),
pp. 1004–1018. For constructive ideas in this regard see Ladeur (2013), pp. 286–287, who suggests
that the regulatory focus of data protection laws should no longer be on the individual, but on
networks and the collective effects of the processing of data flows. Also see Broemel and Trute
(2016), p. 50.
15 The GDPR acknowledges this in recital 101. More generally, Article 1(1) GDPR reflects the dual

objectives of EU data protection law: the protection of personal data and the free movement of such
data. Likewise, Article 3 VIII of the Marco Civil da Internet lays down the freedom to do business
as a principle underlying Internet governance in Brazil.
16 The easy accessibility of potentially outdated or even false personal information and data for an

indefinite period of time, especially through Internet search engines, has initiated a debate over the
need and possibilities of ‘forgetting on the Internet.’ This debate was fueled by the EU Commission’s
GDPR proposal that in its Article 17 provided for a ‘right to be forgotten and to erasure.’ The matter
attracted further attention when the European Court of Justice in its Google judgement also used
this terminology, see ECJ, C-131/12, Google Spain v González, judgment of May 13, 2014, paras
89–99, accessible under http://curia.europa.eu. This ruling was received quite controversially in
German and Brazilian scholarship. However, the right to be forgotten is not to be missunderstood
as being just a synonym for the right to deletion. See for closer considerations on this subject Sarlet
(2022), in this volume, and Schimke (2022), in this volume.
17 Social cooling as a result of a latent fear of being surveilled was the main topic at the last congress

of the Chaos Computer Club, see Bruehl and Tanriverdi (2017).


18 Social interaction—both with the state and other individuals—is determined by what the inter-

acting parties know about each other. When one party has to fear a profound informational imbal-
ance, they will be inhibited in the outright expression of their thoughts and opinions, which can, in
turn, affect democratic processes. See Boehme-Neßler (2015), pp. 1282–1287.
450 R.-D. Veit

Analytica has demonstrated the severity of the repercussions that the uncontrolled,
nontransparent collection and analysis of personal information and data can have. In
light of the Snowden revelations concerning the scale of global surveillance practices
of various states that have fueled the adoption of legal responses in various regions
of the world, including the EU19 and Brazil,20 this lack of transparency and loss of
control gain yet another dimension that is well captured by Orwell’s famous metaphor
of Big Brother potentially eavesdropping at any moment:
There was of course no way of knowing whether you were being watched at any given
moment. How often, or on what system, the Thought Police plugged in on any individual
wire was guesswork. It was even conceivable that they watched everybody all the time.21

Data protection is a complex field of law that encompasses a bundle of protected


interests, but if understood and designed appropriately, it is an important tool to coun-
tervail the aforementioned developments while also taking into account conflicting
rights. In order to be effective, data protection regulation must, inter alia, provide
basic limits to data collection and processing as well as transparency and control
mechanisms. Only when such legal guarantees are in place is it possible to estimate
what information is collected in which contexts, how it is used, where it is transferred
to, which negative consequences its use might entail and, eventually, also to exercise
one’s rights.22
However, despite this reality, the legal standards for the protection of personal data
differ drastically, both conceptually and substantively, from country to country,23 a
setting that contrasts with the borderless and ubiquitous flow of data, making the
determination of the applicable law vital not only for data subjects but also for data
processors. While the former have a legitimate interest in knowing when, where,
and for what purposes personal information relating to them is being processed and,
in case of a violation of their rights, what redresses are available in terms of where
and to whom they can turn, the latter could see themselves confronted with a broad

19 In David Bernet’s film “Democracy—Im Rausch der Daten,” a documentary about the legislative
process of the GDPR, Jan Philipp Albrecht states that Snowden’s disclosures were crucial to the
adoption of the law.
20 Both the lawmaking process of the Brazilian Internet Bill of Rights (Marco Civil da Internet)

and the Data Protection Draft Bill 5,276/2016 were impacted by the Snowden revelations. For
insights into the legislative process of the Marco Civil and the ‘Snowden effect’ on it see Rossini
et al. (2015), pp. 5–7. Draft Bill 5,276/2016 was one three different draft bills on data protection in
Brazil, and it was advanced thanks to the Federal Ministry of Justice, which returned to prioritizing
the issue as a result of the Snowden leaks. For this, as well as for an overview on the different draft
bills, see de Souza Abreu et al. (2016) pp. 18–29. The General Law on the Protection of Personal
Data (Lei Geral de Proteção de Dados Pessoais) is based on Draft Bill No. 5,276/2016, but also
contains elements of the other drafts. For more detail see below Footnotes 52, 53.
21 Orwell (2016), p. 3.
22 Albers (2014), pp. 227–229.
23 For an overview of the existing data protections law see Greenleaf (2017), pp. 10–13. This

diversity of legal frameworks is, at least in part, rooted in the different models of data protection
that are commonly used across the globe and can be distinguished as comprehensive, sectoral,
self-regulatory, and technology-based, see Curtiss (2016), p. 106.
Safeguarding Regional Data Protection Rights on the Global Internet … 451

array of (sometimes conflicting) legal provisions with which they should be eager
to comply in order to avoid sanctions. This all illustrates the urgent necessity of
effective international data protection governance.

2.1 Alternative Approaches to Data Protection Governance

These technical, economic, and social developments all call for a solid, and most
importantly, transnationally effective response to ensure privacy and data protection
rights in the digital sphere. However, Internet governance is generally a complex
matter as the Internet and law have opposite structural characteristics; while the
Internet is global and rather dynamic, law is traditionally national and slow to adapt
to those dynamic changes. The most obvious and also most developed approach to
dealing with this tension between the global dimension of the Internet and the limited
territorial reach of (national) law is to place domestic or regional legal instruments on
the international platform and to work toward a legal framework with a global scope.
However, since the rise of the Internet, different alternative suggestions regarding its
governance have been put forward, two of which will be presented briefly, since they
could also be considered in the context of international data protection regulation.
One relatively early idea as to how to respond to the rationalities of the new
global phenonemon/network was to establish a separate (self-regulatory) regime for
cyberspace as a new ‘fourth legal dimension’ with its own universally applicable
rules and voluntary arbitration, irrespective of (national) jurisdictional boundaries.24
While such an approach is indeed innovative in the sense that it breaks with tradi-
tional governance paradigms and also offers solutions to the problems of conflicting
jurisdictions and the diversity of notions about privacy and data protection, it is in line
with neither the political nor the legal reality and also raises questions about legiti-
macy. Establishing a separate jurisdiction for the Internet with its own institutional
control would require national states and other political entities to delegate not only
their legitimate democratic power but also their obligation to uphold the rule of law
(at least in part) to private institutions. Regardless of the question of its desirability,
such an institutional shift is more than unlikely, particularly since nations and other

24 For that (early) idea see Johnson and Post (1996), p. 1367 as well as Mayer (1996), pp. 1789–
1790. On a similar note, but more recently and in the specific context of data protection law,
the suggestion was made to install self-organized private institutions (‘information brokers’) as
representatives of the hybrid public–private interests of Internet users which transcend the users’
individual limited privacy concerns and put the focus on the trans-subjective components of data
processing on the Internet, see Ladeur (2013), p. 287. This idea deserves further contemplation but
it also raises questions about legitimacy. Possibly, the decisions of these self-organized institutions
could ultimately be subject to the jurisdiction of state courts in order to enhance the democratic
legitimacy of the institutions. This way, the democratically constituted political entities would not
entirely outsource their (often) constitutional obligation to protect their citizens.
452 R.-D. Veit

political entities have already passed extensive Internet laws, implicitly rejecting the
idea that the Internet is a field beyond the scope of their power.25
Another alternative approach to reconciling the conflicting characteristics of the
Internet and law is to restructure the former along the physical national borders
by imposing territorial restrictions through technical means (‘re-territorialization of
the Internet’)—a suggestion with a trajectory quite contrary to granting the Internet
its own jurisdiction. The interim injunctions of a French court in the Yahoo! case in
200026 first sparked the discussion about using the techniques of zoning and geoloca-
tion to ‘re-territorialize’ the global network.27 However, the feasibility and accuracy
of these techniques were disputed soon after, and they are considered ineffective
because the restrictions they impose can easily be circumvented.28
As a consequence of Edward Snowden’s disclosures about the spying practices
and techniques especially in the U.S., the idea to restructure the Internet along (supra-)
national lines was reignited; the reactions to the surveillance scandal were particularly
vigorous in Germany and Brazil since the confidential documents revealed that top-
ranking politicians including Chancellor Merkel and President Rousseff were directly
affected.29 In Germany, Telekom AG suggested establishing ‘national routing’ or
‘Schengen routing’, meaning that when both sender and recipient are located within
national or EU territory, the data will not be routed via the U.S.30 Angela Merkel even
went so far as to demand a separate communications network for the EU altogether.31
Although the argument in favor of such ‘data nationalism’32 seems to be evident—
more protection against third-country spying—the idea is no longer being pursued,
at least not in the sense in which it was originally proposed.33 National routing, like
the techniques of zoning and geolocation, would lead to a decentralization of the
Internet, which is entirely contradictory to its open and freely accessible structure.
This openness is not only crucial to the dynamic processes of digitalization and

25 For this reason, among others, the idea of the Internet as its own territory was rejected quite soon
after it was first suggested. See Miller (2003), p. 234.
26 Tribunal de Grande Instance de Paris, N° RG: 00/05308, LICRA v. Yahoo! Inc. and Yahoo France,

Ordonnance de référé de 22 mai 2000 and Ordonnance de référé de 20 November 2000.


27 Zoning means the attempt to transfer offline geographical structures onto the Internet and limit

its access accordingly, and geolocation is a procedure that makes it possible to assign IP addresses
to geographical locations, see Hoeren (2007), pp. 3–6.
28 Hoeren (2007), pp. 5–6. That is not to say that such techniques are not employed at all in the

context of data protection. Google, for instance, uses Geoblocking to ensure that de-referencing is
effective in all Member States, see ECJ, C-507/17, CNIL, Judgment of 24 September 2019, para
32, accessible under http://curia.europa.eu.
29 This also sparked the idea for a new exclusive Internet cable between Brazil and Europe, see

Cáceres (2014).
30 For a detailed description of the progression of this debate (not just) in Germany see Geminn

et al. (2016), pp. 23–59.


31 Vasagar and Fontanella-Khan (2014).
32 For the term as well for the historical roots of the (political) idea see Kuner (2015a), pp. 2089–

2092.
33 For chances of the re-territorialization paradigm surviving see Büttner et al. (2016), pp. 149–151.
Safeguarding Regional Data Protection Rights on the Global Internet … 453

a central value for various social realities,34 but it can also catalyze democratic
processes, as became apparent in the Arab Spring. Finally, the Snowden revelations
have shown that large-scale data-sharing is taking place between EU intelligence
services and those of third countries,35 which casts further doubt on the effectiveness
of re-territorializing the Internet.
In conclusion, despite their theoretical appeal, neither the idea of an entirely
separate jurisdiction for the Internet nor its re-organization along national lines have
had a lasting influence on Internet governance in general and international data
protection governance in particular.

2.2 The Absence of a Global Legal Regime


and the Dominance of the EU

In the absence of feasible alternatives that provide effective protection of privacy and
data protection rights in an environment of global flows and ubiquitous processing of
personal data, the only adequate response to the enforcement issues outlined above
appears to be a harmonized, globally applicable and enforceable legal framework.
However, despite calls from a broad variety of stakeholders36 and with the number
of national omnibus data protection laws growing constantly,37 there is, as it stands,
no binding and therefore effective global regime in place38 and there is little chance
that this objective will be achieved in the foreseeable future. The main obstacle to

34 Büttner et al. (2016), p. 149.


35 Kuner (2017) pp. 914–915 with further references.
36 Google’s Global Privacy Counsel Peter Fleischer demanded such standards, suggesting that the

APEC Framework is “the most promising foundation” to build on, see Fleischer (2007). At their
annual conferences the world’s privacy and data protection commissioners have passed various
resolutions dealing with international data protection standards and cross-border enforcement,
among them the 2009 Madrid Resolution on “International Standards on the Protection of Personal
Data and Privacy”, accessible at https://globalprivacyassembly.org/wp-content/uploads/2015/02/
The-Madrid-Resolution.pdf (accessed 11 Jan. 2022). Its supporters also include civil society coali-
tions that have organized conferences discussing global privacy standards, see, for example, http://
thepublicvoice.org/2009/07/global-privacy-standards-in-a-global-world-1.html (accessed 11 Jan.
2022).
37 For an overview of this development see Greenleaf (2017), pp. 10–13. In 2018, Brazil passed

a much-anticipated General Law on Data Protection (Lei Geral de Proteção de Dados), which is
clearly inspired by the GDPR and also has a comprehensive approach. Before this law was adopted,
the legal landscape was rather piecemeal with provisions in various sectoral laws. On this see
Doneda and Mendes (2014), pp. 3–20.
38 The UN has taken several steps toward a global framework: in 1990, the Guidelines for the

Regulation of Computerized Personal Data Files (Resolution 45/95) were adopted and in 2013,
the General Assembly passed a resolution affirming the application of the right to privacy online
(Resolution 68/167). In addition, the UN Human Rights Commission is working to promote the right
to privacy in the digital age and appointed a Special Rapporteur on the right to privacy in 2015, see
http://www.ohchr.org/EN/Issues/DigitalAge/Pages/DigitalAgeIndex.aspx (accessed 11 Jan. 2022).
However, all of these measures are non-binding and have had little practical impact. Nevertheless,
454 R.-D. Veit

the achievement of this goal is the diversity of ideas about privacy and data protec-
tion across the globe, which is not only rooted in different cultural and historical
backgrounds but is also owed to particular political and economic interests.39
Significant progress with regard to the harmonization of data protection standards,
however, has been made at the regional, supranational level. Among the various
frameworks from different multinational organizations that have been adopted since
the 1980s,40 the European Union’s first data protection act, Directive 95/46, stands
out in two ways. Firstly, by way of its provisions on international data transfers, the
Directive introduced a ‘border control’ of data flows and thereby indirectly expanded
its influence even beyond the territory of the EU.41 Secondly—and most likely as
a consequence of its extraterritorial reach—, this legal act has had an outstanding
impact on privacy and data protection legislation throughout the world, serving as
the basis for most of the international privacy laws in the last two decades,42 and is
considered to be “the most influential body of data protection worldwide.”43
However, regardless of its unprecedented role as an international benchmark for
privacy and data protection legislation, Directive 95/46 was drafted at a time when the
World Wide Web was still in its infancy and people were not aware of, and could not
foresee the aforementioned challenges. For obvious reasons, including its territorial
scope,44 an update of the framework was therefore overdue.
As the European General Data Protection Regulation came into force in May
2016 and has applied since 25 May 2018, the time for this update has come. This

some authors promote the UN guidelines as the most promising basis for an international regime,
see de Hert and Papakonstantinou (2013), pp. 323–324.
39 While in the EU both privacy and data protection are constitutionally guaranteed fundamental

rights that are assumed to also owe their existence to the experiences of Nazi surveillance during
World War II, there are no explicit equivalents in, e.g., the U.S. constitution, Cunningham (2016),
p. 422. The strong opposition to a comprehensive data protection framework in the U.S. derives
from various factors, including the high value placed on the principles of free speech as well as the
traditional reliance on market-based solutions, see Cunningham (2013), p. 447, who also asserts that
the U.S. is nevertheless slowly conforming to the EU approach. The profitable and rapidly growing
data aggregation industry in the U.S. most likely also adds to the reluctance of the government to
establish a comprehensive regime.
40 The most important regional privacy data protection frameworks are the OECD Privacy Guide-

lines, the Council of Europe’s Data Protection Convention 108, the Asia Pacific Economic Coop-
eration (APEC)’s Privacy Framework, and the Economic Community of West African States
(ECOWAS) Supplementary Act on Personal Data Protection. For the history and gradual devel-
opment of the regional and international sources of data protection law, see de Hert and Papakon-
stantinou (2013), pp. 274–288. In May 2018, the Committee of Ministers of the Council of Europe
adopted an amending protocol to modernize the Data Protection Convention 108, which is now
open for signature. The consolidated text of the modernized treaty is available online at https://sea
rch.coe.int/cm/Pages/result_details.aspx?ObjectId=09000016807c65bf (accessed 15 Dec. 2021).
41 See Chapter IV of the Directive.
42 Cunningham (2016), pp. 426–428. Also see Greenleaf (2012), pp. 72–79.
43 Kuner (2014), p. 55. With reference to Directive 95/46, the EU has also been referred to as “the

decisive leader not only for EU Member Nations, but worldwide,” Cunningham (2013), p. 427.
44 Kranenborg (2014), p. 241 rightly points out that the scope did not necessarily cover modern data

processing activities such as cloud storage.


Safeguarding Regional Data Protection Rights on the Global Internet … 455

new (general) Regulation is the main instrument of the European data protection
law reform45 and shall apply equally in every Member State,46 replacing not only
Directive 95/46 itself but also to a great extent the national frameworks that are to
this day largely determined by it.
Unsurprisingly, EU representatives expect this new legal act to have influence on
the global developments of data protection law similar to that of its predecessor;
some even go so far as to claim that it will “change the world as we know it.”47
In scholarship, the majority tends to share this view and to confirm that the GDPR
has a highly influential role in the global context,48 while others criticize it for its
lack of pluralistic openness and assert that the EU approach is increasingly insular.49
However, in light of the broad influence of the European standards on national data
protection laws worldwide,50 there is no reason to fear that the EU will become
isolated in international data protection governance. On the contrary, it is the states
that refuse to adopt comprehensive regimes that increasingly find themselves on the
sidelines.51 The General Law on Data Protection (Lei Geral de Proteção de Dados,
LGPD)52 in Brazil can be considered an indication of the continuing influence of the
EU regime also under the GDPR. The content and structure of this comprehensive
framework are to a great extent clearly inspired by the European legal act.53

45 Along with the GDPR, the EU has passed two more legal acts to regulate data protection in
specific sectors: Directive (EU) 2016/680 for the area of crime investigation and prosecution and
Directive (EU) 2016/681 for the field of passenger name record (PNR) data. As part of the Digital
Single Market policy of the Juncker Commission, a proposal for a third act, the ePrivacy regulation,
has been published, which is intended to complement the GDPR with respect to data protection in
electronic communication, see COM(2017) 10 final.
46 The legal act, however, contains numerous ‘opening clauses’ that allow or even demand additional

legislation by the Member States. On this see above, Footnote 9.


47 Albrecht (2016), pp. 287–289.
48 See, for instance, Safari (2017), pp. 809–848 as well as Curtiss (2016), pp. 100–105, arguing that

the GDPR’s worldwide influence will be most visible in the private sector.
49 Kuner (2014), p. 65 demands that the GDPR should include a provision requiring the Commission

to consider the enactment by third countries of regional or international data protection instruments
when assessing the adequacy of their standards. This statement, however, disregards Article 45(2)(c)
GDPR, which explicitly mentions the international commitments or other obligations entered into
as a criterion to be taken into account when assessing the adequacy.
50 See Greenleaf (2012), pp. 72–77.
51 For the example of the U.S. see Cunningham (2013), pp. 452–453.
52 Lei Nº 13.709, de 14 de Agosto de 2018, Dispõe sobre a proteção de dados pessoais e altera a Lei

nº 12.965, de 23 de abril de 2014 (Marco Civil da Internet), Official Journal of the Union (Diário
Oficial da União), section 1, Nr. 157, Wednesday, 15 August 2018, pp. 59–64.
53 Before the enactment of the LGPD, there were three draft bills on data protection from different

authors competing for adoption: bills No. 4.060/2012, 330/2013, and 5.276/2016, of which bill
No. 5.276/2016, an initiative by the Ministry of Justice, was the one that was the closest to the
EU approach. The content of the adopted LGPD mainly follows the patterns of former draft bill
5.276/2016, but also contains elements of draft bills 4.060/2012 and 330/2013, which were all
merged during the legislative process. On this see Mangeth and Marinho Nunes (2018).
456 R.-D. Veit

While the GDPR mostly appears to be little more than an updated version of
Directive 95/46, it also comes with substantive and institutional innovations, among
them concepts of non-European origin such as privacy impact assessments as well
as privacy by design and by default,54 new individual data protection rights,55 but
also new mechanisms to ensure more effective transborder enforcement within the
EU.56 With respect to its extraterritorial application, the GDPR continues with the
patterns of the Directive but also introduces new features, and the new penalty regime
certainly discourages indifference. However, before these extraterritorial instruments
are presented and their feasibility and legitimacy are assessed (Sect. 4), an overview
of the ECJ’s jurisprudence on the external dimension of EU data protection law will
be given, as it shaped the regime in a major way and also influenced the content of
the GDPR.

3 The External Dimension of EU Data Protection Law


in the Jurisprudence of the European Court of Justice

In general, the development of the EU data protection regime and in particular of


the GDPR cannot be fully understood without taking into account the jurisprudence
of the ECJ, which has not only shaped Directive 95/46 but has also determined the
content of the new legislation.
In recent years, the European Court of Justice in Luxembourg has become very
active in the area of fundamental rights protection, and it is particularly in the field
of data protection where this development can be observed.57 While the majority
of the decisions in this context concerned the substantive content of both the data
protection legislation and the fundamental rights to private life and to data protection
as guaranteed in Articles 7 and 8 of the European Charter of Fundamental Rights
(CFR),58 there are a number of judgments in which the Court also engaged in the
extraterritorial dimension of EU Data Protection Law.

54 See Articles 35 and 25 GDPR. The introduction of these concepts does indeed reflect the Regu-
lation’s consideration of and openness to privacy policies from all over the world, see Schwartz
(2013), p. 2002.
55 Such as the ‘right to be forgotten’ or the right to data portability, see Articles 17 and 20 GDPR.
56 These are the introduction of a one-stop shop with a lead supervisory authority (Article 56 GDPR)

as well as the cooperation and consistency mechanisms (Article 60; 63 GDPR). On these as well as
the entire institutional control system under the GDPR see Schantz and Wolff (2017), pp. 295–335.
57 For an overview of the ECJ’s jurisprudence in the field of data protection before and after the

coming into force of the Charter of Fundamental Rights see Skouris (2016), pp. 1359–1364.
58 Charter of Fundamental Rights of the European Union, 2012/C236/02, Official Journal C 326/391.

For a closer look to the substantive content of the fundamental rights to data protection guaranteed
in Article 8 CFR see Reinhardt (2022), in this volume.
Safeguarding Regional Data Protection Rights on the Global Internet … 457

In Lindqvist,59 the ECJ first dealt with the third country data transfer regime under
Directive 95/46 and took a rather cautious stance. The judgment from 2003 held that
it does not constitute a data transfer to a third country60 when an individual in a
Member State uploads personal data onto a website that is also hosted within the
EU, even if this website is accessible to anyone with an Internet connection. In its
reasoning, the Court established that if in such a case a third country data transfer
were to occur, the exception of Article 25 Directive 95/46 would turn into a general
rule for the Internet, making the whole network subject to European Data Protection
Law.61
Little of this reserved stance, however, was apparent in the subsequent judgments
of the Court in the matter. In Digital Rights Ireland,62 the Court declared the former
Data Retention Directive 2006/2463 to be invalid due to violations of Article 7 and
8 CFR; the external dimension of European data protection law was therefore not
directly at issue. However, the Court also based the invalidity partly on the fact that
the Directive did not require the data to be stored within the EU, which, in turn, meant
that the constitutionally required control, as laid down in Article 8(3) CFR, would
not be ensured.64 This passage of the judgment indeed implied that the Court took the
view that the constitutional requirement of institutional control of data processing
deriving from Article 8 CFR also applies to data that is transferred from the EU to
third countries.65
A landmark judgment that was published only one month later and attracted global
attention, inter alia, due to its implications for the scope of European data protection
law, is Google Spain.66 In this judgment, the Court established a broad territorial
scope of Directive 95/46, finding that it applies to the activities of a search engine
operated by an enterprise that is seated outside EU territory but has an establishment
in an EU Member State promoting and selling advertising space and directing its
services toward the population of that Member State.67 According to the Court, “the
European Union legislature sought to prevent individuals from being deprived of
the protection guaranteed by the directive […] by prescribing a particularly broad

59 ECJ, C-101/01, Lindqvist, Judgment of 6 November 2003, accessible under http://curia.europa.eu.


60 In the sense of Article 25(1) Directive 95/46.
61 ECJ, Footnotes 59, para 69.
62 ECJ, C-293/12 and C-594/12, Digital Rights Ireland, Judgment of 8 April 2014, accessible under

http://curia.europa.eu.
63 Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on

the retention of data generated or processed in connection with the provision of publicly available
electronic communications services or of public communications networks and amending Directive
2002/58/EC, L 105/54.
64 ECJ, Footnote 62, para 68.
65 Kuner (2014), p. 63. The Court confirmed this interpretation in later judgments, particularly in

Schrems, see below, Footnote 75.


66 ECJ, C-131/12, Google Spain v AEPD and González, Judgment of 13 May 2014, accessible

under http://curia.europa.eu. On another major aspect of the judgment see above, Footnote 16.
67 ECJ, Footnote 66, para 55.
458 R.-D. Veit

territorial scope.”68 Methodologically, it based its broad interpretation on the fact that
the Directive’s goal is to guarantee the effective protection of fundamental rights,69
reaffirming the extraterritorial dimension of the fundamental rights to private life and
to data protection first established in Digital Rights Ireland. However, the ECJ did not
make clear how far exactly this “broad territorial scope” reaches or where its limits
are. In its recent judgment in CNIL,70 the Court specified the territorial boundaries of
the “right to de-referencing” as established in Google Spain. It held that internet users’
access – including those outside the Union – to the referencing of a link referring to
information regarding a person whose center of interests is situated in the Union is
likely to affect that person within the Union itself and that this consideration may, in
general, justify the existence of a competence of the EU legislature to lay down an
obligation for search engine operators to carry out a de-referencing on all versions
of the search engine.71 Following an analysis of both Directive 95/46 and the GDPR,
however, the Court came to the conclusion that, currently, EU data protection law does
not contain such an obligation.72 Nevertheless such de-referencing shall, in principle,
be carried out in respect of all the Member States73 and, what is more, national
authorities of the Member States remain competent to order, where appropriate and
according to national standards of protection, search engine operators to carry out a
de-referencing concerning all versions.74
Certainly one of the most significant rulings of the ECJ with relation to the third
country data transfer regime of the EU is Schrems,75 which, much like Google Spain,
sparked a global public reaction. The two main holdings of the Court were (1) that an
adequacy decision of the Commission pursuant to Article 25(6) Directive 95/46 does
not preclude national data protection authorities from overseeing data transfers to
third countries and (2) that the Commission’s Decision 2000/52076 establishing that

68 ECJ, Footnote 66, para 54.


69 ECJ, Footnote 66, para 53, later confirmed in another judgment, see ECJ, C-230/14, Weltimmo,
Judgment of 1 October 2015, para 30, accessible under http://curia.europa.eu.
70 ECJ, C-507/17, CNIL, Judgment of 24 September 2019, accessible under http://curia.europa.eu,

see also above, Footnote 28.


71 ECJ, Footnote 70, paras 57 and 58, emphasis added.
72 ECJ, Footnote 70, paras 64 and 65.
73 ECJ, Footnote 70, para 66.
74 ECJ, Footnote 70, para 72. The ECJ hereby delimits its scope of jurisdiction with a view to

the variety of national values among the Member States, more specifically the potentially divergent
interests of the public in accessing information. Constitutionally, this is in line with, inter alia, Article
53 CFR according to which nothing in the Charter, including Article 8 CFR, shall be interpreted
as restricting human rights and fundamental freedoms as recognised, in their respective fields of
application, inter alia, by the Member States’ constitutions.
75 ECJ, C-362/14, Maximillian Schrems v Data Protection Commissioner, Judgment of 6 October

2015, accessible under http://curia.europa.eu.


76 2000/520/EC: Commission Decision of 26 July 2000 pursuant to Directive 95/46/EC of the

European Parliament and of the Council on the adequacy of the protection provided by the safe
harbor privacy principles and related frequently asked questions issued by the U.S. Department of
Commerce (notified under document number C(2000) 2441) (Text with EEA relevance), Official
Journal L 215.
Safeguarding Regional Data Protection Rights on the Global Internet … 459

the Safe Harbor Principles issued by the U.S. Department of Commerce77 provided
an “adequate level of protection” as required by Article 25(1) Directive 95/46 is
invalid.
In the context of their first holding, the European judges strengthened the position
of the national supervisory authorities in stating that, in accordance with Article 8(3)
CFR, they are “vested with the power to check whether a transfer of personal data
from its own Member State to a third country complies with the requirements laid
down by Directive 95/46.”78 After defining criteria for the assessment of the adequacy
of a third country’s data protection rules,79 the judges, as in Digital Rights Ireland,
invalidated the Commission’s decision because of its violation of the fundamental
rights guaranteed by Articles 7, 8 and 47 CFR.80 Their main considerations for this
finding were that the Commission’s decision did not establish that U.S. law provided
for an effective redress for EU citizens, nor for any limitation on the powers of the
U.S. public authorities to interfere with these rights.81 The Court held that legislation,
such as Decision 2000/520, that authorizes storage of all the personal data of all
the persons whose data has been transferred from the EU to the U.S. without any
differentiation, limitation, or exception being made, is not limited to what is strictly
necessary which, however, is a requirement of Article 8 CFR.82 It further established
that the generalized, unlimited access by public authorities to the content of electronic
communications compromises the core of the fundamental right to private life laid
down in Article 7 CFR,83 but remained silent as to the essence of the right to data
protection.
On the same note, the Court once more reaffirmed the extraterritorial dimension
of Article 8 CFR in finding that the adequacy assessment “implements the express
obligation laid down in Article 8(1) of the Charter to protect personal data and […]
is intended to ensure that the high level of that protection continues where personal
data is transferred to a third country.”84

77 Accessible online at https://rm.coe.int/16806af271 (accessed 11 Jan. 2022).


78 ECJ, Footnote 75, para 47.
79 ECJ, Footnote 75, paras 70–78. These will be presented in detail in the context of the indirect

extraterritorial effects of the GDPR, see below Sect. 4.2.1.


80 ECJ, Footnote 75, paras 91–98.
81 ECJ, Footnote 75, paras 87–89.
82 ECJ, Footnote 75, paras 91–93. Referring to the Commission’s own assessment of the situation,

the ECJ also found the U.S. authorities’ access to data to be incompatible with the EU standards of
purpose limitation and proportionality that are also set out in Article 8 CFR, see ibid. para 90.
83 ECJ, Footnote 75, para 94. Interestingly, the European Court of Human Rights (ECtHR) recently

ruled that it does not necessarily have to be the content of data that leads to an infringement, but
that the combination of metadata may give as much information about an individual. See below,
ECtHR, Footnote 208.
84 ECJ, Footnote 75, para 72. In another passage, the Court states that if the supervisory authorities

were bound by an adequacy decision, persons whose data has been or could be transferred to the
third country concerned would be denied their rights guaranteed by Article 8(1) and (3) of the
Charter, see ECJ, Footnote 75, para 58, and thereby stresses the extraterritorial dimension of the
institutional control.
460 R.-D. Veit

The demise of the Safe Harbor Agreement, however, does not mark the end of
international data transfer agreements of the EU being challenged before the ECJ.
The EU-U.S. Privacy Shield, the successor of the Safe Harbor Agreement,85 has
already been brought before the Luxembourg Court by privacy activist groups.86
Moreover, the Irish Commercial High Court in October 2017 rendered a judgment
in which it raised doubts as to the validity of the Standard Contract Clauses (SCC)
decisions of the Commission with respect to data transfers from the EU to the U.S.,
and announced the referral of the matter to the ECJ for a preliminary ruling.87 After
taking her time to consider all of the submissions by those party to the proceedings,88
the Justice referred the case to the Court in May 2018,89 shortly before the GDPR
came into force.
In July 2020, the ECJ published its decision in this case, another landmark judg-
ment, which in scholarship is often referred to as Schrems II.90 In accordance with the
extensive questions referred, the Court did not limit itself to ruling on the validity of
the SCC Decision91 but decided on other substantive aspects of the external dimen-
sion of EU data protection law as well, including the interpretation of Article 46
GDPR as well as the validity of the Commission’s Decision 2016/1250 implementing
the EU-U.S. Privacy Shield.92
In the context of its second holding, the ECJ stated that the provisions of the GDPR
dealing with data transfers to third countries (Chapter V) “are intended to ensure the
continuity of [the level of protection guaranteed by the GDPR] where personal data
is transferred to a third country” and that this level of protection must be guaranteed

85 For more detail on these agreements see below, Sect. 4.2.2.


86 Two separate actions for annulment of the Commission’s decision have been brought, one by the
group Digital Rights Ireland Ltd, the other by La Quadrature du Net et al. The former, however,
was held inadmissible by the Court, see ECJ, T-670/16, Digital Rights Ireland v Commission, Order
of 22 November 2017, accessible under http://curia.europa.eu. The latter was declared void, see
ECJ, T-738/16, La Quadrature du Net, Action of 25 October 2016, accessible under http://curia.eur
opa.eu (accessed 15 Dec. 2021).
87 The High Court Commercial, Case No. 2016/4809 P., The Data Protection Commissioner v

Facebook Ireland Ltd. and Maximilian Schrems, Judgment of 3 October 2017.


88 Carolan (2018).
89 ECJ, C-311/18, Reference for a Preliminary Ruling of 9 May 2018, Data Protection Commis-

sioner v Facebook Ireland Limited, Maximilian Schrems, accessible under http://curia.europa.eu.


90 ECJ, C-311/18, Data Protection Commission v Facebook Ireland Ltd, Maximillian Schrems,

Judgment of 16 July 2020, accessible under http://curia.europa.eu. Before responding to the ques-
tions referred, the judges pointed out that and why those must be answered in the light of the
provisions of the GDPR rather than those of the Directive, see paras 77–79.
91 2010/87/: Commission Decision of 5 February 2010 on standard contractual clauses for the

transfer of personal data to processors established in third countries under Directive 95/46/EC of
the European Parliament and of the Council, Official Journal, L 39/5.
92 Commission Implementing Decision (EU) 2016/1250 of 12 July 2016 pursuant to Directive

95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided
by the EU-U.S. Privacy Shield (notified under document C(2016) 4176) (Text with EEA relevance),
Official Journal L 207/1.
Safeguarding Regional Data Protection Rights on the Global Internet … 461

irrespective of which provision of the chapter serves as legal basis for the transfer.93 In
relation to the institutional dimension, the Court further strengthened the position of
the national supervisory authorities by way of holding them competent (and obliged),
unless there is a valid adequacy decision in place,94 to suspend or prohibit a data
transfer to a third country pursuant to SCCs adopted by the Commission, if those
clauses are not or cannot be complied with in that third country and where the
controller or processor has not itself suspended the transfer.95 With regard to the
SCC Decision of the Commission, the ECJ, following an analysis of the guarantees
contained in the standard data protection clauses in the annex to the Decision, ruled
that it does not violate Articles 7, 8 and 47 CFR and is therefore valid.96 In their
reasoning, the judges state that the fact that these clauses do not bind the authorities
of third countries may, depending on the third country in question, entail the necessity
for the controller or processor established in the EU to supplement the guarantees
contained in the standard clauses, but it cannot affect the validity of the Commission’s
decision.97 Thus, the Court emphasized the individual responsibility of the controllers
and/or processors. In its last holding, however, the Court ruled that Decision (EU)
2016/1250 implementing the EU-U.S. Privacy Shield is invalid.98 After a detailed
analysis of both the provisions and recitals of the Privacy Shield Decision and the
applicable US national law, the judges came to the conclusion that the Commission,
in finding that the US ensures an adequate level of protection under the EU-US
Privacy Shield, disregarded the requirements of Article 45(1) GDPR read in the light
of Articles 7, 8 and 47 CFR.99
Another instance in which the ECJ dealt with the EU data protection regime in
the context of data transfers to third countries was in its Opinion on the draft EU-
Canada PNR agreement issued in July 2017.100 The Court was called upon by the
European Parliament pursuant to Article 218(XI) TFEU101 to give its opinion on the
compatibility of the planned agreement between the EU and Canada on the transfer

93 ECJ, Footnote 90, paras 93 and 92. For more details on Chapter V and the Court’s holdings in
this context see below, Sect. 4.2.
94 If there is a valid adequacy decision and a complaint is lodged by a person, the authorities remain

competent to examine whether the transfer in question complies with the requirements laid down
by the GDPR and, if relevant, bring an action before the national courts, see para 120.
95 ECJ, Footnote 90, paras 106–122.
96 ECJ, Footnote 90, paras 137–149.
97 ECJ, Footnote 90, paras 123–136.
98 ECJ, Footnote 90, para 201. The Court found the validity of Decision (EU) 2016/1250 to be

relevant for the purposes of assessing both the obligations of the controller and recipient of personal
data transferred to a third country pursuant to the Standard Contract Clauses and also any obligations
to which the supervisory authority may be subject to suspend or prohibit such a transfer, see ECJ,
Footnote 90, para 154.
99 ECJ, Footnote 90, paras 162–199. For more details on the reasoning see below, Sect. 4.2.2.
100 ECJ, Opinion 1/15, Draft EU-Canada PNR agreement, Opinion of 26 July 2017, accessible

under http://curia.europa.eu.
101 Consolidated version of the Treaty on the Functioning of the European Union, Official Journal

C 326/47.
462 R.-D. Veit

of passenger name record data, and it ruled that the draft, in its current form, must
not enter into force because several of its provisions are incompatible with Articles
7, 8, 21, and 52 CFR.102 While, as an international agreement generally constituting
a “legitimate basis laid down by law” as required under Article 8(2) CFR,103 the
draft, according to the judges, in many aspects fails to limit the interferences with
these rights to what is strictly necessary, among other reasons,104 because it is not
sufficiently precise as regards the scope of PNR data to be transferred105 and because
it permits the transfer of sensitive data to Canada as well as the use and retention of
such data by the Canadian authorities.106
With respect to the external dimension of European data protection law, the judges
held that Article 8 CFR “requires, inter alia, that the high level of protection of
fundamental rights and freedoms conferred by EU law continues where personal
data is transferred from the European Union to a non-member country,”107 however
also restating that the means to provide for an equal standard may differ.108 To further
justify this holding, the Court referred to the Charter’s preamble in which the fourth
paragraph states the necessity “to strengthen the protection of fundamental rights
in the light of changes in society, social progress and scientific and technological
developments.”109 Furthermore, the Canadian authorities were only to be allowed
to transfer the PNR data to authorities in other third countries if those countries
provided for an essentially equivalent level of protection.110
In conclusion, the European Court of Justice has played a fundamental role both
in shaping the substantive content and expanding the external dimension of the Euro-
pean data protection regime. After taking a rather cautious stance at first, it subse-
quently started to widen the territorial scope with regard to the EU (Google, CNIL)
as well as increasingly reaffirm the international data transfer regime (Digital Rights,
Schrems I & II, and EU-Canada PNR agreement). Throughout all the judgments, the
main consideration was the fundamental right to data protection (and its concretiza-
tion in the GDPR), which would lose its effectiveness and could easily be circum-
vented if the high standard of protection it provides ended at the European territorial
border. As will be established below, the judgments also affected the content of the

102 It was the first time that the ECJ decided on the compatibility of a draft international agreement
with the fundamental rights laid down in the Charter. Opinion 1/15 therefore has implications for
other agreements on international data transfers such as the PNR agreement with the USA.
103 ECJ, Footnote 100, paras 145–147.
104 For an overview of the other reasons given and amendments required by the ECJ see Docksey

(2017), pp. 768–773.


105 ECJ, Footnote 100, paras 155–163.
106 ECJ, Footnote 100, paras 164–167.
107 ECJ, Footnote 100, para 134.
108 ECJ, Footnote 100, para 134, with explicit reference to its findings in the Schrems case, see

above.
109 ECJ, Footnote 100, para 135.
110 ECJ, Footnote 100, para 214. The Court thereby reaffirmed what it first established in the Schrems

case.
Safeguarding Regional Data Protection Rights on the Global Internet … 463

reformed regime under the GDPR with regard to both its territorial scope and the
rules on international data transfers.

4 The External Dimension of EU Data Protection Law


Under the General Data Protection Regulation

As the analysis of the relevant ECJ case law has revealed, the external dimension of
the EU data protection regime is founded on the constitutional obligation to safeguard
the European standard and effectively enforce it in an increasingly globalized and
interconnected world.111 The EU Data Protection Directive 95/46 was one of the
first comprehensive and binding data protection frameworks to tackle the issue of
transborder data flows and thereby consider the borderless nature of the Internet. It
achieved its broad reach even beyond EU territory as well as its international success
mainly through two normative instruments: (1) its territorial scope that included
non-European data controllers when they made use of equipment situated on EU
territory112 and (2) its restrictions on personal data transfers to third countries or
entities that do not provide an adequate standard of protection.113
The new General Data Protection Regulation continues with these trends but also
further extends the external dimension of EU data protection law by introducing
additional instruments, most notably the new (extra-)territorial scope in Article 3(2)
GDPR, which has been referred to as “the single most important provision in the entire
[…] Regulation” for any non-EU party, “perhaps followed by [Chapter V] dealing
with the transfer of personal data to third countries.”114 From the perspective of data
controllers and processors worldwide who aim to interact with the European market,
the new sanction regime, which introduces fines of up to 4 percent of an enterprise’s
worldwide turnover in the case of non-compliance, makes the significance of these
provisions clear.115
In the age of the Internet, extraterritoriality in comprehensive and effective data
protection frameworks is indeed inevitable,116 and extraterritorial provisions can no
longer be rejected categorically. The research focus must be on the extent to which
a particular claim is legitimate. Therefore, the aforementioned mechanisms of the
GDPR will be analyzed and their normative legitimacy assessed in the following.

111 This notion is also apparent in the GDPR. See, for instance, Article 44 as well as Recitals 23
and 101.
112 Article 4(2)(c) Directive 95/46.
113 Article 25, 26 Directive 95/46.
114 Svantesson (2014), p. 71.
115 See Article 83(5) GDPR. On 21 January 2019, the French national data protection authority,

CNIL, imposed a financial penalty of fifty million euros on Google for non-compliance with GDPR
provisions.
116 Cunningham (2016), p. 421.
464 R.-D. Veit

The instruments in the GDPR can be distinguished into those with direct and with
indirect extraterritorial effect.

4.1 Direct Extraterritorial Effect: The Marketplace Principle

The central innovation that the General Data Protection Regulation introduces to the
external dimension of EU data protection law is its widened territorial scope, which is
defined in Article 3 GDPR. Article 3(1) GDPR continues with the patterns of Article
4(1)(a) Directive 95/46 and states that the Regulation applies to data controllers and
processors with an establishment in the Union if the processing takes place in the
context of the activities of this establishment. However, the new provision now makes
explicit that it is irrelevant whether the processing takes place in the Union or not.117
Article 3(3) GDPR, similarly to Article 4(1)(b) Directive 95/46 before it, extends
the territorial scope to data controllers who are established in places where Member
State law applies by virtue of public international law.
The actual innovation of the GDPR is laid down in Article 3(2) GDPR. It has been
argued that this provision “will likely bring all Internet services providers […] under
the scope of the EU Regulation as soon as they interact with data subjects residing
in the European Union.”118 It unfolds direct extraterritorial effect by extending the
Regulation’s scope also to data controllers and processors established outside the
Union, where their processing activities are related to a) the offering of goods or
services in the Union to data subjects who are in the EU or b) the monitoring of their
behavior as far as their behavior takes place within the Union. By abandoning the
hitherto existing requirement of the use of equipment situated on EU territory,119 the
European legislator went beyond a mere territoriality-based approach and established
a ‘marketplace principle’ which focuses on the scope of a controller’s and processor’s
targeting of data subjects in the European market. Where this provision applies, it
is complemented by the obligation for non-EU based controllers and processors to
designate a representative in the Union who shall be mandated to be addressed by
supervisory authorities and data subjects,120 which creates a territorial contact point
for more effective enforcement.
Up until the judgment of the ECJ in the Google case, the territorial scope of
Directive 95/46 was subject to a heated debate centered around its (in)adequacy
in the Internet setting; in this context, it was deemed “overly […] complex to the

117 This clarification can be seen as an approval of the ECJ’s interpretation of Article 4(1)(a)
Directive 95/46 in the Google Spain judgment, see ECJ Footnote 66, paras 50–60.
118 Svantesson (2014), p. 71. However, this statement was made with regard to the wording of

the provision in the EU Commission’s proposal. The finally adopted version narrowed the scope
significantly and taking this into account, the statement would have to be amended to: “as soon as
they interact with data subjects who are in the Union within the Union”.
119 See Article 4(1)(c) Directive 95/46.
120 Article 27 GDPR.
Safeguarding Regional Data Protection Rights on the Global Internet … 465

degree of being dysfunctional.”121 The Directive was drafted in the early 1990s, that
is, at a time when the Internet as a global information and communication network
was still in its infancy. Naturally, its provisions, including those on the territorial
scope, did not yet reflect the digital reality of large-scale cross-border movements
of (personal) data that has since evolved. In order for the legal act’s normative claim
not to be rendered void in the face of these developments, the ECJ, in Google,
therefore interpreted the criteria of Article 4(1)(a) Directive 95/46 extensively and
held the Directive applicable even in situations where the actual data processing is
not carried out on EU territory.122 While this expansion was viewed by some as a
necessity for the effective and complete protection of fundamental rights,123 others
consider it judicial overreach.124 Thus, the judgment did not set aside all the concerns
that were raised in the context of the territorial scope of Directive 95/46.
Against this background, placing the focus on the degree of interaction with the
European market is an innovative game changer in EU data protection regulation that
appears to be more appropriate in the online environment than the (albeit watered
down) requirement of a territorial nexus. That being said, it is not clear whether
the new provision will necessarily increase legal certainty for non-EU-based data
controllers and processors. Considering the wide impact of Article 3(2) GDPR, the
wording of its criteria is rather vague. However, as with any legal act of the European
Union, the explanatory recitals of the GDPR offer some insight as to the intended
meaning of its provisions.125
With respect to Article 3(2)(a) GDPR, recital 23 states that, in order to determine if
data subjects in the Union are being targeted for the offering of goods and services,126
it must be “ascertained whether it is apparent that the controller or processor envis-
ages”127 these activities. The decisive element therefore is the subjective intention
of a controller or processor, which is naturally difficult to determine, but the recital
subsequently lists objective factors that are intended to give guidance. As examples
for such indicators, the recital names “the use of a language or a currency gener-
ally used in one or more Member States with the possibility of ordering goods and
services in that other language” and “the mentioning of customers or users who are

121 Svantesson (2014), p. 67.


122 See above (Sect. 3). On Article 4(1)(a) Directive 95/46 and its teleological interpretation by the
ECJ also see de Hert and Czerniawski (2016), pp. 233–235.
123 Hijmans (2014), p. 560.
124 Gömann (2017), p. 578–581, who takes into account textual, systematic, and historical

arguments. Critical due to the lack of clear jurisdictional limits also Kuner (2014), p. 69.
125 According to Article 296(2) TFEU, legal acts of the EU shall state the reasons on which they

are based. The EU legislator fulfills this obligation by adding recitals to adopted legal acts. In its
judgments, the ECJ often refers to them when defining the content and meaning of a provision, but
they are not themselves subject to interpretation.
126 This also includes the offering of “information society services” as defined in Article 1(1)(b) of

Directive (EU) 2015/1535, see European Data Protection Board (EDPB), Guidelines 3/2018 on the
territorial scope of the GDPR (Article 3), 12 November 2019, p. 16, accessible under https://edpb.
europa.eu.
127 Emphasis added.
466 R.-D. Veit

in the Union,” whereas passive elements such as the mere accessibility of a data
controller’s or processor’s website in the Union are not sufficient. Considering that
these ‘objectivizing’ criteria were developed in the context of a different field of law,
they have yet to prove their practical workability in data protection law.128 However,
recital 23 is only an interpretative guideline and does not contain an exhaustive
list of all factors to be taken into account for the determination of the intention to
target the EU market. Thus, there is room for the development of further, possibly
more appropriate criteria that may better reflect the fact that the EU data subjects (in
some cases) have a market choice, too.129 As, in practice, (EU) data subjects often
make their choices on the basis of a search engine’s results, one useful indicator for
targeting intentions could be a data controller’s or processor’s expenditures on an
Internet referencing service to the operator of a search engine in order to facilitate
the access to their websites by data subjects in the EU.130 Article 3(2) of the Brazilian
LGPD reads very similarly to Article 3(2)(a) GDPR. Therefore, the provision raises
the same issue of finding appropriate criteria for the establishment of the intention
to target.
Regarding the interpretation of Article 3(2)(b) GDPR, recital 24 states that “it
should be ascertained whether natural persons are tracked on the Internet including
potential subsequent use of personal data processing techniques which consist of
profiling a natural person” so as to establish if a controller or processor aims to
monitor the behavior of EU data subjects within the Union. According to the wording,
the mere act of tracking is sufficient to trigger Article 3(2)(b) GDPR; as opposed to
Article 3(2)(a) GDPR read in conjunction with recital 23, the subjective intention
or motivation of a data controller or processor to monitor a data subject’s behavior
is irrelevant. At the same time, the explicit clarification that the subsequent use of
the tracked user’s data for profiling purposes is included can be understood as an
indication that such tracking, which is carried out only for technical maintenance
purposes, shall be exempted from the scope.131 Article 3(2)(b) GDPR is intended to
cover those third country data controllers and processors that systematically track

128 As pointed out by Gömann (2017), p. 585, these criteria are a clear reference to the ECJ’s findings
in Pammer and Alpenhof , a European international procedural law case. See ECJ, C-585/08 and
C-144/09, Pammer and Alpenhof , judgment of 7 December 2010, accessible under http://curia.eur
opa.eu. The EDPB also established that the criteria developed in this case are, inter alia, to be taken
into consideration when considering whether goods or services are offered to data subject in the
Union, see EDPB, Footnote 126, pp. 17–19.
129 Article 3(2)(a) GDPR has been criticized for not being clear enough as it might also cover such

data controllers and processors who act globally but do not necessarily target the EU market, see
de Hert and Czerniawski (2016), p. 239. However, reading the provision in conjunction with recital
23, it becomes clear that the legislator did not intend to cover those cases; the mere accessibility of a
website is not sufficient. On the contrary, there must be an element apparent in which the controller’s
or processor’s intention to target EU data subjects becomes manifest. See also, including examples,
EDPB, Footnote 126, pp. 15–16.
130 This factor was considered by the ECJ in a different context. See ECJ, Footnote 128, para 81.
131 The EDPB has clarified that in its view, “the use of the word ‘monitoring’ implies that the

controller has a specific purpose in mind for the collection and subsequent reuse of the relevant data
about an individual’s behavior within the EU.”, see EDPB, Footnote 126, p 20.
Safeguarding Regional Data Protection Rights on the Global Internet … 467

the preferences and behavior of the users of their services through tools such as
cookies or social plug-ins that register the websites visited by a user.132 While its
wording is rather broad, the scope of the provision is limited in two ways; it only
applies to behavior that is taking place in the Union and only to data subjects who
are in the Union. This is a reasonable limitation of the EU regime’s reach. In any
case, establishing jurisdiction on the grounds of the monitoring of behavior is an
innovative approach in data protection law and certainly more appropriate in the
Internet setting than the requirement of a territorial link.133
As pointed out above, Article 3(2) GDPR constitutes a paradigm shift in the
territorial scope of data protection laws. Nevertheless, its criteria, particularly those of
Article 3(2)(a) GDPR, have yet to prove whether they provide for workable solutions
and more legal certainty for data controllers and processors in practice. The task
of concretizing the GDPR’s often vague provisions (and thereby adding to their
workability) is assigned primarily to the new Data Protection Board134 as well as,
ultimately, the European Court of Justice. Criticism, however, is not only directed at
the wording of the provision, but also at the establishment of a marketplace principle
in general. Prominent points of criticism are (1) the broad variety of marketplaces that
the Internet comes with, which could potentially entail conflicts of interest as well as
high compliance costs, particularly for smaller businesses, and (2) the lack of actual
enforceability in practice.135 Regarding the first aspect, it is to be noted that businesses
operating in different markets simultaneously may indeed see themselves confronted
with diverging, sometimes even conflicting, legal requirements. This, however, is a
natural consequence of the global nature of the Internet and is something for any
business intending to act internationally to take into consideration. Costs resulting
from diverging data protection and privacy standards could be reduced by complying
with the EU regime, as the arguably most demanding standard, wherever in the world
business is conducted, as some enterprises already have.136 In this sense, the EU
standards serve as compliance benchmarks. The alleged lack of enforceability is at
the very least doubtful. The new sanction regime constitutes a serious incentive for
compliance and could, if necessary, be complemented by “market access restriction

132 Schantz and Wolff (2017), p. 113.


133 Adjusting the scope to the changed digital setting was clearly the intention of the legislator,
which is reflected by the fact that recital 24 is one of the very few parts of the GDPR where the
term Internet is explicitly mentioned.
134 According to Article 70(1)(e) GDPR, it is among the tasks of the Board to “examine […] any

question covering the application of this Regulation and issue guidelines, recommendations and
best practices in order to encourage consistent application of this Regulation.” The EDPB has acted
on this task and published its first guidelines on the territorial scope of the GDPR, see EDPB,
Footnote 126.
135 For references see Klar (2017), p. 534.
136 At the event Die neue EU-Datenschutzgrundverordnung, Bucerius Law School, Hamburg, 8

December 2016, Director of Privacy & Data Protection Officer at eBay International AG Anne
Zeitler stated that eBay will implement the standards of the GDPR not only on EU territory but
globally.
468 R.-D. Veit

measures” as an expression of exercising market sovereignty.137 In addition, the


obligation set out in Article 27(1) GDPR for those data controllers and processors
who fall under the scope of Article 3(2) GDPR to designate a representative in the
Union are likely to enable more effective on-site enforcement.138
In conclusion, irrespective of the need for further specification of its criteria,
Article 3(2) GDPR as compared to the territorial scope of Directive 95/46 can be
said to be a more appropriate response overall to the aforementioned technological,
economic, and social developments that have resulted in personal data relating to
EU citizens being processed outside the Union on an unprecedented scale, which,
in turn, prompted the ECJ to expand the provisions of the Directive which were
insufficient in this regard. With Article 3(2), the GDPR abandons the strict necessity
of a territorial nexus while also requiring sufficient links to the Union: compliance
with the EU standards is obligatory only for those who want to do business on the EU
market by targeting “data subjects who are in the Union” to offer goods or services or
monitoring their behavior as far as it “takes place within the Union.”139 What is more,
the extended territorial scope helps to prevent forum shopping and levels the playing
field for both EU and non-EU competitors. With the introduction of the marketplace
principle, the ECJ will no longer have to lower the requirements of a territorial link
in order for the EU regime to be applicable in the Internet setting. Should the Court
be called upon to interpret the new scope, its principal task will be to set out clear
and workable criteria for the practical application of the new provision to provide
for legal certainty.
With regard to the normative foundation and justification of Article 3(2) GDPR,
it must be stated that extraterritorial claims in general140 and the use of a market-
oriented approach in particular are not peculiar to the European regime. In Brazil,
for instance, both the Marco Civil da Internet and the recently passed Lei Geral de
Proteção de Dados contain similar provisions.141 In fact, even the United States,
the economically most significant state among the opponents of a comprehensive
data protection regime, employs a comparable principle in different fields of law,
including online privacy.142 The European co-legislators, in correspondence with the
ECJ’s jurisprudence outlined above, consider the widening of the territorial scope

137 The term “market access restriction measures” is inspired by the suggestion of Svantesson to
determine justifiable jurisdiction over the Internet by focusing on marketplace control instead of
persons, conducts, or physical links. See Svantesson (2014), p. 96–100.
138 Recital 80 makes clear that the representative “should be subject to enforcement proceedings in

the event of non-compliance”.


139 Some, however, see Article 3(2)(b) GDPR as going even beyond the marketplace principle. See

Klar (2017), p. 536.


140 See Svantesson (2014), p. 55, who observes a “tendency toward wide extraterritorial jurisdic-

tional claims” in data protection laws worldwide.


141 Article 11 § 2 of Marco Civil reads as follows: “The provisions of this article apply to activities

conducted by foreign-based legal entities, if they offer services to the Brazilian public or at least
one of the members of the legal entities’ economic group has an establishment in Brazil.” Likewise,
Article 3(2) LGPD reads very similarly to Article 3(2)(a) GDPR.
142 With regard to the Children’s Online Privacy Act (COPPA), for example, the U.S. Federal

Trade Commission (FTC) has in its FAQs explicitly stated that “[f]oreign-based websites and
Safeguarding Regional Data Protection Rights on the Global Internet … 469

of the data protection regime a necessity in order to protect the fundamental right to
data protection in the online world.143 The duty to protect human rights even beyond
the jurisdictional territory is supported in scholarship144 and is also in line with
international law.145 Moreover, the extension of the territorial scope of European data
protection law is not without a just cause; it stems from technological, economic, and
societal developments and has to be seen as an instrument to adjust the constitutional
obligation to protect fundamental rights accordingly. Incentivizing compliance by
way of market pressure is an effective and normatively just way to fulfil this duty.

4.2 Indirect Extraterritorial Effect: Adequacy and Beyond

Like its predecessor, the General Data Protection Regulation contains a chapter that
is entirely dedicated to the regulation of data transfers to third countries and much
resembles the regime of Directive 95/46. This chapter’s provisions have indirect
extraterritorial effect because they do not apply directly in third countries, but only
encourage compliance by outlawing data transfers to third countries which do not
ensure an adequate standard of protection.146
Compared to the Directive, the articles of the GDPR are more detailed, take into
account the ECJ’s recent jurisprudence, and reflect a generally more compromising
default: instead of prohibiting data transfers on a general basis, unless the destination
country ensures an adequate level of protection, Chapter V starts with setting out
“General principles for transfers” in Article 44 GDPR, which stipulates that personal
data transfers, including onward transfers,147 to non-EU countries or other entities
may take place only if, subject to the other provisions of the GDPR, the conditions
laid down in the chapter are complied with. Thus, data controllers and processors
who intend to transfer personal information and data to a third country must comply
with both the general provisions of the GDPR and the specific requirements set
out in Chapter V. Infringements of the provisions of this Chapter are subject to

online services must comply with COPPA if they are directed to children in the United States, or
if they knowingly collect personal information from children in the U.S.”, see https://www.ftc.
gov/tips-advice/business-center/guidance/complying-coppa-frequently-asked-questions (accessed
11 Jan. 2022). For this and further examples also see Klar (2017), p. 536.
143 This becomes evident in Recital 23, where it states that the Regulation should apply to non-EU

based data controllers and processors that fall under the scope of Article 3(2) GDPR “[i]n order to
ensure that natural persons are not deprived of the protection to which they are entitled under this
Regulation”.
144 On extraterritoriality in human rights, including examples from EU and U.S. law, see Torre-

mans (1995), pp. 290–296. For the constitutional obligation to safeguard fundamental rights
extraterritorially see Hoffmann-Riem (2014), pp. 61–62.
145 For an in-depth analysis, taking into account all sources of international law as laid down in

Article 38 of the Statute of the International Court of Justice, see Svantesson (2014), pp. 76–94.
146 Kuner (2015b), p. 239, argues that the distinction between ‘extraterritorial in scope’ and

‘extraterritorial in effect’ has become irrelevant.


147 The explicit inclusion of onward transfers is a central change as compared to the Directive 95/46.
470 R.-D. Veit

high administrative fines.148 Furthermore, Article 44 GDPR states that the Chapter’s
provisions “shall be applied in order to ensure that the high level of protection for
natural persons guaranteed by this Regulation is not undermined”,149 which is to be
viewed as an interpretative guideline.150
With regard to its structure, Chapter V of the GDPR maintains the pattern of Direc-
tive 95/46 and provides for three pathways for lawful transfer to third countries: (1)
transfer to a country with an adequate standard of protection (Article 45 GDPR), (2)
transfer subject to appropriate safeguards (Article 46 GDPR), and c) derogations for
specific situations (Article 49 GDPR). These three options are formally independent
from each other in the sense that, for instance, the invalidity, suspension, or repeal of
an adequacy decision does not affect the possibility of using the other instruments.151
The recently passed Lei Geral de Proteção de Dados in Brazil follows the same
model, but differs in structure and detail. Article 33 LGPD sets forth the general rule
according to which an international transfer of personal data is only permitted in the
subsequent cases. Article 33(1) LGPD then legitimizes the transfers to countries or
international organizations only when they provide a level of protection adequate
to the provisions of the law.152 Unlike in case of adequacy, Article 33(2) LGPD
allows data transfers also when the controller offers and substantiates guarantees of
compliance with the principles and rights of the data subject and the data protection
regime provided in the act, and the provision sets forth four different options which
much resemble those for appropriate safeguards in Article 46 GDPR, albeit with less
detail.153 Finally, Articles 33(3)–(9) LGPD provide for other legitimate grounds for
international data transfer. Structurally, these provisions are similar to the derogations
set out in Article 49 GDPR, but they differ in their content. While Article 49(1) GDPR
repeats the general criteria for the lawfulness of processing set out in Article 6(1)
GDPR,154 this is not the case for Article 33(2–9) LGPD.

148 These fines can be up to e20 million or 4 percent of the total worldwide turnover of the preceding
financial year. See Article 83(5)(c) GDPR.
149 This corresponds with the teleological interpretation of Directive 95/46 by the ECJ according to

which the level of data protection as guaranteed under the Directive “could easily be circumvented”
if the adequacy assessment did not require a level “essentially equivalent” to the EU by virtue of
the Directive 95/46, see ECJ, Footnote 75, para 73.
150 See also ECJ, Footnote 90, para 92: “That level of protection [as mentioned in Article 44 GDPR]

must therefore be guaranteed irrespective of the provision of that chapter on the basis of which a
transfer […] is carried out”.
151 Article 45(7) GDPR states that a repeal, amendment, or suspension of an adequacy decision “is

without prejudice to transfers of personal data […] pursuant to Articles 46–49”.


152 Article 34 LGPD assigns the task of evaluating the level of protection to the national authority.

At the same time, Article 33 (parágrafo único) LGPD empowers certain legal entities of public
law and the parties accountable to request the national authority to evaluate the level of protection
provided by a particular country or international organization.
153 However, Article 35 LGPD states that the content of the guarantees referred to in Article 33(2)

LGPD will be defined by the national authority, and the article also lays down criteria for the
authority to take into consideration when carrying out this task. For more detail on Article 46
GDPR see below, 4.2.3.
154 On Article 49 GDPR see also below, Sect. 4.2.3.
Safeguarding Regional Data Protection Rights on the Global Internet … 471

4.2.1 The Adequacy Regime as the Core Principle

The ‘adequacy regime’ remains the core principle of European third country data
transfer regulation and is now laid down in Article 45 GDPR. According to Article
45(1) GDPR, a transfer of personal data to a non-EU destination country may take
place “where the Commission has decided that the third country, a territory or one or
more specified sectors within that third country, or the international organization in
question ensures an adequate level of protection.” Where such an adequate standard
of protection is determined, Article 45(3) GDPR empowers the Commission to make
a formal adequacy decision by way of an implementing act, which then legitimizes
data transfers to that destination without further authorization. The wording of Article
45(1) GDPR contains two central changes in comparison to Article 25(1) Directive
95/46. Firstly, it abandons the word ‘only’ and thereby implicitly emphasizes the
existence of alternative legitimate grounds for data transfers, and secondly, it gives
the Commission more flexibility as to the assessment of adequacy. The adequacy of
a protection standard can now explicitly be attested not only to a third country, but
also to smaller geographical units within that country, such as a “territory or one or
more specified sectors” as well as to an international organization. This additional
specification allows for the Commission to take into account regional or sectoral
developments and could be an instrument to further catalyze developments such
as the influence of California’s privacy legislation on other states and the federal
lawmakers in the U.S., which has been referred to as the ‘California effect’.155
Article 45(2) GDPR now specifies the criteria for assessing adequacy, which are
much more detailed than they were in the Directive. The emphasis is on the existence
of effective rights for the data subjects as well as their effective enforceability through
administrative and judicial means.156 The new provision is more pluralistic in the
sense that it obliges the Commission to take into account international commitments
of the destination entity in relation to the protection of personal data.157 The focus
on effective rights and their effective enforcement reflects the elements laid down in
Article 8 CFR as they were established by the ECJ in the Schrems case in which the
Court held that, while the means of the third country may very well differ from those
employed in the EU, including self-regulatory approaches,158 they must “neverthe-
less prove, in practice, effective to ensure protection essentially equivalent”159 and

155 See on this Schwartz (2014).


156 Article 45(2)(a), (2)(b) GDPR. As became apparent in its judgment in Schrems II, the ECJ
considers the criteria laid down in Article 45(2)(a) GDPR to be a concretization both of the funda-
mental rights in Articles 7 and 8 CFR and Article 47 CFR. See ECJ, Footnote 90, paras 169–185
and 186–197 respectively.
157 Article 45(2)(c) GDPR.
158 ECJ, Footnote 75, para 81. The Working Party has published criteria that a self-regulatory regime

needs to fulfil in order to be considered as a valid ingredient of adequate protection. See Article
29 Working Party, ‘Working Document: Judging industry self-regulation: when does it make a
meaningful contribution to the level of data protection in a third country?’, WP 7, 14 January 1998,
p. 6.
159 ECJ, Footnote 75, para 74. Also see ECJ, Footnote 90, para162.
472 R.-D. Veit

that the main considerations of the Commission when assessing adequacy must be
“the content of the applicable rules in that country […] and the practice designed to
ensure compliance with those rules.”160 Furthermore, the European judges obliged
the Commission to review periodically whether its finding of adequacy is still factu-
ally and legally justified.161 This reasoning was also implemented in the GDPR,
which now requires the formal adequacy decision to feature a mechanism for peri-
odic review162 and mandates the Commission to “monitor developments […] that
could affect the functioning of the decision adopted” on an ongoing basis.163 Overall,
the Commission’s discretion as to the adequacy assessment is reduced, and the review
of the criteria provided for in EU law must be read in light of the Charter and should
be strict.164
Under the Directive, this adequacy regime, as far as it was accepted on a general
basis, was criticized for having two central flaws165 : firstly, the Directive did not
specify how an “adequate level of protection” is assessed; Article 25(2) Directive
95/46 remained quite unclear as to the criteria of the assessment. Secondly, the
opinions of the EU Member States differed on what constitutes an adequate level of
protection. Another concern was that, due to the absence of on-site audit mechanisms,
judgments of adequacy would be made in a nonempirical way, according to the
analysis of formal indicators.166
Against this background, the GDPR provides for clearer and more detailed provi-
sions and renders this criticism invalid. Article 45 GDPR now sets out precise, non-
exhaustive criteria for the Commission to work with, and by imposing the obligation
to monitor developments and adjust the decision accordingly, it prevents legal dead-
lock. The issue of divergent stances among the Member States is now obsolete as
the competence to assess the adequacy of a third country’s data protection regime
is exclusively reserved for the Commission. As a matter of fact, the GDPR equips
the national supervisory authorities with the power to suspend data flows,167 which
is in line with the recent holding of the ECJ that the authorities are not bound by an
adequacy decision and may take their own view.168 However, this capacity is only
limited to singular data transfers to third countries or other entities and does not
generally affect the adequacy decisions of the Commission.169 Under the GDPR,

160 ECJ, Footnote 75, para 75. Regarding the decision implementing the EU-U.S. Privacy Shield,
the ECJ came to the conclusion that the Commission did not sufficiently assess the content of the
applicable rules in the US, see ECJ, Footnote 90, para 163–198.
161 ECJ, Footnote 75, para 76.
162 Article 45(3) GDPR.
163 Article 45(4) GDPR.
164 ECJ, Footnote 75, para 78. On the necessity to interpret the criteria for the adequacy assessment

in light of the Charter rights also see ECJ, Footnote 90, paras 169–185 and 186–197.
165 Greenleaf (2012), p. 77.
166 Bennett and Raab (2006), p. 103.
167 Article 58(2)(j) GDPR.
168 ECJ, Footnote 75, para 66.
169 Nevertheless, if the Commission has adopted a valid adequacy decision and a complaint is lodged

by a person concerning the protection of his or her data protection rights, the authorities remain
Safeguarding Regional Data Protection Rights on the Global Internet … 473

adequacy judgments cannot be made in a non-empirical way, based on the black


letter of the law, but must take into consideration the existence and effective func-
tioning of independent supervisory authorities that ensure and enforce compliance
in the destination country.170 This explicit accentuation of the procedural dimension
is an essential improvement of the updated regime.
However, despite constituting the core principle of EU data transfer regulation, the
adequacy regime has relatively little practical importance. To date, the Commission
has recognized only thirteen countries as ensuring an adequate level of protection,171
and major economic players do not feature among them. Nevertheless, the Commis-
sion’s rigid policy did not entail a disruption of data flows to those third countries
without an adequate standard; instead, it highlights the practical relevance of the
other pathways to lawfulness offered under the EU regime.

4.2.2 The U.S. Exception: From Safe Harbor to the Privacy Shield

The United States is not among the countries whose level of protection has been
attested as adequate by the EU Commission. But considering how deeply interwoven
the European and U.S. markets are and the economic impact a sudden rupture of the
trade relations between the two players would have,172 both sides had incentives
to find a solution. Thus, negotiations were initiated and in 2000, the Commission
issued a decision173 that found the U.S. “safe harbor principles”174 to be constituting
an adequate standard of protection. In short, this framework provided for a voluntary
self-regulatory program for U.S.-based companies that required adherence to seven
fundamental privacy principles that mirrored the EU regime and was overseen by
a U.S. federal agency, namely the Federal Trade Commission.175 This compromise
was heavily criticized mainly for its mere self-regulatory nature as well as its defi-
cient enforcement scheme,176 and in October 2015 it was invalidated by the ECJ
in the Schrems case for violation of fundamental rights.177 Shortly after this judg-
ment was issued, the EU and the U.S. entered into negotiations for a replacement,

competent to examine whether the transfer in question complies with the requirements laid down
by the GDPR and, if relevant, bring an action before the national courts, see ECJ, Fn 90, para. 120.
170 Article 45(2)(b) GDPR.
171 These are Andorra, Argentina, Canada, Faroe Islands, Guernsey, Israel, Isle of Man, Japan,

Jersey, New Zealand, Switzerland, United Kingdom, and Uruguay. Adequacy talks are taking
place with South Korea, see https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-
outside-eu/adequacy-protection-personal-data-non-eu-countries_en (accessed 11 Jan. 2022).
172 For estimates see, for instance, Cunningham (2013), pp. 436–437.
173 See Footnote 76.
174 See Footnote 77.
175 See U.S. Department of Commerce, U.S.-EU Safe Harbor Overview, https://2016.export.gov/

safeharbor/eu/eg_main_018476.asp (accessed 11 Jan. 2022).


176 More thoroughly on the regime as well as the criticism directed at it Cunningham (2013),

pp. 436–440.
177 ECJ, Footnote 75.
474 R.-D. Veit

which culminated in the establishment of the EU-U.S. Privacy Shield,178 which was
implemented in EU law by a new adequacy decision of the Commission.179
In July 2020, this adequacy decision of the Commission was also held invalid by
the European Court of Justice.180 Prior to the judgment, the views on the efficacy of
the Privacy Shield differed.181 Its structure was similar to the one of the Safe Harbour
agreement; it also provided for a voluntary self-regulatory program, and compliance
was subject to supervision by the same U.S. institutions. However, it was generally
more detailed, introduced new “supplemental principles,” and was complemented by,
inter alia, commitments of various U.S. officials, including the establishment of an
ombudsperson as an oversight mechanism for national security interference through
which authorities in the EU can submit requests on behalf of EU citizens regarding
intelligence practices of the U.S.182
Nevertheless, the ECJ held that the limitations on data protection arising from the
domestic US law, which the Commission assessed in the Privacy Shield Decision,
namely Sect. 702 of the FISA and E.O. 12,333, read in conjunction with PPD-
28, “are not circumscribed in a way that satisfies requirements that are essentially
equivalent to those required, under EU law, by the second sentence of Article 52(1)
of the Charter”.183 What is more, the Court stated that the ombudsperson mechanism
contained in the Privacy Shield Decision does not provide for any cause of action
before a body which offers guarantees essentially equivalent to those required by
Article 47 CFR to the persons affected.184 As a result, EU-based controllers and/or
processors willing to transfer personal data to the US will have to resort to the
alternative legal grounds for data transfers provided for in the GDPR.

4.2.3 Alternative Pathways to Legitimate Data Transfer

For those data controllers and processors who aim to transfer data to a non-EU
entity that does not provide for an adequate standard of protection, the GDPR offers
alternative pathways to lawfulness. They can opt to provide appropriate safeguards
or, exceptionally, transfer personal data by way of derogation.

178 The text is accessible online at https://www.privacyshield.gov/EU-US-Framework (accessed 11


Jan. 2022).
179 See above, Footnote 92.
180 ECJ, Footnote 90, paras 150–201. The ECJ emphasized the immediate effect of the invalidation

and pointed out that Article 49 GDPR provides for legal grounds for data transfers to thirds countries,
see para 202.
181 EU stakeholders had taken a more skeptical view. The Article 29 Working Party, for instance,

directed criticism both at the commercial aspects and the governmental access to data transferred to
the U.S., see Article 29 Working Party, ‘Working Document: EU - U.S. Privacy ShieldFirst annual
Joint Review’, WP 255, 28 November 2017. For a more positive notion see, for example, Kuner
(2017), pp. 912–913.
182 See Commission Implementing Decision, Footnote 92, Annex III.
183 ECJ, Footnote 90, para 185. See also paras 178–184.
184 ECJ, Footnote 90, para 197. For the different reasons for this holding see paras 190–196.
Safeguarding Regional Data Protection Rights on the Global Internet … 475

According to Article 46(1) GDPR, data controllers and processors may, via bilat-
eral or multilateral agreements that contain ‘appropriate safeguards’, compensate for
the absence of an adequacy decision in the country they aim to transfer the data to
if ‘enforceable data subject rights’ and ‘effective legal remedies’ are available. The
GDPR distinguishes between such safeguards that require further authorization by
a supervisory authority and those that do not. Article 46(2) GDPR contains non-
exhaustive examples for adequate safeguards not requiring further authorization:
a legally binding and enforceable instrument between public authorities, binding
corporate rules (BCRs),185 standard data protection contract clauses (SCCs),186 codes
of conduct, and a certification mechanism. Pursuant to Article 46(3) GDPR, adequate
safeguards that are subject to further authorization may be provided for, in particular,
by (1) contractual clauses between the controller or processor and the controller,
processor, or recipient in the third country and (2) provisions to be inserted into
administrative arrangements between public authorities.187
As compared to its predecessor, the new provision is more detailed, introduces
new options for legal data transfers, and, by explicitly accepting BCRs and self-
regulatory mechanisms, demonstrates the ambition to reconcile the high standard
maintained in the EU with the diverging approaches to data protection worldwide.
With regard to the substantive requirements of Article 46 GDPR, the ECJ, in its
recent judgment in Schrems II, clarified that the provision, because it appears in
Chapter V, must be read in the light of Article 44 GDPR and that, therefore, the
level of protection provided for by the GDPR must be guaranteed irrespective of
the provision of that chapter on the basis of which a transfer is carried out.188 As a
result, the judges stated that the term ‘appropriate safeguards’ is to be understood
as requiring such measures that are “capable of ensuring that data subjects whose
personal data are transferred to a third country pursuant to standard data protection
clauses are afforded […] a level of protection essentially equivalent to that which
is guaranteed within the European Union”.189 As regards Article 46(2)(c) GDPR
in particular, the Court ruled that, when determining the adequacy of the level of
protection in a third country when data is transferred there pursuant to SCCs adopted
under this provision, the criteria to be taken into consideration are, in particular, both
the contractual clauses themselves and the relevant aspects of the legal system of the
third country as regards any access by its public authorities to the data transferred.190

185 The detailed requirements for the approval of BCRs are laid down in Article 47 GDPR.
186 Under Directive 95/46, the EU Commission issued decisions adopting sets of Standard Contrac-
tual Clauses. According to Article 46(5) GDPR, they shall remain in force until amended, replaced,
or repealed. In its recent judgment in Schrems II, the ECJ confirmed the validity of Decision 2010/87
in light of Articles 7, 8 and 47 CFR, see above, Sect. 3.
187 See Article 46(3) GDPR. In these cases, the supervisory authority shall apply the consistency

mechanism, Article 46(4) GDPR. The EDPB has provided further guidance as to the application
of Articles 46 (2)(a) and 46 (3)(b), see EDPB, Guidelines 2/2020, 15 December 2020, accessible
under: https://edpb.europa.eu.
188 ECJ, Footnote 90, para 92.
189 ECJ, Footnote 90, para 96.
190 ECJ, Footnote 90, paras 102–104.
476 R.-D. Veit

Regarding the assessment of the latter, the factors to be taken into account in the
context of Article 46 GDPR correspond to those set out, non-exhaustively, in Article
45(2) GDPR.191 The Court stressed that the contractual mechanism provided for in
Article 46(2)(c) GDPR “is based on the responsibility of the controller or his or
her subcontractor established in the European Union and, in the alternative, of the
competent supervisory authority.”192 Thus, not all safeguards required by Article 46
GDPR must necessarily be provided for in a Commission decision such as the SCC
Decision but it is, above all, for the controller or processor to verify whether the law
of the third country of destination ensures adequate protection.193
The third option for data controllers and processors to transfer data to entities
without an adequate protection standard, which is intended as the last resort, is to
do so by way of derogations. Article 49(1) GDPR sets forth seven exhaustive condi-
tions194 and introduces a new general clause that requires the balancing of the transfer
purpose and the interests and rights of the data subject affected.195 These deroga-
tions, however, are not tailored to generally providing an adequate long-term legal
basis for repeated, mass, or structural data transfers. Only in cases where it would
be genuinely inappropriate or even impossible for the transfer to take place on the
basis of appropriate safeguards as laid down in Article 46 GDPR should the deroga-
tions be applied.196 The term ‘derogation’ is misleading insofar as a transfer on the
basis of Article 49 GDPR may constitute a derogation from the general requirement
of an adequate standard of protection, but such a transfer, pursuant to Article 49
GDPR, does not exempt the controller or processor from the obligation to respect
the fundamental rights of the data subject.197 The European Data Protection Board
has recently published guidelines seeking to provide guidance as to the application
of Article 49 GDPR.198

191 ECJ, Footnote 90, para 104.


192 ECJ, Footnote 90, para 134.
193 ECJ, Footnote 90, paras 128 and 134.
194 These are: (1) consent, (2) necessity for the performance of a contract between the data subject

and the controller, (3) necessity for the performance of a contract between a third party and the data
controller in the interest of the data subject, (4) necessity for important reasons of public interest,
(5) necessity for establishment, exercise, or defense of legal claims, (6) necessity for the protection
of vital interests of the data subject or other persons, and (7) transfer from a public register.
195 See Article 49(1)(subpara. 2). When a controller or processor makes use of this option, Article

49(6) stipulates the additional obligation to document the assessment made as well as the suitable
safeguards employed in the records referred to in Article 30 GDPR.
196 See, with regard to the old regime, Article 29 Working Party, ‘Working Document on a Common

Interpretation of Article 26(1) of Directive 95/46/EC of 24 October 1995’, WP 114, 25 November


2005, p. 9.
197 Article 29 Working Party, Footnote 196, p. 9.
198 EDPB, Guidelines 2/2018 on derogations of Article 49 under Regulation 2016/679, 25 May

2018, accessible under https://edpb.europa.eu.


Safeguarding Regional Data Protection Rights on the Global Internet … 477

4.2.4 Criticism and Normative Legitimacy

This international data transfer regime of the EU, as an (indirect) extraterritorial


mechanism, has of course been subject to criticism. On a more general note, it
has been asserted that by prohibiting data transfers—regardless of the location of
the data—the European Union is “bypassing national sovereignty.”199 Moreover,
the regime under the Directive was attested a lack of jurisdictional ‘safety valves,’
and in this context it was suggested to precisely define the external dimension of
the fundamental right to data protection in the sense that some ‘essential’ elements
should be determined and only those should have extraterritorial applicability.200
Further, particularly following the decision in Schrems, the EU and the ECJ were
criticized for concerning themselves with surveillance practices in third countries,
when surveillance is clearly taking place within the territory of the EU and the
Member States are engaging in intelligence-sharing on a large scale.201
In response to this criticism, it must be said that the EU data transfer regime does
not bypass the sovereignty of third countries. Like the marketplace principle, it is
an expression of the constitutional obligation to safeguard fundamental rights on the
Internet and to protect citizens from dangers arising from outside the jurisdictional
territory,202 an inherently legitimate purpose. If, in the age of the Internet and global
data flows, the protection of personal information and data were limited to territorial
borders, the normative claim of a right to data protection would be rendered void.
The idea that a country may restrict data flows to third countries if these countries
do not provide a certain degree of protection is not a peculiarity of the EU regime,
but is also reflected in other international data protection frameworks.203 In its recent
judgments, the ECJ has established the essential elements of Articles 7 and 8 CFR204
and thereby defined the limits to the external dimension of the fundamental rights.205

199 Cunningham (2013), p. 452.


200 Kuner (2015b), pp. 241–244.
201 See Kuner (2017), pp. 898–899 with references.
202 See Recital 114: “In any case, where the Commission has taken no decision on the adequate

level of data protection in a third country, the controller or processor should make use of solutions
that provide data subjects with enforceable and effective rights as regards the processing of their
data in the Union once those data have been transferred so that that they will continue to benefit
from fundamental rights and safeguards.” (emphasis added).
203 See Article 12(3) of the Data Protection Convention 108 and No. 17 of the OECD Privacy

Guidelines.
204 In Schrems, Footnote 75, the Court held that interfering legislation must “lay down clear and

precise rules governing the scope and application of a measure,” impose minimum safeguards for
the data subjects to be “effectively protected against the risk of abuse and any unlawful access and
use of that data” (para 91), and that derogations and limitations may apply only insofar as strictly
necessary (para 92).
205 However, it needs to be noted that EU data protection law consists of (functionally) different

elements or, more metaphorically, “building blocks” that complement each other but do not neces-
sarily need to be applied all at once but rather in a differentiated manner with a view to the field
of regulation in question, on this, in the German national context, see Albers (2012), pp. 183–231,
particularly pp. 230–231. Taking this flexible nature of EU data protection law into consideration
478 R.-D. Veit

With regard to the accusation of hypocritical behavior, it must be stated that, as


much as a coherent surveillance regime would add to the legitimacy of the EU’s
claim to safeguard its protection standards extraterritorially, the lack thereof cannot
serve as an argument to justify a violation of fundamental rights by a third country.206
Moreover, this criticism disregards that the European Union is a complex political
and legal body sui generis that is separate from its Member States. Due to this
specific structure, certain policy areas and the legislative competences they entail,
including national security, are reserved to the Member States and the ECJ would
exceed its jurisdiction if it decided on data protection law with regard to matters of
national security.207 However, an important player regarding the task of establishing
a common European (minimum) standard in relation to data protection and (national)
security law is the European Court of Human Rights (ECtHR).208 The relationship
of the European constitutional courts is complex and dynamic. If they manage to
coordinate their respective jurisprudence, this could be a way for the EU to adopt a
common or at least more harmonized stance on both data protection and surveillance
law.209

when assessing whether the level of protection in a third country is essentially equivalent to the EU
standard could be helpful for the task of reconciling diverging notions of privacy and data protection.
206 See also Kuner (2017), p. 899.
207 The jurisdiction of the ECJ is limited to the interpretation and application of EU law, see Article

19 TEU. Correspondingly, the GDPR, as laid down by Article 2(2)(a) and like the Directive before
it, does not apply to activities which fall outside the scope of Union law, which explicitly includes
activities concerning national security, see Recital 16. However, this has not stopped the Court from
finding national legislation which allows for a general and indiscriminate retention of all traffic and
location data of all users of electronic communication to violate Articles 7, 8, 11 and 52 CFR. See
ECJ, C-203/15 and C-698/15, Tele 2 Sverige AB/Watson and Others, Judgment of 21 December
2016, accessible under http://curia.europa.eu, paras 62–112. On this matter see also ECJ, C-207/17,
Ministerio Fiscal, Judgment of 2 October 2018, accessible under http://curia.europa.eu, paras 29–
43. In a recent referral for a preliminary ruling, the Court was called upon to clarify the scope of its
jurisdiction with respect to national security law. The referring British court asked, inter alia, how
and to what extent the findings of the ECJ in the Tele 2 Sverige case apply to a national requirement
to a provider that it must provide bulk communications data to the security and intelligence agencies
of a Member State if the use of such data is essentially necessary to protect national security. In
its judgment, the ECJ held that “national security remains the sole responsibility of each Member
State” but it interpreted the term rather narrowly and distinguished it from “objectives of combating
crime in general, even serious crime, and of safeguarding public security”, see ECJ, C-623/17,
Privacy International, Judgment of 6 October 2020, accessible under http://curia.europa.eu, paras
74–75. All in all, the Court did not use the opportunity to define clear jurisdictional limitis with
regard to national security law.
208 In a recent judgment, the ECtHR ruled that parts of the (former) surveillance program of the

British GCHQ violate Articles 8 and 10 of the European Convention of Human Rights (ECHR).
See ECtHR, nos. 58170/13, 62322/14 and 24960/15, Big Brother Watch and Others v. the United
Kingdom, Judgment of 13 September 2018, accessible under http://hudoc.echr.coe.int.
209 In Schrems II, the ECJ, when asked whether the ECHR or national law and practices in the

context of national security in one or more Member States are to be included when determining
whether a data transfer to a third country on the basis of the SCC Decision constitutes a violation of
rights, held that this is not the case, see Footnote 90, paras 97–101. This is not a convincing solution
as it disregards both the function of the ECtHR in the framework of the multilevel cooperation of the
Safeguarding Regional Data Protection Rights on the Global Internet … 479

Evidently, international data transfer regulation is a highly political field, which


the collision of the EU and U.S. approaches to the subject makes particularly obvious.
In this specific context, it has been argued that the problem is not the political
disagreement about how to regulate international data transfer, but that both sides
are “unwilling to consider positions that go beyond the underlying assumptions of
their own system.”210 This statement disregards the fact that the EU has taken signif-
icant steps towards pluralistic openness of its legal framework. The reformed data
transfer regime under the GDPR constitutes a compromise between the obligation to
protect constitutional values and the legal and political reality. It incentivizes other
states to adopt an adequate protection standard by granting unrestricted data flows
to countries with such a data protection framework and, at the same time, it provides
for sufficient alternative “paths to accommodation”211 for diverging approaches to
the protection of personal information and data. The legal act consolidates policies
that had proven to be deficient and introduces new instruments for international data
transfers, which widens the options for those willing to transfer data from the EU to a
non-EU entity. Furthermore, both the ECJ and the EU legislator have acknowledged
the pluralist nature of international data protection law.212 The Court has established
that, as long as the protection standard is essentially equivalent, a third country is
free in the choice of means.213 The GDPR now obliges the Commission to take into
account international commitments of a state to the protection of personal data214
and also sets out the obligation for the Commission and EU supervisory authorities
to, inter alia, further engage in international dialogue and cooperation to facilitate
the effective enforcement of data protection law and also promote the exchange and
documentation of data protection legislation and practice, including on jurisdictional
conflicts with third countries.215
Overall, the data transfer regime of the EU under the GDPR can therefore be
considered a normatively just instrument, as it manages to reconcile the constitutional
obligation to safeguard data protection rights on the Internet with the pluralistic
reality of international data protection law. Naturally, the enforcement of the EU
data transfer regime is subject to limits. Where EU data protection law reaches
beyond EU territory, it is dependent on the cooperation of the third states for its
effective enforcement. The EU cannot expect that its standards will be accepted by
every country in the world, so it has to engage in negotiations if it does not want
to abandon its normative claim in the digital context. With regard to a future legal
basis for data transfers between the EU and the US, this means that considerable
concessions on both sides will have to be made. Against this background, it would

European constitutional courts and the necessity of a common stance on the legality of surveillance
practices.
210 Kuner (2017), p. 917.
211 For this term see Schwartz (2013), pp. 2001–2003.
212 On the pluralist nature of global data protection law Kuner (2014), pp. 66–67.
213 ECJ, Footnote 75, para 74.
214 Article 45(2)(c) GDPR.
215 Article 50 GDPR.
480 R.-D. Veit

have been helpful if the ECJ had given a more detailed explanation of what, in its
view, are the minimum requirements for a legal system to be considered as providing
a level of protection “essentially equivalent” to the EU standard. On the other hand,
judicial self-restraint is necessary in order to provide the EU Commission with more
flexibility when entering into (political) negotiations.

5 Conclusion

The rise of the Internet has brought changes to the rationalities of economies and soci-
eties across the world that put pressure on traditional political and legal paradigms.
Because of the loose connection between the object of regulation and the subject
of protection, the challenges are particularly evident in the field of data protec-
tion law. While a global legal regime, as the most desirable response, is not on the
horizon, outstanding progress regarding the regulation of transnational data flows
was achieved by the EU through its Directive 95/46, which is seen as the most
influential instrument in international data protection regulation. With its update, the
new General Data Protection Regulation, the EU is expected to further deepen its
influence on international data protection governance, mainly because of the new
marketplace principle as well as the revised third country data flow ‘border control’
regime.
This chapter has assessed these instruments and established that they originate
from the constitutional obligation of the European Union and its institutions to safe-
guard the fundamental rights to private life and to data protection, which are facing
new challenges in the era of the Internet due to ubiquitous, often nontransparent
cross-border data processing and the global surveillance practices of various states.
It argues that the focus on the degree of interaction with the EU market is a legitimate
approach to fulfilling this obligation and that it is not a peculiarity of the European
regime, but is also used in (data protection) laws in other countries, such as Brazil
or the United States.
Safeguarding data protection rights on the Internet is a complex and difficult task.
The General Data Protection Regulation approaches it by adjusting its territorial
scope to the reality of a globalized data-driven economy and continuing to control
data flows to third countries. Of course, like most instruments of international law,
the enforcement of the EU data transfer regime is subject to limits. Wherever in the
world legal regimes collide, they cannot claim absolute and unconditional applica-
bility and enforcement. The creation of international standards is always a political
process that follows its own rationalities and is determined by reciprocal adjustment.
Modern data protection laws therefore need to be designed so that they are open
to conflicting notions of privacy and political and economic interests while at the
same time maintaining their normative claim. In this context, scholarship plays an
important role in helping to find harmonizing solutions where conflicts of regimes
arise, ideally through interdisciplinary and comparative approaches.
Safeguarding Regional Data Protection Rights on the Global Internet … 481

This chapter has argued that in many ways, the EU data protection regime reflects
the pluralistic nature of international data protection law and is open to international
data protection policy innovations. In this sense, it is a more flexible instrument than
its predecessor. With regard to the goal of a globally harmonized data protection
standard, the GDPR is certainly a step in the right direction.

References

Albers M (2014) Realizing the complexity of data protection. In: Gutwirth S, Leenes R, De Hert P
(eds) Reloading data protection. Springer, Dordrecht, pp 213–235
Albers M (2012) § 22 Umgang mit personenbezogenen Informationen und Daten. In: Hoffmann-
Riem W, Schmidt-Aßmann E, Voßkuhle A (eds) Grundlagen des Verwaltungsrechts, C.H. Beck,
Munich, pp 107–234
Albrecht JP, Jotzo F (2017) Das neue Datenschutzrecht der EU. Nomos, Baden-Baden
Albrecht JP (2016) How the GDPR will change the world. Eur Data Protect Law Rev 3(3):287–289
Bennett CJ, Raab CD (2006) The governance of privacy. The MIT Press, Cambridge, Massachusetts
Boehme-Neßler V (2015) Big data und Demokratie - Warum Demokratie ohne Datenschutz nicht
funktioniert. Deutsches Verwaltungsblatt 130(20):1282–1287
Broemel R, Trute HH (2016) Alles nur Datenschutz? Zur rechtlichen Regulierung algorithmen-
basierter Wissensgenerierung. Berliner Debatte Initial 27(4):50–65
Bruehl J, Tanriverdi H (2017) Der überwachte Mensch zensiert sich selbst. Süddeutsche Zeitung.
http://www.sueddeutsche.de/digital/kongress-des-chaos-computer-club-der-ueberwachte-men
sch-zensiert-sich-selbst-1.3808605. Accessed 11 Jan. 2022
Büttner B et al (2016) Reterritorialisierung und Privatheit. In: Bütter B et al (eds) Die Reterritori-
alisierung des Digitalen. Kassel University Press, Kassel, pp 139–152
Cáceres J (2014) Brasilien und EU planen gemeinsames Datenkabel. Süddeutsche Zeitung. http://
www.sueddeutsche.de/politik/folgen-der-nsa-affaere-brasilien-und-eu-planen-gemeinsames-dat
enkabel-1.1896673. Accessed 11 Jan. 2022
Carolan M (2018) High Court to rule on data issues to bring to EU Court of Justice. The
Irish Times. https://www.irishtimes.com/business/technology/high-court-to-rule-on-data-issues-
to-bring-to-eu-court-of-justice-1.3361504. Accessed 11 Jan. 2022
Cunningham M (2016) Complying with international data protection law. Univ Cincinnati Law Rev
84(2):421–450
Cunningham M (2013) Diminishing sovereignty: how European privacy law became international
norm. Santa Clara J Int Law 11(2):421–453
Curtiss T (2016) Privacy harmonization and the developing world: the impact of the EU’s general
data protection regulation on developing countries. Washington J Law Technol Arts 12(1):95–122
Dammann U (2016) Erfolge und Defizite der EU-Datenschutzgrundverordung. Zeitschrift für
Datenschutz 6(7):307–314
De Hert P, Czerniawski M (2016) Expanding the European data protection scope beyond territory:
Article 3 of the general data protection regulation in its wider context. Int Data Privacy Law
6(3):230–243
De Hert P, Papakonstantinou V (2013) Three scenarios for international governance of data privacy:
towards an international data privacy organization, Preferably by a UN Agency. I/S: J Law Policy
Inf Soc 9(2):271–324
De Souza Abreu J, Sousa Nakagawa FM, Pacetta Ruiz J (2016) Data protection in Brazil, draft
report/review of legal background. Internetlab. http://www.internetlab.org.br. Accessed 11 Jan.
2022
Docksey C (2017) Opinion I/15: privacy and security, finding the balance. Maastricht J Eur Comparat
Law 24(6):768–773
482 R.-D. Veit

Doneda D, Mendes LS (2014) Data protection in Brazil: new developments and current challenges.
In: Gutwirth S, Leenes R, De Hert P (eds) Reloading data protection. Springer, Dordrecht, pp
3–20
Fleischer P (2007) Call for global privacy standards. Google Public Policy Blog. https://publicpol
icy.googleblog.com/2007/09/call-for-global-privacy-standards.html. Accessed 11 Jan. 2022
Geminn CL, Ledder S, Pittroff F (2016) Routing und die Arena seiner Aushandlung. In: Bütter B
et al (eds) Die Reterritorialisierung des Digitalen. Kassel University Press, Kassel, pp 23–59
Giesen T (2016) Euphorie ist kein Prinzip des Rechtsstaats. In: Datenschutz S (ed) Zukunft der
informationellen Selbstbestimmung. Erich Schmidt Verlag, Berlin, pp 23–47
Gömann M (2017) The new territorial scope of EU data protection law: deconstructing a
revolutionary achievement. Common Market Law Rev 54(2):567–590
Greenleaf G (2017) Global data privacy laws 2017: 120 national data privacy laws, including
Indonesia and Turkey. Privacy Laws Bus Int Rep 145:10–13
Greenleaf G (2012) The influence of European data privacy standards outside Europe: implications
for globalisation of convention 108. Int Data Privacy Law 2(2):68–92
Hartmann IA (2022) Self-regulation in online content platforms and the protection of personality
rights. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Hijmans H (2014) Right to have links removed: Evidence of effective data protection. Maastricht
J Eur Comparat Law 21(3):555–563
Hoeren T (2007) Zoning und Geolocation - Technische Ansätze zu einer Reterritorialisierung des
Internet. Multimedia Und Recht 10(1):3–6
Hoffmann-Riem W (2014) Freiheitsschutz in den globalen Kommunikationsinfrakstrukturen.
Juristenzeitung 69(2):53–63
Johnson DR, Post D (1996) Law and borders—The rise of law in cyberspace. Stanford Law Rev
48(5):1367–1402
Klar M (2017) Die extraterritoriale Wirkung des neuen europäischen Datenschutzrechts. Daten-
schutz und Datensicherheit 41(9):533–537
Kokott J, Sobotta C (2013) The distinction between privacy and data protection in the jurisprudence
of the CJEU and the ECtHR. Int Data Privacy Law 3(4):222–228
Kranenborg H (2014) Article 8—Protection of personal data. In: Peers S et al (eds) The EU charter
of fundamental rights. Hart Publishing, Oxford and Porland, pp 223–265
Krisch N (2010) Beyond constitutionalism. Oxford University Press, Oxford
Kuner C (2015a) Data nationalism and its discontents. Emory Law J Online 64:2089–2098
Kuner C (2015b) Extraterritoriality and international data transfers in EU data protection law. Int
Data Privacy Law 5(4):235–245
Kuner C (2017) Reality and illusion in EU data transfer regulation post Schrems. German Law J
18(4):881–918
Kuner C (2014) The European Union and the search for an international data protection framework.
Groningen J Int Law 2(1):55–71
Ladeur KH (2013) New institutions for the protection of privacy and personal dignity in internet
communication. Braz J Public Policy 3(2):282–296
Lana de Freitas Castro E, Pereira Winter P (2014) O Conflito de Jurisdições em Caso de Violação
de Direitos da Personalidade por Publicação na Internet. Revista de Estudos Jurídicos UNEPS
18(28). https://dialnet.unirioja.es/descarga/articulo/5191698.pdf. Accessed 11 Jan. 2022
Mangeth AL, Marinho Nunes B (2018) A proteção de seus dados pessoais está em jogo
no Senado. ITS Rio. https://feed.itsrio.org/senado-vs-c%C3%A2mara-seus-dados-pessoais-em-
jogo-97d7b0cefc54. Accessed 11 Jan. 2022
Márton E (2016) Violations of personality rights through the internet: jurisdictional issues under
European law. Nomos, Baden-Baden
Mayer C (1996) Recht und Cyberspace. Neue Juristische Wochenschrift 49(28):1782–1791
Miller SF (2003) Prescriptive jurisdiction over internet activity: the need to define and establish the
boundaries of cyberliberty. Indiana J Glob Legal Stud 10(1):227–254
Safeguarding Regional Data Protection Rights on the Global Internet … 483

Orwell G (2016) 1984. Enrich Spot Ltd., Hong Kong


Reinhardt J (2022) Realizing the fundamental right to data protection in a digitized society. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Rossini C, Brito Cruz F, Doneda D (2015) The strenghts and weaknesses of the Brazilian internet bill
of rights: examining a human rights framework for the internet. Global commission on internet
governance paper series, vol 19
Roßnagel A (2016) Wie zukunftsfähig ist die Datenschutz-Grundverordnung? Datenschutz und
Datensicherheit 40(9):561–565
Safari BA (2017) Intangible privacy rights: how Europe’s GDPR will set a new global standard for
personal data protection. Senton Hall Law Rev 47(3):809–848
Sarlet IW (2022) The protection of personality in the digital environment: an analysis in the light
of the so-called right to be forgotten in Brazil. In: Albers M, Sarlet IW (eds) Personality and data
protection rights on the internet. Springer, Dordrecht, Heidelberg, New York, London (in this
volume)
Schantz P, Wolff HA (2017) Das neue Datenschutzrecht. C.H. Beck, München
Schimke A (2022) Forgetting as a social concept. Contextualizing the right to be forgotten. In: Albers
M, Sarlet IW (eds) Personality and data protection rights on the internet. Springer, Dordrecht,
Heidelberg, New York, London (in this volume)
Schulz W (2022) Regulating intermediaries to protect privacy online—The case of the German
NetzDG. In: Albers M, Sarlet IW (eds) Personality and data protection rights on the internet.
Springer, Dordrecht, Heidelberg, New York, London (in this volume)
Schwartz PM (2014) In practice: the ‘California effect’ on privacy law. The Recorder. https://
www.law.com/therecorder/almID/1202635738034&In_Practice_The_California_Effect_on_Pri
vacy_Law/#ixzz2pMcyFT6O. Accessed 11 Jan. 2022
Schwartz PM (2013) The EU-US privacy collision: a turn to institutions and procedures. Harv Law
Rev 126(7):1966–2009
Skouris V (2016) Leitlinien der Rechtsprechung des EuGH zum Datenschutz. Neue Zeitschrift für
Verwaltungsrecht 35(19):1359–1364
Svantesson DJB (2014) The extraterritoriality of EU data privacy law—its theoretical justification
and its practical effect on U.S. business. Stanford J Int Law 53(1):53–102
Torremans P (1995) Extraterritoriality in human rights. In: Neuwahl NA, Rosas A (eds) The
European Union and Human Rights. Kluwer Law International, The Hague, pp 281–296
Vasagar J, Fontanella-Khan J (2014) Angela Merkel backs EU internet to deter US spying. Financial
Times. https://www.ft.com/content/dbf0081e-9704-11e3-809f-00144feab7de. Accessed 11 Jan.
2022
Viellechner L (2017) Governing through transnational arrangements: the case of Internet domain
allocation. In: Paul R et al (eds) Society, regulation and governance: new models of shaping social
change? Edward Elgar Publishing, Cheltenham/Northampton, pp 106–120
Wagner J, Benecke A (2016) National legislation within the framework of the GDPR. Eur Data
Protect Law Rev 2(3):353–361
Zarsky TZ (2017) Incompatible: the GDPR in the age of big data. Seton Hall Law Rev 47(4):995–
1020

Raoul-Darius Veit Ph.D. Legal clerk at the Higher Regional Court Hamburg (Hanseatisches
Oberlandesgericht) and former Research Assistant at Prof. Dr Marion Albers’ Chair for Public
Law, Information and Communication Law, Health Law and Legal Theory at Hamburg University.
Member and Scholarhip Holder of the Albrecht Mendelssohn Bartholdy Graduate School of Law
(AMBSL). Visiting Researcher at PUCRS and FGV Rio de Janeiro in 2017 and 2018 as part of the
Brazilian/German CAPES/DAAD PROBRAL-Research Project „Internet Regulation and Internet
Rights“. Main areas of research: Data Protection Law, German Constitutional Law, European Law
and the Interrelations of National and EU Law. Selected Publications: Commentary of Art. 6 and
484 R.-D. Veit

9 General Data Protection Regulation (GDPR), in: Heinrich Amadeus Wolff/Stefan Brink (Eds.),
Data Protection Law, Beck’scher Online-Kommentar, C. H. Beck München, 2018 et seqq., since
the 25th Ed. (in collaboration with Marion Albers); Commentary of §§ 22 and 23 Federal Data
Protection Act (Bundesdatenschutzgesetz—BDSG), in: Heinrich Amadeus Wolff/Stefan Brink
(Eds.), Data Protection Law, Beck’scher Online-Kommentar, C. H. Beck München, 2018 et seqq.,
since the 25th Ed. (in collaboration with Marion Albers).
Index

A B
Accuracy, 88, 124, 205, 411, 415, 425, 428, Balancing, 10, 11, 24, 26, 28, 29, 57, 61,
432, 433, 436–438, 452 91, 103, 107, 149, 150, 152, 155,
Adequacy regime, 13, 471–473 163–165, 170, 173, 197, 206, 233,
278, 298, 309, 310, 317, 318, 321,
Aida Curi, 149–151, 164 322, 334, 370, 372, 399, 476
Algorithm, 6, 7, 10, 12, 80, 117, 126, 158, Behavioral economics, 63
161, 184–186, 268–270, 275, 276, Big data, 3, 12, 56, 59, 117, 126, 332,
283, 284, 340, 407–416, 419–421, 407–409, 411–413, 416, 449
423, 425, 430, 432–437, 440, 441, Binding Corporate Rules (BCRs), 475
449 Blasphemy, 325, 327, 328, 330
Algorithmic discrimination, 1, 12, 407, Brazilian Civil Code, 8, 37, 113, 144, 311,
409, 416, 420, 421, 431, 432, 434, 361, 364
436, 440, 441 Brazilian Civil Framework for the Internet,
Algorithmic governance, 407, 409, 432, 113, 120
433, 436 -see also Marco Civil da Internet
Anonymity Brazilian Civil Rights Framework for the
Internet, 264, 367, 370
- before Brazilian courts, 314
-see also Marco Civil da Internet
-concept of anonymity, 11, 322, 333 Brazilian dictatorship, 35
-prohibition on anonymity, 222, 330 Brazilian doctrine, 138
-right to anonymity, 7, 337, 342–344 Brazilian Internet Bill of Rights, 213–216,
-under European human rights’ law, 318 218, 225, 228–231, 233–235, 237,
450
Anonymization services, 84, 86, 94, 128
-see also Marco Civil da Internet
Anonymous expression of opinion, 337 Brazilian private law, 36, 37, 39, 44, 50
Apollonia case, 188
App blocking, 317
Appropriate safeguards, 470, 474–476 C
Arbitrary watchfulness, 115, 116 Cambridge Analytica, 6
Candelaria Slaughter, 150, 164
Artificial intelligence, 3, 56, 75, 109, 270, Causation, 408, 412–414
301, 414 Chilling effect, 43, 77, 78, 250, 278, 295,
Auditability, 432, 433 329, 330, 345, 348
Autonomy, 22, 40, 63, 64, 166, 217, 319, Civil liability, 46, 163, 171, 219, 223, 231,
343, 358, 362, 365, 368, 372 233, 234, 236, 241, 242, 247, 249,
© Springer Nature Switzerland AG 2022 485
M. Albers and I. W. Sarlet (eds.), Personality and Data Protection Rights on the Internet,
Ius Gentium: Comparative Perspectives on Law and Justice 96,
https://doi.org/10.1007/978-3-030-90331-2
486 Index

251–254, 256–258, 261, 263, 264, Correlation, 310, 408, 411–414, 416,
315 420–422, 434, 436, 439, 440
Code, 11, 37–40, 44–47, 60, 75, 80, 85, Council of Europe
143–145, 151, 157, 169, 247, 249, -recommendation on the roles and
251, 252, 264, 267, 270, 274–276, responsibilities of internet
279, 282–284, 292, 293, 295, 300, intermediaries, 301
304, 327, 362, 366, 427, 430, 433, Credit Information Act, 427–431, 436,
434, 436, 440, 475 438–441
Commission Nationale de l’Informatique et Credit scoring, 12, 126, 407, 409, 414, 416,
des Libertés (CNIL), 42, 452, 458, 421, 423, 426, 428–432, 434,
462, 463 436–438, 440, 441
Communication Cross-border
-on the internet, 1, 2, 5, 69, 72, 74, 93, -data transfer, 449
190, 244, 320, 337 -law enforcement, 463–480
-temporal dimension, 187 Cryptography, 315
Confidentiality of communications, 82, Cyberbullying, 310
123, 309, 322
Consent, 8, 42, 44–47, 57, 63–65, 96, 113, D
114, 121, 127, 145, 146, 187, 191, Damages, 11, 39, 50, 92, 114, 119, 120,
232, 298, 311, 313, 362, 364, 371, 122, 123, 140, 145, 146, 149, 151,
372, 428–431, 436, 476 153, 155, 162–165, 171, 172, 190,
Constitution, 8, 9, 12, 25, 36, 37, 40, 41, 213, 222, 223, 226–228, 230–233,
43, 44, 49, 78, 85, 89, 91, 113–115, 235, 237, 241, 242, 245, 247–259,
120, 134, 138, 142, 143, 145, 172, 261, 263–265, 311, 314–316, 349,
214, 216, 217, 220, 222, 223, 227, 355, 360, 362, 364, 371, 382,
228, 236, 241, 252, 254–258, 264, 425–430, 438
276, 279, 296, 297, 301, 309, 311, Data doubles, 75
312, 314, 316–318, 333, 337–339, Data-driven economy, 407, 409, 410, 413,
353, 356, 359, 370, 377, 386, 398, 416, 424, 480
399, 402, 423, 424, 454, 458 Datafication, 1, 2, 5, 56, 59, 69, 70, 74, 105,
Constitutional law, 24, 26, 30, 31, 48, 68, 109
89, 91, 276, 277, 298, 337, 339, 345, Data protection, 7, 8, 12, 18, 27, 35–37, 40,
347, 348, 375, 377, 379, 380, 393, 42–44, 47–51, 55–58, 60–66, 91, 96,
394, 396, 398, 399, 402, 483 97, 106, 113, 114, 134, 135, 137,
Consumer Protection Code, 45, 143, 154, 146, 160, 162, 167, 169, 175, 180,
162, 167, 367, 423, 426–430 186, 191–197, 199–204, 206–208,
Content generated by third party, 11, 122, 213, 233, 258, 294, 303, 311–313,
230, 241, 242, 247, 249, 251, 252, 333, 338, 342, 344, 355, 356, 360,
258, 259, 315 367, 370, 371, 377, 379, 380, 393,
Content moderation, 11, 267, 287, 295, 303 402, 409, 423–427, 438–441,
Control, 5, 7–9, 27, 43, 47, 48, 59, 63, 66, 445–451, 453–459, 462, 463, 465,
69, 71, 74–76, 86, 90, 96, 98, 104, 467, 468, 470, 471, 474, 475,
107, 108, 115, 116, 118, 119, 126, 477–481
128, 134, 145, 148, 153, 154, 161, Data Protection Directive (EU-DPD), 5, 82,
163, 167, 175, 189, 193, 205, 244, 181, 203, 446, 463
248, 274, 275, 279, 283, 290, 291, Data protection law
302, 319, 324, 355, 357, 360, 362, -adequate level of, 18, 27, 66, 459, 461,
413, 449–451, 454, 456, 457, 459, 469, 471–473, 477
468, 480 -external dimension of, 13, 447, 456,
Copyright, 11, 61, 84, 214, 233, 234, 236, 457, 460, 462–464
237, 239, 249, 250, 256, 259, 264, -global regime for, 453
268, 269, 271, 273, 278, 279, 281, -harmonization of, 445, 454
300, 324, 350, 364, 377, 382 -in favour of the testator, 393
Index 487

-in favour of the testator’s Digital heritage, 9, 12, 355


communication partners, 377, 380, Digital identity, 12, 358–360, 372
383, 394 Digital legacy, 355, 362
Data protection rights, 1, 7–13, 44, 185, Digital Rights Ireland, 66, 95–97, 99,
320, 445, 448, 451, 453, 456, 472, 457–460
479, 480 Digital testament, 363
Data retention Digitization, 2, 56, 69, 70, 74, 109, 181, 280
-bundle of constitutional requirements, Dignity of the human person, 47, 134,
90 138–140, 142, 144, 215, 226, 256
-concept, 80, 105 Disclosure, 29, 63, 122, 125, 168, 201, 204,
-German data retention laws, 85 223, 224, 241, 246, 250, 257, 258,
-judgment of the FCC, 85, 87, 104 261, 262, 264, 294, 309, 310, 313,
-precautionary data retention without 318, 319, 322, 325, 327, 330–332,
cause, 87 334, 389, 427, 433–436, 450, 452
-rulings of the ECJ, 95 Discrimination, 6, 48, 77, 78, 120, 121,
-separation of storage and retrieval, 89 126, 279, 323, 355, 357, 361, 364,
-structural pattern, 81 407–409, 415–423, 425, 426,
Data Retention Directive (EU-DRD) 429–432, 436–441
-nullification, 99 Duties of protection, 170
-obligations of the providers, 82 Duty to protect, 7, 17, 22, 26, 27, 80, 391,
Data security, 9, 63, 66, 80, 83, 84, 90, 91, 394, 399, 469
97, 103, 104, 108, 193 Dynamic interpretation of fundamental
Data subject, 9, 23, 24, 57, 59, 63–65, 93, rights, 339, 352
113, 114, 136, 187, 191, 193, 194,
196, 199, 290, 313, 319, 320, 338,
E
355, 423–425, 428, 429, 434, 436,
ECommerce Directive, 289, 291, 292, 295,
439, 450, 464–468, 470, 471,
296
475–477
Encryption, 75, 320, 330, 356
Data transferability, 65
EPrivacy Directive, 55, 64, 82, 95, 99,
Dataveillance, 75
101–103
Decision-making, 4, 9, 12, 28, 37, 38, 64,
European Charter of Fundamental Rights,
87, 106, 130, 280, 293, 302,
56, 456
407–409, 411, 412, 414, 415, 419,
European Convention of Human Rights, 8,
420, 422, 423, 432, 433, 436, 439
478
Declaration of Independence of European Court of Human Rights, 11, 17,
Cyberspace, 59 62, 309, 310, 318–320, 323, 334,
Defamation, 162, 173, 290, 293, 298, 310 448, 459, 478
Democracy, 9, 40, 41, 78, 115, 117–119, Explainability, 432, 434, 435
135, 137, 138, 173, 197, 219, 243, Extraterritorial application
275, 280, 316, 346, 347, 357, 362, -direct, 464
450 -indirect, 469
Digital economy, 8, 62, 64 Extraterritoriality, 463, 469
Digital estate
-account contracts, 382, 386, 387
confidentiality obligations, 389 F
personal contractual relationships, Facebook, 6, 18, 19, 21, 23, 27, 30, 31, 47,
388, 389 64, 71, 115, 118, 126, 129, 166, 174,
-definition, 380 175, 197, 198, 203, 225, 242, 243,
-(online) data (on foreign servers), 381, 246, 258, 260, 268, 270, 273, 277,
388 278, 281–284, 290, 292, 294, 297,
asset-related, 390–392 298, 307, 316, 317, 326, 328–332,
personal, 388, 390, 392 363, 370, 377, 378, 379, 381–383,
-right to access, 380, 384 392, 449, 460
488 Index

Fairness, 276, 280, 432, 433 Geolocation, 317, 452


Federal Supreme Court, 148, 223, 231–233, German Civil Code, 181, 383
236, 317, 353 German Network Enforcement Act, 5, 11,
Forgetting 230, 289, 293
-as a legal concept, 183 Google Brasil, 154–157, 314
-as a social concept, 179 Google Spain, 5, 9, 55, 61, 62, 167, 303,
-in the social and cultural sciences, 183 341, 449, 457, 458, 464
-on the internet, 1, 10, 179, 181, 182, Guarantee of anonymity, 337
184, 186, 445, 449 Guarantee of the confidentiality and
Freedom of expression integrity of information technology
-as a condition for the full exercise of systems, 79, 394
the right to internet access, 228 Guarantee of the inviolability of
-as a foundation for the regulation of telecommunications, 79, 86, 106
internet use, 218
-as a principle for regulating internet
use, 220 H
-protection in the United States, 220 Habeas data, 43, 141, 311, 367, 424
Freedom of thought, 216, 217, 314, 329 Hate speech, 5, 11, 173, 242, 244, 245, 265,
Free speech, 218, 223, 224, 231, 238, 242, 269, 270, 273–277, 284, 289, 292,
245, 260, 261, 276, 278, 279, 281, 293, 298, 303, 310, 321, 331, 339,
297, 309, 310, 314, 316, 317, 340, 448
321–323, 334, 454 Honor killing, 325
Fundamental rights, 7–9, 11, 12, 16–18, Horizontal dimension of human rights, 25
20–24, 27, 41, 43, 47, 49–51, 55–66, Horizontal effect of fundamental right, 22
68, 78–80, 85–87, 89, 95–97, 99, Human dignity, 16, 25, 40, 112, 133, 137,
103–109, 112, 115, 116, 118, 120, 140, 144, 152, 175, 279, 319, 321,
133, 134, 136–142, 144, 145, 160, 324, 334, 391, 392, 398, 434
162, 163, 165–168, 170, 171, 175,
182, 191, 192, 195, 205, 208, 213,
215, 218, 220, 228, 229, 237, 241, I
250, 252–256, 258, 261, 264, 267, Identification, 42, 93, 94, 100, 117, 122,
270, 276–283, 289, 295–300, 312, 125, 150, 153, 222, 223, 225–227,
317, 319, 329, 337–340, 342–348, 232, 241, 242, 247, 248, 250, 253,
351–355, 361, 365, 367, 368, 371, 255, 260, 265, 273, 283, 315, 316,
372, 375, 377, 380, 384, 385, 394, 318, 321, 325–327, 331, 333, 341,
396, 398, 399, 424, 454, 456, 458, 359, 368, 391
459, 462, 465, 469, 471, 476–478, Identity, 10, 31, 89, 96, 102, 124, 127–129,
480 140, 144, 162, 163, 166, 175, 184,
186, 222, 227, 243, 246, 257, 260,
279, 310, 315, 318–324, 326, 327,
G 330–332, 334, 341, 345, 346,
General Data Protection Act, 146, 407, 423, 348–350, 355, 356, 358–360, 362,
424, 427–429, 431, 436, 437, 440 388, 422
-see also Lei Geral de Proteção de Information age, 59, 356
Dados Informational self-determination, 25, 50,
General Data Protection Regulation 51, 57, 63, 133, 138, 140, 141, 143,
(GDPR), 5, 13, 48, 49, 55, 60, 145, 161, 166, 192, 204, 205, 207,
63–66, 78, 82, 135, 181, 190–194, 325, 337–339, 341–343, 347, 349,
200, 202–204, 207, 208, 258, 312, 355, 360, 368, 394
371, 379, 384, 393, 402, 428, 435, Information society, 28, 35, 51, 57, 68, 116,
440, 445, 447, 449, 450, 453–456, 127, 357, 360, 449, 465
458–476, 478–481, 484 Information technology, 3, 51, 57, 89, 126,
General right of personality, 139, 188, 205, 133, 134, 214, 356, 357, 380, 381,
206, 383, 390 408, 426
Index 489

Intelligence services, 6, 9, 16, 70, 84, 85, M


88, 90, 92–94, 108, 112, 115, 117, Marco Civil da Internet, 4, 8–11, 35, 46, 47,
453 70, 145, 309
Intermediaries, 1, 9, 10, 62, 71, 79, 161, -see also Brazilian Civil Framework for
167, 182, 190, 203, 230, 234, 278, the Internet
282, 289, 291, 292, 297, 299–303, -see also Brazilian Civil Rights
315, 321, 353 Framework for the Internet
Intermediary liability, 8, 11, 36, 213, 277, -see also Brazilian Internet Bill of
287 Rights
Internet Marketplace principle, 464, 467, 468, 477,
-evolutionary stages, 3 480
-governance, 7, 10, 17, 24, 26, 28–31, Media privilege
218, 239, 275, 313, 447, 449, 451, 453 -as mechanism of coordination, 199,
-law enforcement on the, 302, 322 202, 204, 206–208
-in Brazilian law, 10
-re-territorialization of the, 452
-in the GDPR, 199, 202, 208
-socio-technical arrangement, 1, 2
Memory
-technical foundations, 2 -as ‚ars‘ and as ‚vis‘, 184
Internet of Bodies, 2, 3, 71 -collective memory, 139, 140, 151, 165,
Internet of Things, 2, 3, 6, 56, 59, 71, 74, 183, 184
81, 341, 343, 449 -cultural memory, 183
Internet regulation -social memory, 185, 186
-cross-sectional issues, 1, 4, 5 -web memory, 185
-fundamental issues, 1, 4, 5, 82 Metadata, 47, 66, 72, 75, 81, 83, 91, 118,
IP address, 83, 84, 93, 94, 102, 115, 124, 459
125, 129, 313, 315, 316, 452 Mob justice, 325, 327, 328

N
J National routing, 452
Jurisdiction, 10, 13, 26, 85, 90, 103, 104, Neutrality, 5, 8, 36, 113, 120–122, 213,
122, 169, 174, 175, 188, 195, 291, 312, 313, 351, 352, 357
294, 296, 310, 312, 318, 320, 334, Normative legitimacy, 445, 463, 477
338, 351, 399, 437, 440, 445, 448, Notice and takedown, 247, 249–253, 257,
451–453, 458, 467, 468, 478 281, 282
NSA-Scandal, 21

O
L
Objectification of interests, 64
Law in context, 31 Onlife World, 1, 2, 69, 70, 109
Lebach, 135, 141, 164, 187, 189 Online archive cases, 181, 187
Legal comparison, 30, 31 Online communication, 116, 202, 289–291,
Lei Geral de Proteção de Dados (LGPD), 8, 310, 313, 321, 322, 331, 333, 334
36, 48–51, 311, 314, 450, 453, 455, Online evaluation portals, 12, 337, 345, 349
466, 468, 470 Online rating portals, 342
-see also General Data Protection Act Overblocking, 293, 295, 299, 340
Liability, 6, 11, 66, 93, 122, 146, 153, 154,
156–163, 167, 172, 190, 194, 195,
200, 207, 213, 214, 219, 223, 224, P
227, 230–237, 247–249, 252, Personal dignity, 50
255–257, 264, 270, 278, 292, 296, Personality profile, 342, 390
309, 315, 321, 323, 351, 426, Personality rights, 5, 8, 11, 35–41, 44, 46,
429–431, 437 48–51, 66, 114, 133–138, 140,
490 Index

142–144, 150, 152, 153, 155, Public space/sphere, 7, 20, 21, 29, 56, 59,
157–159, 161–163, 165, 166, 170, 115, 198, 199, 217, 242, 279
172, 173, 175, 190, 194, 196–198,
202, 206, 241, 248, 253, 257–259,
264, 267, 270, 276–278, 280–282, Q
289, 290, 294, 298, 311, 317, 334, Qandeel Baloch, 327
349, 350, 355, 356, 358–361, 365, Quadrature du Net and Others, 95, 100, 101
367–371, 373, 378, 383, 390–392,
448
Platform governance, 271–284, 291–304 R
Positive obligations, 8, 22, 55, 58, 60, Regulation
64–66, 322, 334 -self-regulation, 11, 59, 60, 174, 229,
Power relations, 64, 73, 74 230, 267, 270–274, 276, 277,
Press law, 10, 172, 179–181, 190–192, 279–284, 292, 293, 295, 304, 471
194–197, 199–208, 223 Responsibility, 5, 22, 26, 27, 31, 59–61, 65,
Principle of purpose limitation, 8, 57, 90, 74, 91, 123, 133, 157, 161, 182, 186,
193, 205, 449 190, 194, 229, 248, 258, 259, 282,
293, 299, 301, 302, 314, 318, 347,
Privacy
348, 427, 432, 461, 476
-privacy language, 30
Right of inheritance
-privacy protection, 8, 20, 24, 28, 29,
-individual right, 385
58, 114, 128, 277, 429
-legal institution, 41, 385
-privacy protection beyond the State, 26 -limited guarantee of continued
-private/public distinction, 59 existence, 385
Privacy shield, 5, 13, 460, 461, 472–474 -universal succession, 383, 386
Private life, 8, 11, 23, 27, 38, 43, 46, 47, 96, Right to a free development of the
106, 116, 118, 122, 127, 138, 144, personality, 138
151, 152, 167, 172, 173, 254, Right to a new beginning, 139
309–311, 313, 314, 317, 318, 321, Right to be forgotten
324, 325, 333, 334, 340, 343, 358, -constitutional dimension, 188
361, 371, 379, 424, 456, 458, 459, European Court of Justice, 17, 179, 187,
480 195, 449
Profiling, 6, 103, 126, 130, 310, 320, 332, -in data protection law, 10, 179–181,
334, 409, 414, 416, 420, 428, 429, 187, 190–192, 194, 195, 207
466 -in press law, 181, 190, 192, 195
Proportionality, 25, 61, 62, 66, 84, 86, 88, Right to data protection, 56, 58, 66, 79, 96,
90, 91, 96, 97, 105, 106, 142, 163, 114, 360, 371, 459, 477
167, 171, 299, 301, 303, 346, 385, Right to deindexation, 133, 146, 153, 154,
423, 459 160–162, 170
Protection of Records, 122 Right to deleting data, 133
Provider, 3, 5, 11, 24, 28, 46, 61, 65, 69, 71, Right to erasure of data, 136
79, 81–91, 93, 94, 97–99, 101–103, Right to honor, 140
105–108, 115, 121–123, 125, 126, Right to know, 43, 80, 130, 428
128, 133, 136, 146, 147, 153–162, Right to one’s own image, 140
166–169, 171, 182, 192, 206, 214, Right to personal identity, 260
219, 226, 228, 230–236, 241, 247, Right to privacy, 17, 19–21, 24–26, 29, 30,
249, 251–253, 256–260, 263, 264, 36, 43, 50, 56, 62, 66, 79, 114, 115,
279, 291–299, 315, 320, 321, 333, 120, 124, 140, 144, 151, 228,
337, 350, 351, 371, 378, 380–384, 309–311, 313, 315, 317, 319,
388–397, 401, 403, 448, 464, 478 323–325, 334, 453
Pseudonymity, 310, 315, 322, 323, 326, 330 Right to reinvent oneself, 139
Publicity, 20, 121, 149, 247, 319, 324, 326, Right to resocialization, 141, 143
360, 371 Right to respect for private and family life,
Public-private-dichotomy, 17, 20 318
Index 491

S -methods, 73, 76, 78


Safe Harbour Agreement, 474 -modes, 76
Safe Harbour Doctrine, 18 -process with several elements, 73, 105
Satamedia decision, 201, 203 -protected interests, 9, 78, 79
Schrems Case I, 462 -protection needs, 9, 69, 72, 75, 76, 78,
Schrems Case II, 460, 462, 471, 475, 478 109
Search engine, 5, 24, 61, 133, 136, -public-private assemblages, 6, 9
153–163, 165–167, 169–172, 174, -requirements for surveillance in the
181, 183–185, 188–190, 195, 196, Internet Age, 104
201, 203, 207, 231–233, 236, 290, -surveillance society, 73
300, 449, 457, 458, 466 -under the conditions of the Internet, 9,
Search engine companies, 3 74, 101, 104, 105, 127
Secrecy of telecommunications Surveillance overall accounting, 89, 90
-application and interpretation of
statutory law in conformity with
constitutional law, 398 T
-duty to protect, 394, 399 Tele2 Sverige, 95, 99–101, 103
-individual right, 385 Telecommunications, 4
-practical concordance with the right of Telecommunications Act (TKG), 45, 46,
inheritance, 399 84, 85, 87, 93, 94, 384, 391–394,
Secret application case, 226, 228 396–402
Self-constitutionalization, 60 Telecommunications providers, 71
Shareconomy, 3, 5 Traffic data, 82, 84, 88–94, 97, 105, 106,
Social media, 7, 19, 21, 29, 30, 61, 62, 70, 115
129, 147, 153, 195, 196, 269–271, Transparency, 6, 9, 45, 48, 65, 90, 93, 118,
275, 278, 279, 281, 282, 284, 290, 135, 206, 283, 295, 323, 407, 409,
291, 297, 300, 326, 328–330, 332, 423–425, 429, 430, 432–436, 440,
342, 360, 377, 448 449, 450
Social network, 3, 5, 11, 21, 24, 26, 27, 60, Trust, 45, 57, 76, 89, 268, 364, 372, 389,
64, 65, 153, 182, 196, 199, 201, 229, 391
233, 242–253, 255, 260, 265, 278,
279, 293–295, 316, 321, 339, 340,
346, 358, 362, 369, 378, 380–383, W
386, 388, 393–397, 402 WhatsApp, 47, 316, 317, 362
Specific court order, 241, 251, 252, 255,
256, 258, 264, 315
Spickmich decision, 196, 201 X
Spiral of silence, 345 Xuxa case, 154, 156, 164
Standard Data Protection Contract Clauses,
475
Statistics, 298, 332, 413, 430 Y
Superior Court of Justice, 45, 135, 148, 234, Youtube, 30, 127, 261, 262, 268, 269, 271,
236, 241, 245, 247, 248, 370, 429 273, 274, 276, 279–281, 300, 316,
Surveillance 326, 362, 377
-bulk surveillance, 75, 105, 106, 109
-core element of, 73
-essential function, 73 Z
-mass surveillance, 69, 71, 77, 105, 224 Zoning, 452

You might also like