Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

108 The Claim to Privacy

Bigo, Didier. (2006) Security, Exception, Ban and Surveillance. In Theorizing Surveillance. The
Panopticon and Beyond, edited by David Lyon. Devon:Willan.
Garland, David. (1996) The Limits of the Sovereign State. British Journal of Criminology 6(4):
445–471.
de Goede, Marieke. (2012) Speculative Security. Minneapolis: University of Minnesota Press.
Haggerty, Kevin D., and Richard V. Ericson. (2000) The Surveillant Assemblage. British Journal of
Sociology 51(4): 605–622.
de Hert, Paul, and Serge Gutwirth. (2006) Privacy, Data Protection and Law Enforcement.
Opacity of the Individual and Transparency of Power. In Privacy and the Criminal Law, edited by
Erik Claes, Antony Duff, and Serge Gutwirth. Oxford: Intersentia.
Lyon, David. (2001) Surveillance Society. Monitoring Everyday Life. Buckingham: Open University
Press.
Mitsilegas, Valsamis. (2003) Money Laundering Counter-Measures in the European Union: A New
Paradigm of Security Governance versus Fundamental Legal Principles. The Hague: Kluwer Law
International.
Mitsilegas, Valsamis. (2005) Contr^ 
ole des Etrangers, des Passagers, des Citoyens: Surveillance et
Anti-terrorisme. Cultures et Conflits 58: 155–182.
Mitsilegas, Valsamis. (2009) EU Criminal Law. Oxford: Hart.
Mitsilegas, Valsamis. (2012) Immigration Control in an Era of Globalisation: Deflecting
Foreigners, Weakening Citizens, Strengthening the State. Indiana Journal of Global Legal Studies
19(1): 3–60.
Solove, Daniel J. (2008a) Data Mining and the Security-Liberty Debate. University of Chicago Law
Review 75(1): 343–362.
Solove, Daniel J. (2008b) Understanding Privacy. Cambridge: Harvard University Press.

Security and the Claim to Privacy


Louise Amoore
Durham University

When US President Barack Obama publicly addressed the data mining and
analysis activities of the National Security Agency (NSA), he appealed to a famil-
iar sense of the weighing of the countervailing forces of security and privacy.
“The people at the NSA don’t have an interest in doing anything other than
making sure that where we can prevent a terrorist attack, where we can get
information ahead of time, we can carry out that critical task,” he stated. “Oth-
ers may have different ideas,” he suggested, about the balance between “the
information we can get” and the “encroachments on privacy” that might be
incurred (Obama 2013). In many ways, conventional calculations of security
weigh the probability and likelihood of a future threat on the basis of informa-
tion gathered on a distribution of events in the past. Obama’s sense of a trad-
ing-off of security and privacy shares this sense of a calculation of the tolerance
for the gathering of data on past events in order to prevent threats in the
future. In fact, though, the very NSA programs he is addressing precisely con-
found the weighing of probable threat, and the conventions of security and pri-
vacy that adhere to strict probabilistic reasoning. The contemporary mining and
analysis of data for security purposes invites novel forms of inferential reasoning
such that even the least probable elements can be incorporated and acted upon.
I have elsewhere described these elements of possible associations, links, and
threats as “data derivatives” (Amoore 2011) that are decoupled from underlying
values and do not meaningfully belong to an identifiable subject. The analysis
Louise Amoore 109

of data derivatives for security poses significant difficulties for the idea of a data
subject with a recognizable body of rights to privacy, to liberty, and to justice.
Consider, for example, two years before the controversies surrounding NSA
and the PRISM program, when the US Director of National Intelligence, James
Clapper, testified before the joint hearing of committees on the virtues of the
algorithmic piecing together of fragments of data:

The most valuable national intelligence is the huge collection of databases of rou-
tinely collected information that can be searched by computer algorithm. An ana-
lyst may know that a potential terrorist attacker is between 23 and 28 years old, has
lived in Atlanta Georgia, and has travelled often to Yemen. That analyst would like
to be able to very rapidly query the travel records, the customs and border protec-
tion service, the investigative records of the State Department … we made impor-
tant progress after the December 2009 attempted bombing of an aircraft over
Detroit, but there remains much more to be done. (US Senate Committee 2011:16)

What is sought in the director of the DNI’s vision is not the probable relation-
ship between data on past activities and a future terrorist attack, but more specif-
ically, a potential terrorist, a subject who is not yet fully in view, who may be
unnamed and as yet unrecognizable. The security action takes place on the ter-
rain of a potential future on the horizon, a future that can only be glimpsed
through the plural relations among data points.
Why does the analysis of fragments of correlated data derivatives matter to our
sense of privacy, both as a concept and as a body of rights? Contemporary forms of
security do not primarily seek out a complete data set of information on a person,
but rather to assemble sufficient correlated data points to make inferences about
risk or threat possible. The question of what constitutes personal or private data
thus becomes more complex than identifying the accessing of private content. In
the public debate on the PRISM program, for example, the idea of “meta-data”
mobilizes a distinction between the spoken content of a telephone conversation
and the meta-elements of call number, location, time of call, and so on. In fact,
though, the complete content of a call event does not need to be captured in
order for a security action to take place. The derivative meta-data of associative
links between elements represent highly valuable data points for social network
analysis techniques that map so-called “patterns of life.” In a sense, the content of
the “dots to be connected” matters less to this kind of analysis than the relations
between them. The absence of a complete record thus becomes a security virtue
because it makes it more difficult to assume that content is innocent—in effect,
apparently innocent content can become suspicious at the level of meta-data
because it is subsequently associated with other things. For the individual, who is
both the subject of privacy rights and the data subject of data protection laws, con-
temporary security appears indifferent to the person as such, while attentive to the
multiple links and associations among plural people and things.
Put simply, contemporary forms of security are less interested in who a suspect
might be than in what a future suspect may become; less interested in the one-
to-one match of the watch list or alerts index database, and more interested in
the signals of real-time predictive analytics. The one-to-one match with an identi-
fied individual gives way to what Gilles Deleuze (1992:3) calls the “dividual”—a
“torn chain of variables” dividing the subject within herself, recombining with
the divided elements of others. The signature of the dividual that is sought by
the security software does not “belong” to a subject as such—it does not sign off
on past events, it signals possibilities. Thus, for example, an agreement to
“depersonalize” data in a database after a period of six months, as in the case of
the EU-US agreement on passenger name record (PNR) data, scarcely matters
when what is left is a dividuated signature that allows for the writing of new
110 The Claim to Privacy

code. So, a subject’s specific and named journey on a particular date may no
longer be attributable in the database, but it persists as a source for new algorith-
mic code to act on future subjects.
Among the implications of the data derivative, then, is the capacity for little
pieces of past data to become mobile, to detach themselves from identifiable data
subjects, and to become active in new security decisions. Some dividuated data ele-
ments are disaggregated from the remainder and become drawn into association
with other elements derived from other subjects. In this process of disaggregation
and reassembly, the person appears less as a singular body than as a plural set of
variables which can have an onward life as they connect with the lives of others. For
example, the 2009 New York subway suspect, Najibullah Zazi, the 2009 Northwest
Airlines Detroit bomber, Umar Farouk Abdulmutallab, and the 2010 Times Square
bomber, Faisal Shahzad, could be offered as examples of the failure of systems of
data analysis to act upon individuals. Yet, though none of the suspects were pre-
emptively intercepted on the basis of the analysis of their travel or financial data,
the fragments of data did reappear as a chain of variables for future algorithmic
searches. In the analysis of the US Attorney General, they had ceased to be identifi-
able subjects and had emerged as “operatives with British or American passports or
visas who had visited South Asia and had returned to the US over (redacted) time
period” (2010:4). It is the chain of variables of their data fragments that live on in
the algorithmic rules of contemporary border security analytics.
If the subject of data-driven security measures is a fractionated subject whose
missing elements are a resource to analytics technologies, then can we meaning-
fully conceive of a right to privacy of the dividual? Such a form of privacy would
adhere not to identifiable individuals, but to the associations and links between
multiple pieces of multiple people’s lives. If what is collected, analyzed,
deployed, and actioned in contemporary security is not strictly personal data ele-
ments, but a “data derivative”—a specific form of abstraction like a financial
derivative, that can have value that is decoupled or only loosely tied to its under-
lying coordinate—can privacy have meaningful purchase on this slippery deriva-
tive? One response has been to seek to restrict the use of personal data and to
assert the right to be deleted or forgotten (Mayer-Sch€ onberger 2009). But the
data derivative makes associations and links that will always exceed a specified
use, will travel, and circulate with new effects and implications. Similarly, the
design of new “sovereign information sharing” systems that promise to protect
the “integrity of the database” appear to protect individual privacy while allowing
their derivatives to circulate (Agrawal, Rakesh, and Ramakrishnan Srikant 2010).
By running the analytics within separate databases (airline passenger name
record data, national authorities’ alerts index data, and so on), and aggregating
only the high-risk scores and positive matches in a “master calculation,” the sys-
tem uses the associations between subjects without revealing the full picture to
the data owners. The right to privacy of the individual is thus inadequate to the
task of inhibiting the spread of unmoored and dividuated pieces of data that are
only partially attributable to a subject.
So, what might it mean to have data protection, or to limit use to a defined pur-
pose in a world of abstracted data derivatives? The gathering of evidence of ter-
rorist activities after the fact is seeping into the preemptive search for indicators
of intent capable of anticipating future possible infraction. In this sense, for
example, what is significant about the UK’s GCHQ generating 197 leads from
PRISM in 2012 is not the 197, but how the 197 are arrived at. The 197 represent
persons of interest that have come to attention after a process of running large
volumes of data through the analytics software. This process is described by the
industry as one of “knowledge discovery” in which the algorithm “picks out the
strongest relationships” and partitions and segments the data on this basis.
Because the knowledge discovery process does not begin with a defined purpose
Louise Amoore 111

or criteria, it defies the logic of restricting the purpose of data analysis. It is the
analytics that govern what the elements of interest should be—this travel associ-
ated with this financial transaction, associated with this presence in an open
source online community, associated with this presence on Facebook, for exam-
ple. So, put simply, the vast bulk of the filtering and analysis happens before any
named individuals or lists are identified. Can a data subject ever give meaningful
consent for their data to be analyzed in bulk, along with terabytes of the data of
other people’s lives and transactions? Taking seriously the implications of con-
temporary data analytics, it is not strictly the privacy of the individual that is
infringed by programs such as PRISM, but the harm is in the violence done to
associational life, to the potentiality of futures that are as yet unknowable (de
Goede 2012). In effect, the data do not have meaning until sense is made of
them by the relational structure of the analytics. One could protect data very
effectively and yet still the inferred meanings of the analytics would limit life
chances and close down potential futures.
Perhaps the most significant harm, though, lies in the violence done to politics
itself, and to the capacity to make a political claim. Where politics exists because
of intractability and irresolvability, because of difficulty as Thomas Keenan (1997)
puts it, contemporary data-led security offers resolution through computation,
promising to resolve the incalculable and the intractable. The computational and
algorithmic turn in security has no tolerance for the emergent or half-seen figure
at the edges of perception, no interest in an unfulfilled future potential. “All pos-
sible links” are wrought in the correlative relations that are drawn between bodies
as they are disaggregated into degrees of risk. There is a challenge for law and
human rights when a juridical sensibility of evidence and evaluation meets a set
of security measures that demand the projection of future possible events yet to
take place. Where the balance of probability may discount unverified fragments,
computation lends greater weight to the associated elements, such that once
assembled together their singular uncertainty is less easy to see.
So, is privacy as a body of rights adequate to the task of protecting the capacity to make
a political claim? If the politics of privacy is to locate space to challenge the analy-
sis of “big data” (Boyd and Crawford 2013), then, one step may be to make a dis-
tinction between privacy as a body of rights and privacy as a claim that can be
made. The body of rights, as Costas Douzinas (2000, 2012) compellingly reminds
us, embodies an imaginary unity of the self, a clearly identifiable subject, while
the rights claimant is fractured and split. While the right to privacy itself is read-
ily reconciled within the technology—“anonymizing,” “masking,” or “depersonal-
izing” the data, for example—the fractured subject falls beneath the register of
recognizable rights. What matters politically, then, is that claims to privacy are
made, even in the face of the impossibility of meaningful juridical redress. As
the layering of software calculations and automated security decisions become
ever more difficult to prise open and call to account, the claim to privacy none-
theless illuminates the limit points of what can be made transparent or account-
able. Where the search for a definitive body of privacy rights will necessarily fall
short of marking a place for all future claims, the act of making a claim is a
political demand that can be made by those whose experiences and lives are not
registered in an existing body of rights.

References
Agrawal, Rakesh and Ramakrishnan Srikant. (2010) Enabling Sovereign Information Sharing
Using Web Services. SIGMOD, Paris, France.
Amoore, Louise. (2011) Data Derivatives: On the Emergence of a Security Risk Calculus for Our
Times. Theory, Culture and Society 28(6): 24–43.
112 Data Protection, with Love

Boyd, Danah, and Kate Crawford. (2013) Critical Questions for Big Data. Information,
Communication and Society 15(5): 662–679.
Deleuze, Gilles. (1992) Postscript on the Societies of Control. October 59: 3–7.
Douzinas, Costas. (2000) The End of Human Rights: Critical Legal Thought at the Fin-de-Siecle. Oxford: Hart.
Douzinas, Costas. (2012) The Paradoxes of Human Rights. Constellations 20 (1): 51–67.
de Goede, Marieke. (2012) Speculative Security: The Politics of Pursuing Terrorist Monies. Minneapolis:
University of Minnesota Press.
Keenan, Thomas. (1997) Fables of Responsibility: Aberrations and Predicaments in Ethics and Politics.
Stanford, CA: Stanford University Press.
Mayer-Sch€onberger, Viktor. (2009) Delete: The Virtue of Forgetting in the Digital Age. Princeton, NJ:
Princeton University Press.
Obama, Barack. (2013) Remarks at White House Press Conference, August 9 2013. Available at
http://www.whitehouse.gov/the-press-office/2013/08/09/remarks-president-press-conference.
(Accessed November 29, 2013).
US Attorney General. (2010) Review of Passenger Name Records Data. Washington, DC: Office of
the Attorney General.
US Senate Committee. (2011) Ten Years After 9/11: Is Intelligence Reform Working?. Washington,
DC: US Senate Committee on Homeland Security and Governmental Affairs.

Data Protection, with Love


Rocco Bellanova8
Peace Research Institute Oslo (PRIO) and Universit
e Saint-Louis - Bruxelles

The paper is not critical, but neither is it neutral. For we happen to like, no,
even better, to love the Zimbabwe Bush Pump in all of its many variants. But even
if affection moves our writing, this is not an exercise in praise. Rather, we want
to analyze the specific quality that attracts us to the Zimbabwe Bush Pump. (de
Laet and Mol 2000:225, emphasis in original)

How to Engage with the Politics of Privacy in the Age of Preemptive Security?
My suggestion is to start with data protection, which, following De Hert and Gut-
wirth (2006), is not exactly the same of privacy. Extrapolating from their analysis
of the two as different “legal tools,” I would say that privacy and data protection
are two slightly different rationales of power relations: one of privacy based on
the “opacity of the individual” and one of data protection on the “transparency and
accountability of the powerful” (Gutwirth and De Hert 2008:275; emphasis in origi-
nal).9 These rationales (attempt to) orientate two different loose dispositifs, each

8
Author’s notes: I would like to thank Anthony Amicelle, J. Peter Burgess, Katja de Vries, Anne Duquenne,
Rapha€ el Gellert, and Mareile Kaufmann for their comments. The editorial feedback of Marieke de Goede and Mark
B. Salter helped to sharpen my argument (in particular on the “undecidability” of data protection).
9
A twofold conceptualization of privacy and data protection is also present in the text of the European Union
Charter of Fundamental Rights, which provides for two different articles: Article 7 on the “respect for private and
family life” and Article 8 on the “protection of personal data.” The differentiation between privacy and data protec-
tion is not without critics (for example, Rouvroy and Poullet 2009), nor it is a clear-cut one (for example, Bygrave
2001). The notion of data protection is a rather European construct: In the United States, there is a stronger ten-
dency to speak about informational or data privacy (Solove, Rotenberg and Schwartz 2006), or contextual integrity
(Nissenbaum 2004), and to consider it not as an autonomous concept but as a sub-category of privacy (Solove
2006).
Copyright of International Political Sociology is the property of Wiley-Blackwell and its
content may not be copied or emailed to multiple sites or posted to a listserv without the
copyright holder's express written permission. However, users may print, download, or email
articles for individual use.

You might also like