Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Slide 1

My name is Charles Woods, and I am nervous. I am also an assistant professor of English at


Texas A&M University-Commerce. In 2022, I founded the Digital Rhetorical Privacy Collective,
along with 5 of our colleagues, Morgan Banville, Chen Chen, Gavin Johnson, Cecilia Shelton,
and Noah Wason. The Digital Rhetorical Privacy Collective is an interactive, coalitional resource
which hosts events and features reading lists, activities, assignments, lesson plans, and other
teaching materials for teaching about digital privacy and surveillance. The DRPC’s mission is the
bridge the scholarly and public conversations about surveillance and privacy to enact coalitional
action dedicated to not only ending oppression under surveillance capitalism, but also to building
equitable futures for all. My talk today is called, “What is Rhetorical Privacy?: The Uncertainty
of Binaries Operating in Communication and Across Culture.

Slide 2
A quick introduction: Digital rhetorical privacy is a state of being when a user is confidant their
digital data is free from unauthorized observances by nefarious computer technologies and other
users. Defining nefarious means looking at the data that is collected and how it is used. This
multi-pronged approach was originally developed to study data privacy in Terms of Service
documents and draws on historical and recent scholarship in digital rhetorics, genre studies, and
technical communication. It’s a pliable approach, but it is incomplete, and today, I seek feedback
on extensions and complications of this approach.

Digital Rhetorical Privacy includes intertwined analytic elements. The six original elements are
temporalty, transparency, language, data usage, digital surveillance, meaningful access, and
through collaboration with colleagues from the DRPC, have extended the framework to include 3
additional elements: with Johnson, design, with Banville, data usage for training AI, and then I
have developed an interlocking consent. A few other notes as we embark on this presentation:
privacy and surveillance exist on a continuum. Privacy is contextual (rhetorical) and relies on
complex histories and legalities. We can’t set surveillance aside in discussions of privacy.

Slide 3
In relation to the theme of the symposium, I am interested in data leaks, which I would describe
as a failure of privacy protections. I think Terms of Service documents are the most influential
genre in the world, and, as has been noted, we aren’t reading them. And I think understanding
Terms of Service documents as a place for safety—or security—rooted in an understanding of
data privacy and digital surveillance helps us take a, perhaps, coalitional approach like the DRPC
in working towards equity on the privacy-surveillance continuum today. I ask who stands to
benefit when privacy erodes? And I ask us to look at the design of privacy policies and ask: How
does the design of the policy support and/or work against the stated values of the company/data
collecting body? Is user consent positioned as a value within the design of the policy? What, if
any, design elements help to balance asymmetrical power structures embedded in privacy
policies? Who is made safe?

Slide 4
Terms of Service documents offer sites for critical evaluations of failure or safety from rhetorics
of science, technology, and medicine. To me, direct-to-consumer genetics companies like
23andme and Ancestry.com offer a place to intervene in and across all three sectors. 23andMe
works with pharmaceutical companies, the American law enforcement apparatus, other Big Tech
companies and other third parties, all of which is outlined in their Terms of Service documents.
For example, we might consider 23andMe selling its genetic data to Spanish pharmaceutical
company Almirall in January 2020 an act of digital aggression. Today, I have noted relationships
between my project and connections to projects about adoption and DTC-genetics and am
excited to learn more about Avery Edenfeld and Racheael Jordan’s work. So, today, I offer a
rhetorical analysis and criticism of 23andMe documents as a genre that exists as part of a post-
surveillance culture.

Slide 5
In a forthcoming co-authored piece in Communication Design Quarterly, I define “post-
surveillance” as an “affective orientation highlighted by users not only expressing collective
surveillance apathy regarding the implementation of New surveillance technologies with the
intention of bodily control but also a willingness to participate in practices that aid in the
expansion of global surveillance infrastructures” (p. #). We detail a post-surveillance era defined
by collective surveillance apathy regarding the emergent data infrastructures of our global
society and digital lives, emphasizing the temporal element of surveillance and stopping just
short of detailing a post-surveillance culture that exists across time and space and accounts for
people’s interlocking attitudes, beliefs, and values about, and ideologies, epistemologies, and
aesthetics regarding surveillance. For me, Terms of Service documents are the most influential
genre in this post-surveillance culture, which is defined by The Privacy Aesthetic.

Slide 6
The Privacy Aesthetic represents a move from analysis to aesthetics–from a privacy rhetoric to a
privacy aesthetic. The Privacy Aesthetic stratifies an oblique orientation to pressing ontological
concerns and seeks to deconstruct dominant and oppressive epistemologies regarding ubiquitous
privacy and surveillance. “The Privacy Aesthetic” includes attuning to the oblique ubiquity of
rhetorics of privacy and surveillance; recognizing the influence of ToS documents; integrating
interlocking consent as a value; understanding the intersection of “the body” and “the digital” as
essential for new surveillance technologies; and, considering the importance of space regarding
data collection. I am interested in merging this orientation to data privacy and digital surveillance
with an emphasis on the design of privacy policies, including the work of Tijmen Schep.

Slide 7
In her work, artist Dinie Besems explores privacy as a luxury and how we divulge devices into
our private spaces, and designer Jesse Howard values privacy in the redesign of digital
technologies. In coordination, artist Tijmen Schep (2016) outlines eight (8) principles coalescing
around security and authenticity: (1) privacy first, (2) think like a hacker, (3) collect as little data
as possible, (4) protect the data, (5) understand identity, (6) open the black box, (7) turn the user
into a designer, (8) technology is not neutral. The Privacy Aesthetic I describe aligns with these 8
principles to contend with privacy erosion caused by constant privacy slippage.

Slide 8
Schep’s protocols are relevant across technologies, but I am interested in their application to and
potential influence on Terms of Service documents. Schep necessitates the incorporation of
privacy features at the earliest stage as a way of working against dark patterns, wherein users
give up more data than they would want. How might this, then, prompt us to consider the design
of the privacy policy. Schep calls for manufacturers and designers to anticipate and respond
quickly to problems, and to create a culture of privacy within the organization. And, to highlight
the importance of geo-location, we might note the differences in data collection in American and
in the EU, where justification must satisfy need. So, a question: If responsibility is shared among
stakeholders, what responsibility do users have to identify privacy features in a post-surveillance
culture?

Slide 9
With Schep’s first principle in mind, we notice 23andMe’s Privacy Statement, a separate
document from the Terms of Service document. Additionally, 23andMe has a Medical Record
Privacy Notice for users who seek medical advice through their services. As Wason and I argued,
having so many policies available via hyperlink can feel like a wild goose chase to find
information. The 23andMe Privacy Statement has an entire section devoted to explaining the
differences between aggregate information vs. personal information, too. This is a start to
Privacy First, but further examination of third-party relationships reveals privacy erosion. This
includes the ways in which 23andMe works with law enforcement and, like I mentioned earlier,
sell data to pharmaceutical companies like Almirall.

Slide 10
Further examinations of privacy policies and data leaks as failures is needed, as are extensions
and complications to this approach. First co-authored work from the DRPC was accepted,
focusing on the privacy-surveillance continuum. The DRPC has grants from CPTSC and CCCC
to continue this project. Invitation: Collaborate with the DRPC (drpcollective.com).

You might also like