Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Journal of the Association for Information Systems (2021) 22(3), 571-578

doi: 10.17705/1jais.00672

POLICY EDITORIAL

ISSN 1536-9323

Policy for Ethical Digital Services

Richard O. Mason1
1
Cox School of Business, Southern Methodist University, USA, rmason@mail.cox.smu.edu

John L. King was the accepting senior editor. This editorial was accepted on February 27, 2021.

Our values, our institutions, democracy, human “assumes, expresses, and helps create a system of
rights, in my view are being challenged right now human ‘values,’” such as “respect for persons and the
because of the emergence of new technologies. 1 maintenance of individual dignity” (Vickers, 1965, p.
– Casper Klynge, Denmark’s foreign ambassador 29).
to the technology industry.
2 Respect for Persons or Individual
1 On Policy Making and Human Dignity
Values A key value for policy is respect for persons or
How does information technology affect human individual dignity. This concept is emergent, gaining
values? There is much contemporary discussion on this meaning over time. By the 19th century, it had evolved
question in accounts of “surveillance capitalism” in a to mean that human beings have an “unconditional
sociotechnical context, (Zuboff, 2019) 2 as well as in moral worth and should always be treated as if there is
research on concepts of dignity regarding personal data nothing of greater value than they are.”5 This is from
(PD), sometimes called personal attribute data (PAD) the second formulation of Immanuel Kant’s
in the process of digitalization (Leidner & Tona, 2021). categorical imperative: “act as to treat humanity,
This paper discusses policy for PD as an overarching whether in thine own person or in that of any other, in
framework in a sociotechnical system. Such policy every case as an end withal, never as a means only”
makes the system more acceptable to stakeholders, (emphasis added, Kant, 1964, p. 96). Kant places no
some of whom carry more power and influence than time limit on this imperative. It works irrespective of
others. 3 Policy is political, as well as technical and disruptions. People should be treated with respect and
analytical. Changes in science, technology, social individual dignity, even after death. This seems
mores, and unanticipated disruptions like the COVID- especially relevant in an age of pandemic, when some
19 pandemic dislocate sociotechnical systems and hold economic values above respect for persons and
sometimes require new policy. Policy must be guided individual dignity.6
by values,4 either consciously or unconsciously. Policy

1The New York Times, Sept 4, 2019, B1. policy issues facing the system (see Mason & Mitroff, 1981,
2“Sociotechnical” implies the interrelatedness of the social p. 43).
4
and the technical aspects of a system as a whole. Values are good, desirable and important to stakeholders.
Sociotechnical systems are interlinked, a mixture of human Value must be the object of a preference, or a judgment of
beings, knowledge, technology, and the environment. Each importance for any state of a system, or thing, attitude, ideal,
works together to reach a condition of joint optimization (see purpose, or goal (see Reese, 1980, p. 604)
5 https://rhchp.regis.edu/HCE/EthicsAtAGlance/RespectFor
Trist & Bamforth, 1951, p. 14).
3 Stakeholders are claimants inside and outside the Persons
6 https://www.marketwatch.com/story/how-do-you-choose-
sociotechnical system who have a vested interest in the
between-economic-deaths-of-despair-and-coronavirus-

571
Policy for Ethical Digital Services

Respect for persons and individual dignity should be Ethical digital service policy often has its origins in
applied to PD that describe a quality or feature of an biomedicine. It is akin to IS because it involves the
individual person or an inherent part or characteristic systematic use of information. Biomedicine also
of that person. When PD include data about attributes embodies respect for persons and individual dignity. A
and behavioral data that attach to given individuals, it good example of this is the “common rule” that gave rise
becomes personally identifiable information (PII).7 In to institutional review boards (IRBs) that oversee
online marketing, behavioral data (information human-subjects research (King, 2015). IRBs arose in
produced as a result of actions, typically commercial part from the US Public Health Service (USPHS)
behavior using a range of devices connected to the Tuskegee syphilis study in the 1930s in which Black
internet, such as a PC, tablet, or smartphone) can track males in Macon County, Alabama were promised
sites visited, apps downloaded, and games played.8 An treatment for “bad blood” but in fact received no
individual’s PD might not provide exclusive identifiers treatment even if diagnosed with syphilis. A similar
like PII, but each additional element of PD can USPHS study in Central America in the 1940s
describe more about who the person is or might deliberately infected people with diseases to study them.
become. Behavioral data reflects a user’s past behavior These cases dovetailed with the horrific biomedical
and can be used to predict and influence subsequent experiments performed during WWII that came to light
behavior. Issues of respect for persons and individual during the Nuremberg trials. Some argued that data
dignity can arise with PII or PD because they are collected in such barbaric studies should never be
extensions of the person. This is an emerging concept, published. Others argued that the data were
as the case of Henrietta Lacks suggests. contributions from the victims and should be published.
There is seldom closure on such issues but respect for
3 Lessons from Henrietta Lacks persons and individual dignity guidelines help guide the
discussion.
Henrietta Lacks was a poor, 30-year-old African
In August 2013—over 60 years after Henrietta Lacks
American mother of five who moved from Clover,
died—the Lacks family and the National Institutes of
Virginia, “a land of wooden slave quarters, faith
Health reached an “understanding.” Biomedical
healings, and voodoo,” to Baltimore. 9 She was researchers could use all genome information derived
diagnosed at Johns Hopkins Hospital with aggressive from Henrietta Lacks’s cells, and the family would
cervical cancer and died on October 4, 1951. Hers was receive accolades but no financial remuneration. The
more than a biological case: it was a data case. After NIH announced it would begin “exploring fundamental
her death, physicians found that Lack’s cells carried reforms to human subject protections— actions that are
the data to divide repeatedly and survive indefinitely in driven in part by the fact that technological advances,
culture. Her “immortal” cells, labeled “HeLa,” (an especially in genomics and computing, have made the
acronym comprised of the initial letters of her first and notion of ‘de-identifying’ a research participant’s
last names) were the first “immortal” human cells biological sample virtually impossible.”10
grown in culture. They live today and are used
For purposes of this paper, researchers used Henrietta
worldwide to study the effects of nuclear radiation, for
Lacks’s data (found in her cells) without consent. To the
research on cancer, and to develop techniques of in-
extent that they considered her a patient, not a person,
vitro fertilization, cloning, and gene mapping. HeLa respect for the person was easy to ignore. Her cells were
was also used to develop COVID-19 vaccines. HeLa an “object,” not part of the person. But Henrietta Lacks
launched a medical revolution and a multimillion- was a person. A young lab assistant, Mary Kubicek,
dollar industry. Yet neither Lacks nor anyone in her witnessed Lacks’s autopsy. She looked at Lacks’s arms
family benefited financially. Aside from the fact that and legs to avoid her lifeless eyes, but then gazed at
her cells (and most cells, for that matter) can be thought Lacks’s toenails, covered in chipped bright red polish.
of as data processing entities, Henrietta Lacks’s case She said: “When I saw those toenails, I nearly fainted. I
shows how closely PD can be connected to the person, thought, Oh jeez, she’s a real person. I started imagining
and how our understanding of these issues is evolving. her sitting in her bathroom painting those toenails, and
it hit me for the first time that those cells, we’d been

8
victims-economists-lawmakers-grapple-with-a-moral- https://digitopoly.org/2015/05/14/behind-the-buzz-of-behavioral-
conundrum-2020-03-26 data/
7 Tech writer and linguist Margaret Rouse, defines PII as 9 Quoted from the dust jacket of The Immortal Life of

“any data that could potentially identify a specific Henrietta Lacks (Skloot, 2010).
“individual.” She adds it could be any” information that can 10 “NIH, Lacks family reach understanding to share genomic

be used to distinguish one person from another and can be data of HeLa cells” (https://www.nih.gov/news-events/
used for de-anonymizing anonymous data.” news-releases/nih-lacks-family-reach-understanding-share-
https://searchfinancialsecurity.techtarget.com/definition/per genomic-data-hela-cells).
sonally-identifiable-information

572
Journal of the Association for Information Systems

working with all this time and sending all over the individuals can ask organizations to delete personal
world, they came from a live woman. I’d never thought data. No such regulations govern the United States as
of it that way” (Skloot, 2010, pp. 90-91).11 a whole, but the California Consumer Privacy Act
(CCPA) tries to do something similar.17
4 Stressing Human Values
Respect for persons and individual dignity applies to
5 A New Type of Capitalism
digital service platforms (DSPs) 12 such as Google, DSPs employ a new business model (Zuboff, 2019).
Facebook, Amazon, Apple, and Microsoft. DSPs The story of Henrietta Lacks informs DSPs in two
employ PD in much the same way that researchers ways. First, it shows that PD are an extension of the
have been using the HeLa cell line. They pay little if person, and thus subject to respect for persons and
any respect to the originators. They use data freely to individual dignity. Second, it shows that the
drive multimillion-dollar businesses in a huge application of the concept changes over time. It is not
sociotechnical system where data flow plays a central now settled. The DSP business model needs to engage
role. The term “big data” 13 was coined in 2005 to these points. This model can be summarized as five
describe this. 14 New York Times technology reporter steps, of which the third, fourth, and fifth are key:
Steve Lohr, after attending the World Economic
1. Offer users an online service, essentially for free.
Forum, wrote in February 2012:
2. Acquire and use PD to deliver the service
The new megarich of Silicon Valley, first at
efficiently and effectively.
Google and now Facebook, are masters at
harnessing the data of the Web—online 3. Use powerful computers, typically at “server
searches, posts and messages—with farms” located strategically around the world, to
Internet advertising. At the World economic augment the PD with behavioral information
forum last month in Davos, Switzerland, Big through AI, machine learning, deep learning, etc.
Data was a marquee topic. A Forum report, Within microseconds, match demographic
‘Big Data, Big Impact,’ declared data a new information like name, gender, age, location, and
economic asset class like currency or gold. previous behavior to reveal almost anything.
Digital tracking or digital fingerprinting enables
Lohr goes on to say: “policymakers are beginning to
following, recording, storing, and repackaging a
realize the potential for channeling these torrents of
user’s internet history. DSPs specializing in PD
data into actionable information that can be used to
know what users read or skip, what videos they
identify needs & provide services for the benefit of
watch all the way through or stop watching after
low-income populations.” The report serves as a call to
a few seconds, what promotional emails they
action for stakeholders to take “concerted action to
read or toss in the Trash without opening, what
ensure that this data helps the individuals and
they do on social media (e.g., “Like” on
communities who create it” (WEF, 2012).
Facebook, retweet on Twitter, “heart” on
Big data has become an economic driver, with data Instagram). Taken together, such clues can
characterized as “currency or gold,” a kind of capital. identify users and capture behavior, correlating
A significant portion of data is PD. Respect for persons to deduce each user’s online fingerprint and
and individual dignity requires entities acquiring PD to personal traits, likes and dislikes, idiosyncrasies
handle it according to ethical principles similar to those and perhaps neuroses (Chen, 2019).
defined in the General Data Protection Regulations
4. The DSP serves the user personalized
(GDPR), Europe’s law embodying the right to be
advertisements or propaganda each time the user
forgotten, or the right to erasure. 15 In fact, the GDPR
accesses the account.
provides a useful definition of PD. 16 Under GDPR,

11 Chapter 26 of this book, “Breach of Privacy,” describes million emails and nearly 20 million text messages and
some of the abuses made in using HeLa cells for research and perfomed nearly five billion Google searches per minute
other purposes. (https://www.forbes.com/sites/nicolemartin1/2019/08/07/ho
12 A platform refers to a coordinated group of technologies w-much-data-is-collected-every-minute-of-the-
that is used as a base upon which various applications are day/#62f37f353d66). In addition, International Data
developed, implemented, and delivered. Digital services Corporation (IDC) predicts that the world’s data will be in
refer to the electronic delivery, usually web-based, of hundreds of zettabytes by 2025 (a zettabyte is one sextillion,
information (see McAfee & Brynjolfsson, 2017). 1021 or 270 bytes. The pandemic might affect this, but the
13 Roger Mougalas from O'Reilly Media coined the term trend is clear.
“Big Data” for the first time in 2005 (https://datafloq.com/ 15 https://gdpr.eu/
16 https://gdpr-info.eu/art-4-gdpr/
read/big-data-history).
14 According to Forbes, in 2019 Americans used nearly 4.5 17 https://en.wikipedia.org/wiki/California_Consumer_Privacy_

GB [giga bites] of internet data, exchanged nearly 200 Act

573
Policy for Ethical Digital Services

5. The DSP “monetizes” by selling user data to operations for many companies are eager to
marketers, propagandists, etc. lay bets on our future behavior. (Zuboff,
2019, p. 8 emphasis in original) 19
Users expect Steps 1 and 2. These raise few ethical
issues. Users seldom see Steps 3, 4, and 5 that raise The behavioral surplus is created in Steps 3, 4, and 5
challenging ethical issues. In Data and Goliath: The noted above. Zuboff, references Marx’s theory of
Hidden Battles to Collect Your Data and Control Your labor, 20 calling this an exploitation of a user unless
World, security expert Bruce Schneier describes how conducted under conditions of informed consent.
much and how easily PD can be collected:
Your cell phone tracks where you live and 6 The Model is Highly Successful
where you work. It tracks where you like to
Several DSPs dominate. Google (a division of
spend your weekends and evenings. It
Alphabet that includes YouTube) helps people find
tracks how often you go to church (and
information via search. Facebook (including Instagram
which church), how much time you spend in
and WhatsApp) informs and connects people via social
bars, and whether you speed when you
media. Amazon helps people purchase goods via
drive. Since it knows about all the other
digital commerce. Apple, among its other services,
phones in your area it tracks with whom you
allows people to listen to music via iPod and iTunes.
spend your days, with whom you have
Microsoft owns the professional networking site
lunch, and with whom you sleep. The
LinkedIn along with other services and offers public
accumulated data can probably paint a
cloud and other software services. Firms like these are
better picture of how you spend your time
among those that Zuboff calls “surveillance
than you can, because it doesn’t have to rely
capitalists.” “Surveillance” implies some acquisition
on human memory. In 2012, researchers
of personal data done clandestinely, and many users
were able to use this data to predict where
are not aware or fully informed about the behavioral
people would be 24 hours later, to within 20
data or PII being collected about them or how it is used.
meters. (Schneier, 2015, emphasis in
original) Some DSPs enjoy power over users by controlling user
PD. Many users care more about participating with a
Schneier (2015, p. 1) refers to this as “a very intimate
DSP than about protecting their personal data. Many
form of surveillance.” It describes much of the
don’t know which of their PD are collected and how
personal data that social media companies are
they are used. Many are not aware of the extent to
currently collecting. In The Age of Surveillance
which user PD fuels corporate enterprises and
Capitalism, social psychologist Shoshana Zuboff
undergirds successful and wealthy DSPs. Some DSPs
shows how digital service platforms have benefited
have prospered immensely, with billions of users
from “surveillance capitalism,” the process of
signing on daily and interacting for work,
capturing and utilizing PD. One reviewer referred to
entertainment, or education. Each user is a source of
her far-reaching and thought-provoking book as “the
PD to be monetized by a DSP.
information industry’s Silent Spring.” 18 She writes:
The use of PD has been economically effective.
Surveillance capitalism unilaterally claims
Writing in Digital Information World on May 8, 2019,
human experience as free raw material for
Aqsa Rasool states:
translation into behavioral data. Although
some of these data are applied to product or A few years back [in the 20th century], oil
service improvement, the rest are declared and gas companies have been ruling the list
as proprietary behavioral surplus, and of valuable companies. However, tech
fabricated into prediction products that companies like Google, Amazon, Facebook,
anticipate what you will do now, soon, and Apple and Microsoft [The Big Five], took
later … Surveillance capitalists have grown them over and have been leading in the
immensely wealthy from these trading market capitalization. 21

18The review is from Chris Hoofnagle, a law professor at 19 An important precursor to Zuboff’s book is Schneier’s
University of California, Berkeley who studies information (2015) Data and Goliath: The Hidden Battles to Collect Your
privacy, and appears on the dust jacket of Zuboff’s book. Data and Control Your World.
20 “Karl Marx held that human labor was the source of
Silent Spring is a book by Rachel Carlson published in the
1950’s that raised the public’s awareness of the damage that economic value. The capitalist pays his workers less than the
the large-scale use of pesticides (and human activities more value their labor has added to the goods, usually only enough
broadly) was doing to the environment. to maintain the worker at a subsistence level”
(https://www.britannica.com/topic/surplus-value).
21 https://www.digitalinformationworld.com/2019/05/top-five-us-

574
Journal of the Association for Information Systems

Recent data suggest that Rasool’s observation is right. whether to provide information to a DSP based on
In 2019, Microsoft’s market capitalization was over knowledge of the purposes, procedures, risks, benefits.
one trillion dollars. The market capitalization of and alternatives premised on well-established ethical
individual companies like Amazon, Apple, Alphabet principles, such as respect for persons, beneficence,
(including Google and YouTube), and Facebook and justice (see Beauchamp & Childress, 1981).
(including Instagram and WhatsApp) was also very
Bioethics has dealt extensively with informed consent.
large. The total market capitalization of these
According to Faden and Beauchamp (1986), five
companies was between four and five trillion dollars.22
components are required for ethical informed consent:
This was about 15% of the US economy’s total market
disclosure, comprehension, voluntariness, competence
capitalization at the time, according to the Russell
and, consent. A DSP employing a user’s personal data
3000 Index of the 3000 companies that collectively
for purposes other than those required to provide a
account for over 98% of the US’s market capitalization
direct service meets the demanding criteria of
($30 trillion in January 2018). 23 Penetration rates 24
informed consent if and only if:
suggest the majority of the global human population of
seven to eight billion are active or potential customers 1. The user receives a thorough disclosure
of DSPs. Mobile phone use has been estimated at over regarding the additional applications for which
two thirds of the global population; more than half use personal information can be used.
the internet, and nearly half actively use social media.
More than 40% are mobile social media users.25 2. The user comprehends the disclosure.
3. The user acts voluntarily in releasing personal
Major DSP owners are wealthy. The net worth of the
information.
top ten richest Americans includes five owners of DSP
companies and others who own high-tech information 4. The user is competent and understands what is
companies. 26 The smallest fortunes among these being agreed to if releasing or authorizing the
people were around $50 billion; the highest were over release of the information.
$150 billion. If money is power, and as Lord Acton
remarked, power tends to corrupt, the concentration of 5. The user consents to—i.e., gives permission
wealth in few hands could be a threat to the respect of for—the actions being taken involving the use of
persons. Making money is no sin, and Acton stated personal information. (This condition suggests
only that power tends to corrupt. Nevertheless, this that young children and those judged to be
raises significant concerns. incompetent do not satisfy the requirement.)
Informed consent is key but challenging. Many
7 Policy to Be Resolved companies do not know the potential uses data of when
they collect them, so it is hard for individuals to give
DSP concentration of wealth has accumulated in part consent for what might happen. The GDPR and the
by the uncompensated use of users’ PD. The purpose CCPA address secondary use but there are loopholes
of this paper is to point out the fundamental issue of that make the concept of informed consent awkward.
respect for persons in the guiding generation of policy,
This might be an example of the pains of “growing
not to create policy for given circumstances. However,
into” this new era in which it is difficult to reconcile
two issues can be seen in the challenge to develop
technology benefits with respect for persons.
effective policy: informed consent and antitrust
actions.
7.2 Antitrust Actions
7.1 Informed Consent On a different matter than privacy or informed consent,
DSPs have been accused of employing exploitive,
An ethical relationship between a user and a DSP
anticompetitive practices: there are too few
requires that the user is sufficiently aware of the
competitors and substantial entry barriers. A suggested
personal and behavioral information that the DSP is
solution is to break up large DSPs into smaller, less
collecting, as well as what is inferred about the use of
powerful units. On July 24, 2019, the Associated Press
this information. This is embodied in the concept of
reported that:
informed consent. Informed consent is morally
required before releasing personal information to
others. Informed consent enables users to decide

25https://datareportal.com/reports/digital-2019-global-
tech-companies-total-market-capitalization.html
22 https://ycharts.com/companies/MSFT/market_cap digital-overview
23 https://www.barrons.com/articles/the-u-s-stock-market-is-now- 26 https://en.wikipedia.org/wiki/List_of_Americans_by_

worth-30-trillion-1516285704 net_worth
24 Penetration rates are the percentage of a target market that

a service reaches during a period of time.

575
Policy for Ethical Digital Services

The U.S. Justice Department has 8 Conclusion


announced a major antitrust investigation
into unnamed tech giants, while the House Most humans hold sacrosanct knowledge and
Judiciary Committee has begun an information about themselves. Respect for persons
unprecedented antitrust probe into Google, means individuals have a right to share it or not share
Facebook, Amazon and Apple over their it with others.29 If they do share it they must decide to
aggressive business practices, and do so as autonomous, conscious, and uncoerced actors.
promises “a top-to-bottom review of the An individual’s dignity must be honored, penetrated
only by that person’s agreement through informed
market power held by giant tech
consent. Henrietta Lacks’s dignity was violated
platforms.”27
without her consent or that of her family. That
Chris Hughes, a co-founder of Facebook, is a violation generated substantial economic benefits for
surprising proponent of breaking up Facebook and others. Many suffer the same “predatory practices” as
other DSCs. In a The New York Times opinion on May DSP online users or subjects of surveillance.
9, 2019, entitled “It’s Time to Break Up Facebook,” he “Predation” in biology refers to an organism feeding
argues that Mark Zuckerberg is “too personally on prey.30 Part of antitrust law is intended to protect
powerful,” and that the consumers from predatory business practices. A major
policy challenge is how best to develop and implement
vibrant marketplace that once drove effective policy for informed consent prior to the
Facebook and other social media collection and use of personal data. Another is to
companies to compete to come up with employ antitrust measures to remediate excessive
better products has virtually disappeared. power imbalance enjoyed by some DSPs.
This means there’s less chance of start-ups
Research issues also arise. One concerns secondary use
developing healthier less exploitative social
of what originates as PD. For example, PD (or
media platforms. It also means less inadequately anonymized PD) might appear on a
accountability on issues like privacy. screen and be “scraped” by a screen-scraper.
(emphasis added)28 Researchers should consider when the use of data
Whether or not it would change anything regarding disrespects a person or robs a person of individual
respect for persons, breaking up is hard to do. For one dignity. The challenge lies within. Researchers
thing, there are different kinds of technologies in use. should treat the data of other persons with respect and
DSCs use mediating technologies that link clients with dignity.
the external environment in comparison to more
conventional and decomposable long-length Acknowledgments
technologies used in mass production where tasks can
The author acknowledges the help of reviewers and
be broken down (Thompson, 1967, pp. 15-16).
editors with this paper.

27 29
https://www.usatoday.com/story/tech/2019/07/24/facebook- One philosopher who holds this position is the logical positivist
google-amazon-apple-face-antitrust-probe-big-tech- A. J Ayer (see Ayer, 1963).
30 https://www.biology-online.org/dictionary/Predator
breakup/1814480001/
28 https://www.nytimes.com/2019/05/09/opinion/sunday/chris-

hughes-facebook-zuckerberg.html

576
Journal of the Association for Information Systems

References McAfee, A. & Brynjolfsson, E. (2017). Machine,


platform, crowd. Norton
Ayer, A. J. (1963). The concept of a person and other
essays. Macmillan. Reese, W.L. (1980). Dictionary of philosophy and
religion. Humanities Press.
Beauchamp, T. L., & Childress, J. F. (1989). Principles
of biomedical ethics (3rd ed.) Oxford Schneier, B. (2015). Data and Goliath: The hidden
University Press. battles to collect your data and control your
world. Norton.
Chen, B. X. “How to Combat Online
‘Fingerprinting,’” The New York Times, July 4, Skloot, R. (2010). The immortal life of Henrietta
2019, B7. Lacks. Crown Publishers.

Faden, R. & Beauchamp, T. (1986). A history of Thompson, J. D. (1967). Organizations in action.


informed consent. Oxford University Press, McGraw-Hill.
1986. Trist, E. & Bamforth, K. (1951). Some social and
Kant, I. (1964). Groundwork of the metaphysics of psychological consequences of the longwall
morals (trans. J. Paton) Harper & Row. method of coal getting. Human Relations, 4(1)
3-38.
King, J. L. (2015). Humans in computing: Growing
responsibilities for researchers considering the Vickers, G. (1965). The art of judgment: A study in
role of institutional review boards in computing policy making. Basic Books.
research. Communications of the ACM, 58(3), WEF (2012). Big Data, Big Impact: New Possibilities
31-33. for Economic Development. World Economic
Leidner, D.E., & Tona, O. (2021). The CARE theory Forum.
of Dignity Amid Personal Data Digitalization. http://www3.weforum.org/docs/WEF_TC_MF
MIS Quarterly, 45(1), 343-370. S_BigDataBigImpact_Briefing_2012.pdf

Lohr, S. (2012). The age of big data. The New York Zuboff, S. (2019). The age of surveillance capitalism:
Times, February 11, 2012, Sunday Review. the fight for a human future at the new frontier
of power. Public Affairs
Mason, R. O., & Mitroff, I. I. (1981). Challenging
strategic planning assumptions. Wiley.

577
Policy for Ethical Digital Services

About the Authors


Richard Mason is the Carr P. Collins Distinguished Professor Emeritus, Edwin L. Cox School of Business, Southern
Methodist University, Dallas, TX, and was the director of SMU’s Maguire Center for Ethics and Public Responsibility.
He is an Association of Information Systems Fellow and Leo Awardee and a Foreign Fellow of the Russian Academy
of Natural Sciences in Informatics and Cybernetics. He is a community member of the Mercy Regional Medical Center
Ethics Committee in Durango, CO, and an advisor to Fort Lewis College.

Copyright © 2021 by the Association for Information Systems. Permission to make digital or hard copies of all or part
of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for
profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for
components of this work owned by others than the Association for Information Systems must be honored. Abstracting
with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists requires prior
specific permission and/or fee. Request permission to publish from: AIS Administrative Office, P.O. Box 2712 Atlanta,
GA, 30301-2712 Attn: Reprints, or via email from publications@aisnet.org.

578
Reproduced with permission of copyright owner. Further reproduction
prohibited without permission.

You might also like