Professional Documents
Culture Documents
The Ethical Issues of Privacy, Manipulation, and Dependency Associated With Apple' Product Siri
The Ethical Issues of Privacy, Manipulation, and Dependency Associated With Apple' Product Siri
The Ethical Issues of Privacy, Manipulation, and Dependency Associated With Appl
e’ Product Siri
Student’s Name
Institutional Affiliation
Course
Instructor
Date
2
The Ethical Issues of Privacy, Manipulation, and Dependency Associated With Appl
e’ Product Siri
Artificial Intelligence (AI) is now incorporated in many modern products and is affecting
every facet of life. Nowadays, viewers are inundated by numerous advertisements on television
and other media platforms highlighting products with life-changing capabilities (Siau & Wang,
2020). For example, Apple Inc. is now making products that use AI to help individuals perform
day to day work. Apple’s Siri uses an AI technology platform to assist users to organize
themselves, thus adding convenience to everyday existence. The product has led to intense
debate regarding its mixed impacts on users. Although it has significantly improved many
aspects of life, Siri has nonetheless raised important ethical issues such as privacy, over-
Apple is one of the most innovative firms that have incorporated AI technology in their
products and software. One of its innovative products, Siri, has positively impacted on the lives
of users by providing a faster, easier way to do all kinds of useful things (Apple, 2020).The
technology can simultaneously perform multiple tasks such as set alarms, timers, and reminders
without the input of the user. The technology uses AI to anticipate the intention of users, thus
essentially playing the role of decision-maker. However, while Siri is an important addition to an
individual’s life, Apple cannot easily develop this product ethically because it eavesdrop
harm human relations. To develop the product ethically, Apple needs to limit the amount of
information that Siri can collect from users. This way, Apple will not have to contend with the
ethical question of illegal collection of users’ data, which may amount to spying. However, the
3
company has launched a campaign to plead its innocence against the allegations of spying
(Nakedsecurity.com, 2028). Through visual campaigns such as one depicted in image 1, the tech
giant has gone some way in restoring the trust of its consumers.
Image 1: Apple has moved to allay fears that Siri is a spying app. Source:
https://nakedsecurity.sophos.com/2018/08/13/siri-is-listening-to-you-but-shes-not-spying-says-
apple/
(2019), Apple's Siri can easily be activated by “something as mundane as the sound of a zipper”
(par. 1), thus leaving any conversation between individuals open to surveillance. Thus, users may
lose important personal data such as location and contact details, whether one is talking to a
personal doctor or a business partner. Thus, despite the numerous benefits that come with owing
the device, it can give away important information to unauthorized parties such as those that deal
in consumer products. Although Apple has a policy that prohibits sharing of personal data with
third parties, one may fear that such information may be leaked illegally to marketers of
consumer products. In recent years, users have expressed their concern about the role of Siri as a
personal assistant that can record and store words (Kotia & Bharti, 2019 par. 4). Siri’s
eavesdropping capability is so enhanced that it records users’ data even when they are not
is no different from data theft (Brey, 2012 p. 3).On its part, Apple has claimed that this
information is primarily used for advertisements rather than for sharing with third parties (Apple,
4
2020). In fact, the company, as a matter of policy, does not share customer data with third parties
such as marketers of consumer products. However, with tech giants forging strong bonds with
regulators, users should increasingly become concerned with having their privacy sacrificed for
corporate profits. Overcoming this ethical issue is tricky because the technology is not designed
to automatically seek consent from users in the same way a human would do. To a significant
extent, Siri represents a significant ethical issue because it infringes on users’ right to privacy.
https://africa.businessinsider.com/tech/an-apple-whistleblower-has-publicly-decried-the-
company-for-violating-fundamental/jw92ph0
The Apple Siri undermines important human values such as socialization and trust
because users trust the technology more than friends or family members. Users reveal most of
their information to the technology because of its accuracy and the assurance that it would never
fail. On a positive note, the technology enables humans to organize themselves in ways that
would have been unimaginable a few decades ago. However, this benefit comes at a great cost to
human value of trust on each other. For example, users may be reluctant to use the application
for fear that it would divulge important information to unauthorized parties such as marketers of
consumer products. As Kotia and Bharti (2019) reflect, the technology reveals the most
indefinite aspects of human character. While technology is critical in life, it should not come at
5
the expense of important values such as trust that binds people together. Apple, therefore, must
contend with the ethical issue of whether to continue producing the technology, even though it
comes at huge expense to humanity. For example, the company should modify the app and make
it less intrusive in order to cultivate users’ confidence and trust. According to Kotia and Bharti
(2019), humans should confide with friends and families more than with “faceless technologies
(par. 3), a viewpoint that may not be lost on Apple. However, the company has argued that the
technology has served as a reliable “personal assistant” for human, making them more effective
before.
Image 3: An illustration of Siri’s capability of performing even the most mundane tasks. Source:
https://thenextweb.com/apple/2010/04/28/apple-purchases-virtual-assistant-app-siri/
Apple’s Siri is an invisible manipulator, as its addictive nature compels users to explore
for infinite possibilities. Its addiction comes from the fact that users become fascinated and
obsessed by its seemingly unlimited capabilities as ‘personal assistants’. Every day, humans are
inundated with decisions of varying importance. While every decision matters, the choices that
people make impact them in various ways. According to Hill and White (2020), inconsequential
decisions such as the choice of clothes or whether to go to work are not being made by AI
technologies such as Siri, an indicative of their growing manipulative powers when it comes to
decision making. However, Apple has argued that AI technology eases the burden of making
6
complex decisions such as in business, politics, and careers. Studies have shown that the human
brain is self-sufficient for making the big decisions in life (Greenfield & Gillespie-Lynch, 2008).
making them think make decisions they would not ordinarily make. In addition, AI has been
designed to distort the essence of human experience in order to create an opportunity for
profiteering (Siau & Wang, 2020). For example, Apple knows that humans would become overly
dependent on Siri and therefore, would not be able to function without the app. This dependency
provides an opportunity for sale of more products. Instead, humans have increasingly relied on
automatic personal assistants such as Siri to control aspects such as daily schedule of
appointments or fitness regime. From an ethical perspective, large profit-making tech giants such
as Apple can use their vast resources to make innovative products that consumers cannot resist.
From a personal perspective, Siri denies users the much needed personal space. For
example, I am unable to do even the most mundane tasks such as setting reminders for an
important event without using the app. As most users would agree, human beings are, by their
very nature, social animals. Therefore, the ability to break free from human dependence can be
viewed a superhuman quality. According to Lincoln (2000), society and social interactions are
important because they enable people to create an identity. Regardless of technological advances,
no innovation can replicate the essence of a face-to-face conversation between humans. Apple’s
Siri technology has threatened to undermine the value of interpersonal relationships by providing
a platform such as Siri which users will spend too much using at the expense of talking to their
friends. From an ethical perspective, the emergence of Siri was purely motivated by a need to
undermine social interactions between people while increasing interactions with the app.
7
Researchers have established that people who use AI technologies such as Siri as personal
organizers are less likely to interact with each other compared to their counterparts who do not
use the devices (Brüggemeier et al., 2020). The more the technology becomes a substitute for
Although such a comparison may sound extreme, it is more or less true, especially when
considered from an ethical perspective. Users develop a mindset that personal assistants imitate a
real human being. However, Apple has claimed that its voice assistant making people dependent
on machines is accidental, and that human retain the choice to use the technology (Brüggemeier
et al., 2020). However, there is no denying the fact that Apple has capitalized on the modern
trends in which people have become increasingly busier, leaving them with little option than rely
to automated assistants.
Conclusively, Apple cannot ethically continue making Siri because of the technology’s
ethical implications on numerous facets of life. The technology exposes users to eavesdroppers,
as their conversations are no longer private. Siri also makes users to eschew interpersonal
interactions, thus undermining important human values. In addition, the technology has
effectively undermined the ability by humans to utilize their ability to make decisions even on
References
Brey, P. A. (2012). Anticipating ethical issues in emerging IT. Ethics and Information
Brüggemeier, B., Breiter, M., Kurz, M., & Schiwy, J. (2020). User experience of Alexa, Siri and
Dedvukaj, T. (2019, July 29). Apple’s Siri is eavesdropping on your conversations, putting
https://www.foxbusiness.com/technology/apples-siri-is-eavesdropping-on-your-
conversations-putting-users-at-risk
Greenfield, P. M., & Gillespie-Lynch, K. (2008). Intersubjectivity evolved to fit the brain, but
grammar Co-evolved with the brain. Behavioral and Brain Sciences, 31(5), 523-524.
https://doi.org/10.1017/s0140525x08005141
Hill, K., & White, J. (2020, December 28). Designed to deceive: Do these people look real to
you? The New York Times - Breaking News, US News, World News and Videos.
https://www.nytimes.com/interactive/2020/11/21/science/artificial-intelligence-fake-
people-faces.html
Kotia, J., & Bharti, R. (2019, August 26). AI ethics: Personal assistants like Alexa, Siri and
https://medium.com/%C3%A9clair%C3%A9/ai-ethics-personal-assistants-like-alexa-
siri-and-google-home-d54ba05dadd3
9
Lincoln, K. D. (2000). Social support, negative social interactions, and psychological well‐
Nakedsecurity.com. (2018, August 13). Siri is listening to you, but she’s NOT spying, says
https://nakedsecurity.sophos.com/2018/08/13/siri-is-listening-to-you-but-shes-not-
spying-says-apple/
Siau, K., & Wang, W. (2020). Artificial intelligence (AI) ethics. Journal of Database