Professional Documents
Culture Documents
Multimodal Essay
Multimodal Essay
Multimodal Essay
Victoria Tauer
Professor Ferrara
English 1001
2 May 2024
The world today has entered a digital age where knowledge flow and intellectual discourse is
controlled by the media. Technological advancements have allowed the media to have the
constant ability to change and adapt to society for it is not controlled by programmers, but the
algorithm itself. This algorithm takes user information, preferences, and past searches to create a
personalized media that is customized to each individual. While the media allows for quick and
easy access to information, this personalized, self-made algorithm has no ethical code. Its main
objective is not displaying information that is valuable but what post it should prioritize to grasp
user attention. It is not about what society needs to hear but what individuals want to hear based
upon their preferences. In recent years, there has been an increase in questioning whether these
danger to democracy as they prevent the varied and equitable flow of knowledge, resulting in
media that only portrays one-sided information based upon preferences, in which users are only
exposed to content that supports their preexisting opinions. According to my survey, 96.3% of
the 27 people said they encounter information online that aligns with their existing beliefs or
views. This media leads to “the loss of autonomy, the decrease in the epistemic quality of
information, [as well as] losing the ability for effective contestation,” severely harm the diversity
Tauer 2
of perspectives required for effective discourse (Bozdag 263- 264). Society develops through
access to all information. Only then, through discourse, research and arguments, can society
collectively make some sort of a conclusion that benefits humanity. This notion is expressed by
philosopher John Stuart Mil as he portrays that all ideas have extreme intellectual value (Larvor
1). Even ideas that are false have value, for they depict why realities set in place are valid.
Having access to all information is the only way to have true intellectual freedom and understand
all of society’s errors, pointing individuals in the direction of a better world. When the media
controls information, and algorithms prevent users from seeing certain information, there is no
way a society can continue to develop. As Engin Bozdag further expressed, “Filter bubbles are a
problem for the liberal democrats especially due to restrictions on individual liberty, restrictions
on choice and the increase in unawareness…. The filter bubble, according to deliberative
democrats, hurts the civic discourse, mutual understanding and sensemaking” (Page 254).
Individuals, due to algorithms, no longer have the awareness and transparency to all information,
limiting productive discourse. There is no reality where conversation can lead to solutions if the
same information is not available to all people. Filter bubbles undermine the fundamental
process to which information becomes intellectually valuable, hindering the democratic process
that protects transparency and personal liberties. However, by creating algorithms that delete the
personalized aspect of the media, the same information will be shared with all individuals,
of consumers' search habits online. These bubbles created by consumer searches restrict
individuals within their own media of like-minded content, limiting exposure to varied
perspectives. These search habits lead to users being stuck with preexisting opinions. Users’
Tauer 3
opinions, because of this, end up being radicalized, for individuals end up searching the same
things repeatedly without realizing it or agreeing to it, which can change how they behave
online. As user search habits are formed through algorithmic recommendations, the searches
The article “Habitual Generation of Filter bubbles” by Jernej Kaluža explains this notion through
the example of how “his radicalisation began when he typed the words ‘black on White crime’
into Google and got caught into the algorithmically induced filter bubble with racist content ‘[he
has] never been the same since that day’” (Kaluža 272). These algorithms track internet activity
and display information that is very comparable to what has been previously viewed. Eventually,
this strengthens prior views and choices, preventing the consumption of alternative
viewpoints. According to my survey, 81.48% of 27 people said they take most opinions that
resonate with their beliefs as factual. As individuals grow more fixed in their opinions, which
they take as factual, extreme beliefs go uncontested, leading to radicalism. Further explained by
“Habitual Generation of Filter bubbles,” by tracking “user activity expressed in specific search
terms (such as ‘Crooked Hillary’ or ‘Trump’s sexual assault’), observed a significant effect of
bubble formation: ‘due to the programmed responsiveness to past user interests and preferences,
algorithms serve as a confirmatory communication partner … that reassures and reinforces users’
prior beliefs and fosters extremism’” (Kaluža 278). When there is no balance of opinion, it is
easier to develop radical ideas. The magnification of content that amplifies preexisting opinions
creates a society where radicalism or extremism is significantly more common. Radicalism has
severe effects on democracy, as it prevents the ability for productive discourse. Another solution
to this issue would be to create an algorithm that prioritizes diverse opinions and broadens the
content to various opinions and perspectives instead of personalized “filter bubbles”. By shower
Tauer 4
users multiple viewpoints, they will be able to formulate their own opinion based upon having
However, the use of personalized algorithms may be effective for business as well as the
promote business through recommended advertisements that appeal to users’ tastes. According to
"Personalized Experiences: Why We Love Them + Brand Examples,” “49% of consumers say
they will likely become repeat buyers after a personalized shopping experience with a retail
brand. Businesses also report that consumers spend more when they have a personalized
experience. In fact, 80% of business leaders surveyed in Twilio’s report say that consumers
spend an average of 34% more with a personalized experience" (Bretous). Brands may develop
greater emotional ties and loyalty among customers by adapting products to specific preferences
of customers. Many consumers also find these algorithms useful, for my survey displays
opinions such as, “I think algorithms are useful because they help you find what you are looking
for quickly” and “I think it is helpful because it shows you want you want to see.” Bretous also
explained how “Netflix’s algorithm is programmed to suggest shows and movies based on a
user’s watching history, including watch time and review. What you end up with is a
programming list with elements from content you’ve enjoyed in the past, making it easier to pick
something new… everything is curated just… based on… personal interests,” making things
easy and more efficient for users (Bretous). These algorithms not only promote business but
create a more efficient system that allows consumers to be quickly directed to their preferences.
However, this can also be seen as a significant invasion of privacy. My survey has opposite
opinions where people have stated, “I do not like them, so I turned off my search history” and
“they can be seen as an complete invasion of privacy because they collect and analyze personal
Tauer 5
data without explicit consent.” Although these algorithms may be useful to promote business, the
Through my survey I asked each individual to go into google and type the letter “T”, then
to name the first person that comes up in “recommended search.” Individualized media led to
varied results. Taylor Swift was 33.3% of searched, Trump was 14.81%, and there were 13 other
searches, each with different names. Taylor Swift and Trump are two completely different figures
with polarizing ideologies; even the letter “T” creates huge divisions in searches. While this may
seem innocent, these differences in searches create a large division in society. Using Facebook to
understand these algorithms, Eli Pariser states, “I'm progressive, politically… the conservatives
had disappeared from my Facebook feed. And what it turned out was going on was that
Facebook was looking at which links I clicked on… I was clicking more on my liberal friends'
links than on my conservative friends' links. And without consulting me about it, it had edited
them out. They disappeared” (Pariser, 01:04- 38). These “Filter bubbles” can be seen completely
erasing opposing viewpoint because they do not resonate with user preferences. However,
without hearing opposing viewpoints, there is no way to learn or understand others, drastically
encouraging division. My survey shows, from 0-10 (10 being the most effort), there was an
average of 3.8 effort level in seeking diverse opinions, and 70.37% of people put that they
believed this creates division. If the media does not show diverse opinions, and individuals are
not searching for them, what is preventing the complete polarization of society? Pariser further
states, “along came the Internet… algorithms don't yet have the kind of embedded ethics that the
editors did…. we need to make sure that they're not just keyed to relevance. We need to make
sure that they also show us things that are uncomfortable or challenging or important” (Pariser
06:03- 49). There is an immediate need for creating an algorithm that is approved by editors or
Tauer 6
embeds some sort of ethical code. Programmers must find a way to approve the algorithms as
they develop, or else there is no say in the information they will produce. By creating a system
where there is a clear explanation of how these algorithms work, then give a option to turn off
this personalization aspect, media can be more transparent, and not just take user information
shaped the nation. When people think of censorship, they think of authoritarian countries like
North Korea or Russia. Individuals do not desire to live in a world where information is hidden
from them. However, these personalized algorithms have the unrestricted privilege of making
opposing opinions disappear. When thinking about it in that sense, algorithms are a form of
censorship based upon user preference. These personalized algorithms prevent intellectual
freedom and create extremism, as well as a polarized society. There is an immediate need for
change, and programmers must implant solutions in algorithms that allow individuals to access
all information.
Tauer 7
Works Cited
Bozdag, Engin, and Jeroen Van den Hoven. "Breaking the Filter Bubble: Democracy and
Design." Ethics and Information Technology, vol. 17, no. 4, 2015, pp. 249-265. ProQuest,
https://libdb.fairfield.edu/login?url=https://www.proquest.com/scholarly-journals/breaking-
filter-bubble-democracy-design/docview/1779399893/se-2.
personalization.
Jernej Kaluža (2022) Habitual Generation of Filter Bubbles: Why is Algorithmic Personalization
Problematic for the Democratic Public Sphere?, Javnost - The Public, 29:3, 267-
Pariser, Eli. “Beware Online ‘Filter Bubbles.’” Eli Pariser: Beware Online “Filter Bubbles” |