English CIA 1

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

1

Navigating the Digital Landscape

Titirshu Sharma (2334614), Chaitya Jain (2334529)

Department of Psychology, CHRIST (Deemed to be University)

Language and Contemporary Society: ENG184-2

Mr. Dhanesh M.

09/01/2024
2

INTRODUCTION
In a time when technology is advancing at an exponential rate, the digital divide has become
a major issue that affects every aspect of our lives. To provide a greater knowledge of the
overall issue of traversing the digital divide, this essay aims to go into five crucial concepts:
algorithmic bias, privacy, autonomy, digital divide, and surveillance. These words capture the
potential as well as the dangers of our digital age, where societal dynamics and quickening
technology advancements have created a wide range of difficulties.
Once mostly thought to be a difference in access to ICTs, the digital divide has evolved into a
complex web of inequalities with significant effects on both individuals and communities.
The digital revolution brings with it opportunities never seen before, but it also makes already
existent disparities worse, leading to an uneven distribution of benefits and drawbacks. This
digital environment is made more complex by the emergence of artificial intelligence (AI).
The digital divide is a social justice need as well as a technology problem. Vulnerable people
are particularly affected by algorithmic bias, privacy loss, autonomy issues, and surveillance
activities. The necessity of addressing these challenges becomes more pressing as society
rapidly moves towards an increasingly digital future.
Algorithmic bias
A concerning aspect of the growing digital divide is the emergence of algorithmic bias, which
is a byproduct of automated decision-making systems. The possibility for prejudice is a
serious concern as algorithms influence many facets of our life, including financial
transactions and career prospects. The obvious truth that biased algorithms can magnify and
even perpetuate social injustices, disproportionately hurting marginalized groups, clashes
with the promise of efficiency and objectivity.
Apple Card and Credit Algorithm Bias
Apple Card, launched in 2019, promised a sleek and streamlined credit card experience with
a focus on transparency and fairness. However, the card quickly became embroiled in
controversy due to concerns that its creditworthiness algorithm discriminated against women.
Biased Data:
 The algorithm was likely trained on historical credit data, which has long reflected
gender disparities. Women often have lower credit scores than men due to factors like
unequal pay, career interruptions for childcare, and limited access to formal financial
products. This skewed data could have inadvertently baked bias into the Apple Card
algorithm.
Unfair Outcomes:
 Investigations and reports revealed that some women with strong credit histories were
offered significantly lower credit limits than their male counterparts with similar
financial profiles. This disparity raised concerns about potential gender discrimination
in the algorithm's decision-making process.
3

Real-World Impacts:
 The controversy surrounding Apple Card's credit algorithm cast a shadow over its
promise of fairness and transparency. It also highlighted the broader issue of gender
bias in financial technology and the potential for algorithms to perpetuate existing
inequalities.
January 2024, the US Department of Health and Human Services (HHS) announced a plan to
implement a new risk assessment algorithm to manage the distribution of COVID-19 antiviral
medications. The algorithm, called PRIORITY, aims to prioritize treatment for those most at
risk of severe illness or death. However, concerns have been raised about the potential for
bias and ethical implications.
Concerns about Risk Assessment Algorithms:
 Bias: Algorithms can reflect and amplify existing biases in the data they are trained
on. For example, if a healthcare algorithm is trained on data that underrepresents
certain populations, it may be less accurate in predicting health outcomes for those
groups.
 Transparency: Algorithms can be opaque, making it difficult to understand how they
reach their conclusions. This lack of transparency can make it challenging to
challenge or correct biased outcomes.
 Ethical implications: The use of algorithms to make decisions about individuals' lives
raises ethical concerns, such as fairness, privacy, and autonomy.
Privacy
In the digital sphere, privacy—once a fundamental right valued in the analog world—is
jeopardized. The widespread use of internet-connected gadgets and platforms has made
personal data a lucrative asset. The digital age presents unparalleled connectivity and ease,
but it also exposes people to frequent data breaches and invasive surveillance. The
fundamental foundation of our democratic society is being threatened by the degradation of
privacy, which erodes personal autonomy.
Our personal data is continuously moving via a complicated network of gadgets, websites,
and applications in the ever-expanding digital world. Every click, swipe, and transaction
creates a digital footprint that provides governments, businesses, and even hackers with
access to a wealth of data. Because of this ongoing exposure, there is always a chance of
privacy breaches occurring, which casts a shadow of doom over our online actions.
A privacy violation can have disastrous, far-reaching effects. Consider the possibility that
your bank statements or credit card information will fall into the wrong hands. The
possibilities of identity theft, financial fraud, and even blackmail become horrifying. Your
browsing history and location data, for example, appear innocuous enough, but they may be
exploited to build an intricate profile of your tastes and way of life invading your privacy and
potentially leaving you vulnerable to targeted scams or manipulation.
Example
4

In 2018, the hospitality world was rocked by a massive data breach that tarnished the
reputation of Marriott International and put the personal information of millions of guests at
risk. Hackers infiltrated the company's reservation system, stealing the data of up to 500
million individuals who had stayed at its Starwood-branded hotels over the previous four
years. The compromised data included a treasure trove of sensitive information, from names
and addresses to email addresses, passport numbers, and even encrypted payment card
details. This breach posed a significant threat to the privacy and security of millions of
travelers, exposing them to potential identity theft, financial fraud, and even targeted scams.
The scope of the Marriott data breach was truly staggering, making it one of the largest
personal data breaches in history. The attack highlighted the vulnerabilities inherent in storing
vast amounts of guest data and the importance of robust cybersecurity measures in the
hospitality industry. In the aftermath of the breach, Marriott faced intense scrutiny from
regulators, customers, and security experts. The company scrambled to contain the damage,
offering credit monitoring services to affected guests and investing heavily in its
cybersecurity infrastructure. While the immediate fallout subsided, the Marriott data breach
remains a stark reminder of the ever-present risks of privacy breaches in today's digital world.
Remember, your privacy is a valuable asset. Treat it with the respect it deserves.
Autonomy
The digital era presents new problems for autonomy, which is the cornerstone of personal age
ncy an decision-making.

The emergence of algorithms and automated systems presents a risk to individual autonomy a
s they shape decisions and influence behavior, even while they offer individualized experienc
es.Navigating the digital divide without compromising personal autonomy requires finding a
careful balance between customization and maintaining individual agency

Our autonomy is being undermined by the digital world, which may appear to be a glittering
sea of opportunities but is actually a threat. Every click, swipe, and purchase we make creates
a digital trail that feeds algorithms to understand our preferences, routines, and weaknesses.
Then, these unseen puppeteers gently guide us toward their objectives by affecting our
decisions and reshaping our internet encounters.
Imagine being inundated with advertisements for things you never even "needed," selectively
filtered news feeds that reinforce your preconceived notions, and search results that gently
nudge you in the direction of predefined answers. Despite its subtle nature, this manipulation
has the potential to erode our capacity for critical thought, self-determination, and resistance
to outside influences.
Social media sites may become echo chambers of unquestioned opinions, facial recognition
software can convert our faces into surveillance instruments, and data monopolies can use our
information for their own financial advantage. Our ability to be true to ourselves, consider
other viewpoints, and choose our own pathways may be progressively restricted in this web
of control.
But keep in mind that autonomy is just being threatened, not lost. We can take back control
and use our own compass to navigate the digital world by identifying these hazards,
demanding transparency from algorithms, and aggressively protecting our privacy. In the
5

digital age, autonomy is a constant battle, but one that is worthwhile to fight for the sake of
our own freedom and the integrity of our collective future.
Digital divide
the gap in digital literacy. It makes a distinction between those who have easy access to
knowledge, opportunities, and communication, and those who are left behind and must
contend with a lack of resources, antiquated technology, and a feeling of isolation. This
divide affects more than just having access to the internet; it also affects our capacity to
access necessary services, engage in the online discussions that shape our lives, and navigate
a world that is becoming more and more digital.
Example
The vast and geographically rugged region of Appalachia in the eastern United States is home
to millions of people, many of whom face stark challenges in accessing essential healthcare
services. This situation is further exacerbated by a profound digital divide that leaves many
residents struggling to reap the benefits of telehealth and other technology-driven healthcare
initiatives.
The Challenges:
 Limited Broadband Access: A significant portion of Appalachia lacks access to
reliable high-speed internet, a crucial prerequisite for telehealth consultations and
online healthcare resources. This lack of connectivity disproportionately affects rural
communities, leaving residents dependent on outdated methods of accessing
healthcare.
 Digital Literacy Gap: Many residents, particularly older adults, lack the necessary
digital literacy skills to navigate online healthcare platforms and utilize telehealth
services effectively. This lack of familiarity creates further barriers to accessing care.
 Economic Disparity: Poverty and low-income rates are prevalent in many
Appalachian communities, limiting residents' ability to afford devices, data plans, and
even basic medical care, further fueling the healthcare disparity.
The Consequences:
 Delayed or Denied Care: The digital divide hinders timely access to medical
consultations and diagnostics, potentially leading to delayed diagnoses and treatment
for health conditions. Chronic and preventable diseases may flourish due to this lack
of preventative care.
 Social Isolation and Mental Health: Limited access to healthcare resources can
contribute to feelings of isolation and exacerbate mental health issues in these
communities.
 Economic Impact: Unmanaged health conditions can negatively impact productivity
and employment opportunities, perpetuating the cycle of poverty and hindering
economic development in the region.

Surveillance
6

Surveillance is a hidden underbelly of the modern world, a glittering sea of connectivity and
information. It is a massive and complex web that is woven from each click, swipe, and
online interaction. Even though it's frequently undetectable, this ongoing surveillance raises
important concerns about autonomy, privacy, and the definition of freedom in the digital age.
It is like a jail where prisoners are continuously watched, but the watcher is never seen. That
invisible observer in the digital world is a complicated web of corporations, governments,
and algorithms, each with their own motivations for tracking our every action.
Surveillance in the Digital World Takes Many Forms:
Data tracking: Businesses and governments can gather and examine the digital traces left by
our online actions, from searches to purchases. This information can be utilized for profiling,
targeted advertising, and even behavior prediction
Social Media Monitoring: Websites such as Facebook and Twitter keep tabs on our
conversations and feelings, building comprehensive profiles that are used to manipulate our
decisions and charge for our time.
Government Surveillance: Governments use advanced technologies to monitor internet
communications in the name of security, which raises concerns about widespread surveillance
and the diminution of civil freedoms.
Consequences
Privacy Erosion: We feel exposed and vulnerable in the digital world as a result of the
ongoing gathering and analysis of personal data, which erodes our right to privacy.
Manipulation and Bias: Algorithms have the ability to restrict our capacity for critical thought
and to provide us with access to a variety of viewpoints by manipulating our decisions,
screening out information, and sometimes reinforcing preexisting biases.
Chilling Effect on Free Speech: Online discourse and democratic engagement can be
negatively impacted by self-censorship and the restriction of our freedom of expression due
to fear of being watched.
Unequal influence Dynamics: By controlling and analyzing our data, data monopolies and
powerful institutions amass enormous influence, giving rise to worries about the abuse and
misuse of this information.

REFERENCES
https://www.wired.com/story/the-apple-card-didnt-see-genderand-thats-the-problem/
https://www.nytimes.com/2019/11/10/business/Apple-credit-card-investigation.html
https://www.theverge.com/2019/11/11/20958953/apple-credit-card-gender-discrimination-
algorithms-black-box-investigation
https://www.iberdrola.com/social-commitment/what-is-digital-divide
7

https://www.brookings.edu/articles/algorithmic-bias-detection-and-mitigation-best-practices-
and-policies-to-reduce-consumer-harms/

You might also like