Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

In this part, we discussed disruptive technologies through the concept of ethics.

What
problems do these technologies create for us and what are the current examples of these
problems?

-----------------------------------------slayt-1------------------------------

Bu kısımda neler hakkında konuşacağımı açıklayacağım.

Possible Effects of Disruptive Technologies

- Artificial Intelligence cannot be as successful as humans in the ethical


decision-making process.
- Artificial Intelligence bias and algorithmic Injustice
- Our personal information is used to manipulate us or to manage our perception
- Our right to be forgotten has been taken away from us.

-With the introduction of advanced technologies into our lives, human beings began to
encounter certain problems. Since the concept of ethics is realized by human judgment, we
observe that artificial intelligence cannot be as successful as humans in the ethical
decision-making process. Although artificial intelligence bias does not seem to be an
important issue now, in the future this may result in violation of the rights of minorities. In
addition, as our personal information is stored on various platforms and shared with other
organizations, we witness that our personal information is used to manipulate us or to
manage our perception. Moreover, our past information stored on social media platforms is
still accessible to other individuals against our will. Therefore, our right to be forgotten has
been taken away from us.
-----------------------------------------------Slayt-2-------------------------------
Ethical Decision taking of IA

First, the issue I want to address is that artificial intelligence is not as successful as humans
in making ethical decisions. Today, we see that AI is at very critical points. Like autonomous
cars and armed drones. In the event of an accident, autonomous cars have various problems
in making ethical decisions like humans. The reason for this is that artificial intelligence only
takes decisions as it is taught, but our concept of ethics is not as simple as it is thought, and
it seems a little difficult for the time to fully teach this to artificial intelligence. Another example
of this is that the artificial intelligence used by the drones used for defence systems can
identify the enemies and take the necessary position to attack them, but cannot make the
attack decision on their own, and this decision must be approved by a human. This is
because the decisions this technology makes on its own are not ethical.

-------------------------------------------------Slayt-3-----------------------------------------------------------------

IA Bias and Algorithmic Injustice

I will continue with this topic that the decision-making process of artificial intelligence is
unethical. Artificial intelligence carries the ideologies and prejudices of its developers into our
lives. We are calling this situation “IA bias”. Whatever the data given in the artificial
intelligence learning process, the way it handle as a subject takes place within this
information. In short, if the developer of artificial intelligence does not give the data that black
people are human, the system will perceive black people as an object, not a human. The
same is possible for white people. The example I want to give may seem very simple, but the
fact that self-driving cars do not perceive black people and think that hitting them is not a big
problem can cause huge problems in the future. As you can see from the picture when we
search for hand, we can only see the white people hand.
Joy Buolamwini, the woman you see in the last Image, is the founder of the Algorithmic
Justice League. So, what is an algorithmic justice league? Let's learn about this subject with
the video prepared by this institution.

https://www.youtube.com/watch?v=lIfFxtSN358

-----------------------------------------------Slay-4 ----------------------------------------------------------------

Manipulation according to our personal Data

Do you think the ads that you see are random while browsing stories on Instagram or surfing
google? If you think so, you are wrong because it recommends products to us using our
Facebook and google search history. Although this situation seems very innocent, in fact,
this situation causes to buy what we do not need, or a more serious situation may affect our
votes in the future presidential elections.
-------------------------------------------Slayt-5--------------------------------------------------------------------
Cambridge Analytica Scandal
To briefly summarize the Cambridge Analytica scandal, it all started when Cambridge
Analytica obtained the data of 87 million American Facebook users using the bug in the
system. They used this data to create detailed personality profiles. Using these profiles, they
predicted the user's gender, race, personal preferences, beliefs, and most importantly, their
political affiliation. Cambridge Analytica, which worked for Trump in the 2016 election, used
these profiles to micro-target in detail and to publicize and disseminate personalized political
ads and manipulative news.

--------------------------------------slayt-6-------------------------------------------------------------------------

"RIGHT TO BE FORGETTEN
Isn't it scary that the mistakes we made in the past are still accessible to others somewhere
in the internet universe? Forgetting and being forgotten is a human condition. Individuals
have the right to forget or to be forgotten about their past. But this is very difficult in an
environment where our personal data or actions we have taken in the past are stored and
this information can be accessed by anyone. It is the most natural right for people to want
and demand that their past actions be forgotten to turn a new page, but this is very difficult
with the internet and social media. Even if we do not have an account on social media, the
post shared by another individual causes the situation that we wanted to forget or wanted to
be forgotten in the past, accessible to everyone. Fortunately, we can request the deletion of
information thanks to the "RIGHT TO BE FORGETTEN" decision taken by the European
Court of Justice in 2014.

----------------------------------------------------slayt-7-----------------------------------

You might also like