Download as pdf or txt
Download as pdf or txt
You are on page 1of 45

Introduction to AI and Data Ethics

Prof. Keeley Crockett


Professor of Computational Intelligence
Department of Computing and Mathematics
Overview

• Why is looking at AI through an ethical


lens important?
• What is Ethics?
• How does AI learn from Data? – A
simple example
• Ethical Lenses and Theories
• A framework for ethical decision
making
48 hours AI In the news……
Profiling to the Extreme with Big Data

Available: https://www.aclu.org/ordering-pizza 3-4


What is Artificial intelligence Ethics?

“AI ethics is a set of values, principles, and techniques that employ


widely accepted standards to guide moral conduct in the development
and use of AI systems.”
Source Gov.UK

New technology = new ethical problems

Should a device, a technique or technology be restricted because people can use


it for illegal or harmful actions as well as beneficial ones?
Potential Harms Caused by AI Systems

• Bias and Discrimination


• Denial of Individual Autonomy, Recourse, and
Rights
• Non-transparent, Unexplainable, or Unjustifiable
Outcomes
• Invasions of Privacy
• Isolation and Disintegration of Social Connection This Photo by Unknown Author is licensed under CC BY

• Unreliable, Unsafe, or Poor-Quality Outcomes

Source: understanding_artificial_intelligence_ethics_and_safety.pdf (turing.ac.uk)


Problems with Artificial Intelligence and Ethics (1)

• Autonomous Vehicles
• The trolley problem: should you pull the lever to divert the runaway trolley onto the
side track? [Wikipedia]
• Do you take responsibility for killing one person to save several?
• Autonomous Weapons
• If robots fight our wars, will fewer humans die?
• Can robot decisions be tempered by compassion?
• Can we program in the rules of war?
• Can we program ethics?
• Could autonomous agents of war be hacked?
This Photo by Unknown Author is licensed under CC BY-SA

Credit to Judy Goldsmith


7
Problems with Artificial Intelligence and Ethics (2)
• The Future of Employment
• If robots and software agents
take over manufacturing,
farming, service industries,
education, etc., what will
people do?
• How will we share the
prosperity?
• Do robots have employment
rights?
• Source: The impact of AI on jobs
(publishing.service.gov.uk)

Credit to Judy Goldsmith


8
Background – A simple
Example of how AI learns
Machine Learning Workflow

Source: http://nkonst.com/wp-content/uploads/2014/03/ml-eng.png

10
How does AI Learn? - An example problem
• Imagine that I'm trying predict whether my neighbor is
going to drive into work, so I can ask for a ride.
• Whether she drives into work seems to depend on the
following attributes of the day:
• temperature,
• expected precipitation,
• day of the week,
• what she's wearing.

(Adapted from Leslie Kaelbling's example in the MIT courseware)


Memory
Okay. Let's say we observe our neighbor on three days:

Temp Precip Day Shop Clothes


25 None Sat No Casual Walk
-5 Snow Mon Yes Casual Drive
15 Snow Mon Yes Casual Walk
Memory

• Now, we find ourselves on a snowy “–5” – degree Monday, when the


neighbor is wearing casual clothes and going shopping.
• Do you think she's going to drive?

Temp Precip Day Clothes


25 None Sat Casual Walk
-5 Snow Mon Casual Drive
15 Snow Mon Casual Walk
-5 Snow Mon Casual
Memory
• The standard answer in this case is "yes".
• This day is just like one of the ones we've seen before, and so it seems
like a good bet to predict "yes."
• This is about the most rudimentary form of learning, which is just to
memorize the things you've seen before.

Temp Precip Day Clothes


25 None Sat Casual Walk
-5 Snow Mon Casual Drive
15 Snow Mon Casual Walk
-5 Snow Mon Casual Drive
Noisy Data
• Things aren’t always as easy as they were in the previous case. What if you
get this set of noisy data?
• Now, we are asked to Temp Precip Day Clothes
predict what's going to 25 None Sat Casual Walk
happen. 25 None Sat Casual Walk
• We have certainly seen 25 None Sat Casual Drive
this case before. 25 None Sat Casual Drive

• But the problem is that it 25 None Sat Casual Walk


has had different 25 None Sat Casual Walk
answers. Our neighbor 25 None Sat Casual Walk
is not entirely reliable. 25 None Sat Casual ?
Averaging

Temp Precip Day Clothes


• One strategy would be to predict 25 None Sat Casual Walk
the majority outcome. 25 None Sat Casual Walk
• The neighbor walked more 25 None Sat Casual Drive
times than she drove in this 25 None Sat Casual Drive
situation, so we might predict 25 None Sat Casual Walk
"walk".
25 None Sat Casual Walk
25 None Sat Casual Walk
25 None Sat Casual Walk
Generalization
Temp Precip Day Clothes
22 None Fri Casual Walk
3 None Sun Casual Walk
10 Rain Wed Casual Walk
What about a previously 30 None Mon Casual Drive
unseen case? 20 None Sat Formal Drive
Will she walk or drive? 25 None Sat Casual Drive
-5 Snow Mon Casual Drive
27 None Tue Casual Drive
24 Rain Mon Casual ?
We now have data from 2+ neighbors…..
Temp Type Day Clothes Outcome
I Temp Type Day Clothes Outcome
D
22 None Fri Casual Walk
22 None Fri Formal Drive 3 None Sun Casual Walk
2
3 None Sun Formal Drive 10 Rain Wed Casual Walk
2
10 Rain Wed Formal Drive 30 None Mon Casual Drive
2
20 None Sat Formal Drive
2 30 None Mon Formal Drive
25 None Sat Casual Drive
2 20 None Sat Formal Drive
-5 Snow Mon Casual Drive
2 25 None Sat Formal Drive
27 None Tue Casual Drive
2 -5 Snow Mon Formal Drive
24 Rain Mon Casual Walk
2 27 None Tue Formal Drive
Neighbour 2
Neighbour 1

Can you see anything different between the two


neighbours?
Neighbour 1 is a registered disabled person
What is Ethics?
What do these words mean?
Values – “principles or standards of behaviour;
one's judgement of what is important in life”

Morals – principals of right and wrong

Ethics – the study of standards of right and wrong


behaviour

3-21
What Ethics is not….

• Ethics is not the same as feelings


• Ethics is not the same as religion.
• Ethics is not the same thing as following
the law.
• Ethics is not the same as following
culturally accepted norms
• Ethics is not science

Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making -
Markkula Center for Applied Ethics (scu.edu)
A Doctors Dilemma

You are a doctor at a top hospital. You have six gravely ill patients,
five of whom are in urgent need of organ transplants. You can't help
them, though, because there are no available organs that can be
used to save their lives. The sixth patient, however, will die without a
particular medicine. If s/he dies, you will be able to save the other five
patients by using the organs of patient 6, who is an organ donor.
What do you do?

Option 1 - Keep patient 6 comfortable, but do not give


him the medical care that could save his life in order to
save the other five patients

Option 2 - Save patient 6 and let the other five die; it's
unfortunate, but that's not your call to make
Service Robots in the Hotel Industry

Good or Bad ?

Whose Perspective ?

This Photo by Unknown Author is


licensed under CC BY
This Photo by Unknown Author is licensed
under CC BY-SA-NC
Utilitarian Ethics

Ethical Lenses and


Deontological Ethics
Theories
Virtue Ethics
Six Ethical Lenses
• The Rights Lens – respects moral rights of those affected.
• The Justice Lens – each person has fair and equal
treatment.
• The Utilitarian Lens – emphasises the consequences of
our actions on others - do good for individuals
• The Common Good Lens - mutual concern for the shared
interests of all members of a community.
• The Virtue Lens - A very ancient approach to ethics –
actions consistent with ideal virtues
• The Care Ethics Lens – Relationships - the need to listen
and respond to individuals in their specific circumstances.
DEONTOLOGICAL Ethics (Kantianism)

• This ethical lens focuses on moral rules,


rights, principles, and duties. Example of Deontological Ethical
• requires careful ethical reflection and Issues in Tech Practice
judgment
• The ethical issues and concerns include
• Autonomy (the extent to which In what way does a virtual banking assistant that is
people can freely choose for deliberately designed to deceive users (for example by
themselves) actively representing itself as a human) violate a moral
• Dignity (the extent to which people rule or principle such as the Kantian imperative to never
are valued in themselves, not as treat a person as a mere means to an end? Would people
objects with a price) • be justified in feeling wronged by the bank upon
• Transparency (honest, open, and discovering the deceit, even if they had not been
informed conditions of social financially harmed by the software? Does a participant in
treatment/distribution) a commercial financial transaction have a moral
right not to be lied to, even if a legal loophole means there
is no legal right violated here?
Deontological Questions for Technologists that Illuminate the
Ethical Landscape:

• What rights of others & duties to others must we respect in a particular


context?
• How might the dignity & autonomy of each stakeholder be impacted by
this project?
• Does our project treat people in ways that are transparent and to
which they would consent?
• Are our choices/conduct of the sort that I/we could find universally
acceptable?
• Does this project involve any conflicting moral duties to others, or
conflicting stakeholder rights? If so, how can we prioritize these?
• Which moral rights/duties involved in this project may be justifiably This Photo by Unknown Author is
overridden by higher ethical duties, or more fundamental rights? licensed under CC BY-NC-ND
The Utilitarian Perspective
• Utilitarianism is attractive to many engineers because in theory it
implies the ability to quantify the ethical analysis and select for the
optimal outcome
• Requires us to consider equally the welfare of all affected
stakeholders,
• The ethical issues and concerns frequently highlighted by looking
through this ethical lens include, but are not limited to:
• Happiness (in a comprehensive sense, including such
factors as physical, mental, and other forms of well-being)
• Balancing of stakeholder interests (who is benefitting and
who is being harmed, in what ways and to what degree,
and how many)
• Prediction of consequences (some consequences can be
predicted and others cannot; still, one should account for
all reasonably foreseeable effects of this action)
The Utilitarian Perspective: Related Questions for
Technologists

• Who are all the people who are likely to be directly and indirectly affected by this project? In
what ways?
• Will the effects in aggregate likely create more good than harm, and what types of good and
harm? What are we counting as well-being, and what are we counting as harm/suffering?
• What are the most morally significant harms and benefits that this project involves? Is our view
of these concepts too narrow, or are we thinking about all relevant types of harm/benefit
(psychological, political, environmental, moral, cognitive, emotional, institutional, cultural, etc.)?
• How might future generations be affected by this project?
• Have we adequately considered ‘dual-use’ and downstream effects other than those we intend?
• Have we considered the full range of actions/resources/opportunities available to us that might
boost this project’s potential benefits and minimize its risks?
• Are we settling too easily for an ethically ‘acceptable’ design or goal (‘do no harm’), or are there
missed opportunities to set a higher ethical standard and generate even greater benefits?

This Photo by Unknown Author is licensed under CC BY-SA-NC


Problems of Utilitarian Ethics

1. Comparing and measuring the consequences of


alternative actions is very difficult.
 One problem that follows from this is that, because
of these difficulties, there will be a tendency to
ignore the consequences, especially the harmful
consequences, to anyone other than those closest
to us.
2. Do the ends justifies the means?
 Are there not certain decisions that should follow
no matter what the consequences?

3-31
The Common Good Perspective
• Focuses on the impact of a practice on the health and welfare of communities or groups
of people.
• The ethical issues and concerns frequently highlighted by looking through this ethical lens
include,
• Communities (of varying scales, ranging from families to neighbourhoods, towns,
provinces, nations, and the world)
• Relationships (not only among individuals, but also relationships in a more holistic
sense of groups, including nonhuman animals and the natural world as well)
• Institutions of governance (and the ways in which these networked institutions
interact with each other)
• Economic institutions (including corporations and corporate cultures, trade
organizations, etc.)
• Other social institutions (such as religious groups, alumni associations, professional
associations, environmental groups, etc.)
Common Good Questions for Technologists that Illuminate
the Ethical Landscape

• Does this project benefit many individuals, but only at the


expense of the common good?
• Does it do the opposite, by sacrificing the welfare or key
interests of individuals for the common good? Have we
considered these tradeoffs, and determined which are
ethically justifiable?
• What might this technology do for or to social institutions
such as various levels of government, schools, hospitals,
churches, infrastructure, and so on?
• What might this technology do for or to the larger
environment beyond human society, such as ecosystems,
biodiversity, sustainability, climate change, animal welfare,
etc.?
Source: Five Ways To Shape Ethical Decisions: Common Good Approach - Capsim
Virtue Ethics (character-based ethics)
• A right act is the action a virtuous person would
do in the same circumstances
• Concerns the rightness or wrongness of individual Good points of virtue ethics
actions • It centres ethics on the person and what it
means to be human
• Provides guidance as to the sort of characteristics
• It includes the whole of a person's life
and behaviours a good person will seek to achieve.
Bad points of virtue ethics
• Virtue ethics teaches: • it doesn't provide clear guidance on what to
An action is only right if it is an action that a do in moral dilemmas
virtuous person would carry out in the same • presumably a totally virtuous person would
circumstances.
know what to do and we could consider
A virtuous person is a person who acts virtuously them a suitable role model to guide us
A person acts virtuously if they "possess and live • there is no general agreement on what the
the virtues" virtues (culturally may be different )
A virtue is a moral characteristic that a person
needs to live well
Virtue Ethics for Technologists that Illuminate the Ethical
Landscape
• What design habits are we regularly embodying, and are they the habits of excellent designers?
• Would we want future generations of technologists to use our practice as the example to follow?
• What habits of character will this design/project foster in users and other affected stakeholders?
• Will this design/project weaken or disincentivize any important human habits, skills, or virtues
that are central to human excellence (moral, political, or intellectual)? Will it strengthen any?
• Will this design/project incentivize any vicious habits or traits in users or other stakeholders?
• Are our choices and practices generally embodying the appropriate ‘mean’ of conduct
• (relative to the context)? Or are they extreme (excessive or deficient) in some ways?
• Is there anything unusual about the context of this project that requires us to reconsider or
modify the normal ‘script’ of good design practice? Are we qualified and in a position to safely
and ethically make such modifications to normal design practice, and if not, who is?
• What will this design/project say about us as people in the eyes of those who receive it?
• Will we, as individuals and as a team/organization, be proud to have our names associated with
this project one day?
Global Ethical Perspectives

• Technologists remain vigilant and humble enough to


remember that whatever ethical frameworks may be most
familiar or ‘natural’ to them and their colleagues, they only
amount to a tiny fraction of the ways of seeing the ethical
landscape that their potential users and impacted
communities may adopt.
• Technologists will design with ethical values in mind; the
only question is whether they will do so in ways that are
careful, reflective, explicit, humble, transparent, and This Photo by Unknown Author is licensed under CC BY-SA-NC

responsive to stakeholder feedback, or in ways that are


arrogant, opaque, and irresponsible.
Questions for Technologists that Illuminate the Global
Ethical Landscape
• Have we invited and considered the ethical perspectives of users and communities other than our
own, including those quite culturally or physically remote from us? Or have we fallen into the trap
of “designing for ourselves”?
• How might the impacts and perceptions of this design/project differ for users and communities
with very different value-systems and social norms than those local or familiar to us? If we don’t
know, how can we learn the answer?
• The vision of the ‘good life’ dominant in tech-centric cultures of the West is far from universal.
Have we considered the global reach of technology and the fact that ethical traditions beyond the
West often emphasize values such as social harmony and care, hierarchical respect, honor,
personal sacrifice, or social ritual far more than we might?
• In what cases should we refuse, for compelling ethical reasons, to honor the social norms of
another tradition, and in what cases should we incorporate and uphold others’ norms?
• How will we decide, and by what standard or process?
A Framework for Ethical Decision Making
A Framework for Ethical Decision Making

Identify the Ethical Issues

1. Could this decision or situation be damaging to someone or to


some group, or unevenly beneficial to people? Does this
decision involve a choice between a good and bad alternative,
or perhaps between two “goods” or between two “bads”?
2. Is this issue about more than solely what is legal or what is
most efficient? If so, how?

Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making

Get the Facts

3. What are the relevant facts of the case? What facts are not
known? Can I learn more about the situation? Do I know
enough to make a decision?
4. What individuals and groups have an important stake in the
outcome? Are the concerns of some of those individuals or
groups more important? Why?
5. What are the options? Have all the relevant persons and
groups been consulted? Have I identified creative options?

Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making

6. Evaluate Alternative Actions – ask the following questions

• Which option best respects the rights of all who have a stake? (The
Rights Lens)
• Which option treats people fairly, giving them each what they are
due? (The Justice Lens)
• Which option will produce the most good and do the least harm for as
many stakeholders as possible? (The Utilitarian Lens)
• Which option best serves the community as a whole, not just some
members? (The Common Good Lens)
• Which option leads me to act as the sort of person I want to be? (The
Virtue Lens)
• Which option appropriately takes into account the relationships,
concerns, and feelings of all stakeholders? (The Care Ethics Lens)
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making

Choose an Option for Action and Test It

7. After an evaluation using all of these lenses, which option


best addresses the situation?
8. If I told someone I respect (or a public audience) which
option I have chosen, what would they say?
9. How can my decision be implemented with the greatest care
and attention to the concerns of all stakeholders?

Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making

Implement Your Decision and Reflect on the Outcome

10. How did my decision turn out, and what have I learned
from this specific situation? What (if any) follow-up actions
should I take?

Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
Summary

• Ethical lenses and theories provide


us with a way to question
appropriate uses of AI.

• Lab sessions - ETHICS IN TECH


PRACTICE: A Toolkit will explore a
number of tools and case studies.

• Further Reading on Moodle


This Photo by Unknown Author is licensed under CC BY-SA-NC

You might also like