Professional Documents
Culture Documents
Introduction To Data and AI Ethics
Introduction To Data and AI Ethics
• Autonomous Vehicles
• The trolley problem: should you pull the lever to divert the runaway trolley onto the
side track? [Wikipedia]
• Do you take responsibility for killing one person to save several?
• Autonomous Weapons
• If robots fight our wars, will fewer humans die?
• Can robot decisions be tempered by compassion?
• Can we program in the rules of war?
• Can we program ethics?
• Could autonomous agents of war be hacked?
This Photo by Unknown Author is licensed under CC BY-SA
Source: http://nkonst.com/wp-content/uploads/2014/03/ml-eng.png
10
How does AI Learn? - An example problem
• Imagine that I'm trying predict whether my neighbor is
going to drive into work, so I can ask for a ride.
• Whether she drives into work seems to depend on the
following attributes of the day:
• temperature,
• expected precipitation,
• day of the week,
• what she's wearing.
3-21
What Ethics is not….
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making -
Markkula Center for Applied Ethics (scu.edu)
A Doctors Dilemma
You are a doctor at a top hospital. You have six gravely ill patients,
five of whom are in urgent need of organ transplants. You can't help
them, though, because there are no available organs that can be
used to save their lives. The sixth patient, however, will die without a
particular medicine. If s/he dies, you will be able to save the other five
patients by using the organs of patient 6, who is an organ donor.
What do you do?
Option 2 - Save patient 6 and let the other five die; it's
unfortunate, but that's not your call to make
Service Robots in the Hotel Industry
Good or Bad ?
Whose Perspective ?
• Who are all the people who are likely to be directly and indirectly affected by this project? In
what ways?
• Will the effects in aggregate likely create more good than harm, and what types of good and
harm? What are we counting as well-being, and what are we counting as harm/suffering?
• What are the most morally significant harms and benefits that this project involves? Is our view
of these concepts too narrow, or are we thinking about all relevant types of harm/benefit
(psychological, political, environmental, moral, cognitive, emotional, institutional, cultural, etc.)?
• How might future generations be affected by this project?
• Have we adequately considered ‘dual-use’ and downstream effects other than those we intend?
• Have we considered the full range of actions/resources/opportunities available to us that might
boost this project’s potential benefits and minimize its risks?
• Are we settling too easily for an ethically ‘acceptable’ design or goal (‘do no harm’), or are there
missed opportunities to set a higher ethical standard and generate even greater benefits?
3-31
The Common Good Perspective
• Focuses on the impact of a practice on the health and welfare of communities or groups
of people.
• The ethical issues and concerns frequently highlighted by looking through this ethical lens
include,
• Communities (of varying scales, ranging from families to neighbourhoods, towns,
provinces, nations, and the world)
• Relationships (not only among individuals, but also relationships in a more holistic
sense of groups, including nonhuman animals and the natural world as well)
• Institutions of governance (and the ways in which these networked institutions
interact with each other)
• Economic institutions (including corporations and corporate cultures, trade
organizations, etc.)
• Other social institutions (such as religious groups, alumni associations, professional
associations, environmental groups, etc.)
Common Good Questions for Technologists that Illuminate
the Ethical Landscape
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making
3. What are the relevant facts of the case? What facts are not
known? Can I learn more about the situation? Do I know
enough to make a decision?
4. What individuals and groups have an important stake in the
outcome? Are the concerns of some of those individuals or
groups more important? Why?
5. What are the options? Have all the relevant persons and
groups been consulted? Have I identified creative options?
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making
• Which option best respects the rights of all who have a stake? (The
Rights Lens)
• Which option treats people fairly, giving them each what they are
due? (The Justice Lens)
• Which option will produce the most good and do the least harm for as
many stakeholders as possible? (The Utilitarian Lens)
• Which option best serves the community as a whole, not just some
members? (The Common Good Lens)
• Which option leads me to act as the sort of person I want to be? (The
Virtue Lens)
• Which option appropriately takes into account the relationships,
concerns, and feelings of all stakeholders? (The Care Ethics Lens)
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
A Framework for Ethical Decision Making
10. How did my decision turn out, and what have I learned
from this specific situation? What (if any) follow-up actions
should I take?
Credits: Markkula Centre for Applied Ethics: available A Framework for Ethical Decision Making - Markkula Center for Applied Ethics (scu.edu)
Summary