Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Fighting Algorithmic Bias

Before watching

1. Discuss these questions with your classmates / your teacher.

a. What are self-learning algorithms typically used for?


b. Have you ever worked with them?
c. What are some of the challenges of working with them?
d. What companies use face recognition software?
e. Have you ever encountered issues with face recognition software as a
user? If so, which ones? If not, which issues do you think people might
encounter?

2. You will watch a TedTalk in which MIT grad student Joy Buolamwini
explains how she’s fighting bias in algorithms. What do you think this
idea refers to?

3. Match the adjectives on column A with their collocations on column


B. Use a dictionary, if necessary.
A B
coded experiences
algorithmic pace
massive practices
rapid bias
exclusionary gaze
discriminatory scale

While watching

4. Watch the video until 0:55, check your answers in exercise 3. Then,
answer these questions.

a. Why does she consider herself a poet of code?


b. What is the coded gaze?
c. What are some possible consequences of algorithmic bias?
5. Watch the next part until 2:25. Write True (T) or False (F) and correct
the false ones.

a. The Aspire Mirror is a system that projects digital masks on someone’s


reflection.
b. To build the Aspire Mirror, Joy used facial recognition software only
available at MIT Media Lab.
c. The system helped Joy feel better every morning
d. She had encountered that problem before.
e. She had to get a robot to play hide-and-seek.

6. Continue watching until 3:36. Explain the incident with the social
robot in Hong Kong and the conclusion they arrived at using the words
and phrases from the box.

entrepreneurship competition - local start-ups - demo


developers - generic - machine learning techniques - training set -

7. Watch until 4:59 and match the sentence halves below.

1. Training sets don't just... a. ...are starting to use facial recognition


software in their crime-fighting arsenal

2. Algorithmic bias...
b. ...remains a challenge.

c. ...showing that one in two adults in the


3. Across the US, police departments...
US have their faces in facial recognition
networks.
4. Georgetown Law published a report...
d. ...is no laughing matter.

5. Labeling faces consistently... e. ...can also lead to discriminatory


practices.

6. Misidentifying a suspected criminal... f. ...materialize out of nowhere.


8. Before watching, in pairs / with the teacher, write down three possible
ways to make coding bias-free on the right column.

Our ideas Joy Buolamwini’s ideas

9. Finish watching the video and complete the chart with the ideas
suggested by Joy. How similar or different are the speaker’s ideas from
yours?

After watching

10. Discuss these questions with your classmates / the teacher.

a. What is the purpose of the video?


b. How successful is the speaker to achieve this purpose?
c. What surprised you the most about the video?

11. Read these excerpts. Discuss with your classmates / the teacher.

If the training sets aren't really Some judges use machine-


that diverse, any face that generated risk scores to
deviates too much from the determine how long an
established norm will be harder individual is going to
to detect, which is what was spend in prison.
happening to me.

Who codes matters,


how we code matters
and why we code
matters.
Focus on lexis

12. Read these excerpts from the video. Then, answer the questions.

a. “I'm a graduate student at the MIT Media Lab, and there I have the
opportunity to work on all sorts of whimsical projects, including the Aspire
Mirror, a project I did so I could project digital masks onto my reflection.”

Look at the use of the word "project”. Which two are nouns? Which one
is a verb?

How are they pronounced? Listen again and choose the correct option.

/'prɒdʒɛkt/ is the noun / verb. /prəˈdʒɛkt/ is the noun / verb.

Use "project" as a noun and a verb in examples. Read them aloud.

b. “Unfortunately, I've run into this issue before.”

What kind of issue did the speaker run into?

What kind of issues do you usually run into at work? Can you think of a
synonym of the phrasal verb “run into”?

c. "My friends and I laugh all the time when we see other people mislabeled
in our photos. But misidentifying a suspected criminal is no laughing
matter, nor is breaching civil liberties."

What does the prefix "mis" mean in both examples in bold?

Think of two more examples starting with this prefix and use them in
meaningful sentences.

___________________________________________________________________
___________________________________________________________________
___________________________________________________________________
Follow-up task

13. In pairs / small groups / with your teacher, discuss these questions.

Joy said "In her book, "Weapons of Math Destruction," data scientist
Cathy O'Neil talks about the rising new WMDs -- widespread, mysterious
and destructive algorithms that are increasingly being used to make
decisions that impact more aspects of our lives.." What examples did she
mention?

Are you familiar with these three examples of Al bias? What do they
consist of?

1. Racism embedded in US healthcare


2. COMPAS (Correctional Offender Management Profiling for Alternative
Sanctions)
3. Amazon’s hiring algorithm

14. Read the article Real-life Examples of Discriminating Artificial


Intelligence and discuss the ideas.

15. Share your conclusions or write a short essay.

Can these types of Al bias, or others, be found in your country?

As Terence Shin pointed out in the article, "Discrimination undermines


equal opportunity and amplifies oppression." What are some best
practices that everyone should conduct to minimize AI bias according to
the video, the article and your experience?

Daniela Ubieta & Silvina Mascitti - ©2022

You might also like