Professional Documents
Culture Documents
Rachellevy 2017
Rachellevy 2017
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
http://about.jstor.org/terms
This content downloaded from 128.122.230.132 on Thu, 09 Nov 2017 20:07:38 UTC
All use subject to http://about.jstor.org/terms
Taking Aim
at Biased Algorithms
C
Rachel Levy
athy O’Neil is a mathematician, data
scientist, author, activist, and blogger (at
mathbabe.org). She has worked in higher
education, on Wall Street, and for various
start-up companies.
Her 2016 book Weapons of Math Destruction aims to
help the public better understand the impact that algo-
rithms have on everyone’s life. Many students trained in
the mathematical sciences take jobs directly addressed
in the book, such as in data science and finance. O’Neil
raises new issues about the impact of mathematicians’
work on society. As topics in the news heighten public
awareness and concern about the power and role of
Adam Morganstern
algorithms, mathematicians have an opportunity to Cathy O’Neil.
provide new tools to foster transparency, equity, and
benevolence. are somehow dealing with objective truth, but every
I chatted with Cathy O’Neil in January. This inter- step along the way requires human intervention. This
view has been edited for length and clarity. misconception that algorithms are somehow revealing
objective truth is the most important clarification that I
Rachel Levy: What are the main take-home messages of
want to make.
Weapons of Math Destruction?
In fact, it is the opposite. These algorithms are often
Cathy O’Neil: My book describes the way mathemat-
propagating historical biases and past mistakes. It is a
ics and the trust of mathematics is used against the
particular shame when the brand of mathematics is be-
public. The public trusts and fears mathematics. So
ing deployed to protect something that is fundamentally
when marketers, salespeople, or even scientists represent
immoral and corrupt.
algorithms as mathematical and when they represent
machine learning as sophisticated and opaque, people do RL: What are some examples of weapons of math de-
not question them. Therefore, those automated systems struction (WMDs)?
remain almost entirely unaccountable, and bad things CO: Weapons of math destruction are algorithms that
can happen. Algorithms are not mathematics. They are important, opaque, and destructive. There are ex-
have mathematical attributes, but they are ultimately amples all across normal everyday life for Americans.
working with human-curated data on a human-curated Examples include assessments for teachers that
agenda. are happening in many large cities. They are called
We—the developers—embed our biases in the algo- “value-added models” and have been addressed by the
rithms. Not to mention we have chosen what data to use American Statistical Society. Judges use predictive po-
to train our algorithms, and the data we have chosen is licing and recidivism-risk algorithms for parole, bail, and
a social construction. There’s a belief that algorithms sentencing. Political microtargeting algorithms inform
”
is used by predatory for-profit colleges and payday lend-
ers to target single black mothers who want a better life
for their children. human-curated agenda.
These algorithms don’t affect everyone equally. The
working class and working poor are affected more often factors such as tuition cost. The model has spun off an
by these algorithms than highly educated and well-off industry of side effects, including growing administra-
people. Well-off people can get through their lives fairly tions and unnecessary expenditures, not to mention
unscathed by the kinds of corporate and government rising tuitions. That’s just one example of many that
surveillance that I worry about in the book. I urge illustrates how important it is to have a good definition
people to look at algorithms and weapons of math de- of success.
struction through the lens of class and race. The algorithms that are both important and nonde-
The people building and deploying the algorithms structive are the ones that help the people who are the
are often well intentioned but naive. They are technolo- least lucky or are suffering the most. There are colleges
gists—sometimes mathematicians, computer scientists, using algorithms to find struggling students, especially
or statisticians—and they have an arm’s-length perspec- struggling freshmen, and connecting them with advising
tive on the targets of the algorithms that they build. support.
They do not acknowledge or understand the kind of It is important that the people who are struggling
effects they are creating. the most and raising the largest number of red flags are
RL: Can you name some algorithms that are designed and not being punished. There is an example of this that I
blogged about—Mount St. Mary’s College. They seemed
used in ways that are relatively fair and just? What quali-
to be doing this identification and expelling students
ties make them so?
before the U.S. News survey was due.
CO: One of my favorite examples is sports. The amount
You asked for a good algorithm, and I came up with
of sports data is blossoming. The data comes from
one that is being used to do good but could easily be
the games, and the games are on public view, such as
used to do harm. Therein lies the conundrum. It really
national television. The top radio shows serve the pur-
depends on the usage. A good rule of thumb is to ask:
pose of cleaning the data—they’ll talk endlessly about
“Is it helping or punishing the people who are the worst
whether a play should have been an error or a base hit.
off ?”
We have transparency.
Algorithms make moral decisions all the time, and
One of the most difficult aspects of building algorithms
the people programming the algorithms should not be
fairly is to have a well-defined and agreed-upon defini-
the ones making those moral decisions. Right now, we
tion of success. In sports, success is clearly defined: A
are conflating the job of building the algorithm with
sports team wants to win games. You could argue they
answering the moral questions.
want to make as much profit as possible, but there is at
least a correlation between the two. If you look under RL: What important things have happened since the
the covers of many of the algorithms that are destruc- hardcover version of your book was published that we can
tive or ineffective, you’ll see that their definition of suc- read about in the upcoming paperback version?
cess is ambiguous, incorrect, or so hard to measure that CO: The run-up to the election largely happened after
unintended consequences abound. I finalized my book. So the biggest gaping hole is the
In the book I talk at length about the U.S. News & way propaganda, fake news, hoaxes, and the Facebook
World Report college-ranking model. For the last 30 algorithm have further destroyed our concept of truth
years, colleges have been trying to get ranked higher, and our efforts at democracy. I talk about the Facebook
and the definition of success in this model is an arbi- algorithm and political microtargeting in the hardcover
trary set of attributes that do not include important version, but I don’t come out and say that the Facebook