Benjamin, R. (2019) - Assessing Risk, Automating Racism. Science, 366 (6464), 421-422

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Racial bias in cost

data leads an algorithm


to underestimate
health care needs of
Black patients.

Downloaded from http://science.sciencemag.org/ on October 24, 2019


SOCIAL SCIENCE

Assessing risk, automating racism


A health care algorithm reflects underlying racial bias in society

By Ruha Benjamin era, the intention to deepen racial inequi- beyond the algorithm developers by con-
ties was more explicit, today coded ineq- structing a more fine-grained measure of

A
s more organizations and indus- uity is perpetuated precisely because those health outcomes, by extracting and clean-
tries adopt digital tools to identify who design and adopt such tools are not ing data from electronic health records to
risk and allocate resources, the au- thinking carefully about systemic racism. determine the severity, not just the number,
tomation of racial discrimination is Obermeyer et al. gained access to the train- of conditions. Crucially, they found that so
a growing concern. Social scientists ing data, algorithm, and contextual data for long as the tool remains effective at pre-
have been at the forefront of study- one of the largest commercial tools used by dicting costs, the outputs will continue to
ing the historical, political, economic, and health insurers to assess the health profiles be racially biased by design, even as they
ethical dimensions of such tools (1–3). But for millions of patients. The purpose of the may not explicitly attempt to take race into
most analysts do not have access to widely tool is to identify a subset of patients who re- account. For this reason, Obermeyer et al.
used proprietary algorithms and so can- quire additional attention for complex health engage the literature on “problem formula-
not typically identify the precise mecha- needs before the situation becomes too dire tion,” which illustrates that depending on
nisms that produce disparate outcomes. and costly. Given increased pressure by the how one defines the problem to be solved—
On page 447 of this issue, Obermeyer et Affordable Care Act to minimize spending, whether to lower health care costs or to
al. (4) report one of the first studies to most hospital systems now utilize predictive increase access to care—the outcomes will
examine the outputs and inputs of an al- tools to decide how to invest resources. In vary considerably.
gorithm that predicts health risk, and in- addition to identifying the precise mecha- To grasp the broader implications of the
fluences treatment, of millions of people. nism that produces biased predictions, Ober- study, consider this hypothetical: The year
They found that because the tool was de- meyer et al. were able to quantify the racial is 1951 and an African American mother of
signed to predict the cost of care as a proxy disparity and create alternative algorithmic five, Henrietta Lacks, goes to Johns Hopkins
for health needs, Black patients with the predictors. Hospital with pain, bleeding, and a knot
same risk score as White patients tend to Practically speaking, their finding means in her stomach. After Lacks is tested and
be much sicker, because providers spend that if two people have the same risk score treated with radium tubes, she is “digitally
much less on their care overall. This study that indicates they do not need to be enrolled triaged” (2) using a new state-of-the-art risk
contributes greatly to a more socially con- in a “high-risk management program,” the assessment tool that suggests to hospital
PHOTO: FATCAMERA/GETTY IMAGES

scious approach to technology develop- health of the Black patient is likely much staff the next course of action. Because the
ment, demonstrating how a seemingly worse than that of their White counterpart. tool assesses risk using the predicted cost
benign choice of label (that is, health cost) According to Obermeyer et al., if the predic- of care, and because far less has commonly
initiates a process with potentially life- tive tool were recalibrated to actual needs been spent on Black patients despite their
threatening results. Whereas in a previous on the basis of the number and severity of actual needs, the automated system un-
active chronic illnesses, then twice as many derestimates the level of attention Lacks
Department of African American Studies, Princeton Black patients would be identified for inter- needs. On the basis of the results, she is
University, Princeton, NJ, USA. Email: ruha@princeton.edu vention. Notably, the researchers went well discharged, her health rapidly deteriorates,

SCIENCE sciencemag.org 25 OCTOBER 2019 • VOL 366 ISSUE 6464 421


Published by AAAS
INS IGHTS | P E R S P E C T I V E S

and, by the time she returns, the cancer has they are valued less (9). It is not “something BATTERY TECHNOLOGY
advanced considerably, and she dies. about the interactions that Black patients
This fictional scenario ends in much the
same way as it did in reality, as those familiar
with Lacks’s story know well (5–7). But rather
have with the healthcare system” that leads
to poor care, but the persistence of structural
and interpersonal racism. Even health care
The coming
than getting assessed by a seemingly race-
neutral algorithm applied to all patients in
a colorblind manner, she was admitted into
providers hold racist ideas, which are passed
down to medical students despite an oath
to “do no harm” (10). The trope of the “non-
electric vehicle
the Negro wing of Johns Hopkins Hospital
during a time when explicit forms of racial
discrimination were sanctioned by law and
compliant (Black) patient” is yet another way
that hospital staff stigmatize those who have
reason to question medical authority (11, 12).
transformation
custom—a system commonly known as Jim But a “lack of trust” on the part of Black pa- A future electric transporta-
Crow. However, these are not two distinct tients is not the issue; instead, it is a lack of tion market will depend on
processes, but rather Jim Crow practices feed trustworthiness on the part of the medical in-
the “New Jim Code”—automated systems that dustry (13). The very designation “Tuskegee battery innovation
hide, speed, and deepen racial discrimination study” rather than the official name, U.S. Pub-
behind a veneer of technical neutrality (1). lic Health Service Syphilis Study at Tuskegee, By George Crabtree1,2
Data used to train automated systems are continues to hide the agents of harm. Ober-

E
typically historic and, in the context of health meyer et al. mention some of this context, but lectric vehicles are poised to trans-
care, this history entails segregated hospital passive and sanitized descriptions continue form nearly every aspect of transporta-

Downloaded from http://science.sciencemag.org/ on October 24, 2019


facilities, racist medical curricula, and un- to hide the very social processes that make tion, including fuel, carbon emissions,
equal insurance structures, among other fac- their study consequential. Labels matter. costs, repairs, and driving habits. The
tors. Yet many industries and organizations As researchers build on this analysis, it primary impetus now is decarboniza-
well beyond health care are incorporating is important that the “bias” of algorithms tion to address the climate change
automated tools, from education and bank- does not overshadow the discriminatory emergency, but it soon may shift to eco-
ing to policing and housing, with the prom- context that makes automated tools so nomics because electric vehicles are antici-
ise that algorithmic decisions are less biased important in the first place. If individuals pated to be cheaper and higher-performing
than their human counterpart. But human and institutions valued Black people more, than gasoline cars. The questions are not
decisions comprise the data and shape the they would not “cost less,” and thus this tool if, but how far, electrification will go. What
design of algorithms, now hidden by the might work similarly for all. Beyond this will its impact be on the energy system and
promise of neutrality and with the power to case, it is vital to develop tools that move on geoeconomics? What are the challenges
unjustly discriminate at a much larger scale from assessing individual risk to evaluat- of developing better batteries and securing
than biased individuals. ing the production of risk by institutions so the materials supply chain to support new
For example, although the Fair Housing that, ultimately, the public can hold them battery technology?
Act of 1968 sought to protect people from accountable for harmful outcomes. j The signs of vehicle electrification are
discrimination when they rent or buy a growing. By 2025, Norway aims to have
REFERENCES AND NOTES
home, today social media platforms allow 100% of its cars be either an electric or
1. R. Benjamin, Race After Technology: Abolitionist Tools
marketers to explicitly target advertisements for the New Jim Code (Polity Press, 2019). plug-in hybrid unit, and the Netherlands
by race, excluding racialized groups from 2. V. Eubanks, Automating Inequality: How High-Tech Tools plans to ban all gasoline and diesel car
the housing market without penalty (8). Al- Profile, Police, and Punish the Poor (St. Martin’s Press, sales by the same year. By 2030, Germany
2018).
though the federal government brought a 3. S. Noble, Algorithms of Oppression: How Search Engines plans to ban internal combustion engines,
suit against Facebook for facilitating digital Reinforce Racism (NYU Press, 2018). and by 2040, France and Great Britain aim
discrimination in this manner, more recently 4. Z. Obermeyer, B. Powers, C. Vogeli, S. Mullainathan, to end their gasoline and diesel car sales.
Science 366, 447 (2019).
the U.S. Department of Housing and Urban 5. K. Holloway, Private Bodies, Public Texts: Race, Gender, The most aggressive electric vehicle tar-
Development introduced a rule that would and a Cultural Bioethics (Duke Univ. Press, 2011). gets are those set by China, which has al-
make it harder to fight algorithmic discrimi- 6. H. Landecker, Sci. Context 12, 203 (1999). most half the global electric vehicle stock
7. R. Skloot, The Immortal Life of Henrietta Lacks
nation by lenders, landlords, and others in (Broadway Books, 2011).
and where 1.1 million electric vehicles were
the housing industry. And unlike the algo- 8. J. Angwin, A. Tobin, M. Varner, “Facebook (still) letting sold in 2018. Europe and the United States
rithm studied by Obermeyer et al., which housing advertisers exclude users by race,” ProPublica, each have just over 20% of the global stock,
21 November 2017; www.propublica.org/article/
used a proxy for race that produced a racial with electric car sales of 380,000 and
facebook-advertising-discrimination-housing-race-sex-
disparity, targeted ads allow for explicit ra- national-origin. 375,000 units, respectively, in 2018 (1, 2).
cial exclusion, which violates Facebook’s 9. E. Glaude Jr., Democracy in Black: How Race Still How far electrification will go depends
own policies. Yet investigators found that the Enslaves the American Soul (Crown Publishers, 2016). primarily on a single factor—battery tech-
10. K. M. Bridges, Reproducing Race: An Ethnography of
company continued approving ads excluding Pregnancy as a Site of Racialization (Univ. California nology. In comparing electric with gasoline
“African Americans, mothers of high school Press, 2011). vehicles, all the downsides for electric arise
kids, people interested in wheelchair ramps, 11. A. Nelson, Body and Soul: The Black Panther Party from the battery. Purchase price, range,
and the Fight Against Medical Discrimination (Univ.
Jews, expats from Argentina and Spanish Minnesota Press, 2011). charging time, lifetime, and safety are all
speakers,” all within minutes of an ad sub- 12. H. Washington, Medical Apartheid: The Dark History battery-driven handicaps. On the upside,
mission (8). So, whether it is a federal law or of Medical Experimentation on Black Americans from electric vehicles have lower greenhouse gas
Colonial Times to the Present (Harlem Moon, Broadway
a company policy, top-down reform does not Books, 2006). emissions, provided the electricity grid that
by itself dampen discrimination. 13. R. Benjamin, People’s Science: Bodies and Rights on the supports them is powered by renewable
Labels matter greatly, not only in algo- Stem Cell Frontier (Stanford Univ. Press, 2013). energy [the renewable share of global elec-
rithm design but also in algorithm analysis. tricity is up from 22% in 2001 to 33% today
Black patients do not “cost less,” so much as 10.1126/science.aaz3873 (3), with Europe at 36%, China at 26%, and

422 25 OCTOBER 2019 • VOL 366 ISSUE 6464 sciencemag.org SCIENCE

Published by AAAS
Assessing risk, automating racism
Ruha Benjamin

Science 366 (6464), 421-422.


DOI: 10.1126/science.aaz3873

Downloaded from http://science.sciencemag.org/ on October 24, 2019


ARTICLE TOOLS http://science.sciencemag.org/content/366/6464/421

RELATED http://science.sciencemag.org/content/sci/366/6464/447.full
CONTENT

REFERENCES This article cites 2 articles, 1 of which you can access for free
http://science.sciencemag.org/content/366/6464/421#BIBL

PERMISSIONS http://www.sciencemag.org/help/reprints-and-permissions

Use of this article is subject to the Terms of Service

Science (print ISSN 0036-8075; online ISSN 1095-9203) is published by the American Association for the Advancement of
Science, 1200 New York Avenue NW, Washington, DC 20005. The title Science is a registered trademark of AAAS.
Copyright © 2019 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of
Science. No claim to original U.S. Government Works

You might also like