Professional Documents
Culture Documents
Algorithms and The Social Organization of Work - Paper LSE - Ifeoma Ajunwa and Rachel Schlund - Oxford Handbook of Ethics of AI
Algorithms and The Social Organization of Work - Paper LSE - Ifeoma Ajunwa and Rachel Schlund - Oxford Handbook of Ethics of AI
Algorithms and The Social Organization of Work - Paper LSE - Ifeoma Ajunwa and Rachel Schlund - Oxford Handbook of Ethics of AI
This material has been electronically reproduced with the permission of the rightsholder for
exclusive use in the LSE Ethics of AI online short course offered by the London School of
Economics and Political Science (2023). Further distribution or transmission of this material by
any means is strictly prohibited.
Algorithms and the Social Organization of Work
Print Publication Date: Jul 2020 Subject: Law, IT and Communications Law
Online Publication Date: Jul 2020 DOI: 10.1093/oxfordhb/9780190067397.013.52
This chapter argues that the proliferation of automated algorithms in the workplace rais
es questions as to how they might be used in service of the control of workers. In particu
lar, scholars have noted machine learning algorithms as prompting a data-centric reorga
nization of the workplace and a quantification of the worker. The chapter then considers
ethical issues implicated by three emergent algorithmic-driven work technologies: auto
mated hiring platforms (AHPs), wearable workplace technologies, and customer relation
ship management (CRM). AHPs are “digital intermediaries that invite submission of data
from one party through preset interfaces and structured protocols, process that data via
proprietary algorithms, and deliver the sorted data to a second party.” The use of AHPs
involves every stage of the hiring process, from the initial sourcing of candidates to the
eventual selection of candidates from the applicant pool. Meanwhile, wearable workplace
technologies exist in a variety of forms that vary in terms of design and use, from wrist
bands used to track employee location and productivity to exoskeletons used to assist em
ployees performing strenuous labor. Finally, CRM is an approach to managing current
and potential customer interaction and experience with a company using technology.
CRM practices typically involve the use of customer data to develop customer insight to
build customer relationships.
Keywords: automated algorithms, workplace, machine learning algorithms, algorithmic-driven work technologies,
automated hiring platforms, wearable workplace technologies, customer relationship management, hiring
process, customer data, customer relationships
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
argues that technology can never be introduced without an effect—rather any new tech
nology must be parsed for both its affordances and foreclosures.4
Harry Braverman’s seminal analysis of technology, Taylorist management, and the labor
process demonstrated how managerial control is exerted through technology, and as
Michael Burawoy noted in his ethnography, the most visible control technology in a
(p. 806) factory was the assembly line.5 The line coerced workers to keep pace or risk
penalties both monetary and social. The proliferation of automated algorithms in the
workplace raises questions as to how they might be used in the service of control and co
ercion. As some scholars have argued, machine learning algorithms have “prompted a da
ta-centric reorganization of the workplace” and a quantification of the worker in a man
ner and to a degree, previously unseen in history.6 This chapter considers ethical issues
implicated by three algorithmic-driven work technologies: (1) automated hiring plat
forms, (2) wearable workplace technologies, and (3) customer relationship management.
During the initial sourcing stage of the hiring process, companies use AHPs to source or
find attractive candidates using targeted advertising or matching technology.9 Targeted
advertising involves the use of machine learning algorithms to build predictive models
based on data from job seekers and their online activity to automatically generate a pool
of jobseekers with predetermined, sought-out characteristics that companies can use to
target or exclude job seekers from viewing advertisements.10 In the case of matching
technologies, companies typically use personalized job boards to automatically generate a
list of potential job candidates who match the characteristics the company is looking for
in a job candidate.11
Following the initial sourcing stage of the hiring process, companies typically use
(p. 807)
Page 2 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
AHPs aid during the interviewing stage of the hiring process by creating interview guides
for hiring managers based on areas of concern indicated by the assessments used in the
screening process.16 AHPs may also make use of video interviews—job applicants’ inter
views are recorded and their responses, vocal tone, and facial expressions are analyzed
using machine learning algorithms, which compare current job applicants’ responses to
past interview responses from the company’s top employees.17
During the final step in the hiring process, the selection phase, employers use AHPs in
making the final hiring decision by automating background checks and helping to negoti
ate job offer terms. Automated background checks typically assess if a job applicant has a
criminal history and is authorized to work.18 Recently, some AHP vendors also offer social
media background checks—a job applicant’s social media and online history is deployed
to predict how likely the job candidate will engage in toxic workplace behavior such as
bullying, sexual harassment, or drug use.19
Employers can also use AHPs to help negotiate job offer terms. For example, some AHPs
can use companies’ data regarding previous job offers and acceptances to create predic
tive models to assess how likely a given job candidate will accept a given job offer.20
Further, the predictive models can provide companies with potential ways in which to in
crease the likelihood a given job applicant will accept a given job offer such as by increas
ing salary, bonus, and other benefits.21
The use of AHPS is on the rise. According to the Society for Human Resource Manage
ment, in 2016 approximately 22 percent of organizations used automated screening to re
view job applicants’ resumes. A survey conducted by Deloitte University Press in 2017 re
vealed approximately 56 percent of companies surveyed were in the process of redesign
ing their human resources (HR) programs to utilized digital and mobile AHPs. Further, 51
percent of companies surveyed in 2017 were in the process of redesigning their compa
nies to use digital business models. Additionally, 33 percent of the HR teams included in
the survey were using artificial intelligence (AI) to implement HR solutions. The survey
included data from 10,447 companies across the world, including Western Europe (25
percent), Latin and South America (17 percent), Asia (15 percent), North America (14
percent), Africa (10 percent), Central and Eastern Europe (8 percent), Nordic countries
(7 percent), Oceania (3 percent), and the Middle East (2 percent). The companies sur
veyed were also from a variety of industries, including professional services (16 percent),
financial services (13 percent), consumer business (13 percent), technology, media, and
telecommunications (12 percent), manufacturing (11 percent), energy and resources (7
percent), life sciences and health care (6 percent), real estate (1 percent), and other (12
percent).22 Thus, the use of AHPs is worldwide and spans a variety of industries.
Page 3 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
Information Asymmetry
The use of AHPs results in steep information asymmetries between job applicants and em
ployers since job applicants must provide volumes of personal information to attain em
ployment. However, employers do not reciprocate the information and insight they attain
about job applicants—employers do not share the information they attain about job appli
cants with the job applicants, and the job applicants do not receive additional information
about employers.26
Privacy Concerns
Coercing consent from job applicants to take part in personality assessments and to re
veal identifying information that is used to gather additional information on applicants
from third-party vendors renders an enormous amount of personal information. This in
formation is stored, analyzed, and shared among industry partners, in ways unknown to
the job candidate.27 The collection, distribution, and storage of job applicant’s personal
information create numerous privacy concerns regarding the security and confidentiality
of sensitive employee data.
Encoding of Bias
AHP designers claim that automating the hiring process will reduce hiring bias. The
promise of reduced hiring bias derives from the belief that automated hiring systems are
blind to protected characteristics such as gender, age, ethnicity, and sexual orientation.28
However, the AHP is not an autonomous agent. The predictive models used to select em
ployees from applicant pools are built from training data selected by the company. Train
ing data is typically derived from the company’s current high-sales employees’ data,
which is then used to detect patterns in the data to build predictive models to select simi
lar job applicants—a process sometimes referred to as “cloning your best people.”29 The
process of “cloning your best people” can result in encoding human bias. The idea being
that “systematically biased data produces systemically biased analyses, regardless of the
quality of those analyses.”30 For example, if women and racial minorities were underrep
resented in a company’s demographics, then (p. 810) training data would create a predic
Page 4 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
tive model that would similarly undervalue applicants from those underrepresented
groups.31
Discrimination
The encoding of human bias in predictive models used by AHPs can lead to discriminato
ry hiring practices. During the initial sourcing stage of the hiring process, the use of tar
geted advertising can result in discrimination when employers use protected characteris
tics such as ethnicity, gender, and age to target advertisements. Targeted advertising can
also result in discrimination when employers use variables that are correlated with pro
tected characteristics to target job advertisements, such as zip code.32 During the screen
ing stage of the hiring process, discrimination can result from using biased training data
to predict minimum qualifications.33 Furthermore, behavioral and personality assess
ments used in the screening process have been demonstrated to discriminate against peo
ple of color.34 During the interview stage of the hiring process, the use of video interviews
to analyze job applicants responses, vocal tone, and facial expressions can lead to dis
criminatory outcomes for people with regional and nonnative accents, speech impedi
ments, visible disabilities, and darker skin tones due to the design flaws in speech recog
nition and facial analysis software.35 During the selection phase of the hiring process, the
use of social media background checks can reveal information about job applicants that
employers are not legally permitted to use to inform hiring decisions, such as ethnicity,
sexuality, disability, or pregnancy status.
Wearable workplace technologies in the form of smart glasses exist in a variety of de
signs, including see-through smart glasses with enhanced visual aid, smart glasses
equipped with a thermographic camera, artificial reality glasses, and smart glasses with a
monocular view.38 See-through smart glasses with enhanced visual aid essentially consist
of glasses with the properties of a static computer—allowing employees to consult manu
als or guides, transcript notes, or look up records. A variety of industries use see-through
smart glasses with enhanced visual aid, such as construction, manufacturing, field ser
vice, and healthcare, among others.39 Smart glasses equipped with a thermographic cam
era are typically used in construction and manufacturing to assess machine fatigue by
monitoring surface temperature and temperature distribution to predict when a machine
will overheat. Artificial reality (AR) smart glasses project virtual information onto an
employee’s physical surroundings, allowing the employee to distinguish between virtual
information and physical information. To illustrate, factory employees who fix machines
Page 5 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
can use AR glasses to project visual markers and schematics to guide their work.40 Smart
glasses with a monocular view afford employees a wider viewing angle of their surround
ings and can produce high-resolution images. Many industries have implemented smart
glasses with a monocular view, including telemedicine, remote assistance, and warehous
ing.41
Wrist- and finger-worn wearable workplace technologies also come in a variety of designs
and provide different affordances. Examples of finger-worn workplace wearables include
rings equipped with scanners and Bluetooth technology that allow employees to scan in
ventory and communicate hands-free.42 Examples of wrist-worn workplace wearables in
clude bracelets and bands equipped with GPS to track employee location, biosensors to
record and analyze employee biometric data such as heart rate, accelerometers that track
employee movement and activity, and wearable computer interfaces.43 (p. 812) Wrist-worn
workplace wearables are used across a variety of industries, with global market revenue
projected to reach US$70 billion by 2020.44
Smart caps and helmets designed for the workplace can be equipped with augmented re
ality, Bluetooth technology, voice recognition, and even sensors to detect brain activity.45
For example, smart caps and helmets equipped with augmented reality can be used to
project blueprints and instructions for employees performing tasks such as welding and
construction.46 Further, smart caps and helmets equipped with sensors to detect brain ac
tivity can be used to measure an employee’s level of fatigue or alertness.47 A variety of in
dustries have adopted smart caps and helmets such as fieldwork, transportation, con
struction, manufacturing, and product design.48
Wearable workplace technology in the form of exoskeletons are essentially wearable ro
botics that are typically designed to assist employees performing arduous tasks by in
creasing their strength and agility.49 For example, exoskeletons used in construction and
manufacturing can assist employees in lifting heavy tools.50 Some exoskeletons are even
equipped with technology that allows employers to monitor their employees’ location,
mood, and physical health.51 A variety of industries have implemented exoskeletons such
as construction, manufacturing, and geriatric care.52
Wearable workplace technology has proliferated in the workplace. A report by Frost &
Sullivan revealed that between January 2013 and December 2015 approximately 2,955
patents were registered in the United States for wearable technology designed for
(p. 813) the workplace—illustrating the flourishing development of wearable workplace
technologies.53 Further, the majority of the 2,955 patents registered in the United States
for wearable workplace technologies between January 2013 and December 2015 consist
ed of patents for wrist- and finger-worn wearables, such as wristwatches, ring scanners,
and wrist bands—indicating wrist- and finger-worn wearables designed for the workplace
remain a significant trend.54
Page 6 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
The implementation of wearable workplace technologies is also on the rise. A survey con
ducted by Forrester in 2016 revealed 23 percent of information workers use wearable
work technologies, demonstrating a 63 percent increase from 2015.55 Further, an addi
tional survey conducted by Forrester revealed that 62 percent of telecommunications or
ganization executives described the implementation of wearable workplace technologies
is a priority for their organization, which has grown from 52 percent in 2014.56
Privacy Concerns
The increased use of wearable workplace technologies presents numerous privacy con
cerns regarding employee data. Wearable technologies collect volumes of sensitive, per
sonal data from employees, creating issues regarding the collection, use, and distribution
of employee data by employers.60 Further, wearable workplace technologies (p. 814) typi
cally afford employers the ability to observe and track their employees’ location and phys
iological activity—a practice that may violate the National Labor Relations Act.61
Moreover, wearable workplace technologies collect large amounts of sensitive employee
data, which remains stored on companies’ servers—leaving sensitive employee data vul
nerable to hacking by third parties.62
Discrimination
Wearable workplace technologies also present concerns regarding discrimination. Since
wearables collect employee health data, such technologies could potentially reveal med
ical information that could be used to discriminate against employees. For example, the
use of wearable technologies could facilitate discrimination against employees who have
a medical condition or disability that prevents them from reaching productivity
standards.63 Additionally, wearables used in corporate wellness programs collect employ
ee data outside of the workplace such as activity levels, weight, heart rate, and sleep
quality.64 The collection and use of employee data regarding health and lifestyle outside
of the workplace present further discrimination concerns as this data could be used to de
termine employee benefits and compensation.65
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
ing paid leave for workplace injuries.66 Further, the use of data collected by wearable
workplace technology could be used to deny employee compensation claims.67 For exam
ple, if an employee causes an accident that leads to injury at work, an employer could use
biometric data collected by wearable workplace technology to determine if the employee
was sleep-deprived, shifting the blame onto the employee.68
ty. For example, wearable workplace technologies such as see-through smart glasses with
enhanced visual aid or virtual reality glasses can cause distractions, which could lead to
injury.69 Furthermore, poorly designed exoskeletons could potentially damage muscles,
tendons, and nerves.70 Injuries caused by wearable workplace technologies could lead to
employee lawsuits against employers.71
Inaccurate Predictions
Data collected by wearable workplace technologies are not always accurate. For example,
employees can “game” wearable workplace technology design flaws to skew the results of
the data.72 The use of wearable workplace technology is also a form of surveillance,
which could make some individuals nervous, thus skewing the accuracy of the data col
lected.73
The definition of CRM implementation and use remains debated in academic literature.78
Pennie Frow and Adrian Payne conceptualize CRM implementation and use as five inte
grated processes: (1) strategy development, (2) value creation, (3) multichannel integra
tion and consumer experience, (4) information management, and (5) performance assess
ment. The strategy development process involves assessing a firm’s business strategy to
develop a customer strategy that creates a base for implementing CRM. The value-cre
ation process involves utilizing a firm’s business and associated customer strategy to de
termine what value the firm can offer customers, what value customers can offer the firm,
and how to integrate the exchange of both customer and company value. The multichan
nel integration and customer experience process involves deciding what channels (e.g.,
ways in which a firm interacts with customers) to use and integrate to improve customer
experience. The information management process involves the collection, analyzation,
Page 8 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
and use of customer data to develop customer insight to inform marketing practices. Fi
nally, the performance assessment process involves evaluating the implementation and
use of CRM to ensure it is achieving the intended results outlined by a firm’s business
strategy.79
Page 9 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
tion.87 Essentially, the use of CRM to favor certain groups of customers at the expense of
others leads to perceptions of unfairness, which can damage a company’s reputation,
leading some customers to disengage with the company and spread negative (p. 818) in
formation about the company.88 In effect, such targeted promotions could also be seen as
a form of digital redlining that disadvantages lower income customers, a population that
has large overlaps with racial minorities.89
Dishonesty
CRM practices may sometimes entail misleading and confusing customers by hiding rele
vant information, which may cause customers to make buying decisions at their disadvan
tage. For example, companies can use CRM to generate complex pricing schemas that
make it difficult for customers to compare pricing from other service providers. Addition
ally, companies can use CRM to generate continuous price and rate changes so that cus
tomers do not have time to compare pricing alternatives from other service providers ac
curately. Further, CRM performance measurement systems and employee reward struc
tures can encourage dishonest company behavior, such as overcharging and upselling
customers.97
Page 10 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
Discrimination
Collecting and analyzing consumer data to generate consumer insight to inform market
ing practices, such as targeted advertising, constitutes an essential component of CRM.98
Companies collect and analyze consumer data to predict who their most profitable cus
tomers are and then target advertisements and promotions to those customers and other
customers who share similar characteristics.99 This practice of targeted advertising using
predictive analytics leads to certain groups of customers to receive targeted advertising
and promotions, while excluding other groups of customers, known as “digital
redlining.”100 Digital redlining can lead to discrimination when companies exclude cer
tain groups of customers based on protected characteristics, such as ethnicity,101 (p. 820)
religion,102 pregnancy status,103 and gender104 or proxies for protected characteristics.105
For example, a ProPublica investigation revealed that companies were excluding cus
tomers by ethnic group from viewing targeted advertisements on Facebook using “ethnic
affinity groups,” a feature on Facebook that used online customer behavior to determine
customer ethnicity—a proxy for the protected characteristic of ethnicity.106
Hannak et al. investigated the effect of race and gender on online freelance market
places.107 Specifically, Hannak and colleagues examined if perceived gender and race in
fluence worker evaluations by customers, if the language of worker evaluations by cus
tomers differed for workers of different perceived genders and races, and if workers’ per
ceived gender and race significantly associate with their search result rankings in online
freelance marketplaces.108 Overall, the findings demonstrate support for the effect of
race and gender or bias in online freelance marketplaces. Hannak and colleagues hypoth
esized that biased customer evaluations of workers might indirectly cause the biased
ranking of workers. In other words, since customer evaluations of workers revealed bias
as a function of perceived race and gender, and worker rankings also revealed bias as a
function of perceived race and gender, biased customer evaluations of workers might also
cause biased rankings of workers.
Hannak and colleagues argue that online freelance marketplace designers should
(p. 821)
actively seek to mitigate the biases present in online freelance marketplaces. Hannak and
colleagues present several examples, including, limiting the number of customer evalua
tions of workers that can be viewed, limiting customers’ ability to pick workers by having
the process automated, and adjusting individual workers’ ratings when they are a prod
uct of systematic biased evaluation by customers.
Page 11 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
Furthermore, Hannak and colleagues call for future research to investigate the impact of
working conditions on the propensity for women and people of color to leave the free
lance workforce. The researchers also call for future research to investigate the causal
mechanisms behind the effect of biased customer evaluations of workers on hiring deci
sions.109
Chen and co-authors conducted an audit study of three resume search engines. Notably,
they conducted experimental studies to investigate if the resume search engines directly
used inferred gender to rank job seekers’ posted resumes.110 The findings support the ex
istence of indirect gender discrimination at both the individual and group level for re
sume search engine rankings—presenting several key implications. Chen and colleagues
argue for the adoption of ranking algorithms that account for indirect discrimination by
making ranking algorithms “group fair by design.” In other words, hiring websites that
use ranking algorithms should ensure their ranking algorithms rank both male and fe
male job candidates at a rate proportional to the distribution of the population. Further
more, Chen and colleagues call for future research to investigate other hiring websites
and how actual hirers or recruiters use resume search engines.111
Bibliography
Ajunwa, Ifeoma. “The Paradox of Intervention as Anti-Bias Intervention,” Cardozo Law
Review, no. 41 (forthcoming, 2020).
Ajunwa, Ifeoma, and Daniel Greene. “Platforms at Work: Automated Hiring Platforms and
Other New Intermediaries in the Organization of Work.” Research in the Sociology of
Work 33, no. 1 (forthcoming, 2019): 1–53.
Boulding, William, Richard Staelin, Michael Ehret, and Wesley J. Johnston. “A Cus
(p. 822)
tomer Relationship Management Roadmap: What Is Known, Potential Pitfalls, and Where
to Go.” Journal of Marketing 69, no. 4 (2005): 155–166.
Greenbaum, Joan M. Windows on the Workplace: Technology, Jobs, and the Organization
of Office Work. 2d ed. New York: Monthly Review Press, 2004.
Page 12 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money and
Information. Cambridge, MA: Harvard University Press, 2015.
Zuboff, Shoshana. In the Age of the Smart Machine: The Future of Work and Power. New
York: Basic Books, 1988.
Notes:
(1) David Obstfeld, “Social Networks, the Tertius Iungens Orientation, and Involvement in
Innovation,” Administrative Science Quarterly 50, no. 1 (2005): 100–130.
(2) Ronald S. Burt, Structural Holes: The Social Structure of Competition (Cambridge,
MA: Harvard University Press, 1992).
(3) Hyman Louis, Temp: How American Work, American Business, and the American
Dream Became Temporary (New York: Viking, 2018); Ifeoma Ajunwa and Daniel Greene,
“Platforms at Work: Automated Hiring Platforms and Other New Intermediaries in the Or
ganization of Work,” Research in the Sociology of Work 33, no. 1 (forthcoming 2019): 1–
53.
(4) Shoshana Zuboff, In the Age of the Smart Machine: The Future of Work and Power
(New York: Basic Books, 1988).
(5) Harry Braverman, Labor and Monopoly Capital: The Degradation of Work in the Twen
tieth Century, 25th anniversary ed. (New York: Monthly Review Press, 1998); Michael Bu
rawoy, “Between the Labor Process and the State: The Changing Face of Factory Regimes
under Advanced Capitalism,” American Sociological Review 48, no. 5 (1983): 587–605.
(6) Ifeoma Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable
Technology as the New Data-Centric Research Agenda for Employment and Labor Law,”
St. Louis Law Journal 63, no. 1 (forthcoming 2019).
(7) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(8) Miranda Bogen and Aaron Rieke, “Help Wanted: An Examination of Hiring Algorithms,
Equity, and Bias” (Washington, D.C.: Upturn, 2018).
(9) Id.
(10) Pauline Kim and Sharion Scott, “Discrimination in Online Employment Recruiting,”
St. Louis University Law Journal 63, no. 1 (2018): 1–29.
(11) Sirui Yao and Bert Huang, “Beyond Parity: Fairness Objectives for Collaborative Fil
tering,” in 31st Conference on Neural Information Processing Systems (NIPS 2017), 1–10.
Page 13 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(12) Bogen and Rieke, “Help Wanted: An Examination of Hiring Algorithms, Equity, and
Bias.”
(13) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(14) Mariotti and Robinson, “Society for Human Resource Management 2017 Talent Ac
quisition Benchmarking Report,” (SHRM, 2017).
(15) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(16) Id.
(17) Josh Bersin, “Talent Trends HR Technology Disruptions for 2018: Productivity, De
sign, and Intelligence Reign” (Deloitte University Press, 2017).
(18) Bogen and Rieke, “Help Wanted: An Examination of Hiring Algorithms, Equity, and
Bias.”
(19) Nathan J. Ebnet, “It Can Do More Than Protect Your Credit Score: Regulating Social
Media Pre-employment Screening with the Fair Credit Reporting Act,” Minnesota Law Re
view 97, no. 1 (2012): 306–336.
(20) Bogen and Rieke, “Help Wanted: An Examination of Hiring Algorithms, Equity, and
Bias.”
(21) Nagaraj Nadendla, “Introducing the Oracle Recruiting Cloud,” filmed 2017 at Open
World 2017, San Francisco, CA, video, 26:18, https://video.oracle.com/detail/videos/most-
recent/video/5701490825001/openworld-2017:-introducing-the-oracle-recruiting-cloud.
(22) Jeff Schwartz et al., “Rewriting the Rules for the Digital Age 2017 Deloitte Global Hu
man Capital Trends” (Deloitte University Press, 2017).
(23) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(24) E. Anderson, Private Government: How Employers Rule Our Lives (and Why We Don’t
Talk about It) (Princeton, NJ: Princeton University Press, 2017).
(25) Burawoy, “Between the Labor Process and the State: The Changing Face of Factory
Regimes under Advanced Capitalism.”
(26) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(27) Id.
Page 14 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(28) J. Meredith, “AI Identifying Steady Workers: E-Recruiting Firm Offering Tool to Deter
mine How Long Job Seeker Will Stay Around,” Chicago Tribune (July 16, 2001), http://
articles.chicagotribune.com/2001-07-16/business/0107160013_1_unicru-neural-networks-
employee.
(29) A. Overholt, “True or False: You’re Hiring the Right People,” Fast Company (February
2002), https://www.fastcompany.com/44463/true-or-false-youre-hiring-right-people.
(30) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(31) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law.”
(33) Ajunwa and Greene, “Platforms at Work: Automated Hiring Platforms and Other New
Intermediaries in the Organization of Work.”
(34) Craig Haney, “Employment Tests and Employment Discrimination: A Dissenting Psy
chological Opinion,” Industrial Relations Law Journal 5, no. 1 (1982): 1–86.
(35) Rachael Tatman, “Gender and Dialect Bias in YouTube’s Automatic Captions,” in Pro
ceedings of the First ACL Workshop on Ethics in Natural Language Processing (ACL,
2017), 53–59; Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accura
cy Disparities in Commercial Gender Classification,” Proceedings of the 1st Conference
on Fairness, Accountability and Transparency, PMLR 81, no. 1 (2018): 77–91.
(36) Timothy L. Fort, Anjanette H. Raymond, and Scott J. Shackelford. “The Angel on Your
Shoulder: Prompting Employees to Do the Right Thing through the Use of Wearables,”
Northwestern Journal of Technology and Intellectual Property 14, no. 2 (2016): 139–170;
Dov Greenbaum, “Ethical, Legal and Social Concerns Relating to Exoskeletons,” ACM
SIGCAS Computers and Society 45, no. 3 (2016): 234–239.
(37) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law”; Frost
& Sullivan, “Wearable Technologies for Industrial Applications: Smart Glasses, Wrist-
Worn Devices, HUDs Improve Productivity and Efficiency in Industrial Sector” (Frost &
Sullivan, June 30, 2017); Greenbaum, “Ethical, Legal and Social Concerns Relating to Ex
oskeletons.”
(38) Frost & Sullivan, “Wearable Technologies for Industrial Applications”; J.P. Gownder,
“The Technology-Augmented Employee: How Emerging Technologies Like Artificial Intel
ligence Are Reshaping the Future of Work” (Forrester, 2018), https://www.forrester.com/
report/The+TechnologyAugmented+Employee/-/E-RES125811; Ajunwa, “Algorithms at
Work: Productivity Monitoring Platforms and Wearable Technology as the New Data-Cen
tric Research Agenda for Employment and Labor Law.”
Page 15 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(42) Id.; Fort, Raymond, and Scott, “The Angel on Your Shoulder: Prompting Employees to
do the Right Thing through the Use of Wearables.”
(43) Frost & Sullivan, “Wearable Technologies for Industrial Applications”; Ajunwa, “Algo
rithms at Work: Productivity Monitoring Platforms and Wearable Technology as the New
Data-Centric Research Agenda for Employment and Labor Law”; Fort, Raymond, and
Shackelford. “The Angel on Your Shoulder: Prompting Employees to do the Right Thing
through the Use of Wearables.”
(46) Ajunwa, “Algorithms at Work”; Frost & Sullivan, “Wearable Technologies for Industri
al Applications”; Turner, “Are Performance-Monitoring Wearables an Affront to Workers’
Rights?”
(47) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law”; Ben
Coxworth, “SmartCap Monitors Workers’ Fatigue Levels by Reading Their Brain Waves.”
(50) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law”;
Greenbaum, “Ethical, Legal and Social Concerns Relating to Exoskeletons.”
(51) Ana Viseu, “Simulation and Augmentation: Issues of Wearable Computers,” Ethics
and Information Technology 5, no 1 (2003): 17–26.
(54) Id.
Page 16 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(55) Boris Evelson and Michael Facemire, Enterprise Business Intelligence: Now Always at
Hand on Your Smartwatch (Forrester, 2016), https://www.forrester.com/report/
Enterprise+Business+Intelligence+Now+Always+At+Hand+On+Your+Smartwatch/-/E-
RES129410.
(56) J.P. Gownder, “Deliver Digital Operational Excellence and Digital Customer Experi
ence Innovation with Wearables” (Forrester, 2017), https://www.forrester.com/report/
Deliver+Digital+Operational+Excellence+And+Digital+Customer+Experience+Innovation+With+W
E-RES103381.
(58) Christy Pettey, “Wearables Hold the Key to Connected Health Monitoring” (Gartner,
2018), https://www.gartner.com/smarterwithgartner/wearables-hold-the-key-to-connected-
health-monitoring/.
(59) Id.
(60) Ifeoma Ajunwa, Kate Crawford, and Joel Ford, “Health Meets Big Data: An Ethical
Framework for Health Information Collection by Corporate Wellness Programs,” Journal
of Law, Medicine, and Ethics 44, no.1 (2016): 474–480.
(61) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law”;
U.S.C, “National Labor Relations Act,” 29 § 157 (2012).
(63) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law.”
(64) Helen Nissenbaum and Heather Patterson, “Biosensing in Context: Health Privacy in
a Connected World,” in Quantified Biosensing Technologies in Everyday Life, ed. Dawn
Nafus (Cambridge, MA: MIT Press, 2016), 79–100.
(65) Alexander H. Tran, “The Internet of Things and Potential Remedies in Privacy Tort
Law,” Columbia Journal of Law and Social Problems 50, no. 2 (2017): 263–298.
(67) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law”;
Antigone Peyton, “A Litigator’s Guide to the Internet of Things,” Richmond Journal of Law
& Technology 22, no. 3 (2016): 1–20.
Page 17 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(68) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law.”
(69) Jeremy P. Brummond and Patrick J. Thorton, “The Legal Side of Jobsite Technology,”
Construction Today (June 22, 2016), http://www.construction-today.com/sections/columns/
2752-the-legal-side-of-jobsite-technology.
(70) Garry Mathiason et al., “The Transformation of the Workplace through Robotics, Arti
ficial Intelligence, and Automation: Employment and Labor Law Issues, Solutions, and the
Legislative and Regulatory Response” (Littler, 2016), https://www.littler.com/publication-
press/publication/transformation-workplace-through-robotics-artificial-intelligence-and.
(71) Ajunwa, “Algorithms at Work: Productivity Monitoring Platforms and Wearable Tech
nology as the New Data-Centric Research Agenda for Employment and Labor Law.”
(72) Ifeoma Ajunwa; Kate Crawford, “When the Fitbit Is an Expert Witness,” The Atlantic
(November 19, 2014), https://www.theatlantic.com/technology/archive/2014/11/when-fit
bit-is-the-expert-witness/382936/.
(73) Oliva Solon, “Wearable Technology Creeps into the Workplace,” Sydney Morning Her
ald (August 7, 2015), https://www.smh.com.au/business/workplace/wearable-technology-
creeps-into-the-workplace-20,150,807-gitzuh.html.
(74) Pennie Frow et al., “Customer Management and CRM: Addressing the Dark Side,”
Journal of Services Marketing 25, no. 2 (2011): 79–89.
(76) Musfiq Mannan Choudhury and Paul Harrigan, “CRM to Social CRM: The Integration
of New Technologies into Customer Relationship Management,” Journal of Strategic Mar
keting 22, no. 2 (2014): 149–176; Satish Jayachandran et al., “The Role of Relational In
formation Processes and Technology Use in Customer Relationship Management,” Journal
of Marketing 69, no. 4 (2005): 177–192.
(77) Jayachandran et al., “The Role of Relational Information Processes and Technology
Use in Customer Relationship Management”; William Boulding et al., “A Customer Rela
tionship Management Roadmap: What Is Known, Potential Pitfalls, and Where to Go.”
(78) Id.
(79) Adrian Payne and Pennie Frow, “A Strategic Framework for Customer Relationship
Management,” Journal of Marketing 69, no. 4 (2005): 167–176.
(80) Michael Fayerman, “Customer Relationship Management,” New Directions for Institu
tional Research 113, no.1 (2002): 57–68.
Page 18 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(81) Kate Leggett, “The Five CRM Trends In 2019 That Will Shape Engagement, Relation
ships, And Revenue” (Forrester, 2019), https://www.forrester.com/report/
The+Five+CRM+Trends+In+2019+That+Will+Shape+Engagement+Relationships+And+Revenue/-
E-RES148555.
(82) Id.
(83) Sumair Dutta, “Social Media and Customer Service: From Listening to
Engagement” (Aberdeen Group, 2012), http://www.oracle.com/us/products/applications/
aberdeen-social-customer-svc-1902160.pdf.
(84) David Kiron et al., “Social Business: Shifting Out of First Gear,” MIT Sloan Manage
ment Review 55, no. 1 (2013): 1–32.
(85) Boulding et al., “A Customer Relationship Management Roadmap: What Is Known, Po
tential Pitfalls, and Where to Go.”
(86) Bang Nguyen, Sooyeon Nikki Lee-Wingate, and Lyndon Simkin, “The Customer Rela
tionship Management Paradox: Five Steps to Create a Fairer Organization,” Social Busi
ness 4, no. 3 (2014): 207–230; Bang Nguyen and Lyndon Simkin, “The Dark Side of CRM:
Advantaged and Disadvantaged Customers,” Journal of Consumer Marketing 30, no. 1
(2013): 17–30.
(87) Monroe Xia and J. Cox, “The Price Is Unfair! A Conceptual Framework of Price Fair
ness Perceptions,” Journal of Marketing 68, no. 1 (2004): 1–15.
(88) Nguyen and Simkin, “The Dark Side of CRM: Advantaged and Disadvantaged Cus
tomers”; Jennifer Lyn Cox, “Can Differential Prices Be Fair?,” Journal of Product & Brand
Management 10, no. 5 (2001): 264–275; F.M. Feinberg, A. Krishna, and Z.J. Zhang, “Do
We Care What Others Get? A Behaviourist Approach to Targeted Promotions,” Journal of
Marketing Research 39, no. 3 (2002): 277–291.
(89) Margaret Hu, “Algorithmic Jim Crow,” Fordham Law Review 86, no. 2 (2017): 633–
696; Cathy O’Neil, “No Safe Zone: Getting Insurance,” in Weapons of Math Destruction:
How Big Data Increases Inequality and Threatens Democracy (New York, NY: Crow,
2016), 161–178.
(90) Satish Jayachandran et al., “The Role of Relational Information Processes and Tech
nology Use in Customer Relationship Management”; William Boulding et al., “A Customer
Relationship Management Roadmap: What Is Known, Potential Pitfalls, and Where to Go.”
(91) Jayachandran et al., “The Role of Relational Information Processes and Technology
Use in Customer Relationship Management.”
(92) Pennie Frow et al., “Customer Management and CRM: Addressing the Dark Side.”
Page 19 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(94) Carole Cadwalladr and Emma Grahm-Harrison, “Revealed: 50 Million Facebook Pro
files Harvested for Cambridge Analytica in Major Data Breach,” The Guardian (March 17,
2018), https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-in
fluence-us-election.
(95) Id.
(96) Pennie Frow et al., “Customer Management and CRM: Addressing the Dark Side”;
Joseph Turow, Lauren Feldman, and Kimberly Meltzer, “Open to Exploitation: America’s
Shoppers Online and Offline” (Philadelphia: Anneberg Public Policy Center of the Univer
sity of Pennsylvania, 2005); Detlev Zwick and Nikhilesh Dholakia, “Whose Identity Is It
Anyway? Consumer Representation in the Age of Database Marketing,” Journal of Macro
marketing 24, no. 1 (June 2004): 31–43; Lilian Edwards and Michael Veale, “Slave to the
Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking
For,” Duke Law & Technology Review 16, no. 1 (2017): 18–84; Ifeoma Ajunwa, “Workplace
Wellness Programs Could be Putting Your Health Data at Risk,” Harvard Business Review
(January 19, 2017), https://hbr.org/2017/01/workplace-wellness-programs-could-be-
putting-your-health-data-at-risk.
(97) Pennie Frow et al., “Customer Management and CRM: Addressing the Dark Side.”
(98) Id.; Anthony Danna and Oscar H. Gandy, “All That Glitters Is Not Gold: Digging Be
neath the Surface of Data Mining,” Journal of Business Ethics 40, no. 1 (2002): 373–386.
(99) Id.
(100) Margaret Hu, “Algorithmic Jim Crow”; Cathy O’Neil, “No Safe Zone: Getting In
surance”; Marcia Stepanek, “Weblining: Companies Are Using Your Personal Data to Lim
it Your Choices—and Force You to Pay More for Products,” Bloomberg Business Week
(April 3, 2000), https://www.bloomberg.com/news/articles/2000-04-02/weblining.
(101) Elliot Zaret and Brock N. Meeks, “Kozmo’s Digital Dividing Lines,” MSNBC (April 11,
2000), https://web.archive.org/web/20001217050000/http://www.msnbc.com/news/
373212.asp?cp1=1; Latanya Sweeney, “Discrimination in Online Ad Delivery,” Communi
cations of the ACM (May 2013); Julia Angwin and Terry Parris Jr., “Facebook Lets Adver
tisers Exclude Users by Race,” ProPublica (October 28, 2016), https://
www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race.
(102) Samuel Gibbs, “Google Alters Search Autocomplete to Remove ‘Are Jews Evil’
Suggestion,” The Guardian (December 5, 2016), https://www.theguardian.com/technology/
2016/dec/05/google-alters-search-autocomplete-remove-are-jews-evil-suggestion.
(103) Kashmir Hill, “How Target Figured Out a Teen Girl Was Pregnant Before Her Father
Did,” Forbes (February 16, 2012), https://www.forbes.com/sites/kashmirhill/2012/02/16/
how-target- figured-out-a-teen-girl-was-pregnant-before-her-father-did/.
Page 20 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021
Algorithms and the Social Organization of Work
(104) Amit Datta, Michael Carl Tschantz, and Anupam Datta, “Automated Experiments on
Ad Privacy Settings: A Tale of Opacity, Choice and Discrimination,” Proceedings of the
15th Privacy Enhancing Technologies Symposium (2015): 92–112.
(105) Angwin and Parris Jr., “Facebook Lets Advertisers Exclude Users by Race.”
(106) Id.
(108) Id.
(109) Id.
(110) L. Chen, A. Hannak, R. Ma, and C. Wilson, “Investigating the Impact of Gender on
Rank in Resume Search Engines,” Proceedings of the 2018 CHI Conference on Human
Factors in Computing Systems (2018).
(111) Id.
Ifeoma Ajunwa
Ifeoma Ajunwa, Associate Professor, Labor Relations, Law, and History Department,
Cornell University ILR School; Associate Faculty Member, Cornell Law School; Facul
ty Associate, Berkman Klein Center for Internet & Society at Harvard University
Rachel Schlund
Page 21 of 21
PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights
Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in
Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).
Subscriber: London School of Economics and Political Science; date: 10 May 2021