Darkside of Pan

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

European Journal of Information Systems

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/tjis20

The dark sides of people analytics: reviewing the


perils for organisations and employees

Lisa Marie Giermindl, Franz Strich, Oliver Christ, Ulrich Leicht-Deobald &
Abdullah Redzepi

To cite this article: Lisa Marie Giermindl, Franz Strich, Oliver Christ, Ulrich Leicht-Deobald &
Abdullah Redzepi (2021): The dark sides of people analytics: reviewing the perils for organisations
and employees, European Journal of Information Systems, DOI: 10.1080/0960085X.2021.1927213

To link to this article: https://doi.org/10.1080/0960085X.2021.1927213

© 2021 The Author(s). Published by Informa


UK Limited, trading as Taylor & Francis
Group.

Published online: 02 Jun 2021.

Submit your article to this journal

Article views: 1516

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=tjis20
EUROPEAN JOURNAL OF INFORMATION SYSTEMS
https://doi.org/10.1080/0960085X.2021.1927213

REVIEW

The dark sides of people analytics: reviewing the perils for organisations and
employees
Lisa Marie Giermindla, Franz Strichb, Oliver Christa, Ulrich Leicht-Deobald c
and Abdullah Redzepia
a
School of Business, OST Eastern Switzerland University of Applied Sciences, St. Gallen, Switzerland; bChair of Human Resource
Management and Intrapreneurship, University of Bayreuth, Bayreuth, Germany; cInstitute for Business Ethics, University of St. Gallen, St.
Gallen, Switzerland

ABSTRACT ARTICLE HISTORY


Technological advances in the field of artificial intelligence (AI) are heralding a new era of Received 1 May 2020
analytics and data-driven decision-making. Organisations increasingly rely on people analytics Accepted 29 April 2021
to optimise human resource management practices in areas such as recruitment, performance KEYWORDS
evaluation, personnel development, health and retention management. Recent progress in the People analytics; dark side;
field of AI and ever-increasing volumes of digital data have raised expectations and contributed peril; analytics; artificial
to a very positive image of people analytics. However, transferring and applying the efficiency- intelligence (AI); Information
driven logic of analytics to manage humans carries numerous risks, challenges, and ethical Systems (IS)
implications. Based on a theorising review our paper analyses perils that can emerge from the
use of people analytics. By disclosing the underlying assumptions of people analytics and
offering a perspective on current and future technological advancements, we identify six perils
and discuss their implications for organisations and employees. Then, we illustrate how these
perils may aggravate with increasing analytical power of people analytics, and we suggest
directions for future research. Our theorising review contributes to information system research
at the intersection of analytics, artificial intelligence, and human-algorithmic management.

1. Introduction
cognitive tasks (Benbya et al., 2020; Faraj et al., 2018;
People analytics has the potential to transform the way Mayer et al., 2020). Learning algorithms are becoming
organisations identify, develop, manage and control more and more prevalent in people analytics to predict
their workforce (Chamorro-Premuzic et al., 2017; output values from given input data, (Cappelli, 2019;
Huselid, 2018). The term “people analytics” does not van den Heuvel & Bondarouk, 2017). Moreover, these
refer to a technology, but to a novel, quantitative, algorithms are increasingly able to reliably perform
evidence-based, and data-driven approach to manage and potentially even outperform humans in
the workforce (Gal et al., 2020; McAfee et al., 2012). a growing number of tasks (Faraj et al., 2018; Strich
This approach aims to raise the efficiency of core et al., 2021). The new analytical power of learning
human resource (HR) functions such as workforce algorithms also enables organisations to process, com­
planning, recruiting, development and training, as bine and analyse the ever-increasing volumes of digital
well as to optimise employees’ and the organiation’s data, as well as to automatically identify patterns in
performance (Bodie et al., 2016; Gal et al., 2017; structured and unstructured data (Brynjolfsson &
Leonardi & Contractor, 2018; Tursunbayeva et al., Mitchell, 2017; von Krogh, 2018).
2018). Consequently, it is hardly surprising that orga­ The rapid progress in the field of AI has raised
nisations worldwide are increasingly deploying people organisational expectations and reinforced widely
analytics to analyse and link data on human beha­ held assumptions on the power, transparency, accu­
viour, social relationships and employee characteris­ racy and objectivity of learning algorithms, contribut­
tics to internal or external business information ing to a very positive image of people analytics among
(Fernandez & Gallardo-Gallardo, 2020; Leonardi & researchers and practitioners (Gal et al., 2017;
Contractor, 2018). Greasley & Thomas, 2020; King, 2016; Marler &
Data-driven decision-making in this new era of Boudreau, 2017; Tursunbayeva et al., 2019). For exam­
people analytics has been facilitated by recent ple, people analytics has been described as “critical to
advancements of learning algorithms in the field of any organisation’s success” (Boudreau, 2017, p. 1),
artificial intelligence (AI) (Faraj et al., 2018; von stamped as the “new kid on the block” (Baesens
Krogh, 2018). The term ‘learning algorithms‘ refers et al., 2017, p. 20), labelled as “game changer for the
to a set of technologies able to adaptively interpret future of HR” (van der Togt & Rasmussen, 2017,
and learn from large data sets to perform human-like p. 131), and assuring the HR function “the potential

CONTACT Lisa Marie Giermindl lisa.giermindl@ost.ch


© 2021 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-
nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or
built upon in any way.
2 L. GIERMINDL ET AL.

to become one of the leaders in analytics” (Davenport, critical picture of its potential ramifications and impli­
2019, April 18, p. 1). These high hopes tend to build on cations. Further, we seek to problematise increasing
three underlying assumptions. Firstly, decision- technological advancement’s role in the context of
making based on learning algorithms is often consid­ people analytics and to shed light on the critical
ered superior and more reliable than human decision- aspects of algorithm-based decision-making in this
making (Kryscynski et al., 2018; Leicht-Deobald et al., context by answering the following research question:
2019; Martin-Rios et al., 2017). Algorithms’ high pro­
cessing capacity enables rapid analysis of large data What are the perils of people analytics and what are
volumes to identify statistical patterns and correla­ their potential negative implications for organisations
tions (Gal et al., 2017; Newman et al., 2020). and employees?
Thereby, algorithms are thought to increase fairness,
transparency and objectivity (Jabagi et al., 2020; This paper aims to review the potential perils emer­
Martin-Rios et al., 2017; Sharma & Sharma, 2017). ging from largely unchallenged assumptions and orga­
Secondly, people analytics is believed to predict, mod­ nisational expectations associated with the use of
ify, and manage current and future human behaviour, learning algorithms in people analytics and to discuss
particularly through systematically analysing and the negative effects these algorithms have on decision-
exploring historical data (Chamorro-Premuzic et al., making, management, and human capabilities. We are
2017; Gal et al., 2017; Isson & Harriott, 2016). The expanding the limited body of knowledge in the peo­
assumption is that current and future behaviour can ple analytics field to include a critical analysis of the
be explained and predicted from the analysis of past potential threats and resulting implications, particu­
actions and their consequences (Sivathanu & Pillai, larly focussing on the pitfalls associated with using
2019; N. Wang & Katsamakas, 2019). Thirdly, people more sophisticated technologies (Bodie et al., 2016;
analytics is considered to be able to streamline self- Gal et al., 2017; Leicht-Deobald et al., 2019).
aware, subjective human beings, just as it is able to Advancing academic work in this field intends to
optimise inanimate objects, processes and material help researchers and practitioners to adopt an
resources (Gal et al., 2017; Nocker & Sena, 2019). informed and realistic approach to people analytics.
With humans as immediate subject of analysis, people Our paper offers two major theoretical contribu­
analytics aims to influence and optimise their beha­ tions: First, we propose a novel maturity level to
viour through quantitatively analysing their conduct account for the recent technological advancements
and their digital traces (Gal et al., 2020; Leicht- and the emergence of learning algorithms and AI in
Deobald et al., 2019). the analytics’ field. By systematically reviewing the lit­
However, these assumptions can be problematic erature on people analytics and highlighting future
because they contribute to a very positive perception research avenues, we extend the field of people analytics
of people analytics without attention to potential risks, and contribute to a more nuanced and balanced under­
which might entail serious consequences for organisa­ standing of people analytics. Secondly, we theorise the
tions and employees. For instance, to treat employees emergence of potential perils for organisations and
similarly to quantifiable data objects, rather than as employees and how they may aggravate with increasing
genuine fully-fleshed human beings could entail technological progress. Thereby, our paper is among
a conceptual category error (Gal et al., 2017, 2020; the first to carve out the peculiarities of people analytics
Leicht-Deobald et al., 2019). People analytics based and explain why overlooking these peculiarities could
on recent technological advances in the field of AI bear significant risks and lead to the emergence of perils
could potentially underestimate human complexity, for organisations and employees. This way, we contri­
be more invasive, and have more serious consequences bute to the growing body of literature on the dark sides
for employees than other forms of business analytics, of technologies, analytics and AI.
as in analysing goods, money flows, key financial fig­ In the remainder of the paper, we first conduct
ures, etc. (Gal et al., 2020; King, 2016; Leicht-Deobald a systematic literature review to map the current field
et al., 2019). Consequently, the underlying assump­ of people analytics. Subsequently, we describe the
tions of people analytics’ role, capabilities and pro­ themes which have emerged from the literature review
mises among researchers and practitioners hold by describing the opportunities, barriers, and maturity
particular ethical and moral challenges. This can of people analytics and highlight the associated idio­
guide organisations and managers towards a strong syncrasies and risks identified in prior research. In the
or even unbalanced reliance on people analytics and following, we depict how the technological advance­
result in severe perils. ments and the nascent introduction of learning algo­
Our paper seeks to unpack the inherent assump­ rithms and AI have the potential to take people
tions and expectations that underlie the use of people analytics to a new level. Concluding, we conceptualise
analytics. In so doing, we portray a nuanced and six perils of people analytics, which can arise from the
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 3

use of people analytics and the technological advance­ and consolidating what we know and do not know
ments. Thereby, we scrutinise their undesirable, unin­ about the field (Paré et al., 2016; Rowe, 2014).
tended consequences for organisations and
employees. Finally, we illustrate how and why the
2.1. Methodology
negative implications of these perils can escalate as
the analytical power of people analytics increases. We conducted this systematic literature review follow­
ing Paré et al.’s (2016) guidelines. Thus, we (1) devel­
oped a review plan, (2) defined and applied a literature
2. Reviewing the current state of the people
search strategy, (3) refined our findings in terms of
analytics literature
contribution to our research objective, (4) assessed the
Over the last few years, the literature on people remaining studies’ quality with respect to the informa­
analytics has grown rapidly. Nevertheless, concep­ tion systems field, (5) identified relevant themes, and
tual papers that offer typologies to categorise dif­ (6) synthesised our findings (Paré et al., 2016).
ferent people analytics practices and areas of Following the systematic literature review, we elabo­
application (e.g., Angrave et al., 2016; Dulebohn & rate on the recent technological advances in the field of
Johnson, 2013; Ulrich & Dulebohn, 2015) as well as people analytics and carve out the resulting potential
literature reviews that integrate the mainly practi­ perils for organisations and employees. (Figure 1)
tioner-oriented and limited academic literature provides an overview of our search procedure.
(e.g., Ekawati, 2019; Marler & Boudreau, 2017; Firstly, in developing a review plan, we were guided
Tursunbayeva et al., 2019) still dominate this nas­ by the rationale of identifying critical knowledge gaps,
cent literature. Similarly prevalent are case studies as well as missing and neglected themes in extant
that focus on the consequences of particular people research (Okoli, 2015; Paré et al., 2016; Rowe, 2014;
analytics practices (e.g., Martin-Rios et al., 2017; Webster & Watson, 2002). In the review process, we
Schiemann et al., 2018; Simón & Ferreiro, 2018). combined descriptive and exploratory aspects, openly
Thus, empirical research in this field is still scarce mapped the people analytics field based on the sys­
and rigorous quantitative or qualitative research tematic literature review, and categorised emerging
examining the consequences of people analytics themes, patterns and trends found in the literature.
are lacking (e.g., Greasley & Thomas, 2020; van Secondly, we formulated a systematic literature
den Heuvel & Bondarouk, 2017). In sum, the lit­ search strategy to identify publications related to the
erature on people analytics is still in its early stages use of people analytics (see Table 1 for our detailed
(Angrave et al., 2016; Huselid, 2018; Marler & inclusion criteria). In sum, we searched four databases,
Boudreau, 2017; Minbaeva, 2018) and it predomi­ namely Business Source Premier, Scopus and
nantly paints an optimistic portrait of how people PsychInfo accessed via Ebsco host, and then addition­
analytics could help organisations flourish (e.g., ally, Web of Science. We chose these databases,
Green, 2017; McIver et al., 2018; Nienaber & because they are among the most often used ones in
Sewdass, 2016; Simón & Ferreiro, 2018). systematic literature reviews (Aguinis et al., 2020;
Prior integrative literature reviews have primar­ Hiebl, 2021) as well as to cover literature on people
ily focused on what people analytics is and how it analytics from related disciplines such as Information
operates (Falletta & Combs, 2020; Marler & Systems, Organisational Behaviour or Human
Boudreau, 2017), how it should be implemented Resource Management. Further, we searched the
(Angrave et al., 2016; Boudreau & Cascio, 2017; AISEL Electronic Library to also include premier IS
Fernandez & Gallardo-Gallardo, 2020), the value conference proceedings that are considered important
proposition it offers (Tursunbayeva et al., 2018; indicators for emerging trends as well as to include the
van den Heuvel & Bondarouk, 2017; Werkhoven, most recent academic discussions and advancements.
2017a), and how it can influence performance (Aral Finally, we conducted a backward and forward search
et al., 2012; Peeters et al., 2020; Sharma & Sharma, to uncover relevant articles not identified in our prior
2017). Most of the prior integrating literature took literature search (Webster & Watson, 2002).
a functional approach to people analytics (Peeters Thirdly, we defined our exclusion criteria and
et al., 2020; van der Togt & Rasmussen, 2017), removed duplicates (see Table 2, E.I). Next, we manu­
converging on how it can help improve firms’ ally screened abstracts and other relevant paragraphs
effectiveness, but paying less attention to this considering their contribution to our literature search
approach’s ethical challenges, possible conse­ goal. We removed papers that only mentioned the
quences, and risks (for exceptions, see conceptual search terms as an expression or in a general data
papers, Gal et al., 2020; Leicht-Deobald et al., analytics context without a clear focus on HR (E.II).
2019). Therefore, we seek to advance the literature Next, we excluded articles primarily geared towards
on people analytics by systematically mapping the the technical implementation of software, methods
field, identifying common themes in the literature, and algorithms (E.III). Further, we excluded
4 L. GIERMINDL ET AL.

Figure 1. Overview of search procedure.

practitioner-orientated papers, such as those pub­ inductively generated the themes of this review during
lished in People & Strategy, Strategic HR Review, the process of engaging with the body of literature
Harvard Business Review or Sloan Management (Locke et al., 2020), and then quantitatively coded
Review (E.IV). the studies accordingly (Webster & Watson, 2002).
Fourthly, we assessed the articles’ quality and elig­ We found that predominantly the studies focused on
ibility. We included those published in ranked jour­ one or more of the following emergent themes, which
nals and the three most prominent IS conferences are summarised in Table 3.
[European Conference on Information Systems
(ECIS), Hawaii International Conference on System
Sciences (HICSS), and International Conference on
Information Systems (ICIS)]. Two of the authors inde­ Table 1. Inclusion criteria.
pendently coded the primary studies. In case of dis­ Inclusion Criteria
agreement, they discussed the discrepancies until they I.I. Presence of terms and synonyms related to people analytics:
reached consensus. Finally, we selected 66 papers for “people analytics”, “HR analytics”, “personnel analytics”,
“administrative data analytics”, “HCM analytics”, “workforce
the final analysis of the study, i.e., slightly less than analytics”, “human capital analytics”, “human capital workforce
20.4 % of the overall number of studies selected in the analytics”, “human resource analytics”, “workplace analytics”,
first round. Overall, 55 of the 66 papers had been “talent analytics”.
I.II. Academic peer-reviewed papers focusing on the adoption of
published in or after 2017, thus confirming the rapid people analytics (i.e., any form of data-driven decision-making
proliferation of people analytics literature in the last in human resource management).
I. III.
few years. Further, the majority of the studies (54 of Relevant studies published in English before October 2020.
66) were published in disciplines other than IS, such as
Business, Management and Accounting, as well as
Organisational Behaviour and Human Resource
Management.
Table 2. Exclusion criteria.
Exclusion Criteria
2.2. Emerging themes from the people analytics We removed . . .
E.I. . . . duplicates
literature E.II. . . . studies with only marginal reference to people analytics
E. III.
Fifthly, applying a theme-centred approach, we ana­ ... studies restricted to the technical or methodological aspects of
people analytics – for example, evaluation of machine learning
lysed the remaining studies to identify predominant models or data mining methods
themes in the people analytics literature (Leidner, E. IV.
... practitioner-oriented articles primarily offering practical
2018; Rowe, 2014; Webster & Watson, 2002). We recommendations on how to implement people analytics
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 5

Table 3. Overview of emerging themes of people analytics literature.


Total
Nr of
Themes Definition Key Aspects Studies Sample Publications
Opportunities Opportunities refer to the promises, benefits ● diverse areas of applica­ 53 Aral et al., 2012; Kryscynski et al., 2018;
of people and expectations of organisations tion, such as recruit­ Rasmussen & Ulrich, 2015; Tursunbayeva
analytics regarding the use of people analytics ment, development, et al., 2018; van der Togt & Rasmussen, 2017
retention
● improvement of perfor­
mance and efficiency
● improved work experi­
ence and job satisfaction
Barriers to Barriers describe the obstacles and reasons ● lack of analytical skills 41 Angrave et al., 2016; Dahlbom et al., 2019;
adopting hindering or slowing the adoption of ● lack of an integrated Fernandez & Gallardo-Gallardo, 2020; Marler
people people analytics data basis & Boudreau, 2017; Minbaeva, 2018
analytics ● lack of collaboration
with other functions
● technical barriers
Maturity of Maturity refers to the analytical capacity of ● maturity levels of people 40 Ben-Gal, 2019; Ekawati, 2019; King, 2016;
people people analytics analytics Sivathanu & Pillai, 2019; van den Heuvel &
analytics ● descriptive, predictive Bondarouk, 2017;
and prescriptive people
analytics
● maturity of people ana­
lytics relatively low
Idiosyncrasies Idiosyncrasies relate to the particularities and ● ethical and moral 13 Bodie et al., 2016; Falletta & Combs, 2020;
of people distinctive characteristics of people implications Khan & Tang, 2017; Nocker & Sena, 2019;
analytics analytics ● invasiveness Schafheitle et al., 2019;
● consideration of human
complexity
● far-reaching
consequences
Risks of Risks refer to likely sources of dangers of ● privacy and data protec­ 10 Calvard & Jeske, 2018; Gal et al., 2017; Gal
people people analytics and their negative tion concerns et al., 2020; Leicht-Deobald et al., 2019;
analytics consequences for organisations and ● surveillance and con­ Newman et al., 2020
employees stant tracking
● algorithmic biases

2.2.1. Theme 1: opportunities of people analytics extend their experience (e.g., Gal et al., 2020;
Scholars and practitioners alike have suggested that Huselid, 2018; Tursunbayeva et al., 2018). Scholars
people analytics offers businesses great promise have argued that people analytics can make “workers
(Huselid, 2018; Rasmussen & Ulrich, 2017; more efficient, more productive, happier, and more
Tursunbayeva et al., 2018). Consequently, the largest likely to be loyal to their employer“ (Bodie et al., 2016,
part of scholarly attention has gone to the opportu­ p. 23). Some studies even going as far as to state that
nities people analytics offers. Overall, 53 studies have “in ten years, no single decision within the HR domain
highlighted such opportunities, addressing the follow­ will be made without a clear business case supported
ing three sub-themes: opportunities and business- by statistical data” (van den Heuvel & Bondarouk,
oriented benefits, employee-oriented benefits, and 2017, p. 169).
assumptions underpinning people analytics. In sum, these promises seem to be fuelled by under­
A common thread in this theme is the potential lying assumptions: firstly, people analytics is more
people analytics has to transform the way organisa­ objective and less error-prone than human decision-
tions identify, develop and evaluate their talent making, and secondly, based on historical data, it is
(Chamorro-Premuzic et al., 2017; Fernandez & able to predict future human behaviour:
Gallardo-Gallardo, 2020; Martin-Rios et al., 2017). Regarding the ability to induce more objective,
The underlying rationale of many studies seems to rational and bias-free decision-making (Claus, 2019;
be the belief that people analytics can generate action­ Marler & Boudreau, 2017; Tursunbayeva et al., 2019),
able insights for all stages of the employee life cycle to people analytics endeavours to replace subjective
improve operational and strategic firm performance human decision-making, because intuition, gut feel­
(Levenson, 2018; Minbaeva, 2018; L. Wang & Cotton, ing, individual experience and unformalised know-
2018). Several studies emphasise the diverse areas of how often stem from biased beliefs and are therefore
application, such as workforce planning, onboarding, not considered as valid, rigorous or trustworthy for
personnel development, performance-appraisal, assessing talent or creating HR policies (Gal et al.,
diversity management and retention (Aral et al., 2017; Greasley & Thomas, 2020; King, 2016).
2012; Bekken, 2019; Frederiksen, 2017). Drawing on large pools of quantitative data from
Other studies highlight the possible benefits of peo­ a variety of sources, people analytics is said to deliver
ple analytics for employees by offering health-benefits only a single, bias-free representation of the truth to
and development opportunities for employees to decision-makers (Bodie et al., 2016; Gal et al., 2017).
6 L. GIERMINDL ET AL.

Since algorithms are applied repeatedly and reliably, slow in adopting people analytics practices (e.g.,
they can increase objectivity, transparency and fair­ Angrave et al., 2016; Marler & Boudreau, 2017;
ness in the decision-making process (Gal et al., 2017; Vargas et al., 2018). We summarise the relevant stu­
Jabagi et al., 2020; Newman et al., 2020). dies here in three subthemes: lacking the required
As such, people analytics and algorithm-based analytical capabilities, lacking a consistent, integrated
decision-making are considered superior and less data base and data analysis approach, and technical or
error-prone than human decision-making software-related barriers.
(Kryscynski et al., 2018; Leicht-Deobald et al., One common theme emerging from these studies is
2019), thus offering unprecedented organisational that HR professionals often lack the required analytics
intelligence (Martin-Rios et al., 2017). Owing to capabilities (Angrave et al., 2016; Dahlbom et al., 2019;
their large processing capacity, algorithms can mine Kryscynski et al., 2018; Levenson, 2018). Because data
an immense depth and breadth of data to detect do not speak for themselves, HR managers have to
patterns and correlations, whereas human decision- decide which questions to ask, how to interpret and
makers can only handle a limited amount of input make sense of the findings, put the results in perspec­
(Gal et al., 2017; Martin-Rios et al., 2017; Sharma & tive, and draw conclusions from the data (Boudreau &
Sharma, 2017). People analytics’ computational Cascio, 2017; Leicht-Deobald et al., 2019; Minbaeva,
power and its underlying algorithms far exceed 2018). Traditionally, HR professionals not only lack
human cognitive resources in terms of speed, effi­ the required analytical, technical, and methodological
ciency, and consistency (Gal et al., 2017; Jabagi skills to perform such analysis; they also lack a data
et al., 2020; Kryscynski et al., 2018). As a result, mindset (Andersen, 2017; Calvard & Jeske, 2018;
people analytics is said to increase the quality of Rasmussen & Ulrich, 2017). Thus, many HR employ­
decisions and to support organisations in making ees are sceptical, ambivalent or reluctant, as they
more efficient, accurate and informed HR decisions doubt whether humans can be reduced to numbers
(Bodie et al., 2016; Falletta & Combs, 2020). and wonder how they can use employees’ data in an
Regarding the second assumption, people analytics ethically legitimate way (Angrave et al., 2016; Greasley
is thought to be able to predict, modify and manage & Thomas, 2020; Kryscynski et al., 2018).
human behaviour (Chamorro-Premuzic et al., 2017; Additionally, complying with legislation and data pro­
Gal et al., 2017; Isson & Harriott, 2016). The central tection regulations while keeping people’s trust, make
claim is that making sense of past actions and their it difficult for HR managers to process (non-
consequences enables decision-makers to explain cur­ anonymised) personnel data (van den Heuvel &
rent behaviour, diagnose and understand the under­ Bondarouk, 2017).
lying reasons, and predict future trends (Sivathanu & Further, several studies stress that only a few firms
Pillai, 2019; N. Wang & Katsamakas, 2019). Thus, have a unified, integrated and cleansed data basis and
people analytics can gain insight from historical data analytic approach (Lawler et al., 2004; Lismont et al.,
or current data to understand past, current and future 2017; Minbaeva, 2018). This prevents merging
employee and business performance (Fernandez & employee data with data from other functional areas
Gallardo-Gallardo, 2020; King, 2016). High prediction such as finance, sales, or production, and analysing it
accuracy allows managers to make rational, anticipa­ (Boudreau & Cascio, 2017; Kassick, 2019; Levenson &
tory decisions that were previously not possible, such Fink, 2017). Silo mentalities in companies can also
as forecasting workforce resources and addressing prevent HR professionals from combining HR data
emerging gaps in competences (Bhardwaj & Patnaik, with data on other productivity and performance
2019; Nienaber & Sewdass, 2016; Schiemann, 2018). determinants, thus precluding HR from achieving
In all, very many studies highlight the potential meaningful objective performance outcomes
promises and opportunies of people analytics. (Bhardwaj & Patnaik, 2019; van der Togt &
Rigorous empirical studies measuring such benefits Rasmussen, 2017). Also, HR professionals tend to be
with scrutiny are, however, rare (Kryscynski et al., situated in relatively peripheral positions in the orga­
2018; Newman et al., 2020), and much of the evidence nisational hierarchy (Angrave et al., 2016; Greasley &
is still anecdotal (King, 2016; Schiemann et al., 2018). Thomas, 2020). Such a situation alongside the micro­
Future research could thus investigate the largely political challenges HR departments face could further
untested relationships that past literature disclosed. hamper collaboration with other units and prevent HR
from obtaining top management’s support for analy­
2.2.2. Theme 2: barriers to adopting people tical efforts, or could limit them in implementing the
analytics people analytics project’s recommendations in prac­
In total, 41 of the 66 studies in our data set addressed tice (King, 2016; van den Heuvel & Bondarouk, 2017).
the question of possible barriers to adopting people Other studies that address this theme investigate
analytics. These studies suggest manifold underlying the technical or software-related barriers hindering
reasons for explaining why HR practitioners have been the adoption of (more) advanced forms of people
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 7

analytics (Ben-Gal, 2019; Dahlbom et al., 2019; Vargas performance, to identify problem areas, needs for
et al., 2018). Several studies stress that HR information action, and business opportunities (Fernandez &
systems are often limited to questions focused on Gallardo-Gallardo, 2020; Schafheitle et al., 2019).
operational reporting rather than providing the capa­ Descriptive people analytics is typically performed
city HR departments need to perform statistical ana­ using the balanced scorecard and key performance
lysis (Andersen, 2017; Fernandez & Gallardo- indicators such as absenteeism rates (King, 2016;
Gallardo, 2020). Leicht-Deobald et al., 2019).
Overall, studies subsumed under this theme focus Predictive analytics builds on identifying explana­
on human, data, and on technical barriers to adopting tory patterns (such as trends, relationships, prefer­
people analytics for putting a strong analytics strategy ences, clusters) from past or present events to
in place. Despite the wealth of research on this topic, forecast future business developments (Fernandez &
future research could contribute by using multi-case Gallardo-Gallardo, 2020; Sivathanu & Pillai, 2019).
studies or longitudinal research designs to empirically The central question it answers is “What will happen
identify the barriers, determining how they evolve and why?” (Isson & Harriott, 2016; King, 2016). To
over time and how to overcome them. this end, it leverages statistical and mathematical
methods such as advanced regression techniques,
2.2.3. Theme 3: maturity of people analytics data mining, text mining, web mining, and forecast
In all, 40 studies explicitly examined or at least com­ calculations (Ekawati, 2019; Sivathanu & Pillai, 2019;
mented on the current analytical maturity of people Ulrich & Dulebohn, 2015). Predictive analytics deter­
analytics. Within this topic, we identified two major mines the probability that an event or possible devel­
sub-themes, referring to people analytics’ low matur­ opment will occur, typically by providing a score that
ity, and to its different maturity levels and approaches. represents the likelihood of the event (Leicht-
One of this theme’s dominant ideas is that analy­ Deobald et al., 2019; Schafheitle et al., 2019). Thus,
tical forecasting and decision support tools in this predictive analytics enables the forecasting of future
domain are still in their infancy compared to other employee behaviour and performance, such as pre­
corporate areas such as marketing, sales, production dicting employee attrition or changes in levels of
planning or finance (Chamorro-Premuzic et al., 2017; engagement. Therefore, it is often depicted as an
Marler & Boudreau, 2017; Schiemann, 2018). A large early warning system (Kryscynski et al., 2018;
number of studies emphasise that the existing use of Sivathanu & Pillai, 2019; van den Heuvel &
data analyses has hardly progressed beyond reactive, Bondarouk, 2017).
standardised, or demand-related historical reporting Prescriptive analytics not only forecasts future
and rudimentary forecasts (Angrave et al., 2016; development, but also recommends decisions and
Lunsford & Phillips, 2018; van den Heuvel & courses of action, based on an analysis of past data
Bondarouk, 2017). Consequently, most organisations and alternative, future scenarios (Leicht-Deobald
are still struggling to move from basic operational et al., 2019; Schafheitle et al., 2019). In so doing, it
reporting to more advanced forms of analytics seeks to answer the question “What should be done?”
(Boudreau & Cascio, 2017; Dahlbom et al., 2019; and goes beyond merely predicting future outcomes
Minbaeva, 2017). (Isson & Harriott, 2016; Leicht-Deobald et al., 2019).
Beyond the consensus on people analytics’ low In addition to the methods deployed by predictive
maturity, we identified several studies that apply analytics, prescriptive analytics uses solution-
a widely used categorisation from the business analy­ oriented simulation and scenario calculations as well
tics literature (Davenport et al., 2007; Delen & as machine learning algorithms with the aim of devel­
Demirkan, 2013), distinguishing three analytics oping decision proposals or, in extreme cases, imple­
maturity levels according to their analytical focus menting these decisions (Lunsford & Phillips, 2018;
and capacity. The three levels are descriptive analytics, Sivathanu & Pillai, 2019). Prescriptive analytics can be
predictive analytics, and prescriptive analytics used to optimise the efficiency of employees’ beha­
(Ekawati, 2019; Fernandez & Gallardo-Gallardo, viour or to model complex strategic HR decisions
2020; King, 2016). (Fernandez & Gallardo-Gallardo, 2020; Leicht-
Descriptive analytics seeks to examine what Deobald et al., 2019). At this stage, decision-making
occurred in the past and how this influences the pre­ becomes a joint human-algorithm decision-making
sent, by answering the question “What happened?” process (Burton et al., 2019; Schafheitle et al., 2019).
(Isson & Harriott, 2016; Lunsford & Phillips, 2018). Finally, several studies stress that the three types of
It is based on standard statistical methods, such as analytics offer increasing analytical power
correlation analysis, simple regressions, mean values (Frederiksen, 2017; Leicht-Deobald et al., 2019).
and percent changes (Ekawati, 2019; Leicht-Deobald Among the empirical studies in this theme (Aral
et al., 2019; Sivathanu & Pillai, 2019). It helps organi­ et al., 2012; Hicks, 2018; Schiemann et al., 2018), we
sations to understand past and current business find that most studies only examine the use of
8 L. GIERMINDL ET AL.

descriptive and predictive analytics techniques (e.g., miscalculation, and costly strategic mistakes are likely
Bekken, 2019; Chalutz, 2019; Dahlbom et al., 2019). (Falletta & Combs, 2020; King, 2016; Nocker & Sena,
However, recent technological advances in the context 2019).
of people analytics illustrate the need to expand cur­ Thirdly, compared to other forms of business
rent research and emphasise the use of more advanced analytics, people analytics is more invasive to
forms of people analytics. employees, also in interfering with the individual’s
way of working and living in several ways (Bodie
2.2.4. Theme 4: idiosyncrasies of people analytics et al., 2016; Gal et al., 2017; Khan & Tang, 2017):
Compared to the other themes, research on the idio­ The data collected for people analytics tend by their
syncrasies and challenges of applying analytics to nature to be very sensitive, granular and personal. As
humans is limited. Overall, only 13 out of 66 identified such, the data allows organisations deep individual-
studies examined or referred to idiosyncrasies and target insight (and inter-reference) in employee’s
peculiarities of applying analytics to manage humans personal life, as well as in their psychological disposi­
(e.g., Gal et al., 2017; Leicht-Deobald et al., 2019; tion (Falletta & Combs, 2020; Nocker & Sena, 2019;
Nocker & Sena, 2019; Schafheitle et al., 2019). The Schafheitle et al., 2019). By contrast, humans are not
overall consensus of these studies is that trying to the immediate subject of analysis for most other
apply the logic of data-driven management from forms of business analytics (e.g., in analysing goods,
other parts of the organisation to manage the work­ money flows, key financial figures, etc.), therefore, for
force entails several pitfalls due to four main idiosyn­ the most part, employees are only marginally affected
crasies of people analytics. by analytics in other disciplines (Gal et al., 2017).
Firstly, compared to analytics applications in other Further, even compared to the wide use of personal
business domains, applying analytics and algorithms data elsewhere, people analytics is different and more
in analysing, evaluating, controlling and managing pervasive for two major reasons related to indivi­
employees invokes ethical and moral challenges with duals’ awareness of access to personal data and to
far-reaching implications (Falletta & Combs, 2020; their ability to withhold or withdraw personal infor­
Gal et al., 2017). For instance, in other areas, using mation. Through extensive media coverage, most
business analytics to demonstrate business processes’ people are aware that their personal details, such as
inefficiency and then cutting costs on such a basis, is their private social media activities or their smart
widely accepted and not frowned upon (Appelbaum device usage is stored and transformed for further
et al., 2017; Khan & Tang, 2017). By contrast, similarly use (e.g., Fox & Moreland, 2015; Mai, 2016;
using people analytics to manage humans and their Mantelero, 2016). In the organisational context, how­
behaviour in an organisation, poses a range of ethical ever, employees are less aware of the extent to which
and moral questions that need careful consideration their employer harvests and evaluates their data
(Peeters et al., 2020; Schafheitle et al., 2019). Thus, (Bodie et al., 2016; Isson & Harriott, 2016; Khan &
treating employees as quantifiable data objects, rather Tang, 2017). Also, in their private lives, people have
than as cultural, sentient social human beings is ethi­ the possibility of opting out, actively abstaining from
cally questionable (Gal et al., 2017, 2020; Leicht- using an internet platform, and sharing only infor­
Deobald et al., 2019). Applying algorithms to manage mation on aspects of their lives that they authorise
humans has potentially profound organisational, prac­ (Nocker & Sena, 2019). By contrast, in the workplace
tical, societal, moral and ethical implications in work­ context, due to their financial dependency in the
places that have yet to be studied and suitably employment relationship, employees might not have
addressed (Calvard & Jeske, 2018; Nocker & Sena, the opportunity to object to their data being evalu­
2019; Peeters et al., 2020). ated or to stop their data being shared with external
Secondly, human engagement can be too complex analytics providers (Peeters et al., 2020; Schafheitle
to be measured, evaluated and analysed in a data- et al., 2019; Simbeck, 2019). Together, these factors –
driven way similar to other parts of the organisation processing personal and possibly sensitive data, as
(Gal et al., 2020; King, 2016). Because human beha­ well as employees’ inability to opt out and so escape
viour is much more complex and much less predict­ these technologies’ adverse effects – account for and
able than that of machinery or other tangible assets, reinforce the intrusive nature of people analytics
reducing complex human characteristics and beha­ (Falletta & Combs, 2020; Nocker & Sena, 2019;
viour for representation by data points and numbers Schafheitle et al., 2019).
is problematic (Gal et al., 2017; King, 2016; Newman Finally, people analytics is more consequential for
et al., 2020). Therefore, directly transferring and human beings than any other form of business analy­
extending analytics to organisational areas tradition­ tics (Gal et al., 2017; Khan & Tang, 2017). With most
ally handled on the basis of human judgment and business analytics applications, wrong decisions or
expertise, may fail to consider human complexity; inaccurate predictions can result in financial loss, but
moreover, significant misinterpretation, not in life-changing situations for employees
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 9

(Schafheitle et al., 2019). For instance, using analytics resistance from employees and to hamper their com­
for organisational fleet management comprises activ­ mitment (Bodie et al., 2016; Khan & Tang, 2017).
ities such as evaluating and optimising the utilisation Closely related to these privacy concerns, some
of vehicles (e.g., replacing high mileage vehicles with studies explicitly addressed the issue of surveillance,
newer ones). Yet, the analogous use of people analytics constant tracking and algorithmic control of workers
to evaluate employees’ workloads according to their (Hamilton & Sodeman, 2020; Jabagi et al., 2020;
performance and then filtering out underperforming Schafheitle et al., 2019). These studies stress that con­
or costly employees, would have a potentially life- stant tracking, collecting and exploiting novel types of
altering impact on those ranked low in the assessment granular and sensitive data can foster feelings of being
(Leicht-Deobald et al., 2019). Quite obviously, the cost controlled, and can impede workers’ autonomy, which
of errors in such a situation is significantly higher in often results in counterproductive behaviour
HR management than in other business settings, as (Hamilton & Sodeman, 2020; Jabagi et al., 2020;
evaluation and optimisation can determine employees’ Leicht-Deobald et al., 2019). Particularly, extensively
future career prospects, thus directly impacting not monitoring non-job-related behaviours could have
only their professional lives, but also their private detrimental consequences that affect employee per­
sphere (Bodie et al., 2016; Falletta & Combs, 2020). ceptions negatively (Schafheitle et al., 2019).
Therefore, the risk of miscalculation and misinterpre­ Additionally, a number of studies pointed out the
tation weighs more heavily and can lead to more issue of implicit algorithmic bias resulting from poorly
fundamental and profound ethical and moral ques­ trained algorithms or inherent, yet subtle, human bias
tions regarding people analytics decisions (Calvard & in the training data (e.g., Gal et al., 2017; Hamilton &
Jeske, 2018; Falletta & Combs, 2020; Gal et al., 2017). Sodeman, 2020; Newman et al., 2020). Further,
In sum, the idiosyncacies outline above illustrate a handful of studies indicated that people analytics
the danger of mindlessly applying logic from other and the sometimes non-transparent functioning of
business analytics areas to people analytics. Most stu­ algorithmic decision-making can produce a lack of
dies in this theme outlined only one or two of these algorithmic transparency (e.g., Gal et al., 2020;
idiosyncratic aspects and none of them empirically Schafheitle et al., 2019). Additionally, only a few stu­
investigated the possible effects. Following up on this dies have discussed how increasing datafication and
explication and empirically analysing these peculiari­ nudging adversely affects the workplace on employees’
ties of people analytics and their potential conse­ virtue ethics (Gal et al., 2020) or how algorithm-based
quences, future research could make an important decision-making can impair employees’ personal
contribution to this theme. integrity (Leicht-Deobald et al., 2019). Similarly, the
risks and detrimental effects of error variance and
potential reductionism (Boudreau & Cascio, 2017;
2.2.5. Theme 5: risks of people analytics
Gal et al., 2017) have also only limitedly been
Compared to the preceding categories, the risks and
researched. Consequently, the papers we identified
negative implications of people analytics have
that deal with potential negative consequences of peo­
attracted the fewest scholarly discussions (Angrave
ple analytics unequivocally call for more research on
et al., 2016; Chamorro-Premuzic et al., 2017; Marler
these risks (Calvard & Jeske, 2018; Gal et al., 2017,
& Boudreau, 2017). Only 10 of the 66 studies exam­
2020; Tursunbayeva, 2019; Tursunbayeva et al., 2018;
ined or pointed towards potential drawbacks of people
van den Heuvel & Bondarouk, 2017).
analytics (e.g., Calvard & Jeske, 2018; Newman et al.,
2020; Schafheitle et al., 2019). Of these, only three
studies explicitly investigated risks associated with 2.3. Synthesis of the literature
people analytics (Gal et al., 2017, 2020; Leicht-
Finally, we synthesised our findings to map the field of
Deobald et al., 2019), whereas the remaining six stu­
people analytics, to identify underlying assumptions
dies only pointed towards potential perils in their
and to problematise the increasing technological
conclusions or in designating future research.
advancements’ role in the context of people analytics
The majority of these studies raised privacy and
(Alvesson & Sandberg, 2011). Our literature review
data protection concerns that result from increasingly
has demonstrated that a wealth of studies address the
intrusive actions of people analytics (Hamilton &
promises of and barriers to people analytics adoption,
Sodeman, 2020; Peeters et al., 2020; Simbeck, 2019).
whereas the idiosyncrasies, the negative implications
These studies stress that employees face more and
and associated risks have received little attention. At
more invasive information collection, processing and
the same time, our literature review also shows that the
dissemination as the people analytics boundaries are
majority of studies in this field concern themselves
progressively extended from employees’ work lives
with a rather low level of people analytics’ maturity,
into their social and even physiological spaces. Such
primarily investigating descriptive and predictive ana­
intrusion was found to evoke negative responses and
lytics. Thereby, current literature focuses on analytical
10 L. GIERMINDL ET AL.

approaches seeking to answer what has happened and 2019), we propose a fourth maturity level: autono­
what will happen, based on common statistical meth­ mous analytics. This maturity level relies heavily on
ods, such as simple regression models (Isson & the use of autonomous learning algorithms, mostly
Harriott, 2016; Leicht-Deobald et al., 2019). In prac­ based on artificial neural networks to process struc­
tice, however, organisations increasingly begin to tured, unstructured, and self-updating data from
apply more advanced people analytics technologies, various sources (Margherita, 2021; Tambe et al.,
using learning algorithms and AI – even surpassing 2019). Thereby, autonomous analytics differs from
what is known as prescriptive analytics (e.g., Cappelli, other systems in learning capacity, i.e., with every
2019; Davenport, 2018, 2019, April 18). application autonomous analytics can improve and
Correspondingly, recent literature highlights the provide more precise estimations and evaluations
need to advance the field of people analytics by con­ and autonomously adapt decision parameters
sidering more advanced analytical methods and emer­ (Davenport, 2018; Krumeich et al., 2016; Wang,
ging technologies (e.g., Dahlbom et al., 2019; 2012). These types of algorithms and models are
Davenport, 2019, April 18; van den Heuvel & no longer deterministic, predictable or strictly
Bondarouk, 2017). For instance, Dahlbom et al. repeatable processes and calculations (Dourish,
(2019) emphasise that “new types of data and different 2016). Whereas descriptive, predictive and pre­
algorithms used in AI and machine learning solutions scriptive analytics serve as supplementing technol­
utilized in HRA [Human Resource Analytics]” (p. ogies and decision-support systems, analytics in
123) will transform the field of people analytics. this new maturity level autonomously drives deci­
Others highlight an upcoming shift „from automation sion-making to substitute human intervention
to artificial intelligence“ (van den Heuvel & (Kellog et al., 2020; Pachidi & Huysman, 2018;
Bondarouk, 2017, p. 165) and that „automated learn­ von Krogh, 2018). Consequently, autonomous ana­
ing and AI solutions will play an even greater role” lytics does not focus on a specific type of human-
(Schafheitle et al., 2019, p. 23) in the next few years. driven question to assist decision-making, as it was
Furthermore, some researchers stress that AI and big the case with the maturity levels outlined above.
data will revolutionise people analytics as these “new Instead, the decision-making authority is trans­
technologies, coupled with the near-ubiquitous digiti­ ferred from humans to AI-enabled people analytics’
zation of work and work-related behaviors, has the systems. Thus, unlike previous levels of analytics,
potential to help organizations monitor, predict, and technologies on the fourth maturity level can also
understand employee behaviors (and thoughts) at autonomously substitute decision-making pro­
scale, like it has never been done before”. (Chamorro- cesses, including the execution of task and entire
Premuzic & Bailie, 2020, p. 1). Similarily, Davenport work processes (Mayer et al., 2020; Strich et al.,
(2018) underlines: “Today, firms might incorporate 2021). Overall, by dictating an entire decision pro­
predictive or prescriptive analytics into existing offer­ cess, as well as automatically executing and even
ings . . . [but] . . . AI takes this activity to the next level communicating the derived decisions, technologies
by providing increased automation and sophistica­ used by autonomous analytics can substitute
tion” (p. 76). Thus, scholars highlight “the role of AI humans in the decision-making process (Burton
in replicating and replacing the human workforce” et al., 2019; Strich et al., 2021).
(Tursunbayeva et al., 2021, p. 15) owing to the ability Recent advances highlighted in the fourth maturity
of AI “to perform tasks that normally require human level have further reinforced the high expectations
cognition, including adaptive decision making“ organisations hold regarding the implementation and
(Tambe et al., 2019, p. 16). Overall, several scholars use of people analytics technologies to manage their
stress that advances in the field of AI will transform workforce. Expectations based on the assumptions
employees’ work processes, and even have the poten­ that learning algorithms are superior to human deci­
tial to substitute work processes previously inherent to sion-making (Kryscynski et al., 2018; Leicht-Deobald
human decision-making (Faraj et al., 2018; von Krogh, et al., 2019), can adequately predict future human
2018). behaviour based on historical data (Chamorro-
Although some of the more recent literature on Premuzic et al., 2017; Gal et al., 2017; Isson &
people analytics references the increasing technolo­ Harriott, 2016), and are equally applicable in mana­
gical advancement’s role in the context of people ging inanimate objects and animate objects (such as
analytics, current literature lacks a systematic humans) (Gal et al., 2017, 2020; Leicht-Deobald et al.,
reflection of the effects these new technologies 2019). Such expectations are informed by practices
might have for organisations and their employees. already used in related fields such as business analytics
To account for the introduction of learning algo­ (Holt et al., 2017; Isson & Harriott, 2016) and are
rithms in the context of workforce management associated with increased productivity, higher perfor­
and the potential aggravation of the risks of people mance, effective management and lower personnel
analytics (Leicht-Deobald et al., 2019; Simbeck, costs.
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 11

Yet, the peculiarities we depicted in our literature elaborate on the six identified perils of people analytics
review illustrate the danger of applying the logic devel­ and theorise about the consequences for organisations
oped in other business analytics areas to people analy­ and employees.
tics. If the unique aspects of people analytics are not
diligently considered, utilising technological practices
3.1. People analytics can bring about an illusion
similar to those in other fields can bring risks that have
of control and reductionism
unintended consequences at an organisational as well
as an individual level. Such risks might aggravate due People analytics builds on the belief that digital
to recent technological advancement and the nascent data can accurately represent reality and objec­
introduction of learning algorithms in the field of tively quantify the full scope of workforce activ­
people analytics. We believe that most of these emer­ ities and employees’ traits, experiences and skills
ging methods and technologies have yet to be rigor­ (Faraj et al., 2018; Gal et al., 2017). With its ability
ously scrutinised in scientific research to take the field to analyse vast amounts of data related to employ­
beyond the dominant paradigm that one-sidedly high­ ees’ traits, experiences and skills, people analytics
lights the benefits of how organisations and employees promises to deliver an accurate and objective
could prosper from the use of people analytics. account of employees’ performance
Consequently, we need to identify and challenge the (Tursunbayeva et al., 2018).
underlying assumptions of the current literature However, people analytics can promote a false
(Alvesson & Sandberg, 2011) and illustrate how these sense of certainty regarding the data (Gal et al.,
assumptions can aggravate in severe perils of people 2017), which can lead to two major pitfalls. First,
analytics. people analytics’ illusion of objectivity can result
in an overly strong, possibly even a blind belief in
the algorithms’ processes, results and capability to
3. The perils of people analytics for
predict reliable outcomes correctly (Leicht-
organisations and employees
Deobald et al., 2019; Mayer et al., 2020). Such
The methodological approach for identifying our unwarranted trust and overconfidence in the
perils is informed by Leidner (2018) and is grounded ostensible objectivity and rationality of machines
in a broad theorising review of the organisational could create an illusion of control in that man­
expectations associated with using learning algorithms agers and employees might overestimate their abil­
in people analytics. We conceptualise perils as risks ity to influence decisions over which they actually
resulting from the use of people analytics and the have little control (Kellog et al., 2020; Newell &
unintended and harmful implications they have for Marabelli, 2015). As people analytics’ analytical
organisations and employees. In introducing each power increases, this perception of control can
peril, we reconstructed the promises organisations increase, which raises the likelihood of underesti­
rely on, as well as the assumptions and expectations mating risks (Durand, 2003).
they have with a view to decision-making, manage­ Second, people analytics might follow
ment and human capabilities related to the use of a reductionist logic which can mislead managers to
people analytics (Alvesson & Sandberg, 2011). Then, postulating cause-and-effect relationships that in fact
we unpack the potential risks and consequences, and do not exist (Bhattacharya et al., 2010; Khan & Tang,
theorise about the tensions around people analytics’ 2017; Leicht-Deobald et al., 2019). Algorithms repre­
undesired and potentially harmful implications for sent a simplified model of human behaviour that is
organisations and employees. In line with previous restricted to a set of measurable dimensions or proxies
research (Gal et al., 2020; Pachidi & Huysman, 2018), of such behaviour (Faraj et al., 2018; Gal et al., 2017;
we maintain that our perils are not necessarily inten­ Hansen & Flyverbom, 2015). Such an oversimplifica­
tionally induced. Further, we acknowledge the recent tion of complex features can misrepresent reality
rapid technological advancements and highlight how (Dulebohn & Johnson, 2013; Pachidi & Huysman,
future development may aggravate the risk resulting 2018). By putting people into boxes, people analytics
from the identified perils and potentially change cur­ systems fail to consider the complex, decisive nature of
rent HR and management practices. We illustrate knowledge work and human interaction (Faraj et al.,
these implications with organisational examples 2018; Gal et al., 2017).
based on current practices in the people analytics For example, Microsoft’s tool, MyAnalytics, pro­
field. Overall, our phenomenon-driven approach mises to make organisations’ knowledge workers
(Leidner, 2018) provides a comprehensive theorising more productive by analysing and optimising their
review to highlight the potential effects of AI in mana­ individual workflows (Clohessy et al., 2018). In doing
ging the workforce, raising awareness of the potential this, the tool evaluates employees’ email traffic,
perils, and advancing the field of people analytics to response time, and time spent in meetings. While
meet future challenges. In the following section, we Microsoft may have intended to give employees an
12 L. GIERMINDL ET AL.

overview of their work routines to optimise their time 3.2. People analytics can lead to estimated
resources, the tool also enables HR to gain a detailed predictions and self-fulfiling prophecies
measure of employees’ behaviour (Fuller & Shikaloff,
2017; Hodgson, 2019). This, in turn, allows the firm to People analytics rests on the assumption that it can
identify the most productive workers and to include explain, predict and modify human behaviour based
such information in the company’s decision-making. on past events (Cappelli, 2019; Pachidi & Huysman,
Nonetheless, the practices such tools capture are most 2018). With a growing shortage of skilled labour,
likely incomplete, reductive and a potentially mislead­ organisations are hoping that people analytics will
ing representation of the knowledge workers’ actual anticipate future workforce trends, derive proactive
work (Gal et al., 2020; Pachidi & Huysman, 2018). courses of action, and optimise the recruiting of
Consequently, this measurement can lead to costly employees as a scarce strategic resource (Bodie et al.,
decisions, such as promoting and dismissing the 2016; Chamorro-Premuzic et al., 2017; Huselid, 2018;
wrong people, and establishing perceptions of unfair­ Minbaeva, 2018; Peeters et al., 2020).
ness among employees (Khan & Tang, 2017). As this However, the predictions people analytics and its
example shows, people analytics risks reducing valu­ deployed algorithms make are merely conditional
able qualitative, even if tacit, aspects of employees’ probabilities for the occurrence of an event, not
performance to quantifiable metrics, thus failing to the event itself (Faraj et al., 2018). This means that
adequately consider all aspects of performance and people analytics assesses on the probability of
contextualise human qualities more broadly employees showing certain behaviours rather than
(Newman et al., 2020). Such quantification and decon­ on their actual behaviour (Brayne, 2017; Faraj
textualisation could be damaging to companies, as if et al., 2018; Mayer-Schönberger & Cukier, 2012).
they promote and lay off the wrong people, with detri­ The upshot is that people are subject to how the
mental consequences for both organisations and algorithm categorises them (Faraj et al., 2018;
employees (Khan & Tang, 2017; Newman et al., 2020). Fourcade & Healy, 2013). Examples such as the
Hence, it is important that HR managers and epic failure of Google Flue Trends (Lazer &
employees are aware of such reductionist tenden­ Kennedy, 2015) or from the field of crime analytics,
cies. Once reductionism is masked in the data, where inmates were granted parole, based on the
people tend to perceive constructs, such as perfor­ likelihood of not committing a crime illustrate that
mance, as objective realities (Cowgill & Tucker, these probabilistic predictions can result in erro­
2020; Gal et al., 2017). Reductionism is therefore neous assignments of a high risk (Brayne, 2017;
particularly harmful when it is no longer recog­ Faraj et al., 2018; Huysman, 2020; Israni, 2017).
nised and insufficiently reflected upon in the man­ This can have worrying implications in the people
agerial decision-making process (Curchod et al., analytics context. For instance, people analytics
2020; King, 2016). This can become particularly could develop a pattern to detect the likelihood of
problematic if more advanced forms of people ana­ future absence from work (e.g., pregnancy) or illness
lytics, such as prescriptive or autonomous analytics, (e.g., mental health issues, such as burnout, depres­
are deployed. For instance, learning algorithms aim sion); these probabilities could be integrated in deci­
to autonomously optimise and adapt decision- sions to hire or lay-off (compare Boot et al., 2017;
making processes, even if the underlying data may Boudreau, 2014; Duhigg, 2012; Thorstad & Wolff,
already be subject to reductionistic tendencies. 2019). Managers are often unaware of the condi­
Consequently, the identification of potential reduc­ tional nature of estimated predictions and their
tionistic tendencies will most likely become more respective statistical limitations (Boudreau &
complicated the more advanced the technologies in Cascio, 2017; Khan & Tang, 2017). This risks that
use are. managers view the estimates as accurate predictions
Put simply, people analytics might not always live and execute the proposed decisions without due
up to the inflated expectations and promises of objec­ caution (Hamilton & Sodeman, 2020).
tive knowledge (Bodie et al., 2016; Leicht-Deobald Furthermore, this can lead to serious misjudgements
et al., 2019). The illusory objectivity of people analytics and detrimental outcomes for both organisation and
algorithms can also obfuscate the assumptions and employees when more advanced forms such as
underlying reductionism built into these technologies, autonomous people analytics are deployed and
resulting in less accurate information (Bhattacharya there is no longer a human counterweight to algo­
et al., 2010; Gal et al., 2020; Leicht-Deobald et al., rithmic categorisation and decision errors (Shrestha
2019; Newman et al., 2020; Pachidi & Huysman, et al., 2019). The inherent complexity of learning
2018). Consequently, organisations risk creating dys­ algorithms further aggrevates the recognition of
functionalities, flawed strategic decision-making, tak­ potentially faulty controls. The more accurate tech­
ing misdirected actions, and undermining their nologies apparently become in predicting simple
employees’ development (Durand, 2003). relationships, the greater the confidence of decision
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 13

makers to rely on these technologies in highly sen­ be successful in the past while ignoring novel patterns
sitive areas of decision making. and parameters (Pachidi & Huysman, 2018). Yet, such
Further, applying people analytics can lead to self- an approach makes companies more vulnerable to
fulfiling prophecies (Batistic & van der Laken, 2019; external shocks. For example, in the Covid-19 situa­
Herschel & Miori, 2017; Peeters et al., 2020). tion employees will partly change their behaviour in
Following people analytics’ predictions, organisations ways that an algorithm had not foreseen in using data
and managers can take decisions which create condi­ from before this crisis. A prediction of employees’
tions that ultimately realise these very predictions (Gal preferences for home offices would have been comple­
et al., 2017; Rainie & Anderson, 2017). For instance, tely wrong if the data had been collected before the
firms can use people analytics to forecast the future pandemic broke out. As such, predicting future beha­
performance, the expected retention, or return on viour based on historic data implies an extrapolation
investment in their recently hired employees of the future based on past patterns. The literature on
(Boudreau & Cascio, 2017; Chalutz, 2019; Kellog strategic forecasting is a telling example of how diffi­
et al., 2020; Pessach et al., 2020). Then, based on cult it is to foresee future trends and developments,
these predictions, companies might allocate training and how those predictions often prove to be wrong
resources only to the employees they perceive as pro­ due to changes in the wider ecosystem (e.g.,
mising, which will result in selected employees receiv­ Appelbaum et al., 2017; Durand, 2003; Lazer &
ing additional training, while others receive no Kennedy, 2015).
training (Cowgill & Tucker, 2020; Gal et al., 2017). Consequently, firms can encounter the risk of
Such actions will produce exactly the predicted results: path dependency, which limits their capacity to
employees who receive training will outperform those innovate (Sydow et al., 2009). Thus, people analytics
who do not receive training (Gal et al., 2017; Simbeck, can result in self-perpetuating loops solidifying the
2019). However, the difference between the two direction of development regardless of its quality
groups of employees can mistakenly be interpreted as (Barocas et al., 2013). For instance, Amazon set
indicators of the validity of people analytics (Calvard out to automate its talent search by relying on
& Jeske, 2018; Gal et al., 2017; Simbeck, 2019), while a prescriptive analytics tool to identify the top can­
the performance improvement might be due to the didates based on profiles of successful hires over the
additional training and not related to people analytics’ past 10 years. As the data showed that the over­
predictive power. whelming majority of successful hires had previously
In sum, the assumption that people analytics can been (white) men, the tool predicted that male can­
reliably and validly predict human behaviour based on didates were more likely to be a good fit; thereby,
past events can mislead managers to draw problematic the algorithm inappropriately excluded women
conclusions and implement weak decisions (Boudreau (Hamilton & Sodeman, 2020; Purdy et al., 2019;
& Cascio, 2017; King, 2016). Such decisions can lead to Simbeck, 2019; Vardarlier & Zafer, 2020). As
serious misallocation of organisational resources a result, Amazon had to abandon the recruitment
(Boyd & Crawford, 2012; Huselid, 2018). tool (Dastin, 2018; Hamilton & Sodeman, 2020;
Peeters et al., 2020).
As this example underscores, hiring decisions
3.3. People analytics can foster path based on historical data can result in homosocial
dependencies reproduction, homophily, social bias, and discrimi­
A central assumption of people analytics is that it natory practices (Hamilton & Sodeman, 2020;
enables decision-makers to predict future human Kellog et al., 2020). It also demonstrates how
behaviour based on historical data (Chamorro- advances in people analytics technology can exacer­
Premuzic et al., 2017; N. Wang & Katsamakas, 2019). bate these risks, because the more algorithms auto­
This way, people analytics is meant to help organisa­ matically make decisions, the more information
tions solve critical problems, identify and predict based on complex models is provided, the higher
future trends, facilitate strategic workforce planning, the risk of covert discrimination. Consequently, the
and adapt to business and market trends (Hansen & use of and reliance on learning algorithms and AI
Flyverbom, 2015; Schildt, 2017; Sharma et al., 2014; can reinforce social and economic stratification,
Tursunbayeva et al., 2018). inequality and the social isolation of minorities
However, by drawing on historical data to predict (Barocas et al., 2013; Bodie et al., 2016; Kellog
and prescribe the future, algorithms “embody et al., 2020; Kleinbaum et al., 2013; Simbeck,
a profound deference to precedent” (Barocas et al., 2019). Importantly, the increasing uniformity can,
2013, p. 8). Consequently, based on people analytics, in turn, lead to innovations being nipped in the
firms predict future events only as an extrapolation of bud, as well as to diverse and creative approaches
the past, thus only focusing on actions that proved to in organisations being curtailed (Gal et al., 2017).
14 L. GIERMINDL ET AL.

Accordingly, people analytics can also impede the Ultimately, learning algorithms can become too com­
decision-making horizon and significantly limit plex to be fully understood even by those program­
access to alternative perspectives, novel ideas and ming them (Dourish, 2016; Gal et al., 2017; Possati,
knowledge diversity (Faraj et al., 2018; Kleinbaum 2020). For example, in the context of people analytics,
et al., 2013). this implies that pivotal decisions such as on staff
Overall, the use of people analytics potentially layoffs, could be made even without the responsible
bears the risk that organisations will act backward manager exactly knowing why and how the algorithm
looking, deterministically, ego-centrically and decided to do so. As the manager will no longer be able
bounded by their very own algorithms, which can to explain the decision rationales, employees could
lead to the firms repeating past mistakes (Cappelli, perceive the decisions as arbitrary or nonsensical,
2019; Pachidi & Huysman, 2018; Simbeck, 2019). resulting in complaints of unfairness, feelings of frus­
By focusing on past events to determine their tration or disengagement, and possibly high employee
future strategy, organisations can become less turnover (Flyverbom, 2019; Gal et al., 2020; Huysman,
open, less innovative and less flexible in responding 2020; Zarsky, 2016).
to changes in their internal and external environ­ Another example, in the context of drug testing,
ment (Thrane et al., 2010). which is increasingly used by organisations in the
recruiting process (Calmes, 2016), shows how false
positive decisions can have different causes, such as
3.4. People analytics can impair transparency medication or test procedures in laboratories (Akin,
and accountability 2020; Bodie et al., 2016). Yet, such false positive results
can deprive workers of their jobs or tarnish their
By creating more transparent HR processes, people
reputations for future opportunities (Bodie et al.,
analytics is meant to answer the increasing organisa­
2016; Kellog et al., 2020). This is particularly proble­
tional demand for managers to be more accountable
matic, not only because of potential bias and high
regarding their activities and the logic of their deci­
stakes, but also because workers might not be able to
sions (Flyverbom, 2019; Giermindl et al., 2017;
appeal judgements against them or correct missing or
Leonardi & Contractor, 2018; Stohl et al., 2016). In
mistaken information (Kellog et al., 2020). Owing to
this respect, people analytics promises that employees
the data’s high opacity and the limited access to it, it
will benefit from fairer, more understandable HR deci­
can be difficult to reverse engineer the data to ensure
sions, which in turn will increase their participation
its accuracy (Bodie et al., 2016).
opportunities, as well as their commitment and work
Further, the opacity of algorithms raises the ques­
motivation (Isson & Harriott, 2016; Minbaeva, 2018;
tion of who is accountable for a managerial decision
Newman et al., 2020; van den Broek et al., 2019).
and its ethical implications (Ananny & Crawford,
With increasing analytical power, however, the
2018; Hansen & Flyverbom, 2015). Accountability
decisions made by people analytics become more and
refers to the expectation that a person can be called
more opaque, inaccessible and untraceable regarding
on to justify his or her intentions, motives and ration­
their underlying assumptions, mechanisms and pro­
alities (Gal et al., 2017). By allocating decision-making
cesses (Faraj et al., 2018; Pachidi et al., 2016; Simbeck,
authority to autonomous people analytics, organisa­
2019). When less advanced forms of people analytics,
tions could be tempted to avoid accountability
such as descriptive or predictive analytics, are used for
(Barocas et al., 2013; Martin, 2018; von Krogh, 2018).
decision-making, employees can gain insight in the
However, given the potentially life-altering nature of
logic that guided their decisions by asking their man­
algorithmic decisions in the people analytics context,
agers to disclose the reasons for their decisions (Gal
as depicted in the examples above, organisations
et al., 2020; Leicht-Deobald et al., 2019). Once
should be wary of the risks and negative consequences
advanced machine learning algorithms are deployed,
posed by relying on these systems, and they should
only a limited group of knowledge workers with highly
work to hold their people analytics algorithms accoun­
specialised skills and technical training can recon­
table (Calvard & Jeske, 2018; Diakopoulos & Friedler,
struct decisions (Dourish, 2016; Faraj et al., 2018).
2016).
Thus, Burrell (2016) emphasises:
Yet, to hold algorithmic decision-making systems
accountable for their outcomes requires more than
“When a computer learns and consequently builds its
own representation of a classification decision, it does seeing the algorithm’s code or the underlying data;
so without regard for human comprehension [. . .] rather, one needs to clearly understand how the sys­
The workings of machine learning algorithms can tem works and should be able to reconstruct the rea­
escape full understanding and interpretation by sons for the algorithmically computed decisions ex-
humans, even for those with specialized training, post facto (Ananny & Crawford, 2018; Barocas et al.,
even for computer scientists” (Burrell, 2016: 10).
2013; Bodie et al., 2016). Nonetheless, although the
deployed algorithms are inscrutable or impossible to
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 15

understand, organisations need to take responsibility intensify, with the use of increasingly advanced tech­
for their outcomes and ethical implications (Martin, nologies such as learning algorithms or AI.
2018). For instance, O’Connor (2016) reports on a Silicon
Overall, if the deployed learning algorithms and Valley-based people analytics software provider
AI function as a black box, it seems difficult to named Percolata that seeks to maximise employees’
determine who can be held accountable for serious retail performance and workers’ productivity for
mistakes, significant failures, and misconduct of a variety of retail chains. To achieve this, the software
a system (Ananny & Crawford, 2018; combines sensor-based human movement patterns
Diakopoulos & Friedler, 2016; Huysman, 2020). with data on the amount of sales per employee (i.e.,
Such opacity can create information asymmetries, sales divided by customers). Percolata provides man­
obscure power structures and inhibit oversight, as agement with detailed reports on when and with
well as negatively influence workers’ perceived which colleagues each employee performs best,
procedural fairness and organisational commit­ including a list of employees rated from lowest to
ment (Gal et al., 2020; Jabagi et al., 2020; highest by shopper yield. Based on these prescribed
Newman et al., 2020). Further, prior research sug­ actions, the company supplies a schedule with the
gests that when workers are directed by an algo­ ideal mix of workers to maximise sales for every 15-
rithm that they perceive as unfair, this can minute slot of each day, giving the worker with the
undermine their moral compass and increase highest shopper yields more hours than their lower
their willingness to engage in unethical behaviour performing colleagues. The only role left for managers
(Curchod et al., 2020; Kellog et al., 2020). is to press a button so that the schedule is automati­
Consequently, employees could lack legal certainty cally communicated and instantaneously sent to work­
and feel powerless, while managers might feel ers’ personal smartphones, thus replacing
incapacitated by such algorithms (Curchod et al., communication between employees and managers
2020; Kellog et al., 2020; Leicht-Deobald et al., (O’Connor, 2016). This example shows that people
2019). analytics software can replace direct communication
with a human manager, removing managers interact­
ing to give feedback and in-person assessment of
3.5. People analytics can reduce employees’ workers (Gray & Suri, 2019; Kellog et al., 2020).
autonomy In the context of knowledge workers, people
analytics or tools such as Microsoft’s MyAnalytics
People analytics can support the organisational need can be used similarly to the way in which the
for knowledge intensive, flexible or even new, fluid Percolata example did to identify, pressurise and
organisational forms by improving collaboration eliminate low performers (Fuller & Shikaloff,
between employees and by leveraging their innovative 2017; Hodgson, 2019). As has already become evi­
and creative skills (Holford, 2019; Minbaeva, 2018; dent in gig economy firms such as Uber, UberEats
Tursunbayeva et al., 2018). Further, it promises to and Deliveroo, this could even go further, with
increase employees’ scope for decision-making, auton­ algorithms automatically notifying underperform­
omy, and self-management competences (Bodie et al., ing workers of their dismissal (via a dashboard)
2016; Minbaeva, 2018). without any human involvement (O’Connor, 2016;
Even so, people analytics can also significantly Schildt, 2017). People analytics risks fostering simi­
curtail employees’ autonomy and their reflexive self- lar conditions in the context of knowledge workers
determination (Jabagi et al., 2020; Leicht-Deobald (Gray & Suri, 2019). Prior research has labelled this
et al., 2019; Schafheitle et al., 2019). Thus, people renewal of the Taylorist paradigm as digital
analytics can impede employees’ genuine discretion Taylorism and “involves creative and intellectual
in decision-making and working habits by replacing tasks being subject to the same process as chain
teams’ interactive processes, collaboration and coop­ work” (Holford, 2019, p. 5). This way, the algo­
eration practices with pre-defined goals and key rithms deployed by people analytics can now con­
performance indicators (KPIs) condensed in algo­ trol humans in an unprecedented and alarming way
rithms (Burton et al., 2019). In this way, people (Faraj et al., 2018; Kellog et al., 2020; Schildt, 2017).
analytics can produce reactive chains of action Thus, algorithms “[provide] a degree of control and
instead of self-reliant and self-organised behaviour oversight that even the most hardened Taylorists
according to employees’ self-determination. As could never have dreamt of” (Prassl, 2016; as cited
a result, algorithmic calculation can become more in O’Connor, 2016).
relevant than people’s self-determination, corroding Overall, the reductionist mapping of complex
the open social processes of teams (Burton et al., human abilities and characteristics according to
2019; Mayer-Schönberger & Cukier, 2012). a machine paradigm heralds in a new era of algo­
A phenomenon, that potentially continues to rithmic management and control. The examples
16 L. GIERMINDL ET AL.

Table 4. Evolution of the perils of people analytics at the respective maturity levels.
Maturity Level Descriptive People Analytics Predictive People Analytics Prescriptive People Analytics Autonomous People Analytics
Decision Process Decision Support Joint Human-Algorithm Automated Decision-Making
Decision-Making
Characteristics People analytics analyses what People analytics forecasts People analytics automatically People analytics autonomously
happened in the past and future developments based prescribes decision derives complex decisions,
how this influences the on past or real-time recommendations based on executes them, and
present. observations and advanced statistics, communicates them based
probabilistic weighting of scenarios and machine on self-learning algorithms.
various scenarios. learning algorithms.
Human/Machine People analytics gains Organisations and employees Owing to its seeming Much of the decision-making
Interaction acceptance as it provides increasingly rely on people superiority, employees tend and execution process is
support by facilitating analytics for decision-making to trust the decisions blindly automated and is
managers’ drawing and question these systems and to implement the increasingly performed
conclusions from evaluated less and less. proposed recommended without human interaction,
past data to make data- measures unquestioningly. expertise, or reflection.
driven decisions.
1. ● Employees can be guided ● There is a risk, that man­ ● Employees may blindly ● The illusion of control and
People by the false expectation agers and employees jus­ trust the supposedly blind faith in the algo­
analytics can that digital data can accu­ tify their decisions based on unchangeable technologi­ rithms may increasingly
bring about an rately and objectively reductionist parameters and cal decision; although they cause employees to con­
illusion of represent reality which can increasingly rely on the illu­ may believe they are still in sign the decisions comple­
control and lead to a false sense of sion that they can control control, they have dele­ tely to people analytics
reductionism certainty. the future. gated their decision- technology; thus, it
● This bears the risk that they ● This overconfidence can making authority. becomes even more diffi­
overlook the emerging lead to insensitivity to ● The implementation of the cult to recognise reduc­
reductionism and postulate feedback and inaccurate decision made by people tionism or errors in the
non-existing cause-and- risk assessment. analytics without critical data.
effect relationships. scrutiny can lead to costly ● The result is mismanage­
poor decisions and mis­ ment, of which the causes
guided strategies. may no longer be identi­
fied, undermining the
workforce’s potential.
2. ● This peril only occurs if pre­ ● Organisations are less likely ● Managers and employees ● The executed decisions can
People dictive analytics or more to recognise that predic­ tend to regard the esti­ misjudge the estimated
analytics can advanced forms are tions are based only on mates as accurate predic­ predictions and self-
lead to applied. forecast probabilities and tions and implement the fulfiling prophecies.
estimated may fail to recognise the proposed decisions with­ ● Self-fulfiling prophecies
predictions and resulting self-fulfiling out reflection. may become increasingly
self-fulfiling prophecies. ● Self-fulfiling prophecies prevalent, as they
prophecies ● This misleads organisations may confirm the pre­ strengthen and continue in
into believing that they can viously made assumptions, a self-affirming cycle, often
predict future behaviour, which encourages resulting in the misdirec­
which can lead to wrong employees to keep trusting tion and misallocation of
conclusions and decisions. and following the pre­ organisational resources.
scribed decision
recommendations.
3. ● Due to its historical orienta­ ● Organisations predict future ● Organisations and employ­ ● There is a danger that orga­
People tion, people analytics has events only as an extrapo­ ees tend to overlook the nisations systematically
analytics can limited transformational lation of the past, which actual developments and exclude human decision-
foster path potential. makes organisations more the need for adaptation. making power and become
dependencies ● However, self-determined vulnerable to external ● The unrecognised path purely self-referential with­
human interventions can shocks. dependencies can result in out organisational
still compensate for such ● There is a risk that organisa­ self-perpetuating loops foresight.
a limitation. tions fail to recognise the solidifying the direction of ● Organisations may become
arising path dependencies development regardless of less open, less innovative,
which can systematically its quality, impeding the and less flexible in
undermine future orienta­ decision-making horizon responding to changes in
tion and innovation. and limiting access to their internal and external
alternative perspectives environment.
and novel ideas.
4. ● Only employees with ● The algorithms and justifi­ ● The algorithms may be too ● The decisions and actions
People advanced statistical knowl­ cations underlying the complex to be understood can be incomprehensible
analytics can edge can retrace the deci­ decisions are largely opa­ by employees affected by and irretraceable for
impair sion rationales. que and more and more their application as well as employees, the organisa­
transparency ● Accountability for the ana­ difficult to understand. managers who apply them. tion, and even the analytics
and lytics-supported people- ● Accountability for the data- ● Accountability can no technology provider.
accountability related decisions still lies driven and technologically longer be meaningfully ● It is hard to hold anyone
with the managers and derived decisions still lie assigned to a human at the accountable for the deci­
employees can still with the managers, but they specific level of decision- sions. This can result in
approach them to have may no longer be able to making and action, and feelings of powerlessness
their decision rationales explain and justify these in decisions may be increas­ of employees and incapa­
disclosed. a rationally comprehensible ingly perceived as arbitrary citation of managers, as
way. and nonsensical. well as in potentially creat­
ing information asymme­
tries, obscuring power
structures, and inhibiting
oversight.
(Continued)
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 17

Table 4. (Continued).
Maturity Level Descriptive People Analytics Predictive People Analytics Prescriptive People Analytics Autonomous People Analytics
5. ● People analytics can ● Predictive future events ● There is a risk that both ● Organisations relying too
People impede employee discre­ based on standardised pat­ individual and cooperative heavily on people analytics
analytics can tion in decision making by terns bear the risk to lead to decision-making processes run the risk of developing
reduce replacing teams’ interactive a mechanisation of human will become increasingly into a mechanistic neo-
employees’ processes and collabora­ thinking and behaviour. mechanised and semi- tayloristic system in which
autonomy tion practices with pre- ● The sphere of decisions and automated and less the complexity of the
defined goals and algorith­ actions as well as employee reflected by open, indivi­ human and social sphere is
mically condensed KPIs. autonomy may become dual, or cooperative cogni­ hardly recognised.
● Mechanised management increasingly restricted and tive processes. ● This can lead to alienating
procedures can inhibit can produce reactive chains ● People analytics is increas­ and dehumanising work
employees’ own perfor­ of action instead of self- ingly replacing direct com­ and to reification of inter­
mance and self-control reliant and self-organised munication with a human personal relationships,
potential. behaviour according to manager and controlling whereby workers will
employees’ self- employees, which reduces experience a loss of indivi­
determination. their autonomy, freedom, duality, autonomy, and
and self-organisation skills. reflexive self-
determination.
6. ● Due to the alleged super­ ● Human rationality is still ● There is a risk that sover­ ● Managers have less and
People iority of people analytics, involved in accepting/ eignty over decisions is less opportunities to take
analytics can managers may be tempted rejecting the automated increasingly passing over decisions, and people ana­
marginalise to trust and feel social insights: managers can still from managers to people lytics may operate without
human pressured to rely on people discern when and where analytics. human judgement to miti­
reasoning and analytics. not to incorporate the ● This can drastically under­ gate their operation.
erode ● Managers may tend to put algorithm’s judgment in mine managers’ power of ● Managers increasingly lose
managerial aside their intuitions and their own decision-making. judgment, human reason­ their ability to make
competence abilities and if in doubt, act ● Yet the value, relevance and ing, and critical reflection rational decisions human
against their own need for genuine human and can lead to devaluing reasoning, rationalising,
assessments. decision-making compe­ of the management and sense-making atrophy
tences can start to decrease function. and managers increasingly
and managers progressively become obsolete.
forfeit their sense-making
capabilities.

given above demonstrate that people analytics can 3.6. People analytics can marginalise human
revive the piecemeal system with an algorithmic reasoning and erode managerial competence
managerial span of control of which the logic is
One core argument of people analytics is that manage­
opaque (Faraj et al., 2018). While coordinating
rial decision-making should be a more rational pro­
work can become more efficient in using people
cess in which algorithms guide decision-making,
analytics, it can ultimately lead to mechanising the
because humans are subject to inherent unconscious
workplace and to working environments in which
biases, unrecognised overconfidence and irrational
individual employees have little or no direct
behaviour (Burton et al., 2019; Gal et al., 2017).
human contact with one another, thus undermin­
Owing to its large processing power, people analytics
ing the organisation’s appropriate development
promises to help managers make more effective HR
and its innovative power (Dahlander &
decisions, to complement their competences, and to
Frederiksen, 2012; Dougherty & Dunne, 2012;
promote rationality in managerial decision-making
Faraj et al., 2018). Furthermore, recent technolo­
and human sense-making (Lee, 2018; Leicht-Deobald
gical advances such as autonomous analytics can
et al., 2019; Werkhoven, 2017b).
further aggravate the alienation and dehumanisa­
However, people analytics can simultaneously
tion of work. Whereas the supervision and mon­
decrease the value, relevance and need for genuine
itoring of work processes was previously limited
human decision-making competences (Gal et al., 2017;
by personnel and thus also financial resources, this
Mayer et al., 2020). If organisations consider algorithms
can now be carried out in real time by learning
to be superior and more objective than allegedly emo­
algorithms without the need for human interac­
tional, subjective and deficient human processes, and
tion. Therefore, organisations relying on people
they find managers’ only valid source of knowledge
analytics can lead to the reification of interperso­
should be measurable, and preferably quantitative facts,
nal relationships, whereby workers will experience
human abilities and managerial competence will neces­
a loss of individuality, autonomy and individual
sarily be devalued (Gal et al., 2017; Leicht-Deobald et al.,
freedom (Curchod et al., 2020; Holford, 2019;
2019; Newell & Marabelli, 2015). The seeming super­
Kellog et al., 2020).
iority of people analytics can lead employees to trust
18 L. GIERMINDL ET AL.

these technologies and systems more than either their 2004; Hodgkinson et al., 2009). Every time man­
own assessment of current circumstances or their unique agers delegate information gathering and decision-
capabilities such as their intuitive judgement, reasoning making to algorithms and limit their own thinking
and critical reflection (Burton et al., 2019; Introna, 2016; to what the algorithms deem appropriate, they risk
Leicht-Deobald et al., 2019; Mayer et al., 2020). Further, forfeiting their cognitive autonomy (Burton et al.,
employees might feel social pressured to rely on these 2019). Thus, algorithmic evaluations and decisions
people analytics tools and to follow their recommenda­ can replace, reshape and interrupt the relationship
tions to avoid the risk of being held responsible for poor between manager and employee, which de facto can
decisions (Briône, 2017; Leicht-Deobald et al., 2019; lead to devaluing the management function
Pachidi & Huysman, 2018; van den Broek et al., 2019). (Curchod et al., 2020; Gal et al., 2017; Jabagi
Consequently, human qualities such as problem solving et al., 2020; Jarrahi et al., 2019). When managers
and creativity could increasingly give way to calculation, start to merely rely on the choices prescribed by
efficient predictability and control (Ananny, 2016; people analytics rather than on their experience,
Orlikowski & Scott, 2014). they risk making themselves redundant.
For example, at the Siemens factory in Congleton, One example is a high-skilled online labour mar­
a software called Preactor was introduced to give the ket system that replaced a transparent evaluation of
workshop teams instructions on which parts to produce employees with a non-transparent automated eva­
and in what order. Rather than relying on their human luation and wage calculation. Employees no longer
judgment and experience, workers now receive understood the evaluation criteria and how they
a specific set of instructions from the software telling determined their rating and the resulting pay cuts.
them exactly in what order to perform each step. While Negotiation processes between management and
the cell managers welcomed the clarity and simplicity of employees became redundant, as did discussion
the production and action plans, production workers between employees and management about the
complained about their loss of autonomy and the deva­ meaningfulness of tasks and evaluations. An algo­
luation of their skills and knowledge. The algorithmic rithm instead of a person on the other side of the
decision templates stepped in, compromising the for­ managerial relationship reduces contact between
mer social relationships between management and pro­ managers and workers, so that employees can no
duction employees and making it impossible for longer question or challenge their directions and
managers to adequately assess the new system’s negative results (Kellog et al., 2020).
effects on employees (Briône, 2017). Overall, the seeming infallibility of people analy­
As this example already indicates, this development tics and an organisations reliance on it can disrupt
could end in a vicious circle in which the more and lower employees’ powers of judgment
employees’ decision-making is substituted by people (Holford, 2019). Further, the increasing transfer of
analytics, the less opportunities and autonomy they traditional management tasks to people analytics
have to make their own decisions, whereby their learn­ systems can marginalise decision-makers’ mental
ing capacities are also reduced (Pachidi & Huysman, representations, sense-making and rationality (Gal
2018). The technological advances in people analytics et al., 2017; Pachidi & Huysman, 2018). Ultimately,
might further exacerbate this phenomenon: the more the use of and reliance on learning algorithms and
autonomous respective algorithms are, the less need AI as used in autonomous analytics is likely to
there is for humans to monitor, review or improve aggravate the erosion of managerial competences
them. Accordingly, the need for human expertise and could even reduce their role to that of the
decreases as technologies’ autonomy advance. As machine’s stooge (Faraj et al., 2018; Schildt, 2017),
with muscles that are not trained, employees’ power and lead to the organisation replacing managers, to
of judgment and reasoning atrophies over time and the extent that many could become obsolete (Faraj
their ability to make independent decisions can gra­ et al., 2018; Gal et al., 2017; Schildt, 2017). For
dually wither away. employees, this would mean that they have no
Managers, leaders and decision-makers are par­ one to turn to in trying to understand problems
ticularly affected by marginalisation of human they need to solve, or in making a decision or
sense-making, reasoning, and rationalising seeking feedback on their work. This can lead to
(Curchod et al., 2020; Gal et al., 2017; Jabagi employees experiencing frustration, anxiety and
et al., 2020). To date, successful team leaders have stress (Gray et al., 2016; Schwartz, 2018).
often excelled in making good strategic decisions
regarding team composition. They selected, devel­
3.7. Summary and integration
oped and promoted employees based on their clo­
seness and interaction with their team, relying on The six perils discussed above reveal the potential
non-conscious forms of cognition as well as explicit dangers, tensions and risks of the use of people
reasoning processes (e.g., Hackman & Wageman, analytics. While some may still sound like a far-
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 19

fetched dystopian scenario, the perils exist today, algorithms can be equally applied in handling inani­
albeit in a less pronounced and evident form. As mate objects and managing humans (Gal et al., 2017,
mentioned, we argue that technological advance­ 2020; Leicht-Deobald et al., 2019). Based on the phe­
ments and the nascent introduction of learning algo­ nomenological review of these organisational expecta­
rithms and AI in the field of people analytics have tions (Alvesson & Sandberg, 2011) associated with the
the potential to enlarge and aggravate the risks of use of learning algorithms in people analytics, we then
people analytics and their potential profound nega­ theorised about the perils of people analytics. Finally,
tive impacts on organisations and employees. To we explained how the emerging perils evolve and how
illustrate the characteristics and evolution of these their negative impact on organisations and employees
perils, we have juxtaposed each of them with the aggravates with each increasing maturity level of peo­
four levels of maturity to form a matrix in Table 4. ple analytics.
It serves as an overview of the characteristics and
stages of development of the six identified perils and
4.1. Theoretical implications
shows how they evolve and aggravate with each
maturity level. While the opportunities for organisa­ Our research offers two major theoretical contribu­
tions increase with each stage of development due to tions to the literature on people analytics and the
the progress of the underlying technologies, at the field of information systems. Firstly, we introduce
same time the impact of the perils on organisations a fourth conceptual level of maturity, which we
and employees may be escalating. label autonomous analytics, to problematise the
increasing technological advancement’s role
(Alvesson & Sandberg, 2011). The novel maturity
4. Discussion
level is based on automated decision-making, rapid
In this paper we set out to identify potential perils of progress in the field of AI, and people analytics
people analytics and to theorise about their potential increasingly using learning algorithms. We theorise
negative consequences for organisations and employ­ about how this increased level of people analytics’
ees. We first systematically reviewed the people analy­ maturity is expanding the possibilities but could
tics literature and openly mapped the research field also further aggravate the potential perils and nega­
using a theme-centred reviewing approach (Leidner, tive implications of people analytics. We believe
2018; Rowe, 2014; Webster & Watson, 2002), categor­ that a fourth maturity level is a useful extension
ising this literature’s emerging themes, patterns and of the analytical framework for scholars and practi­
trends. We identified five emerging themes relating to tioners in that it facilitates reflection on the use of
the opportunities, barriers, maturity, idiosyncrasies analytics in various contexts and its implications
and risks of people analytics. Further, we uncovered for each level. In doing so, we advance the emer­
some underdeveloped areas in this nascent field of ging literature on people analytics, and add to the
research, such as the lack of rigorous quantitative broader stream of analytics literature. Further, we
and qualitive empirical studies on people analytics’ integrate the identified perils within the framework
outcomes, or the limited analysis of more advanced of the four levels of maturity, demonstrating how
forms of people analytics’ uses, and we suggested these perils evolve across the levels. While the
future research avenues for each theme. To account opportunities for organisations increase with each
for the recent technological advancements and the stage of development owing to technological pro­
nascent introduction of learning algorithms and AI gress, at the same time the perils’ implications for
in the field of (people) analytics, we proposed a new organisations, managers and employees may esca­
maturity level, autonomous people analytics. late. We therefore provide an outlook on how using
Thereafter, we unpacked and challenged several more sophisticated people analytics can fundamen­
underlying assumptions held in a good portion of tally change decision-making, management and
the academic and practitioner-oriented discourse on human capabilities, and what this means for the
people analytics. These assumptions evolve around the future. We also show that these effects occur not
notion that recent technological advances in people only when we use AI systems or learning algo­
analytics tend to be more objective than and superior rithms, but already at lower maturity levels.
to human decision-making (e.g., Bodie et al., 2016; Because the predominant research now focuses on
Kryscynski et al., 2018; Martin-Rios et al., 2017), that AI hazards, the potentially negative effects of exist­
people analytics can reduce bias and increase trans­ ing technologies receive less research attention.
parency (Jabagi et al., 2020; Martin-Rios et al., 2017; However, these technologies remain relevant, con­
Sharma & Sharma, 2017), that future behaviour can be tinue to be used, and have consequences with simi­
precisely extrapolated based on current or past beha­ lar kinds of risks. Thus, our study serves as
viour (e.g., Chamorro-Premuzic et al., 2017; Gal et al., a starting point to illustrate the pitfalls of less and
2017; Sivathanu & Pillai, 2019), and that learning more advanced forms of people analytics.
20 L. GIERMINDL ET AL.

Secondly, we outline six potential perils of people Secondly, our paper raises organisations’ awareness
analytics and scrutinise their far-reaching implications that organisations should not simply transfer other
for organisations and employees, as well as their nega­ areas’ analytics logic to the management and control
tive impact on decision-making, management and of their employees. Avoiding due attention to people
human skills. By unpacking the underlying assump­ analytics’ idiosyncrasies raises another critical point
tions, we offer a more nuanced and critical under­ regarding the emergence of the perils. In this respect,
standing of people analytics and its potential our contribution offers added value for organisations,
ramifications and implications. Our article extends managers and employees by helping them not only to
prior systematic reviews (e.g., Fernandez & Gallardo- see the benefits of using people analytics, but also to be
Gallardo, 2020; Marler & Boudreau, 2017; sensitised in advance to the possible negative conse­
Tursunbayeva et al., 2018) by synthetising the parti­ quences of inadequate implementation and an unre­
cularities of people analytics and explaining why flective use of people analytics. Our study alerts
ignoring them could entail considerable risk, which organisations to possible pitfalls and draws attention
in turn can lead to the emergence of potentially harm­ to phenomena, such as emerging reductionism and
ful perils. Moreover, we provide a comprehensive and self-fulfiling prophecies, and their far-reaching conse­
holistic overview of potential pitfalls that can result quences for organisational processes, managers and.
from recent technological advances in people analy­ employees. Thereby, we not only raise organisations’
tics. Thereby, our article goes beyond prior conceptual and human decision-makers’ awareness, but also con­
and empirical work that studied and theorised about tribute to a more responsible and sustainable use of
only some of these potential drawbacks and did not people analytics, which in turn will ultimately lead to
examine them comprehensively. Further, we expand a more sustainable use of human resources in
critical information systems research on the dark sides organisations.
of AI by thoroughly reviewing, synthesising and inte­ Finally, our proposed matrix helps managers to
grating literature from diverse disciplines and fields, better understand the perils arising from people
such as people analytics, business analytics, human analytics and to trace how these perils can evolve
resource management, algorithmic management, and with increased levels of maturity. Additionally, it
decision-making. enables managers to attend to what is important
when they use descriptive, predictive, prescriptive
and autonomous analytics, and recognise the indi­
4.2. Practical implications
cations of the perils’ emergence. Managers have
Additionally, our paper has three practical implica­ a key role to play in decision-making processes
tions for organisations and managers dealing with and especially in technology decisions
the implementation and use of people analytics. (Orlikowski, 2000). They link organisational goals
First, we contribute to more realistic expectations on and the operational staff, and therefore should be
people analytics’ technologies at work by uncovering open, but at the same time also critically reflect
their underlying simplistic and partly misplaced the application of new and previously unknown
assumptions. The numerous promises associated technologies. Further, they should continue to
with people analytics, as well as their apparent super­ believe in their own abilities, such as their intui­
iority, which are further propagated by many technol­ tive judgment, reasoning and critical reflection
ogy vendors, can cause misguided understandings of and use people analytics to complement, not
people analytics’ possibilities and an overstatement of replace, their skills. Additionally, organisations
their overall effectiveness. This can lead organisations should (continue to) realise that their managers
and managers to relying on people analytics too heav­ and employees are their most valuable resource
ily and to use them without due care, critical reflection and recognise the need to keep the human deci­
or sufficient knowledge. Thus, we argue that the perils sion-makers in the loop. Consequently, the skills
are not a necessary evil and inevitable consequence of managers have should not be substituted by out­
using people analytics, but that they can arise from put-oriented technologies. Instead, organisations
inflated expectations and might be aggravated by an should trust their decision-makers, give them the
unreflected use of people analytics. By exposing and freedom to speak out if people analytics’ decision-
mitigating the misplaced assumptions, our paper can making recommendations go against their own
help organisations to develop a more balanced and insight, thus promoting an open culture of
holistic view on the capabilities and risks of imple­ exchange and error. In order to learn from one
menting people analytics. To generate realistic expec­ another when dealing with new data-driven
tations, it is important that companies frame these approaches and technologies (such as people ana­
assumptions in accordance with the potential risks, lytics dashboards), we recommend a broad dis­
outlining people analytics’ possibilities and limitations course and continuous exchange between
already when they start to implement the technologies. management, HR and employees.
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 21

5. Conclusion Management Review, 36(2), 247–271. https://doi.org/10.


5465/amr.2009.0188
The increasing introduction of learning algorithms is Ananny, M. (2016). Toward an ethics of algorithms:
ushering in a new era of analytics. Therefore, it is not Convening, observation, probability, and timeliness.
surprising that organisations also increasingly deploy Science, Technology & Human Values, 41(1), 93–117.
people analytics for making evidence-based decisions https://doi.org/10.1177/0162243915606523
Ananny, M., & Crawford, K. (2018). Seeing without know­
and managing their workforce in a data-driven ing: Limitations of the transparency ideal and its applica­
approach. Based on a systematic literature review of tion to algorithmic accountability. New Media & Society,
people analytics, we have explicated five major themes 20(3), 973–989. https://doi.org/10.1177/
in the current people analytics literature. Furthermore, 1461444816676645
to account for recent technological advances such as Andersen, M. K. (2017). Human capital analytics: The
winding road. Journal of Organizational Effectiveness,
learning algorithms, we propose a fourth maturity
4(2), 133–136. https://doi.org/10.1108/JOEPP-03-
level: autonomous analytics. Moreover, based on 2017-0024
a broad theorising review of the organisational expec­ Angrave, D., Charlwood, A., Kirkpatrick, I., Lawrence, M., &
tations associated with the use of learning algorithms Stuart, M. (2016). HR and analytics: Why HR is set to fail the
and AI, we have identified six perils of people analy­ big data challenge. Human Resource Management Journal,
tics. Each of the perils has profound implications for 26(1), 1–11. https://doi.org/10.1111/1748-8583.12090
Appelbaum, D., Kogan, A., Vasarhelyi, M., & Yan, Z. (2017).
organisations and employees alike. By drawing atten­
Impact of business analytics and enterprise systems on
tion to the tensions in which people analytics is managerial accounting. International Journal of
embedded, our research serves as a systematic starting Accounting Information Systems, 25, 29–44. https://doi.
point for future research to shed light on the negative org/10.1016/j.accinf.2017.03.003
consequences of people analytics. The question is not Aral, S., Brynjolfsson, E., & Wu, L. (2012). Three-way com­
plementarities: Performance pay, human resource analy­
whether people analytics will monitor, determine and
tics, and information technology. Management Science,
optimise an increasing portion of our working envir­ 58(5), 913–931. https://doi.org/10.1287/mnsc.1110.1460
onment in the future; rather, it is how we can reap the Baesens, B., De Winne, S., & Sels, L. (2017). Is your company
positive rewards this process offers, while respecting ready for HR analytics? MIT Sloan Management
the complexity of the human condition. The use of Review,58(2), 19–21.
people analytics will transform the future of work and Barocas, S., Hood, S., & Ziewitz, M. (2013). Governing algo­
rithms: A provocation piece. New York University.
of human decision-making. It is incumbent upon us to Batistic, S., & van der Laken, P. (2019). History, evolution
ensure that, through dedicated research, this future and future of big data and analytics: A bibliometric ana­
will be bright. lysis of its relationship to performance in organizations.
British Journal of Management, 30(2), 229–251. https://
doi.org/10.1111/1467-8551.12340
Acknowledgement Bekken, G. (2019). The algorithmic governance of data
driven-processing employment: Evidence-based manage­
We are grateful to the Associate Editor and four insightful ment practices, artificial intelligence recruiting software,
Reviewers for developing the potential of this work. and automated hiring decisions. Psychosociological Issues
in Human Resource Management, 7(2), 25–30. https://
doi.org/10.22381/pihrm7220194
Disclosure of potential conflicts of interest Benbya, H., Davenport, T. H., & Pachidi, S. (2020). Special
issue editorial: Artificial Intelligence in organizations:
No potential conflict of interest was reported by the Current state and future opportunities. MIS Quarterly
author(s). Executive, 19(4), ix–xxi. doi:10.2139/ssrn.3741983
Ben-Gal, H. (2019). An ROI-based review of HR analytics:
Practical implementation tools. Personnel Review, 48(6),
ORCID 1429–1448. https://doi.org/10.1108/PR-11-2017-0362
Bhardwaj, S., & Patnaik, S. (2019). People analytics:
Ulrich Leicht-Deobald http://orcid.org/0000-0003-4554-
Challenges and opportunities - A study using delphi
7192
method. IUP Journal of Management Research, 18(1),
7–23.
Bhattacharya, S., Wang, Y., & Xu, D. (2010). Beyond simon’s
References
means-ends analysis: Natural creativity and the unan­
Aguinis, H., Ramani, R. S., & Alabduljader, N. (2020). Best- swered ‘why’in the design of intelligent systems for pro­
practice recommendations for producers, evaluators, and blem-solving. Minds and Machines, 20(3), 327–347.
users of methodological literature reviews. In https://doi.org/10.1007/s11023-010-9198-7
Organizational research methods, 1–31. Bodie, M. T., Cherry, M. A., McCormick, M. L., & Tang, J.
Akin, J. (2020, July 16). The truth about false positives and (2016). The law and policy of people analytics. University
employment drug screens. GoodHire. https://www.good of Colorado Law Review, 88(4), 961–1042. http://lawre
hire.com/blog/the-truth-about-false-positives-and- view.colorado.edu/wp-content/uploads/2017/05/10.-88.
employment-drug-screens 4-Bodie_Final.pdf
Alvesson, M., & Sandberg, J. (2011). Generating research Boot, C. R. L., Van Drongelen, A., Wolbers, I., Hlobil, H.,
questions through problematization. Academy of Van Der Beek, A. J., & Smid, T. (2017). Prediction of
22 L. GIERMINDL ET AL.

long-term and frequent sickness absence using company trends. Information Systems, 2(4), 296–387. https://doi.
data. Occupational Medicine, 67(3), 176–181. https://doi. org/10.1561/2900000011
org/10.1093/occmed/kqx014 Cowgill, B., & Tucker, C. E. (2020). Algorithmic fairness and
Boudreau, J. (2014). Predict what employees will do economics. Columbia Business School Research Paper 65
without freaking them out. Harvard Business Review, (3):644-676, http://dx.doi.org/10.2139/ssrn.3361280
5, 2–4. https://hbr.org/2014/09/predict-what-employ Curchod, C., Patriotta, G., Cohen, L., & Neysen, N. (2020).
ees-will-do-without-freaking-them-out Working for an Algorithm: Power Asymmetries and
Boudreau, J. (2017), June 16. HR must make people analy­ Agency in Online Work Settings. Administrative Science
tics more user-friendly’, Harvard Business Review. https:// Quarterly, 65(3), 644–676. https://doi.org/10.1177/
hbr.org/2017/06/hr-must-make-people-analytics-more- 0001839219867024
user-friendly Dahlander, L., & Frederiksen, L. (2012). The core and cos­
Boudreau, J., & Cascio, W. (2017). Human capital analytics: mopolitans: A relational view of innovation in user
Why are we not there? Journal of Organizational communities. Organization Science, 23(4), 988–1007.
Effectiveness, 4(2), 119–126. https://doi.org/10.1108/ http://dx.doi.org/10.1287/orsc.1110.0673
JOEPP-03-2017-0021 Dahlbom, P., Siikanen, N., Sajasalo, P., & Jarvenpää, M.
Boyd, D., & Crawford, K. (2012). Critical questions for big (2019). Big data and HR analytics in the digital era.
data: Provocations for a cultural, technological, and scho­ Baltic Journal of Management, 15(1), 120–138. https://
larly phenomenon. Information, Communication & doi.org/10.1108/BJM-11-2018-0393
Society, 15(5), 662–679. https://doi.org/10.1080/ Dastin, J. (2018, October 11), Amazon scraps secret ai
1369118X.2012.678878 recruiting tool that showed bias against women. Reuters.
Brayne, S. (2017). Big data surveillance: The case of policing. https://www.reuters.com/article/us-amazon-com-jobs-
American Sociological Review, 82(5), 977–1008. https:// automation-insight/amazonscraps-secret-ai-recruiting-
doi.org/10.1177/0003122417725865 tool-that-showed-bias-against-women-
Briône, P. (2017, June 14). Mind over machines: New tech­ idUSKCN1MK08G
nology and employment relations. ACAS Research Paper. Davenport, T. H. (2018). From analytics to artificial
https://www.acas.org.uk/mind-over-machines-new- intelligence. Journal of Business Analytics, 1(2), 73–80.
technology-and-employment-relations/html https://doi.org/10.1080/2573234X.2018.1543535
Brynjolfsson, E., & Mitchell, T. (2017). What can machine Davenport, T. H. (2019, April 18). Is HR the most analytics-
learning do? Workforce implications. Science, 358 driven function? Harvard Business Review, https://hbr.
(6370), 1530–1534. https://doi.org/10.1126/science. org/2019/04/is-hr-the-most-analytics-driven-function
aap8062 Davenport, T. H., Harris, J. G., Jones, G. L., Lemon, K. N.,
Burrell, J. (2016). How the machine ‘thinks’: Understanding Norton, D., & McCallister, M. B. (2007). The dark side of
opacity in machine learning algorithms. Big Data & customer analytics. Harvard Business Review, 85(5), 37.
Society, 3(1), 1–12. https://doi.org/10.1177/ https://hbr.org/2007/05/the-dark-side-of-customer-
2053951715622512 analytics
Burton, J. W., Stein, M. K., & Jensen, T. B. (2019). Delen, D., & Demirkan, H. (2013). Data, information and
A systematic review of algorithm aversion in augmented analytics as services. Decision Support Systems, 55(1),
decision making. Journal of Behavioral Decision Making, 359–363. https://doi.org/10.1016/j.dss.2012.05.044
32(1), 15–29. https://doi.org/10.1002/bdm.2085 Diakopoulos, N., & Friedler, S. (2016). How to hold algo­
Calmes, J. (2016, May 16). Hiring hurdle: Finding workers rithms accountable. MIT Technology Review, 17(11).
who can pass a drug test. The New York Times. https:// https://www.technologyreview.com/2016/11/17/155957/
www.nytimes.com/2016/05/18/business/hiring-hurdle- how-to-hold-algorithms-accountable/
finding-workers-who-can-pass-a-drug-test.html Dougherty, D., & Dunne, D. D. (2012). Digital science and
Calvard, T. S., & Jeske, D. (2018). Developing human knowledge boundaries in complex innovation.
resource data risk management in the age of big data. Organization Science, 23(5), 1467–1484. https://doi.org/
International Journal of Information Management, 43, 10.1287/orsc.1110.0700
159–164. https://doi.org/10.1016/j.ijinfomgt.2018.07.011 Dourish, P. (2016). Algorithms and their others:
Cappelli, P. (2019). Data science can’t fix hiring (yet). Algorithmic culture in context. Big Data & Society, 3(2),
Harvard Business Review, 97(3), 56–57. https://hbr.org/ 205395171666512. https://doi.org/10.1177/
2019/05/data-science-cant-fix-hiring-yet 2053951716665128
Chamorro-Premuzic, T., Akhtar, R., Winsborough, D., & Duhigg, C. (2012, February 16). How companies learn your
Sherman, R. A. (2017). The datafication of talent: How secrets. The New York Times Magazine, Retrieved http://
technology is advancing the science of human potential at www.nytimes.com/2012/02/19/magazine/shopping-
work. Current Opinion in Behavioral Sciences, 18, 13–16. habits.html
https://doi.org/10.1016/j.cobeha.2017.04.007 Dulebohn, J. H., & Johnson, R. D. (2013). Human resource
Chamorro-Premuzic, T., & Bailie, I. (2020). Tech is trans­ metrics and decision support: A classification framework.
forming people analytics. Is that a good thing? Harvard Human Resource Management Review, 23(1), 71–83.
Business Review. https://hbr.org/2020/10/tech-is- https://doi.org/10.1016/j.hrmr.2012.06.005
transforming-people-analytics-is-that-a-good-thing Durand, R. (2003). Predicting a firm’s forecasting ability:
Claus, L. (2019). Hr disruption—time already to reinvent The roles of organizational illusion of control and orga­
talent management. BRQ Business Research Quarterly, 22 nizational attention. Strategic Management Journal, 24
(3), 207–215. https://doi.org/10.1016/j.brq.2019.04.002 (9), 821–838. https://doi.org/10.1002/smj.339
Clohessy, T., Acton, T., Whelan, E., & Golden, W. (2018). Ekawati, A. D. (2019). Predictive analytics in employee churn:
Enterprise personal analytics: The next frontier in indivi­ A systematic literature review. Journal of Management
dual information systems research, foundations and Information & Decision Sciences, 22(4), 387–397. https://
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 23

www.abacademies.org/articles/predictive-analytics-in- Hackman, J. R., & Wageman, R. (2004). When and how


employee-churn-a-systematic-literature-review.pdf team leaders matter. Research in Organizational
Falletta, S. V., & Combs, W. L. (2020). The HR analytics Behavior, 26, 37–74. https://doi.org/10.1016/S0191-
cycle: A seven-step process for building evidence-based 3085(04)26002-6
and ethical HR analytics capabilities. Journal of Work- Hamilton, R. H., & Sodeman, W. A. (2020). The questions
Applied Management 13(1): 51–68. https://doi.org/10. we ask: Opportunities and challenges for using big data
1108/JWAM-03-2020-0020 analytics to strategically manage human capital resources.
Faraj, S., Pachidi, S., & Sayegh, K. (2018). Working and Business Horizons, 63(1), 85–95. https://doi.org/10.1016/
organizing in the age of the learning algorithm. j.bushor.2019.10.001
Information and Organization, 28(1), 62–70. https://doi. Hansen, H. K., & Flyverbom, M. (2015). The politics of
org/10.1016/j.infoandorg.2018.02.005 transparency and the calibration of knowledge in the
Fernandez, V., & Gallardo-Gallardo, E. (2020). Tackling the digital age. Organization, 22(6), 872–889. https://doi.
hr digitalization challenge: Key factors and barriers to hr org/10.1177/1350508414522315
analytics adoption. Competitiveness Review, 31(1):162- Herschel, R., & Miori, V. M. (2017). Ethics and big data.
187. https://doi.org/10.1108/cr-12-2019-0163 Technology in Society, 49, 31–36. https://doi.org/10.1016/
Flyverbom, M. (2019). The digital prism: Transparency and j.techsoc.2017.03.003
managed visibilities in a datafied world. Cambridge Hicks, C. (2018). Predicting knowledge workers’ participa­
University Press. tion in voluntary learning with employee characteristics
Fourcade, M., & Healy, K. (2013). Classification situations: and online learning tools. Journal of Workplace Learning,
Life-chances in the neoliberal era. Accounting, 30(2), 78–88. https://doi.org/10.1108/JWL-04-2017-0030
Organizations and Society, 38(8), 559–572. https://doi. Hiebl, M. R. W. (2021). Sample selection in systematic
org/10.1016/j.aos.2013.11.002 literature reviews of management research. In
Fox, J., & Moreland, J. J. (2015). The dark side of social Organizational research methods.
networking sites: An exploration of the relational and Hodgkinson, G. P., Sadler-Smith, E., Burke, L. A.,
psychological stressors associated with Facebook use Claxton, G., & Sparrow, P. R. (2009). Intuition in orga­
and affordances. Computers in Human Behavior, 45, nizations: Implications for strategic management. Long
168–176. https://doi.org/10.1016/j.chb.2014.11.083 Range Planning, 42(3), 277–297. https://doi.org/10.1016/
Frederiksen, A. (2017). Job satisfaction and employee turn­ j.lrp.2009.05.003
over: A firm-level perspective. German Journal of Human Hodgson, C. (2019, May 16). Spy in the screen reveals what
Resource Management: Zeitschrift Für Personalforschung, workers are worth. Financial Times, Pearson. Retrieved
31(2), 132–161. https://doi.org/10.1177/ https://www.ft.com/content/03e9abd8-6dea-11e9-80c7-
2397002216683885 60ee53e6681d
Fuller, R., & Shikaloff, N. (2017). Being engaged at work is Holford, W. D. (2019). The future of human creative knowl­
not the same as being productive. Harvard Business edge work within the digital economy. Futures, 105,
Review, 16, 2–5. https://hbr.org/2017/02/being-engaged- 143–154. https://doi.org/10.1016/j.futures.2018.10.002
at-work-is-not-the-same-as-being-productive Holt, M., Lang, B., & Sutton, S. G. (2017). Potential employ­
Gal, U., Jensen, T. B., & Stein, M. K. (2017). People ees’ ethical perceptions of active monitoring: The dark
analytics in the age of big data: An agenda for IS side of data analytics. Journal of Information Systems, 31
research. Proceedings of the International Conference (2), 107–124. https://doi.org/10.2308/isys-51580
on Information Systems (ICIS 2017), Seoul, South Huselid, M. A. (2018). The science and practice of work­
Korea. force analytics: Introduction to the HRM special issue.
Gal, U., Jensen, T. B., & Stein, M. K. (2020). Breaking the Human Resource Management, 57(3), 679–684. https://
vicious cycle of algorithmic management: A virtue ethics doi.org/10.1002/hrm.21916
approach to people analytics. Information and Huysman, M. (2020). Information systems research on arti­
Organization, 30(2), 100301. https://doi.org/10.1016/j. ficial intelligence and work: A commentary on “Robo-
infoandorg.2020.100301 Apocalypse cancelled? Reframing the automation and
Giermindl, L., Strich, F., & Fiedler, M. (2017). Why do you future of work debate”. Journal of Information
NOT use the enterprise social network? Analyzing non- Technology, 35(4), 307–309. https://doi.org/10.1177/
users’ reasons Through the lens of affordances. 0268396220926511
Proceedings of the 38th International Conference on Introna, L. D. (2016). Algorithms, governance, and govern­
Information Systems, Seoul, South Korea. mentality: On governing academic writing. Science,
Gray, M. L., & Suri, S. (2019). Ghost work: How to stop Technology & Human Values, 41(1), 17–49. https://doi.
silicon valley from building a new global underclass. org/10.1177/0162243915587360
HMH Books. Israni, E. (2017, October 26). When an algorithm helps send
Gray, M. L., Suri, S., Ali, S. S., & Kulkarni, D. (2016).The you to prison. The New York Times. A. G. Sulzberger.
crowd is a collaborative network. Paper presented at the https://www.nytimes.com/2017/10/26/opinion/algo
Proceedings of the 19th ACM Conference on Computer- rithm-compas-sentencing-bias.html
Supported Cooperative Work & Social Computing, San Isson, J. P., & Harriott, J. S. (2016). People analytics in the era
Francisco California ,USA. of big data: Changing the way you attract, acquire,
Greasley, K., & Thomas, P. (2020). Hr analytics: The develop, and retain talent. Wiley.
onto-epistemology and politics of metricised hrm. Jabagi, N., Croteau, A. M., & Audebrand, L. (2020).
Human Resource Management Journal, 14. https://doi. Perceived Organizational Support in the Face of
org/10.1111/1748-8583.12283 Algorithmic Management: A Conceptual Model.
Green, D. (2017). The best practices to excel at people Proceedings of the 53rd Hawaii International Conference
analytics. Journal of Organizational Effectiveness: People on System Sciences (HICSS 2020), Maui, USA.
and Performance, 4(2), 137–144. https://doi.org/10.1108/ Jarrahi, M. H., Sutherland, W., Nelson, S., & Sawyer, S.
JOEPP-03-2017-0027 (2019). Platformic management, boundary resources,
24 L. GIERMINDL ET AL.

and worker autonomy in gig work. In Computer sup­ Lismont, J., Vanthienen, J., Baesens, B., & Lemahieu, W.
ported cooperative work (pp. 1–37). https://doi.org/10. (2017). Defining analytics maturity indicators: A survey
1007/s10606-019-09368-7 approach. International Journal of Information
Kassick, D. (2019). Workforce analytics and human Management, 37(3), 114–124. https://doi.org/10.1016/j.
resource metrics: Algorithmically managed workers, ijinfomgt.2016.12.003
tracking and surveillance technologies, and wearable bio­ Locke, K., Feldman, M., & Golden-Biddle, K. (2020). Coding
logical measuring devices. Psychosociological Issues in practices and iterativity: Beyond templates for analyzing
Human Resource Management, 7(2), 55–60. https://doi. qualitative data. Organizational Research Methods,
org/10.22381/pihrm7220199 109442812094860. https://doi.org/10.1177/
Kellog, K. C., Valentine, M. A., & Christin, A. (2020). 1094428120948600
Altorithms at work: The new consteted terrain of Lunsford, D. L., & Phillips, P. P. (2018). Tools used by
control. Academy of Management Annals, 14(1), organizations to support human capital analytics.
366–410. https://doi.org/10.5465/annals.2018.0174 Performance Improvement, 57(3), 6–15. https://doi.org/
Khan, S. A., & Tang, J. (2017). The paradox of human 10.1002/pfi.21767
resource analytics: Being mindful of employees. Journal Mai, J. (2016). Big data privacy: The datafication of personal
of General Management, 42(2), 57–66. https://doi.org/10. information. The Information Society, 32(3), 192–199.
1177/030630701704200205 https://doi.org/10.1080/01972243.2016.1153010
King, K. G. (2016). Data analytics in human resources: Mantelero, A. (2016). Personal data for decisional purposes
A case study and critical review. Human Resource in the age of analytics: From an individual to a collective
Development Review, 15(4), 487–495. https://doi.org/10. dimension of data protection. Computer Law & Security
1177/1534484316675818 Review, 32(2), 238–255. https://doi.org/10.1016/j.clsr.
Kleinbaum, A. M., Stuart, T. E., & Tushman, M. L. (2013). 2016.01.014
Discretion within constraint: Homophily and structure in Margherita, A. (2021). Human resources analytics:
a formal organization. Organization Science, 24(5), A systematization of research topics and directions for
1316–1336. https://doi.org/10.1287/orsc.1120.0804 future research. Human Resource Management Review,
Krumeich, J., Werth, D., & Loos, P. (2016). Prescriptive Advance online publication. 100795. https://doi.org/10.
control of business processes. Business & Information 1016/j.hrmr.2020.100795
Systems Engineering, 58(4), 261–280. https://doi.org/10. Marler, J. H., & Boudreau, J. W. (2017). An evidence-based
1007/s12599-015-0412-2 review of HR Analytics. The International Journal of
Kryscynski, D., Reeves, C., Stice-Lusvardi, R., Ulrich, M., & Human Resource Management, 28(1), 3–26. https://doi.
Russell, G. (2018). Analytical abilities and the performance org/10.1080/09585192.2016.1244699
of hr professionals. Human Resource Management, 57(3), Martin, K. (2018). Ethical implications and accountability of
715–738. https://doi.org/10.1002/hrm.21854 algorithms. Journal of Business Ethics, 160(4), 835–850.
Lawler, E. E., Levenson, A., & Boudreau, J. W. (2004). HR https://doi.org/10.1007/s10551-018-3921-3
metrics and analytics – Uses and impacts. Human Martin-Rios, C., Pougnet, S., & Nogareda, A. M. (2017).
Resource Planning Journal, 27(4), 27–35. Teaching HRM in contemporary hospitality manage­
Lazer, D., & Kennedy, R. (2015, January 10). What we can ment: A case study drawing on HR analytics and big
learn from the epic failure of google flu trends. Wired, data analysis. Journal of Teaching in Travel and
Retrieved. https://www.wired.com/2015/10/can-learn- Tourism, 17(1), 34–54. https://doi.org/10.1080/
epic-failure-google-flu-trends 15313220.2016.1276874
Lee, M. K. (2018). Understanding perception of algorithmic Mayer, A.-S., Strich, F., & Fiedler, M. (2020). Unintended
decisions: Fairness, trust, and emotion in response to consequences of introducing AI systems for decision
algorithmic management. Big Data & Society, 5(1), making. MIS Quarterly Executive, 19(4), 239–257.
1–16. https://doi.org/10.1177/2053951718756684 https://aisel.aisnet.org/misqe/vol19/iss4/6
Leicht-Deobald, U., Busch, T., Schank, C., Weibel, A., Mayer-Schönberger, V., & Cukier, K. (2012). Big Data:
Schafheitle, S., Wildhaber, I., & Kasper, G. (2019). A revolution that transforms how we work, live, and
The challenges of algorithm-based HR think. Houghton Mifflin Harcourt.
decision-making for personal integrity. Journal of McAfee, A., & Brynjolfsson, E., Brynjolfsson. (2012). Big
Business Ethics, 160(2), 377–392. https://doi.org/10. data: The management revolution. Harvard Business
1007/s10551-019-04204-w Review, 90(10), 60–68. https://hbr.org/2012/10/big-data-
Leidner, D. E. (2018). Review and theory symbiosis: An the-management-revolution
introspective retrospective. Journal of the Association for McIver, D., Lengnick-Hall, M. L., & Lengnick-Hall, C. A.
Information Systems, 19(6), 1. https://doi.org/10.17705/ (2018). A strategic approach to workforce analytics:
1jais.00501 Integrating science and agility. Business Horizons, 61(3),
Leonardi, P., & Contractor, N. (2018). Better peo­ 397–407. https://doi.org/10.1016/j.bushor.2018.01.005
ple analytics. Harvard Business Review, 96(6), Minbaeva, D. B. (2017). Human capital analytics: Why
70–81. https://hbr.org/2018/11/better-people- aren’t we there? Introduction to the special issue, editorial
analytics material. Journal of Organizational Effectiveness: People
Levenson, A. (2018). Using workforce analytics to and Performance, 4(2), 110–118. https://doi.org/10.1108/
improve strategy execution. Human Resource JOEPP-04-2017-0035
Management, 57(3), 685–700. https://doi.org/10.1002/ Minbaeva, D. B. (2018). Building credible human capital
hrm.21850 analytics for organizational competitive advantage.
Levenson, A., & Fink, A. (2017). Human capital analytics: Human Resource Management, 57(3), 701–713. https://
Too much data and analysis, not enough models and doi.org/10.1002/hrm.21848
business insights. Journal of Organizational Effectiveness: Newell, S., & Marabelli, M. (2015). Strategic opportunities
People and Performance, 4(2), 145–156. https://doi.org/ (and challenges) of algorithmicdecision-making: A call
10.1108/JOEPP-03-2017-0029 for action on the long-term societaleffects of ‘datification’.
EUROPEAN JOURNAL OF INFORMATION SYSTEMS 25

The Journal of Strategic Information Systems, 24(1), 3–14. at http://www.pewinternet.org/2017/02/08/code-


https://doi.org/10.1016/j.jsis.2015.02.001 dependent-pros-and-consof-the-algorithm-age
Newman, D. T., Fast, N. J., & Harmon, D. J. (2020). When Rasmussen, T., & Ulrich, D. (2015). Learning from practice:
eliminating bias isn’t fair: Algorithmic reductionism and How hr analytics avoids being a management fad.
procedural justice in human resource decisions. Organizational Dynamics, 44(3), 236–242. https://doi.
Organizational Behavior and Human Decision Processes, org/10.1016/j.orgdyn.2015.05.008
160, 149–167. https://doi.org/10.1016/j.obhdp.2020.03. Rasmussen, T. & Ulrich, D. (2017). How HR Analytics
008 avoids being a management fad, Organizational
Nienaber, H., & Sewdass, N. (2016). A reflection and inte­ Dynamics,Vol. 44 No 3, pp. 236–242. https://doi.org/10.
gration of workforce conceptualisations and measure­ 1016/j.orgdyn.2015.05.008
ments for competitive advantage. Journal of Intelligence Rowe, F. (2014). What literature review is not: Diversity,
Studies in Business, 6(1), 5–20. https://doi.org/10.37380/ boundaries and recommendations. European Journal of
jisib.v6i1.150 Information Systems, 23(3), 241–255. https://doi.org/10.
Nocker, M., & Sena, V. (2019). Big data and human 1057/ejis.2014.7
resources management: The rise of talent analytics. Schafheitle, S. D., Weibel, A., Ebert, I. L., Kasper, G.,
Social Sciences, 8(10), 273. https://doi.org/10.3390/ Schank, C., & Leicht-Deobald, U. (2019). No stone left
socsci8100273 unturned? Towards a framework for the impact of data­
O’Connor, S. (2016, September 8). When your boss is an fication technologies on organizational control. Academy
algorithm. Financial Times. https://www.ft.com/content/ of Management Discoveries, 6 (3):455-487. https://doi.
88fdc58e-754f-11e6-b60a-de4532d5ea35 org/10.5465/amd.2019.0002
Okoli, C. (2015). A guide to conducting a standalone sys­ Schiemann, W. A., Seibert, J. H., & Blankenship, M. H.
tematic literature review. Communications of the (2018). Putting human capital analytics to work:
Association for Information Systems, 37(1), 879–910. Predicting and driving business success. Human
https://doi.org/10.17705/1CAIS.03743 Resource Management, 57(3), 795–807. https://doi.org/
Orlikowski, W. J. (2000). Using technology and constituting 10.1002/hrm.21843
structures: A practice lens for studying technology in Schiemann, W. A., Seibert, J. H., & Blankenship, M. H.
organizations. Organization Science, 11(4), 404–428. (2018). Putting human capital analytics to work:
https://doi.org/10.1287/orsc.11.4.404.14600 Predicting and driving business success. Human
Orlikowski, W. J., & Scott, S. V. (2014). What happens when Resource Management,57(3), 795–807.
evaluation goes online? Exploring apparatuses of valua­ Schildt, H. (2017). Big data and organizational design–the
tion in the travel sector. Organization Science, 25(3), brave new world of algorithmic management and com­
868–891. https://doi.org/10.1287/orsc.2013.0877 puter augmented transparency. Innovation, 19(1), 23–30.
Pachidi, S., & Huysman, M. (2018). Organizational intelli­ https://doi.org/10.1080/14479338.2016.1252043
gence in the digital age: Analytics and the cycle of choice. Schwartz, D. (2018). Embedded in the crowd: Creative free­
In R. D. Galliers & M. K. Stein (Eds.), The routledge lancers, crowdsourced work, and occupational
companion to management information systems. community. Work and Occupations, 45(3), 247–282.
Routledge. https://doi.org/10.1177/0730888418762263
Pachidi, S., Huysman, M., & Berends, J. J. (2016). Playing Sharma, A., & Sharma, T. (2017). HR analytics and perfor­
the numbers game: Dealing with transparency. mance appraisal system: A conceptual framework for
Proceedings of the 37th International Conference on employee performance improvement. Management
Information Systems, Dublin, Ireland. Research Review, 40(6), 684–697. https://doi.org/10.
Paré, G., Tate, M., Johnstone, D., & Kitsiou, S. (2016). 1108/MRR-04-2016-0084
Contextualizing the twin concepts of systematicity and Sharma, R., Mithas, S., & Kankanhalli, A. (2014).
transparency in information systems literature reviews. Transforming decision-making processes: A research
European Journal of Information Systems, 25(6), 493–508. agenda for understanding the impact of business analy­
https://doi.org/10.1057/s41303-016-0020-3 tics on organisations. European Journal of Information
Peeters, T., Paauwe, J., & Van De Voorde, K. (2020). People Systems, 23(4), 433–441. https://doi.org/10.1057/ejis.
analytics effectiveness: Developinga framework. Journal of 2014.17
Organizational Effectiveness: People and Performance, 7(2), Shrestha, Y. R., Ben-Menahem, S. M., & von Krogh, G.
203–219. https://doi.org/10.1108/JOEPP-04-2020-0071 (2019). Organizational decision-making structures in
Pessach, D., Singer, G., Avrahami, D., Ben-Gal, H. C., the age of artificial intelligence. California Management
Shmueli, E., & Ben-Gal, I. (2020). Employees recruitment: Review, 61(4), 66–83. https://doi.org/10.1177/
A prescriptive analytics approach via machine learning 0008125619862257
and mathematical programming. Decision Support Simbeck, K. (2019). HR analytics and ethics. IBM Journal of
Systems, 134(18), 113290. https://doi.org/10.1016/j.dss. Research & Development, 63(4/5), 9:1–9:12. https://doi.
2020.113290 org/10.1147/JRD.2019.2915067
Possati, L. M. (2020). Algorithmic unconscious: Why psy­ Simón, C., & Ferreiro, E. (2018). Workforce analytics: A case
choanalysis helps in understanding AI. Palgrave study of scholar–practitioner collaboration. Human
Communications, 6(70). https://doi.org/10.1057/s41599- Resource Management, 57(3), 781–793. https://doi.org/
020-0445-0 10.1002/hrm.21853
Purdy, M., Zealley, J., & Maseli, O. (2019, November 18). Sivathanu, B., & Pillai, R. (2019). Technology and talent
The risks of using AI to interpret human emotions. analytics for talent management – A game changer for
Harvard Business Review, Available at https://hbr.org/ organizational performance. International Journal of
2019/11/the-risks-of-using-ai-to-interpret-human- Organizational Analysis, 28(2), 457–473. https://doi.org/
emotions . 10.1108/IJOA-01-2019-1634
Rainie, L., & Anderson, J. (2017). Code-dependent: Pros and Stohl, C., Stohl, M., & Leonardi, P. M. (2016). Digital age|
cons of the algorithm age. Pew Research Center. Available Managing opacity: Information visibility and the paradox
26 L. GIERMINDL ET AL.

of transparency in the digital age. International Journal of van den Heuvel, S., & Bondarouk, T. (2017). The rise (and
Communication, 10(15), 123–137. https://ijoc.org/index. fall?) of hr analytics: A study into the future application,
php/ijoc/article/view/4466/1530 value, structure, and system support. Journal of
Strich, F., Mayer, A.-S., & Fiedler, M. (2021). What to do in Organizational Effectiveness, 4(2), 157–178. https://doi.
a world of artificial intelligence? Investigating the impact org/10.1108/JOEPP-03-2017-0022
of substitutive decision-making AI systems on employees’ van der Togt, J., & Rasmussen, T. H. (2017). Toward
professionale rolde identity. Journal of the Association for evidence-based HR. Journal of Organizational
Information Systems, 22(2), 304–324. https://doi.org/10. Effectiveness, 4(2), 127–132. https://doi.org/10.1108/
17705/1jais.00663 JOEPP-02-2017-0013
Sydow, J., Schreyögg, G., & Koch, J. (2009). Organizational Vardarlier, P., & Zafer, C. (2020). Use of artificial intelli­
path dependence: Opening the black box. Academy of gence as business strategy in recruitment process and
Management Review, 34(4), 689–709. https://doi.org/10. social perspective. In Hacioglu, U. (Ed). Digital Business
5465/amr.34.4.zok689 Strategies in Blockchain Ecosystems, Springer (pp.
Tambe, P., Cappelli, P., & Yakubovich, V. (2019). 355–373).
Artificial intelligence in human resources manage­ Vargas, R., Yurova, Y. V., Ruppel, C. P., Tworoger, L. C., &
ment: Challenges and a path forward. California Greenwood, R. (2018). Individual adoption of hr analy­
Management Review, 61(4), 15–42. https://doi.org/10. tics: A fine grained view of the early stages leading to
1177/0008125619867910 adoption. International Journal of Human Resource
Thorstad, R., & Wolff, P. (2019). Predicting future mental Management, 29(22), 3046–3067. https://doi.org/10.
illness from social media: A big-data approach. 1080/09585192.2018.1446181
Behavioural Research, 51(4), 1586–1600. https://doi.org/ von Krogh, G. (2018). Artificial Intelligence in organiza­
10.3758/s13428-019-01235-z tions: New opportunities for phenomenon-based
Thrane, S., Blaabjerg, S., & Hannemann Møller, R. (2010). theorizing. Academy of Management Discoveries, 4(4),
Innovative path dependence: Making sense of product 404–409. https://doi.org/10.5465/amd.2018.0084
and service innovation in path dependent innovation Wang, F. (2012). A big-data perspective on AI: Newton,
processes. Research Policy, 39(7), 932–944. https://doi. merton, and analytics intelligence. IEEE Intelligent
org/10.1016/j.respol.2010.04.003 Systems, 27(5), 2–4. https://doi.org/10.1109/MIS.2012.
Tursunbayeva, A. (2019). Human resource technology dis­ 91
ruptions and their implications for human resources Wang, L., & Cotton, R. (2018). Beyond Moneyball to social
management in healthcare organizations. BMC Health capital inside and out: The value of differentiated workforce
Services Research, 19(1), 1–8. https://doi.org/10.1186/ experience ties to performance. Human Resource
s12913-019-4068-3 Management, 57(3), 761–780. https://doi.org/10.1002/hrm.
Tursunbayeva, A., Di Lauro, S., & Pagliari, C. (2018). People 21856
analytics - a scoping review of conceptual boundaries and Wang, N., & Katsamakas, E. (2019). A network data science
value propositions. International Journal of Information approach to people analytics. Information Resources
Management, 43, 224–247. https://doi.org/10.1016/j.ijin Management Journal, 32(2), 28–51. https://doi.org/10.
fomgt.2018.08.002 4018/IRMJ.2019040102
Tursunbayeva, A., Pagliari, C., Di Lauro, S., & Antonelli, G. Webster, J., & Watson, R. T. (2002). Analyzing the past to
(2019). Opportunities and benefits of people analytics for prepare for the future: Writing a literature review. MIS
hr managers and employees: Signals in the grey literature. Quarterly, 26(2), Xiii–Xxiii. https://www.jstor.org/stable/
Proceedings of the 13th Mediterranean Conference on 4132319
Information Systems (MCIS), Naples, Italy. Werkhoven, J. (2017a). “Conceptualizing Business Value
Tursunbayeva, A., Pagliari, C., Di Lauro, S., & Antonelli, G. Creation trough Human Resource Analytics”.
(2021). The ethics of people analytics: Risks, opportu­ Proceedings of the 23rd Americas Conference on
nities and recommendations. Personnel Review, Advance Information Systems (AMCIS 2017), Boston, USA.
online publication. ahead-of-print(ahead–of–print). Werkhoven, J. (2017b). “Exploring functional affor­
https://doi.org/10.1108/PR-12-2019-0680 dances and sensemaking in human resource
Ulrich, D., & Dulebohn, J. H. (2015). Are we there yet? analytics”. Proceedings of the 23rd Americas
What’s next for hr? Human Resource Management Conference on Information Systems (AMCIS 2017),
Review, 25(2), 188–204. https://doi.org/10.1016/j.hrmr. Boston, USA.
2015.01.004 Zarsky, T. (2016). The trouble with algorithmic deci­
van den Broek, E., Sergeeva, A., & Huysman, M. (2019). sions: An analytic road map to examine efficiency
Hiring algorithms: An ethnography of fairness in and fairness in automated and opaque decision
practice. Proceedings of the 40th International making. Science, Technology & Human Values, 41
Conference on Information Systems (ICIS 2019), Munich. (1), 118–132. https://doi.org/10.1177/
Germany. 0162243915605575

You might also like