Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

r Academy of Management Review

2021, Vol. 46, No. 3, 534–551.


https://doi.org/10.5465/amr.2019.0178

THE ROLE OF ARTIFICIAL INTELLIGENCE AND DATA


NETWORK EFFECTS FOR CREATING USER VALUE
ROBERT WAYNE GREGORY
University of Virginia

OLA HENFRIDSSON
University of Miami

EVGENY KAGANER
Moscow School of Management SKOLKOVO

HARRIS KYRIAKOU
ESSEC Business School

Some of the world’s most profitable firms own platforms that exhibit network effects. A
platform exhibits network effects if, the more that people use it, the more valuable it be-
comes to each user. Theorizing about the value perceived by users of a platform that ex-
hibits network effects has traditionally focused on direct and indirect network effects. In
this paper, we theorize about a new category of network effects—data network effects—
that has emerged from advances in artificial intelligence and the growing availability of
data. A platform exhibits data network effects if, the more that the platform learns from
the data it collects on users, the more valuable the platform becomes to each user. We ar-
gue that there is a positive direct relationship between the artificial intelligence capabili-
ty of a platform and the value perceived in the platform by its users—a relationship that
is moderated by platform legitimation, data stewardship, and user-centric design.

Network effects make crucial contributions to the Eisenmann, Parker, & Van Alstyne, 2011; Gawer,
value that users perceive in the products, services, or 2009; Majumdar & Venkataraman, 1998; McIntyre &
platforms of some of the world’s most valuable firms Srinivasan, 2017; McIntyre, Srinivasan, Afuah,
(e.g., Apple, Microsoft, Facebook). A platform or one Gawer, & Krestschmer, 2020; Parker & Van Alstyne,
of its products or services exhibits network effects if, 2005; Priem, Butler, & Li, 2013; Rochet & Tirole,
the more that people use it, the more valuable it be- 2003).
comes to each user (Church & Gandal, 1992; Farrell & To date, research has focused on two categories of
Saloner, 1985, 1986; Katz & Shapiro, 1985, 1986, network effects: direct network effects and indirect net-
1992; Sheremata, 2004; Suarez, 2005). For example, work effects (Clements, 2004; McIntyre & Srinivasan,
a social network such as Facebook exhibits network 2017). In the case of “direct network effects,” the
effects because, the more that people use it, the more value that users derive from a network comes from
valuable it becomes to each user, since more users users being able to interact directly with one anoth-
mean more people to interact with (Afuah, 2013; er (Rochet & Tirole, 2003; Zhu & Iansiti, 2012). For
Van Alstyne, Parker, & Choudary, 2016). Because of example, network effects on social media platforms
the immense impact that network effects can have primarily stem from users interacting directly with
on the value that users perceive in a platform, many each other. In the case of “indirect network effects,”
scholars have theorized about their nature and con- the more people who use a product, the higher is
sequences for user value (Cennamo & Santal o, 2013; the likelihood of increased availability and variety
of complements of the product, thereby increasing
the value of the product to each user (Boudreau,
We sincerely thank associate editor Allan Afuah and three 2012; Church, Gandal, Krause, & Canada, 2008;
anonymous reviewers for their constructive guidance on de-
veloping the paper. We are also grateful for the opportunity to
Clements & Ohashi, 2005). For example, the more
participate in the AMR Theory Development Hackathon held users who are attracted to a mobile ecosystem, the
at IESE Business School in Barcelona in February 2019. greater are the incentives for development and thus
534
Copyright of the Academy of Management, all rights reserved. Contents may not be copied, emailed, posted to a listserv, or otherwise transmitted without the copyright holder's express
written permission. Users may print, download, or email articles for individual use only.
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 535

the diversity of apps, resulting in more perceived that a growing network of interconnected users gives
user value of products within that mobile ecosys- rise to network externalities, where a user’s utility of
tem. In sum, extant research has effectively ex- a platform is a function of the total number of users
plored the impact of network effects on the value (Katz & Shapiro, 1985). In this paper, we propose
perceived by users in terms of both direct and indi- that “platform AI capability”—that is, the ability of a
rect network effects. platform to learn from data to continuously improve
However, little attention has been paid to “data its products or services for each user—gives rise to
network effects” as an emerging category of network new platform externalities, where a user’s utility of a
effects. A platform exhibits data network effects if, platform is a function of the scale of data-driven
the more that the platform learns from the data it col- learning and improvements realized with AI. These
lects on users, the more valuable the platform be- improvements manifest in greater product function-
comes to each user. For example, the more that ality, platform quality, and experience for each user
Google learns about users and the searches that they (Cennamo & Santal o, 2013, 2019; Himan, 2002;
conduct, the more it can individualize the experi- McIntyre & Srinivasan, 2017; Zhu & Iansiti, 2012;
ence, making the search engine more valuable to Zhu & Liu, 2018). Our model of data network effects
each user. Similarly, the more that Tesla optimizes explains this novel phenomenon.
its self-driving algorithms by feeding them with bil-
lions of miles worth of driving data it gathers from
CONCEPTUAL BACKGROUND
in-car sensors, cameras, and radar units, the greater
is the perceived value of Tesla cars. Before presenting the model, we first summarize
In this paper, we explore the role of artificial intel- the background information about AI and network
ligence (AI) and data network effects for creating effects that is needed to understand the causal argu-
user value, especially in the context of multisided ments of the model.
platforms. The starting point is the observation that
the value each user perceives depends on the scale of Artificial Intelligence
data-driven learning and improvements realized
with AI. Such learning and improvements typically Brian Arthur (2009) proposed three principles
rely on faster and better predictions through applica- through which we can understand advanced tech-
tions of machine learning grounded in data nologies such as AI: combination, recursiveness, and
(Agrawal, Gans, & Goldfarb, 2018; Samuel, 1959). phenomena. First, while AI pioneers in the 1950s
For example, music-streaming services use machine- such as John McCarthy, Marvin Minsky, Nathaniel
learning techniques to continuously learn about Rochester, and Claude Shannon projected that
users’ listening preferences and improve their rec- “every aspect of learning or any other feature of intel-
ommendation engine, making the platform more ligence can in principle be so precisely described
valuable to each user. that a machine can be made to simulate it” (McCor-
Assuming direct connections and multisided ex- duck, 2004: 111), today’s application of AI exhibits a
change between users or user groups (e.g., Farrell & more modest ambition by combining technologies in
Saloner, 1985; Katz & Shapiro, 1985), prior network particular functional domains (Raisch & Krakowski,
effects literature cannot readily explain why data- 2020). Advances in machine learning offer a novel
driven learning and improvements on a platform approach to specific decision-making tasks and busi-
contributes to user value through data network ef- ness problems (Finlay, 2017). In turn, such machine
fects. In this research, we therefore examine the role learning draws on the dramatically improved per-
of AI and the “computer in the middle of every trans- formance–price ratio of computer-processing tech-
action” (Varian, 2014: 1) to address the following re- nology, data storage and management, and network
search question: “How is value for each user of the technologies (Agrawal et al., 2018; Yoo, Henfridsson,
platform created from data with AI?” & Lyytinen, 2010). In combination, these technolo-
We develop a model of data network effects that gies make AI an important tool for enabling plat-
complements and extends existing network effects forms, products, or services to generate user value.
theory (Cennamo & Santal o, 2013; Church et al., For example, navigation services leverage data col-
2008; Farrell & Saloner, 1986; Katz & Shapiro, 1986; lected about users to offer dynamic turn-by-turn nav-
Parker & Van Alstyne, 2005; Rochet & Tirole, 2003). igation based on continuously improved predictions
Based on the premise of same-side or multisided ex- of traffic situations. The perceived value for each
change among users, network effects theory posits user increases as predictions benefit from the
536 Academy of Management Review July

increasing processing and networking power of com- products and services to continuously learn and im-
puting devices such as smartphones. prove on the basis of feedback data from users who
Second, AI applications exhibit a modular archi- share a variety of personal data at a high velocity.
tecture (Garud & Karnøe, 2003; Schilling, 2000) mak-
ing up a complex network of technologies wherein Network Effects
each technology is developed independently with
its own set of design objectives. This creates recur- The concept of “network effects” is predicated on
siveness that influences the perceived user value of the notion that network externalities give rise to val-
the service employing AI. AI consists of technolo- ue creation through direct connections or multisided
gies, which in turn consist of technologies. As im- exchange among individual users or different groups
provements in one piece of technology are of users on opposite sides of the market. As outlined
accomplished, this may conflict with the design ob- by Katz and Shapiro (1985), the utility that a user de-
jectives of another piece of technology. For instance, rives from a platform is a function of the total number
consider how the improvements in the recommenda- of users, as the scale of the network gives rise to con-
tion engine of a service may improve its conve- sumption or network externalities. Using network
nience, personalization, and ease of use. However, size as the main determinant of user value (Parker &
the improvements in the engine may cause privacy Van Alstyne, 2018; Suarez, 2005), network effects
concerns and require improvements in privacy pro- theory examines how increases in the network size
tection and the cybersecurity technology used. In of one user group may produce a virtuous cycle with
fact, in the wake of recursiveness, firms leveraging increases in the network size of either the same user
AI may face scrutiny from key stakeholder audiences group (direct network effects) or another user group,
regarding their collection and use of personal data, providing complements to the platform (indirect net-
the lack of transparency in their decisions (e.g., auto- work effects) (Church & Gandal, 1992; Church et al.,
mated loan decision-making), and how they deal 2008; Katz & Shapiro, 1992; Rochet & Tirole, 2003,
with the errors stemming from biases oftentimes in- 2006; Schilling, 2002).
herited when algorithms use data collected on users However, the broad adoption and diffusion of AI
(Ahsen, Ayvaci, & Raghunathan, 2019). on today’s platforms warrants another look at net-
Finally, resonating with Arthur’s (2009) third work effects. In particular, it should be noted that “a
principle—that of phenomena—today’s AI has data- computer in the middle of every transaction” (Var-
driven learning at its center (Meyer et al., 2014). For ian, 2014: 1) does not merely provide connectivity
example, rather than programming explicit rules for and possibilities for exchange among users. It also
recognizing a cat or a dog, neural networks (a form of gives rise to new data networks that platform compa-
machine learning algorithms) are capable of teaching nies explore and exploit with the help of AI. By
themselves classification if trained with a prelabeled means of automation or augmentation (Raisch & Kra-
data set. In practice, training machine learning algo- kowski, 2020), AI enables significant scaling of the
rithms involves much tinkering and experimenta- learning from the data collected on users as they
tion, iteratively learning from data to detect patterns leave digital traces of interconnections with things,
and predict outcomes faster and more accurately people, and organizations in their daily life. Whether
(Agrawal et al., 2018). In this regard, the value of AI or not these new processes of data-driven value crea-
technologies is based on the existence of big data tion and capture have a positive or negative effect on
(McAfee & Brynjolfsson, 2012; Varian, 2014). “Big the perceived value for each user of the platform
data” refers to very large volumes of data; the ability (Stucke & Ezrachi, 2016, 2018; Tucker, 2019), they
to process and transmit that data at a high velocity; give rise to new platform externalities (Himan, 2002)
the existence of an increasing variety of data sources, underlying the concept of data network effects. The
including social networks, mobile devices, con- utility that a user derives from a platform is then a
nected things, and open data (weather, traffic, maps, function of the scale of data-driven learning and im-
etc.); and the challenge of ensuring veracity so that provements realized with AI. The resulting data net-
data sources truly represent reality (Baesens, Bapna, work effect, in terms of the increase in user value,
Marsden, Vanthienen, & Zhao, 2016). The volume, may manifest in superior (inferior) functionalities of
variety, and veracity of data make important contri- the products delivered through the platform, a more
butions to predictive model development from (less) personalized and meaningful experience for
which users will more likely benefit. In addition, each user, or other aspects of platform quality (see,
trained prediction models enable data-driven e.g., Cennamo & Santal o, 2013, 2019; Himan, 2002;
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 537

McIntyre & Srinivasan, 2017; Zhu & Iansiti, 2012; contribution of network conduct to user value be-
Zhu & Liu, 2018). cause this reputation serves as a signal to other users
Afuah (2013) provides a good starting point for ex- and motivates exchange. In the example of Uber,
ploring these data network effects, because of the drivers and passengers rate each other. A positive
work’s departure from focusing on network size as five-star rating helps a driver obtain repeated jobs,
the only determinant of user value. In particular, while lower ratings create an important barrier for
Afuah (2013) proposed network structure and con- deriving value from the platform. Finally, the per-
duct as additional factors that contribute to user val- ception of trust in the platform, the object of ex-
ue on multisided platforms. For instance, network change, or the exchange partners themselves may
structure may vary in terms of how it contributes to also play an important role in users engaging on the
user value through the degree of transaction feasibili- platform and obtaining benefits. In the case of Uber,
ty. The feasibility of transactions depends not only the behavior and exchange relationship between
on the existence of connectivity among users but passengers and drivers is governed by the use of ma-
also on the availability of data and useful informa- chine learning algorithms on the platform (Rose-
tion on the network to achieve the best possible nblat, 2018).
matches between supply and demand and to enable In sum, the underlying mechanism of how data
each individual user to make more informed deci- network effects contribute to user value influences
sions on entering and executing transactions (Chen the network by increasing the scale of learning from
& Horton, 2016). Uber, for example, uses machine the data collected on users through the use of AI. To
learning algorithms on its platform to analyze data substantiate this claim, we develop a framework ad-
collected on each user in real time and to improve dressing our research question of how the value for
both the algorithmic matching as well as the infor- each user of the platform is created from data
mation and experience offered to each user who de- through AI.
cides to engage in exchanges among riders and
drivers. As this example illustrates, the feasibility of A FRAMEWORK FOR EXPLORING THE ROLE
transactions also depends on the extent to which OF AI AND DATA NETWORK EFFECTS FOR
each user is actively engaged in using the platform or CREATING USER VALUE
its products and services. To this end, Uber is known
for using techniques of behavioral nudging to inform Figure 1 shows our framework for explaining the
and engage users with the help of push notifications role of AI and data network effects for creating “user
and messages providing intelligent recommenda- value,” defined as the value that users perceive in
tions that adapt to changing contextual and situa- the platform (e.g., Facebook) or its products and serv-
tional circumstances (Rosenblat, 2018). ices (e.g., News Feed, Pages). The data network ef-
The growing influence of learning from data on fects themselves are manifested in the positive direct
the network in the era of AI also becomes evident relationship between the AI capability of a platform
when considering network conduct (Afuah, 2013), and the value of the platform as perceived by its
another factor contributing to user value. For in- users—a relationship that is moderated by platform
stance, not all users of the network are necessarily ra- legitimation, data stewardship, and user-centric
tional and have identical information about one design.
another and the possible transactions. This may re- The framework is based on the following set of
sult in opportunistic behavior that makes the plat- assumptions:
form less valuable, on average, to users. For example,
some Uber drivers may try to game the system by go- (1) The “computer in the middle of every trans-
ing offline to avoid fulfilling passenger requests that action” (Varian, 2014: 1) turns AI-enabled plat-
they find less lucrative than those received on an al- forms into flexible infrastructures that are
ternative ride-hailing platform that they use in paral- capable of learning (Assumption 1). For exam-
lel. By learning from data collected on each user ple, in addition to employing plenty of human
through the use of machine learning algorithms on labor, social media sites such as Facebook and
the platform, Uber tries to prevent what it views as Inke, one of the largest Chinese live-streaming
fraudulent behavior, an instance of opportunistic be- companies, use machine learning algorithms to
havior. On the other hand, users of a platform may help moderate (e.g., find and remove) toxic
earn a reputation for being trustworthy, dependable, content, including spam, hate speech, nudity,
and honest, which may positively impact the violence, and terrorist propaganda.
538 Academy of Management Review July

FIGURE 1
Model of Data Network Effects

Data stewardship Platform legitimation


Data quantity Personal data use
Data quality Prediction explainability

Artificial intelligence
capability of a platform
Speed of prediction Perceived user
Accuracy of prediction value

Performance expectancy
Effort expectancy
User-centric design

(2) The strategic role of machine learning in to- Drawing on this set of assumptions, we explain our
day’s platforms highlights data as a key input framework in the following sections (see also Fig-
into learning and value creation, turning data ure 1).
into a valuable asset (Assumption 2). For exam-
ple, “Facebook, Uber, and Spotify operate tech-
Platform AI Capability
nology platforms where their entire value lies
in the relationships they create and the infor- We suggest that the engine driving data network
mation they hold” (Birkinshaw, 2018: 204, em- effects is “platform AI capability,” defined as the
phasis added). ability of a platform to learn from data to continuous-
(3) Consumerization (Gabriel, Korczynski, & ly improve its products and services for each user
Rieder, 2015) has blurred the line between con- (Assumption 1) (Figure 1). The main mechanism
sumption and production, turning users into through which platform AI capability may enhance
prosumers who cocreate value (Assumption 3). perceived user value is by improving prediction
For example, content creators on social media (Meinhart, 1966). “Prediction” describes the ability
platforms such as YouTube simultaneously of a system to draw upon existing data about the past
consume and produce marketing content, ef- and present to generate information about the future
fectively cocreating value with brands and oth- (Churchman, 1961). This information can help fore-
er YouTubers. cast future events or provide recommendations for
(4) The fact that a few large platform firms (e.g., action (Agrawal et al., 2018). For example, a credit-
Facebook, Google) dominate the information worthiness decision made by a lending platform in-
economy by capturing a disproportionate and volves predicting the likelihood that someone will
growing share of the value (Iansiti & Lakhani, pay back a loan, drawing upon existing data on users
2017) has given rise to concerns about the and past transactions. Another example is the detec-
firms’ massive influence. Indeed, platform AI tion of fraudulent credit card transactions, which in-
capability alters the behaviors, attitudes, ex- creasingly relies on machine learning algorithms
pectations, and emotions of people participat- trained by data scientists and domain experts.
ing in elections, protests, education, and so Prediction enabled by machine learning works
forth, affecting the interests of a wide range of through what Herbert A. Simon (1995: 110) called
stakeholders often in conflicting ways. This “learning from examples”: “A number of systems
suggests that, for long-term success, platform have been constructed that learn from their own prob-
owners must balance diverse stakeholder inter- lem-solving efforts, or from the successful problem-
ests (Assumption 4). solving efforts of others in the form of worked-out
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 539

examples of problem solutions.” For example, to de- anticipates any network dynamics that destroy user
velop a reliable fraud detection model, as in the exam- value based on the state of the network of users at the
ple given above, a balanced training data set with past exact moment a given transaction is being carried
fraudulent and nonfraudulent examples of credit card out. For example, Uber tries to prevent fraudulent be-
transactions must be created and fed into the machine havior such as prearranged trips between riders and
learning algorithm during training. drivers that limit open competition by letting its al-
Under certain circumstances, which include train- gorithms monitor signs of fake trips (e.g., requesting,
ing the machine learning algorithms with adequate accepting, and completing trips on the same device
data sets, machine-generated predictions can help or with the same payment profile, excessive promo-
avoid human cognitive biases in making assessments tional trips, excessive cancellations) in real time for
and forming judgments. For instance, in discussing faster prediction and action recommendations or
how to deal with the known overconfidence bias in sanctions to enforce rules more quickly. Similarly,
which a person’s subjective confidence in their judg- Facebook tries to detect misinformation more quick-
ments is greater than the objective accuracy of those ly to prevent false news from spreading by employ-
judgments, Kahneman and Tversky (1977: 4–7) stat- ing machine learning algorithms that help identify
ed the following: “The most radical suggestion is to faster what stories might be false or which accounts
replace such assessments by computation.” will more likely post false news before letting human
Effectively, computations enabled by a platform fact-checkers do their work to moderate content and
AI capability can result in higher speed and accuracy increase the perceived value of the platform.
of prediction (Agrawal et al., 2018). Both types of im-
Proposition 1a. The greater the speed of prediction,
provements and their effect on the characteristics of
the higher the perceived user value is likely to be.
network structure and conduct (Afuah, 2013) have to
be taken into account to understand how platform AI Accuracy of prediction. As illustrated by the ex-
capability impacts perceived user value. amples given above, the learning enabled by plat-
Speed of prediction. Users participating in the form AI capability not only occurs on the basis of
platform’s network are free agents, empowered by data collected from the network but also influences
the use of products and services offered by the plat- the network by shaping interactions among users.
form. For example, a Facebook user autonomously This influence occurs by wrapping trained predic-
decides what to post and when, and an Uber driver tion models and machine learning features into the
decides when, where, and how long to drive and products and services offered by the platform, allow-
whether to accept or reject ride requests. As a result ing them to function in a smarter and more adaptive
of the users’ autonomy, exchange relationships in- way. The resulting agency of the platform exerts a
volving interactions among users are typically strong influence on key network characteristics, in-
bounded in time and affected by a myriad of actions cluding the perception of trust among network users
taken at that very moment by other users. For exam- and transaction feasibility (Afuah, 2013). As an ex-
ple, Twitter users may retweet messages within sec- ample of the latter, the feasibility of transactions on a
onds, directly influencing other users to engage or platform such as Uber depends not only on the ubiq-
disengage in further information exchange on the uitous availability and constant connectivity of the
platform. Such actions may lead to a rapid reconfigu- Internet and smartphones with installed apps but
ration of the network’s structure, which can poten- also on the availability of information generated
tially impede new interactions or manifest in through prediction (e.g., pushed information about
opportunistic behaviors by actors pursuing informa- available ride requests on the driver’s way home after
tion asymmetries (Afuah, 2013) and misinformation a platform work shift). As an example of the influ-
campaigns (O’Connor & Weatherall, 2019). ence of prediction on trust among interacting users,
A platform AI capability offering a greater speed of the information filtering, curation, and ranking per-
prediction helps offset such value-destroying dy- formed by algorithms on Facebook has at times gen-
namics and foster value-enhancing interactions erated a greater perception of trust and, at other
among users by minimizing the time between when times, a weaker perception of trust in the network,
a salient change in the network structure or conduct depending on the accuracy of the prediction.
occurs and when the platform detects this change As these examples illustrate, a platform AI capa-
and generates user action recommendations to influ- bility ensuring greater accuracy of prediction helps
ence the network. Indeed, in an ideal scenario, the reduce deviations from what has been forecasted or
platform makes instantaneous predictions and recommended to what events or outcomes have
540 Academy of Management Review July

actually occurred or what users truly want, increas- results for the users (Agrawal et al., 2018). Training
ing transaction feasibility and bolstering the percep- machine learning algorithms with greater amounts
tion of trust among network users. For example, of data leads to better prediction models (Simon,
when the Uber platform indicates an estimated arriv- 1995, 1996) from which users will ultimately benefit.
al time of three minutes, but it takes the car 10 mi- However, there are many examples in which ma-
nutes to pick up a customer, the value of the chine learning algorithms trained on large data sets
platform to this particular user decreases. Similarly, produce inaccurate prediction results (Khoury &
inaccurate forecasts and misplaced action recom- Ioannidis, 2014). For example, IBM’s efforts to train
mendations due to algorithmic biases (Lambrecht & machine learning algorithms to diagnose cancer and
Tucker, 2019), for example, may fuel malevolent be- recommend treatment options, including their prob-
haviors and lead to a deterioration of trust in the net- abilities of success, have been greatly complicated
work. Continued difficulties of Facebook algorithms by handwritten notes and local acronyms. Accord-
to detect fake news stories offer a good illustration of ingly, we suggest that both data quantity and quality
this latter point (Bucher, 2016). need to be considered factors as moderating the im-
pact of platform AI capability on perceived user
Proposition 1b. The greater the accuracy of predic-
value.
tion, the higher the perceived user value is likely to
Data quantity. Increased accuracy and speed of
be.
prediction—the main mechanisms through which
platform AI capability positively impacts perceived
Data Stewardship user value—depend on the quantity of data used as
an input to train and calibrate machine learning
Data are oftentimes referred to as the oil fueling models. In their study of human prediction,
the information economy (McAfee & Brynjolfsson, Kahneman and Tversky (1977) distinguished be-
2012; Perrons & Jensen, 2015; Varian, 2014). This tween “singular information” (i.e., data consisting of
suggests that data are a valuable asset (Assumption 2 evidence about a particular case) and “distributional
within our framework), especially when they are information” (i.e., base-rate data describing the dis-
used to nurture platform AI capability and help en- tribution of outcomes across many cases of the same
sure value creation for each user. When supplied class). A common reason for inaccurate predictions
with sufficient quality and quantity of oil, the engine by a person is the tendency to rely too much on sin-
may provide much more value to its users. Similarly, gular information, typically coming from a single
we suggest that the effect of platform AI capability case that the person is closely familiar with, and to
on perceived user value is moderated by data quanti- underweight or ignore distributional information.
ty and data quality. To ensure this strengthening ef- This is called an “internal approach to prediction”
fect, a firm must refine and extract value from data (Kahneman & Tversky, 1977). To avoid this common
by means of “data stewardship,” defined as the en- bias in prediction, the particular case at hand needs
terprise-wide holistic management of a firm’s data to be compared with the distribution of cases of the
assets to help ensure adequate data quantity and same class, thus helping avoid biases in the interpre-
quality (Baesens et al., 2016; Cooper, Watson, tation of data. This is called an “external approach to
Wixom, & Goodhue, 2000; Kitchens, Dobolyi, Li, & prediction” (Kahneman & Tversky, 1977).
Abbasi, 2018; Otto, 2011; Ross, Weill, & Robertson, While computers do not suffer from motivation-
2006; Wixom & Watson, 2001; Woerner & Wixom, al factors or limited cognitive information process-
2015). Data stewardship acts as a mechanism of data ing capacities that would make them attached to a
network effects by helping fuel the engine, making particular case (Simon, 1991), the internal ap-
the platform more valuable to each user through in- proach to prediction may still be present in ma-
creased speed and accuracy of prediction (Agrawal chine learning if the training dataset is not large
et al., 2018). enough and does not contain a sufficient range of
To understand the moderating effect of data quan- cases of the same class. This will likely lead to mis-
tity and quality on the relationship between platform interpretation of new cases that the algorithm is
AI capability and perceived user value, consider the confronted with during usage, preventing the fast
role of data in machine learning, as discussed earlier identification of emerging patterns and accurate
above. Machine learning algorithms are fed with predictions (Agrawal et al., 2018). The larger the
training data to iteratively adjust predictive models volume of data about past cases, the greater
until they produce more accurate and relevant the ability to build and train machine learning
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 541

algorithms on a strong distributional dataset that complete, reliable, and appropriate for the task at
facilitates an external approach to prediction, hand (Kahneman & Tversky, 1977). In other words,
thereby increasing the accuracy and speed of the data must be of sufficient quality.
prediction. “Data quality” includes aspects of truthfulness
For example, DeepMind’s AlphaGo system, which (the degree of conformity between the recorded val-
beat the former champion Lee Sedol in the board ue and the actual value), completeness (the extent to
game called Go (a strategy game similar to chess in which the recorded values exist for all observations),
which each player seeks to enclose more territory on consistency (the degree to which the data have been
the board than their opponent), was trained using measured in the same manner across cases), and
vast quantities of examples taken from a large num- timeliness (the speed by which data observations are
ber of games played by the best human Go players, updated in the event of change) (Ballou & Pazer,
allowing the machine to optimize its prediction and 1985; Constantiou & Kallinikos, 2015; Markus, 2015;
decision-making capabilities to the extent that its McAfee & Brynjolfsson, 2012; Woerner & Wixom,
speed and accuracy outperformed the best Go player 2015; Yoo, 2015). The better the quality of data, the
in the world. The limitations of relatively small greater is the likelihood of reducing or eliminating
quantities of data for training machine learning algo- the prevalent overconfidence bias in prediction
rithms become apparent in another example. Hedge (Kahneman & Tversky, 1977), thereby strengthening
fund investors such as AQR Capital Management are the impact of platform AI capability on perceived
increasingly relying on algorithmic trading—using a user value.
large variety and volume of data, from credit card re- For example, popular fare aggregators and travel
cords to satellite images of inventories to flight char- metasearch engines such as Kayak.com offer several
ters for private jets—to make more accurate alternative routes alongside their prices for users to
predictions and more profitable investment deci- choose from to reach their desired destination. Mak-
sions. Yet, the overall size of the data relative to the ing such recommendations, or prescriptions, re-
complexity of the events that these hedge fund man- quires not only generating a prediction for how long
agers are trying to forecast is still not large enough, a flight sequence might take but also offering an indi-
highlighting again the importance of data quantity as cation of the degree of confidence for the recommen-
an important moderator of the relationship between dation to buy an airfare ticket for the given
platform AI capability and perceived user value. destination or to wait until better rates become avail-
able on the platform. The more truthful, complete,
Proposition 2a. The greater the quantity of data for
the training of machine learning algorithms on the
consistent, and timely the data set that the aggregator
platform, the stronger will be the relationship be- platform draws on—achieved, for example, through
tween platform AI capability and perceived user better integration with the reservation systems of air-
value. lines and travel agencies—the faster and better are
the predictions and, thus, also the recommendations
Data quality. Increased accuracy and speed of offered by the platform to each user.
prediction also depend on the quality of data used as
input for training and calibrating machine learning Proposition 2b. The higher the quality of data for the
models. Kahneman and Tversky (1977) explained training of machine learning algorithms on the plat-
that human predictors typically suffer from an over- form, the stronger will be the relationship between
platform AI capability and perceived user value.
confidence bias, whereby their certitude concerning
a given estimate tends to be higher than that justified User-centric design. The perceived value of a fu-
by the available evidence. This happens because of eled engine is likely to be only as strong as the design
people’s tendency to form judgments that are consis- of the car in which the engine is installed, because the
tent with their preferences and experience, as well as design shapes the experience of the driver. Similarly,
being due to the adoption of unverified assumptions platform AI capability trained with adequate quanti-
and owing to cognitive anchoring, whereby an indi- ties and quality of data may help create AI models
vidual depends too heavily on an initial piece of that provide greater speed and accuracy of prediction
information offered when making decisions (Kahne- from which users may perceive value; the better that
man & Tversky, 1977). While these cognitive limita- these trained AI prediction models are wrapped into
tions, in principle, can be overcome by computation well-designed products and services through which
and machine learning, avoiding the overconfidence users can directly experience the benefits of platform
bias in prediction requires the use of a data set that is AI capability, the stronger the perceived value of the
542 Academy of Management Review July

platform AI capability to each user is likely to be. We machine learning algorithms to make the user expe-
argue that, to create value for users, firms designing rience more personalized and tailored to the unique
products and services in the era of AI must adapt to identity of each user (Adler & Kwon, 2002). For ex-
“consumerization,” a process involving the wide- ample, the video streaming service Netflix runs 50
spread adoption and diffusion of consumer digital concurrent experiments per user at any given point
technologies by people across society (Gregory, Kaga- in time aimed at driving better personalization and
ner, Henfridsson, & Ruch, 2018). By empowering continuously developing the feature set of its user
users to cocreate value with their personal data, con- applications (Gomez-Uribe & Hunt, 2016). We sug-
sumerization blurs the line between consumption gest that both the performance expectancy and effort
and production (Gabriel et al., 2015), effectively turn- expectancy of designed products and services need
ing users into prosumers (Assumption 3 in our frame- to be considered as factors moderating the impact of
work). Firms adapt to consumerization by adopting platform AI capability on perceived user value by
“user-centric design,” defined as becoming closer to influencing committed use and driving user
users and better understanding their needs to help in- engagement.
crease the performance and effort expectancy of the Performance expectancy. To ensure user engage-
products and services. ment, the design of the platform’s products and serv-
User-centric design involves applying “design to ices needs to incorporate considerations of
get closer to users and better understand their needs” performance expectancy. “Performance expectancy”
(Verganti, 2008: 436). By better understanding real is defined as the degree to which an individual be-
user needs and designing the platform’s products lieves that using the system will help them attain
and services in a way that closely meets their expect- gains in job performance (Venkatesh, Morris, Davis,
ations, habits, whims, and desires (Gabriel et al., & Davis, 2003). Based on Assumption 3 of our frame-
2015), user-centric design empowers and engages work, we view the “job,” in the context of platforms,
users to cocreate value by contributing with their as a series of tasks that the user carries out in a given
feedback and personal data to the ongoing improve- context by using the platform’s products and serv-
ment and tuning of AI models and features of the ices (Christensen & Raynor, 2003). The performance
platform. Therefore, user-centric design acts as an- will likely be evaluated by users through assessing
other key mechanism of data network effects by the extent to which they believe that the adoption of
helping users experience the supplied engine and the platform’s products and services will help them
making the increased speed and accuracy of predic- satisfy their needs and meet their expectations while
tion afforded by platform AI capability more accessi- performing their job. Performance expectancy is the
ble and beneficial to each user. To achieve this strongest predictor of the user intention to use the
outcome, user-centric design must foster user system in both voluntary and mandatory settings
engagement. (Venkatesh et al., 2003) and is therefore a key deter-
One way to conceptualize user engagement on minant of committed use (Klein & Sorra, 1996),
platforms is to consider the intensity with which which is the basis for iterative improvements in pre-
users interact with the platform’s product and serv- dictive models created by machine learning algo-
ices, ranging from complete avoidance to skilled and rithms based on user feedback (Agrawal et al., 2018).
committed use (Klein & Sorra, 1996). Platform busi- Accordingly, performance expectancy is likely to
nesses typically capture user engagement by report- strengthen the impact of platform AI capability on
ing the number of daily and monthly active users, perceived user value.
where “active” corresponds to a certain threshold of For example, consider a scenario in which a
committed use with regard to a particular product or customer needs to travel in a car from point A to
service. A high level of committed use across a broad point B in the fastest and most convenient way
range of a platform’s products and services makes possible under the condition of intense city traf-
AI-enabled predictions more accurate because it in- fic. Waze, a turn-by-turn navigation app, is en-
creases the availability of user feedback about the abled by a feature called “floating car data” that
outcome (e.g., user chooses option A) following each determines the traffic speed on the road network
instance of forecast or prescription (e.g., user is pre- based on a collection of local data, speed, direc-
sented with options A, B, and C). Every user interac- tion of travel, and time information from mobile
tion with a platform offers an opportunity to test phones in vehicles on the road. As this feature is
certain features of a product or service, and, there- wrapped into the app through user-centric design
fore, to improve the prediction models created by that induces users to use Waze every day even
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 543

though they may know the way to their destina- Platform Legitimation
tion, the app continuously supplies the underly-
A car may be nicely designed and powered by a
ing platform AI capability with new
good engine supplied with sufficient quantities of
crowdsourced feedback data, helping it improve
high-quality oil, but people will still only want to
its predictions on an ongoing basis. As a result,
use the car if they also consider it safe and secure
Waze is able to increase the perceived value of the
and the perceived risk of an accident is low. Drawing
platform for each user through faster and more ac-
on this analogy, platform owners must balance di-
curate rerouting based on changing traffic flows.
verse stakeholder interests (Assumption 4 in our
Proposition 3a. The higher the performance expec- framework) to mitigate the perceived risks related to
tancy of the platform’s products and services, the data privacy and security (Cavoukian & Chanliau,
stronger will be the relationship between platform AI 2013; Kroener & Wright, 2014) as well as the inter-
capability and perceived user value.
pretability and explainability of AI (Coglianese &
Effort expectancy. The level of user engagement Lehr, 2019). Building upon this assumption, we in-
with a product or service on the platform is also a troduce the third key mechanism of data network
function of “effort expectancy,” defined as the de- effects. We argue that actions, including the respon-
gree to which an individual user believes that us- sible use of data and ensuring the explainability of
ing the system will be free of effort (Venkatesh AI features, must be considered strategic, as they
et al., 2003). Similar to performance expectancy, may play an important role in strengthening the rela-
effort expectancy also shapes the user intention to tionship between platform AI capability and per-
adopt the system, and, by extension, the level of ceived user value by avoiding accidents such as data
committed use. The easier it is to use a product or security breaches, data privacy violations, and unin-
service on the platform, the more likely it is that tended consequences of unexplainable machine be-
users will adopt and use it in a committed way, al- havior. To capture this category of actions geared
lowing for further data-driven improvement of toward balancing diverse stakeholder interests and
the underlying AI models and features based on mitigating the perceived risks of the use of big data
user feedback and creating more value for each and AI in platform contexts, we introduce the con-
user. Thus, user beliefs reflecting higher effort ex- cept of “platform legitimation,” defined as actions
pectancy will likely increase the level of commit- that the platform owner takes to ensure positive legit-
ted use of the platform’s products and services, imacy evaluations of the platform by key stakeholder
prompting user feedback that is necessary to make audiences. In what follows, we explain the moderat-
the AI-enabled predictions more accurate. ing role of platform legitimation in our model.
For example, voice assistants such as Apple’s Siri, “Legitimacy,” defined as “a generalized percep-
Google Now, and Microsoft’s Cortana found a way to tion or assumption that the actions of an entity are
combine complex machine learning technology— desirable, proper, or appropriate within some social-
deep neutral networks, hybrid emotional inference ly constructed system of norms, beliefs, and
models, as well as natural language processing and definitions” (Suchman, 1995: 574), acts as a key de-
generation—with highly accessible user interface de- terminant of a social entity’s ability to acquire re-
signs that rely on voice interaction as a more natural sources from the environment (Garud, Schildt, &
and intuitive way for humans to interact with the Lant, 2014; Zimmerman & Zeitz, 2002). The crucial
machines and use products and services on the re- resources in the case of platforms in today’s era of AI
spective platforms of each voice assistant service include the personal data, financial means, and tech-
(e.g., Apple’s iOS). As a result, the perceived effort nological capabilities needed to set up machine
expectancy of these voice assistant services is very
learning algorithms, train models, and develop new
high, contributing to their widespread adoption and
platform features. Accordingly, regulators oversee-
engaged use, which in turn helps continuously im-
ing the use of personal data, platform investors, and
prove predictions and behavior on the basis of user
technology partners all represent key stakeholder
feedback data that increases the perceived value of
groups whose legitimacy judgments must be consid-
the platform.
ered in understanding the functioning of data net-
Proposition 3b. The higher the effort expectancy of work effects and ultimately the perceived value of
the platform’s products and services, the stronger the platform by users.
will be the relationship between platform AI capabili- Satisfying the needs and interests of these key
ty and perceived user value. stakeholder groups is important for platform owners
544 Academy of Management Review July

because they provide critical resources (e.g., sus- Uber’s practices in using city transportation data.
tained access to personal data protected by appropri- Facebook, too, has repeatedly attracted legitimation
ate laws and rules) upon which the continued scrutiny from key stakeholder audiences—including
development and use of their platform AI capability regulators, investors, and partners—over its repeated
depends. The key characteristics of the platform that failures to ensure the privacy and security of user
attract legitimation scrutiny from resource-granting data (e.g., the Cambridge Analytica scandal, wherein
stakeholders and that therefore must be proactively the company was able to harvest personally identifi-
addressed as part of platform legitimation include (a) able information from the Facebook platform
how the platform is designed and governed to col- through an app that exploited the Facebook Graph
lect, store, and use personal data and (b) how the API), pointing to limitations in the design and gover-
platform is designed and governed to apply machine nance of its platform. As a sign of platform legitima-
learning transparently and make predictions tion and effort to secure support from key
explainable. stakeholder groups to sustain its scalable business
Personal data use. A critical aspect of platform le- model around the use of platform AI capability to
gitimation concerns the extent to which the plat- create value for billions of users and attract adver-
form’s approach to collecting, storing, and sharing tisers, Facebook has started to endorse data privacy
personal user data are adjudged by the stakeholder protection rules and to work with regulators to
audiences to be “the right thing to do” (Suchman, secure positive legitimacy evaluations in the future.
1995: 579). This assessment goes beyond self-
Proposition 4a. The higher the moral desirability of
interested calculations concerning the utility of plat-
the use of personal data by the platform, the stronger
form transactions for an individual user and involves will be the relationship between platform AI capabili-
considerations of moral desirability entertained by a ty and perceived user value.
wide range of stakeholders across society (Bitektine,
2011). To this end, the platform firm must demon- Prediction explainability. Another critical aspect
strate that its policies and procedures for data collec- of platform legitimation concerns “explainability”—
tion and use—typically communicated through user or interpretability of functioning and coherence in
privacy policies (B elanger & Crossler, 2011; Hong & understanding—of the predictions made by AI mod-
Thong, 2013; Pavlou, 2011; Smith, Dinev, & Xu, els and features on the platform. AI-made predic-
2011) and information security compliance docu- tions not only influence core market-related
ments (Anderson & Moore, 2006; Barlow, Warkentin, processes on the platform, including how the plat-
Ormond, & Dennis, 2018)—meet morally desirable form matches different user groups, but also have a
principles, such as privacy by design and security by profound effect on the behavior and emotions of
design (Cavoukian & Chanliau, 2013; Kroener & users. As resource-granting stakeholders seek an un-
Wright, 2014). These by-design principles call for derstanding of how and why people are being influ-
data privacy and security to be taken into account enced and are affected by these AI-made predictions
throughout the entire engineering and development and the resulting machine behavior or decision-
process and for them to be reflected in the design of making, the stakeholders make an assessment as to
the platform or specific products and services on the whether they are meaningful in the context of the
platform. The declared policies, procedures, and prevalent beliefs, logics, and categories (Suchman,
design choices can then be compared by the re- 1995). Considering the “black box” nature of many
source-granting stakeholder audiences with the actu- AI models, which makes it difficult, if not impossi-
al platform outcomes to uncover inconsistencies or ble, for humans to understand exactly how machine
malfeasance in how the management applies the learning algorithms make predictions and arrive at
norms in practice. In case the platform design and certain decisions, recommendations, or behaviors
outcomes are deemed incoherent, the regulators, in- (Coglianese & Lehr, 2019), making such predictions
vestors, and partners may choose to withhold legiti- explainable is extremely difficult in some cases
macy and, by extension, resources, forcing the (Mayenberger, 2019; Preece, 2018). However, only if
platform to alter or altogether eliminate certain AI the explainability of AI-made predictions is
features or models. achieved can stakeholders assess the meaningful-
For example, Uber’s expansion into Europe re- ness of these predictions and renew their trust and
sulted in a backlash against the company’s alleged commitment to grant the critical resources that help
noncompliance with the regional personal data pro- ensure a strong relationship between platform AI ca-
tection regulations, as well as, more broadly, against pability and perceived user value (Rossi, 2018).
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 545

For example, it is becoming increasingly common Our explanation of user value creation in the era of
for banks and lenders to use machine learning algo- AI offers a novel set of insights. First, the research de-
rithms to predict credit risk and make creditworthi- scribes data network effects as a new category of net-
ness assessments. The resulting loan decisions may work effects focused on the impact of data-driven
have a strong impact on the lives of consumers, yet learning and improvements, enabled by platform AI
disappointed users typically lack an explanation for capability, on perceived user value. Under certain
being denied credit. To increase the perceived user conditions, data network effects play an influential
value of AI-enabled loan decision-making, credit- role for the value that users perceive in a platform,
granting institutions can educate their customers. product, or service. We surmise that data network ef-
For instance, Bank of America offers all customers fects largely influence perceived user value in the
their FICO (Fair Isaac Corporation) score and ex- context of platforms facilitating the production and
plains the important components of the score that exchange of information or experience goods (Sha-
are calculated by the algorithms. Thus, fostering ex- piro & Varian, 1999; Varian, 2014). In such contexts,
plainability of predictions made by machine learn- the user experience is heavily shaped by the scale of
ing algorithms on the platform is likely to strengthen learning from data collected on users. For example,
the relationship between platform AI capability and Google Search is an online service powered by a plat-
perceived user value. form AI capability enjoying a high popularity among
users due to its capability to continuously improve
Proposition 4b. The higher the explainability of pre- the underlying algorithms and experience of each
dictions made by machine learning algorithms on
user as it learns about users and their search queries.
the platform, the stronger will be the relationship be-
Similarly, Netflix leverages data network effects as it
tween platform AI capability and perceived user
value. collects and analyzes data about how its platform is
used and then draws on the learning outcomes to
continuously improve its content and user interface
DISCUSSION to increase the perceived value of the streaming serv-
ices offering through its platform. Notably, data net-
Our research contributes to the literature on net- work effects are even more significant when
work effects (Afuah, 2013; Cennamo & Santal o, 2013; learning capabilities are an important determinant of
Fuentelsaz, Maicas, & Polo, 2012; Gallaugher & Wang, platform, product, or service quality (McIntyre &
2002; Liu, Gal-Or, Kemerer, & Smith, 2011; Parker, Srinivasan, 2017; Zhu & Iansiti, 2012). For exam-
Van Alstyne, & Jiang, 2017; Shankar & Bayus, 2003; ple, the user value of a Tesla car’s Autopilot func-
Sheremata, 2004; Singh, Tan, & Mookerjee, 2011; Sua- tionality is influenced by the firm’s ability to use
rez, 2005) by explaining the role of AI and data net- AI and learn from the data collected from sensors,
work effects for creating user value, especially in the cameras, and radar units in cars to continuously
context of multisided platforms (Hagiu & Wright, improve the self-driving algorithms and Autopilot
2015; McIntyre et al., 2020). Data network effects ex- functionality.
hibit a positive direct relationship between the AI ca- Second, the value that users perceive in a plat-
pability of a platform and the value perceived in the form, product, or service may depend on combina-
platform by its users—a relationship that is moderat- tions of data network effects and direct network
ed by platform legitimation, data stewardship, and effects. Direct network effects describe the value that
user-centric design. This highlights new platform ex- users derive from a network, which comes from
ternalities, wherein a user’s utility of a platform is a users being able to interact directly with one another
function of the scale of data-driven learning and im- (Katz & Shapiro, 1985; Rochet & Tirole, 2003). In
provements realized with AI, complementing user view of the “computer in the middle of every trans-
value rooted in network externalities deriving from action” (Varian, 2014: 1), these direct exchanges
the scale of the network. Integrating our model of data among users are increasingly mediated by interac-
network effects with the extant network effects litera- tive processes of learning from data collected on
ture, we argue that the utility that a user derives from each user participating in the exchange relationship,
a platform is increasingly both a function of the scale highlighting the combined contribution of data net-
of the network and data-driven learning and improve- work effects and direct network efforts to user value.
ments realized with AI. This highlights the need to ex- For example, Facebook has historically benefitted
amine interactions between network effects and data from strong direct network effects, whereby the val-
network effects. ue that users derive from the network primarily
546 Academy of Management Review July

stems from the opportunities of users to interact di- text prediction, and speaker identification. As an in-
rectly with each other. More recently, however, the creasing number of these kinds of features enabled
“self-reinforcing process whereby growth begets by Apple’s platform AI capability are incorporated
growth” (Boudreau & Jeppesen, 2015: 1774) seems to into complements of iPhones and other iOS devices,
have slowed down, and Facebook has struggled to perceived user value is increasingly becoming a
sustain high-quality interactions among users in in- function of the combination of data network effects
creasingly crowded social networks. To deal with and indirect network effects.
this challenge and sustain the perceived user value Finally, our research indicates the significance of
of the platform, Facebook has activated and started extending the scope of network effects research be-
leveraging data network effects on top of direct net- yond the economics view of platforms (see also
work effects by collecting and learning from the vast Gawer, 2014). Data network effects relate to the tech-
amounts of personal data from its large “N” of users nical architecture of the platform, indicating that net-
on the network (Farrell & Saloner, 1986; Gandal, work effects theory needs to go beyond viewing
1994; Katz & Shapiro, 1985). By applying machine platforms as mere markets (McIntyre et al., 2020) to
learning techniques and rolling out AI models and effectively study how network effects interact with
features, Facebook has tried to influence the network data network effects. The research reported in this
in a desirable direction to increase perceived user paper indicates some of the factors that need to be in-
value. The increase in perceived user value stem- corporated into our understanding of how network
ming from data network effects feeds back into direct effects are empowered by AI technologies (Raisch &
network effects, as it increases the number of daily Krakowski, 2020).
active users, offering more opportunities for users to For future research, we suggest empirically exam-
interact directly with each other. ining data network effects in the context of direct or
Third, the value that users perceive in a platform indirect network effects. In doing so, it makes sense
or one of its products or services may depend on to distinguish between positive and negative data
combinations of data network effects and indirect network effects, similar to the common distinction
network effects. Indirect network effects focus on the between positive and negative direct or indirect net-
phenomenon that, the more people who use a prod- work effects (Parker, Van Alstyne, & Choudary,
uct, the greater is the variety and availability of the 2016). The relevance of making this distinction is
complements of the product, thereby increasing per- highlighted by the nature of machine learning. Ma-
ceived user value (Boudreau, 2012; Church et al., chine learning algorithms at work in data network ef-
2008; Clements & Ohashi, 2005). This phenomenon fects learn on the basis of data, yielding unique
of the demand for a product and the supply of com- models of prediction and decision making. This phe-
plements for that product affecting each other (Stre- nomenon is also referred to as “self-programming,”
mersch, Tellis, Franses, & Binken, 2007) may be in contrast with knowledge-based systems that are
influenced by data network effects if the developers explicitly programmed (Meinhart, 1966; Samuel,
of complements can use the platform AI capability to 1959). Such self-programming also has disadvan-
learn from data collected on users of the product to tages, including the possibility of algorithmic biases
improve the quality of their complements. For exam- (Lambrecht & Tucker, 2019). We therefore suggest
ple, Apple rolled out an AI model framework for iOS distinguishing between positive and negative data
developers (called Core ML), bringing machine network effects (see also Parker et al., 2016). Negative
learning to smartphone apps in its mobile ecosystem. data network effects, wherein the perceived value of
While each user of Apple’s mobile ecosystem the platform for users decreases, may particularly be
benefitted before from indirect network effects that activated in the absence of high data quality and
resulted in a greater diversity and number of comple- quantity as well as during breaches of data privacy
ments of iPhones and other iOS devices, these indi- and security (see model in Figure 1). As an example
rect network effects are now strengthened by data of the former scenario, consider Microsoft’s AI-
network effects, as developers are provided by Apple powered chatbot Tay, a Twitter bot that was sup-
with a platform AI capability that helps them im- posed to learn to engage people through casual and
prove their apps by performing fast and accurate pre- playful conversations on social media. Tay rapidly
dictions, potentially increasing the perceived value picked up racist and highly abusive language from
of the complements and overall mobile ecosystem Twitter users, causing a rapid deterioration of
for each user. Examples of such improvements in- perceived user value. As this example illustrates,
clude real-time image recognition, face detection, embedding platform AI capabilities in exchange
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 547

relationships and user networks on multisided plat- hand, AI can augment distributed problem solving
forms poses considerable risks (Russell, Hauert, Alt- and production models (Kyriakou, Nickerson, & Sab-
man, & Veloso, 2015), highlighting the need to nis, 2017), while, on the other hand, crowds can sup-
consider both the intended and unintended conse- port AI systems by providing skills that these
quences of data network effects in future research. systems currently lack (Kittur et al., 2013; von Ahn &
Future work could also explore the impact of AI Dabbish, 2004).
and data network effects on the value of the platform
to the platform owner. By expanding the focus of ex- Implications for Managers
planation from perceived user value (our model) to
value creation and capture, future work could effec- Managers are well aware that data are the new oil
tively explore the linkages between data network ef- fueling the information economy and that data
fects and competitive advantage. For example, how should be treated as a strategic asset (McAfee & Bryn-
and why do “superior” data-driven AI processes in a jolfsson, 2012; Perrons & Jensen, 2015; Varian,
firm erode the traditional isolating mechanisms that 2014). Reaping the strategic benefits from data assets
incumbent leaders might have built into an indus- requires, in addition to the development or acquisi-
try?1 One of the most significant isolating mecha- tion of a superior platform AI capability, careful at-
nisms discussed in prior strategic management tention to three key mechanisms of data network
literature is the firm’s idiosyncratic capacity to learn effects: (1) data stewardship, (2) user-centric design,
and diversify at the same time (Kor & Mahoney, and (3) platform legitimation. As regards data stew-
2004). In platform AI settings, learning processes are ardship, this means ensuring that machine learning
data driven, while diversification oftentimes in- algorithms on the platform are fed with appropriate
volves diversifying the platform in a way that stimu- quantities and quality of data and that they
lates new related application areas (Cennamo & employ an enterprise-wide approach to the holistic
Santalo, 2019; Ghazawneh & Henfridsson, 2013). governance of the firm’s data assets. In terms of user-
The platform sponsor’s idiosyncratic capacity to centric design, this means embracing consumeriza-
learn from data is enabled by complementarities be- tion during development to create user-centric
tween the platform AI capability and various man- designs of products and services on the platform that
agement, governance, and design capabilities by the increase performance expectancy (e.g., greater per-
platform organization (see our model); these comple- sonalization) and effort expectancy (e.g., greater ease
mentarities, when paired with the relatedness of of use). With respect to platform legitimation, this
platform-based products and services, may lead to a means using personal data collected from users re-
generative diversification of the platform that be- sponsibly by implementing principles of privacy by
comes increasingly difficult to imitate over time. The design and security by design and by ensuring the
extent to which this unique isolating mechanism in explainability of predictions generated by AI on the
platform AI settings erodes the traditional isolating platform. When all these mechanisms are successful-
mechanisms in established industries is likely to de- ly activated, users will likely perceive sustainable
pend on the relative role of learning from data and in- value in the platform AI capability, which may then
formation compared to other critical success factors become a source of competitive advantage.
of competition.
Finally, future research could also explore the in- REFERENCES
teraction between artificial and collective intelli-
Adler, P. S., & Kwon, S.-W. 2002. Social capital: Prospects
gence. This may involve studying the usefulness of
for a new concept. Academy of Management Re-
AI in evaluating solutions from crowds, which is
view, 27: 17–40.
particularly relevant when there are many solutions
and no one knows what the best solution should be Afuah, A. 2013. Are network effects really all about size?
(Afuah & Tucci, 2012; Piezunka & Dahlander, 2015). The role of structure and conduct. Strategic Manage-
ment Journal, 34: 257–273.
Furthermore, understanding how to best organize
work by iteratively leveraging artificial and collec- Afuah, A., & Tucci, C. L. 2012. Crowdsourcing as a solu-
tive intelligence, as well as combining them, can tion to distant search. Academy of Management Re-
help in expediting search processes, completing view, 37: 355–375.
modular and complex work, or helping to identify Agrawal, A., Gans, J., & Goldfarb, A. 2018. Prediction ma-
optimal solutions (Afuah & Tucci, 2012; Baldwin & chines: The simple economics of artificial intelli-
Clark, 2000; Yu & Nickerson, 2011). On the one gence. Boston, MA: Harvard Business Review.
548 Academy of Management Review July

Ahsen, M. E., Ayvaci, M. U. S., & Raghunathan, S. 2019. Cennamo, C., & Santal
o, J. 2019. Generativity tension and
When algorithmic predictions use human-generated value creation in platform-based technology ecosys-
data: A bias-aware classification algorithm for breast tems. Organization Science, 30: 617–641.
cancer diagnosis. Information Systems Research, 30: Chen, D. L., & Horton, J. J. 2016. Research note—Are online
97–116. labor markets spot markets for tasks? A field experi-
Anderson, R., & Moore, T. 2006. The economics of infor- ment on the behavioral response to wage cuts. Infor-
mation security. Science, 314: 610–613. mation Systems Research, 27: 403–423.
Arthur, W. B. 2009. The nature of technology: What it is Christensen, C. M., & Raynor, M. E. 2003. The innovator’s
and how it evolves. New York, NY: Free Press. solution: Creating and sustaining successful growth.
Baesens, B., Bapna, R., Marsden, J. R., Vanthienen, J., & Cambridge, MA: Harvard Business School Press.
Zhao, J. L. 2016. Transformational issues of big data Church, J., & Gandal, N. 1992. Network effects, software
and analytics in networked business. Management provision, and standardization. Journal of Industrial
Information Systems Quarterly, 40: 807–818. Economics, 40: 85–103.
Baldwin, C. Y., & Clark, K. B. 2000. Design rules: The Church, J., Gandal, N., Krause, D., & Canada, B. 2008. Indi-
power of modularity. Cambridge, MA: MIT Press. rect network effects and adoption externalities. Re-
view of Network Economics, 7: 337–358.
Ballou, D. P., & Pazer, H. L. 1985. Modeling data and pro-
cess quality in multi-input, multi-output information Churchman, C. W. 1961. Prediction and optimal decision.
systems. Management Science, 31: 150–162. Englewood Cliffs, NJ: Prentice Hall.
Barlow, J. B., Warkentin, M., Ormond, D., & Dennis, A. R. Clements, M., & Ohashi, H. 2005. Indirect network effects
2018. Don’t even think about it! The effects of antineu- and the product cycle: Video games in the U.S.,
tralization, informational, and normative communica- 1994–2002. Journal of Industrial Economics, 53:
tion on information security compliance. Journal of 515–542.
the Association for Information Systems, 19: Clements, M. T. 2004. Direct and indirect network effects:
689–715. Are they equivalent? International Journal of Indus-
B
elanger, F. C., & Crossler, R. E. 2011. Privacy in the digital trial Organization, 22: 633–645.
age: A review of information privacy research in infor- Coglianese, C., & Lehr, D. 2019. Transparency and algorith-
mation systems. Management Information Systems mic governance. Administrative Law Review, 71:
Quarterly, 35: 1017–1041. 1–56.
Birkinshaw, J. 2018. How is technological change affecting Constantiou, I. D., & Kallinikos, J. 2015. New games, new
the nature of the corporation? Journal of the British rules: Big data and the changing context of strategy.
Academy, 6: 185–214. Journal of Information Technology, 30: 44–57.
Bitektine, A. 2011. Toward a theory of social judgments of Cooper, B. L., Watson, H. J., Wixom, B. H., & Goodhue,
organizations: The case of legitimacy, reputation, and D. L. 2000. Data warehousing supports corporate strat-
status. Academy of Management Review, 36: egy at first American corporation. Management Infor-
151–179. mation Systems Quarterly, 24: 547–567.
Boudreau, K. J. 2012. Let a thousand flowers bloom? An Eisenmann, T., Parker, G., & Van Alstyne, M. 2011. Plat-
early look at large numbers of software app developers form envelopment. Strategic Management Journal,
and patterns of innovation. Organization Science, 32: 1270–1285.
23: 1409–1427. Farrell, J., & Saloner, G. 1985. Standardization, compatibil-
Boudreau, K. J., & Jeppesen, L. B. 2015. Unpaid crowd ity, and innovation. RAND Journal of Economics, 16:
complementors: The platform network effect mirage. 70–83.
Strategic Management Journal, 36: 1761–1777. Farrell, J., & Saloner, G. 1986. Installed base and compati-
Bucher, T. 2016. The algorithmic imaginary: Exploring the bility: Innovation, product pre-announcement, and
ordinary affects of Facebook algorithms. Information predation. American Economic Review, 76: 940–955.
Communication and Society, 20: 30–44. Finlay, S. 2017. Artificial intelligence and machine
Cavoukian, A., & Chanliau, M. 2013. Privacy and security learning for business. Preston, U.K.: Relativistic
by design: A convergence of paradigms. Toronto, Books.
Canada: Information and Privacy Commissioner. Fuentelsaz, L., Maicas, J. P., & Polo, Y. 2012. Switching
Cennamo, C., & Santal o, J. 2013. Platform competition: costs, network effects, and competition in the Europe-
Strategic trade-offs in platform markets. Strategic an mobile telecommunications industry. Information
Management Journal, 34: 1331–1350. Systems Research, 23: 93–108.
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 549

Gabriel, Y., Korczynski, M., & Rieder, K. 2015. Organiza- Katz, M. L., & Shapiro, C. 1985. Network externalities,
tions and their consumers: Bridging work and con- competition, and compatibility. American Economic
sumption. Organization, 22: 629–643. Review, 75: 424–440.
Gallaugher, J. M., & Wang, Y.-M. 2002. Understanding net- Katz, M. L., & Shapiro, C. 1986. Technology adoption in
work effects in software markets: Evidence from web the presence of network externalities. Journal of Polit-
server pricing. Management Information Systems ical Economy, 94: 822–841.
Quarterly, 26: 303–327. Katz, M. L., & Shapiro, C. 1992. Product introduction with
Gandal, N. 1994. Hedonic price indexes for spreadsheets network externalities. Journal of Industrial Econom-
and an empirical test for network externalities. RAND ics, 40: 55–83.
Journal of Economics, 25: 160–170. Khoury, M. J., & Ioannidis, J. P. A. 2014. Big data meets
Garud, R., & Karnøe, P. 2003. Bricolage versus break- public health. Science, 346: 1054–1055.
through: Distributed and embedded agency in tech- Kitchens, B., Dobolyi, D., Li, J., & Abbasi, A. 2018. Ad-
nology entrepreneurship. Research Policy, 32: vanced customer analytics: Strategic value through in-
277–300. tegration of relationship-oriented big data. Journal of
Garud, R., Schildt, H. A., & Lant, T. K. 2014. Entrepreneur- Management Information Systems, 35: 540–574.
ial storytelling, future expectations, and the paradox Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw,
of legitimacy. Organization Science, 25: 1479–1492. A., Zimmerman, J., Lease, M., & Horton, J. 2013. The
Gawer, A. 2009. Platforms, markets, and innovation. future of crowd work. In A. Bruckman & S. Counts
Cheltenham, U.K.: Edward Elgar. (Chairs), Proceedings of the 2013 conference on com-
Gawer, A. 2014. Bridging differing perspectives on techno- puter supported cooperative work: 1301–1318. New
logical platforms: Toward an integrative framework. York, NY: Association for Computing Machinery.
Research Policy, 43: 1239–1249. Klein, K. J., & Sorra, J. S. 1996. The challenge of innovation
Ghazawneh, A., & Henfridsson, O. 2013. Balancing plat- implementation. Academy of Management Review,
form control and external contribution in third-party 21: 1055–1080.
development: The boundary resources model. Infor- Kor, Y. Y., & Mahoney, J. T. 2004. Edith Penrose’s (1959)
mation Systems Journal, 23: 173–192. contributions to the resource-based view of strategic
Gomez-Uribe, C. A., & Hunt, N. 2016. The Netflix recommend- management. Journal of Management Studies, 41:
er system: Algorithms, business value, and innovation. 183–191.
ACM Transactions on Management Information Sys- Kroener, I., & Wright, D. 2014. A strategy for operationaliz-
tems, 6: Article 13. Published online. doi: 10.1145/ ing privacy by design. Information Society, 30:
2843948 355–365.
Gregory, R. W., Kaganer, E., Henfridsson, O., & Ruch, T. J. Kyriakou, H., Nickerson, J. V., & Sabnis, G. 2017. Knowl-
2018. IT consumerization and the transformation of IT edge reuse for customization: Metamodels in an open
governance. Management Information Systems design community for 3D printing. Management In-
Quarterly, 42: 1225–1253. formation Systems Quarterly, 41: 315–322.
Hagiu, A., & Wright, J. 2015. Multi-sided platforms. Inter- Lambrecht, A., & Tucker, C. 2019. Algorithmic bias? An
national Journal of Industrial Organization, 43: empirical study of apparent gender-based discrimina-
162–174. tion in the display of STEM career ads. Management
Himan, N. B. 2002. Platform externalities and the antitrust Science, 65: 2966–2981.
case against Microsoft. Antitrust Bulletin, 47: Liu, C. Z., Gal-Or, E., Kemerer, C., & Smith, M. D. 2011.
641–660. Compatibility and proprietary standards: The impact
Hong, W. L. T., & Thong, J. Y. L. 2013. Internet privacy con- of conversion technologies in IT markets with net-
cerns: An integrated conceptualization and four em- work effects. Information Systems Research, 22:
pirical studies. Management Information Systems 188–207.
Quarterly, 37: 275–298. Majumdar, S. K., & Venkataraman, S. 1998. Network ef-
Iansiti, M., & Lakhani, K. R. 2017. Managing our hub econ- fects and the adoption of new technology: Evidence
omy. Harvard Business Review, 95: 84–92. from the U.S. telecommunications industry. Strategic
Kahneman, D., & Tversky, A. 1977, June. Intuitive pre- Management Journal, 19: 1045–1062.
diction: Biases and corrective procedures. Re- Markus, M. L. 2015. New games, new rules, new score-
trieved from https://apps.dtic.mil/dtic/tr/fulltext/ boards: the potential consequences of big data. Jour-
u2/a047747.pdf nal of Information Technology, 30: 58–59.
550 Academy of Management Review July

Mayenberger, D. 2019. How to overcome modelling and Piezunka, H., & Dahlander, L. 2015. Distant search, narrow
model risk management challenges with artificial in- attention: How crowding alters organizations’ filtering
telligence and machine learning. Journal of Risk of suggestions in crowdsourcing. Academy of Man-
Management in Financial Institutions, 12: 241–255. agement Journal, 58: 856–880.
McAfee, A., & Brynjolfsson, E. 2012. Big data: The manage- Preece, A. 2018. Asking “why” in AI: Explainability of in-
ment revolution. Harvard Business Review, 90(10): telligent systems—perspectives and challenges. Intel-
60–68. ligent Systems in Accounting, Finance &
McCorduck, P. 2004. Machines who think: A personal in- Management, 25: 63–72.
quiry into the history and prospect of artificial intel- Priem, R. L., Butler, J. E., & Li, S. 2013. Toward reimagin-
ligence (2nd ed.). Natick, MA: AK Peters. ing strategy research: Retrospection and prospection
McIntyre, D. P., & Srinivasan, A. 2017. Networks, plat- on the 2011 AMR Decade Award article. Academy of
forms, and strategy: Emerging views and next steps. Management Review, 38: 471–489.
Strategic Management Journal, 38: 141–160. Raisch, S., & Krakowski, S. 2020. Artificial intelligence
McIntyre, D. P., Srinivasan, A., Afuah, A., Gawer, A., & and management: The automation–augmentation par-
Krestschmer, T. 2020. Multi-sided platforms as new adox. Academy of Management Review. Published
organizational forms: A dynamic perspective on strat- online in advance. doi: 10.5465/2018.0072
egy, scope and business models. Academy of Man- Rochet, J.-C., & Tirole, J. 2003. Platform competition in
agement Perspectives. Published online in advance. two-sided markets. Journal of the European Econom-
doi: 10.5465/amp.2018.0018 ic Association, 1: 990–1029.
Meinhart, W. A. 1966. Artificial intelligence, computer Rochet, J.-C., & Tirole, J. 2006. Two-sided markets: A pro-
simulation of human cognitive and social processes, gress report. RAND Journal of Economics, 37: 645–667.
and management thought. Academy of Management
Rosenblat, A. 2018. Uberland: How algorithms are re-
Journal, 9: 294–307.
writing the rules of work. Berkeley, CA: University of
Meyer, G., Adomavicius, G., Johnson, P. E., Elidrisi, M., California Press.
Rush, W. A., Sperl-Hillen, J. A. M., & O’Connor, P. J.
Ross, J. W., Weill, P., & Robertson, D. 2006. Enterprise ar-
2014. A machine learning approach to improving dy-
chitecture as strategy: Creating a foundation for busi-
namic decision making. Information Systems Re-
ness execution. Boston, MA: Harvard Business Press.
search, 25: 239–263.
Rossi, F. 2018. Building trust in artificial intelligence. Jour-
O’Connor, C., & Weatherall, J. O. 2019. The misinforma-
nal of International Affairs, 72: 127–133.
tion age: How false beliefs spread. New Haven, CT:
Yale University Press. Russell, S., Hauert, S., Altman, R., & Veloso, M. 2015.
Ethics of artificial intelligence. Nature, 521: 415–416.
Otto, B. 2011. Data governance. Business & Information
Systems Engineering, 3: 241–244. Samuel, A. L. 1959. Some studies in machine learning us-
ing the game of checkers. IBM Journal of Research
Parker, G. G., & Van Alstyne, M. 2005. Two-sided network
and Development, 3: 210–229.
effects: A theory of information product design. Man-
agement Science, 51: 1494–1504. Schilling, M. A. 2000. Toward a general modular systems
theory and its application to interfirm product modular-
Parker, G., & Van Alstyne, M. 2018. Innovation, openness, and
ity. Academy of Management Review, 25: 312–334.
platform control. Management Science, 64: 3015–3032.
Schilling, M. A. 2002. Technology success and failure in
Parker, G. G., Van Alstyne, M. W., & Choudary, S. P. 2016.
winner-take-all markets: The impact of learning orien-
Platform revolution: How networked markets are
tation, timing, and network externalities. Academy of
transforming the economy—and how to make them
work for you [Kindle version]. Retrieved from https:// Management Journal, 45: 387–398.
wwnorton.com/books/Platform-Revolution Shankar, V., & Bayus, B. L. 2003. Network effects and com-
Parker, G., Van Alstyne, M., & Jiang, X. 2017. Platform eco- petition: An empirical analysis of the home video game
systems: How developers invert the firm. Manage- industry. Strategic Management Journal, 24: 375–384.
ment Information Systems Quarterly, 41: 255–266. Shapiro, C., & Varian, H. R. 1999. Information rules. Cam-
Pavlou, P. 2011. State of the information privacy literature: bridge, MA: Harvard Business School Press.
Where are we now and where should we go? Manage- Sheremata, W. A. 2004. Competing through innovation in
ment Information Systems Quarterly, 35: 977–988. network markets: Strategies for challengers. Academy
Perrons, R. K., & Jensen, J. W. 2015. Data as an asset: What of Management Review, 29: 359–377.
the oil and gas sector can learn from other industries Simon, H. A. 1991. Bounded rationality and organizational
about “big data.” Energy Policy, 81: 117–121. learning. Organization Science, 2: 125–134.
2021 Gregory, Henfridsson, Kaganer, and Kyriakou 551

Simon, H. A. 1995. Artificial intelligence: An empirical Woerner, S. L., & Wixom, B. H. 2015. Big data: Extending
science. Artificial Intelligence, 77: 95–127. the business strategy toolbox. Journal of Information
Simon, H. A. 1996. The sciences of the artificial (3rd ed.). Technology, 30: 60–62.
Cambridge, MA: MIT Press. Yoo, Y. 2015. It is not about size: A further thought on big
Singh, P. V., Tan, Y., & Mookerjee, V. 2011. Network ef- data. Journal of Information Technology, 30: 63–65.
fects: The influence of structural capital on open Yoo, Y., Henfridsson, O., & Lyytinen, K. 2010. Research
source project success. Management Information commentary—The new organizing logic of digital in-
Systems Quarterly, 35: 813–817. novation: An agenda for information systems re-
Smith, H. J., Dinev, T., & Xu, H. 2011. Information privacy search. Information Systems Research, 21: 724–735.
research: An interdisciplinary review. Management Yu, L., & Nickerson, J. V. 2011. Cooks or cobblers?: Crowd
Information Systems Quarterly, 35: 980–1015. creativity through combination. In D. S. Tan & G. Fitz-
Stremersch, S., Tellis, G. J., Franses, P. H., & Binken, patrick (Eds.), Proceedings of the conference on hu-
J. L. G. 2007. Indirect network effects in new product man factors in computing systems: 1393–1402. New
growth. Journal of Marketing, 71: 52–74. York, NY: Association for Computing Machinery.

Stucke, M. E., & Ezrachi, A. 2016. When competition fails Zhu, F., & Iansiti, M. 2012. Entry into platform-based mar-
to optimize quality: A look at search engines. Yale kets. Strategic Management Journal, 33: 88–106.
Journal of Law and Technology, 18: 70–110. Zhu, F., & Liu, Q. 2018. Competing with complementors:
Stucke, M. E., & Ezrachi, A. 2018. Alexa et al., What are An empirical look at Amazon.com. Strategic Man-
you doing with my data? Critical Analysis of Law, 5. agement Journal, 39: 2618–2642.
Retrieved from https://cal.library.utoronto.ca/index. Zimmerman, M. A., & Zeitz, G. J. 2002. Beyond survival:
php/cal/article/view/29509/21994 Achieving new venture growth by building legitima-
Suarez, F. F. 2005. Network effects revisited: The role of cy. Academy of Management Review, 27: 414–431.
strong ties in technology selection. Academy of Man-
agement Journal, 48: 710–720.
Suchman, M. C. 1995. Managing legitimacy: Strategic and
Robert Wayne Gregory (rg7cv@virginia.edu) is associate
institutional approaches. Academy of Management
professor at McIntire School of Commerce, University of
Review, 20: 571–610.
Virginia. He received his PhD in management information
Tucker, C. 2019. Digital data, platforms and the usual [anti- systems from Goethe University Frankfurt. His research
trust] suspects: Network effects, switching costs, es- interests include digital innovation and strategy,
sential facility. Review of Industrial Organization, platforms, and technology-driven change.
54: 683–694.
Ola Henfridsson (ohenfridsson@miami.edu) is professor
Van Alstyne, M. W., Parker, G. G., & Choudary, S. P. 2016. of business technology at Miami Herbert Business School,
Pipelines, platforms, and the new rules of strategy. University of Miami. He is also a WBS distinguished
Harvard Business Review, 94(4): 54–60, 62. research environment professor at Warwick Business
Varian, H. R. 2014. Beyond big data. Business Economics School in the United Kingdom and a KIN Fellow at VU
(Cleveland, Ohio), 49: 27–31. Amsterdam. His research interests relate to digital
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. innovation, platforms, and technology management.
2003. User acceptance of information technology: To- Evgeny Kaganer (evgeny_kaganer@skolkovo.ru) is professor
ward a unified view. Management Information Sys- and dean for academic affairs at the Moscow School of
tems Quarterly, 27: 425–478. Management SKOLKOVO. He is also a visiting professor at
Verganti, R. 2008. Design, meanings, and radical innova- IESE Business School, University of Navarra. He received
tion: A metamodel and a research agenda. Journal of his PhD from Louisiana State University. His research
Product Innovation Management, 25: 436–456. interests focus on digital strategy and organizational
transformation.
von Ahn, L., & Dabbish, L. 2004. Labeling images with a
computer game. In E. Dykstra-Erickson (Ed.), Pro- Harris Kyriakou (kyriakou@essec.edu) is associate
ceedings of the conference on human factors in com- professor of information systems at ESSEC Business
puting systems: 319–326. New York, NY: Association School, and a member of the European Commission’s
for Computing Machinery. expert group on digitization. He received his PhD from
Wixom, B. H., & Watson, H. J. 2001. An empirical investi- Stevens Institute of Technology. His research focuses on the
gation of the factors affecting data warehousing suc- intersection between collective and artificial intelligence.
cess. Management Information Systems Quarterly,
25: 17–41.

You might also like