Download as pdf or txt
Download as pdf or txt
You are on page 1of 76

Journal of Information Technology

Au

Behavioral Economics in Information Systems Research:


Critical Analysis and Research Strategies
th

Journal: Journal of Information Technology


or

Manuscript ID JIN-20-0198.R1

Manuscript Type: Scholarly Review


Focc
behavioral economics, information systems, bounded rationality, dual
A

Keywords:
process theory, heuristics and biases, prospect theory, nudges

Theories of decision-making have long been important foundations for


r Pe

information systems (IS) research and much of IS is concerned with


information processing for decision making. The discipline of behavioral
economics (BE) provides the dominant contemporary approach for
understanding human decision-making. Therefore, it is logical that IS
eep

research that involves decision making should consider BE as foundation


or reference theory. Surprisingly, and despite calls for greater use of BE
in IS research, it seems that IS has been slow to adopt contemporary BE
rteR

as reference theory. This paper reports a critical analysis of BE in all


Abstract: fields of IS based on an intensive investigation of quality IS research
using bibliometric content analysis. The analysis shows that IS
researchers have a general understanding of BE, but their use of the
edv

theories has an ad hoc feel where only a narrow range of BE concepts


and theories tend to form the foundation of IS research. The factors
constraining the adoption of BE theories in IS are discussed and
strategies for the use of this influential foundation theory are proposed.
iM

Guidance is provided on how BE could be used in various aspects of IS.


ew

The paper concludes with the view that BE reference theory has the
potential to transform significant areas of IS research.
an
us
cr
ip
t

https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 1 of 75 Journal of Information Technology

1
2
3
4
5
6
7 Behavioral Economics in Information Systems
8
9
10 Research: Critical Analysis and Research Strategies
11
Au

12
13
14
15 Abstract
th

16
17
Theories of decision-making have long been important foundations for information
18
or

19
20 systems (IS) research and much of IS is concerned with information processing for
21
Fo c

22 decision making. The discipline of behavioral economics (BE) provides the dominant
Ac

23
24 contemporary approach for understanding human decision-making. Therefore, it is
r e

25
26 logical that IS research that involves decision making should consider BE as
Pe p

27
28 foundation or reference theory. Surprisingly, and despite calls for greater use of BE in
29
erte

30 IS research, it seems that IS has been slow to adopt contemporary BE as reference


31
32 theory. This paper reports a critical analysis of BE in all fields of IS based on an
33
Red

34 intensive investigation of quality IS research using bibliometric content analysis. The


35
36
analysis shows that IS researchers have a general understanding of BE, but their use
v

37
38
M
iew

39
of the theories has an ad hoc feel where only a narrow range of BE concepts and
40
41 theories tend to form the foundation of IS research. The factors constraining the
an

42
43 adoption of BE theories in IS are discussed and strategies for the use of this influential
44
45 foundation theory are proposed. Guidance is provided on how BE could be used in
us

46
47 various aspects of IS. The paper concludes with the view that BE reference theory has
48
cr

49 the potential to transform significant areas of IS research.


50
51
ip

52
53 Keywords: behavioral economics, information systems, bounded rationality, dual
54
t

55
56 process theory, heuristics and biases, prospect theory, nudges.
57
58
59
60

1
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 2 of 75

1
2
3
4
1. Introduction
5
6
7 Theories of decision-making have long been important to information systems (IS)
8
9 research, especially in the fields of decision support systems, intelligent systems, IT
10
11 governance, systems development, business intelligence, and business analytics. As
Au

12
13 Goes (2013, p.iii) suggested “Much of what the IS field is about relates to information
14
15 processing for decision making.” Decision theory can be broadly divided into economic
th

16
17 decision-making and behavioral decision theory. Economic decision-making is largely
18
or

19 prescriptive in nature and focuses on maximizing the expected utility of a decision


20
21
outcome subject to constraints (typically budget and resource constraints) normally to
Fo c

22
Ac

23
24 reach an equilibrium in some form of a market. Economists have long believed that
r e

25
26 maximizing expected utilities is the basis of an effective descriptive theory of decision
Pe p

27
28 making in addition to the perfect ideal. Behavioral decision theory on the other hand
29
is primarily descriptive in nature and has a focus on understanding how and why
erte

30
31
32 people make decisions. Browne and Parsons (2012) argued that behavioral decision
33
Red

34 theory has had a profound effect on IS scholarship. Currently, the dominant form of
35
36 behavioral decision theory is behavioral economics (Fox, 2015). Behavioral economics
v

37
38
M
iew

(BE) is a collection of theories of decision making in economic situations that explains


39
40
how humans actually make decisions. BE relaxes some of the core assumptions of
41
an

42
43 neoclassical economics, including the assumptions that decision makers always make
44
45 rational decisions and have perfect knowledge of a decision and its context, have
us

46
47 unlimited information processing abilities, lack any emotional involvement with a
48
cr

49 decision, and have no time constraints in a decision process. That is, BE relaxes a
50
51 core belief of neo-classical economics, the belief that decision making is perfectly
ip

52
53 rational.
54
t

55
56
57 BE is founded on the principle of bounded rationality (Simon, 1955) that holds that
58
59
humans have cognitive limitations that prevent them from making perfectly rational
60

2
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 3 of 75 Journal of Information Technology

1
2
3 economic decisions. When humans make decisions, they use two cognitive systems,
4
5 System 1 and System 2 (Sloman, 1996), which are respectively dominated by fast,
6
7
intuitive thinking and rule-based deliberate reasoning. System 1, in particular, is driven
8
9
10 by general heuristics that can be subject to many cognitive biases that diminish
11
Au

12 decision effectiveness (Tversky & Kahneman, 1974). Behavioral economists have


13
14 combined various aspects of BE to create important theories like prospect theory
15
th

16 (Kahneman & Tversky, 1979a) and intervention methods like decision nudges (Thaler
17
18 & Sunstein, 2009). BE is not a single reference theory, it is a complex web of interacting
or

19
20 theories, phenomena, methods, and processes.
21
Fo c

22
Ac

23
24 In the area of the IS discipline most concerned with decision making, decision support
r e

25
26
systems (DSS), there have consistently been calls for the greater use of contemporary
Pe p

27
28
29 BE in IS research (Sage, 1981; Elam, Jarvenpaa, & Schkade, 1992; Angehern &
erte

30
31 Jelassi, 1994; Arnott, 2006; Bhandari, Hassanein, & Deaves, 2008; Huang, Hsu, & Ku,
32
33 2012; Arnott & Pervan 2014). Goes (2013) in an MIS Quarterly editorial echoed these
Red

34
35 calls. There have been two reviews of BE in IS research that indicated that BE as a
36
v

37 reference discipline has a low use in IS research (Fleischmann et al., 2014; Odnor &
38
M
iew

39 Oinas-Kukkonen, 2017). Most recently, Arnott and Gao (2019) provided an overview
40
41
an

of BE and suggested 10 areas where BE could be used more widely in DSS projects.
42
43
44
45
us

This paper adds to this previous work and aims to increase our understanding of how
46
47
48
BE is used, and could be used, in general IS research. It approaches this task through
cr

49
50 a bibliometric content analysis of high-quality recent IS research. The literature
51
analysis forms the foundation for considering how IS projects should and could use
ip

52
53
54 BE reference theory. Ultimately, the paper addresses the questions:
t

55
56 1. How has behavioral economics been used in IS research?
57
58 2. Why don’t we use behavioral economics in IS research projects?
59
60 3. How could we use behavioral economics in IS research?

3
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 4 of 75

1
2
3
4
5 The paper is organized as follows: the next section provides an overview of
6
7
mainstream BE. Previous literature reviews of the use of BE reference theory in IS
8
9
10 research are then discussed. The design of this paper’s literature analysis is then
11
Au

12 presented followed by the results of the review. The paper then addresses the three
13
14 research questions mentioned above.
15
th

16
17
18
or

19
20 2. Behavioral Economics
21
Fo c

22
Ac

23
24
r e

25 For much of the 20th century the dominant approach to decision making, both
26
Pe p

27 descriptive and prescriptive, was neoclassical economics. The microeconomic


28
29 approach to decision making was articulated by Marshall (1890) and has been taught
erte

30
31 to every generation of economists ever since (for example, Samuelson & Nordhaus,
32
33
Red

2009). Neoclassical economics is based on two major foundations: rationality in


34
35
decision making and optimization leading to equilibrium in markets. Economists
36
v

37
38 assume that a rational decision involves consistent preferences, logical reasoning,
M
iew

39
40 perfect information about all aspects of a decision, and unlimited processing capacity.
41
an

42 In making a decision with these conditions decision makers effectively ration scare
43
44 resources by maximising their utility or satisfaction in either consumption or profits from
45
us

46 production of goods and services. The idea of utility maximization in markets can be
47
48 traced to Adam Smith (Smith, 1776) and was expressed in its current form of expected
cr

49
50 utility (EU) theory by Von Neumann and Morgenstern (1944). Under expected utility
51
ip

52 theory, decision makers maximize the expected values of their choices under strict
53
54
normative conditions of mathematical logic. By participants in a market acting in this
t

55
56
57
almost superhuman rational utility maximizing manner, markets attain an optimum
58
59 equilibrium between consumption and production. Prior to the development of BE there
60

4
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 5 of 75 Journal of Information Technology

1
2
3 were a number of critiques of using the maximization of expected utility as an effective
4
5 descriptive theory of decision making including from mathematical economists (for
6
7
example, Arrow, 1950), statistical decision theorists (for example, Alias, 1953; Savage,
8
9
10 1954; Ellsberg, 1961), post-war microeconomists (for example, Akerlof, 1970; Shiller,
11
Au

12 1978), and from institutional economists (for example, Galbraith, 1952). These
13
14 challengers attacked one or more of the rigorous assumptions of neoclassical
15
th

16 economics. Despite these critiques, expected utility maximization largely survived as


17
18 the accepted descriptive economic theory of decision making until the rise and
or

19
20 acceptance of behavioral economics. It remains the preeminent prescriptive, or
21
Fo c

22 normative, approach to decision making.


Ac

23
24
r e

25
26
Nobel Prize for Economics winner1 Herbert Simon laid the foundation of BE2 in the
Pe p

27
28
29 seminal papers “A behavioral model of rational choice” in the Quarterly Journal of
erte

30
31 Economics (Simon, 1955) and “Theories of decision-making in economics and
32
33 behavioral science” in The American Economic Review (Simon, 1959). For Simon, the
Red

34
35 ideal view of economic decision rationality was at odds with his studies of
36
v

37 administrative managers in the 1940s. He argued in Simon (1997/1945) and Simon


38
M
iew

39 (1955) that decision makers could not have perfect knowledge of a decision situation
40
41
an

and, further, they are significantly limited in their cognitive and information processing
42
43 abilities. In addition, they are normally subject to time pressure that means that perfect
44
45
us

computations of utility and other functions are not possible. Simon’s key insight was
46
47
48
that decision makers’ rationality was bounded rather than perfect. Under bounded
cr

49
50 rationality decision makers use heuristics or rules of thumb rather than the optimization
51
processes that underpin neoclassical economics.
ip

52
53
54
t

55
56 1 http://www.nobelprize.org/nobel_prizes/economics/laureates/1978/
57 2 We date the emergence of BE to Herbert Simon’s work despite many of the principles he
58
enunciated being known in the psychology literature (for example, Dewey, 1933; Edwards,
59
1954, 1961). Unlike the psychology scholars, Simon published his seminal research in top-
60
ranked economics journals.

5
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 6 of 75

1
2
3
4
5 Following bounded rationality, the major aspect of Simon’s decision-making theory is
6
7
the phase model of decision-making, a descriptive and explanatory model of the
8
9
10 process of decision-making under bounded rationality where the phases of
11
Au

12 intelligence, design, and choice define a decision process (Simon, 1960). The phases
13
14 can be executed in different sequences and are iterative and recursive. A well-
15
th

16 accepted extension of Simon’s phase model is the five-stage model of Mintzberg,


17
18 Raisinghani, and Theoret (1976). The phase model led to the concept of decision
or

19
20 structuredness. Simon offered the following definition: “Problems are well structured
21
Fo c

22 when the goal tests are clear and easily applied, and when there is a well-defined set
Ac

23
24 of generators for synthesizing potential solutions. Problems are ill structured to the
r e

25
26
extent that they lack these characteristics.” (Simon, 1997/1945, p. 128). A totally
Pe p

27
28
29 structured decision is one where a computer program can be written to make the
erte

30
31 decision; a totally unstructured decision is one where no aspect of the decision process
32
33 can be articulated. For Simon, the decisions people make are largely unstructured,
Red

34
35 where the decisions of economic man are, by definition, completely structured. The
36
v

37 phase model has, unfortunately, largely failed in empirical validation (Lipschitz & Bar-
38
M
iew

39 Ilan, 1996) but bounded rationality remains the accepted foundation of all BE theory.
40
41
an

42
43 Chapman and Pike (2013) in a significant review of the BE literature argued that Simon
44
45
us

and colleagues typify the “old” BE, while Kahneman, Tversky, Thaler and their
46
47
48
colleagues typify the “new”. This paper uses the terms “early” and “contemporary” BE
cr

49
50 as they are less value-loaded and more accurate. Theories and concepts from the
51
early BE of Simon and colleagues are shown in Table 1. In Tables 1 through 5 classic,
ip

52
53
54 seminal, and well-cited BE reference articles are included in the Indicative References
t

55
56 column. In addition, IS articles that have used the aspects of BE are shown in italics.
57
58 This italicization of IS references is continued in Tables 2 through 5.
59
60

6
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 7 of 75 Journal of Information Technology

1
2
3 Table 1. Early Behavioral Economics.
4
5 Concept Definition Indicative References*
6 Bounded The seminal theory of BE & remains the Agosto (2002);
7 Rationality foundation of contemporary BE. Posits Choudhury & Sampler
8 that humans are incapable of (1997); Gigerenzer &
9 economically rational decisions. Their Selten (2001);
10 rationality is bounded inter alia by Kahneman (2003);
11
Au

information processing ability, available Ross (2014); Simon


12 information, time and cost constraints. (1955)
13 Phase model of A descriptive model of decision making Hall & Paradice (2005);
14 decision making that posits that decisions are staged, Lipschitz & Bar-Ilan
15 iterative, and recursive. The stages are (1996); Mintzberg,
th

16 intelligence, design, and choice. Also Raisinghani, & Theoret


17 called the phase theorem and the stage (1976); Simon (1960);
18 model. Has little evidence for its validity. Xue, Liang, & Boulton
or

19 (2008)
20 Decision A corollary of the phase model. A Abdolmohammadi
21 Structuredness decision is fully structured when all its (1987); Gorry & Scott
Fo c

22 phases can be documented and is Morton (1971); Simon


Ac

23 unstructured when no aspect or phase (1960, 1973); Smith


24 can be described. Most decisions are (1988)
semi-structured. Also called
r e

25
26 programmedness.
Pe p

*IS research is shown in italics.


27
28
29
erte

30
The psychologists Daniel Kahneman and Amos Tversky took Simon’s challenge to the
31
32
33 rationality of economic man into the laboratory and developed the heuristics and
Red

34
35 biases research area (Tversky & Kahneman, 1974). The first economist to publish in
36
v

37 this “new” area was Richard Thaler (Thaler, 1980). Kahneman won the Nobel Prize for
38
M
iew

39 Economics in 20023 and Thaler in 20174. This paper takes the view that mainstream
40
41 contemporary BE is typified by the research of Kahneman, Tversky, and Thaler and
an

42
43 has become the scientific orthodoxy in the understanding of human decision-making.
44
45
us

In a commentary in the Harvard Business Review about the impact of the various
46
47 streams of BE on business and management, Justin Fox concluded “The Kahneman-
48
cr

49
Tversky heuristics-and-biases approach has the upper hand right now, both in
50
51
academia and in the public mind.” (Fox, 2015, p. 84). Fox reported that 90% of current
ip

52
53
54 BE is in the Kahneman and Tversky camp. Thinking Fast and Slow (Kahneman, 2011)
t

55
56 provides a definitive summary of contemporary mainstream BE while Thaler’s
57
58
59 3 http://www.nobelprize.org/nobel_prizes/economics/laureates/2002/kahneman.html
60 4 https://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/2017/

7
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 8 of 75

1
2
3 Misbehaving: The Making of Behavioral Economics provides a chronicle of the history
4
5 and overview of the discipline (Thaler, 2015), as does Camerer and Loewenstein
6
7
(2003) and Corr and Plagnol (2019).
8
9
10
11
Au

12 There are other significant branches of descriptive decision-making study arising from
13
14 Simon’s early BE, most notably the research associated with Gary Klein and Gerd
15
th

16 Gigerenzer. These two quite different research groupings share two propositions:
17
18 decision-making is best studied in the field with real decision makers, and expert
or

19
20 intuition, through the action of heuristics, is a source of decision effectiveness rather
21
Fo c

22 than a flaw in decision making. Klein is best known for his landmark study of fire
Ac

23
24 commanders in the field (Klein, Calderwood, & Clinton-Cirocco, 2010). His resulting
r e

25
26
naturalistic decision-making approach and the recognition-primed decision model is
Pe p

27
28
29 used in military tactical and strategic decision-making (Klein, 1999, 2004). Klein also
erte

30
31 collaborated with Kahneman to investigate under what circumstances expert intuition
32
33 is preferable to analytic decision-making (Kahneman & Klein, 2009). Gigerenzer’s
Red

34
35 program of research is deeply critical of the heuristics and biases approach, even
36
v

37 arguing that biases may be experimental artifacts (Gigerenzer, 2000, ch. 12). His core
38
M
iew

39 proposition is that heuristics are a source of effectiveness in decision-making and his


40
41
an

approach is summarized in Gigerenzer and Selten (2001), Gigerenzer and Brighton


42
43 (2009), and Gigerenzer (2014).
44
45
us

46
47
48
Contemporary mainstream BE has two main foundations: the first is the dual process
cr

49
50 theory of decision-making cognition; the second is a set of heuristics and cognitive
51
biases that, in some circumstances, systematically prejudice decision quality.
ip

52
53
54 Kahneman (2011, Part 1) placed dual process theory firmly at the foundation of
t

55
56 contemporary BE. This theory holds that decision-making occurs within and between
57
58 two cognitive processes or systems. Kahneman and Frederick (2002) typified them as
59
60 two families of cognitive operations. In an influential paper, Stanovich and West (2000)

8
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 9 of 75 Journal of Information Technology

1
2
3 termed them System 1 and System 2 in order to avoid descriptive labeling. System 1
4
5 is fast, automatic, effortless, and intuitive. When facing a decision, System 1 is the first
6
7
in action. It operates through innate, instinctive behavior. System 2 is slow, deliberate
8
9
10 and requires significant cognitive effort. System 2’s abilities are not innate and must
11
Au

12 be formed through education, both formally in training courses, schools, and


13
14 universities, and less formally in families and social interaction. Over time, System 2
15
th

16 tasks can be converted to System 1 through exposure and experience. While


17
18 described as discrete systems, System 1 and 2 can operate at the same time and can
or

19
20 interact. Evans (2003) described the situation as like two minds in the same body.
21
Fo c

22 Kahneman and Frederick (2002, p. 51) relate: “System 1 quickly proposes intuitive
Ac

23
24 answers to judgment problems as they arise, and System 2 monitors the quality of
r e

25
26
these proposals, which it may endorse, correct, or override.” Aspects of dual process
Pe p

27
28
29 theory are shown in Table 2.
erte

30
31
32
33 Table 2. The Dual Process Theory of Decision Cognition.
Red

34
35 Concept Definition Indicative References*
36 Dual Process Posits that decision making occurs within Arnott, Lizama, & Song
v

37 Theory and between two cognitive systems, (2017); Bhattacherjee &


38 normally termed System 1 and System 2. Sanford (2006); Gwebu
M
iew

39 et al. (2014); Chaiken


40 (1980); Reyna (2004)
41 System 1 Fast, unconscious, automatic, effortless, Evans (2008),
an

42 and intuitive. It represents innate, Kahneman (2011);


43 instinctive behavior. Formed by biology Klein (2004); Stanovich
44 and experience. & West (2000)
45 System 2 Slow, conscious, deliberate, rule-based, Evans (2008);
us

and requires significant cognitive effort. Kahneman (2011);


46
Formed by formal and cultural education. Stanovich & West
47
(2000)
48
* IS research is shown in italics.
cr

49
50
51
ip

52
53 The second major foundation of mainstream contemporary BE, heuristics and biases,
54
t

55 is essentially devoted to understanding System 1 decision processes, how they are


56
57 effective and how they fail. The seminal publication is Tversky and Kahneman (1974).
58
59 Heuristics and biases are best thought of as a large collection of phenomena and
60

9
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 10 of 75

1
2
3 effects where each phenomenon describes a particular aspect of human decision-
4
5 making. Heuristics and biases can be explained by theories from psychology,
6
7
sociology, and BE itself. Further, some theories explain aspects of multiple biases.
8
9
10 Unfortunately, this collection of disparate effects makes it difficult to have a coherent
11
Au

12 overall view of the field. As Fleischmann et al. (2014, p. 5) relates “… there is no


13
14 standardized, generally accepted and scientifically grounded framework for cognitive
15
th

16 biases”.
17
18
or

19
20 Tversky and Kahneman (1974) built on Simon’s work to identify three general and
21
Fo c

22 innate heuristics that guide decision making. Being general and innate means that all
Ac

23
24 humans have these heuristics as a fundamental part of their brain’s function. The action
r e

25
26
of these general heuristics means that decision makers can quickly and effortlessly
Pe p

27
28
29 arrive at a decision. Tversky and Kahneman’s original general heuristics are
erte

30
31 availability, representativeness, and adjustment and anchoring. Since the publication
32
33 of Tversky and Kahneman (1974) there has been considerable research effort in
Red

34
35 identifying other general heuristics. These proposed heuristics include the effort
36
v

37 heuristic (Kruger et al., 2004) and the recognition heuristic (Goldstein & Gigerenzer,
38
M
iew

39 2002). The most important development in general heuristic research is the affect
40
41
an

heuristic (Slovic et al., 2002). In considering the role of emotion in decision making they
42
43 noted that in the 1970s, when the heuristics and biases research stream was
44
45
us

established, psychology had a strong emphasis on cognitive rather than emotional or


46
47
48
motivational factors. The various judgment heuristics are defined in Table 3.
cr

49
50
51
Table 3. General Judgment Heuristics.
ip

52
53
54 Concept Definition Indicative References*
t

55 Availability heuristic Decision makers assess the probability of Barfar, Padmanabhan,


56 an event by the degree to which & Hevner (2017); Chen
57 instances are available in memory. & Lee (2003); Schwarz
58 & Vaughn (2002);
59 Tversky & Kahneman
60 (1973)

10
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 11 of 75 Journal of Information Technology

1
2
3 Representativeness Decision makers assess the likelihood of Kahneman & Frederick
4 heuristic an occurrence by the similarity of that (2002); Lim & Benbasat
5 occurrence to the stereotype of a set of (1997); Nisbett & Ross
6 occurrences. (1980)
7 Anchoring and Decision makers make assessments by Epley & Gilovich
8 Adjustment heuristic starting from an initial value and adjust (2006); Kahneman &
9 this value to arrive at the final decision. Frederick (2002);
10 Now regarded as a cognitive bias (see Remus & Kottemann
11 anchoring bias below). (1995); Tversky &
Au

12 Kahneman (1974)
13 Affect heuristic Operates because decision makers tag Ariely (2013); Slovic,
14 memories and representations of things Finucane, Peters, &
15 in their minds with previous assessments MacGregor (2002);
th

16 of affect (feeling and emotion). Affect can Winter (2014)


17 serve as a fast, effortless cue in decision
18 making.
or

19 Effort heuristic Decision makers judge the quality or Kruger, Wirtz, Boven, &
20 value of an object, process, or event by Altermatt (2004);
21 the amount of effort that has gone into its Schrift, Kivetz, &
Fo c

22 development. Netzer (2016)


Ac

23 Recognition Allows decision makers to quickly judge Goldstein & Gigerenzer


heuristic the value of something based on their (2002)
24
ease of recognition. Also called the less-
r e

25
is-more effect.
26
* IS research is shown in italics.
Pe p

27
28
29
erte

30
31 While general heuristics are a source of effectiveness in human decision-making, they
32
33 are subject to cognitive processes that can lead to poor decisions and, in rare cases,
Red

34
35 catastrophic failure. These processes are the second part of the heuristics and bias
36
v

37 component of BE. Cognitive biases are cognitions or mental behaviors that prejudice
38
M
iew

39 decision quality in a significant number of decisions for a significant number of people;


40
41
an

they are inherent in human reasoning. In the literature, these effects are often termed
42
43 decision or judgment biases, illusions, or effects. Tversky and Kahneman (1974, p.
44
45
us

1130) viewed biases as failures of general heuristics: “…they (heuristics) occasionally


46
47
48 lead to errors in prediction or estimation.” An important word in the quotation is
cr

49
50 “occasionally” and the key issue is knowing when a particular bias is likely to adversely
51
affect a particular decision. All else being equal, biases are likely to affect complex
ip

52
53
54 decisions with time pressure and decisions that are new to the decision maker. Dozens
t

55
56 of biases have been discovered in psychology and BE experiments and this body of
57
58 work is summarized in Sage (1981), Kahneman, Slovic, and Tversky (1982),
59
60 Kahneman and Tversky (2000), Gilovich, Griffen, and Kahneman (2002), and

11
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 12 of 75

1
2
3 Kahneman (2011). The discovery of biases has become somewhat of an academic
4
5 industry and a confusing aspect of the area is that researchers use various and
6
7
confusing names for the effects (Sage, 1981). One strategy to aid the overall
8
9
10 understanding of the range of cognitive biases has been to develop classifications and
11
Au

12 typologies of biases. Tversky and Kahneman (1974) classified biases by their three
13
14 general judgment heuristics. Bazerman and Moore (2013) followed a similar approach
15
th

16 and argued for the primacy of overconfidence biases. Bazerman and Moore’s typology
17
18 is aimed at managers and senior professionals. Hogarth’s (1987) typology is ordered
or

19
20 according to his general model of decision making. The Lovallo & Sibony (2010)
21
Fo c

22 typology is from a McKinsey & Co report and is aimed at management consulting.


Ac

23
24 Finally, the typology of Arnott (2006) was developed specifically for systems analysis
r e

25
26
in DSS development projects. Using Bazerman and Moore as an overall structure,
Pe p

27
28
29 these typologies have been combined to form the typology presented in Table 4. In the
erte

30
31 table each bias is defined, and references are provided to foundation and popular
32
33 articles as well as to their use in IS research (in italics). Further, the table mentions
Red

34
35 alternative names for the various biases.
36
v

37
38
M
iew

39 Table 4. Biases in Human Decision-Making.


40
41
an

Concept Definition Indicative References*


42 Overconfidence These lead decision makers to have
43 Biases unwarranted confidence in their decision-
44 making performance.
45
us

Completeness bias The perception of an apparently complete Fischhoff, Slovic, &


46 or logical data presentation can stop the Lichtenstein (1978);
47 search for omissions. Also called McKenzie, Liersch, &
48 overprecision. Yaniv (2008)
cr

49 Complexity bias Time pressure, information overload and Maule & Edland
50 other environmental factors can increase (1997); Ordonez &
51 the perceived complexity of a task. Benson (1997)
ip

52 Desire bias The likelihood of desired outcomes is Olsen (1997); Hastie &
53 often assessed as being greater than Dawes (2001)
54 they actually are.
t

55 Illusion of control A poor decision may lead to a good Dudezert & Leidner
56 outcome inducing a false feeling of (2011); Hastie &
57 control over the decision-making Dawes (2001);
58 situation. Kottemann, Davis, &
59 Remus (1994)
60

12
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 13 of 75 Journal of Information Technology

1
2
3 Overconfidence The ability to solve difficult or novel Brenner et al. (1996);
4 problems is often overestimated leading Chen & Koufaris
5 to significant overconfidence in decision (2015); Moore & Healy
6 making. Arguably one of the strongest (2008); Tan, Tan, &
7 biases. Teo (2012)
8 Overplacement Decision makers can believe that they Koellinger, Minniti, &
9 perform better than others when this is Schade, (2007); Moore
10 not the case. Also called the better-than- & Cain (2007)
11 average effect.
Au

12 Planning fallacy The time to undertake a project is often Buehler, Griffin, &
13 significantly underestimated. Also called Ross (2002); Lovallo &
14 the optimism bias. Related to the Kahneman (2003);
15 conjunctive and disjunctive events bias. Shmueli, Pliskin, &
th

16 The “yes” bias is a special case of the Fink (2016b)


17 planning fallacy.
18 Redundancy bias The more redundant and voluminous the Arkes, Hackett, &
or

19 data, the more confidence may be Boehm (1989)


20 expressed in its accuracy and
21 importance.
Fo c

22 Self enhancement People tend to overestimate their Dunning, Heath, &


Ac

23 decision-making abilities and view Suls (2004); Sedikides


themselves more positively than is & Gregg (2008)
24
warranted. Also called the positivity bias.
r e

25
Test bias Some aspects and outcomes of decision Christensen-Szalanski
26
making cannot be tested, leading to & Bushyhead (1981)
Pe p

27
unrealistic confidence in decision making.
28
Availability Biases These biases have to do with how
29 memory affects decision making.
erte

30 Ease of recall Events or occurrences can be Tversky & Kahneman


31 overweighed in decision making if they (1981); Ma, Kim, &
32 can be easily recalled. Also called the Kim (2014)
33 salience, vividness, recency, and latency
Red

34 biases or effects.
35 Retrievability bias How memory is structured can affect how Tversky & Kahneman
36 easily it can be recalled or searched for. (1974); Bazerman &
v

37 Also called the search bias. Moore (2013)


38 Testimony bias The inability to recall details of an event Briggs & Krantz
M
iew

39 may lead to seemingly logical (1992); Ricchiute


40 reconstructions that are inaccurate. (1997)
41 Representativeness These biases affect how decision makers
an

42 Biases assess information relative to stereotypes


43 or assumed representations.
44 Base rate Decision makers can ignore relevant, Kahneman & Tversky
45 insensitivity known, background information. (1972); Moore, Oesch,
us

46 & Zietsma (2007); Roy


47 & Lerch (1996)
48 Chance Random events can be misconceived as Ayton, Hunt & Wright
misconception the essential nature of a process. Also (1989); Oppenheimer
cr

49
50 called the gambler’s fallacy. & Monin (2009);
51 Tversky & Kahneman
(1971)
ip

52
53 Illusory correlation The probability of two events occurring Alloy & Tabachnik
54 together can be overestimated if they (1984); Moody &
t

55 have co-occurred in the past. The halo Galletta (2015);


effect is a special case. Nisbett & Wilson
56
(1977); Tversky &
57
Kahneman (1973)
58
Regression to mean Events regressing toward the mean on Jorgenson, Indahl, &
59
subsequent trials is a persistent pattern. Sjoberg (2003);
60 This is often ignored in decision making.

13
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 14 of 75

1
2
3 Tversky & Kahneman
4 (1974)
5 Sample size The nature & size of a sample can be Sedlmeier &
6 insensitivity ignored or downplayed in judging a Gigerenzer (1997);
7 sample’s representativeness. Tversky & Kahneman
8 (1971)
9 Similarity bias The likelihood of an event may be judged Horton & Mills (1984);
10 by the degree of its similarity with the Joram & Read (1996);
11 class it is perceived to belong to. Also Kahneman & Frederick
Au

12 called stereotypical thinking. The (2002)


13 misattribution effect is a special case of
14 the similarity bias.
15 Subset bias A conjunction or subset is often judged Thuring & Jungermann
th

16 more probable than its set. Also called (1990); Briggs &
17 the conjunction fallacy. Krantz (1992)
18 Confirmation These biases cause people to ignore or
or

19 Biases downplay information that is contrary to


20 their current position.
21 Anchoring bias Adjustments from an initial position are Allen & Parsons
Fo c

22 usually insufficient. The anchor value (2010); Chapman &


Ac

23 overwhelms judgment. Also called the Johnson (1994);


24 first impression bias. Critcher & Gilovich
(2008); Ni, Arnott, &
r e

25
Gao (2019); Furnham
26
& Boo (2011); Lim,
Pe p

27
Benbasat, & Ward
28 (2000)
29 Confirmation trap Decision makers tend to seek Evans (1989); Huang,
erte

30 confirmatory evidence and do not search Hsu, & Ku (2012);


31 for, or downplay, disconfirming Russo, Medvec, &
32 information. Meloy (1996)
33 Conjunctive and Outcome likelihoods are often Bar-Hillel (1973);
Red

34 disjunctive events overestimated in compound conjunctive Teigen, Martinussen,


35 bias problems and underestimated in & Lund (1996)
36 compound disjunctive problems.
v

37 Endowment effect Decision makers place excessive value in Kahneman, Knetsch, &
38 assets that they currently own. Related in Thaler (1990, 1991);
M
iew

39 some way to the effort heuristic. Rafaeli & Raban


40 (2003)
41 Hindsight bias In retrospect, the degree to which an Arkes, Faust,
an

42 event could have been predicted is often Guilmette, & Hart


43 overestimated. Also called the curse of (1988); Fischhoff
44 knowledge, the I-knew-it-all-along effect, (2007); Hertwig,
45 and creeping determinism. Fenselow, & Hoffrage
us

46 (2003)
47 Status quo bias Decision makers can irrationally prefer a Kahneman, Knetsch, &
48 current situation over a preferable Thaler (1991); Polites
possible future. Also called conservatism, & Karahanna (2012)
cr

49
50 habit, and inertia.
51 Bounded These biases involve systematic &
Awareness Biases predictable failures to notice critical
ip

52
53 information.
54 Change blindness Decision makers may not notice obvious Mitroff, Simons, &
t

change, especially it occurs over time in Franconeri (2002);


55
small frequent episodes. Tenbrunsel & Messick
56
(2004)
57
Focalism Decision makers can decide by focusing Schade & Kahneman
58
on a subset of available information and (1998); Wilson et al.
59 overweight this information in judgement. (2000)
60

14
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 15 of 75 Journal of Information Technology

1
2
3 Also called the focusing illusion and the
4 durability bias.
5 Inattentional Decision makers may only see an item or Bazerman & Moore
6 blindness information they are looking for, even (2013); Mack (2003)
7 when important other information is
8 displayed.
9 Framing Biases Concerned with how information is
10 presented to a decision maker relative to
11 a reference point.
Au

12 Framing bias Events framed as either losses or gains Davidson (2002); Kuo,
13 may be perceived differently. See Hsu, & Day (2009);
14 prospect theory below. The zero-cost Tversky & Kahneman
15 bias is a special case of framing. (1981)
th

16 Linear bias Decision makers are often unable to Arnott & O’Donnell
17 extrapolate a non-linear growth process. (2008); Mackinnon &
18 Wearing (1991)
or

19 Presentation bias The mode and mixture of presentation, Dusenbury & Fennma
20 including scales and order of (1996); Hogarth &
21 presentation, can influence the perceived Einhorn (1992);
Fo c

22 value of data. Piramuthu, Kapoor,


Ac

23 Zhou, & Mauw (2012);


24 Remus (1984);
Ricketts (1990)
r e

25
Pseudocertainty Decision makers can prefer the possibility Kahneman & Tversky
26
of absolute certainty in a decision (1979a); Slovic,
Pe p

27
situation, even when this is perceived Fischoff, &
28
rather than actual. Lichtenstein (1982);
29 Tversky & Kahneman
erte

30 (1981)
31 Escalation Biases These biases involve the non-rational
32 commitment to an action. Also called
33 sunk-cost biases.
Red

34 Unilateral escalation Decision makers may commit to follow or Keil, Mann, & Rai
35 escalate a previous unsatisfactory course (2000); Staw (1981)
36 of action.
v

37 Competitive In addition to individual-based unilateral Bazerman & Moore


38 escalation escalation, non-rational escalation can (2013); Teger (1980)
M
iew

39 also be triggered by competition,


40 especially between companies.
41 Other Effects These are behaviors that arise from a
an

42 cocktail of BE and social behaviors. They


43 often involve more than an individual
44 decision maker.
45 Default trust bias People assume that others are Gilbert (1991); Jaswal
us

46 trustworthy and assess their statements et al. (2010)


47 and judgments as accurate when they
48 may not be sound. Also called the implicit
trust bias.
cr

49
50 Herding effect People act together in groups and adopt Cialdini (2009)
51 their judgments uncritically. Similar to
peer pressure.
ip

52
53 Diversification bias People tend to choose a more diverse set Read & Lowenstein
54 of options or actively seek an increased (1995); Read et al.
t

55 set of options in decision making. (2001)


56 * IS research is shown in italics.
57
58
59
60

15
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 16 of 75

1
2
3 Various aspects of BE, dual processes and heuristics and biases, can be combined to
4
5 create theories of human behavior in decision making situations as well as methods to
6
7
improve decision making. Some of these are shown in Table 5. The most important of
8
9
10 these theories is prospect theory (Kahneman & Tversky, 1979a).
11
Au

12
13
14 Table 5. Derived Concepts, Theories, and Methods.
15
th

16 Concept Definition Indicative References*


17 Prospect Theory A descriptive theory of decision making in Adjerid, Peer, &
18 risky situations. Posits that decision Acquisti (2018);
or

19 makers are influenced by perceived gains Kahneman & Tversky


20 and losses in significantly different ways. (1979a); Rose, Rose, &
21 Norman (2004);
Fo c

22 Tversky & Kahneman


Ac

23 (1992)
24 Nudges A modest intervention that changes Halpern (2015); Kretzer
behavior in a predictable way with the & Maedche (2018);
r e

25
explicit goal of improving a process. Thaler & Sunstein
26
(2009)
Pe p

27
Debiasing A process whereby the negative Arnott (2006);
28
consequences of a cognitive bias are Bhandari, Hassanein,
29
reduced or mitigated. & Deaves (2008);
erte

30 Cheng & Wu (2010);


31 Fischhoff (1982); Keren
32 (1990)
33
Red

*IS research is shown in italics.


34
35
36
v

37
38 Prospect theory is a descriptive theory of decision making in risky and uncertain
M
iew

39
40 situations. As such it complements rather than replaces the prescriptive theories of
41
an

42 decision making of neoclassical economics. Kahneman and Tversky developed prospect


43
44 theory because they didn’t believe that expected utility theory adequately explained how
45
us

46 humans actually make decisions especially where there is a possibility of significant


47
48 losses. Figure 2 shows prospect theory’s value function. Each individual will have a
cr

49
50 different shape to their value function, but Figure 1 portrays the most common shape
51
ip

52 based on empirical studies (Kahneman & Tversky, 1979a). The horizontal axis in Figure
53
54
1 measures the monetary gains or losses that follow a decision relative to a reference or
t

55
56
57 starting point. The vertical axis measures the psychological value that a gain or a loss
58
59 yields to an individual. In Figure 1, segments 2 and 4 represent risk seeking behavior
60

16
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 17 of 75 Journal of Information Technology

1
2
3 and segments 1 and 3, risk avoidance. On the dashed line in Figure 1 decision makers
4
5 treat the value of losses and gains equally; an assumption underlying the economic
6
7
theory of expected utility.
8
9
10
11
Au

12
13
14
15
th

16
17
18
or

19
20
21
Fo c

22
Ac

23
24
r e

25
26
Pe p

27
28 Figure 1. A Value Function from Prospect Theory.
29
erte

30
31
32
33
Red

34
Three cognitive features, all operating characteristics of System 1, are the foundation
35
36 of prospect theory (Kahneman, 2011, p. 281-2). They are: 1. Evaluation is relative to
v

37
38 a neutral reference point. This point defines what is a gain and what is a loss; 2. There
M
iew

39
40 is diminishing sensitivity to both increasing values and increasing gains or losses; and
41
an

42 3. Losses loom larger than gains in decision maker’s minds. Risk seeking is the
43
44 dominant behavior when assessing losses (segment 4) while risk avoidance is the
45
us

46 dominant pattern when assessing gains (segment 1). Prospect theory shows that
47
48 humans do not behave as neoclassical economists assume, especially when risk
cr

49
50
aversion is in play. As Camerer and Loewenstein (2003, p. 20) related “Prospect theory
51
ip

52
53
… explains experimental choices more accurately than EU because it gets the
54
t

55 psychophysics of judgment and choice right.” Thaler (2015, p.353) observed “Prospect
56
57 theory is, of course, the seminal evidence-based theory in behavioral economics.”
58
59
60

17
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 18 of 75

1
2
3 Nudging is not a theory of decision making but rather a strategy and process for using
4
5 BE in an organizational setting. Nudges are essentially field experiments for
6
7
organizational rather than academic ends. They are based on the action of dual process
8
9
10 theory especially where the action of System 1 overwhelms System 2 thinking in negative
11
Au

12 ways. Nudges are argued to be needed to lead people to superior decision outcomes
13
14 because of the inherent flaws of human decision making, that is, the negative
15
th

16 consequence of cognitive biases. The concept of nudges was developed by Thaler and
17
18 Sunstein (2009) who defined a nudge as “any aspect of the choice architecture that alters
or

19
20 people’s behavior in a predictable way without forbidding any options or significantly
21
Fo c

22 changing their economic incentives” (p. 6). Thaler, Sunstein, and Balz (2014) provide a
Ac

23
24 discussion of the six principles of an effective choice architecture. Perhaps the most
r e

25
26
famous example of a nudge is changing the level of peoples’ pension savings in a
Pe p

27
28
29 positive way by altering the default on employment contracts from opt-in to opt-out. An
erte

30
31 opt-in default requires effort to commence or change pension saving plans. This effort
32
33 added to decision inertia means that people often do not take the effort to opt-in and
Red

34
35 thereby ensure a comfortable retirement. Nudging, by changing the default to opt-out,
36
v

37 changes the effort frame such that many more people save for their retirement and reap
38
M
iew

39 the rewards of long-term compound interest. The explicit use of the nudge strategy has
40
41
an

been most prominent in government. In 2008 Sunstein (of Thaler and Sunstein) was
42
43 appointed to head the White House Office of Information and Regulatory Affairs (OIRS)
44
45
us

by President Obama. This office was charged with using a BE informed approach to
46
47
48
government action. In 2010, on the other side of politics, a conservative UK government
cr

49
50 established the Behavioural Insights Team (BIT) in 10 Downing Street with a similar brief
51
to OIRS. BIT is credited with saving many millions of pounds of government revenue and
ip

52
53
54 significantly changing citizen behavior towards better personal outcomes (Halpern,
t

55
56 2015).
57
58
59
60

18
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 19 of 75 Journal of Information Technology

1
2
3 It is important to note that not all actions of cognitive biases are negative. It is perhaps
4
5 unfortunate that the term “bias” implies a universal negative effect as often the action
6
7
of heuristics and biases can be a source of decision effectiveness (Gigerenzer, 2000).
8
9
10 However, as documented above, their negative consequences can lead to serious
11
Au

12 errors in decision processes. Debiasing is a method for mitigating the negative


13
14 consequences of biases. Strangely, the research on debiasing is small compared to
15
th

16 the overall bias literature. The most influential works on debiasing are Fischhoff (1982)
17
18 and Keren (1990). Keren proposed a debiasing method based on clinical psychology
or

19
20 practice: identify the bias, consider alternatives to bias reduction, implement the
21
Fo c

22 remedy, and monitor the outcome. Fischhoff presented a number of debiasing


Ac

23
24 approaches based on the decision context. The contexts he identified are first that the
r e

25
26
decision maker is the problem, second, that the task needs redesign, and third that
Pe p

27
28
29 there is mismatch between the decision maker and the task.
erte

30
31
32
33
Red

34
35 3. Previous Reviews of Behavioral Economics in IS Research
36
v

37
38
M
iew

39
40 Tables 1 through 5 include a number of citations to IS research that have used one or
41
an

42 more aspects of BE as a major part of their work. While helpful, these examples do not
43
44 show the depth and breadth of the use of BE in IS research in a rigorous and
45
us

46 systematic way. To further this understanding there have been two literature reviews
47
48 of BE in general IS research. Fleischmann et al. (2014) conducted the most detailed
cr

49
50 review to date of cognitive biases in IS research. They used a sample of the Basket of
51
ip

52 Eight (Bof8) set of journals, as defined by the AIS Senior Scholars,5 augmented by two
53
54
journals and two conferences. Through keyword searches using 120 terms they
t

55
56
57
identified 84 articles from 1992 to 2013 that used biases in their research. They then
58
59
60 5 http://aisnet.org/?SeniorScholarBasket

19
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 20 of 75

1
2
3 coded and analyzed their sample using content analysis and found that the use of
4
5 biases in IS research has been steadily increasing over time. Their analysis showed
6
7
that framing and anchoring were the most popular biases studied in their sample and
8
9
10 that prospect theory (13/84 articles) was used to develop research models in IS
11
Au

12 research. Most of the IS articles that Fleischmann et al. (2014) reviewed used
13
14 laboratory experiments or field experiments as their research strategy. The second
15
th

16 literature review, Odnor and Oinas-Kukkonen (2017), analyzed a wider range of


17
18 aspects of BE than Fleischmann et al. in the Bof8 from 2006 to 2014. However, their
or

19
20 article selection criteria of experiments and experiment-like surveys was quite
21
Fo c

22 restrictive and yielded a BE-using sample of only 15 articles. They found significant
Ac

23
24 diversity in the small sample with the main focus being e-commerce, and
r e

25
26
recommender systems in particular. They argued that BE could have impact on studies
Pe p

27
28
29 on systems design and use. In a sub-field of IS, Arnott and Pervan (2014) in their
erte

30
31 general analysis of the DSS discipline, found that the change of foundation theory from
32
33 the early to contemporary BE was progressing slowly and citations to the early BE still
Red

34
35 dominated. They argued for a swap to contemporary BE reference theory.
36
v

37
38
M
iew

39 While these past descriptive literature reviews are valuable, they do have some
40
41
an

restrictions and constraints. As a result, it is desirable and timely to conduct a critical


42
43 rather than a descriptive literature analysis of the current use of BE in IS. The next
44
45
us

sections of this paper are devoted to this task.


46
47
48
cr

49
50
51
4. Research Method and Design
ip

52
53
54
4.1 General Approach
t

55
56
57 This paper’s research involved the bibliometric content analysis (Weber, 1990) of
58
59 representative IS research that used BE as foundation or reference theory. Bibliometric
60

20
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 21 of 75 Journal of Information Technology

1
2
3 content analysis involves the coding and analysis of a sample of research articles. In
4
5 this approach, data capture is driven by a protocol that has predominantly quantitative
6
7
questions but can also include some qualitative responses. This form of data capture
8
9
10 is labor intensive in that each article in the sample has to be read in full, sometimes
11
Au

12 multiple times. The approach has the advantage that it can illuminate the deep
13
14 structure of the field in a way that is impossible to achieve with other literature analysis
15
th

16 approaches.
17
18
or

19
20 As will be discussed in the next sub-section, the article sample covers five years of
21
Fo c

22 recent IS research. As a result, no analysis of trends over time is possible with this
Ac

23
24 sample. In this way the research in this paper is best thought of as cross-sectional in
r e

25
26
nature rather the longitudinal studies discussed in Section 3. Studying IS research from
Pe p

27
28
29 2014 to 2018 provides a snapshot to help answer the first research question “How has
erte

30
31 BE been used in IS research?” Where possible, reference is made to previous
32
33 literature analyses to identify some temporal trends.
Red

34
35
36
v

37 4.2 The Article Sample


38
M
iew

39 To understand how BE is used in IS research, a representative sample of research


40
41
an

articles that used behavioral economics was required. In determining the time-period
42
43
44
of the sample, 2014 was chosen as the starting year. This date marks the end of the
45
us

46 samples of previous relevant literature reviews (Fleischmann et al., 2014; Odnor &
47
48 Oinas-Kukkonen, 2017). Further, 2014 represents a likely time for the influence of
cr

49
50 Thinking Fast and Slow (Kahneman, 2011) to affect journal publication. Thinking Fast
51
ip

52 and Slow led to widespread exposure of BE to researchers, business professionals,


53
54 and the public. The end year of the sample is 2018 as this was the latest complete
t

55
56 year of journal publication available to the researchers at the time of coding. The
57
58 sample period is therefore 2014 to 2018.
59
60

21
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 22 of 75

1
2
3 The Basket of Eight set of journals was chosen as the journal sample as the Bof8
4
5 arguably represents the pinnacle of scientific IS publishing. The Bof8 is the European
6
7
Journal of Information Systems (EJIS), Information Systems Journal (ISJ). Information
8
9
10 Systems Research (ISR), Journal of the Association for Information Systems (JAIS),
11
Au

12 Journal of Information Technology (JIT), Journal of Management Information Systems


13
14 (JMIS), Journal of Strategic Information Systems (JSIS), and MIS Quarterly (MISQ).
15
th

16 The Bof8 provides a good mix of IS research traditions, epistemologies, and methods
17
18 and is equally split between USA and European journals. The literature review sample
or

19
20 strategy of using the Bof8 journals is common in IS theory and review papers (for
21
Fo c

22 example, Haddara & Hetlevik, 2016; Prat, Comyn-Wattiau, & Akoka, 2015; Tarafdar &
Ac

23
24 Davison, 2018; and Ozdemir, Smith, & Benamati, 2017).
r e

25
26
Pe p

27
28
29 In identifying potential articles for coding, only original research was included. This
erte

30
31 means that editorials, short commentaries on other papers, and responses to critiques,
32
33 are not part of the sample; however research notes and essays were included. The
Red

34
35 selection of potential articles for the sample was not able to be made by automated
36
v

37 search. This is because the classification of articles in the various journals is not
38
M
iew

39 consistent. Further, some editorials, responses and commentaries are not clearly
40
41
an

identified as such and can appear in a search as a research article. The process
42
43 adopted by this research was for the second author to carefully read each issue of
44
45
us

each journal to identify the articles to consider for the sample. This process maximized
46
47
48
the sampling reliability and yielded a set of 1,348 articles. The overall sampling process
cr

49
50 is shown in Figure 2.
51
ip

52
53
54
t

55
56
57
58
59
60

22
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 23 of 75 Journal of Information Technology

1
2
3
4
5
6
7
8
9
10
11
Au

12
13
14
15
th

16
17
18
or

19
20
21
Fo c

22
Ac

23
24
r e

25
26
Pe p

27
28
29
erte

30 Figure 2. The Article Sampling Process.


31
32
33
Red

34
The next step was to identify those articles of the 1,348 that used BE in some way.
35
36
Various search strategies to identify articles that use BE in this large set were
v

37
38
M
iew

39 considered and abandoned. Searching keywords was not appropriate as not all articles
40
41 that used an aspect of BE nominated an explicit keyword to identify this use. Further,
an

42
43 searching on BE terms yielded extraneous results. For example, searching for “bias”
44
45 yielded a number of non-BE concepts such as sampling bias and common method
us

46
47 bias. The search was further complicated as many BE-using articles term a cognitive
48
cr

49 bias an effect, trap, or illusion. They also used different names for a particular effect.
50
51 This means that any set of search terms needs to be very large to cover all aspects of
ip

52
53
BE. As a result of these issues, the tactic used to identify BE-using articles in the 1,348-
54
t

55
56 article set was to manually inspect the reference list of every article and identify any
57
58 BE reference theory citations. This approach was time consuming but was more
59
60 rigorous and reliable than computer-based search using a large number of terms. To

23
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 24 of 75

1
2
3 further increase reliability, this exercise was conducted by the first author. Care was
4
5 taken to identify only BE reference theory, that is, non-IS articles that are BE in nature.
6
7
BE reference theory arises mainly from cognitive and social psychology, but more
8
9
10 recent foundation work is explicitly identified as BE. A list of the 410 BE reference
11
Au

12 theory citations in the IS articles of the sample is available from the authors. This step
13
14 of the process yielded a potential sample of 249 articles. During coding 33 articles
15
th

16 were removed from the sample as their foundations were not judged to be truly BE in
17
18 nature. The final sample that emerged from these various vetting processes comprises
or

19
20 216 articles; its distribution is shown in Table 6. A reference list of the sample is also
21
Fo c

22 available from the authors. The 47 IS articles that exhibited a major use of BE are
Ac

23
24 shown later in Table B1 in Appendix B.
r e

25
26
Pe p

27
28
29 Table 6. Overview of the Article Sample.
erte

30
Journal 2018 Origin Articles 2014 2015 2016 2017 2018 Total % of
31
Impact 2014- Articles Articles Articles Articles Articles Articles Journal
32 Factor 2018 citing citing citing citing citing citing with BE
33 BE BE BE BE BE BE use
Red

34 EJIS 3.917 Europe 165 2 4 3 4 5 18 10.9


35 ISJ 4.267 Europe 133 3 5 4 4 2 18 13.5
36 ISR 2.301 USA 235 12 6 8 5 9 40 17.0
v

37 JAIS 2.839 USA 156 5 7 6 7 9 34 21.8


38 JIT 4.535 Europe 102 5 3 2 3 6 19 18.6
M
iew

39 JMIS 2.744 USA 205 5 12 5 7 6 35 17.1


40 JSIS 4.313 Europe 86 2 3 3 2 4 14 16.3
41 MISQ 5.430 USA 266 6 5 1 15 11 38 14.3
an

42 Total 1,348 40 45 32 47 52 216 16.0


43
44
45
us

46
47
4.3 Coding Protocol and Procedures
48
cr

49
50 In bibliometric content analysis data is collected using a structured protocol that mainly
51
requires quantitative responses. This approach has been used in IS literature reviews
ip

52
53
54 (for example, Arnott & Pervan, 2014; Sivarajah et al., 2017; Trieu, 2017). The protocol
t

55
56 was designed to minimize interpretive and subjective assessments and it is believed
57
58 that other researchers could apply the protocol and code similar data. One way of
59
60 assuring inter-coder reliability is for all coders to code all articles and then evaluate

24
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 25 of 75 Journal of Information Technology

1
2
3 their reliability using a statistic like Cohen’s kappa. However, with this particular sample
4
5 such an approach is impractical. The sample contains 216 articles and coding each
6
7
article requires careful reading and coding decisions. Difficult or complex articles took
8
9
10 up to four hours to code and “normal” papers at least an hour. As a result, the two
11
Au

12 authors shared the coding of the sample and a different approach to ensure inter-coder
13
14 reliability was followed. A codebook was developed that specified how each item
15
th

16 should be coded and included definitions of the constructs and concepts that were
17
18 being assessed. The authors then coded a pilot set of two articles that prima facie had
or

19
20 a significant use of BE. They then compared and discussed their coding and the
21
Fo c

22 protocol and codebook were amended. This cycle was repeated three times after
Ac

23
24 which the coders were well calibrated, and the protocol required no further change.
r e

25
26
Based on the experience with this process it was decided that all difficult articles would
Pe p

27
28
29 be coded by both authors; the coding of around 50 articles was debated by the
erte

30
31 researchers. A condensed version of the article coding protocol appears in Appendix
32
33 A. The full protocol and codebook are available from the authors.
Red

34
35
36
v

37 The protocol has three main sections: general research questions, general BE
38
M
iew

39 questions, and detailed BE questions. The general research questions addressed the
40
41
an

article type by research method, epistemology, unit of analysis, the IS field, research
42
43 domain, the task addressed, and the organizational nature of the task. The general BE
44
45
us

questions were used to assess the overall nature of BE citations in the paper and how
46
47
48
important the BE citations were to the article’s research. The citation count for all BE
cr

49
50 reference articles were recorded including where in the article it was cited. This section
51
of the protocol contained an important item that required subjective judgement by the
ip

52
53
54 coder. This item, the degree of use of BE, was judged as major, moderate, or minor.
t

55
56 Where there was uncertainty with an assessment, the higher score or code was
57
58 assigned. This is called coding with a generosity bias. The section of the protocol with
59
60 detailed BE questions addressed the particular aspects of BE that were used and the

25
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 26 of 75

1
2
3 specific heuristics and biases that may have been involved. Cognitive biases were
4
5 coded using the typology and definitions from Table 4. Anchoring and adjustment was
6
7
a special coding case and was coded either as a bias or a heuristic depending on the
8
9
10 meaning of the effect that was used in an article. In addition to “pure” BE reference
11
Au

12 theory, economics research that has founded some BE research was identified and
13
14 coded. An example of this foundational economics work is Akerlof (1970). The
15
th

16 relationship of this foundational economics to BE is similar to that of the theory of


17
18 reasoned action (Ajzen & Fishbein, 1973) to the popular technology acceptance model
or

19
20 in IS (Davis, 1989).
21
Fo c

22
Ac

23
24
4.4 Data Entry and Analysis Procedures
r e

25
26
Pe p

27 The 216 coded protocols were entered into SPSS for analysis. A research assistant
28
29 entered all data and reviewed all coding during the act of data entry using the codebook
erte

30
31 as a reference. If there were any issues identified during data entry, the second author
32
33 reviewed the coding and resolved the issue. The second author then validated all
Red

34
35 SPSS data to ensure data entry accuracy and completeness. A number of cross
36
v

37 tabulations and frequency analyses were conducted to ensure data integrity before the
38
M
iew

39 data analysis was conducted. Because of the nature of the protocol and the size of the
40
41
an

sample, the analysis focuses on descriptive statistics and cross tabulation. This
42
43
44
analysis tactic has been used by other IS literature reviews (for example, Fleischmann
45
us

46 et al., 2014, Chiu et al., 2019, Tummers, Kassahun, & Tekinerdogan, 2019).
47
48
cr

49
50
51
ip

52 5. RESULTS
53
54
t

55
56 This section involves the examination of BE as foundation theory in IS research and is
57
58
based on the analysis of two main data sets: the 216 sample of BE-using IS articles
59
60

26
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 27 of 75 Journal of Information Technology

1
2
3 and the 47 articles that used BE as a major part of their research. An article was coded
4
5 as a major user of BE if the use was integral to the project, particularly to its design
6
7
and analysis.
8
9
10
11
Au

12 Earlier, Table 6 showed the distribution of the overall sample of BE-using articles over
13
14 time and over journals. The table shows that there are no significant temporal trends
15
th

16 of overall publication in the sample period. There was a drop in BE-using publication
17
18 in 2016 but this was offset by a rise in 2018. As a result, the tables in this Results
or

19
20 Section are not displayed by year; they feature the frequencies and sample
21
Fo c

22 percentages of the various aspects, effects, and constructs being analyzed for the five-
Ac

23
24 year sample period. As discussed above in Section 4.1, the literature analysis in this
r e

25
26
paper is cross-sectional in nature. In the tables data has been rounded to one decimal
Pe p

27
28
29 place. As a result, some totals in percentage columns are either slightly higher or
erte

30
31 slightly lower than 100%. Most of the tables report coding where multiple codes per
32
33 item in the protocol were allowed. These tables are identified throughout this section.
Red

34
35
36
v

37 Table 7 adds the display of the 47-article major BE-use sample to the data on the
38
M
iew

39 overall sample from Table 6. USA published journals dominate the major BE-use
40
41
an

sample; 29.8% of articles are in European journals, 70.2% in USA journals. The journal
42
43 origin publication pattern could be an artifact of the dominant research methods in the
44
45
us

sample (see Table 14); this issue will be returned to in the Discussion. What stands
46
47
48
out in Table 7 is the superficial importance of BE to contemporary IS research. In the
cr

49
50 Bof8 journals, 16.0% of publication involves BE in some way and all journals have over
51
10% of their articles involving some BE citation. However, when considering the major
ip

52
53
54 BE-use articles, only 3.5% of IS research uses BE in a fundamental way. Of the major-
t

55
56 use articles 2.2% used cognitive biases, an increase over the 1992-2012 cognitive bias
57
58 sample of Fleischmann et al. (2014) who found that less than 1% of IS articles used
59
60 biases.

27
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 28 of 75

1
2
3
4
5 Table 7. BE-using IS Articles by Journal.
6
7
Articles with Any Use of Articles with a Major Use
8
BE of BE
9 Journal
Percentage Percentage
10 Frequency Frequency
of Sample of Sample
11
Au

EJIS 18 8.3 8 17.0


12
13 ISJ 18 8.3 5 10.6
14 ISR 40 18.5 6 12.8
15 JAIS 34 15.7 11 23.4
th

16
17 JIT 19 8.8 0 0.0
18 JMIS 35 16.2 8 17.0
or

19 JSIS 14 6.5 1 2.1


20
MISQ 38 17.6 8 17.0
21
Fo c

22 Total 216 47
Ac

23
24
r e

25
26
Pe p

27 The 47 articles in the major BE-use sample are shown in Table B1 in Appendix B.
28
29
Unlike other areas of IS (Taylor, Dillon, & Van Wingen, 2010) there were no dominant
erte

30
31
authors in the sample. A total of 129 authors were responsible for the 47-article sample
32
33
Red

34 and only nine authors contributed to more than one article. Geographically in terms of
35
36 the AIS chapters, the Americas were featured in 55.4% of author affiliations, Europe,
v

37
38 the Middle East, and Africa 24.3%, and the Asia Pacific 20.3%. In terms of institutions,
M
iew

39
40 Georgia State University and HEC Montreal had the highest number of author
41
an

42 affiliations with Indiana University, University of British Columbia, University of


43
44 Minnesota, and the National University of Singapore the only other institutions with
45
us

46 more than three affiliations.


47
48
cr

49
50
Arnott and Pervan (2014) in a literature analysis of 21 years of DSS research (1,466
51
articles) found that citations to early BE dominated DSS research, although the use of
ip

52
53
54 contemporary BE was increasing. In their last analysis period early BE was still 59.7%
t

55
56 of citations. This was surprising as DSS is the area of IS most concerned with decision
57
58 making and it was assumed that the field would naturally use the most recent
59
60 foundation theory. The more recent article samples in this paper indicate a very

28
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 29 of 75 Journal of Information Technology

1
2
3 different situation. Table 8 shows that contemporary BE dominates the use of BE in
4
5 both samples; it represents almost 90% of the major use sample. The major use
6
7
sample has only one article with an economics foundational citation.
8
9
10 Table 8. Generation of BE Used in IS Research.
11
Au

12 Articles with Any Use of Articles with a Major Use of


13 BE BE
Generation
14 Percentage Percentage of
Frequency Frequency
15 of Sample Sample
th

16 Early 72 33.3 18 38.3


17 Contemporary 155 71.8 42 89.4
18 Foundational 22 10.2 1 2.1
or

19
20 Total 249* 61*
21 *multiple codes per article allowed
Fo c

22
Ac

23
24
r e

25 Table 9 shows greater detail about how IS researchers use BE. In the coding, major
26
Pe p

27 use was defined as integral to the project reported in the article; moderate use was
28
29 where an important aspect of the article used BE; and minor use was where BE was
erte

30
31 cited to support an argument or position. Minor use can involve only one or two BE
32
33
Red

cites. In the overall BE-using IS article sample minor use dominates. The table shows
34
35
36 that in the major BE-use sample only one article used foundational economics theory
v

37
38 and then only in a minor way.
M
iew

39
40
41
an

42 Table 9. Degree of Use of BE in IS Research.


43
44 Articles with Any Use of Articles with a Major Use of
45 BE BE
us

46 Degree of Theory Use


Percentage Percentage of
Frequency Frequency
47 of Sample Sample
48 Minor 49 22.7 5 10.6
cr

49 Moderate 14 6.5 4 8.5


50 Early BE
51 Major 9 4.2 9 19.2
ip

52 None 144 66.7 29 61.7


53 Minor 77 35.7 0 0.0
54
Moderate
t

55 Contemporary 38 17.6 2 4.3


56 BE Major 40 18.5 40 85.1
57 None 61 28.2 5 10.6
58 Minor 21 9.7 1 2.1
59 Foundational
60 Moderate 1 0.5 0 0.0

29
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 30 of 75

1
2
3 Major 0 0.0 0 0.0
4
None 194 89.8 46 97.9
5
6
7
8
9 Table 10 shows BE references that received more than ten citations in the overall
10
11
Au

article sample. The table is ordered by citation count. The table reinforces the analysis
12
13 in Tables 8 and 9 that showed a dominance of contemporary BE use in IS research.
14
15
th

In the table there are 454 cites to contemporary BE and only 77 cites to early BE. The
16
17
18 table shows that Akerlof (1970) is the only economics foundational reference of note.
or

19
20 Kahneman and Tversky (1979a), the seminal article on prospect theory, is by far the
21
Fo c

22 most cited BE work in IS. In terms of reference disciplines, the various branches of
Ac

23
24 psychology are overwhelmingly featured in the table. Understandably, economics and
r e

25
26 business administration are also important.
Pe p

27
28
29
erte

30 Table 10. Most Popular Behavioral Economics References in IS Research.


31
32 Foundation Article BE Era Topic Author Discipline No. of
33 Cites
Red

34 Kahneman & Tversky Contemporary Prospect theory Psychology 76


35 (1979a)
36 Tversky & Kahneman (1974) Contemporary Heuristics, Psychology 37
v

37 Biases
Cyert & March (1963) Early Overall view Economics, 33
38
M
iew

Organization science
39 Kahneman (1973) Contemporary Heuristics Psychology 28
40 Kahneman (2011) Contemporary Overall view Psychology 28
41
an

Payne et al. (1993) Contemporary Overall view Psychology, Business 28


42 administration
43 Samuelson & Zeckhauser Contemporary Biases Economics 26
44 (1988)
45 Kahneman (2003) Contemporary Overall view Psychology 22
us

Tversky & Kahneman (1981) Contemporary Prospect theory Psychology 22


46
Harinck et al. (2007) Contemporary Prospect theory Psychology 18
47 March (1994) Early Overall view Organization science 18
48 Kahneman & Tversky Contemporary Prospect theory Psychology 17
cr

49 (1979b)
50 Gigerenzer & Todd (1999) Contemporary Heuristics Psychology 16
51 Chaiken (1980) Contemporary Dual processes Psychology 15
ip

52 Lovallo & Kahneman (2003) Contemporary Biases Business 15


53 administration,
Psychology
54
t

Payne (1976) Contemporary Heuristics Psychology 15


55 Simon (1955) Early Bounded Psychology, 15
56 rationality Computer science
57 Akerlof (1970) Foundational Non-rational Economics 14
58 pricing
59 Chaiken & Maheswaran Contemporary Heuristics, Psychology, 14
60 (1994) biases Marketing

30
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 31 of 75 Journal of Information Technology

1
2
3 Kahneman & Lovallo (1993) Contemporary Prospect theory Psychology, Business 14
4 administration
5 Payne (1982) Contemporary Context Psychology 14
6 Langer (1975) Contemporary Biases Psychology 13
7 Thaler & Sunstein (2008) Contemporary Nudges Economics, Law 13
8 Bazerman & Moore (2009) Contemporary Overall view Business 12
administration
9
Ariely & Simonson (2003) Contemporary Biases Psychology, 11
10 Marketing
11
Au

Simon (1956) Early Bounded Psychology, 11


12 rationality Computer science
13
14
15
th

16
17 Table 11 shows where in the articles BE reference theory was used. The table shows
18
or

19
that in general BE is most used in the Introduction and Background sections of IS
20
21
articles. Of interest is that in the major BE-use sample significant BE citation is
Fo c

22
Ac

23
24 maintained throughout an article.
r e

25
26
Pe p

27
28 Table 11. Location of BE Use in IS Articles.
29
erte

30 Percentage Percentage
Percentage Percentage
31 of Articles of Articles
of Articles of Articles
32 with a with a
Location of BE Citation with Any Use with a Major
33 Minor Use Moderate
Red

of BE Use of BE
34 of BE Use of BE
(n=216) (n=47)
35 (n=138) (n=51)
36 Introduction and
67.6% 55.1% 84.3% 97.9%
v

37 background
38 Method, formulating models
55.6% 46.4% 64.7% 85.1%
M
iew

39 and hypotheses
40 Interpreting results and
44.4% 34.8% 49.0% 78.7%
41 discussion
an

42
43
44
45
us

46 Taylor et al. (2010) using a longitudinal, author co-citation analysis from 1986 to 2005
47
48 found that IS research has consistently centered on three major themes: development,
cr

49
50 implementation, and use of systems in various application domains; IS strategy and
51
ip

52 business outcomes; and group work and decision support. An analysis of these
53
54 themes in the BE-using IS samples is shown in Table 12. We expected the use of BE
t

55
56
foundations to be skewed towards the DSS category as this is the area of IS most
57
58
59
concerned with decision making. Arnott and Pervan (2014, p. 277) found that DSS
60

31
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 32 of 75

1
2
3 articles comprise 18.7% of all IS publication. Table 12 shows that this is almost
4
5 identical to the share of major BE-using articles in the DSS area.
6
7
8
9
10 Table 12. Use of BE in IS Research Themes.
11
Au

12 Articles with Any Use of Articles with a Major Use


13 BE of BE
Research Theme
14 Percentage Percentage
Frequency Frequency
15 of Sample of Sample
th

16 Development, implementation,
154 71.3 33 70.2
and use of systems
17
IS strategy and business
18 45 20.8 7 14.9
or

outcomes
19
Group work and decision
20 26 12.0 9 19.1
support
21
Fo c

22 Total 225* 49*


Ac

23 *multiple codes per article allowed


24
r e

25
26
Pe p

27 The analysis now turns to the nature of inquiry in the two samples (overall use of BE
28
29 in IS and major use of BE). Chen and Hirschheim (2004) in their analysis of 1,873
erte

30
31 articles of a similar journal set to this paper found that 81% of IS research was positivist
32
33
Red

in nature and 19% interpretivist. They suggested that US journals were the oldest
34
35
publications in the discipline and at the time of their analysis were the largest IS forums.
36
v

37
38 As a result, they argued that US journals have created and maintained the dominant
M
iew

39
40 positivist philosophy of science in IS. Chen and Hirschheim further argued that the rise
41
an

42 of European journals meant that IS could “… embrace a more non-traditional, non-


43
44 positivist nature and attract IS researchers with non-traditional orientations” (p. 204).
45
us

46 Eighteen years after Chen and Hirschheim’s sample, the Bof8 comprises an equal split
47
48 of US and European journals, although US journals publish more articles per year than
cr

49
50 European; in our sample period 486 European to 862 US articles (see Table 6). Table
51
ip

52 13 shows that BE-using IS research is more positivist in nature than Chen and
53
54
Hirschheim predicted for the future of the IS discipline. The major BE-use sample has
t

55
56
57 95.7% articles coded as positivist.
58
59
60

32
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 33 of 75 Journal of Information Technology

1
2
3 Table 13. Epistemology of BE-using IS Articles.
4
5 Articles with Any Use of Articles with a Major Use of
6 BE BE
7 Epistemology
Percentage Percentage
8 Frequency Frequency
of Sample of Sample
9
Positivist 193 89.4 45 95.7
10
11 Interpretivist 18 8.3 1 2.1
Au

12 Unclear 5 2.3 1 2.1


13 Total 216 47
14
15
th

16
17
18
or

19 Table 14 takes the analysis of inquiry in BE-using IS research further by examining the
20
21 research methods used in the two samples. Chen and Hirschheim (2004) found that
Fo c

22
Ac

23 in general IS research surveys were the most popular method at 41% of articles, then
24
r e

25 case studies 36%, laboratory and field experiments 20%, and finally action research
26
Pe p

27 at 3% of IS articles. The high case study percentage in their sample was mainly due
28
29 to major European journals. These case studies were both positivist and interpretive
erte

30
31 in nature. Table 14 shows a significantly different pattern to general IS. In Table 14 the
32
33
Red

major BE-use sample case studies were the lowest fraction of methods in the articles.
34
35
36 Experiments and field studies comprised almost half the research methods in the
v

37
38 sample. This is a similar finding to Fleischmann et a. (2014)’s review of bias-using IS
M
iew

39
40 research. Combining experiments with surveys and the use of secondary data, Table
41
an

42 14 shows that 80.9% of the major use sample used typical North American business
43
44 school methods. There were no descriptive research articles or literature reviews in
45
us

46 the major use sample.


47
48
cr

49
50 Table 14. BE-using IS Articles by Research Approach.
51
ip

52
Articles with Any Use of Articles with a Major Use
53 BE of BE
54 Approach
Percentage Percentage
t

55 Frequency Frequency
of Sample of Sample
56 Experiments & field studies 85 39.4 24 51.1
57
58 Conceptual studies, descriptive
23 10.6 5 10.6
59 research, & literature reviews
60 Case studies 25 11.6 3 6.4

33
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 34 of 75

1
2
3 Surveys 35 16.2 8 17.0
4
Secondary data 38 17.6 6 12.8
5
6 Design science & action
21 9.7 4 8.5
7 research
8 Total 227* 50*
9 *multiple codes per article allowed
10
11
Au

12
13 The unit of analysis in the BE-using samples were coded as free text. This coding was
14
15
th

then analyzed and classified using descriptive coding (Saldana, 2013, p.88) to yield
16
17
18 the results in Table 15. The unit of analysis for an article’s research was specified by
or

19
20 the authors or inferred by the coders in 71.3% of the any use sample and in 85.1% of
21
Fo c

22 articles in the major use sample. The remaining articles were coded as unclear or not
Ac

23
24 applicable, mainly because some article types, such as conceptual studies, used
r e

25
26 research methods that do not always specify a unit of analysis. Table 15 shows an
Pe p

27
28 overwhelming use of an individual as the unit of analysis in BE-using IS research. This
29
erte

30 is understandable as it is likely that the articles are studying some aspect of human
31
32 cognition – an individual concern. This is similar to the findings of Fleischmann et al.
33
Red

34
(2014).
35
36
v

37
38 Table 15. Unit of Analysis in BE-using IS Articles.
M
iew

39
40 Articles with Any Use of Articles with a Major Use of
41 BE BE
an

42 Percentage Percentage of
Frequency Frequency
43 of Sample Sample
44 Individual 106 68.4 29 72.5
45
us

Group 8 5.2 1 2.5


46 Organization 4 2.6 1 2.5
47
48 Process 17 11.0 5 12.5
cr

49 IT artifact or
18 11.6 3 7.5
50 outcome
51 Other 2 1.3 1 2.5
ip

52 Total 155* 100.1 40 100.0


53
54 *multiple codes per article allowed
t

55
56
57 The research domains in Table 16 followed a similar coding strategy to the unit of
58
59 analysis coding. Research domains were collected as free text and this was analyzed
60

34
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 35 of 75 Journal of Information Technology

1
2
3 using descriptive coding (Saldana, 2013, p.88) to yield the categories in Table 16. The
4
5 table shows that e-commerce is the domain most engaged by IS researchers who use
6
7
BE reference theory followed either by social and crowd computing or
8
9
10 finance/investment. E-commerce was also the most prominent research domain from
11
Au

12 1992 to 2012 in Fleischmann et al. (2014)’s and Odnor and Oinas-Kukkonen (2017)’s
13
14 literature analyses. Further to the analysis in Table 12, DSS is, unexpectedly, one of
15
th

16 the smallest BE-using research domains in the samples.


17
18
or

19
20 Table 16. Research Domain of BE-using IS Articles.
21
Fo c

22 Articles with Any Use of Articles with a Major Use


Ac

23 BE of BE
24 IS Research Domain
Percentage Percentage
Frequency Frequency
r e

25 of Sample of Sample
26 e-commerce 59 27.3 11 23.4
Pe p

27
28 Mobile 5 2.3 2 4.3
29 Social & Crowd
36 16.7 5 10.6
erte

30 Computing
31 Virtual 9 4.2 5 10.6
32
IS Theory & Research 21 9.7 4 8.5
33
Red

34 IS Use/Adoption 27 12.5 6 12.8


35 IS Analysis & Design &
36 20 9.3 3 6.4
Development
v

37 Business Function &


38 24 11.1 5 10.6
M
iew

Industry
39 Finance/Investment 16 7.4 7 14.9
40
41 Security/Privacy 23 10.7 5 10.6
an

42 IT Management &
43 21 9.7 3 6.4
Governance
44 DSS 10 4.6 2 4.3
45
us

46 Knowledge Management 6 2.8 3 6.4


47 Operational IT 6 2.8 0 0.0
48
Total 283* 61*
cr

49
50 *multiple codes per article allowed
51
ip

52
53
54 The analysis now turns to the nature of the tasks addressed by the BE-using research
t

55
56
in the two samples. The coding of this aspect of IS research (Questions 7 and 8 in
57
58
Appendix A) was largely made according to the coders’ interpretation of the articles’
59
60

35
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 36 of 75

1
2
3 tasks. Table 17 shows the analysis of tasks in the articles using the classifications of
4
5 Anthony (1965). The distributions in the two samples are relatively similar; operational
6
7
tasks are dominant in BE-using IS research, perhaps because of the individual nature
8
9
10 of e-commerce consumer decision-making tasks. The distributions in the table may
11
Au

12 mirror the distribution of tasks in organizations.


13
14
15
th

16 Table 17. Organizational Nature of Tasks in the Sample.


17
18 Articles with Any Use of Articles with a Major Use of
or

19 BE BE
Task Nature
20 Percentage Percentage
Frequency Frequency
21 of Sample of Sample
Fo c

22 Strategic 22 10.2 5 10.6


Ac

23 Tactical 75 34.7 15 31.9


24
Operational 139 64.4 35 74.5
r e

25
26 Total 236* 55*
Pe p

27 *multiple codes per article allowed


28
29
erte

30
31 Table 18 shows an alternative to Anthony’s task classification and uses BE’s dual
32
33
Red

process theory to classify tasks. In coding the articles, it was sometimes difficult to
34
35
36 classify a task as the cognitive nature of decisions can vary between individuals. Some
v

37
38 articles do not address any specific organizational, personal, or social tasks or
M
iew

39
40 processes. Further, conceptual studies, descriptive research, and literature reviews
41
an

42 are not necessarily task related and were often coded as unclear. As a result, “unclear”
43
44 is the largest task category in the overall BE-using sample. Fortunately, the articles in
45
us

46 the major BE-using sample were more definite about the cognitive nature of the tasks
47
48 that their research addressed. The definitions from Table 2 were used in the coding of
cr

49
50 these tasks. The “dominant” part of the classification in the table is important as no
51
ip

52
task is addressed by using only one cognitive system.
53
54
t

55
56
57 Table 18. Type of Tasks Addressed by BE-using Research.
58
59 Articles with Any Use of Articles with a Major Use
Type of Task
60 BE of BE

36
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 37 of 75 Journal of Information Technology

1
2
3 Percentage Percentage
4 Frequency Frequency
of Sample of Sample
5 System 1 dominant 59 27.3 15 31.9
6
7 System 2 dominant 20 9.3 3 6.4
8 Strong S1/S2 interaction 64 29.6 23 48.9
9 Unclear 74 34.3 7 14.9
10
Total 217* 48*
11
Au

12 *multiple codes per article allowed


13
14
15
th

16 The analysis now turns to the actual BE theories and concepts that are used in IS
17
18 research. In terms of early BE, Table 19 shows that bounded rationality, the foundation
or

19
20 of all BE theory, remains a popular organizing construct in both samples. The phase
21
Fo c

22 model of decision making continues to be used in some way in 19.1% of the major use
Ac

23
24 articles. The phase model is still the basis of more IS studies than modern theories like
r e

25
26
nudges. There is more on this point in the Discussion. Structuredness and problem
Pe p

27
28
29 decomposition from early BE do not feature in the major use sample. Table 19 shows
erte

30
31 a strong move to contemporary BE in IS research. In the major BE-use sample
32
33 cognitive biases dominate the BE theories used, at 61.7% of the sample. Next is dual
Red

34
35 process theory and heuristics, each used by 42.6% of articles. Table 19 does show
36
v

37 that IS researchers have a wide awareness of contemporary BE. In Tables 19 through


38
M
iew

39 23 the cells with a null or zero coding result have been shaded to enable these results
40
41
an

to be easily identified.
42
43
44
45
us

Table 19. Aspects of BE in BE-using IS Research.


46
47
Articles with Any Use of Articles with a Major Use of
48
BE BE
cr

49 BE Theory
Percentage Percentage
50 Frequency Frequency
of Sample of Sample
51
Bounded rationality 72 33.3 18 38.3
ip

52
53 Phase model 31 14.4 9 19.1
54 Decision structuredness 8 3.7 0 0.0
t

55
Dual process theory 33 15.3 20 42.6
56
57 Heuristics 39 18.1 20 42.6
58 Cognitive biases 80 37.0 29 61.7
59 Prospect theory 44 20.4 15 31.9
60

37
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 38 of 75

1
2
3 Nudges 10 4.6 6 12.8
4
5 Debiasing 8 3.7 6 12.8
6 Decomposition of complex
2 0.9 0 0.0
7 problem
8 General BE use 5 2.3 0 0.0
9 Total 332* 123*
10
*multiple codes per article allowed
11
Au

12
13
14
15 The next three tables provide more detail about the aspects of BE used in IS research.
th

16
17 Table 20 shows that there is a relatively low use of judgment heuristics as a core BE
18
or

19 reference theory in IS research. Apart from the “general mention” of heuristics the
20
21 recently defined affect heuristic is the most cited but is only by 6.0% of the overall BE-
Fo c

22
Ac

23 using sample. In the major use sample general mention is the highest category along
24
r e

25 with the affect heuristic. Only five articles in the major use sample cited the original
26
Pe p

27 Kahneman and Tversky heuristics and one of these involved anchoring and
28
29
adjustment which is no longer considered to be a general heuristic. When the articles
erte

30
31
32
were inspected, only seven in the major use sample used heuristics as a core
33
Red

34 foundation for their work. This means that only 3.2% of all BE-using IS research used
35
36 heuristics in a major way (0.5% of all IS research). Further, all of the use of the
v

37
38 Kahneman and Tversky heuristics in the major use sample occurred in 2014; from
M
iew

39
40 2015 only the affect heuristic formed the basis of any IS research.
41
an

42
43
44 Table 20. Judgment Heuristics in BE-using IS Research.
45
us

46 Articles with Any Use of Articles with a Major Use of


47 BE BE
48 Heuristic
Percentage Percentage
Frequency Frequency
cr

49 of Sample of Sample
50 Availability 7 3.2 3 6.4
51
Representativeness 1 0.5 1 2.1
ip

52
53 Anchoring and adjustment 3 1.4 1 2.1
54 Affect 13 6.0 9 19.1
t

55
Effort 0 0.0 0 0.0
56
57 Recognition 0 0.0 0 0.0
58 General mention 18 8.3 9 19.1
59 Total 42* 23*
60

38
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 39 of 75 Journal of Information Technology

1
2
3 *multiple codes per article allowed
4
5
6
7
The next part of the analysis focuses on cognitive biases. Table 21 concerns the
8
9
10 highest level of the typology in Table 4 which grouped biases into eight categories,
11
Au

12 while Table 22 addresses the lower layer of the typology which identifies 38 individual
13
14 biases and effects. It should be noted that the totals for Tables 21 and 22 are different
15
th

16 because articles coded as “general mention” in Table 21 do not have a lower level of
17
18 coding. Further, some articles address multiple biases within a higher category.
or

19
20
21
Fo c

22 Table 21 shows the coding of the high level of the bias typology presented in Table 4.
Ac

23
24 All mentions of biases in the samples were coded at the higher level. In both samples
r e

25
26
the most frequently addressed categories in IS research were confirmation, framing,
Pe p

27
28
29 and overconfidence respectively. Of note is that no article addressed or mentioned
erte

30
31 bounded awareness biases, even in a very minor way.
32
33
Red

34
35 Table 21. Cognitive Biases in BE-using IS Research – High Level Analysis.
36
v

37 Articles with Any Use of Articles with a Major Use of


38 BE BE
M
iew

Bias Category
39 Percentage Percentage
Frequency Frequency
40 of Sample of Sample
41 Overconfidence biases 15 6.9 8 17.0
an

42 Availability biases 9 4.2 3 6.4


43
Representativeness biases 4 1.9 1 2.1
44
45 Confirmation biases 29 13.4 13 27.7
us

46 Bounded awareness biases 0 0.0 0 0.0


47 Framing biases 10 21.3
20 9.3
48
Escalation biases 7 3.2 5 10.6
cr

49
50 General mention 14 6.5 4 8.5
51 Other 6 2.8 1 2.1
ip

52
Total 104* 45*
53
54 *multiple codes per article allowed
t

55
56
57
58 Table 22 shows that there is wide but thin use of biases in IS research. In the major
59
60 use sample 19 biases were used in IS research but another 19 biases were not.

39
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 40 of 75

1
2
3 Framing, anchoring, and confirmation were the most used biases in the major use
4
5 sample, but the frequencies are too low for statistical analysis. This is a similar finding
6
7
to the analysis of BE-using DSS research by Arnott and Gao (2019) and the use of
8
9
10 biases in IS research by Fleischmann et al. (2014).
11
Au

12
13
14 Table 22. Cognitive Biases in BE-using IS Research – Bias Level Analysis.
15
th

16 Articles with Any Use of Articles with a Major Use of


17 BE BE
Bias
18 Percentage Percentage
Frequency Frequency
or

19 of Sample of Sample
20 Completeness bias 0 0.0 0 0.0
21 Complexity bias 1 0.5 1 2.1
Fo c

22
Ac

Desire bias 0 0.0 0 0.0


23
24 Illusion of control 4 1.9 2 4.3
r e

25 Overconfidence 7 3.2 3 6.4


26
Overplacement 1 0.5 1 2.1
Pe p

27
28 Planning fallacy 3 1.4 3 6.4
29 Redundancy bias 1 0.5 0 0.0
erte

30 Self enhancement 5 2.3 3 6.4


31
32 Test bias 0 0.0 0 0.0
33 Ease of recall 6 2.8 2 4.3
Red

34 Retrievability bias 1 0.5 1 2.1


35
Testimony bias 0 0.0 0 0.0
36
v

37 Base rate insensitivity 1 0.5 0 0.0


38 Chance misconception 0 0.0 0 0.0
M
iew

39 Illusory correlation 1 0.5 0 0.0


40
41 Regression to mean 0 0.0 0 0.0
an

42 Sample size insensitivity 0 0.0 0 0.0


43 Similarity bias 2 0.9 1 2.1
44
Subset bias 0 0.0 0 0.0
45
us

46 Anchoring effect 12 5.6 5 10.6


47 Confirmation trap 7 3.2 5 10.6
48 Conjunctive and
0 0.0 0 0.0
cr

49 disjunctive events bias


50 Endowment effect 6 2.8 4 8.5
51
Hindsight bias 0 0.0 0 0.0
ip

52
53 Status quo bias 7 3.2 3 6.4
54 Change blindness 0 0.0 0 0.0
t

55
Focalism 0 0.0 0 0.0
56
57 Inattentional blindness 0 0.0 0 0.0
58 Framing 16 7.4 9 19.1
59 Linear bias 0 0.0 0 0.0
60

40
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 41 of 75 Journal of Information Technology

1
2
3 Presentation bias 3 1.4 1 2.1
4
5 Pseudocertainty 0 0.0 0 0.0
6 Unilateral escalation 2 0.9 2 4.3
7 Competitive escalation 1 0.5 1 2.1
8
Default trust bias 5 2.3 1 2.1
9
10 Herding effect 1 0.5 1 2.1
11
Au

Diversification bias 1 0.5 0 0.0


12
Total 94* 49*
13
14 *multiple codes per article allowed
15
th

16
17
18 Finally, Table 23 shows a cross-tabulation of the high-level bias categories with the IS
or

19
20 research domains. The table reinforces the previous analyses and provides more
21
Fo c

22 detail of the classes of cognitive biases used by researchers in various domains. The
Ac

23
24 use of framing biases in ecommerce research stands out, as does confirmation biases
r e

25
26 in security and privacy, and IS use and adoption research, and overconfidence biases
Pe p

27
28
in finance and investment research. The area of IS most concerned with decision
29
erte

30
making, DSS, was only studied using overconfidence effects and only one article in a
31
32
33 major way.
Red

34
35
36
v

37 Table 23. Cognitive Bias Category – IS Research Domain Matrix*.


38
M
iew

39 Over-
Availability Representative- Confirmation Framing Escalation General
40 confidence Other Total
Biases ness Biases Biases Biases Biases Mention
Biases
41
an

23
42 e-commerce 0 2 (2) 2 (0) 5 (4) 9 (5) 1 (1) 2 (1) 2 (0)
(13)
43
Mobile 0 0 0 1 (1) 0 0 0 1 (0) 2 (1)
44
45 Social &
us

18
Crowd 4 (0) 2 (1) 0 6 (1) 2 (1) 0 3 (0) 1 (0)
46 (3)
Computing
47
Virtual 1 (1) 1 (0) 0 3 (1) 0 0 1 (0) 0 6 (2)
48
cr

49 IS Theory &
0 0 0 4 (2) 2 (0) 0 2 (1) 0 8 (3)
50 Research
51 IS Use/
1 (1) 1 (0) 1 (0) 6 (3) 2 (1) 1 (1) 2 (1) 0
14
ip

52 Adoption (7)
53 IS Analysis &
54 Design & 1 (1) 0 0 4 (0) 0 1 (1) 3 (1) 0 9 (3)
t

55 Development
56 Business
57 Function & 2 (1) 2 (0) 1 (0) 1 (0) 2 (0) 1 (0) 0 0 9 (1)
Industry
58
Finance/ 10
59 6 (3) 0 0 2 (1) 0 0 2 (0) 0
Investment (4)
60

41
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 42 of 75

1
2
3 Security/ 19
3 (2) 1 (0) 1 (1) 7 (3) 4 (2) 0 0 3 (1)
4 Privacy (9)
5 IT
6 Management
0 0 0 3 (1) 2 (1) 3 (2) 1 (0) 0 9 (4)
7 &
Governance
8
9 DSS 2 (1) 0 0 0 0 0 1 (0) 0 3 (1)
10 Operational
0 0 0 0 1 (0) 1 (0) 0 0 2 (0)
11 IT
Au

12 132
Total 20 (10) 9 (3) 5 (1) 42 (17) 24 (10) 8 (5) 17 (4) 7 (1)
(51)
13
*Frequencies from the major-use sample are in parentheses.
14
15
th

16
17
18
or

19 6. Discussion
20
21
Fo c

22 This section is structured around the questions articulated in the Introduction:


Ac

23
24 1. How has behavioral economics been used in IS research?
r e

25
26 2. Why don’t we use behavioral economics in IS research projects?
Pe p

27
28 3. How could we use behavioral economics in IS research?
29
erte

30 The discussion begins with issues relating to the literature analysis findings and then
31
32 moves to more speculative discussion. In this sense addressing Question 1 involves
33
Red

34 descriptive analysis with critical aspects, Question 2 critically examines issues with IS
35
36
research, and Question 3 involves theorizing about the future use of BE in IS research.
v

37
38
M
iew

39 Answering Question 3 provides guidance on which aspects of BE are relevant to


40
41 particular areas of IS.
an

42
43
44
45 6.1 How has behavioral economics been used in IS research?
us

46
47 The results of the literature analysis in Section 5 show a complex relationship between
48
cr

49 the BE and IS disciplines. They show that BE seems to be important to IS research


50
51
but when only the major use of BE is considered, the use of BE in IS is relatively small
ip

52
53
54
for such a major behavioral theory, only 3.5% of IS research. The analysis shows that
t

55
56 when using BE, IS researchers do have a broad awareness of contemporary BE and,
57
58 unlike the findings of earlier studies, they predominately use or cite contemporary
59
60 rather than early BE in their work. Further, the articles that use BE in a major way,

42
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 43 of 75 Journal of Information Technology

1
2
3 compared to those that cite BE in some way, shows that although researchers are
4
5 aware of BE, only a relatively small number of IS studies use BE as a main or important
6
7
reference theory.
8
9
10
11
Au

12 In the major BE use sample, e-commerce was the most researched area (23.4% of
13
14 the sample). Adding our findings to Fleischmann et al. (2014)’s and Odnor and Oinas-
15
th

16 Kukkonen (2017)’s shows that e-commerce has been the most prominent BE-using IS
17
18 area since 1992. Finance/investment was the next most popular, but DSS was the
or

19
20 least area researched. This is a very surprising result, as by definition, DSS is the area
21
Fo c

22 that is most concerned with decision making - what BE is concerned with. DSS
Ac

23
24 occupies a much larger fraction of all IS research than the 4.3% of the major use
r e

25
26
sample (Arnott & Pervan, 2014). The reasons for the low take-up of BE in DSS
Pe p

27
28
29 research are not clear but research methods, as discussed below, could be one factor.
erte

30
31 A detailed reading of the articles in the major use sample showed although e-
32
33 commerce had the highest frequency of use, the security and privacy articles were the
Red

34
35 most intensive users of BE research; that is, they cited the most BE articles and often
36
v

37 used more than one aspect of BE in a major way in their projects.


38
M
iew

39
40
41
an

One reason why e-commerce is particularly attractive to BE reference theory use could
42
43 be e-commerce’s general focus on IS use by an individual. The literature analysis
44
45
us

shows an overwhelming focus (72.5%) on an individual unit of analysis in BE-major


46
47
48
use articles. As e-commerce research in the sample only involved operational tasks
cr

49
50 performed by a potential customer, the individual nature of general BE research,
51
especially cognitive bias, is a logical fit (72.7% of e-commerce articles). The next most
ip

52
53
54 popular unit of analysis, a process (12.5% of the major use sample), involves projects
t

55
56 that focus on a decision-making process, either the phase model or dual process
57
58 theory. This is the highest level of abstraction in decision-making research.
59
60

43
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 44 of 75

1
2
3 In terms of the tasks involved in major BE-using IS articles, operational tasks
4
5 dominated the sample (74.5%). This is understandable and could represent a
6
7
reasonable distribution of tasks in organizations, although the around 10% strategic
8
9
10 task share is probably higher than the organizational average. Around half of the tasks
11
Au

12 researched using a major BE foundation involved a strong interaction between System


13
14 1 and System 2 decision making. This implies an overall cognitive process that is
15
th

16 towards an equal use of the two systems. The low level of System 2 dominant tasks in
17
18 both samples was a surprise as it was expected that IS’s main focus would be on
or

19
20 automating System 2 style tasks, the operational tasks identified in Table 17. This
21
Fo c

22 represents a different story to IS research in general, and even the DSS sub-field. This
Ac

23
24 result could also be a further consequence of the dominance of e-commerce in the BE
r e

25
26
using samples.
Pe p

27
28
29
erte

30
31 One theme that emerges from the analysis in Section 5 is the strongly positivist nature
32
33 of BE-using IS research. The discipline of BE is itself almost entirely positivist in nature
Red

34
35 and it could be that BE reference theory is especially appealing to positivist IS
36
v

37 researchers. Further, BE research is strongly experimental and the analysis in Table


38
M
iew

39 14 shows that experiments and field studies are used in the majority of IS studies that
40
41
an

use BE in a major way. It is clear that IS researchers who are skilled in experimental
42
43 research are attracted to reference theories that are dominated by experimental
44
45
us

methods. One aspect of this positivist orientation is that important BE articles can be
46
47
48
difficult to follow without considerable expertise in experimentation; they are often
cr

49
50 difficult reading. This may contribute to the low use of case studies and design science
51
by BE-using IS projects. The literature analysis shows the distribution of methods in
ip

52
53
54 BE-using IS research more closely mirrors the methods of the reference discipline
t

55
56 rather than the methods used in IS in general. It appears that research method rather
57
58 than research area is a determinant of BE use in IS and this could explain why use of
59
60 BE in DSS research is lower than expected; DSS is dominated by design science

44
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 45 of 75 Journal of Information Technology

1
2
3 methods and experiments are declining in their popularity for the area where BE should
4
5 have the greatest impact (Arnott & Pervan, 2014).
6
7
8
9
10 The use of BE as major reference theory in the literature analysis has an ad hoc feel.
11
Au

12 In general, researchers seem to adopt a specific aspect of BE that they believe is


13
14 relevant to their research and only cite articles related to their focusing aspect. In the
15
th

16 article sample, there was little or no consideration of why a particular aspect of BE is


17
18 the most relevant for the research and there was little explicit justification of their
or

19
20 choices. There was also little critique of the aspects of BE that were used. This
21
Fo c

22 suggests that some IS researchers may not be aware of the controversies and
Ac

23
24 limitations of the BE they are using. On the other hand, some articles in the major use
r e

25
26
sample show both a wide and deep understanding of BE (for example, Dinev,
Pe p

27
28
29 McConnel, & Smith, 2015; Wang, Li, & Rao, 2016; Adjerid, Peer, & Acquisti, 2018).
erte

30
31
32
33 Turning to aspects of BE that have been used in IS research, as mentioned above, the
Red

34
35 articles with a major use of BE overwhelmingly use contemporary BE. Early theories
36
v

37 like problem decomposition and task structuredness, that have long been important to
38
M
iew

39 systems analysis practice, were not used at all. The most used BE aspects in the major
40
41
an

use sample were cognitive bias (61.7% of articles), heuristics (42.6%), and dual
42
43 process theory (42.6%). Bounded rationality, the original foundation of all BE theory,
44
45
us

was mentioned in 38.3% of major use articles but the use tended to be providing
46
47
48
context for the research rather than acting as foundation theory in its own right. Dual
cr

49
50 process theory is the foundation of contemporary BE (Kahneman, 2011, Part 1) and
51
should be considered as key to any IS research that involves decision making.
ip

52
53
54
t

55
56 The Results section showed that although heuristics are superficially important for IS
57
58 research, a closer examination of the articles showed that only seven IS articles of the
59
60 1,348 published in the journal sample actually used heuristics as a major reference

45
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 46 of 75

1
2
3 theory. Further, the analysis showed that only the affect heuristic has been used since
4
5 2014. This is in line with a greater acknowledgement of the role of emotions in human
6
7
behavior, a trend that is evident in IS scholarship. In a sense this trend is a correction
8
9
10 away from an overemphasis of cognitive factors in social research.
11
Au

12
13
14 The derived BE theories, concepts, and methods of prospect theory (used in 31.9% of
15
th

16 the major use sample), nudges (12.8%), and debiasing (12.8%) show an increasing
17
18 popularity over time in IS research. Further, when they are mentioned in an article,
or

19
20 they tend to form a major part of the foundation and execution of the research. This
21
Fo c

22 major use should continue to increase over time.


Ac

23
24
r e

25
26
Cognitive biases form the most granular level of abstraction in BE and as mentioned
Pe p

27
28
29 a number of times above, they form the bulk of BE foundation theory in IS research.
erte

30
31 This level of abstraction fits the positivist paradigm where a research task is reduced
32
33 in scope, complexity, and level of abstraction in order to be amenable to
Red

34
35 experimentation. In terms of bias categories, the literature analysis found that the
36
v

37 confirmation, framing, and overconfidence categories were the most popular in IS


38
M
iew

39 research. The analysis showed that no bounded awareness biases were mentioned in
40
41
an

any way in the article sample. This surprising result will be addressed in more detail in
42
43 Section 6.3. The Results section also showed that at the individual cognitive bias level,
44
45
us

half of the biases from the BE literature were not used in the articles in the samples.
46
47
48
This builds on the previous observation that BE use in IS has an ad hoc feel. When
cr

49
50 using biases, IS researchers seem to focus on a small set of biases and typically focus
51
each study on only one bias. One reason for this focus could be that some biases are
ip

52
53
54 much more difficult to research than others, especially in IS field studies. While biases
t

55
56 can be effectively studied in a psychology lab, studying IS phenomena in the field can
57
58 require significantly more effort and time in data collection and analysis. For example,
59
60 the conjunctive and disjunctive events bias (Teigen et al., 1996) where outcome

46
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 47 of 75 Journal of Information Technology

1
2
3 likelihoods are often overestimated in compound conjunctive problems and
4
5 underestimated in compound disjunctive problems could affect IS management of
6
7
large-scale enterprise systems projects. Studying the influence of conjunctive and
8
9
10 disjunctive effects on large scale IS project management could significantly improve
11
Au

12 practice.
13
14
15
th

16 Following on from the bias category analysis, the most researched individual biases
17
18 were framing (19.1% of major use articles), anchoring (10.6%), confirmation (10.6%),
or

19
20 and the endowment affect (8.5%). These four biases are among the easiest to
21
Fo c

22 understand and have been written about in popular non-academic magazines and
Ac

23
24 blogs. They are quite amenable to experimental IS studies and are an obvious entry
r e

25
26
into the use of BE. While the 38 cognitive biases are in no way equally important in
Pe p

27
28
29 their negative effects and relevance to IS research, many of the unused biases are of
erte

30
31 equal or greater relevance and importance to those that were used in the samples.
32
33 These important biases include completeness, hindsight, pseudocertainty, and the
Red

34
35 bounded awareness biases. This situation is presented in Table 24 below.
36
v

37
38
M
iew

39 6.2 Why don’t we use behavioral economics in IS research projects?


40
41
an

As discussed in Section 2, BE is a complex set of theories, phenomena, methods, and


42
43
44
processes with overlapping concepts and effects and suffers from significant confusion
45
us

46 that is caused by aspects of BE being presented in the literature under different names
47
48 or titles. The complexity, even messiness, of BE could be a major impediment to BE
cr

49
50 use. It may also be a reason why most IS articles that use BE concentrate on only one
51
ip

52 aspect of BE; it can take a long time to learn about each particular aspect of BE, so to
53
54 master a number of BE aspects may involve time that some researchers are not willing,
t

55
56 or able, to commit to.
57
58
59
60

47
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 48 of 75

1
2
3 A more likely reason for the slow take up of BE reference theory in IS could be the
4
5 conservative nature of academic activity. This conservatism can lead to researchers
6
7
studying areas and using theories long after their relevance and utility has waned.
8
9
10 Researchers devote significant time and effort to learn theories, design and execute a
11
Au

12 research project, and publish in a scientific journal. They can become resistant to
13
14 changing the foundations that they have mastered, even when there is little or no
15
th

16 evidence for the efficacy of the theory. Kahneman termed this situation theory-induced
17
18 blindness; “… once you have accepted a theory and use it as a tool in your thinking, it
or

19
20 is extraordinarily difficult to notice its flaws. If you come upon an observation that does
21
Fo c

22 not seem to fit the model, you assume that there must be a perfectly good explanation
Ac

23
24 that you are somehow missing” (Kahneman, 2011, p. 277). IS researchers have
r e

25
26
detected this theory-induced blindness. For example, Taylor et al. (2010, p. 664) found
Pe p

27
28
29 that their analysis of the overall IS discipline “…suggests that researchers are
erte

30
31 increasingly limiting their attention to prior work on their topics within their own
32
33 paradigm, ignoring contributions within other paradigms.” Street and Ward (2019) in a
Red

34
35 critique of IS journal reviews identified an ideological bias where “well-known theories
36
v

37 and orthodoxies are favored over less well-known ones” (p. 55). It can be difficult for
38
M
iew

39 new theories to displace reference theory that has been used for many years.
40
41
an

42
43 The use of rational economic models as descriptive theories in IS research is an
44
45
us

example of theory-induced blindness. Prescriptive models should only be used as


46
47
48
descriptive theory with significant empirical support. Rational decision models from
cr

49
50 neoclassical economics does not have such support. The same validity criteria also
51
applies to descriptive theories of decision making. The phase model of decision
ip

52
53
54 making of early BE is a product of its time and is a theory that has not stood up to
t

55
56 empirical validation. Even though it lacks descriptive validity it continues to be popular
57
58 in many IS studies. The use of this model should be augmented with the models,
59
60 theories, and processes of contemporary BE.

48
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 49 of 75 Journal of Information Technology

1
2
3
4
5 The Discussion now turns to the most speculative of the three research questions.
6
7
8
9
10 6.3 How could we use behavioral economics in IS research projects?
11
Au

12 Despite justifiable concerns about the use of theory in IS research (Avison &
13
14 Malaurent, 2014; Hirschheim, 2019), for much of IS BE provides a well-researched
15
th

16 and validated theory foundation especially in the areas concerned with the
17
18 management and governance of IT, IT development and use, and directly supporting
or

19
20 decision making in organizations. IS researchers should use the most appropriate
21
Fo c

22
reference theory available and BE currently provides the best descriptive foundation
Ac

23
24
theories of decision making as well as some prescriptive methods for interventions in
r e

25
26
Pe p

27 organizations. The literature analysis in this paper shows that BE can be a valuable
28
29 and effective reference theory for many areas of IS research.
erte

30
31
32
33 One strategy in overcoming the constraints to using “new” theory in IS is to increase
Red

34
35 the awareness of the BE theory base for IS researchers. Section 2 of this paper, and
36
v

37 in particular Tables 1 through 5, and 23 and 24 below, can be used to locate a concept
38
M
iew

39 or effect that could be relevant to a particular IS research project. The classic, well-
40
41
an

cited, BE publications in the tables should provide a start to understanding how each
42
43
44
concept could be used in IS research as can the identification of IS areas that can use
45
us

46 each BE aspect. In this way the tables can be used to guide further reading and to
47
48 assess if a particular aspect of BE is relevant to a particular research project.
cr

49
50
51
ip

52 There are two major ways of considering how BE could be used in future IS research:
53
54 first by identifying areas of BE that have not been used in IS research, or not used to
t

55
56 their potential, and secondly, considering those areas of IS that could benefit from a
57
58 greater use of BE. Surprisingly, dual process theory is a BE area that could feature
59
60 more in IS research. The literature analysis found dual process theory was used mostly

49
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 50 of 75

1
2
3 in the samples to provide context for the research. The lack of use of dual process
4
5 theory as a major foundation theory could be a consequence of the positivist
6
7
orientation of the samples where research designs favor the lowest levels of theoretical
8
9
10 abstraction. Yet, an understanding of the roles, strengths, and interactions of the two
11
Au

12 cognitive systems could potentially change the nature and content of IS development
13
14 methods and IT management processes.
15
th

16
17
18 The most obvious aspect of BE that can be expanded in IS research is to include the
or

19
20 set of 19 biases that were not mentioned in the samples. For example, the literature
21
Fo c

22 analysis found that none of the bounded awareness biases featured in the article
Ac

23
24 samples. Bounded awareness biases involve systematic and persistent failures to
r e

25
26
notice critical information. As such, they affect all kinds of IS usage and are an obvious
Pe p

27
28
29 target for future IS research. One of the bounded awareness biases is change
erte

30
31 blindness (Mitroff et al., 2002; Tenbrunsel & Messick, 2004). Under this bias decision
32
33 makers may not notice obvious change, especially if it occurs over time in small
Red

34
35 frequent episodes. This bias is especially important for business intelligence reporting
36
v

37 and data analytics. Developers of these systems need to be aware of the potential of
38
M
iew

39 reducing decision-making effectiveness through the frequent presentation of


40
41
an

information to users. Another bias that was absent in the Basket of Eight sample is the
42
43 completeness bias which is sometimes termed overprecision. The action of the
44
45
us

completeness bias is to create a perception of an apparently complete or logical data


46
47
48
presentation which can stop the search for omissions or errors (McKenzie et al., 2008).
cr

49
50 This bias can also adversely affect reporting and analytics systems.
51
ip

52
53
54 Another bias that was not used in the article samples was the hindsight bias (Hertwig
t

55
56 et al., 2003), which affects our ability to interpret and learn from past decisions. This
57
58 bias is one of the strongest in its effect and is particularly relevant to project
59
60 management. IT projects have a persistent reputation for exceeding timelines and

50
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 51 of 75 Journal of Information Technology

1
2
3 budgets. The hindsight bias can lead IT managers to overestimate their ability to judge
4
5 the time, personnel, and other resources required for a project. As a result, failures in
6
7
judgement about a completed project do not always lead to improved decision making
8
9
10 in the next because of the action of the hindsight bias. Escalation biases have been
11
Au

12 successfully used by IT management researchers and other areas of IS could benefit


13
14 from studying these effects.
15
th

16
17
18 Turning to other areas of IS, in the general DSS discipline and in particular executive
or

19
20 information systems and business intelligence, research has had great difficulty in
21
Fo c

22 studying top management decision making (Adam & Murphy, 1995; Arnott, 2010). In
Ac

23
24 practice, IT departments, consultants, and vendors have had a similar lack of success
r e

25
26
in supporting the most senior managers in organizations. A key to these difficulties is
Pe p

27
28
29 an inadequate understanding of senior executive information behaviors. We have
erte

30
31 known for over 40 years that senior executives face radically different decision tasks
32
33 and environments than other managers and professionals (Mintzberg, 1973; Kotter,
Red

34
35 1982; Carpenter, Gelatkanycz & Saunders, 2004; Mintzberg, 2009). Yet,
36
v

37 organizational IT departments, consultants, and vendors mostly use philosophies,


38
M
iew

39 methods, and technologies related to large-scale operational enterprise systems to


40
41
an

support these most senior users. BE theories could be used to provide a better
42
43 understanding of how these senior executives make decisions and how to support
44
45
us

them. Dual process theory is particularly important for this understanding as System 1
46
47
48
decision making is common for top management. There are many possible
cr

49
50 consequent biases that impact this decision making and debiasing them could be a
51
rich area for IS research.
ip

52
53
54
t

55
56 One of the largest areas of IS research over the last 25 years has been about how
57
58 people adopt and use various types of IS (Taylor et al., 2010). IS as a discipline has
59
60 relied on theories like the technology acceptance model (TAM), the unified theory of

51
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 52 of 75

1
2
3 acceptance and use of technology (UTAUT), and the task technology fit model (TTF)
4
5 to investigate how users can effectively adopt and use IT systems. Venkatesh, Davis,
6
7
and Morris (2010, p. 279) in considering the history and status of these theories in IS
8
9
10 adoption research argued that “an excessive focus on replication and minor ‘tweaking’
11
Au

12 of existing models can hinder progress both in the area of technology adoption and in
13
14 information systems in general.” This focus can add to the conservatism of IS research
15
th

16 described in the previous sub-section. BE research has much to offer IS in this area.
17
18 Dual process theory can provide a context for decisions on when or whether to use a
or

19
20 certain system, as can prospect theory. Many of the cognitive biases, especially
21
Fo c

22 overconfidence, confirmation, and bounded awareness biases, may help explain use
Ac

23
24 and adoption behaviors in more nuanced and fundamental ways than the current IS
r e

25
26
theory set. For example, in the major BE-use sample, Hoefnagel et al. (2014)
Pe p

27
28
29 expanded the understanding of UTAUT studies by using the anchoring and adjustment
erte

30
31 effect from BE; Chen and Konfaris (2015) used the overconfidence bias to investigate
32
33 why normally beneficial DSS features can lead to negative outcomes; and Schmueli,
Red

34
35 Pliskin, and Fink (2016a) used BE to examine an area of software development that
36
v

37 has resisted explanation by previous studies.


38
M
iew

39
40
41
an

The literature analysis’s finding of the dominance of positivist experimental and quasi-
42
43 experimental methods in the article samples hints at further possibilities for BE-
44
45
us

founded IS research. Other methods, especially case studies, action research, and
46
47
48
design science, could use BE to investigate virtually all aspects of IS. Case studies
cr

49
50 can illuminate the nature of IS/IT phenomena in ways that are not possible with
51
experiments and surveys. The use of descriptive BE theory in IS case studies,
ip

52
53
54 particularly exploratory case studies that aim for deep understanding of a development
t

55
56 or management process or method could lead to a significant increase in our
57
58 knowledge of IS phenomena. Design science is an increasingly important method for
59
60 IS research. A difficulty with some design science research is the lack of an organizing

52
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 53 of 75 Journal of Information Technology

1
2
3 construct or method to guide an intervention. Nudges are one aspect of BE that could
4
5 provide this overall direction, debiasing is another. Both these aspects of BE could also
6
7
help with the convincing evaluation of design science artifacts and action research
8
9
10 interventions.
11
Au

12
13
14 It is important to note that IS is not always an importer of reference theory. IS research
15
th

16 that uses BE as foundation theory can in turn contribute to BE knowledge. In this way
17
18 the use of BE in IS research need not just be a one-way transfer of knowledge. For
or

19
20 example, in the major BE-use sample, Zheng et al. (2018) added hedonic principles to
21
Fo c

22 prospect theory to explain behavior in crowdsourcing contests. This is a significant


Ac

23
24 contribution to IS theory but also a contribution to BE. Another IS contribution to BE
r e

25
26
knowledge could come in the debiasing area. As discussed in Section 2, debiasing
Pe p

27
28
29 has had relatively little attention from social and cognitive psychology and BE
erte

30
31 researchers. These researchers have focused on identifying and understanding the
32
33 causes and nature of the biases rather than developing processes to mitigate the
Red

34
35 negative effects of the biases. IS has a lot to offer to debiasing efforts as its systems
36
v

37 analysis and design methods can be used or modified to debias decision making.
38
M
iew

39 Examples of this IS contribution include George, Duffy, and Ahuja (2000), Arnott
40
41
an

(2006), Bhandari et al. (2008), and Cheng and Wu (2010).


42
43
44
45
us

To conclude this section, Table 24 summarizes major aspects of the Discussion. The
46
47
48
table shows the actual use of BE research as analyzed from the major use sample. It
cr

49
50 presents this use in two dimensions: BE aspect and IS area. The table does not show
51
the availability, representativeness, effort, and recognition heuristics; these aspects
ip

52
53
54 are summarized under their respective biases. In addition to this paper’s empirical
t

55
56 findings, the last column of Table 24 presents those IS areas that are judged to have
57
58 potential for a greater use of BE. Where appropriate, these potential areas for BE use
59
60 are identified in greater detail than the previous coding.

53
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 54 of 75

1
2
3
4
5 Table 24. Actual and Potential Use of BE in IS Research.
6
7
BE Aspect IS Areas in the Major Use Sample Other IS Areas with Potential
8
for Greater BE Use
9 General aspects
10
11 Dual process theory e-commerce, IS Analysis & Design & All IS areas
Au

Development, IS Use/Adoption, IS Theory &


12 Research, IT Management & Governance,
13 Knowledge Management, Mobile,
14 Security/Privacy, Social & Crowd
15 Computing, Virtual
th

16 Prospect theory e-commerce, IT Management & All IS areas


17 Governance, Security/Privacy, Social &
Crowd Computing
18
or

Nudges e-commerce, IS Use/Adoption, Social & DSS, IT Management &


19 Crowd Computing Governance
20 Debiasing IS Analysis & Design & Development, IS DSS
21 Theory & Research, Security/Privacy,
Fo c

22 Virtual
Ac

23 Affect heuristic e-commerce, IS Use/Adoption, IS Theory & All IS areas


24 Research, Knowledge Management,
Mobile, Security/Privacy, Social & Crowd
r e

25
Computing, Virtual
26 Overconfidence Biases
Pe p

27
28 Completeness bias None DSS (reporting, analytics), e-
commerce (recommender
29 systems)
erte

30 Complexity bias Virtual DSS, IS Use/Adoption


31
Desire bias None e-commerce (marketing), IT
32
Management & Governance
33
Red

(project management)
34 Illusion of control DSS, IS Use/Adoption IT Management & Governance
35 (project management)
36 Overconfidence DSS, Security/Privacy All IS areas
v

37 Overplacement DSS No other areas


38
M
iew

Planning fallacy IS Analysis & Design & Development, IT Management & Governance
39
Security/Privacy
40 Redundancy bias None DSS
41
an

42 Self enhancement Security/Privacy IT Management & Governance


43 Test bias None e-commerce, IS Analysis & Design
44 & Development, IT Management &
45
us

Governance, Security/Privacy
46 Availability Biases
47 Ease of recall e-commerce, Social & Crowd Computing IS Analysis & Design &
48 Development
cr

49 Retrievability bias e-commerce IS Analysis & Design &


50 Development
51 Testimony bias None IS Analysis & Design &
ip

52 Development
Representativeness
53
Biases
54 Base rate insensitivity None DSS (analytics)
t

55
56 Chance misconception None DSS (analytics)
57 Illusory correlation None DSS (analytics)
58 Regression to mean None DSS (analytics)
59
Sample size insensitivity None DSS (analytics)
60

54
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 55 of 75 Journal of Information Technology

1
2
3 Similarity bias Security/Privacy All IS areas
4
Subset bias None DSS (analytics)
5
6 Confirmation Biases
7 Anchoring effect e-commerce, IS Use/Adoption, All IS areas
8 Security/Privacy, Social & Crowd
9 Computing
10 Confirmation trap e-commerce, IS Theory & Research, IT All IS areas
Management & Governance,
11
Au

Security/Privacy, Virtual
12 Conjunctive and None IT Management & Governance
13 disjunctive events bias (project management)
14 Endowment effect e-commerce, IS Theory & Research, IS Analysis & Design &
15 Security/Privacy Development, IT Management &
th

16 Governance
17 Hindsight bias None IT Management & Governance
(project management)
18
or

Status quo bias IS Use/Adoption, Mobile, Social & Crowd DSS, e-commerce,
19 Computing Security/Privacy
20 Bounded Awareness
21 Biases
Fo c

22 Change blindness None DSS (BI, analytics)


Ac

23 Focalism None DSS (BI, analytics)


24
Inattentional blindness None DSS (BI, analytics)
r e

25
26 Framing Biases
Pe p

27 Framing e-commerce, IS Use/Adoption, IT All IS areas


28 Management & Governance,
29 Security/Privacy, Social & Crowd
Computing
erte

30
31 Linear bias None DSS (BI, analytics)
32 Presentation bias e-commerce DSS (BI, analytics), IS
33 Use/Adoption
Red

34 Pseudocertainty None DSS (BI, analytics)


35 Escalation Biases
36 Unilateral escalation e-commerce, IS Analysis & Design & No other areas
v

37 Development, IS Use/Adoption, IT
38 Management & Governance
M
iew

39 Competitive escalation e-commerce, IS Analysis & Design & No other areas


40 Development, IT Management &
41 Governance
an

Other Biases
42
43 Default trust bias Security/Privacy e-commerce (recommender
44 systems), IS Use/Adoption
45 Herding effect Security/Privacy Social & Crowd Computing, Virtual
us

46 Diversification bias None DSS, e-commerce


47
48
cr

49
50
51
ip

52 7. Concluding Comments
53
54
t

55
56
57 Much of IS research concerns human behavior, especially human decision making.
58
59 The current orthodoxy for understanding this behavior is behavioral economics. IS
60

55
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 56 of 75

1
2
3 researchers should use the highest quality and the most relevant reference theory
4
5 available for their work, so it follows that IS research that addresses human behavior
6
7
in some way should consider using BE as a part of its theory foundation. This does not
8
9
10 mean that BE should feature in all these projects, only that it should be seriously
11
Au

12 considered.
13
14
15
th

16 This paper adds to previous work by Sage (1981), Browne and Parsons (2012), Goes
17
18 (2013), Fleischmann et al. (2014), and Arnott and Gao (2019) that urged IS scholars
or

19
20 to make significant use of BE foundation theory. The paper has provided an overview
21
Fo c

22 of the BE discipline, provided a critical analysis of how BE is used in IS research,


Ac

23
24 discussed impediments to the adoption of BE as reference theory, and posited
r e

25
26
possible directions of BE use in IS research. Section 2 of this paper, and in particular
Pe p

27
28
29 Tables 1 through 5, and Table 24, can be used to locate a concept or effect that could
erte

30
31 be relevant to a particular IS research project. They also provide links to further details
32
33 of these aspects of BE. This guide to the detailed use of BE in IS research is a major
Red

34
35 contribution of the paper.
36
v

37
38
M
iew

39 Before summarizing and concluding the paper, it is appropriate to consider the


40
41
an

limitations of the empirical work and to make an observation related to the review
42
43 method. The first limitation relates to sampling; no sample is a perfect representation
44
45
us

of its population. This research used a small number of high-quality journals to


46
47
48
represent general IS publication. However, important uses of BE in IS research could
cr

49
50 have been published outside the Basket of 8. Further, some IS articles use BE ideas
51
but do not cite any BE articles; these would have escaped our sampling procedure. As
ip

52
53
54 was discussed in Section 4.2, arriving at a sample of BE-using articles in a reliable
t

55
56 way was very time consuming. Attempting to rigorously sample BE use in all IS-related
57
58 journals may be impossible. For the reasons given in Section 4.2 we believe our
59
60 sample is a reasonable representation of all IS publication. The second limitation

56
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 57 of 75 Journal of Information Technology

1
2
3 relates to the subjective coding of some items in the protocol used for the literature
4
5 analysis. The most important of these items is the degree of use of BE in an article.
6
7
This item is important as it was used to determine the second BE-using sample – those
8
9
10 articles with a major use of BE as reference theory. The quality of this sample depends
11
Au

12 on the judgment of this item. Some other items in the protocol involved free text coding
13
14 that was later analyzed and classified. The greatest care was taken with the coding of
15
th

16 these items and we believe that other researchers could generate similar data using
17
18 the project’s protocol and codebook.
or

19
20
21
Fo c

22 In addition to the limitations of this research, it is appropriate to comment of the efficacy


Ac

23
24 of the bibliometric content analysis approach that was the empirical method used in
r e

25
26
this research. In Table 10 the most cited BE theory, by some margin, was prospect
Pe p

27
28
29 theory. As a result, a citation analysis would have reported that prospect theory was
erte

30
31 the most used aspect of BE in IS research. However, when the actual content of the
32
33 articles was examined (Table 19), prospect theory although highly cited was only
Red

34
35 meaningfully used by 20.4% of BE-using IS articles (31.9% of major using articles).
36
v

37 From the content analysis it emerged that cognitive biases are in fact the most
38
M
iew

39 significant aspect of BE for IS research (61.7% of major use articles). Citation of a


40
41
an

theory does not always mean the cited theory is integral to the research and
42
43 considering the actual use of a citation provides a richer and more accurate
44
45
us

understanding. However, the analysis of citations as an adjunct and guide to the


46
47
48
content analysis has proven useful in this research.
cr

49
50
51
In conclusion, the empirical part of this paper found that many IS researchers do have
ip

52
53
54 a broad understanding of contemporary behavioral economics theory but the fraction
t

55
56 of IS articles using the theory is relatively small for such a major behavioral theory. The
57
58 use of BE as major reference theory in the literature analysis has an ad hoc feel.
59
60 Researchers tend to adopt a specific aspect of BE that they believe is relevant to their

57
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 58 of 75

1
2
3 research, often with little explicit justification and little critique. E-commerce was the IS
4
5 area most researched with BE reference theory, while surprisingly, DSS was the least
6
7
researched. Operational tasks dominated BE-using IS research which is strongly
8
9
10 positivist in nature. Experiments and field studies are used in the majority of studies.
11
Au

12 Cognitive bias is the BE aspect most used in IS research. The confirmation, framing,
13
14 and overconfidence bias categories were the most popular in IS research, and there
15
th

16 were no bounded awareness biases used in the analyzed research. The derived BE
17
18 theories, concepts, and methods of prospect theory, nudges, and debiasing show an
or

19
20 increasing popularity over time.
21
Fo c

22
Ac

23
24 In the more speculative part of the paper it was argued that dual process theory is a
r e

25
26
BE area that should feature more fundamentally in IS research and that an
Pe p

27
28
29 understanding of the roles, strengths, and interactions of the two cognitive systems
erte

30
31 could change the nature and content of IS development methods, IT management
32
33 processes, and support for senior management decision making. The most obvious
Red

34
35 aspect of BE that can be expanded in IS research is the set of 19 biases that were not
36
v

37 mentioned in the samples. Many of the cognitive biases, especially overconfidence


38
M
iew

39 and bounded awareness biases, may help explain use and adoption behaviors in more
40
41
an

nuanced and fundamental ways than the current IS theory set. Research methods
42
43 other than experiments and field studies, especially case studies and design science,
44
45
us

could effectively use BE to investigate virtually any aspect of IS. This could lead to a
46
47
48
more detailed understanding of key IS phenomena. A major IS contribution to BE
cr

49
50 knowledge could come in the debiasing area. Finally, we believe that BE reference
51
theory has the potential to positively transform significant areas of IS research.
ip

52
53
54
t

55
56
57
58 Acknowledgements
59
60

58
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 59 of 75 Journal of Information Technology

1
2
3 The contribution of the authors was equal.
4
5
6
7
8 References
9
References marked * comprise the 47-article major-use of BE sample.
10
11
Au

12 Abdolmohammadi, M. (1987). Decision support and expert systems in auditing: A


13 review and research directions. Accounting and Business Research, 17(66),
14 173-185.
15 Adam F., & Murphy, C. (1995). Information flows amongst executives: Their
th

16 implications for systems development. Journal of Strategic Information


17 Systems, 4(4), 341-355.
18 *Adjerid, I., Peer, E., & Acquisti, A. (2018). Beyond the privacy paradox: Objective
or

19 versus relative risk in privacy decision making. MIS Quarterly, 42(2), 465-488.
20 *Adomavicius, G., Bockstedt, J. C., Curley, S. P., & Zhang, J. (2018). Effects of
21 online recommendations on consumers’ willingness to pay. Information
Fo c

22 Systems Research, 29(1), 84-102.


Ac

23 Agosto, D.E. (2002). Bounded rationality and satisficing in young people's Web-
24 based decision making. Journal of the Association for Information Science and
r e

25 Technology, 53(1), 16-27.


26 Ajzen, I., & Fishbein, M. (1973). Attitudinal and normative variables as predictors of
Pe p

27
specific behavior. Journal of Personality and Social Psychology, 27(1), 41-57.
28
Akerlof, G.A. (1970). The market for lemons: Quality uncertainty and the market
29
mechanism. Quarterly Journal of Economics, 84(3), 488-500.
erte

30
31 Alias, M. (1953). Le comportement de l’homme rationnel devant le risqué: Critique
32 des postulats et axioms de l’ecole americaine. Econometrica, 21(4), 503-546.
33 Allen, G., & Parsons, J. (2010). Is query reuse potentially harmful? Anchoring and
Red

34 adjustment in adapting existing database queries. Information Systems


35 Research, 21(1), 56-77.
36 Alloy, L.B., & Tabachnik, N. (1984). Assessment of covariation by humans and
v

37 animals: Joint influence of prior expectations and current situational


38 information. Psychological Review, 91, 112–149.
M
iew

39 Angehrn, A.A., & Jelassi, T. (1994). DSS research and practice in perspective.
40 Decision Support Systems, 12, 257–275.
41 Anthony, R. N. (1965). Planning and control systems: A framework for analysis.
an

42 Boston: Harvard University Press.


43 *Arazy, O., Kopak, R., & Hadar, I. (2017). Heuristic principles and differential
44 judgments in the assessment of information quality. Journal of the Association
45
us

for Information Systems, 18(5), 403-432.


46
Ariely, D. (2013). The (honest) truth about dishonesty. New York: Harper Perennial.
47
Ariely, D., & Simonson, I. (2003). Buying, bidding, playing, or competing? Value
48
assessment and decision dynamics in online auctions. Journal of Consumer
cr

49
50
Psychology, 13(1-2), 113-123.
51 Arkes, H., Faust, D., Guilmette, T. J., & Hart, K. (1988). Eliminating the hindsight
bias. Journal of Applied Psychology, 73(2), 305–307.
ip

52
53 Arkes, H.R., Hackett, C., & Boehm, L. (1989). The generality of the relation between
54 familiarity and judged validity. Journal of Behavioural Decision Making, 2, 81-
t

55 94.
56 Arnott, D. (2006). Cognitive biases and decision support systems development: A
57 design science approach. Information Systems Journal, 16(1), 55-78.
58 Arnott, D. (2010). Senior executive information behaviors and decision support.
59 Journal of Decision Systems, 19 (4), 465-480.
60

59
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 60 of 75

1
2
3 Arnott, D., & Gao, S. (2019). Behavioral economics for decision support systems.
4 Decision Support Systems, 122, 113063.
5 Arnott, D., Lizama, F., & Song, Y. (2017). Patterns of business intelligence systems
6
use in organizations. Decision Support Systems, 97, 58-68.
7
Arnott, D., & O’Donnell, P. (2008). A note on an experimental study of DSS and
8
9
forecasting exponential growth. Decision Support Systems, 45(1), 180-186.
10 Arnott, D., & Pervan, G. (2014). A critical analysis of decision support systems
11 research revisited: The rise of design science. Journal of Information
Au

12 Technology, 29, 269-293.


13 Arrow, K.J. (1950). A difficulty in the concept of social welfare. Journal of Political
14 Economy, 58(4), 328-346.
15 Avison, D., & Malaurent, J. (2014). Is theory king?: Questioning the theory fetish in
th

16 information systems. Information Systems Journal, 29(4), 358-361.


17 Ayton, P., Hunt, A.J., & Wright, G. (1989). Psychological conceptions of randomness.
18 Journal of Behavioural Decision Making, 2 (4), 221-238.
or

19 Barfar, A., Padmanabhan, B., & Hevner, A. (2017). Applying behavioral economics in
20 predictive analytics for B2B churn. Decision Support Systems, 101, 115-127.
21 Bar-Hillel, M. (1973). On the subjective probability of compound events.
Fo c

22 Organizational Behavior and Human Performance, 9, 396-406.


Ac

23 *Bartelt, V. L., & Dennis, A. R. (2014). Nature and nurture: The impact of automaticity
24 and the structuration of communication on virtual team behavior and
r e

25
performance. MIS Quarterly, 38(2), 521-538.
26
Bazerman, M., & Moore, D. (2009). Judgment in managerial decision making (7th
Pe p

27
28
ed.). Hoboken, NJ: John Wiley & Sons.
29 Bazerman, M., & Moore, D. (2013). Judgment in managerial decision making (8th
ed.). Hoboken, NJ: John Wiley & Sons.
erte

30
31 Bhandari, G., Hassanein, K., & Deaves, R. (2008). Debiasing investors with decision
32 support systems: An experimental investigation. Decision Support Systems, 46,
33 399-410.
Red

34 Bhattacherjee, A., & Sanford, C. (2006). Influence processes for information


35 technology acceptance: An elaboration likelihood model. MIS Quarterly, 30(4),
36 805-825.
v

37 *Bockstedt, J.C., & Goh, K.H. (2014). Customized bundling and consumption variety
38 of digital information goods. Journal of Management Information Systems, 31(2),
M
iew

39 105-132.
40 Brenner, L.A., Koehler, D.J., Liberman, V., & Tversky, A. (1996) Overconfidence in
41
an

probability and frequency judgments: A critical examination. Organizational


42 Behavior & Human Decision Processes, 65(3), 212-219.
43 Briggs, L.K., & Krantz, D.H. (1992). Judging the strength of designated evidence.
44 Journal of Behavioral Decision Making, 5, 77-106.
45
us

Browne, G.J., & Parsons, J. (2012). More enduring questions in cognitive IS


46
research. Journal of the Association for Information Systems, 13(12), 1000-
47
48
1011.
Buehler, R., Griffin, D., & Ross, M. (2002). Inside the planning fallacy: The causes
cr

49
50 and consequences of optimistic time predictions. In T. Gilovich, D. Griffin, & D.
51 Kahneman (Eds.), Heuristics and Biases: The psychology of intuitive judgment
(pp. 250-270). New York: Cambridge University Press.
ip

52
53 Camerer, C. F., & Loewenstein, G. (2003). Behavioral economics: Past, present,
54 future. In C.F. Camerer, G. Loewenstein, & M. Rabin (Eds.), Advances in
t

55 behavioral economics (pp. 3-51). Princeton, NJ: Princeton University Press.


56 Carpenter M.A., Gelatkanycz M.A., & Saunders W. G. (2004). Upper echelons
57 research revisited: Antecedents, elements, and consequences of top
58 management team composition. Journal of Management, 30, 749-778.
59 Chaiken, S. (1980). Heuristic versus systematic information processing and the use
60 of source versus message cues in persuasion. Journal of Personality and Social

60
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 61 of 75 Journal of Information Technology

1
2
3 Psychology, 39(5), 752-766.
4 Chaiken, S., & Maheswaran, D. (1994). Heuristic processing can bias systematic
5 processing: Effects of source credibility, argument ambiguity, and task
6
importance on attitude judgment. Journal of Personality and Social Psychology,
7
66(3), 460-473.
8
9
Chapman, G.B., & Johnson E.J. (1994). The limits of anchoring. Journal of
10 Behavioral Decision Making, 7, 223-242.
11 Chapman, K., & Pike, L.E. (2013). Literature on behavioral economics, Part 1:
Au

12 Introduction and books. Behavioral & Social Sciences Librarian, 32(4), 205-223.
13 *Chen, C-W., & Koufaris, M. (2015). The impact of decision support system features
14 on user overconfidence and risky behavior. European Journal of Information
15 Systems, 24, 607-623.
th

16 Chen, J.Q., & Lee, S.M. (2003). An exploratory cognitive DSS for strategic decision
17 making. Decision Support Systems, 36 (2), 147-160.
18 Chen, W-S., & Hirschheim, R. (2004). A pragmatic and methodological examination of
or

19 information systems research from 1991 to 2001. Information Systems Journal,


20 14, 197-235.
21 Cheng, F., & Wu, C. (2010). Debiasing the framing effect: The effect of warning and
Fo c

22 involvement. Decision Support Systems, 49(3), 328-334.


Ac

23 *Chiu, C-M., Wang, E.T.G., Fang, Y-H., & Huang, H-Y. (2014). Understanding
24 customers' repeat purchase intentions in B2C e-commerce: The roles of
r e

25
utilitarian value, hedonic value and perceived risk. Information Systems Journal,
26
24(1), 85-114.
Pe p

27
28
Chiu, V., Liu, Q., Muehlmann, B., & Baldwin, A.A. (2019). A bibliometric analysis of
29 accounting information systems journals and their emerging technologies
contributions. International Journal of Accounting Information Systems, 32,
erte

30
31 24-43.
32 Choudhury, V., & Sampler, J.L. (1997). Information specificity and environmental
33 scanning: An economic perspective. MIS Quarterly, 21(1), 25-53.
Red

34 Christensen-Szalanski, J.J., & Bushyhead, J.B. (1981). Physicians use of


35 probabilistic judgement in a real clinical setting. Journal of Experimental
36 Psychology: Human Perception and Performance, 7, 928-935.
v

37 Cialdini, R.B. (2009). Influence: Science and practice. Boston: Pearson Education.
38 *Constantiou, I. D., Lehrer, C., & Hess, T. (2014). Changing information retrieval
M
iew

39 behaviours: An empirical investigation of users' cognitive processes in the choice


40 of location-based services. European Journal of Information Systems, 23(5),
41
an

513-528.
42 Corr, P., & Plagnol, A. (2019). Behavioral economics: The basics. Oxford, UK:
43 Routledge.
44 Critcher, C. R., & Gilovich, T. (2008). Incidental environmental anchors. Journal of
45
us

Behavioral Decision Making, 21(3), pp. 241-251.


46
Cyert, R, & March, J.G. (1963). A behavioral theory of the firm. New Jersey:
47
48
Prentice-Hall.
Davidson, E. (2002). Technology frames and faming: A socio-cognitive investigation
cr

49
50 of requirements determination. MIS Quarterly, 26(4), 329-358.
51 Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user
acceptance of information technology. MIS Quarterly, 13(3), 319-339.
ip

52
53 Dewey, J. (1933). How we think. New York: Health. (Originally published 1910).
54 *Dinev, T., McConnell, A. R., & Smith, H. J. (2015). Research commentary–Informing
t

55 privacy research through information systems, psychology, and behavioral


56 economics: Thinking outside the “APCO” box. Information Systems Research,
57 26(4), 639-655.
58 Dudezert, A., & Leidner, D.E. (2011). Illusions of control and social domination
59 strategies in knowledge mapping system use. European Journal of Information
60 Systems, 20(5), 574-588.

61
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 62 of 75

1
2
3 Dunning, D., Heath, C., & Suls, J.M. (2004). Flawed self-assessment: Implications for
4 health, education, and the workplace. Psychological Science in the Public
5 Interest, 5(3), 69-106.
6
Dusenbury, R., & Fennma, M.G. (1996). Linguistic-numeric presentation mode
7
effects on risky option preferences. Organizational Behavior and Human
8
9
Decision Processes, 68, 109-122.
10 Edwards, W. (1954). The theory of decision making. Psychological Review, 51(4),
11 380-417.
Au

12 Edwards, W. (1961). Behavioral decision theory. Annual Review of Psychology, 12,


13 473-498.
14 Elam, J.J., Jarvenpaa, S.L., & Schkade, D.A. (1992). Behavioral decision theory and
15 DSS: New opportunities for collaborative research. In E.A. Stohr, & B.R.
th

16 Konsynski (Eds.). Information Systems and Decision Processes (pp. 51–74). Los
17 Alamitos, CA: IEEE Computer Society Press.
18 Ellsberg, D. (1961). Risk, ambiguity, and the Savage axioms. Quarterly Journal of
or

19 Economics, 75(4), 643-669.


20 Epley, N., & Gilovich, T. (2006). The anchoring-and-adjustment heuristic: Why the
21 adjustments are insufficiient. Psychological Science, 17(4), 311-318.
Fo c

22 Evans, J.St.B.T. (1989). Bias in human reasoning: Causes and consequences.


Ac

23 London: Lawrence-Erlbaum.
24 Evans, J.StB.T. (2003). In two minds: Dual-process accounts of reasoning. Trends in
r e

25
Cognitive Sciences, 7(10), 454-459.
26
Evans, J.StB.T. (2008). Dual-processing accounts of reasoning, judgment, and social
Pe p

27
28
cognition. Annual Review of Psychology, 59, 255-278.
29 *Fadel, K. J., Meservy, T. O., & Jensen, M. L. (2015). Exploring knowledge filtering
processes in electronic networks of practice. Journal of Management
erte

30
31 Information Systems, 31(4), 158-181.
32 *Fehrenbacher, D. D. (2017). Affect infusion and detection through faces in
33 computer-mediated knowledge-sharing decisions. Journal of the Association
Red

34 for Information Systems, 18(10), 703-726.


35 *Ferratt, T. W., Prasad, J., & Dunne, E. J. (2018). Fast and flow processes
36 underlying theories of information technology use. Journal of the Association
v

37 for Information Systems, 19(1), 1-22.


38 Fischhoff, B. (1982). Debiasing. In D. Kahneman, P. Slovic, & A. Tversky (Eds.),
M
iew

39 Judgement under Uncertainty: Heuristics and Biases (pp. 422-444). New


40 York: Cambridge University Press.
41
an

Fischhoff, B. (2007). An early history of hindsight research. Social Cognition, 25, 10–
42 13.
43 Fischhoff, B., Slovic, P., & Lichtenstein, S. (1978). Fault trees: Sensitivity of
44 estimated failure probabilities to problem representation. Journal of
45
us

Experimental Psychology: Human Perception and Performance, 4, 330-344.


46
Fleischmann, M., Amirpur, M., Benlian, A., & Hess, T. (2014). Cognitive biases in
47
48
information systems research: A scientometric analysis. In Proceedings of the
European Conference on Information Systems (ECIS 2014). Tel Aviv, Israel,
cr

49
50 June 9-11, 2014.
51 Fox, J. (2015). A short history of modern decision making: From “economic man” to
behavioral economics. Harvard Business Review, May, 79-85.
ip

52
53 *Frisk, J. E., Lindgren, R., & Mathiassen, L. (2014). Design matters for decision
54 makers: Discovering IT investment alternatives. European Journal of
t

55 Information Systems, 23(4), 442-461.


56 Furnham, A., & Boo, H. C. (2011). A literature review of the anchoring effect. The
57 Journal of Socio-Economics, 40(1), 35-42.
58 Galbraith, J.K. (1952). American capitalism: The concept of countervailing power.
59 Boston, MA: Houghton Mifflin.
60

62
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 63 of 75 Journal of Information Technology

1
2
3 George, J.F., Duffy, K., & Ahuja, M. (2000). Countering the anchoring and
4 adjustment bias with decision support systems. Decision Support Systems,
5 29(2), 195-206.
6
Gigerenzer, G. (2000). Adaptive thinking: Rationality in the real world. New York:
7
Oxford University Press.
8
9
Gigerenzer, G. (2014). Risk savvy: How to make good decisions. London: Allen Lane.
10 Gigerenzer, G., & Brighton, H. (2009). Homo heuristicus: Why biased minds make
11 better inferences. Topics in Cognitive Science, 1, 107-143.
Au

12 Gigerenzer, G., & Selten, R. (Eds.) (2001). Bounded rationality: The adaptive
13 toolbox. Cambridge, MA: MIT Press.
14 Gigerenzer, G., & Todd, P. M. (1999). Fast and frugal heuristics: The adaptive
15 toolbox. In G. Gigerenzer, P. M. Todd, & The ABC Research Group, Evolution
th

16 and cognition. Simple heuristics that make us smart (pp. 3-34). New York:
17 Oxford University Press.
18 Gilbert, D. (1991). How mental systems believe. American Psychologist, 46(2), 107-
or

19 119.
20 Gilovich, T., Griffin, D., & Kahneman, D. (Eds.) (2002). Heuristics and biases: The
21 psychology of intuitive judgment. New York: Cambridge University Press.
Fo c

22 *Goel, S., Williams, K., & Dincelli, E. (2017). Got phished? Internet security and
Ac

23 human vulnerability. Journal of the Association for Information Systems,


24 18(1), 22-44.
r e

25
Goes, P.B. (2013). Editor’s Comments: Information systems research and behavioral
26
economics. MIS Quarterly, 37(3), iii-viii.
Pe p

27
28
Goldstein, D.G., & Gigerenzer, G. (2002). Models of ecological rationality: The
29 recognition heuristic. Psychological Review, 109(1), 75-90.
Gorry, G. A., & Scott Morton, M. S. (1971). A framework for management information
erte

30
31 systems. Sloan Management Review, 13(1), 55-70.
32 *Grange, C., & Benbasat, I. (2018). Opinion seeking in a social network-enabled
33 product review website: A study of word-of-mouth in the era of digital social
Red

34 networks. European Journal of Information Systems, 27(6), 629-653.


35 *Gregory, R. W., & Muntermann, J. (2014). Research note–Heuristic theorizing:
36 Proactively generating design theories. Information Systems Research, 25(3),
v

37 639-653.
38 *Gupta, A., Kannan, K., & Sanyal, P. (2018). Economic experiments in information
M
iew

39 systems. MIS Quarterly, 42(2), 595-606.


40 Gwebu, K., Wang, J., & Guo, L. (2014). Continued usage intention of multifunctional
41
an

friend networking services: A test of a dual-process model using Facebook.


42 Decision Support Systems, 67, 66-77.
43 Haddara, M., & Hetlevik, T. (2016). Investigating the effectiveness of traditional support
44 structures & self-organizing entities within the ERP shakedown phase. Procedia
45
us

Computer Science, 100, 507-516.


46
Hall, D. J., & Paradice, D. (2005). Philosophical foundations for a learning-oriented
47
48
knowledge management system for decision support. Decision Support
Systems, 39(3), 445-461.
cr

49
50 Halpern, D. (2015). Inside the Nudge Unit. London: WH Allen.
51 Harinck, F., Van Dijk, E., Van Beest, I., & Mersmann, P. (2007). When gains loom
larger than losses: Reversed loss aversion for small amounts of money.
ip

52
53 Psychological Science, 18(12), 1099-1105.
54 Hastie, R., & Dawes, R.M. (2001). Rational choice in an uncertain world. Thousand
t

55 Oaks, CA: SAGE Publications.


56 Hertwig, R., Fenselow, C., & Hoffrage, U. (2003). Hindsight bias: Knowledge and
57 heuristics affect our reconstruction of the Past. Memory, 11(4–5), 357–377.
58 Hirschheim, R. (2019). Against theory: With apologies to Feyerabend. Journal of the
59 Association for Information Systems, 20(9), 1338-1355.
60

63
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 64 of 75

1
2
3 *Hoefnagel, R., Oerlemans, L., & Goedee, J. (2014). How anchoring and adjusting
4 influence citizens’ acceptance of video-mediated crime reporting: A narrative
5 approach. Journal of Strategic Information Systems, 23(4), 305-322.
6
Hogarth, R. (1987). Judgement and choice: The psychology of decision (2nd
7
ed.). Chichester, UK: Wiley.
8
9
Hogarth, R.M., & Einhorn, H.J. (1992). Order effects in belief updating: The
10 belief-adjustment model. Cognitive Psychology, 24(1), 1-55.
11 Horton, D.L., & Mills, C.B. (1984). Human learning and memory. Annual Review of
Au

12 Psychology, 35, 361-394.


13 Huang, H-H., Hsu, J.S-C., & Ku, C-Y. (2012). Understanding the role of computer-
14 mediated counterargument in countering confirmation bias. Decision Support
15 Systems, 53, 438-447.
th

16 *Huang, N., Hong, Y., & Burtch, G. (2017). Social network integration and user
17 content generation: Evidence from natural experiments. MIS Quarterly, 41(4),
18 1035-1058.
or

19 Jaswal, V.K., Croft, A.C., Setia, A.R., & Cole, C.A. (2010). Young children have a
20 specific, highly robust bias to trust testimony. Psychological Science, 21(10),
21 1541-1547.
Fo c

22 Joram, E., & Reed, D. (1996). Two faces of representativeness: The effects of
Ac

23 response format on beliefs about random sampling. Journal of Behavioral


24 Decision Making, 9(4), 249 – 264.
r e

25
Jorgenson, M., Indahl, U., & Sjoberg, D. (2003). Software effort estimation by
26
analogy and regression toward the mean. Journal of System Software, 68,
Pe p

27
28
253–262.
29 Kahneman, D. (1973). Attention and effort. Englewood Cliffs, NJ: Prentice-Hall.
Kahneman, D. (2003). Maps of bounded rationality: A perspective on intuitive
erte

30
31 judgment and choice. In T. Frangsmyr [Nobel Foundation], (Ed.), Les Prix
32 Nobel: The Nobel Prizes 2002 (pp. 449-489). Stockholm, SE: The Nobel
33 Foundation.
Red

34 Kahneman, D. (2011). Thinking fast and slow. New York: Farrar, Straus and Giroux.
35 Kahneman, D., & Frederick, S. (2002). Representativeness revisited: Attribute
36 substitution in intuitive judgment. In T. Gilovich, D. Griffin, & D. Kahneman
v

37 (Eds.), Heuristics and Biases: The psychology of intuitive judgment (pp. 49–
38 81). New York: Cambridge University Press.
M
iew

39 Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to
40 disagree. American Psychologist, 64, 515-526.
41
an

Kahneman, D., Knetsch, J.L., & Thaler, R.H. (1990). Experimental tests of the
42 endowment effect and the Coase theorem. Journal of Political
43 Economy, 98(6): 1325–1348
44 Kahneman, D., Knetsch, J. L., & Thaler, R. H. (1991). Anomalies: The endowment
45
us

effect, loss aversion, and status quo bias. Journal of Economic


46
Perspectives, 5(1), 193–206.
47
48
Kahneman, D., & Lovallo, D. (1993). Timid choices and bold forecasts: A cognitive
perspective on risk taking. Management Science, 39, 17-31.
cr

49
50 Kahneman, D., Slovic, P., & Tversky, A. (Eds.) (1982). Judgment under uncertainty:
51 Heuristics and biases. Cambridge, UK: Cambridge University Press.
Kahneman, D. & Tversky, A. (1972). Subjective probability: A judgment of
ip

52
53 representativeness. Cognitive Psychology, 3(3), 430-454.
54 Kahneman, D. & Tversky, A. (1979a). Prospect theory: An analysis of decision under
t

55 risk. Econometrica, 47, 263-291.


56 Kahneman, D., & Tversky, A. (1979b). Intuitive prediction: Biases and corrective
57 procedures. TIMS Studies in Management Science, 12, 313-327.
58 Kahneman, D. & Tversky, A. (Eds.) (2000). Choices, values, and frames. Cambridge,
59 UK: Cambridge University Press.
60

64
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 65 of 75 Journal of Information Technology

1
2
3 *Kehr, F., Kowatsch, T., Wentzel, D., & Fleisch, E. (2015). Blissfully ignorant: The
4 effects of general privacy concerns, general institutional trust, and affect in
5 the privacy calculus. Information Systems Journal, 25(6), 607-635.
6
Keil, M., Mann, J., & Rai, A. (2000). Why software projects escalate: An empirical
7
analysis and test of four empirical models. MIS Quarterly, 24(4), 631-664.
8
9
Keren, G. (1990). Cognitive aids and debiasing methods: Can cognitive pills cure
10 cognitive ills. In J.P. Caverni, J.M. Fabre, & M. Gonzalez (Eds.), Cognitive
11 biases (pp.523-555). Amsterdam: North-Holland.
Au

12 *Khan, S. S., Kumar, R., Zhao, K., & Stylianou, A. (2017). Examining real options
13 exercise decisions in information technology investments. Journal of the
14 Association for Information Systems, 18(5), 372-402.
15 Klein, G. (1999). Sources of power: How people make decisions. Cambridge, MA: MIT
th

16 Press.
17 Klein, G. (2004). The power of intuition: How to use your gut feelings to make better
18 decisions at work. New York: Currency-Doubleday.
or

19 Klein, G., Calderwood, R., & Clinton-Cirocco, A. (2010). Rapid decision making on the
20 fire ground: The original study plus a postscript. Journal of Cognitive Engineering
21 and Decision Making, 4(3), 186-209.
Fo c

22 *Kloör, B., Monhof, M., Beverungen, D., & Braäer, S. (2018). Design and evaluation
Ac

23 of a model-driven decision support system for repurposing electric vehicle


24 batteries. European Journal of Information Systems, 27(2), 171-188.
r e

25
Koellinger, P., Minniti, M., & Schade, C. (2007). “I think I can, I think I can”:
26
Overconfidence and entrepreneurial behavior. Journal of Economic
Pe p

27
28
Psychology, 28, 502-527.
29 Kotter J.P. (1982). The general managers. New York: The Free Press.
Kottemann, J.E., Davis, F.D., & Remus, W.E. (1994). Computer-assisted decision
erte

30
31 making: Performance, beliefs, and the illusion of control. Organizational
32 Behavior and Human Decision Processes, 57(1), 26-37.
33 *Kretzer, M., & Maedche, A. (2018). Designing social nudges for enterprise
Red

34 recommendation agents: An investigation in the business intelligence


35 systems context. Journal of the Association for Information Systems, 19(12),
36 1145-1186.
v

37 Kruger, J., Wirtz, D., Boven, L., & Altermatt, T. (2004). The effort heuristic. Journal of
38 Experimental Social Psychology, 40, 91-98.
M
iew

39 *Kuan, K. K. Y., Hui, K.-L., Prasarnphanich, P., & Lai, H.-Y. (2015). What makes a
40 review voted? An empirical investigation of review voting in online review
41
an

systems. Journal of the Association for Information Systems, 16(1), 48-71.


42 Kuo, F., Hsu, C., & Day, R. (2009). An exploratory study of cognitive effort involved in
43 decision under framing—An application of the eye-tracking technology.
44 Decision Support Systems, 48(1), 81-91.
45
us

Langer, E. J. (1975). The illusion of control. Journal of Personality and Social


46
Psychology, 32(2), 311-328.
47
48
*Lee, H. K., Lee, J. S., & Keil, M. (2018). Using perspective-taking to de-escalate
launch date commitment for products with known software defects. Journal of
cr

49
50 Management Information Systems, 35(4), 1251-1276.
51 *Lee, J. S., & Keil, M. (2018). The effects of relative and criticism-based performance
appraisals on task-level escalation in an IT project: A laboratory experiment.
ip

52
53 European Journal of Information Systems, 27(5), 551-569.
54 *Lee, K., & Joshi, K. (2017). Examining the use of status quo bias perspective in IS
t

55 research: Need for reconceptualizing and incorporating biases. Information


56 Systems Journal, 27(6), 733-752.
57 *Legoux, R., Leger, P.-M., Robert, J., & Boyer, M. (2014). Confirmation biases in the
58 financial analysis of IT investments. Journal of the Association for Information
59 Systems, 15(1), 33-52.
60

65
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 66 of 75

1
2
3 *Li, M., Tan, C.-H., Wei, K.-K., & Wang, K. (2017). Sequentially of product review
4 information provision: An information foraging perspective. MIS Quarterly,
5 41(3), 867-892.
6
*Liang, H., Xue, Y., & Zhang, Z. (2017). Understanding online health information use:
7
The case of people with physical disabilities. Journal of the Association for
8
9
Information Systems, 18(6), 433-460.
10 Lim, K.H., Benbasat, I., & Ward, L.M. (2000). The role of multimedia in changing first
11 impression bias. Information Systems Research, 11(2), 115-136.
Au

12 Lim, L-H., & Benbasat, I. (1997). The debiasing role of group support systems: An
13 experimental investigation of the representativeness bias. International
14 Journal of Human-Computer Studies, 47, 453-471.
15 Lipschitz, R., & Bar-Ilan, O. (1996). How problems are solved: Reconsidering the
th

16 phase theorem. Organizational Behavior & Human Decision Processes,


17 65(1), 48-60.
18 *Liu, Q. B., & Karahanna, E. (2017). The dark side of reviews: The swaying effects of
or

19 online product reviews on attribute preference construction. MIS Quarterly,


20 41(2), 427-448.
21 Lovallo, D., & Kahneman, D. (2003). Delusions of success: How optimism
Fo c

22 undermines executives' decisions. Harvard Business Review, May, 56–63.


Ac

23 Lovallo, D., & Sibony, O. (2010). The case for behavioral strategy. McKinsey
24 Quarterly, March.
r e

25
*Ma, X., Kim, S.H., & Kim, S.S. (2014). Online gambling behavior: The impacts of
26
cumulative outcomes, recent outcomes, and prior use. Information Systems
Pe p

27
28
Research, 25(3), 511-527.
29 Mack, A. (2003). Inattentional blindness: Looking without seeing. Current Direction in
Psychological Science, 12(5), 180-184.
erte

30
31 Mackinnon, A.J., & Wearing, A.J. (1991). Feedback and the forecasting of
32 exponential change. Acta Psychologica, 76, 177-191.
33 March, J.G. (1994). A primer on decision making: How decisions happen. New York:
Red

34 The Free Press.


35 Marshall, A. (1890). Principles of economics. London: Macmillan.
36 Maule, A.J., & Edland, A.C. (1997). The effects of time pressure on human
v

37 judgement and decision making. In R. Ranyard, W.R. Crozier, & O. Svenson,


38 (Eds), Decision Making: Cognitive Models and Explanations (pp.189-204).
M
iew

39 London: Routledge.
40 McKenzie, C.R.M., Liersch, M.J., & Yaniv, I. (2008). Overconfidence in interval
41
an

estimates: What does expertise buy you? Organizational Behavior & Human
42 Decision Processes, 107, 179-191.
43 *Meservy, T. O., Jensen, M. L., & Fadel, K. J. (2014). Evaluation of competing
44 candidate solutions in electronic networks of practice. Information Systems
45
us

Research, 25(1), 15-34.


46
*Minas, R. K., Potter, R. F., Dennis, A. R., Bartelt, V., & Bae, S. (2014). Putting on
47
48
the thinking cap: Using NeuroIS to understand information processing biases
in virtual teams. Journal of Management Information Systems, 30(4), 49-82.
cr

49
50 Mintzberg H. (1973). The nature of managerial work. Englewood Cliffs, NJ: Prentice-
51 Hall.
Mintzberg H. (2009). Managing. San Francisco CA: Berrett-Koehler Publishers.
ip

52
53 Mintzberg, H., Raisinghani, D., & Theoret, A. (1976). The structure of ‘unstructured’
54 decision processes. Administrative Science Quarterly, 21(2), 246-275.
t

55 Mitroff, S.R., Simons, D.J., & Franconeri, S.L. (2002). The Siren Song of implicit
56 change detection. Journal of Experimental Psychology: Human Perception &
57 Performance, 28, 798-815.
58 Moody, G.D., & Galletta, D.F. (2015). Lost in cyberspace: The impact of information
59 scent and time constraints on stress, performance, and attitudes online.
60 Journal of Management Information Systems, 32(1), 192-244.

66
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 67 of 75 Journal of Information Technology

1
2
3 Moore, D.A., & Cain, D.M. (2007). Overconfidence and underconfidence: When and
4 why people underestimate (and overestimate) the competition. Organizational
5 Behavior & Human Decision Processes, 103, 197-213.
6
Moore, D.A., & Healy, P.J. (2008). The trouble with overconfidence. Psychological
7
Review, 115(2), 502-517.
8
9
Moore, D.A., Oesch, J.M., & Zietsma, C. (2007). What competition? Myopic self
10 interest in market entry decisions. Organization Science, 18(3), 440-454.
11 Ni, F., Arnott, D., & Gao, S. (2109). The anchoring effect in business intelligence
Au

12 supported decision-making. Journal of Decision Systems, 28, 67-81.


13 *Niculescu, M. F., & Wu, D. J. (2014). Economics of free under perpetual licensing:
14 Implications for the software industry. Information Systems Research, 25(1),
15 173-199.
th

16 Nisbett, R., & Ross, L. (1980). Human inference: Strategies and shortcomings of
17 social judgment. Englewood Cliffs, NJ: Prentice-Hall.
18 Nisbett, R.E., & Wilson, T.D. (1977). The halo effect: Evidence for unconscious
or

19 alteration of judgments. Journal of Personality and Social Psychology, 35(4),


20 250-256.
21 *Nuijten, A., Keil, M., & Commandeur, H. (2016). Collaborative partner or opponent:
Fo c

22 How the messenger influences the deaf effect in IT projects. European


Ac

23 Journal of Information Systems, 25(6), 534-552.


24 Odnor, M., & Oinas-Kukkonen, H. (2017). Behavioral economics in information
r e

25
systems research: A persuasion context analysis. In Proceedings of the 19th
26
International Conference on Enterprise Information Systems (ICIES 2017)
Pe p

27
28
(Vol. 3, pp. 17-28). SCITEPRESS.
29 Olsen, R.A. (1997). Desirability bias among professional investment managers:
Some evidence from experts. Journal of Behavioral Decision Making, 10, 65-
erte

30
31 72.
32 Oppenheimer, D.M., & Monin, B. (2009). The retrospective gambler’s fallacy: Unlikely
33 events, constructing the past, and multiple universes. Judgment and Decision
Red

34 Making, 4(5), 326-334.


35 Ordonez, L., & Benson, L. III. (1997). Decisions under time pressure: How time
36 constraint affects risky decision making. Organizational Behavior & Human
v

37 Decision Processes, 71, 121-140.


38 Ozdemir, Z.D., Smith, H.J., & Benamati, J.H. (2017). Antecedents and outcomes of
M
iew

39 information privacy concerns in a peer context: An exploratory study.


40 European Journal of Information Systems, 26(6), 642-660.
41
an

Pare, G., Trudel, M-C., Jaana, M., & Kitsiou, S. (2015). Synthesizing information
42 systems knowledge: A typology of literature reviews. Information &
43 Management, 52, 183-199.
44 *Park, E. H., Ramesh, B., & Cao, L. (2016). Emotion in IT investment decision
45
us

making with a real options perspective: The intertwining of cognition and


46
regret. Journal of Management Information Systems, 33(3), 652-683.
47
48
*Park, S. C., Keil, M., Bock, G.-W., & Kim, J. U. (2016). Winner’s regret in online C2C
auctions: An automatic thinking perspective. Information Systems Journal,
cr

49
50 26(6), 613-640.
51 Payne, J. W. (1976). Task complexity and contingent processing in decision making:
An information search and protocol analysis. Organization Behavior and
ip

52
53 Human Decision Processes, 16(2), 366-387.
54 Payne, J. W. (1982). Contingent decision behavior. Psychological Bulletin, 92(2),
t

55 382-402.
56 Payne, J. W., Bettman, J. R., & Johnson, E. J. (1993). The adaptive decision maker.
57 Cambridge, UK: Cambridge University Press.
58 Piramuthu, S., Kapoor, G., Zhou, W., & Mauw, S. (2012). Input online review data
59 and related bias in recommender systems. Decision Support Systems, 53, 418-
60 424.

67
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 68 of 75

1
2
3 Polites, G.L., & Karahanna, E. (2012). Shackled to the status quo: The inhibiting
4 effects of incumbent system habit, switching costs, and inertia on new system
5 acceptance. MIS Quarterly, 36(1), 21-42.
6
Prat, N., Comyn-Wattiau, I., & Akoka, J. (2015). A taxonomy of evaluation methods
7
for information systems artifacts. Journal of Management Information Systems,
8
9
32(3), 229-267.
10 Rafaeli, S., & Raban, D. (2003). Experimental investigation of the subjective value of
11 information in trading. Journal of the Association for Information Systems, 4(5),
Au

12 119-139.
13 Read, D., & Lowenstein, G. (1995). Diversification bias: Explaining the discrepancy in
14 variety seeking between combined and separated choices. Journal of
15 Experimental Psychology: Applied, 1, 34-49.
th

16 Read, D., Antonides, G, Van den Ouden, L., & Trienekens, H. (2001). Which is
17 better: Simultaneous or sequential choice? Organizational Behavior and
18 Human Decision Processes, 84, 54-70.
or

19 Remus, W.E. (1984). An empirical investigation of the impact of graphical and tabular
20 data presentations on decision making. Management Science, 30, 533-542.
21 Remus, W., & Kottemann, J. (1995). Anchor-and-adjustment behavior in a dynamic
Fo c

22 decision environment. Decision Support Systems, 15, 63-74.


Ac

23 Reyna, V.F. (2004). How people make decisions that involve risk: A dual-processes
24 approach. Current Directions in Psychological Science, 13(2), 60-66.
r e

25
Ricchiute, D.N. (1997). Effects of judgement on memory: Experiments in recognition
26
bias and process dissociation in a professional judgement task.
Pe p

27
28
Organizational Behavior & Human Decision Processes, 70, 27-39.
29 Ricketts, J.A. (1990). Powers-of-ten information biases. MIS Quarterly, 14, 63-77.
Rose, J.M., Rose, A.M., & Norman, C.S. (2004). The evaluation of risky information
erte

30
31 technology investment decisions. Journal of Information Systems, 18(1), 53-
32 66.
33 Ross, D. (2014). Psychological versus economic models of bounded rationality.
Red

34 Journal of Economic Methodology, 21(4), 411-427.


35 Roy, M.C., & Lerch, F.J. (1996). Overcoming ineffective mental representations in
36 base-rate problems. Information Systems Research, 7(2), 233-247.
v

37 Russo, J.E., Medvec, V.H., & Meloy, M.G. (1996). The distortion of information
38 during decisions. Organizational Behavior & Human Decision Processes, 66,
M
iew

39 102-110.
40 Sage, A.P. (1981). Behavioral and organizational considerations in the design of
41
an

information systems and processes for planning and decision support. IEEE
42 Transactions on Systems, Man, and Cybernetics, SMC-11(9), 640-678.
43 Saldana, J. (2013). The coding manual for qualitative researchers (2nd edn.).
44 Thousand Oaks, CA: Sage.
45
us

*Salge, T. O., Kohli, R., & Barrett, M. (2015). Investing in information systems: On
46
the behavioral and institutional search mechanisms underpinning hospitals’ IS
47
48
investment decisions. MIS Quarterly, 39(1), 61-89.
Samuelson, P., & Norhaus, W. (2009). Economics (19th edn.). New york: McGraw Hill
cr

49
50 Education.
51 Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making.
Journal of Risk and Uncertainty, 1(1), 7-59.
ip

52
53 Savage, L.J. (1954). The foundations of statistics. New York: Wiley.
54 Schade, D.A., & Kahneman, D. (1998). Does living in California make people happy?
t

55 A focusing illusion in judgments of life staisfaction. Psychological Science,


56 9(5), 340-346.
57 Schrift, R.Y., Kivetz, R., & Netzer, O. (2016). Complicating decisions: The work
58 heuristic and the construction of effortful decisions. Journal of Experimental
59 Psychology: General, 145(7), 807-829.
60 Schwarz, N., & Vaughn, L.A. (2002). The availability heuristic revisited: Ease

68
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 69 of 75 Journal of Information Technology

1
2
3 of recall and content of recall as distinct sources of information. In T.
4 Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and Biases: The
5 psychology of intuitive judgment (pp. 103–119). New York: Cambridge
6
University Press.
7
Sedikides, C., & Gregg, A.P. (2008). Self-enhancement: Food for thought.
8
9
Perspectives on Psychological Science, 3(2), 102-116.
10 Sedlmeier, P, & Gigerenzer, G. (1997). Intuitions about sample size: The
11 empirical law of large numbers. Journal of Behavioral Decision Making,
Au

12 10, 33-51.
13 Shiller, R.J. (1978). Rational expectations and the dynamic structure of
14 macroeconomic models: A critical review. Journal of Monetary Economics, 4, 1-
15 44.
th

16 *Shmueli, O., Pliskin, N., & Fink, L. (2016a). Can the outside-view approach improve
17 planning decisions in software development projects? Information Systems
18 Journal, 26(4), 395-418.
or

19 Shmueli, O., Pliskin, N., & Fink, L. (2016b). Explaining over-requirement in software
20 development projects: An experimental investigation of behavioral effects.
21 International Journal of Project Management, 33(2), 380-394.
Fo c

22 Simon, H.A. (1955). A behavioral model of rational choice. Quarterly Journal of


Ac

23 Economics, 69, 99-118.


24 Simon, H.A. (1956). Rational choice and the structure of the environment.
r e

25
Psychological Review, 63(2), 129-138.
26
Simon, H.A. (1959). Theories of decision-making in economics and behavioral
Pe p

27
28
science. The American Economic Review, 49(3), 253-283.
29 Simon, H.A. (1960). The new science of management decision. New York: Harper.
Simon, H.A. (1973). The structure of ill-structured problems. Artificial Intelligence, 4,
erte

30
31 181-201.
32 Simon, H.A. (1997). Administrative behavior (4th ed.). New York: The Free Press.
33 (First published 1945.)
Red

34 *Singh, H., Aggarwal, R., & Cojuharenco, I. (2015). Strike a happy medium: The
35 effect of IT knowledge on venture capitalists’ overconfidence in IT
36 investments. MIS Quarterly, 39(4), 887-908.
v

37 Sivarajah, U., Kamal, M.M., Irani, Z., & Weerakkody, V. (2017). Critical analysis of
38 big data challenges and analytical methods. Journal of Business Research,
M
iew

39 70, 263-286.
40 Sloman, S.A. (1996). The empirical case for two systems of reasoning. Psychological
41
an

Bulletin, 119(1), 3-22.


42 Slovic, P., Finucane, M., Peters, E., & MacGregor, D.G. (2002). The affect heuristic.
43 In T. Gilovich, D. Griffin & D. Kahneman (Eds.). Heuristics and Biases: The
44 psychology of intuitive judgment (pp. 397-420). New York: Cambridge
45
us

University Press.
46
Slovic, P., Fischoff, B., & Lichtenstein, S. (1982). Response mode framing and
47
48
information processing effects in risk assessment. In R.M. Hogarth (Ed.). New
directions for methodology and social and behavioral science: Question
cr

49
50 framing and response consistency. San Francisco: Jaossey-Bass.
51 Smith, A. (1776). An inquiry into the nature and causes of the wealth of nations.
London: W. Strahan and T. Cadell.
ip

52
53 Smith, G.F. (1988). Towards a heuristic theory of problem structuring. Management
54 Science, 34(12), 1489-1506.
t

55 Stanovich, K.E., & West, R.F. (2000). Individual differences in reasoning:


56 Implications for the rationality debate. Behavioral and Brain Sciences, 23(5),
57 645-665.
58 Staw, B.M. (1981). The escalation of commitment to a course of action. Academy of
59 Management Review, 6(4), 577-587.
60

69
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 70 of 75

1
2
3 Street, C., & Ward, K.W. (2019). Cognitive bias in the peer review process:
4 Understanding a source of friction between reviewers and researchers. The
5 DATA BASE for Advances in Information Systems, 50(4), 52-70.
6
Tan, W-K., Tan, C-H., & Teo, H-H. (2012). Consumer-based decision aid that
7
explains which to buy: Decision confirmation or overconfidence bias?
8
9
Decision Support Systems, 53, 127-141.
10 Tarafdar, M., & Davison, R.M. (2018). Research in information systems: Intra-
11 disciplinary and inter-disciplinary approaches. Journal of the Association for
Au

12 Information Systems, 19(6), 523-551.


13 Taylor, H., Dillon, S., & Van Wingen, M. (2010). Focus and diversity in information
14 systems research: Meeting the dual demands of a healthy applied discipline.
15 MIS Quarterly, 34(4), 647-668.
th

16 Teger, A. (1980). Too much invested to quit. New York: Pergamon.


17 Teigen, K.H., Martinussen, M., & Lund, T. (1996). Linda versus World Cup:
18 Conjunctive probabilities in three-event fictional and real-life predictions.
or

19 Journal of Behavioral Decision Making, 9, 77-93.


20 Tenbrunsel, A.E., & Messick, D.M. (2004). Ethical fading: The role of self-deception
21 in unethical behaviour. Social Justice Research, 17(2), 223-236.
Fo c

22 *Teubner, T., Adam, M., & Riordan, R. (2015). The impact of computerized agents on
Ac

23 immediate emotions, overall arousal and bidding behavior in electronic


24 auctions. Journal of the Association for Information Systems, 16(10), 838-
r e

25
879.
26
Thaler, R.H. (1980). Toward a positive theory of consumer choice. Journal of
Pe p

27
28
Economic Behavior and Organization, 1(1), 39-60.
29 Thaler, R.H. (2015). Misbehaving: The making of behavioral economics. New York:
W.W. Norton & Company.
erte

30
31 Thaler, R.H., & Sunstein, C.R. (2008). Nudge: Improving decisions about health,
32 wealth, and happiness. London: Penguin Books.
33 Thaler, R.H., & Sunstein, C.R. (2009). Nudge: Improving decisions about health,
Red

34 wealth, and happiness (rev. ed.). London: Penguin Books.


35 Thaler, R.H., Sunstein, C.R., & Balz, J.P. (2014). Choice architecture. Retrieved from
36 https://www.sas.upenn.edu/~baron/475/choice.architecture.pdf, February 4,
v

37 2019.
38 Thuring, M., & Jungermann, H. (1990). The conjunction fallacy: Causality vs. event
M
iew

39 probability. Journal of Behavioral Decision Making, 3(1), 51-74.


40 Trieu, V-H. (2017). Getting value from business intelligence systems: A review and
41
an

research agenda. Decision Support Systems, 93, 111-124.


42 Tummers, J., Kassahun, A., & Tekinerdogan, B. (2019). Obstacles and features of
43 farm management information systems: A systematic literature review.
44 Computers and Electronics in Agriculture, 157, 189-204.
45
us

*Turel, O., & Qahri-Saremi, H. (2016). Problematic use of social networking sites:
46
Antecedents and consequence from a dual-system theory perspective.
47
48
Journal of Management Information Systems, 33(4), 1087-1116.
Tversky, A., & Kahneman, D. (1971). Belief in the law of small numbers.
cr

49
50 Psychological Bulletin, 76 (2), 105-111.
51 Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency
and probability. Cognitive Psychology, 5, 207-232.
ip

52
53 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and
54 biases. Science, 185, 1124-1131.
t

55 Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of
56 choice. Science, 211, 453-458.
57 Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative
58 representation of uncertainty. Journal of Risk and Uncertainty, 5(4) 297-323.
59
60

70
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 71 of 75 Journal of Information Technology

1
2
3 Venkatesh, V., Davis, F.D., & Morris, M.G. (2010). Dead or alive? The development,
4 trajectory and future of technology adoption research. Journal of the
5 Association for Information Systems, 8(4), 2567-286.
6
Von Neumann, J., & Morgenstern, O. (1944). Theory of games and economic
7
behavior. Princeton, USA: Princeton University Press.
8
9
*Wang, J., Li, Y., & Rao, H. R. (2016). Overconfidence in phishing email detection.
10 Journal of the Association for Information Systems, 17(11), 759-783.
11 Weber, R.P. (1990). Basic content analysis (2nd ed.). Newbury Park, CA: Sage
Au

12 Publications.
13 Wilson, T.D., Wheatley, T., Meyers, J.M., Gilbert, D.T., & Axsom, D. (2000).
14 Focalism: A source of durability bias in affective forecasting. Journal of
15 Personality and Social Psychology, 78(5), 821-836.
th

16 Winter, E. (2014). Feeling smart: Why our emotions are more rational than we think.
17 New York: Public Affairs.
18 *Xu, D. J., Benbasat, I., & Cenfetelli, R. T. (2017). A two-stage model of generating
or

19 product advice: Proposing and testing the complementarity principle. Journal


20 of Management Information Systems, 34(3), 826-862.
21 Xue, Y., Liang., H., & Boulton, W.R. (2008). Information technology governance in
Fo c

22 information technology investment decision processes: The impact of investment


Ac

23 characteristics, external environment, and internal context. MIS Quarterly, 32(1),


24 67-96.
r e

25
*Yu, J., Hu, P. J.-H., & Cheng, T.-H. (2015). Role of affect in self-disclosure on social
26
network websites: A test of two completing models. Journal of Management
Pe p

27
28
Information Systems, 32(2), 239-277.
29 *Zheng, H., Xu, B., Hao, L., & Lin, Z. (2018). Reversed loss aversion in
crowdsourcing contest. European Journal of Information Systems, 27(4), 434-
erte

30
31 448.
32
33
Red

34
35
36
v

37
38
M
iew

39
40
41
an

42
43
44
45
us

46
47
48
cr

49
50
51
ip

52
53
54
t

55
56
57
58
59
60

71
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 72 of 75

1
2
3
4
APPENDIX A: Condensed Protocol for the Literature Analysis
5
General Research Questions
6
1. Epistemology: 1 Positivist 2 Interpretivist 3 Critical 4 Unclear
7
2. Article Type: (Multiple codes allowed)
8 1 Conceptual study 2 Descriptive research 3 Experimental 4 Field study
9 5 Case study 6 Survey 7 Literature review
10 8 Secondary data 9 Action research 10 Design science
11
Au

3. Unit of analysis: 1 Unclear 2 Specified 3 Inferred 4 N/A Text answer.


12 5. IS Theme: 1 Development, implementation, & use of systems (Multiple codes allowed)
13 2 IS strategy & business outcomes
14 3 Group work & decision support
15 6. Research domain: Text answer.
th

16 7. Task organizational nature: 1 Strategic 2 Tactical 3 Operational (Multiple codes allowed)


17 8. Type of task addressed: (Multiple codes allowed)
18 1 System 1 dominant 2 System 2 dominant 3 Strong S1/S2 interaction 4 Unclear
or

19
20 General BE Questions
21 9. Is BE reference research used for?
Fo c

22 9a. Introduction & background 1 Cited 2 No


Ac

23 9b. Method, formulating models & hypotheses 1 Cited 2 No


24 9c. Interpreting results & discussion 1 Cited 2 No
10. BE reference theories cited? (author/date citations) Count Citations
r e

25
26 10a. Introduction & background
Pe p

27 10b. Method, formulating models & hypotheses


10c. Interpreting results & discussion
28
11. What behavioral economics generation is used? (Multiple codes allowed)
29
1 Early 2 Contemporary 3 Foundational
erte

30
12. Degree of use BE theory:
31 12a. Early BE 1 minor 2 moderate 3 major 4 none
32 12b. Contemporary BE 1 minor 2 moderate 3 major 4 none
33
Red

12c. Foundational 1 minor 2 moderate 3 major 4 none


34
35 Detailed BE Questions
36 13. BE theory use: (Multiple codes allowed) Definitions appear in Tables 1, 2 and 5.
v

37 1 Bounded rationality 2 Phase model 3 Decision structuredness


38 4 Dual process theory 5 Heuristics (Go to Q14) 6 Cognitive biases (Go to Q15)
M
iew

39 7 Prospect theory 8 Nudges 9 Debiasing 10 Other:


40 14. Use of heuristics: (Multiple codes allowed) Definitions appear in Table 3.
41 1 Availability 2 Representativeness 3 Anchoring and Adjustment
an

42 4 Affect 5 Effort 6 Recognition 7 Other:


43 15. Use of cognitive biases: (Multiple codes allowed)
44 Detailed biases were identified and coded using the typology in Table 4.
45 1 Overconfidence Biases 2 Availability Biases
us

46 3 Representativeness Biases 4 Confirmation Biases


47 5 Bounded Awareness Biases 6 Framing Biases
48 7 Escalation Biases 8 Other:
cr

49
50
51
ip

52
53
54
t

55
56
57
58
59
60

72
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 73 of 75 Journal of Information Technology

1
2
3
4
APPENDIX B: Articles with a Major Use of Behavioral
5 Economics, AIS Basket of 8 Journals, 2014-18.
6
7 Table B1. IS Articles with a Major use of Behavioral Economics.
8
9 Org. Other BE
Research Type of Early BE Heuristics Cognitive Biases
Article Research Domain Nature of Concepts
10 Approach
Tasks
Task Used Used Used
used
11
Au

Confirmation
Affect,
Adjerid et al. System 1 Bounded (endowment effect), Prospect
12 (2018)
Security/Privacy Experiment Operational
dominant rationality
General
Framing (framing Theory
13 mention
bias)
14 Strong
Dual
Adomavicius et e-commerce, IS Confirmation Process
15 Experiment Operational S1/S2
th

al. (2018) Use/Adoption (anchoring effect) Theory,


interaction
16 Nudges
17 IS Analysis & Strong Dual
Arazy et al. Design Tactical, General
Design & S1/S2 General mention Process
18 (2017) science Operational mention
or

Development interaction Theory


19 Bartelt &
Strong Complexity effect Dual
Virtual Experiment Tactical S1/S2 (time Process
20 Dennis (2014)
interaction pressure/stress) Theory
21 Experiment
Fo c

22 Bockstedt &
e-commerce
,
Operational
System 1
Availability
Framing (framing Prospect
Ac

Goh (2014) Secondary dominant bias) Theory


23 data
24 Overconfidence
Strong
r e

25 Chen & DSS, Finance -


Experiment Operational S1/S2
(illusion of control,
Koufaris (2015) Investment overconfidence,
26 interaction
overplacement)
Pe p

27 Chiu et al.
e-commerce Survey Operational
System 1 Prospect
28 (2014) dominant Theory
Availability,
29 Constantiou et Mobile, IS
Case
System 1 Representa Confirmation (status
Dual
study, Operational Process
erte

30 al. (2014) Use/Adoption


Survey
dominant tiveness, quo bias)
Theory
Affect
31 Overconfidence
32 (planning fallacy,
33 self enhancement),
Red

Representativeness
34 (similarity bias),
Dual
Operational Process
35 Dinev et al.
Security/Privacy
Conceptual
, Tactical,
System 1 Bounded
Affect
Confirmation
Theory,
(2015) study dominant rationality (anchoring effect,
36 Strategic
endowment effect),
Prospect
v

37 theory
Framing (framing
38 bias), Default
M
iew

(implicit) trust bias,


39 Herding effect
40 Fadel et al.
Virtual, Strong
Phase
Dual
41 Knowledge Field study Operational S1/S2 Process
an

(2015) model
Management interaction Theory
42 Virtual, Strong
Fehrenbacher
43 (2017)
Knowledge Experiment Operational S1/S2 Affect
Management interaction
44
Dual
45
us

Ferratt et al. Conceptual System 2 Bounded General Process


IS Use/Adoption Operational General mention
46 (2018) study dominant rationality mention Theory,
Nudges
47 Bounded
48 Strong
Frisk et al. Finance - Action rationality,
Tactical S1/S2
cr

49 (2014) Investment research


interaction
Phase
model
50 Dual
51 Goel et al.
Strong
General Confirmation
Process
ip

52 Security/Privacy Experiment Operational S1/S2 Theory,


(2017) mention (confirmation trap)
interaction Prospect
53 theory
54 Grange & e-commerce, Strong
Bounded
t

rationality, General
55 Benbasat Social & Crowd Experiment Operational S1/S2
Phase mention
56 (2018) Computing interaction
model
57 Gregory & Strong
Bounded
IS Theory & Design rationality, General
58 Muntermann
Research science
Tactical S1/S2
Phase mention
(2014) interaction
59 model
60

73
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Journal of Information Technology Page 74 of 75

1
2
3 Operational Prospect
Gupta et al. IS Theory & Conceptual Bounded General
4 (2018) Research study
, Tactical, Unclear
rationality mention
General mention theory,
Strategic Nudges
5 IS Use/Adoption,
Anchoring
6 Hoefnagel et al. Business
Case study Operational
System 1
&
7 (2014) Function & dominant
Adjustment
Industry
8 Availability (ease of Dual
9 Huang et al. Social & Crowd Secondary
Operational Unclear
recall), Confirmation Process
(2017) Computing data (anchoring effect, Theory,
10 status quo bias) Nudges
11
Au

Kehr et al. Mobile, System 1 Bounded


Experiment Operational Affect
12 (2015) Security/Privacy dominant rationality
Strong
13 Khan et al. Finance -
Survey Tactical S1/S2
Prospect
(2017) Investment theory
14 interaction
15 Strong
Bounded
th

Kloör et al. Design rationality,


16 (2018)
DSS
science
Operational S1/S2
Phase
interaction
17 model
18 Kretzer &
System 2 Phase
or

Maedche e-commerce Experiment Operational General mention Nudges


19 dominant model
(2018)
20 Kuan et al. Secondary
Availability (ease of Dual
e-commerce Operational Unclear recall), Framing Process
21 (2015) data
Fo c

(framing bias) Theory


22 IS Analysis & Strong
Ac

Lee et al. Strategic,


23 (2018)
Design & Experiment
Tactical
S1/S2 Escalation Debiasing
Development interaction
24 Escalation
r e

25 Lee & Keil IT Management &


Strong (unilateral
Prospect
26 Experiment Operational S1/S2 escalation,
(2018) Governance theory
Pe p

interaction competitive
27 escalation)
28 Overconfidence
(illusion of control),
29 Confirmation
erte

30 Operational (anchoring effect,


Lee & Joshi Conceptual Prospect
31 (2017)
IS Use/Adoption
study
, Tactical, Unclear status quo bias),
theory
Strategic Framing (framing
32 bias), Escalation
33 (unilateral
Red

escalation)
34
Overconfidence
35 Strong (planning fallacy,
Legoux et al. Finance -
36 (2014) Investment
Experiment Tactical S1/S2 self enhancement), Debiasing
v

interaction Confirmation
37 (confirmation trap)
38
M
iew

Availability
Dual
39 Li et al. (2017) e-commerce Field study Operational
System 2 Bounded (retrievability bias),
Process
dominant rationality Framing
40 (presentation bias)
Theory
41
an

Business
Liang et al. Bounded Prospect
42 Function & Survey Operational Unclear
(2017) rationality Theory
Industry
43 Dual
Confirmation
44 Liu & Strong (anchoring effect,
Process
Bounded Theory,
45
us

Karahanna e-commerce Experiment Operational S1/S2 confirmation trap),


rationality Prospect
46 (2017) interaction Framing (framing
Theory,
bias)
47 Nudges
Business
48 Ma et al. (2014) Function &
Secondary
Operational
System 1
Availability
Prospect
data dominant Theory
cr

49 Industry
50 Bounded Dual
Knowledge Strong
Meservy et al. rationality, Process
51 (2014)
management, Field study Operational S1/S2
Phase Theory
Virtual interaction
ip

52 model (major)
Strong
53 Minas et al. Virtual, IS Theory
Experiment Operational S1/S2
Confirmation
Debiasing
54 (2014) & Research (Confirmation trap)
interaction
t

55 Niculescu & IT Management & Conceptual


Strategic Unclear
Bounded General Confirmation
Wu (2014) Governance study rationality mention (Confirmation trap)
56 Dual
57 Nuijten et al. IT Management & System 1 Framing (framing
Process
58 Experiment Tactical Theory,
(2016) Governance dominant bias), Escalation
Prospect
59 theory
60

74
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000
Page 75 of 75 Journal of Information Technology

1
2
3 Affect Dual
Park, Ramesh Finance - Bounded
4 & Cao (2016) Investment
Case study Tactical Unclear
rationality
(emotion- Process
regret) Theory
5 Dual
Confirmation
6 Park et al. System 1 (endowment effect),
Process
7 e-commerce Survey Operational Affect Theory,
(2016) dominant Framing (framing
Prospect
8 bias), Escalation
theory
9 Finance –
Bounded
Investment, Strong
10 Salge et al.
Business
Secondary
Tactical S1/S2
rationality,
11 (2015) data Phase
Au

Function & interaction


model
12 Industry
IS Analysis & Strong
13 Shmueli et al.
Design & Experiment Tactical S1/S2
Overconfidence
Debiasing
(2016a) (planning fallacy)
14 Development interaction
15 Business Dual
Secondary Strong Overconfidence
th

Singh et al. Function & Process


16 (2015) Industry, Finance
data, Tactical S1/S2 (overconfidence,
Theory,
Survey interaction self enhancement)
17 - Investment Debiasing
e-commerce, IS Dual
18 Teubner et al. System 1 Confirmation
or

Theory & Experiment Operational Affect Process


19 (2015) dominant (endowment effect)
Research Theory
20 Turel & Qahri-
Social & Crowd
System 1
Dual
Computing, IS Survey Operational Process
21 Saremi (2016)
Use/Adoption
dominant
Theory
Fo c

22 Wang et al. System 1 General Overconfidence


Ac

Security/Privacy Experiment Operational Debiasing


23 (2016) dominant mention (overconfidence)
Bounded
24 Strong
rationality,
Xu et al. (2017) e-commerce Experiment Operational S1/S2
r e

25 interaction
Phase
model
26
Pe p

Dual
27 Yu et al. (2015
Social & Crowd
Survey Operational
System 1 Bounded
Affect Process
Computing dominant rationality
28 Theory
Zheng et al. Social & Crowd System 1 Framing (framing Prospect
29 (2018) Computing
Experiment Operational
dominant bias) theory
erte

30
31
32
33
Red

34
35
36
v

37
38
M
iew

39
40
41
an

42
43
44
45
us

46
47
48
cr

49
50
51
ip

52
53
54
t

55
56
57
58
59
60

75
https://mc.manuscriptcentral.com/jin

DOI: 10.1177/02683962211016000

You might also like