Digital nudging

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Computers in Human Behavior Reports 3 (2021) 100052

Contents lists available at ScienceDirect

Computers in Human Behavior Reports


journal homepage: www.journals.elsevier.com/computers-in-human-behavior-reports

Digital nudging with recommender systems: Survey and future directions


Mathias Jesse *, Dietmar Jannach
University of Klagenfurt, Universitaetsstrasse 65-67, A-9020, Klagenfurt, Austria

A R T I C L E I N F O A B S T R A C T

Keywords: Recommender systems are nowadays a pervasive part of our online user experience, where they either serve as
Digital nudging information filters or provide us with suggestions for additionally relevant content. These systems thereby in-
Recommender systems fluence which information is easily accessible to us and thus affect our decision-making processes though the
Survey
automated selection and ranking of the presented content. Automated recommendations can therefore be seen as
Decision making
digital nudges, because they determine different aspects of the choice architecture for users. In this work, we
examine the relationship between digital nudging and recommender systems, topics that so far were mostly
investigated in isolation. Through a systematic literature search, we first identified 87 nudging mechanisms,
which we categorize in a novel taxonomy. A subsequent analysis then shows that only a small part of these
nudging mechanisms was previously investigated in the context of recommender systems. This indicates that
there is a huge potential to develop future recommender systems that leverage the power of digital nudging in
order to influence the decision-making of users. In this work, we therefore outline potential ways of integrating
nudging mechanisms into recommender systems.

1. Introduction related to personal health or wealth. One underlying idea of nudges is


that there are certain psychological phenomena that have an influence on
Automated recommendations are a common feature in today’s online how people make decisions. Knowing such phenomena, a choice architect
world. In online shops, we constantly receive recommendations can then construct a choice environment that subtly guides people to
regarding additional shopping items, media platforms like Spotify sug- what is considered a decision that is beneficial for them. A typical
gest us new tracks or artists, and even the content we see on the news example in the literature refers to the arrangement of food in a school
feed of social media sites is tailored to our preferences in a personalized cafeteria. In this scenario, the goal might be to find an order that leads to
way. Such recommendations are usually designed to help users find the effect that students eat more healthy food.
relevant items of interest and to avoid situations of choice overload for Soon after nudging became popular, different authors used the term
users. At the same time, these systems support organizational goals of the digital nudging (Mirsch et al., 2017; Schneider et al., 2018; Weinmann
providers, such as increased sales or user engagement. The broad adop- et al., 2016), when they transferred the concept to the online world. In a
tion and success of recommendation technology in practice provides common definition, digital nudging is related to user interface (UI) ele-
ample evidence that modern recommender systems are effective and can ments of software applications that affect choices of the users (Weinmann
have a significant impact on the choices and decisions that consumers et al., 2016). Again, the usual goal is to help users make decisions that are
make. As a result, these systems can create substantial business value better for them, i.e., guide them to a predefined, desired option without
(Adomavicius et al., 2018; Gomez-Uribe & Hunt, 2015; Jannach & restricting the choice space.
Jugovac, 2019; Lee & Hosanagar, 2019). A recent review of digital nudging techniques in Human Computer
Viewed from the individual decision-making perspective, recom- Interaction (HCI) research can be found in (Caraban et al., 2019).
mendations can be seen as what is called a nudge in behavioral economics Interestingly, in this work, recommender systems were not considered as
(Leonard et al., 2008). The concept of nudging refers to tools that help means to nudge users. This is to some extent surprising because a typical
people make better decisions “without forcing certain outcomes upon goal of providing recommendations is to help users making better de-
anyone” (Thaler et al., 2012, pp. 428–439). Originally, nudges were cisions in the first place. Moreover, the presentation of selected recom-
investigated in offline decision scenarios, targeting mostly at decisions mendations, e.g., in the form of an ordered list, can be seen as a nudge, as

* Corresponding author.
E-mail addresses: mathias.jesse@aau.at (M. Jesse), dietmar.jannach@aau.at (D. Jannach).

https://doi.org/10.1016/j.chbr.2020.100052
Received 6 November 2020; Received in revised form 23 December 2020; Accepted 23 December 2020
Available online 13 January 2021
2451-9588/© 2020 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

it—at least implicitly—implements different mechanisms that were 2. Concepts, terminology, and methodology
identified in the literature. For example, almost all recommender systems
in practice only present a specific subset of the available choices, which 2.1. Digital nudging
can be interpreted as a hiding nudge according to the categorization of
(Caraban et al., 2019). Likewise, the particular ranking of the choices The concept of nudging as a mechanism of behavioral economics was
corresponds to a positioning nudge. When considering alternative cate- popularized by a book by Thaler and Sunstein from 2008 (Leonard et al.,
gorizations, the provision of recommendations could be seen as a nudge 2008). It is based on two principles, choice architecture and libertarian
that is based on increasing the salience of certain options (Johnson et al., paternalism. A choice architecture is “the environment in which people make
2012) or as an approach that increases the ease and convenience of decisions” (Thaler et al., 2012, pp. 428–439). The characteristics (or:
choosing by making some options more visible and accessible (Sunstein, features) of such an architecture that have an influence on which de-
2014). cisions people make are called nudges. In a software application, setting a
However, there are also opportunities to combine personalized rec- default option for a choice would be an example of a nudge. It is
ommendations with “explicit” digital nudges, an area which is largely important to note here that, according to Thaler und Sunstein, nudges
unexplored so far in the research literature. The use of such nudges can be must not force people to make a certain choice. Setting default options is
particularly helpful when there is more than one goal that one would like therefore admissible, because users are free to make a different choice.
to achieve with the recommendations. Remember that the main goal of The concept of libertarian paternalism finally expresses that a nudge
most recommender systems is to point users to choices (items) they will should be aimed at helping people make better decisions than they
probably like. However, an additional goal might be to point users to items probably would if the nudge would not be there. Nudging, by this defi-
that the system thinks the users should explore. nition, is therefore always done to the benefit of people.
For example, a recommender system for food recipes might have Digital nudging is defined by Weinmann et al. as “the use of user-
learned that a user likes certain types of unhealthy food. In such a situ- interface design elements to guide people’s behavior in digital choice envi-
ation, a “nudging-enhanced” recommender might first of all truthfully ronments” (Schneider et al., 2018). Mirsch at al. adopt this definition and
recommend items that are similar to those that the user liked in the past. furthermore consider digital nudges as “relatively minor changes to decision
At the same time, however, it might make use of a visible nudge to environments”. In the online world, this decision environment is
highlight individual options in the recommendation list that are healthier commonly represented by the user interface of an application. Meske and
than the top-ranked item(s), which are probably unhealthy. Similar Potthoff (2017) based their definition also on that of Weinmann et al. but
considerations can be made in other domains, e.g., in music recom- extend it to consider also the aspect of free decision-making. They
mendation. Here, a recommender might predominantly recommend furthermore refrain from limiting nudging to changes in the UI, but also
tracks, genres, and artists the user liked in the past. At the same time, it consider choices regarding the form and content of information as po-
might however highlight those elements that supposedly help the users to tential nudges. They define digital nudging as a “subtle form of using
broaden their musical interests.1 design, information and interaction elements to guide user behavior in digital
Generally, these relationships between recommender systems and environments, without restricting the individual’s freedom of choice.”
digital nudging have not been in depth explored so far. With this work,
we aim to close this research gap. The contributions and the structure of 2.2. Ethical considerations and persuasion
this paper are as follows.
An important observation by Weinmann et al. is that, in practice,
 We conduct a systematic review to collect a set of nudging mechanisms digital nudges are not always used for the benefit of the users. They give
(i.e., practical ways how to nudge) that were previously identified in the common example of airline websites that use nudging to sell non-
the research literature. essential options, which can be seen as a form of manipulation.2 Simi-
 We organize the identified 87 nudging mechanisms into a novel larly, Schneider et al. observe that nudging is frequently used by mar-
taxonomy, and we describe the underlying psychological phenomena keters to sell more products and give everyday examples (Schneider
that are related to the nudging mechanisms. et al., 2018). The use of digital nudging therefore soon leads to ethical
 We provide a survey of research works in which nudging mechanisms questions. A general problem here is also that there is no entirely neutral
were implemented in the context of recommender systems. way of presenting options in a user interface (Schneider et al., 2018),
 We finally highlight future perspectives of combining digital nudging because designers always have to choose a certain visual representation
approaches with recommendation systems. that results in a certain order of the options.
Comparable ethical questions also arise in the context of recom-
The paper is organized as follows. Next, in Section 2, we introduce the mender systems. In the best case, recommender systems create value
terminology used in the paper and provide details about the methodology both for consumers and providers, and eventually also for additional
used in the literature review. Afterwards, in Section 3, we present our stakeholders (Abdollahpouri et al., 2020; Jannach & Adomavicius,
extended taxonomy of nudging mechanisms; the catalog of individual 2016). Academic research mostly focuses on the consumer perspective,
mechanisms is listed in Section 3.2. The underlying psychological phe- e.g., by trying to develop algorithms and systems that help users find the
nomena of the mechanisms are briefly discussed in Section 4. In Section most relevant items for them. One underlying assumption is that this, at
5, we finally review which mechanisms are typically present in recom- least in the long term, is also beneficial for the recommendation provider,
mender systems, and we then discuss future opportunities to add nudges e.g., because the consumers develop trust over time. In reality, however,
to such systems. recommendation providers may consider not only the consumer value,
but also try to maximize the organizational value, e.g., by building price
and profit aware systems that balance the potentially competing goals
(Jannach & Adomavicius, 2017). Furthermore, some recommender sys-
tems might entirely filter out some options for users in a personalized
1
Today, some e-commerce sites like Amazon use visual highlighting of indi- way, e.g., because they are not considered relevant for them. In such
vidual items in their recommendation lists, e.g., to mark items as being currently situations, the principle of freedom of choice in Meske and Potthoff’s
discounted or being “Amazon’s Choice”. In such a context, it is however not
always clear if the visual highlighting represents a nudge according to the
2
original definition, i.e., if the highlighted option is the most beneficial one for In a strict interpretation of Thaler and Sunstein’s definition such mecha-
the user. nisms would therefore not be called nudges.

2
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

definition would be violated. technique (Münscher et al., 2016) when they refer to different ways of
In the context of our present work, which is on the relation of digital nudging, e.g., by setting a default. Others use the terms nudging mecha-
nudging and recommender systems, we however do not primarily focus nism (Caraban et al., 2019), nudging technique (Djurica & Figl, 2017),
on ethical questions of digital nudging. In general, choice architects are nudge principle (Schneider et al., 2018), nudging element (Meske &
advised to consider the implications of potentially unethical nudges Potthoff, 2017) or simply (digital) nudge (Sch€ar & Stanoevska-Slabeva,
(Schneider et al., 2018; Sunstein, 2015). According to Meske and 2019) when referring to more or less the same thing. In an earlier
Potthoff, the discussion regarding ethical considerations however has not work (Dolan et al., 2010) the term influences on behavior is used.
achieved a consensus yet (Meske & Potthoff, 2017). Furthermore, note that in existing research works we find categorizations
The consideration of ethical questions is sometimes considered as a that mix nudging mechanisms like setting a default with psychological
difference between nudging and the concept of persuasion. In Karlsen and phenomena like the anchoring or the framing effect, e.g., (Meske &
Andersen (2019), they contrast nudging with other types of behavioral in- Potthoff, 2017).
terventions, including persuasion. When considering recommendations as In our work, we will consistently use the term nudging mechanism in
nudges, Karlsen and Andersen consider the removal of options as an aspect the remainder of the paper, thereby adopting the terminology of Caraban
that is more common (and allowed) in persuasive approaches. However, et al. (2019) and others. In addition, we will differentiate between the
note that according to a common definition of persuasive technology (Fogg, mechanisms and the underlying psychological effect. We provide a novel
2002), the use of coercion or deception is not acceptable. mapping of nudging mechanisms and psychological phenomena in Sec-
A discussion of commonalities and differences between nudging and tion 4.
persuasion that goes beyond ethical aspects can be found in Meske and
Potthoff (2017). In the research literature, recommender systems are often 3.1. Taxonomy of nudging mechanisms
considered as persuasive technology, see e.g., Yoo et al. (2013). In the
context of our work, we however recognize that it can be difficult to always Fig. 1 shows the different categories of our taxonomy of nudging
draw a clear line between nudging and persuasion. In the end, while the mechanisms. At the highest level, we identified four main categories of
concepts were developed in different disciplines, they share the same goal nudging mechanisms: Decision Information, Decision Structure, Decision
and aim at influencing the behavior and decisions of people (Meske & Assistance, and Social Decision Appeal.
Potthoff, 2017). In our present work, we therefore do not adopt a dogmatic The first three categories were proposed previously in Münscher et al.
approach, but accept that the concepts of nudging and persuasion are (2016). Based on our catalog of nudging mechanism (see Section 3.2), we
overlapping. We therefore may also include nudges in our survey that added a fourth category named social decision appeal to better reflect the
some might call “persuasive cues” elsewhere (Yoo et al., 2013). characteristics of the mechanisms in our taxonomy.4

2.3. Methodology – literature search  Decision Information This category of nudging mechanisms is based on
changing the information that is shown to the decision maker without
To identify relevant research literature, we first conducted a sys- changing the options themselves. Such changes can, for example, be
tematic search on major electronic libraries. The aim was to find papers based on increasing the salience of the provided information or on
that were concerned with the application of nudges in the field of reframing the description of the options.
recommender systems. Based on the initial findings, citation snowballing  Decision Structure By altering the structure of a decision, a choice ar-
was conducted to find relevant and related works. In order to ensure that chitect can steer the behavior of decision makers. Nudging mecha-
we issued queries of the form (“Nudge” OR “Nudging” OR “Digital nisms in this category focus on the arrangement of options. Examples
Nudge” OR “Digital Nudging”) AND (“Recommender” OR “Recom- for this are setting defaults, changing the effort, or ordering the
mender Systems”) in the following libraries: Web of Science, Spring- options.
erLink, IEEE Xplore, Directory of Open Access Journals, Emerald Insight,  Decision Assistance As the name suggests, this category encompasses
ERIC MEDLINE/PubMed, Sage Journals, INFORMS Journals. We then mechanisms that support decision makers in accomplishing their
manually scanned the titles and abstracts of the returned works to goals. Choice architects can, for example, do this by reminding users
determine if they were relevant for our survey. These papers were of preferred options, or by helping users carry out plans that they
deemed relevant when (a) they were published after the year 2008,3 (b) previously mentioned.
described a nudge and (c) linked those to recommendations in some way.  Social Decision Appeal Similar to the nudging mechanisms mentioned
With the help of snowballing, we subsequently identified additional in the category of decision information, the ones in this category are
works on digital nudging which were not necessarily focusing on also based on changing the information that is presented to users.
recommender systems. This entire procedure resulted in 111 papers. In What sets these two apart is that social decision appeal focuses on the
the end, after removing duplicates and checking for relevancy, we were emotional and social implications of the change. The underlying
left with 63 papers that we further considered in our survey. assumption is that decision makers are more inclined by what other
individuals did in the same situation and are influenced by their
3. Taxonomy and catalog of nudging mechanisms decisions.

In this section, we present the taxonomy which we used to categorize At the second level of the taxonomy, we suggest 13 subcategories
the different nudging mechanisms that we found in the literature. The based on the analyzed papers. Nine of the categories are based on the
developed taxonomy builds upon and refines previous categorizations proposal by Münscher et al. (2016). Again, we suggest additional cate-
found, e.g., in Caraban et al. (2019); Münscher et al. (2016); Hummel and gories, as this allows for a fine-grained classification. These additional
Maedche (2019); Dolan et al. (2010). We will discuss the rationale for our subcategories are Increase Salience and Increase Reputation of Messenger,
adaptations below. inspired by Dolan et al. (2010), Use insights from Human Behavior, to cover
In this context, note that the literature that proposes such taxonomies mechanisms that are directly related to a single psychological phenom-
is not always consistent in terms of terminology. Some authors for enon, and Instigate empathy, inspired by Caraban et al. (2019); Sunstein
example use the term choice architecture tool (Johnson et al., 2012; Thaler (2016).
et al., 2012, pp. 428–439), choice architecture technique, or intervention

4
The categorization in Hummel and Maedche (2019) is based on only two
3
The term “nudge” was popularized in the year 2008. higher-level categories, Structuring the choice task and Describe choice options.

3
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Nudging Mechanisms

1) Decision 4) Social Decision


2) Decision Structure 3) Decision Assistance
Informa Appeal

Translate Increase
Change range Provide
informa reputa
or c reminders
messenger

Increase
salience of Change choice Facilitate Provide social
informa defaults commitment reference point

Make
informa Chang Ins ate
visible consequences empathy

Change
phrasing of Chang -
informa related effort

Fig. 1. Taxonomy of nudging mechanisms.

Overall, with the proposed taxonomy our goal is to avoid certain taxonomy. An overview of each category is given in Fig. 2–5. Details for
limitations of previous proposals. In particular, existing taxonomies each nudging mechanisms are provided in Appendix in Tables 6–9.
sometimes use a comparably coarse structure, which makes it difficult to
classify the variety of nudging mechanisms that we found through our  Decision Information: About half of the identified nudging mechanisms
literature survey in an appropriate manner. For instance, we found it fall in this category, which is based in providing or emphasizing
challenging to classify social or emotion-based mechanisms (e.g., moral certain types of information to guide the decision maker.
suasion) in existing taxonomies in a way that allows to consider the finer  Decision Structure: In this category, we found 23 nudging mechanisms
details of the different types of nudging mechanisms. in four subcategories.
 Decision Assistance: We categorized eight nudging mechanisms under
3.2. Catalog of nudging mechanisms two sub-categories of Decision Assistance.
 Social Decision Appeal: Finally, there are 11 nudging mechanisms in
Next, we present our detailed catalog of nudging mechanisms. We the new category labeled Social Decision Appeal, grouped in three sub-
organize the discussion according to the categories of the presented categories.

1) Decision
Informa

Translate Increase salience of


Make informa Phrasing of informa
informa informa

• Decrease • A /Reduc • Checklist • Providing an • Anchoring and • Limit


vagueness and ing a en • Customized explana adjus window
ambiguity • Hiding informa • Providing • A en • Loss aversion
• Explicitly informa • Disclosure feedback collapse • Make resources
mapping • Increase salience • Give • Providing • Availability scarce
• Simplifica of a e compara e • Biasing the • Mental
• Understanding • Increase salience informa viewpoints memory of accoun
mapping of incen es • Informing • Reduce the experiences •
• Using visuals to • Make external distance • Decoy effect overconfidence
deceit informa • Sugges • Endowment • Placebos
• Using visuals to visible alterna es effect • Priming
increase salience • Visible goals • Framing • Representa en
• Warning • Hyperbolic ess
discoun • Spotlight effect
• Imag a • Tempta

Fig. 2. Overview of nudging mechanisms in the category Decision Information.

4
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

2) Decision
Structure

Change range or
Change choice Chang - Chang
c
defaults related effort consequences

• Automa • Change ease and • Change scale • Change social


enrollment convenience • Decision staging consequences
• Enhancing or • Change financial • Order effects • Connect decision
influencing effort • P to benefit or cost
e choosing • Change physical • Micro-Incen es
• Prompted effort categories
choice • Creat • Structure
• Se faults • Navigability of complex choice
• Simplifying contexts • Structure of
e choosing • Reduce evalua
paperwork
• Speed bumps
• Thr

Fig. 3. Overview of nudging mechanisms in the category Decision Structure.

3) Decision 4) Social Decision


Assistance Appeal

Increase
Provide social
Provide reminders Facilitate commitment reputa Ins ate empathy
reference point
messenger
• Remind of goal • Apprehensive • Nature and • or • Argentum-Ad • Ins ate
• Remind of • Automated consequences of • Messenger Populum empathy with
socially • own past effect • Following the characters
desirable implementa choices herd (Norms) • Invoking feelings
concepts inten • Prec t • Group-Ad of reciprocity
Populum • Moral suasion
• Situated • Just-in- strategy
• Self-control • Referring to • P e
• Tutorial
strategies opinion leader reinforcement
• Social influence
• Support public
(Comparison)
commitment

Fig. 4. Overview of nudging mechanisms in the category Decision Assistance. Fig. 5. Overview of nudging mechanisms in the category Social Decision Appeal.

4. Underlying psychological phenomena nudging mechanisms in the literature are explicitly based on underlying
decision-related phenomena. Our review therefore points to an open
The principle of nudging, as proposed by Leonard et al. (2008), is research gap, as in many cases nudging mechanisms may rather be
based on the existence of a number of psychological phenomena that can defined based on researcher experience than on underlying psychological
have an effect on how we make decisions. In the original work on theories or known phenomena in the context of human decision-making.
nudging, a number of decision heuristics and biases are provided as ex- However, to make nudges as effective as possible, it is important to un-
amples, including the anchoring effect or the availability or similarity derstand what “makes them work” based on a solid understanding of the
heuristic, as proposed in Tversky and Kahneman (1974). Our literature underlying theories.
review revealed that researchers refer to a variety of such underlying As a side note, when reviewing the list of psychological effects in the
phenomena when characterizing the principles of nudging. In our liter- literature, it became evident that these effects are sometimes used
ature review, we identified 58 psychological phenomena that were at interchangeably with a corresponding nudging mechanism. Examples are
least mentioned in the papers. We list them in Table 1. mechanisms from the literature like anchoring and adjustment or framing,
However, when individual nudging mechanisms are described in the which directly refer to underlying psychological phenomena. As a result,
literature, it is often not stated explicitly upon which theoretical con- we face the problem that sometimes nudging mechanisms are proposed
siderations these mechanisms are assumed to rely on. Table 2 shows without strong backing theory or empirical evidence. Furthermore, the
those cases where the underlying phenomena were explicitly mentioned used terminology can also lead to confusion.
in the literature. As it can be seen from the table, one fourth (22 of 87) of In order to understand how nudging mechanisms work in

5
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Table 1 Table 1 (continued )


Overview of underlying psychological phenomena. List of effects List of effects (continued)
List of effects List of effects (continued)
Halo effect (Gena et al., 2019) Sunk-cost fallacy (Mirsch et al., 2017;
Accent fallacy (Gena et al., 2019) Hyperbolic discounting (Mirsch et al., Thaler, 2018)
2017; Theocharous et al., 2019; Johnson Herd instinct bias (Caraban et al., 2019)
et al., 2012; Thaler, 2018; Mirsch, Jung, Hindsight bias (Thaler, 2018)
et al., 2018, b; Lee et al., 2011)
Affect heuristic (Caraban et al., 2019) Image motivation (Mirsch et al., 2017)
Ambiguity Aversion (Theocharous Information bias (Johnson et al., 2012)
et al., 2019) Table 2
Anchoring & Adjustment (Felfernig Intertemporal choice (Mirsch et al., 2017) Explicit statements in the literature about underlying psychological phenomena.
et al., 2008; Leitner, 2015; Meske &
Nudging Mechanism Psychological Effect
Potthoff, 2017; Mirsch et al., 2017,
2018a, 2018b; Schneider et al., 2018; Anchoring and adjustment Anchoring and adjustment (Jung et al., 2018)
Theocharous et al., 2019; Tversky & Argentum Ad Populum Argentum Ad Populum (Gena et al., 2019)
Kahneman, 1974; Weinmann et al., Availability Availability heuristic (Leonard et al., 2008)
2016; Zhang et al., 2015) Biasing the memory of Peak-end Rule (Caraban et al., 2019)
Appeal to the majority (Argumentum Loss aversion (Gena et al., 2019; Miller, experiences
Ad Populum) (Gena et al., 2019) 2019; Mirsch et al., 2017, 2018a, 2018b; Deceptive visualizations Salience bias (Caraban et al., 2019)
Thaler, 2018; Theocharous et al., 2019) Framing Framing (Meske & Potthoff, 2017)
Attentional bias (Mirsch et al., 2017) Mental accounting (Mirsch et al., 2017; Group Ad Populum Group Ad Populum (Gena et al., 2019)
Münscher et al., 2016; Thaler, 2018) Invoking feelings of Reciprocity bias (Caraban et al., 2019)
Authority bias (Gena et al., 2019) Mere exposure effect (Caraban et al., reciprocity
2019) Limited time window Scarcity effect (Johnson et al., 2012)
Availability heuristic (Caraban et al., Messenger effect (Mirsch et al., 2017) Loss aversion Loss aversion (Mirsch, Lehrer, & Jung, 2018)
2019; Mirsch et al., 2017; Schneider Make resources scarce Scarcity effect (Caraban et al., 2019)
et al., 2018; Thaler, 2018; Tversky & Optimism and Optimism and overconfidence bias (Sunstein, 2016)
Kahneman, 1974; Weinmann et al., overconfidence
2016) Partition of options/ Mental accounting, diversification bias and
Choice aversion (Münscher et al., 2016) Middle-option bias (Schneider et al., categories denomination effect (Münscher et al., 2016)
2018) Placebos Placebo effect (Caraban et al., 2019)
Choice supportive bias (Theocharous Optimism and overconfidence bias Priming Mere exposure effect (Caraban et al., 2019)
et al., 2019) (Mirsch et al., 2017; Sunstein, 2016) Providing an explanation Information bias (Johnson et al., 2012)
Commitment bias (Caraban et al., 2019) Peak-end results (Caraban et al., 2019) Referring to opinion leader Herd instinct bias (Caraban et al., 2019)
Confirmation bias (Caraban et al., 2019; Placebo effect (Caraban et al., 2019) Representativeness Representativeness and stereotypes (Leonard et al.,
Loewenstein & Chater, 2017; 2008)
Theocharous et al., 2019) Increase salience Salience bias (Caraban et al., 2019)
Conjunction fallacy (Wang et al., 2019) Present bias (Loewenstein & Chater, 2017; Setting Defaults (Status Status Quo Bias, Loss Aversion, Decision inertia
Sunstein, 2016) Quo Bias) (Johnson et al., 2012; Meske & Potthoff, 2017;
Contrast effect (Theocharous et al., Primacy and recency effect (Schneider Weinmann et al., 2016)
2019) et al., 2018; Leitner, 2015; Felfernig et al, Social Influence Herd instinct bias (Caraban et al., 2019)
2007, 2008) (Comparison)
Correspondence bias (Wang et al., Priming (Bauer & Schedl, 2017; Mirsch Spotlight effect Spotlight effect (Caraban et al., 2019)
2019) et al., 2017) Support public Commitment bias (Caraban et al., 2019)
Decision fatique (Theocharous et al., Procrastination (Sunstein, 2016) commitment
2019)
Decision inertia (Jung et al., 2018) Reciprocity bias (Caraban et al., 2019)
Decoupling (Mirsch et al., 2017, 2018a) Regression to the mean (Kahneman, recommender systems, it is necessary to understand their underlying
2012) effects. Each mechanism can be based on one or more psychological ef-
Decoy effect (Asymmetric dominance) Representativeness and stereotypes
(Caraban et al., 2019; Felfernig et al., (Mirsch et al., 2017; Schneider et al.,
fects. The implementation of mechanisms can vary and thus create a
2008; Lee et al., 2011; Leitner, 2015; 2018; Tversky & Kahneman, 1974; unique set of interactions between their effects. Furthermore, de-
Schneider et al., 2018; Theocharous Weinmann et al., 2016) mographic characteristics can alter the perception. For example, the
et al., 2019) work of Esposito et al. (2017) found that the effects of age are directly
Default effect (Johnson et al., 2012; Risk aversion (Theocharous et al., 2019)
related to nudges. In their example, this meant that older participants
Loewenstein et al., 2015; Meske &
Potthoff, 2017; Weinmann et al., were less likely to being affected by nudges.
2016) To assess these inter-relationships between underlying psychological
Denomination effect (Münscher et al., Salience bias (Caraban et al., 2019; phenomena and the specific implementation of a nudging mechanism,
2016) Johnson et al., 2012) the system needs to be designed and evaluated as a whole. Existing
Distinction bias (Theocharous et al., Scarcity bias (Caraban et al., 2019;
2019) Schneider et al., 2018)
research suggests that the resulting effects cannot be directly connected
Diversification bias (Münscher et al., Selective perception (Theocharous et al., to the mechanisms, as too many variables have to be observed at once
2016) 2019) (Dolan et al., 2012; Hummel & Maedche, 2019; Münscher et al., 2016).
Endowment effect (Loewenstein et al., Simulation heuristic (Wang et al., 2019) This can be attributed to the facts that (i) multiple ways exist of imple-
2015; Mirsch et al., 2017; Thaler,
menting the same nudging mechanism and (ii) each mechanism can
2018)
Framing (Felfernig et al., 2008; Leitner, Social desirability bias (Schneider et al., leverage more than one psychological underlying phenomenon. Overall,
2015; Meske & Potthoff, 2017; 2018; Mirsch et al., 2017, 2018a, 2018b) further research is needed to understand the relationships between
Miller, 2019; Mirsch et al., 2017; nudging mechanism and underlying psychological phenomena.
Theocharous et al., 2019)
Gambler’s fallacy (Theocharous et al., Spotlight effect (Caraban et al., 2019;
2019) Mirsch et al., 2017)
5. Nudging with recommender systems
Group Ad Populum (Gena et al., 2019) Status quo bias (Mirsch et al, 2017,
2018a, 2018b; Caraban et al., 2019; Jung In this section, we investigate the relationship between typical
et al., 2018; Lee et al., 2011; Schneider recommender systems and digital nudging in more depth. First, in Sec-
et al., 2018)
tion 5.1, we argue that online recommendations can be viewed as digital
nudges and why they implicitly implement more than one of the nudging

6
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Table 3 Table 4
Nudging mechanisms inherent to recommender systems. Studies on nudging mechanisms in recommender systems.
Inherent nudging mechanisms Inherent nudging mechanisms (cntd.) Nudging Mechanism Explored in

Change ease and convenience Informing Argentum-Ad Populum Gena et al. (2019)
Change physical effort Ordering effects Decoy effect (Lee et al., 2011; Teppan & Felfernig, 2009)
Customized information Partition of options and categories Framing (Gena et al., 2019; Guers et al., 2013)
Following the herd Simplification Group-Ad Populum Gena et al. (2019)
Hiding Social influence Increase salience of Elsweiler et al. (2017)
Increase salience of attributes Structure complex choices attributes
Increase salience of incentives Suggesting alternatives Increase salience of Turland et al. (2015)
incentives
Informing (Esposito et al., 2017; Lee et al., 2011)
mechanisms of our catalog. Afterwards, in Section 5.2, we report cases Make external (Cheng & Langbort, 2016; Guers et al., 2013; Verma
information visible et al., 2018)
from the literature where recommendations were explicitly combined
Order effects (Kammerer & Gerjets, 2014; Turland et al., 2015)
with digital nudging. Priming Bauer and Schedl (2017)
Providing feedback Bothos et al. (2015)
Remind of goal Bothos et al. (2016)
5.1. Recommendation viewed as nudging
Setting defaults (Bothos et al., 2015, 2016; Bauer & Schedl, 2017; Lee
et al., 2011)
Recommendations, as provided in the context of various online ser- Structure complex choice Bothos et al. (2015)
vices, have proven effects on the behavior of consumers.5 As such, they Suggesting alternatives Forwood et al. (2015)
can be seen as a form of behavioral intervention. In the following, we Support public Brown (2019)
commitment
consider the case of a typical recommendation system as found on Using visuals to increase (Elsweiler et al., 2017; Esposito et al., 2017; Gena et al.,
Amazon.com or at Netflix, i.e., a system that filters and ranks the avail- salience 2019; Karvonen et al., 2010)
able options for the consumers. Such a filtering and ranking approach Warning Esposito et al. (2017)
inherently implements several principles of the nudging mechanisms of
our catalog. We list them in Table 3.
Comparing the list of all identified nudging mechanisms (87) and the
These inherent mechanisms manifest themselves as follows. Typical
ones implemented in recommender systems (18) reveals that the use of
recommender systems aim to provide recommendations which are
69 nudging mechanisms proposed in the literature has not been explored
tailored (customized information) towards the needs of each user and
yet.6 Only a handful of nudging mechanisms were analyzed in more than
therefore reduce the physical effort (change physical effort) in terms of
one implementation. As can be seen in Table 4, the most researched
searching and finding a fitting item. Another aim is to simplify the pro-
mechanisms are “Setting defaults” and “Using visuals to increase
cess of finding fitting items (simplification). Furthermore, they enable
salience”.
users to make more complex choices (structure complex choices). In a
Looking closer at the works listed in Table 4, we can make the
typical recommendation process, rarely only one recommendation is
following observations regarding the applied research methodology.
provided by the system. Based on the needs of the specific user, various
options are presented for them to choose from (suggesting alternatives).
 Application Domains: The papers analyzed in our study focused on nine
Moreover, the recommendations are presented in a list or some other
different application domains. Among these domains, route planning,
visual arrangement. This puts additional emphasis on certain items and
e-commerce, internet service, and social media were found more than
hides unwanted options (ordering effects and hiding). Even more so, these
once with route planning and e-commerce being the most popular
are arranged in lists or on different pages, that alter their perception
ones.
(partition of options and categories). Items provided by the system are
 Types of Experiments: Various experimental settings were explored to
usually accompanied by information like a title, tags or price (informing).
study the nudging mechanisms, including surveys, interviews, labo-
One such attribute that helps users to compare their behavior to that of
ratory studies, or field tests. The latter two, laboratory studies and
others, is the majority of user actions. Usually this is expressed with what
field tests, were most commonly used, sometimes paired with a pre-
has been liked or purchased by others (following the herd and social in-
test before the study or a survey afterwards.
fluence). Each attribute and incentive can further be emphasized (increase
 Study sizes: The study sizes in the analyzed papers range from eight
salience of attributes and incentives) which helps increase the ease and
(Verma et al., 2018) to around 20,933 users (Gena et al., 2019), the
convenience of using the recommender system (change ease and
latter being a clear outlier. Without two outliers, the mean number of
convenience).
participants is at 223 and the median is 100.
 Study Duration: Only in a few papers the overall duration of the studies
5.2. Existing studies on nudging with recommender systems is documented (Bothos et al., 2015, 2016; Gena et al., 2019; Forwood
et al., 2015; Esposito et al., 2017). In these works, the duration was
Our systematic literature review surfaced 16 papers which reported either one month or two months.
on having implemented and analyzed nudges in recommender systems.  Dependent Variables: Eleven studies measured if the implemented
Three additional papers proposed an implementation and described how, mechanisms had any impact on the behavior of users. The remaining
in the future, they would measure the effects (Brown, 2019; Jung et al., papers stated that nudging mechanisms were part in their setup, but
2018; Kissmer et al., 2018). When excluding these three papers, 18 not investigate them systematically. One such paper focused on the
nudging mechanisms were studied in the literature, as shown in Table 4. usability of the recommender system (Karvonen et al., 2010). Others
did not conduct a user study but focused on providing conceptual
insights. Such works for example analyzed a mathematical model
5
See Jannach and Jugovac (2019) for an overview on the effectiveness of (Cheng & Langbort, 2016; Guers et al., 2013) or proposed a concep-
recommender systems in business environments.
6 tual model (Bauer & Schedl, 2017; Knijnenburg, 2014).
Note that in some works the authors did not explicitly use the term nudging
 Effectiveness: The majority of the papers that included user studies, 8
when they explored the effects of a psychological phenomenon in the context of
recommender systems, e.g., in Teppan and Felfernig (2009). We therefore
out of 11, report that at least one nudging mechanism used in their
cannot rule out that other works exist that investigate nudging mechanisms with implementation had an impact on the behavior of users. The
recommender systems, as they were not surfaced through our literature search. remaining three works could not report significant effects of nudging

7
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

(Bothos et al., 2016; Forwood et al., 2015; Lee et al., 2011). The Table 5
observed effects or non-effects of the tested nudging mechanisms are Studies on the effects of nudging mechanisms on user behavior in the context of
summarized in Table 5. recommender systems.
Paper Nudging Goal and Finding
Discussion. Our analysis indicates that the effects of nudging mecha- Positive effects
nisms were explored in a variety of domains, underlining that their broad Bothos et al. (2015) Nudging goal: To nudge users towards following routes
applicability in practice. Regarding the study sizes, the numbers are that are more environmentally friendly with the help of
generally encouraging. A few studies are however based on very small messages.
Finding: Suggestions were frequently accepted and left the
samples, which makes it difficult to judge how reliable and generalizable
users more satisfied.
the obtained results are. Study sizes are often mentioned as a limitation to Elsweiler et al. (2017) Nudging goal: To make users choose healthier (low fat)
the conducted study (Bothos et al., 2016; Kammerer and Gerjets, 2014; recipes in contrast to other unhealthy recipes by using visual
Verma et al., 2018). nudges.
Regarding the durations of these studies, only two papers (Bothos Finding: Through the use of nudges, 62,2% of decisions
resulted in healthier recipes being chosen.
et al., 2015, 2016), both authored by the same researchers, investigated Esposito et al. (2017) Nudging goal: To reduce incompatible purchases by
long-term effects of nudges. All other research focused on the immediate providing additional information and warnings on items.
effects of the nudging mechanisms. As a result, it remains difficult to say Finding: The basic warning nudge does not change the
what the long-term effects of nudges in recommender systems really look behavior of users, but if it is phrased in an emotional way, it
worked. Furthermore, while the use of logos did not
like. As a consequence, more research is needed that concentrates on
decrease the incompatible purchases but providing
observing the changes in behavior in a longer time frame. additional information did.
With regard to the effectiveness of nudging mechanisms in the Gena et al. (2019) Nudging goal: To promote specific news articles and
context of recommender systems, the results are also encouraging, as in thereby capture the interest of users.
the majority of the studies a significant impact on user behavior could be Finding: Ad Populum and Group-Ad Populum nudges were
not effective in the context of news recommendation.
observed. The non-success of some mechanisms however also indicates
Negative framing, on the other hand, worked.
that their effectiveness can depend on specific circumstantial situations, Kammerer and Nudging goal: To get people to click on the first element of
the application domain, or the chosen way of implementing the nudge. Gerjets (2014) a recommendation.
The investigation of the effectiveness in additional domains was for Finding: Ordering effects in a list had an impact on the
chosen websites and the time spent on them. However,
example mentioned as a direction for future work in Kammerer and
using a grid layout substantially reduced these effects.
Gerjets (2014) and Forwood et al. (2015). Teppan and Felfernig Nudging goal: To steer the user to select a certain option
Finally, given the insights from K€ ocher et al. (2019) about (2009) more often by using decoy options.
multi-attribute anchoring effects, it might be advisable to not only Finding: Decoy effects were observed and had an impact on
measure if a nudging mechanisms led to the selection of one particular the behavior of users.
Turland et al. (2015) Nudging goal: To increase the use of more secure networks
item, i.e., the nudged one. The study in K€ ocher et al. (2019) showed that
by recommending and highlighting more secure
highlighting one of several possible options as a recommendation did not alternatives.
lead to the effect that this option was chosen more often. However, the Finding: Nudges had a significant impact on the selection of
recommendation had a significant effect on the choices of the users. In networks. Increasing the salience of options with colors had
a significant impact, whereas ordering effects had not. The
their studies, the recommendation of a particularly large backpack for
combination of both created the strongest behavior changes.
example led to the effect that users on average selected a larger backpack Verma et al. (2018) Nudging goal: To make users consume more relevant
from the set of available choices. Such indirect and inspirational effects of information by removing unsuitable content from their
recommendations were also observed in Kamehkhosh et al. (2019) for Twitter feeds.
the music domain. Finding: The reduction of noise in timelines of social media
had a positive impact on users. Also, the use of such a nudge
was perceived to be useful.
5.3. Nudging mechanisms in recommender systems – a research agenda No effects
Bothos et al. (2016) Nudging goal: To nudge users towards following routes
Our study revealed that 69 nudging mechanisms were not explored that are more environmentally friendly with the help of
messages.
yet in the context of recommender systems. We therefore analyzed which
Finding: No changes in the users’ behavior could be
of these mechanisms could be embedded or combined with a typical detected or attributed to the presented messages.
recommender system in an intuitive way. In the following, we sketch a Forwood et al. (2015) Nudging goal: To decrease the energy density (calories per
few examples of such nudging opportunities. We in most cases assume gram of food) of a purchase by providing alternative food
application scenarios where the original ordering of the recommenda- choices.
Finding: Providing food alternatives did not lead to a
tions, as returned by an underlying algorithm, is maintained when the significant change to the overall energy density of what
nudging mechanisms are applied. A corresponding strategy is applied, participants bought.
e.g., by Amazon, where certain items in the list have additional anno- Lee et al. (2011) Nudging goal: To promote healthy snack choices by
tations such as “Topseller” or “Amazon’s Choice”. While such mecha- recommending suitable alternatives.
Finding: The implemented decoy-based nudging
nisms are already used in practice, little is known about their
mechanisms did not significantly change the snack choices
effectiveness in the academic literature. of users.

 Visuals to increase salience: The provision and the quality of images for
a certain option (item), can significantly influence the user’s choices,  Warnings: In some domains, e.g., food recommendation, some choices
as demonstrated in recent visually-aware recommender systems (He & are less healthy than others. Warnings that are, e.g., expressed
McAuley, 2016). Therefore, using certain imagery can help to look whenever a user makes an unhealthy selection, might help to nudge
one option more attractive than another one. users towards a healthier eating behavior.
 Defaults: Recommendations are usually ordered according to the  Micro-Incentives: This mechanism consists of providing an additional
assumed utility or relevance for the users. Nonetheless, a pre- (financial) incentive for certain options, e.g., reduced delivery costs.
selection of a certain option, possibly paired with additional infor- It is important, however, that the incentive does not change the
mation, seems promising to explore, in particular given the known rational perception of the product itself.
effectiveness of the Default nudge.

8
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

 Just-in-time, situated suggestions: Commonly, recommendations are well as nudges in the context of recommender systems.
displayed when an online user accesses a certain service. There is, On a more general level, note that providing suitable information
however, substantial explored potential in the use of proactive (e.g., mapping, tutorials or simplification) can make choices easier for
recommendation as well (Rook et al., 2020). Such proactive recom- users in general. Furthermore, other nudging mechanisms (e.g.,
mendations, e.g., in the form of push notifications, can be seen as messenger effect or expecting error) might also help to increase the users’
nudging mechanisms by themselves. However, more research is trust in the recommender system, thus potentially leading to a higher
required to understand questions related to the timing of these adoption rate of the provided recommendations and higher decision
nudges, which can be central for their effectiveness. Factors that in- satisfaction.
fluence the timing can, e.g., include questions of the user’s situational There is also untapped potential in the use of personalizing the
context or their current decision stage. nudging mechanisms, both their choice and the implementation, to in-
 Social influence: As shown by Huhn et al. (2018), there is a direct dividual users and their contextual situation. The implementation of such
correlation between messages coming from subject matter experts or “smart nudges” however requires that additional information about the
celebrities and the acceptance of this information. Thus, these experts user’s current situation is available to create the highest added value for
can influence the attitude of other users in their intention to buy the users (Karlsen & Andersen, 2019).
products. Consequently, this could increase the sales through Finally, regarding the analyzed effects, it is worth noting that no
recommender systems and needs to be further considered in research. paper investigated if the participants felt manipulated or coerced by the
 Instigate empathy and messenger effects: Finally, in some situations the proposed nudge. Following the original definition of nudging, it is
goal of applying nudging in recommender systems might not be to however advisable to always check for this aspect in order to avoid any
promote a particular choice, but to increase the effectiveness of a unwanted manipulation of the users.
recommender as a nudging system as a whole. In these cases, the use
of impersonated agents (avatars), as done previously, e.g., in Jannach 6. Conclusion
(2004), could be one viable option to increase the effectiveness of the
recommender. Both recommender systems and digital nudging mechanisms are
proven means to influence the behavior of online users in a desired way.
In addition to this selection of yet unexplored nudges, we see further In this paper, we have elaborated on the relationship between these
potential for future work to validate those nudges that were already concepts, and we have emphasized that recommender systems inherently
explored previously, as discussed in Section 5.2 and to explore alterna- implement various nudging mechanisms. Moreover, we have identified a
tive implementations, also in different application settings. Moreover, number of research gaps as well as future ways of incorporating addi-
there are some behavioral intervention mechanisms that were investi- tional nudging mechanisms into recommender systems to further in-
gated in related fields, but not in the context of nudging. For example, in crease their effectiveness.
the work by Johnson et al. (2012), the system created a feeling of urgency
or scarcity, which puts users in a situation in which they have to act
quickly and make decisions that potentially overemphasize short term Declaration of competing interest
rewards. Such approaches resemble the ideas of the nudging mechanisms
limited time window, making resources scarce and temptation. Such ideas are The authors declare that they have no known competing financial
broadly used as persuasive tools in marketing and sales. In the future, interests or personal relationships that could have appeared to influence
these mechanisms could however be further explored in a positive way as the work reported in this paper.

Appendix A. List of Categorized Nudging Mechanisms

6.1. Decision Information Nudging Mechanisms

Table 6
Nudging Mechanisms in the Category Decision Information

Mechanism Description

Translate information
Decrease vagueness and ambiguity (Sunstein, 2016) By reducing ambiguity and making information easily understandable, it becomes
clearer what should be expressed and thus catches the attention of users more easily.
Explicitly mapping (Caraban et al., 2019; Johnson et al., 2012; Münscher et al., 2016) When users are presented with an option it is beneficial to explain the selection in regard
of its possible outcomes (e.g., costs and benefits).
Simplification (Alslaity & Tran, 2019; Hummel & Maedche, 2019; Meske & Potthoff, 2017; Reduce the cognitive effort which is necessary to understand certain information or tasks
Münscher et al., 2016; Sunstein, 2014, 2016) by making it shorter or phrased more simple.
Understanding mapping (Sunstein, 2016; Thaler et al., 2012, pp. 428–439; Weinmann et al., Explain information, which is difficult to understand, to concepts or information that are
2016) more familiar. Examples for this are an analogy or using visual aids.

Increase salience of information


Attracting/Reducing attention (Sunstein, 2016) A mechanism which tries to draw attention to certain options or information with the
use of highlighting.
Hiding information (Caraban et al., 2019) Make undesirable options or information harder to see.
Increase salience of attribute (Johnson et al., 2012; Sunstein, 2016) The act of making an attribute more salient (e.g., weight, price or color)
Increase salience of incentives (Johnson et al., 2012; Weinmann et al., 2016) Make incentives more salient or visible, so they are more effective and prominent.
Using visuals to deceit (Caraban et al., 2019) Through the use of optical illusions the perception and judgment of options is altered, so
they appear more salient than they actually are.
(continued on next column)

9
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Table 6 (continued )
Mechanism Description

Using visuals to increase salience (Gena et al., 2019; Karvonen et al., 2010; Sunstein, 2016) The salience of information is increased with the use of visual effects (e.g., colors,
pictures, signs or fonts).

Make information visible


Checklist (Sunstein, 2016) The use of checklists helps users to understand their progress and what further needs to
be done for the decision to be finished.
Customized information (Johnson et al., 2012; Meske & Potthoff, 2017) Tailor the provided information to the needs of users and thus reduce cognitive
overload.
Disclosure (Hummel & Maedche, 2019; Sunstein, 2014, 2016) Disclose related and relevant information to a certain option (e.g., full costs of a credit
card)
Give comparative information (Sunstein, 2016) Provide the users with information that is comparative to a point of view. This allows
users to make more profound decisions.
Informing (Johnson et al., 2012; Lee et al., 2011; Loewenstein & Chater, 2017; Meske & A mechanism which aims to simply provide information to the users.
Potthoff, 2017)
Make external information visible (Cheng & Langbort, 2016; Guers et al., 2013; Münscher Provide information that a third party has created (e.g., information about hygiene in a
et al., 2016; Szaszi et al., 2018) coffee shop)
Providing an explanation (Johnson et al., 2012) Provide additional information on the situation the users are in for it to be more
understandable.
Providing feedback (Caraban et al., 2019; Hummel & Maedche, 2019; Leonard et al., 2008; Users are given feedback on how they are performing (e.g., doing tasks right or making
Thaler et al., 2012, pp. 428–439; Weinmann et al., 2016) mistakes)
Providing multiple viewpoints (Caraban et al., 2019) Provide an unbiased overview over a certain decision in order for users to assess the
problem at hand. For example a smartphone can be viewed from different perspectives
that focus on certain usages like entertainment, office related tasks or ease of use.
Reduce the distance (Caraban et al., 2019) If a situation is too far in the future or uncertain to happen, users do not feel the urgency
to act. Therefore, it is possible to present the users with similar situations to reduce the
psychological distance (e.g., heavy rain sounds played in the background when
acquiring a flood insurance)
Suggesting alternatives (Caraban et al., 2019; Forwood et al., 2015) Provide the users with alternatives that they might not have considered at this point
(e.g., cheaper camera with the same resolution).
Visible goals (Castmo & Persson, 2018; Cheng & Langbort, 2016) Allow users to easily assess their progress and performance against a goal state (e.g.,
finishing a registration).
Warning (Hummel & Maedche, 2019; Loewenstein & Chater, 2017; Meske & Potthoff, 2017; Warn the users with the help of visuals or other means that emphasize the problem at
Sunstein, 2014, 2016) hand.

Phrasing of information
Anchoring and adjusting (Jung et al., 2018; Leonard et al., 2008; Meske & Potthoff, 2017; Set an anchor in a decision for users to influence their judgement of the whole decision.
Mirsch, Lehrer, & Jung, 2018; Sch€ar & Stanoevska-Slabeva, 2019; Sunstein, 2016) An anchor can be understood as a starting point, which can be adjusted to help users to
make estimations of the latter provided information (e.g. showing the price of a product
before reduction).
ar & Stanoevska-Slabeva, 2019)
Attentional collapse (Sch€ Perception of users is affected by selective factors of their attention. This means users are
not always aware of every information they are presented.
Availability (Leonard et al., 2008; Sch€ar & Stanoevska-Slabeva, 2019) Users assess the likelihood of a personal situation by how easy it is for them to remember
situations that are alike (e.g., car accidents).
Biasing the memory of experiences (Caraban et al., 2019) Through changing the endings of events, the memory of said event can be altered. (e.g.
increasing the speed of a progress bar towards the end)
Decoy effect (Caraban et al., 2019; Lee et al., 2011; Teppan & Felfernig, 2009) This mechanism can be facilitated by adding an inferior alternative to a set of options to
make the other seem higher in value.
Endowment effect (Caraban et al., 2019; Sch€ar & Stanoevska-Slabeva, 2019) Users are more likely to keep an item if they already own it than they are to acquire it if
they do not own it.
Framing (Caraban et al., 2019; Gena et al., 2019; Guers et al., 2013; Jung et al., 2018; It is possible to describe or frame a good option in a bad way and vice-versa.
Knijnenburg, 2014; Loewenstein & Chater, 2017; Meske & Potthoff, 2017; Münscher
ar & Stanoevska-Slabeva, 2019; Sunstein, 2016)
et al., 2016; Sch€
Hyperbolic discounting (Lee et al., 2011; Mirsch, Lehrer, & Jung, 2018; Sch€ar & Decoupling the purchase from the payment lowers the purchase barrier and shortens the
Stanoevska-Slabeva, 2019) decision-making process (e.g., buying products with a credit card).
Image motivation (Sch€ ar & Stanoevska-Slabeva, 2019) Actions that are favorable are attributed to ourselves as in comparison things that take a
bad turn are considered the fault of others.
Limited time window (Johnson et al., 2012; Meske & Potthoff, 2017; Zhang et al., 2015) When an option is presented as only being available for a certain amount of time it is
perceived as more important and scarce.
Loss aversion (Leonard et al., 2008; Mirsch, Lehrer, & Jung, 2018; Sch€ar & Describes the phenomena of users weighting losses higher than winnings (e.g., the loss
Stanoevska-Slabeva, 2019; Sunstein, 2016) of €100 is worse than winning €100)
Make resources scarce (Caraban et al., 2019) Announcing limited availability of an option increases the probability of users
committing to choosing it.
ar & Stanoevska-Slabeva, 2019)
Mental accounting (Sch€ Users group payments in fictitious accounts (e.g., presents, vacation or stocks). This
helps track incomes and expenses but also leads to misconceptions about how money is
actually spent.
Optimism and overconfidence (Leonard et al., 2008; Sch€ar & Stanoevska-Slabeva, 2019; Users tend to underestimate challenges, costs and timescales but overestimate the
Sunstein, 2016) rewards.
Placebos (Caraban et al., 2019) The presentation of an element that has no effect upon the individual’s condition, is able
to change the behavior and well-being of a user due to its perceived effect (e.g., pills
made of sugar).
Priming (Bauer & Schedl, 2017; Caraban et al., 2019; Leonard et al., 2008; Meske & The users are influenced by a certain stimulus (smell of pizza) and more inclined to
ar & Stanoevska-Slabeva, 2019; Sunstein, 2016)
Potthoff, 2017; Sch€ behave more predictable on another process afterwards (baking pizza for dinner).
Representativeness (Leonard et al., 2008; Sch€ar & Stanoevska-Slabeva, 2019)
(continued on next column)

10
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Table 6 (continued )
Mechanism Description

When shown a member of a group the users expect the member to have certain
characteristics without having actual information about it (e.g., stereotypes in a group of
people)
Spotlight effect (Caraban et al., 2019; Leonard et al., 2008; Sch€ar & Stanoevska-Slabeva, Users tend to overestimate the significance of their own actions.
2019)
Temptation (Leonard et al., 2008) Options that produce a quick reward are more appealing to users than those that do so
over a long period of time.

6.2. Decision Structure Nudging Mechanisms

Table 7
Nudging Mechanisms in the Category Decision Structure

Mechanism Description

Change choice defaults


Automatic enrollment (Caraban et al., 2019; Sunstein, 2016) Automatically enrolling users, so they have to take action to get out of a certain
condition (e.g., organ donor)
Enhancing or influencing active choosing (Sunstein, 2016) This mechanism combines the nudge of prompting the users to actively choose and
another nudge (e.g., order effects or loss aversion) to influence choice.
Prompted choice (Münscher et al., 2016; Sunstein, 2016) By making users actively choose an option they are more invested in the decision and
have to actively think about the options.
Setting defaults (Bauer & Schedl, 2017; Caraban et al., 2019; Hummel & Maedche, 2019; When options are preselected in a decision, users are more inclined to stay with the
Johnson et al., 2012, 2012, 2012; Jung et al., 2018; Knijnenburg, 2014; Lee et al., 2011; status-quo (e.g., installation wizard).
Loewenstein & Chater, 2017; Loewenstein et al., 2015; Meske & Potthoff, 2017; Mirsch,
Lehrer, & Jung, 2018; Münscher et al., 2016; Sch€ar & Stanoevska-Slabeva, 2019; Sunstein,
2014, 2016; Szaszi et al., 2018; Thaler, 2018; Weinmann et al., 2016)
Simplifying active choosing (Sunstein, 2016) By making active choosing more simple it increases the chances of users thinking about
their own decision and the consequences entailed by it.

Change option-related effort


Change ease and convenience (Sunstein, 2014, 2016) The main focus of this mechanism lies on the visibility and accessibility of options (e.g.,
make low-cost options or healthy foods visible).
Change financial effort (Hummel & Maedche, 2019; Münscher et al., 2016) Choice architects can intervene with the perception of financial effort. Examples include
postponing costs to the future without changing the actual final costs.
Change physical effort (Hummel & Maedche, 2019; Münscher et al., 2016) This mechanism changes the necessary physical effort for certain options in order for
them to be more attractive/unattractive. As an example an apple would be positioned
easier to grasp than less healthy alternatives.
Create friction (Caraban et al., 2019) Attempt to minimize intrusiveness while maintaining the capacity to change user’s
behavior (e.g., drop bike keys when car key is selected).
Navigability of contexts (Sunstein, 2016) Make contexts or policies easily navigable for users (e.g., pointers and guides)
Reduce paperwork (Sunstein, 2016) Reduce the involved paperwork in completing a task.
Speed bumps (Sunstein, 2016) Provide literal or figurative speed bumps or cooling-off periods for users to slow down.
Throttle mindless activity (Caraban et al., 2019) Reduce the amount of mindless actions the users can take or give them possible ways of
making up their mistakes (e.g., a time buffer to reverse an action).

Change range or composition of options


Change scale (Johnson et al., 2012) By expanding the denominator, numerators appear larger and this in turn makes the
differences between alternatives seem larger (e.g., €1 and €2 versus 100 cents and 200
cents).
Decision staging (Johnson et al., 2012; Meske & Potthoff, 2017) Break down a complex decision into smaller ones that only focus on small subsets of
attributes that can be compared more easily (e.g., quality of camera, weight and price).
Order effects (Caraban et al., 2019; Sunstein, 2016) What users see first on a website or in a room makes a difference, as items perceived at
first are intuitively of higher importance than others seen later.
Partition of options/categories (Johnson et al., 2012; Münscher et al., 2016) Options or categories can be arranged with their respective alternatives to create a
tactical representation (e.g., segregating healthy options into more diverse categories)
Structure complex choices (Johnson et al., 2012; Sunstein, 2016; Weinmann et al., 2016) The aim of this mechanism is to make a complicated choice easier to decide by
restructuring it into smaller more manageable pieces. This could be done by guiding
users through a process.
Structure of evaluation (Sunstein, 2016) The way users are presented with results can influence their behavior. Assessing a single
item at a time in comparison to all at once will result in different decisions.

Change option consequences


Change social consequences (Münscher et al., 2016) Actions taken by users are linked to social outcomes (e.g., friends know about the
decision/action)
Connect decision to benefit/cost (Münscher et al., 2016) Connecting small benefits to behavior that should be pursued or small costs to behavior
can change the occurrences of outcomes. These benefits or costs must be so small as to
not change the economic incentives of each option (e.g. emotional incentive).
Micro-Incentives (Hummel & Maedche, 2019) Micro-Incentives are changes to the consequences of an option that are insignificant
from a rational choice perspective (e.g. free shopping bag that would usually cost three
cents)

11
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

6.3. Decision Assistance Nudging Mechanisms

Table 8
Nudging Mechanisms in the Category Decision Assistance.

Mechanism Description

Provide reminders
Remind of goal (Brown, 2019; Caraban et al., 2019; Hummel & Maedche, 2019; Meske Tell the users what the result of finishing a process is. Remind them of why they should
& Potthoff, 2017; Münscher et al., 2016; Sunstein, 2014, 2016; Szaszi et al., 2018) finish it.
Remind of socially desirable concepts (Münscher et al., 2016) Influence users by stating the social expectation in a specific situation (e.g., need to vote)
Situated (Brown, 2019) Make suggestions to the users where the practice is applicable and needed
Tutorial (Brown, 2019) Show users how a certain procedure works, so they can do it on their own. This could go as
far as providing helps and further instructions.

Facilitate commitment
Apprehensive (Brown, 2019) Providing multiple locations where the same decision can be performed by the users (e.g.,
different routes lead to the same result).
Automated (Brown, 2019) Perform actions for the users and present the results without them having to do anything.
Eliciting implementation intentions (Hummel & Maedche, 2019; Sunstein, 2014, 2016) Users are more likely to continue an action if someone elicits their implementation
intentions (e.g., “Do you plan to vote?“).
Just-in-time (Brown, 2019; Caraban et al., 2019) Make recommendations when an action is appropriate (e.g., when the user is actively
searching for a product, ask if they need assistance).
Nature and consequences of own past choices (Sunstein, 2014, 2016) With existing historical data it is possible to explain the users why they took a certain course
of actions in the past and why they should do it again.
Precomitment strategy (Hummel & Maedche, 2019; Meske & Potthoff, 2017; Münscher Let users decide on a certain course of actions or let them define a goal to work towards.
et al., 2016; Sunstein, 2014, 2016)
Self-control strategies (Leonard et al., 2008) As the name suggests the goal of this mechanism is to support users in fulfilling their plan
without falling victim to their own weaknesses (e.g., heuristics or biases).
Support public commitment (Brown, 2019; Caraban et al., 2019; Münscher et al., 2016) Publicly committing to a behavior (working out) or to refrain from a behavior (smoking)
creates external pressure. This leads to fewer people breaking their acting against their
commitments.

6.4. Social Decision Appeal Nudging Mechanisms

Table 9
Nudging Mechanisms in the Category Social Decision Appeal.

Mechanism Description

Increase reputation of messenger


Expecting error (Johnson et al., 2012; Weinmann et al., 2016) Users make errors and by being as forgiving of that as possible, improves the reputation of the
messenger.
ar & Stanoevska-Slabeva, 2019)
Messenger effect (Sch€ Every piece of information is delivered by a messenger (e.g., human, computer, recommender
system). This messenger creates a certain picture in the mind of users and therefore affects the
perceiption of it.

Provide social reference point


Argentum-Ad Populum (Gena et al., 2019) Accepting a certain thesis based on the mere fact that the majority of people accept it.
Following the herd (Norms) (Leonard et al., 2008; Mirsch, Lehrer, & Jung, As users do not want to stick out, they have a tendency to do what the majority of people does.
2018; Sch€ar & Stanoevska-Slabeva, 2019; Sunstein, 2016)
Group-Ad Populum (Gena et al., 2019) Accepting a certain thesis based on the mere fact that a specific group of people accept it.
Referring to opinion leader (Münscher et al., 2016) Provide a highly valued or respected messenger to recommend items in order to influence the
opinions and behavior of users.
Social influence (Comparison) (Hummel & Maedche, 2019; Meske & Potthoff, Allow users to compare their actions to that of other users.
2017; Münscher et al., 2016; Sunstein, 2014)

Instigate empathy
Instigate empathy with characters (Caraban et al., 2019) Empathy can created in multiple ways, one of which is to make use of characters. This is done by
showing an avatar or character that changes its state based on the behavior of a user.
Invoking feelings of reciprocity (Caraban et al., 2019) If something good happens to users they try to reproduce this feeling towards the benefactor.
Moral suasion (Sunstein, 2016) Moral suasion is used to increase fun or create a sense of responsibility which the users might act on.
Positive reinforcement (Brown, 2019) Commending users when they align with the expected behavior increases empathy and further
motivates them.

References Adomavicius, G., Bockstedt, J. C., Curley, S. P., & Zhang, J. (2018). Effects of online
recommendations on consumers’ willingness to pay. Information Systems Research, 29,
84–102. https://doi.org/10.1287/isre.2017.0703
Abdollahpouri, H., Adomavicius, G., Burke, R., Guy, I., Jannach, D., Kamishima, T., …
Alslaity, A., & Tran, T. (2019). Towards persuasive recommender systems. In 2019 IEEE
Pizzato, L. (2020). Multistakeholder recommendation: Survey and research
2nd international conference on information and computer technologies (ICICT) (pp.
directions. User Modeling and User-Adapted Interaction, 30, 127–158. https://doi.org/
143–148). https://doi.org/10.1109/INFOCT.2019.8711416
10.1007/s11257-019-09256-1

12
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Bauer, C., & Schedl, M. (2017). Introducing surprise and opposition by design in Jung, D., Erdfelder, E., & Glaser, F. (2018). Nudged to win: Designing robo-advisory to
recommender systems. In Proceedings of the 25th conference on user modeling, overcome decision inertia. In Proceedings of the 26th European conference on
adaptation and personalization (pp. 350–353). https://doi.org/10.1145/ information systems. ECIS, 2018.
3099023.3099099 Kahneman, D. (2012). Thinking, fast and slow. London: Penguin Books.
Bothos, E., Apostolou, D., & Mentzas, G. (2015). Recommender systems for nudging Kamehkhosh, I., Bonnin, G., & Jannach, D. (2019). Effects of recommendations on the
commuters towards eco-friendly decisions. Intelligent Decision Technologies, 9, playlist creation behavior of users. User Modeling and User-Adapted Interaction, 30,
295–306. https://doi.org/10.3233/IDT-140223 285–322. https://doi.org/10.1007/s11257-019-09237-4
Bothos, E., Apostolou, D., & Mentzas, G. (2016). A recommender for persuasive messages Kammerer, Y., & Gerjets, P. (2014). The role of search result position and source
in route planning applications. In 2016 7th international conference on information, trustworthiness in the selection of web search results when using a list or a grid
intelligence, systems and applications. IISA. https://doi.org/10.1109/IISA.2016. interface. International Journal of Human-Computer Interaction, 30, 177–191. https://
7785399. doi.org/10.1080/10447318.2013.846790
Brown, C. (2019). Digital nudges for encouraging developer actions. In IEEE/ACM 41st Karlsen, R., & Andersen, A. (2019). Recommendations with a nudge. Technologies, 7.
international conference on software engineering: Companion proceedings (ICSE- https://doi.org/10.3390/technologies7020045
Companion) (pp. 202–205). https://doi.org/10.1109/ICSE-Companion.2019.00082 Karvonen, K., Shibasaki, S., Nunes, S., Kaur, P., & Immonen, O. (2010). Visual nudges for
Caraban, A., Karapanos, E., Gonçalves, D., & Campos, P. (2019). 23 ways to nudge: A enhancing the use and produce of reputation information. In Proceedings of the
review of technology-mediated nudging in human-computer interaction. In workshop on user-centric evaluation of recommender systems and their interfaces at the 4th
Proceedings of the 2019 CHI conference on human factors in computing systems (CHI ’19). conference on recommender systems (RecSys10).
https://doi.org/10.1145/3290605.3300733 Kissmer, T., Kroll, T., & Stieglitz, S. (2018). Enterprise digital nudging: Between adoption
Castmo, M., & Persson, R. (2018). The alliance of digital nudging & persuasive design: The gain and unintended rejection. In Proceedings of the 24th americas conference on
complementary nature of the design strategies. Master thesis. information (AMCIS).
Cheng, Y., & Langbort, C. (2016). A model of informational nudging in transportation Knijnenburg, B. P. (2014). Information disclosure profiles for segmentation and
networks. In 2016 IEEE 55th conference on decision and control (CDC) (pp. recommendation. In SOUPS2014 workshop on privacy personas and segmentation (p. 4).
7598–7604). https://doi.org/10.1109/CDC.2016.7799443 K€
ocher, S., Jugovac, M., Jannach, D., & Holzmller, H. (2019). New hidden persuaders: An
Djurica, D., & Figl, K. (2017). The effect of digital nudging techniques on customers’ investigation of attribute-level anchoring effects of product recommendations.
product choice and attitudes towards e-commerce sites. In Proceedings of the 23rd Journal of Retailing, 95, 24–41. https://doi.org/10.1016/j.jretai.2018.10.004
americas conference on information systems (AMCIS). Lee, D., & Hosanagar, K. (2019). How do recommender systems affect sales diversity? A
Dolan, P., Hallsworth, M., Halpern, D., King, D., Metcalfe, R., & Vlaev, I. (2012). cross-category investigation via randomized field experiment. Information Systems
Influencing behaviour: The mindspace way. Journal of Economic Psychology, 33, Research, 30, 239–259. https://doi.org/10.1287/isre.2018.0800
264–277. https://doi.org/10.1016/j.joep.2011.10.009 Lee, M. K., Kiesler, S., & Forlizzi, J. (2011). Mining behavioral economics to design
Dolan, P., Hallsworth, M., Halpern, D., King, D., & Vlaev, I. (2010). Mindspace: Influencing persuasive technology for healthy choices. In Conference on human factors in computing
behaviour for public policy. systems - proceedings (pp. 325–334). https://doi.org/10.1145/1978942.1978989
Elsweiler, D., Trattner, C., & Harvey, M. (2017). Exploiting food choice biases for Leitner, G. (2015). Psyrec: Psychological concepts to enhance the interaction with
healthier recipe recommendation. In Proceedings of the 40th international ACM SIGIR recommender systems. In Proceedings of the 1st international workshop on
conference on research and development in information retrieval (pp. 575–584). https:// personalization & recommender systems in financial services (pp. 37–44). volume 1349
doi.org/10.1145/3077136.3080826 of CEUR Workshop Proceedings.
Esposito, G., Hern andez, P., van Bavel, R., & Vila, J. (2017). Nudging to prevent the Leonard, T. C., Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about
purchase of incompatible digital products online: An experimental study. PloS One, health, wealth, and happiness. Constitutional Political Economy, 19, 356–360. https://
12. https://doi.org/10.1371/journal.pone.0173333 doi.org/10.1007/s10602-008-9056-2
Felfernig, A., Friedrich, G., Gula, B., Hitz, M., Kruggel, T., Leitner, G., & Teppan, E. Loewenstein, G., Bryce, C., Hagmann, D., & Rajpal, S. (2015). Warning: You are about to
(2007). Persuasive recommendation: Serial position effects in knowledge-based be nudged. Behavioral Science & Policy, 1, 35–42. https://doi.org/10.2139/
recommender systems. In Persuasive technology (pp. 283–294). Springer Berlin ssrn.2417383
Heidelberg. https://doi.org/10.1007/978-3-540-77006-0_34. Loewenstein, G., & Chater, N. (2017). Putting nudges in perspective. Behavioural Public
Felfernig, A., Gula, B., Leitner, G., Maier, M., Melcher, R., & Teppan, E. (2008). Persuasion Policy, 1, 26–53. https://doi.org/10.1017/bpp.2016.7
in knowledge-based recommendation. In Persuasive technology lecture notes in computer Meske, C., & Potthoff, T. (2017). The DINU-Model - a process model for the design of
science (pp. 71–82). https://doi.org/10.1007/978-3-540-68504-3/7 nudges. In 25th European conference on information systems (pp. 2587–2597). ECIS
Fogg, B. (2002). Persuasive technology: Using computers to change what we think and do. 2017. https://aisel.aisnet.org/ecis2017_rip/11.
Ubiquity. https://doi.org/10.1145/763955.763957, 2002. Miller, T. (2019). Explanation in artificial intelligence: Insights from the social sciences.
Forwood, S. E., Ahern, A. L., Marteau, T. M., & Jebb, S. A. (2015). Offering within- Artificial Intelligence, 267. https://doi.org/10.1016/j.artint.2018.07.007
category food swaps to reduce energy density of food purchases: A study using an Mirsch, T., Jung, R., Rieder, A., & Lehrer, C. (2018). Mit Digital Nudging Nutzererlebnisse
experimental online supermarket. International Journal of Behavioral Nutrition and verbessern und den Unternehmenserfolg steigern. Controlling, 30, 12–18. https://
Physical Activity, 12. https://doi.org/10.1186/s12966-015-0241-1 doi.org/10.15358/0935-0381-2018-5-12
Gena, C., Grillo, P., Lieto, A., Mattutino, C., & Vernero, F. (2019). When personalization is Mirsch, T., Lehrer, C., & Jung, R. (2017). Digital nudging: Altering user behavior in digital
not an option: An in-the-wild study on persuasive news recommendation. Information, environments. In Towards thought leadership in digital transformation: 13 (pp.
10. https://doi.org/10.3390/info10100300 634–648). Internationale Tagung Wirtschaftsinformatik.
Gomez-Uribe, C. A., & Hunt, N. (2015). The Netflix recommender system: Algorithms, Mirsch, T., Lehrer, C., & Jung, R. (2018). Making digital nudging applicable: The digital
business value, and innovation. ACM Transactions on Management Information Systems, nudge design method. In Proceedings of the 39th international conference on information
6. https://doi.org/10.1145/2843948 systems (ICIS).
Guers, R., Langbort, C., & Work, D. (2013). On informational nudging and control of Münscher, R., Vetter, M., & Scheuerle, T. (2016). A review and taxonomy of choice
payoff-based learning. IFAC Proceedings Volumes, 46, 69–74. https://doi.org/ architecture techniques. Journal of Behavioral Decision Making, 29, 511–524. https://
10.3182/20130925-2-DE-4044.00037 doi.org/10.1002/bdm.1897
He, R., & McAuley, J. (2016). Vbpr: Visual bayesian personalized ranking from implicit Rook, L., Sabic, A., & Zanker, M. (2020). Engagement in proactive recommendations.
feedback. In Thirtieth AAAI conference on artificial intelligence (pp. 144–150). AAAI Journal of Intelligent Information Systems, 54, 79–100. https://doi.org/10.1007/
Press. s10844-018-0529-0
Huhn, R., Ferreira, J., Freitas, A., & Leo, F. (2018). The effects of social media opinion Sch€ar, A., & Stanoevska-Slabeva, K. (2019). Application of digital nudging in customer
leaders’ recommendations on followers’ intention to buy. Review of Business journeys? A systematic literature review. In Proceedings of the 25th americas conference
Management, 20, 57–73. https://doi.org/10.7819/rbgn.v20i1.3678 on information systems (AMCIS) (p. 10).
Hummel, D., & Maedche, A. (2019). How effective is nudging? A quantitative review on Schneider, C., Weinmann, M., & vom Brocke, J. (2018). Digital nudging: Guiding online
the effect sizes and limits of empirical nudging studies. Journal of Behavioral and user choices through interface design. Communications of the ACM, 61, 67–73.
Experimental Economics, 80, 47–58. https://doi.org/10.1016/j.socec.2019.03.005 https://doi.org/10.1007/s12599-016-0453-1
Jannach, D. (2004). Advisor suite - a knowledge-based sales advisory-system. In Sunstein, C. R. (2014). Nudging: A very short guide. Journal of Consumer Policy, 37,
Proceedings of the 16th eureopean conference on artificial intelligence (pp. 720–724). 583–588. https://doi.org/10.1057/s11369-018-00104-5
ECAI. https://doi.org/10.1007/978-3-540-27834-4_34, 2004. Sunstein, C. R. (2015). Nudging and choice architecture: Ethical considerations. Yale
Jannach, D., & Adomavicius, G. (2016). Recommendations with a purpose. In Proceedings Journal on Regulation, 413–450.
of the 10th ACM conference on recommender systems (pp. 7–10). RecSys ’16. https:// Sunstein, C. R. (2016). The council of psychological advisers. Annual Review of Psychology,
doi.org/10.1145/2959100.2959186. 67, 713–737. https://doi.org/10.1146/annurev-psych-081914-124745
Jannach, D., & Adomavicius, G. (2017). Price and profit awareness in recommender Szaszi, B., Palinkas, A., Palfi, B., Szollosi, A., & Aczel, B. (2018). A systematic scoping
systems. In Proceedings of the ACM RecSys 2017 workshop on value-aware and multi- review of the choice architecture movement: Toward understanding when and why
stakeholder recommendation. http://arxiv.org/abs/1707.08029. nudges work. Journal of Behavioral Decision Making, 31, 355–366. https://doi.org/
Jannach, D., & Jugovac, M. (2019). Measuring the business value of recommender 10.1002/bdm.2035
systems. ACM Transactions on Management Information Systems, 10. https://doi.org/ Teppan, E. C., & Felfernig, A. (2009). Asymmetric dominance- and compromise effects in
10.1145/3370082 the financial services domain. In 2009 IEEE conference on commerce and enterprise
Johnson, E., Shu, S., Dellaert, B., Fox, C., Goldstein, D., H€aubl, G., … Weber, E. (2012). computing (pp. 57–64). https://doi.org/10.1109/CEC.2009.69
Beyond nudges: Tools of a choice architecture. Marketing Letters, 23, 487–504. Thaler, R. H. (2018). From cashews to nudges: The evolution of behavioral economics.
https://doi.org/10.1007/s11002-012-9186-1 The American Economic Review, 108, 1265–1287. https://doi.org/10.1257/
aer.108.6.1265

13
M. Jesse, D. Jannach Computers in Human Behavior Reports 3 (2021) 100052

Thaler, R. H., Sunstein, C. R., & Balz, J. P. (2012). Choice architecture. The Behavioral IEEE/ACM international conference on advances in social networks analysis and mining
Foundations of Public Policy. https://doi.org/10.2139/ssrn.1583509 (ASONAM) (pp. 1286–1290).
Theocharous, G., Healey, J., Mahadevan, S., & Saad, M. (2019). Personalizing with human Wang, X., Shi, W., Kim, R., Oh, Y., Yang, S., Zhang, J., & Yu, Z. (2019). Persuasion for
cognitive biases. In Proceedings of the 27th ACM conference on user modeling, adaptation good: Towards a personalized persuasive dialogue system for social good. In
and personalization (pp. 13–17). https://doi.org/10.1145/3314183.3323453 Proceedings of the 57th annual meeting of the association for computational linguistics (pp.
Turland, J., Coventry, L., Jeske, D., Briggs, P., & van Moorsel, A. (2015). Nudging towards 5635–5649). https://doi.org/10.18653/v1/P19-1566
security: Developing an application for wireless network selection for android Weinmann, M., Schneider, C., & vom Brocke, J. (2016). Digital nudging. Business &
phones. In Proceedings of the 2015 British HCI conference (pp. 193–201). https:// Information Systems Engineering, 58, 433–436.
doi.org/10.1145/2783446.2783588 Yoo, K.-H., Gretzel, U., & Zanker, M. (2013). Persuasive recommender systems – conceptual
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. background and implications. Springer Science & Business Media. https://doi.org/
Science, 185, 1124–1131. https://doi.org/10.1126/science.185.4157.1124 10.1007/978-1-4614-4702-3
Verma, A., Wadhwa, A., Singh, N., Beniwal, S., Kaushal, R., & Kumaraguru, P. (2018). Zhang, N., Kannan, K. N., & Shanthikumar, J. G. (2015). A recommender system to nudge
Followee management: Helping users follow the right users on online social media. In customers in a capacity constrained supply chain. SSRN Electronic Journal. https://
doi.org/10.2139/ssrn.2588523

14

You might also like