Avatar Marketing

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 63

DOI: 10.

1177/0022242921996646

Author Accepted Manuscript

An Emerging Theory of Avatar Marketing

Journal: Journal of Marketing

Manuscript ID JM.19.0801.R3
Pe
Manuscript Type: Special Issue Revised Submission

Customer Relationship Management, Relationship Marketing,


Research Topics:
Marketing Strategy, Marketing Communications
e

Methods: Conceptual/Theoretical
rR
ev
iew
Ve
rsi
on

Journal of Marketing
Page 1 of 62

1
2 Author Accepted Manuscript
3 An Emerging Theory of Avatar Marketing
4
5
6
7
8 Abstract
9
10 Avatars are becoming increasingly popular in contemporary marketing strategies, but their
11
effectiveness for achieving performance outcomes (e.g., purchase likelihood) varies widely in
12
13
practice. Related academic literature is fragmented, lacking both definitional consistency and
14 conceptual clarity. This article makes three main contributions to avatar theory and managerial
15 practice. First, to address ambiguity with respect to its definition, this study identifies and
16 critically evaluates key conceptual elements of the term avatar, offers a definition derived from
17 this analysis, and provides a typology of avatars’ design elements. Second, the proposed 2  2
Pe
18 avatar taxonomy suggests that the alignment of an avatar’s form realism and behavioral realism,
19
20
across different contingencies, provides a parsimonious explanation for avatar effectiveness.
e

21 Third, the authors develop an emerging theory of avatar marketing, by triangulating insights
from fundamental elements of avatars, a synthesis of extant research, and business practices.
rR
22
23 This framework integrates key theoretical insights, research propositions, and important
24 managerial implications for this expanding area of marketing strategy. Lastly, the authors outline
25 a research program to both test the propositions and insights as well as advance future research.
ev

26
27
28
iew

29
30
31 Keywords: avatar, chatbot, form realism, behavioral realism, artificial intelligence, human–
32 computer interaction
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58 1
59
60 Journal of Marketing
Page 2 of 62

1
2 Author Accepted Manuscript
3 Advances in computer technology have supported the proliferation of virtual characters, broadly
4
5
6 known as avatars, which we define as digital entities with anthropomorphic appearance,
7
8 controlled by a human or software, that have an ability to interact. Companies are heavily
9
10 investing in avatars to engage and serve their customers better and the use of avatars is predicted
11
12
13
to increase by 241% in the travel and hospitality industry and by 187% for consumer goods
14
15 (Sweezey 2019). In the banking industry, 87% of companies already use some form of an avatar
16
17 or plan to implement one within two years (Torresin 2019).
Pe
18
19
Even as the use of avatars continues to rise, their effectiveness varies significantly across
20
e

21
firms. For example, Progressive Insurance’s avatar Flo serves many customers successfully on
rR
22
23
24 Facebook Messenger (Briggs 2018), but IKEA withdrew its avatar Anna from its online website
25
ev

26 following its unsatisfactory performance (Brandtzaeg and Følstad 2018). No guidelines exist for
27
28
iew

29 effective design or use of avatars (Bradbury 2018), and academic research often lags behind
30
31 business practices. In addition, it is difficult to integrate extant research on avatars to establish a
32
33 strong foundation, because of the literature being very fragmented in this domain and the lack of
Ve

34
35
36
definitional and conceptual precision, as is typical for an emerging research area. With this
rsi

37
38 article, we aim to provide an integrated theoretical framework of avatars by establishing
39
on

40 definitional and conceptual clarity, synthetizing academic research and business practices, and
41
42
offering propositions that include both managerial insights and future research opportunities.
43
44
45 This article seeks to make three main contributions to avatar theory and managerial
46
47 practice. First, because extant research remains ambiguous with respect to defining and creating
48
49 a taxonomy of avatars, it also remains difficult for researchers to compare empirical results and
50
51
52 draw meaningful conclusions across studies. We offer an overview, in which we summarize the
53
54 various ways the term avatar has been defined, identify and critically evaluate key conceptual
55
56
57
58 2
59
60 Journal of Marketing
Page 3 of 62

1
2 Author Accepted Manuscript
3 elements of these definitions, and propose a definition on this basis. Then, based on this
4
5
6 definition, we present a typology of avatar design to isolate elements that academics and
7
8 managers can leverage to ensure avatars’ effectiveness for achieving specific goals (e.g.,
9
10 providing standard vs. personalized solutions). This typology represents an overall organizing
11
12
13
framework for thinking about the uses of avatars, making relevant design/implementation
14
15 decisions, and identifying research gaps.
16
17 Second, applying our proposed avatar typology, we synthesize findings from prior
Pe
18
19
20
literature and business practices to produce a 2  2 taxonomy, comprised of two dimensions:
e

21
avatars’ form realism and behavioral realism. This taxonomy enables us to generate specific
rR
22
23
24 research propositions to advance our understanding of and improve effectiveness of avatars in
25
ev

26
27
marketing. The level of alignment between an avatar’s form and behavioral realism across
28
iew

29 different contingencies provides a parsimonious explanation for when an avatar is most effective,
30
31 which we highlight with examples of successful and failed avatars in business practices. In
32
33
particular, avatars with high form realism but low behavioral realism, which we called
Ve

34
35
36 superficial avatars, can hinder customer experiences in high-risk transactions (e.g., stock
rsi

37
38 purchases), because the misalignment between an anthropomorphic appearance and low
39
on

40 intelligence leads to negative disconfirmations of expectations (e.g., Nordnet’s Amelia exhibits


41
42
43 realistic human appearance but does not offer good stock tips). In contrast, an intelligent
44
45 unrealistic avatar (i.e., low form realism, high behavioral realism) may produce positive
46
47 disconfirmations in socially complex interactions that require exchanges of sensitive personal
48
49
50
information (e.g., Ellie, an online avatar, helps assess people’s depression and PTSD symptoms),
51
52 because the avatar’s unrealistic human appearance cannot be mistaken for a real human, people
53
54 are more likely to provide responses that are free from a social desirability bias.
55
56
57
58 3
59
60 Journal of Marketing
Page 4 of 62

1
2 Author Accepted Manuscript
3 Third, by triangulating insights from the avatar fundamentals and our synthesis of the
4
5
6 extant research and business practices, we develop an integrative framework of avatar
7
8 performance and offer key theoretical insights, research propositions, and managerial
9
10 implications for this expanding area of marketing strategy. This framework sheds new light on
11
12
13
the underlying mechanisms of effective avatar design and implementation strategies, as well as
14
15 potential contingencies to these effects. In turn, we offer managerial guidelines regarding avatar
16
17 deployment, specific decision rules, and insights into how and why specific avatar strategies may
Pe
18
19
be effective. This foundation for a contemporary theory of avatars may spur future research; the
20
e

21
identified academic–practice gaps and propositions in particular point to promising research
rR
22
23
24 directions.
25
ev

26
27
28
iew

29 Avatar Fundamentals
30
31 The popularity of avatars is fueled by two macro-environmental factors. First, advancements in
32
33 computer/digital technologies (e.g., artificial intelligence), enabled the development of more
Ve

34
35 complex avatars, and they often appear in three-dimensional forms, imbued with seemingly
36
rsi

37
38
distinctive personalities, appearances, and behavioral patterns, and are overall more appealing
39
than the previous, simple versions (Ahn, Fox, and Bailenson 2012; Garnier and Poncin 2013).
on

40
41
42 Second, increase in the use of avatars reflects the growing importance of online service
43
44
experiences, such as education, gaming, banking, and shopping (Garnier and Poncin 2013; Kim,
45
46
47 Chen, and Zhang 2016), which firms seek to make as convenient and hassle-free for customers
48
49 as possible (Kohler et al. 2011). For example, online customers often express frustration when
50
51 they cannot find relevant information on a website quickly and easily; avatars can effectively and
52
53
54 efficiently provide a solution to this issue. Today’s customers also expect faster communication
55
56 from companies, but immediate responses tend to be difficult and expensive for firms to achieve
57
58 4
59
60 Journal of Marketing
Page 5 of 62

1
2 Author Accepted Manuscript
3 through traditional channels (e.g., face-to-face, telephone) (Kilens 2019). Shopping in the online
4
5
6 environment also reduces customers’ sense of social interaction and personal consultation, a
7
8 concern that avatars can help address (Holzwarth, Janiszewski, and Neumann 2006). Finally,
9
10 recent technology developments enable avatars to offer expanded benefits to firms, in that
11
12
13
“avatars capable of having complex and interactive conversations with customers will
14
15 exponentially increase the amount of data businesses can access. Avatars can potentially know if
16
17 [customers] were bored or happy in real time and know the exact moment someone became
Pe
18
19
disengaged” (Frank 2019).
20
e

21
Defining Avatars
rR
22
23
24 Although we can readily list these benefits of avatars, no strong consensus exists
25
ev

26 regarding their precise definition (Table 1). Furthermore, academics have used multiple terms
27
28
iew

29 interchangeably to refer to avatars, such as automated shopping assistants (Al-Natour, Benbasat,


30
31 and Cenfetelli 2011), chatbots (Ho, Hancock, and Miner 2018), virtual customer service agents
32
33 (Verhagen et al. 2014), embodied conversational agents (Bickmore, Pfeifer, and Jack 2009; Lee
Ve

34
35
36
and Choi 2017; Schuetzler et al. 2018), or virtual/digital assistants (Chattaraman et al. 2019;
rsi

37
38 Freeman and Beaver 2018). The ambiguity surrounding the definition of avatars makes it
39
on

40 difficult for researchers to compare empirical results or draw meaningful conclusions across
41
42
studies (Nowak and Fox 2018). To advance scientific knowledge, we need a precise definition
43
44
45 that clearly delineates the boundaries of the construct. In this section, we review various ways
46
47 avatars have been defined, identify and critically evaluate some key definitional elements, and
48
49 offer a new definition derived from this analysis.
50
51
52 ---Insert Table 1 about here---
53
54
55
56
57
58 5
59
60 Journal of Marketing
Page 6 of 62

1
2 Author Accepted Manuscript
3 Anthropomorphic appearance. One aspect on which there is no consensus when it
4
5
6 comes to defining avatars is whether avatars need to have an anthropomorphic appearance
7
8 (Nowak and Fox 2018). Anthropomorphism refers to “the extent to which an image looks
9
10 human” (Nowak and Rauh, 2006, p. 154). In prior academic research, 70% of articles identify an
11
12
13
anthropomorphic or humanlike appearance as a necessary condition of a conceptual definition of
14
15 an avatar. This element is important because the degree to which an avatar is anthropomorphic
16
17 provides cues of its social presence (Nass and Moon 2000). Research shows that the more
Pe
18
19
anthropomorphic an avatar is perceived to be, the more credible and competent it seems
20
e

21
(Westerman, Tamborini, and Bowman, 2015), such that “A person may be represented by a
rR
22
23
24 highly accurate and lifelike avatar of a fir tree. Although this avatar is realistic, other users may
25
ev

26 be less likely to attribute social potential to it—and less likely to communicate with it—because
27
28
iew

29 it is not anthropomorphic” (Nowak and Fox 2018, p. 37). Research shows that how
30
31 anthropomorphic we perceive something to be impacts our expectations of certain behaviors and
32
33 our willingness to interact; people treat something with a human appearance differently than they
Ve

34
35
36
do inanimate objects (Fox et al. 2015). For example, Neytiri in the film Avatar, is not a human,
rsi

37
38 but because she has an anthropomorphic appearance, other characters interact with her the same
39
on

40 way as they would with a human.


41
42
Knowledge about how to deal with other humans generally is learned early in life and is
43
44
45 more detailed and readily accessible in people’s memory than knowledge about how to interact
46
47 with inanimate objects (Epley, Waytz, and Cacioppo, 2007). According to the computers as
48
49 social actors (CASA) paradigm (Moon 2000; Nass et al. 1995; Reeves and Nass 1996), people
50
51
52 tend to treat computer technology that exhibits humanlike characteristics as social actors and
53
54 apply the same social rules to them during interactions, despite being fully aware that they are
55
56
57
58 6
59
60 Journal of Marketing
Page 7 of 62

1
2 Author Accepted Manuscript
3 dealing with machines (Holzwarth, Janiszewski, and Neumann 2006). The presence of an
4
5
6 anthropomorphic appearance triggers people’s simplistic social scripts (e.g., politeness,
7
8 reciprocity), which in turn induce cognitive, affective, and social responses during interactions
9
10 with technology (Wang et al. 2007). Thus, we regard an anthropomorphic appearance as an
11
12
13
important, required element of the conceptual definition of digital avatars, because people
14
15 interact differently with something they perceive as more “human.” This requirement, thus,
16
17 would exclude inanimate objects and brands, as well as voice-only digital assistants that lack an
Pe
18
19
anthropomorphic appearance.
20
e

21
Interactivity. Interactivity refers to “the extent to which individuals perceive that the
rR
22
23
24 communication allows them to feel in control as if they can communicate synchronously and
25
ev

26 reciprocally with the communicator” (Chattaraman et al. 2019, p. 317). In defining interactivity
27
28
iew

29 as another critical requirement for digital avatars, we refer specifically to the ability to engage in
30
31 two-way interactions, which may be verbal (voice) or non-verbal (text, animation). Prior
32
33 research established three dimensions of interactivity: user’s active control, or ability to
Ve

34
35
36
participate and influence communication; bilateral interactions; and synchronicity (Etemad-
rsi

37
38 Sajadi 2016; Liu and Shrum 2002). About 78% of the papers we reviewed include interactivity
39
on

40 as one of the elements of the conceptual definition of avatars. For example, in defining avatars as
41
42
“virtual characters that can be used as company representatives in online stores,” Liew, Tan, and
43
44
45 Ismail (2017, p. 2) allow them to be non-interactive and capable of only one-way conversation,
46
47 such as welcoming users, introducing the company, or describing available products. Some other
48
49 researchers similarly do not consider interactivity as a necessary element of digital avatars (e.g.,
50
51
52 Jin 2009). Yet most researchers focus on interactive avatars and find that they can increase
53
54
55
56
57
58 7
59
60 Journal of Marketing
Page 8 of 62

1
2 Author Accepted Manuscript
3 customers’ satisfaction with a website or product, credibility, or patronage intentions
4
5
6 (Chattaraman, Kwon, and Gilbert 2012; Holzwarth, Janiszewski, and Neumann 2006).
7
8 However, designing a truly interactive avatar that can engage in synchronous
9
10 communication is not an easy task: “natural language dialogue in chatbots suggests a low
11
12
13
threshold for users to access data and services. However, whereas conversational interfaces are
14
15 truly intuitive when applied to interactions between people, conversations between humans and
16
17 automated conversational agents are more challenging” (Brandtzaeg and Følstad 2018, p. 41).
Pe
18
19
The interactivity requirement would exclude entities such as “self-avatars” in clothing stores,
20
e

21
which do not offer bidirectional communication, as well as any instances of asynchronous
rR
22
23
24 content, such as a lecture delivered as a pre-recorded video of the instructor, or a standard
25
ev

26 greeting from a chatbot that cannot offer personalized interactions with each user. However,
27
28
iew

29 when there is true bi-directional interactivity, it can satisfy customers’ hedonic (e.g., having fun
30
31 while shopping on a website) and utilitarian needs (e.g., efficiently finding a solution to a
32
33 problem on a website) (Liew, Tan, and Ismail 2017). Thus, we include it as a necessary
Ve

34
35
36
requirement for digital avatars.
rsi

37
38 Controlling entity. Researchers also have different perspectives on the controlling entity,
39
on

40 which refers to whether the control over an avatar involves a human operator or an automated
41
42
computer program (Nowak and Fox 2018). Some researchers make this distinction explicit and
43
44
45 refer to anything controlled by technology as an agent or bot, while referring to anything
46
47 controlled by humans as an avatar (Nowak and Fox 2018). However, in business practices, due
48
49 to cost considerations, digital avatars appear almost exclusively enabled by AI (e.g., Sophie, Air
50
51
52 New Zealand customer service rep). Yet we have no theoretical reason to limit the conceptual
53
54 definition of avatars to only those which are enabled by AI because it seems that consumers want
55
56
57
58 8
59
60 Journal of Marketing
Page 9 of 62

1
2 Author Accepted Manuscript
3 a perception of an avatar having some level of intelligence but often cannot tell precisely who or
4
5
6 what controls it (Kim and Sundar 2012), as there are typically no solid clues available.
7
8 According to the modality–agency–interactivity–navigability model and its applications in
9
10 virtual environments though, if agency cues are present in an interface, they influence users’
11
12
13
perceptions by prompting their cognitive heuristics about the nature and content of the
14
15 interaction (Sundar 2008). Users’ perceptions and behaviors thus differ if they learn they are
16
17 interacting with an AI-backed avatar versus one controlled by a person, reflecting the different
Pe
18
19
heuristics that get evoked by machine versus human counterparts (Fox et al. 2015; Go and
20
e

21
Sundar 2019). Thus, “identity cues suggesting that the user is chatting with a human agent or
rR
22
23
24 machine agent can trigger human or machine heuristics respectively and accordingly affect the
25
ev

26 criteria by which they evaluate the quality of the interaction” (Go and Sundar 2019, p. 305).
27
28
iew

29 In summary, we define avatars as digital entities with anthropomorphic appearance,


30
31 controlled by a human or software, that have an ability to interact. Among the academic papers
32
33 we reviewed, about half (51%) include all of these elements in their conceptual definitions of
Ve

34
35
36
avatars; the inclusion rates for each specific definitional element vary between 70% and 90%.
rsi

37
38 Typology of Avatar Design
39
on

40 Based on this derived definition of avatars, we propose a typology of avatar design. This
41
42
typology allows academics and managers to isolate elements that make an avatar more or less
43
44
45 effective for specific goals, such as providing product information, answering customers’ process
46
47 questions, and so forth. Furthermore, this typology provides an overall organizing framework for
48
49 thinking about, making design/implementation decisions regarding, and researching avatars.
50
51
52 Different design elements cause avatars to vary in their visual appearances and behaviors
53
54 during interactions with humans. All of the design elements affect avatars’ form realism and
55
56
57
58 9
59
60 Journal of Marketing
Page 10 of 62

1
2 Author Accepted Manuscript
3 behavioral realism. Form realism refers to the extent to which the avatar’s shape appears human,
4
5
6 while behavioral realism captures the degree to which it behaves as a human would in the
7
8 physical world (Bailenson et al. 2008; Blascovich et al. 2002; Fox et al. 2015). Some researchers
9
10 argue that behavioral realism is more important than form realism (Blascovich et al. 2002), but
11
12
13
both form and behavioral realism are associated with greater avatar usefulness in most contexts
14
15 (Garau et al. 2003; Kang, Watt, and Ala 2008; Yee, Bailenson, and Rickertsen 2007). Figure 1
16
17 provides an overview of all the design elements, with examples, that managers can use to
Pe
18
19
understand and influence the degree of avatars’ form and behavioral realism.
20
e

21
--- Insert Figure 1 about here ---
rR
22
23
24 Form Realism
25
ev

26 Higher form realism may lead users to develop social expectations for their subsequent
27
28
iew

29 interactions with avatars (Nowak and Biocca 2003). Managers can impact the degree of form
30
31 realism of an avatar through the design elements such as spatial dimensions (2D vs. 3D avatars),
32
33 ability to have movement in the face or body (visually static vs. dynamic avatars), and other
Ve

34
35
36
characteristics that enhance the perception of “humanness” of avatars, such as signals of gender,
rsi

37
38 race, age, or names.
39
on

40 Spatial dimension. Avatars can be two- or three-dimensional. In the sample of articles


41
42
we reviewed, 52% focused on 2D avatars and 48% on 3D avatars. Research indicates that 3D
43
44
45 avatars are perceived as more compelling and impactful, relative to 2D versions (Bailenson et al.
46
47 2008; Fox et al. 2015; Persky and Blascovich 2007).
48
49 Movement. Both technological advancements and customer expectations have prompted
50
51
52 the development of more realistic, visually dynamic avatars that can move their bodies and faces
53
54 (Yun, Deng, and Hiscock 2009). In our sample of reviewed studies, 38% of studies examined
55
56
57
58 10
59
60 Journal of Marketing
Page 11 of 62

1
2 Author Accepted Manuscript
3 static avatars, while 62% considered visually dynamic avatars. For example, Amelia is a visually
4
5
6 dynamic avatar that is capable of facial expressions and movement, and has been used, with
7
8 modifications, in various industries such as banking, insurance, and healthcare (Ipsoft 2017).
9
10 Visually dynamic avatars with ability for facial expressions can convey emotions, which is
11
12
13
especially helpful for customers from different cultural backgrounds: “Avatars with high
14
15 intensity expression and dynamics allow both the local and global audiences to achieve
16
17 approximately equal levels in subject identification and emotion perception” (Yun, Deng, and
Pe
18
19
Hiscock 2009, p. 21). Thus, visually dynamic avatars may be more effective for global
20
e

21
corporations. Additional evidence indicates that the greater the ability of an avatar to exhibit
rR
22
23
24 facial expressions, the less perceived human agency is required to exert social influence
25
ev

26 (Bailenson et al. 2008).


27
28
iew

29 Human characteristics. To enhance form realism, avatars can be designed to include


30
31 additional “human” elements, such as an identifiable name, gender, race, and age. In the studies
32
33 we reviewed, out of the abovementioned set of characteristics, gender and age are the most
Ve

34
35
36
commonly studied ones, followed by name and race. For example, the Air New Zealand virtual
rsi

37
38 customer service avatar is female, named Sophie. Research shows that characteristics such as
39
on

40 gender can increase the effectiveness of avatars (Nass and Yen 2010).
41
42
43 Behavioral Realism
44
45
Behavioral realism of avatars can facilitate more natural interactions with users
46
47
48 (Blascovich et al. 2002), and managers can manipulate the degree of avatars’ behavioral realism
49
50 using design elements associated with avatars’ interactivity and controlling entity. Specifically,
51
52 relevant design elements that managers can use to impact avatar interactivity are communication
53
54
55 modality (ability of avatars to communicate verbally, non-verbally, or through a combination of
56
57
58 11
59
60 Journal of Marketing
Page 12 of 62

1
2 Author Accepted Manuscript
3 both), response type (whether avatars’ responses are scripted or natural), and presence of social
4
5
6 content (whether avatars have the ability to engage in interactions about social and personal
7
8 matters in addition to task-oriented communications). In terms of controlling entity, avatars can
9
10 be controlled by a computer program or algorithm or a human, with the latter, predictably,
11
12
13
increasing avatars’ behavioral realism.
14
15 Communication modality. Avatars differ in the modalities of communication they use.
16
17 Nonverbal avatar communication can be represented by text (speech-to-text avatars), gestures, or
Pe
18
19
facial expressions; verbal avatars communicate via speech; whereas nonverbal and verbal avatars
20
e

21
can communicate using a combination of these modalities. The latter category would be the
rR
22
23
24 highest in behavioral realism from a communication modality perspective. Research that
25
ev

26 investigated the communication modality of avatars has focused primarily on avatars that are
27
28
iew

29 capable of both verbal and nonverbal interactions, accounting for 68% of reviewed articles,
30
31 followed by research on nonverbal avatars in 26% of articles, with verbal avatars attracting the
32
33 least academic attention (6%).
Ve

34
35
36
In addition, managers might increase behavioral realism by enabling avatars to recognize
rsi

37
38 the nonverbal behaviors of users, such as their facial expressions, prompting more appropriate
39
on

40 responses. For example, Microsoft’s XiaoIce can interpret users’ photos and make relevant
41
42
inferences and comments (Dormehl 2018). Even with significant advances in AI though, creating
43
44
45 an avatar capable of correctly identifying and responding to users’ various emotions and contexts
46
47 remains a challenge, because “large, interpersonal variability exists in how people express
48
49 emotions. Humans also have diverse preferences for how an agent [avatar] responds to them”
50
51
52 (McDuff and Czerwinski 2018, p. 76).
53
54
55
56
57
58 12
59
60 Journal of Marketing
Page 13 of 62

1
2 Author Accepted Manuscript
3 Response type. Managers can design avatars with an ability to converse in a way that
4
5
6 feels natural to users. Previous research has mostly (60% of articles) focused on avatars that are
7
8 capable of merely selecting a response from a set of preexisting, predetermined, scripted
9
10 responses. For example, HSBC's virtual assistant Amy currently is able to select and provide
11
12
13
users with only pre-determined responses about a limited number of the bank's products. Avatars
14
15 with a capability for natural responses instead can have a “relatively free-flowing conversation,
16
17 using accepted vocabulary and grammar, and with the ability to track the context of the
Pe
18
19
conversation and make appropriate responses" (Burden and Savin-Baden 2019, p. 9). For
20
e

21
example, the skincare company SK-II’s YUMI understands users expressing themselves in their
rR
22
23
24 own words and responds in an organic, conversational manner. The ability to have a
25
ev

26 conversation that feels natural is also highly correlated with perceived agency type; avatars
27
28
iew

29 controlled by humans would have a natural response, whereas software-controlled avatars tend to
30
31 rely on scripted responses.
32
33 Social content. Another design element that can increase avatars’ interactivity is their
Ve

34
35
36
ability to provide some social content during interactions with users, as opposed to purely task-
rsi

37
38 oriented communication (e.g., providing product information). For example, Microsoft’s
39
on

40 XiaoIce, is an AI assistant who also attempts to function like a friend, checking on users after a
41
42
relationship breakup or asking about the physical recovery of a user who posted a photo of a
43
44
45 bruised leg. Since its launch in China in 2014, XiaoIce has gained great popularity, due to its
46
47 emotional intelligence: “The real key takeaway is that we’ve focused on emotional
48
49 intelligence…. We call this an empathetic computing framework, [designed to] have
50
51
52 conversations with humans naturally, which can build a social and emotional connection. It’s a
53
54 good friend. As a result, they can better participate and help out in human society” (Dormehl
55
56
57
58 13
59
60 Journal of Marketing
Page 14 of 62

1
2 Author Accepted Manuscript
3 2018). In the research we reviewed, 34% of avatars could offer some social content during their
4
5
6 interactions.
7
8 Controlling entity. Research shows that consumers interacting with an avatar they
9
10 perceive to be controlled by a human behave differently from customers who believe the avatar
11
12
13
is controlled by software. According to a meta-analysis, avatars controlled by humans elicit more
14
15 presence and a stronger social influence than do computer-controlled avatars (Fox et al. 2015).
16
17 Therefore, for firms that rely on software-controlled avatars (the majority of avatars in practice),
Pe
18
19
reinforcing “human” elements can be very effective. They can leverage the design elements we
20
e

21
have discussed, such as more natural speech programmed for software-controlled avatars,
rR
22
23
24 nonverbal behaviors and emotions, and the ability to provide social content. However,
25
ev

26 perceptions of human agency might not be desirable in all settings as research shows that people
27
28
iew

29 perform worse on certain tasks when they recognize they are interacting with a human-controlled
30
31 avatar rather than software, due to social inhibition, social desirability bias, and perceptions of
32
33 reduced autonomy (Kim, Chen, and Zhang 2016; Yokotani, Takagi, and Wakashima 2018). For
Ve

34
35
36
example, avatars have become popular in healthcare, and when it comes to disclosing sensitive
rsi

37
38 information such as drinking habits, users are more comfortable revealing information if they
39
on

40 perceive less human agency (Schuetzler et al. 2018).


41
42
43
44
45 Avatar Academic Research and Business Practices
46
47 In the proposed typology, all of the design elements serve to increase or decrease the form and
48
49
50 behavioral realism of avatars. Most prior research investigated only a narrow set of design
51
52 elements in piecemeal fashion, independent of other elements, or at an aggregate level, without
53
54 considering the granularity or interactions of specific elements. In the following sections, we
55
56
57
58 14
59
60 Journal of Marketing
Page 15 of 62

1
2 Author Accepted Manuscript
3 synthesize academic literature and business practices related to avatars to provide insights and
4
5
6 identify research gaps. We also derive propositions to help advance theory, inform business
7
8 practices, and guide future research.
9
10 Academic Literature
11
12
13
Because research on avatars is diverse and cross-disciplinary in nature, our goal is to
14
15 provide a representative rather than exhaustive literature review, covering a variety of research
16
17 disciplines and empirical settings. Using avatar and related terms1 as keywords, we searched
Pe
18
19
article titles and abstracts in electronic databases (Academic Search Complete, Business Source
20
e

21
Complete, Science Direct, and Google Scholar) to find studies published during the 1990–2020
rR
22
23
24 period. To identify research involving avatars that are consistent with our definition, we first
25
ev

26 excluded articles that did not provide detailed information about avatar definitional elements or
27
28
iew

29 scope. In addition, we excluded research on topics that fall outside our definitional boundary
30
31 (e.g., consumers’ self-avatars). Purely technical articles (e.g., programming of avatar) or studies
32
33 that do not address real-time consumer–avatar interactions were removed too. To strike a good
Ve

34
35
36
balance between research diversity and quality, we sampled only reputable journals across
rsi

37
38 various disciplines (i.e., impact factor of at least 3).2 In total, we compiled 98 empirical research
39
on

40 articles (the full list of reviewed research is available in the Web Appendix). Table 2 provides a
41
42
summary of select illustrative research.
43
44
45 ---Insert Table 2 about here---
46
47
48 1 Related terms included animated agent, anthropomorphic agent, artificial agent, chatbot, conversational agent,
49 digital assistant, electronic shopping agent, embodied agent, embodied virtual agent, nonhuman agent, spokes-
50 avatar, spokes-character, virtual agent, virtual assistant, and virtual character.
51 2 The academic disciplines included were (1) marketing (e.g., Journal of Marketing), (2) computer science (e.g.,
52 Computers in Human Behavior), (3) information systems (e.g., Journal of Management Information Systems), (4)
53 communications (e.g., Human Communication Research), (5) education (e.g., Computers & Education), (6)
54 healthcare (e.g., Journal of Medical Internet Research), and (7) general business (e.g., Journal of Business
55 Research).
56
57
58 15
59
60 Journal of Marketing
Page 16 of 62

1
2 Author Accepted Manuscript
3 Form realism. The overarching theoretical framework that guides empirical studies of
4
5
6 human–avatar interactions is social response theory, which is sometimes referred to as the
7
8 computers as social actors (CASA) paradigm (Moon 2000; Nass et al. 1995; Reeves and Nass
9
10 1996). It suggests that anthropomorphic characteristics of avatars elicit consumers’ socialness
11
12
13
perceptions, often in an automatic, spontaneous, mindless process that induces varying degrees
14
15 of cognitive, affective, and social responses to avatars (Al-Natour, Benbasat, and Cenfetelli
16
17 2011; Holzwarth, Janiszewski, and Neumann 2006; Verhagen et al. 2014; Wang et al. 2007).
Pe
18
19
Although the theory suggests that an anthropomorphic appearance positively affects customer
20
e

21
outcomes, empirical results indicate some mixed effects. In various situations, lower or higher
rR
22
23
24 levels of form realism appeared more effective, but in other cases, no differences emerged. For
25
ev

26 example, visually static, cartoonish avatars with very low form realism increased satisfaction
27
28
iew

29 with a retailer, attitude toward products, and purchase intentions in some studies (Etemad-Sajadi
30
31 2014; Holzwarth, Janiszewski, and Neumann 2006). However, Qiu and Benbasat (2009) find that
32
33 avatars with more realistic, humanlike appearance increased customers’ perceptions of social
Ve

34
35
36
presence, leading to higher usage intentions. Verhagen et al. (2014) found no significant
rsi

37
38 differences between avatars that are low or high in form realism in terms of service satisfaction.
39
on

40 Similarly, Schuetzler et al. (2018) reported that a more anthropomorphic appearance of an avatar
41
42
had no effect on participants’ disclosure of information about sensitive topics, such as drinking
43
44
45 behaviors.
46
47 Two factors may help explain these inconsistent effects. First, prior studies have not
48
49 investigated all of the design elements identified in our typology that help establish avatars’ form
50
51
52 realism (e.g., visually static vs. dynamic avatars, avatars’ age, gender) or how these underlying
53
54 elements might induce specific effects. By focusing on only a subset of visual characteristics
55
56
57
58 16
59
60 Journal of Marketing
Page 17 of 62

1
2 Author Accepted Manuscript
3 and, thus, failing to account for the totality of the elements that establish form realism of an
4
5
6 avatar, these studies may have produced biased estimates. Second, we propose that avatars’ form
7
8 and behavior must be considered simultaneously, because form realism is meaningful only in the
9
10 context of behavioral realism (Bailenson et al. 2008); but few studies have done so.
11
12
13
Behavioral realism. Extant research consistently shows positive effects of greater
14
15 behavioral realism. For example, Wang et al. (2007) report that an avatar’s scripted text or
16
17 spoken communications can enhance customers’ hedonic and utilitarian benefits when shopping
Pe
18
19
online, as well as increasing their patronage intentions. Similarly, Lee and Choi (2017) find that
20
e

21
when users interact with an avatar that has a high degree of behavioral realism, trust between the
rR
22
23
24 parties is higher. Other studies offer similar results, noting that avatars’ behaviors, such as
25
ev

26 decision-making style (Al-Natour, Benbasat, and Cenfetelli 2011) and socially oriented
27
28
iew

29 communication (Verhagen et al. 2014) significantly affect avatar trustworthiness and the overall
30
31 customer experience (Brave, Nass, and Hutchinson 2005; Chattaraman et al. 2019).
32
33 Advancements in AI technology have allowed avatars to exhibit higher levels of
Ve

34
35
36
cognitive and emotional intelligence. For example, they can engage in autonomous conversations
rsi

37
38 by analyzing and responding to customers’ requests in real time, thereby significantly increasing
39
on

40 customers’ trust in them (McDuff and Czerwinski 2018). Using avatar interviewers equipped
41
42
with video, audio sensors, and advanced analytical software, Nunamaker et al. (2011)
43
44
45 demonstrate how intelligent avatars can detect, interpret, and respond to human interviewees’
46
47 emotions, cognitive effort, and potential deceptions. Pütten et al. (2010) examine the effects of
48
49 an avatar that can collect and analyze a human’s voice and upper-body movement and coordinate
50
51
52 its own responses accordingly. Results show that the avatar’s intelligent behavior led to positive
53
54 evaluations of the avatar, regardless of whether it was controlled by a human or a software.
55
56
57
58 17
59
60 Journal of Marketing
Page 18 of 62

1
2 Author Accepted Manuscript
3 Even in light of these consistent findings related to avatar behavioral realism, some
4
5
6 important research issues remain unresolved. First, few studies have investigated the underlying
7
8 behavioral realism elements identified in our typology of avatar design (e.g., communication
9
10 modality, social content, response type) to determine which are most critical or how they might
11
12
13
interact with other form or behavioral realism elements. For example, Bickmore, Pfeifer, and
14
15 Jack (2009) found that an avatar nurse incorporating social content in her scripted conversations
16
17 produced better patient experiences, but Schuetzler et al. (2018) reported that a scripted, task-
Pe
18
19
focused avatar interviewer elicits more socially biased responses. Second, a few studies revealed
20
e

21
some unexpected negative effects of behavioral realism (e.g., Schuetzler et al. 2018), but
rR
22
23
24 research has yet to identify the conditions in which detrimental effects are more likely or design
25
ev

26 strategies to address them.


27
28
iew

29 Integrated perspective on form and behavioral realism. Our review of extant literature
30
31 thus reveals a key limitation: lack of consideration of the alignment between form and behavioral
32
33 realism of avatars. If the levels of form and behavioral realism are mismatched, the consequences
Ve

34
35
36
for avatars’ effectiveness may be profound and can help explain inconsistent past findings. Yet
rsi

37
38 some misaligned avatars (e.g., the REA avatar is high in behavioral realism but low in form
39
on

40 realism) seem equally as effective as well-aligned avatars (e.g., SK-II’s skincare advisor YUMI
41
42
is very high in both behavioral and form realism). However, other misaligned avatars have failed
43
44
45 (e.g., Nordnet’s Amelia, with high form realism but low behavioral realism). A systematic
46
47 analysis of avatar effectiveness thus seems warranted and requires identifying different
48
49 categories of avatars along the form and behavioral realism dimensions as a first step.
50
51
52 We suggest that avatars can be parsimoniously grouped into a 2  2 taxonomy, according
53
54 to their form and behavioral realism (Figure 2). This taxonomy provides a foundation for
55
56
57
58 18
59
60 Journal of Marketing
Page 19 of 62

1
2 Author Accepted Manuscript
3 predicting the success or failure of avatars in business practices and can inform avatar design
4
5
6 strategies. We identify four distinct categories of avatars: simplistic, superficial, intelligent
7
8 unrealistic, and digital human avatars. A simplistic avatar has an unrealistic human appearance
9
10 (e.g., 2D, visually static, cartoonish image) and engages in low intelligence behaviors (e.g.,
11
12
13
scripted, only task-specific communication). For example, in the Netherlands, ING Bank uses a
14
15 2D, cartoonish-looking avatar Inge to provide responses to simple customer inquiries with a set
16
17 of predetermined answers. In contrast, a superficial avatar has a realistic anthropomorphic
Pe
18
19
appearance (e.g., 3D, visually dynamic, photorealistic image), such as Natwest Bank’s Cora, but
20
e

21
low behavioral realism, in that it is only able to offer preprogrammed answers to specific
rR
22
23
24 questions. An intelligent unrealistic avatar (e.g., REA) is characterized by humanlike cognitive
25
ev

26 and emotional intelligence but exhibits an unrealistic (e.g., cartoonish) human image. These
27
28
iew

29 avatars can engage customers in real-time, complex transactions without being mistaken for real
30
31 human agents. Finally, a digital human avatar such as SK-II’s YUMI is the most advanced
32
33 category of avatars, characterized by both a highly realistic anthropomorphic appearance and
Ve

34
35
36
humanlike cognitive and emotional intelligence, and is designed to provide the highest degree of
rsi

37
38 realism during interactions with human users.
39
on

40 ---Insert Figure 2 about here---


41
42
Insights and Propositions Derived from Academic Literature
43
44
45 To advance extant literature, we propose the need to consider the interrelationship of
46
47 form and behavioral realism. Visual information (i.e., what an avatar looks like) often gets
48
49 processed automatically and almost immediately, requiring minimum cognitive resources
50
51
52 (McGloin et al. 2009). This visual appearance then becomes the basis for probabilistic
53
54 consistency inferences, where consumers form an expectation of some unknown attribute, based
55
56
57
58 19
59
60 Journal of Marketing
Page 20 of 62

1
2 Author Accepted Manuscript
3 on a known attribute with which it is believed to be correlated (Dick, Chakravarti, and Biehal
4
5
6 1990). For example, consumers often make inferences about an unfamiliar brand’s quality by
7
8 using price as a signal of quality, in the belief that the two are correlated. Similarly, when an
9
10 avatar looks more like a human, consumers may expect it to also behave like a human. Thus, the
11
12
13
visual characteristics of an avatar may influence consumers’ judgments of its behavioral
14
15 competence, even before an interaction takes place (Nowak and Biocca 2003). A more realistic
16
17 anthropomorphic appearance suggests a higher level of behavioral realism, leading to a greater
Pe
18
19
expectation that the avatar will behave like a real human might.
20
e

21
Proposition 1: As the form realism of an avatar increases, so do customers’ expectations
rR
22
23 for its behavioral realism.
24
25 Customers will use an avatar’s form realism as a frame of reference for forming initial
ev

26
27
28
expectations about its behavioral realism. The expected level of behavioral realism will then
iew

29
30 serve as a benchmark, against which consumers will form comparative judgements of their
31
32 subsequent experience. According to expectation confirmation theory (Oliver 1980), when the
33
Ve

34
actual outcome is worse than expected, consumers experience a negative disconfirmation,
35
36
leading to decreased overall satisfaction. A better-than-expected outcome instead results in a
rsi

37
38
39 positive disconfirmation, which increases customers’ overall satisfaction (Evangelidis and Van
on

40
41 Osselaer 2018; White and Schneider 2000).
42
43
44 Consistent with this theory, if an avatar’s behavioral realism exceeds the consumer’s
45
46 initial expectations, which were based on the avatar’s form realism, a positive disconfirmation
47
48 likely occurs, and the consumer should perceive the avatar as more credible and attractive, as
49
50
51
well as feel increased trust or confidence in it (Afifi and Burgoon 2000). Ellie, an avatar that
52
53 assesses depression and PTSD symptoms in veterans, serves as a good illustration of a positive
54
55 disconfirmation: Her cartoonish appearance paired with highly intelligent, humanlike behavior
56
57
58 20
59
60 Journal of Marketing
Page 21 of 62

1
2 Author Accepted Manuscript
3 has proven highly effective with vulnerable individuals (Robinson 2015). Conversely, if the
4
5
6 avatar’s behavioral competency falls short of the expectations that users formed on the basis of
7
8 the avatar’s form realism, it may lead to a negative disconfirmation and dampen customers’
9
10 satisfaction (McGloin et al. 2009). Nordnet’s Amelia exhibited minimal competence in providing
11
12
13
customized advice for high-risk transactions (e.g., stock purchase), which proved disappointing.
14
15 Amelia’s realistic anthropomorphic appearance might have led customers to develop high
16
17 behavioral expectations, which Amelia could not deliver, giving rise to a negative
Pe
18
19
disconfirmation. Therefore, we expect asymmetric effects of misaligned avatar form and
20
e

21
behavioral realism.
rR
22
23
24 Proposition 2: Differences between the avatar’s form and behavioral realism have
25 asymmetric effects, such that customers experience positive (negative) disconfirmation
ev

26 when an avatar’s behavioral realism is greater (less) than its form realism.
27
28
iew

29 Mediation mechanisms of avatar effects on performance. Integration of evidence across


30
31 multiple research streams suggests that avatars affect performance outcomes, such as customers’
32
33 likelihood to purchase a product, indirectly through customers’ cognitive, affective, and social
Ve

34
35
36
responses, depending on the context (Holzwarth, Janiszewski, and Neumann 2006; Lee and Choi
rsi

37
38 2017). Customers form cognitive responses to avatars, according to the avatars’ informativeness
39
on

40 or competence in helping them make well-informed decisions (Holzwarth, Janiszewski, and


41
42
Neumann 2006; Wang et al. 2007). Cognitive trust, or willingness to rely on another entity’s help
43
44
45 to achieve goals in uncertain situations, is another key dimension of customers’ cognitive
46
47 response (Keeling, McGoldrick, and Beatty 2010; Martin, Borah, and Palmatier 2017).
48
49 Interactions with avatars can also evoke affective responses in customers, such as by
50
51
52 providing them with unique entertainment experiences (Holzwarth, Janiszewski, and Neumann
53
54 2006). Avatars can deliver pleasurable experiences independent of their ability to facilitate a
55
56
57
58 21
59
60 Journal of Marketing
Page 22 of 62

1
2 Author Accepted Manuscript
3 specific functional goal, such as a shopping task, by offering entertainment and emotional value
4
5
6 during the shopping process (Wang et al. 2007). Human–avatar interactions are also social in
7
8 nature. Avatars can enhance customers’ perceived social presence (i.e., feeling of being with
9
10 another) and create feelings of human contact or connection (Chattaraman et al. 2019; Qiu and
11
12
13
Benbasat 2009). Moreover, the use of an avatar can provide a sense of personalization, so
14
15 customers receive information that appears tailor-made to their specific needs (Verhagen et al.
16
17 2014; Wang et al. 2007). CASA framework argues that avatars can also induce feelings of
Pe
18
19
reciprocity in human–computer interactions, which can strengthen perceived rapport with the
20
e

21
avatar and enhance the users’ social experience (Chattaraman et al. 2019; Lee and Choi 2017).
rR
22
23
24 When the levels of an avatar’s form realism and behavioral realism are aligned,
25
ev

26 customers’ behavioral expectations tend to be confirmed. This simple confirmation, together


27
28
iew

29 with high initial behavioral expectations induced by form realism, may have a strong, positive,
30
31 additive effect on performance outcomes, such as customer’s purchase likelihood (Oliver 1980;
32
33 Stayman, Alden, and Smith 1992). We expect that avatars that are aligned in their form and
Ve

34
35
36
behavioral realism can affect customer performance outcomes through all three types of
rsi

37
38 mediating responses: cognitive, affective, and social. But when the levels of form and behavioral
39
on

40 realism are misaligned, the outcomes might be mediated through different responses. Consider
41
42
the misalignment that occurs when form realism exceeds behavioral realism. In this situation,
43
44
45 customers may find the avatar to be especially entertaining, because its realistic anthropomorphic
46
47 appearance and characteristics can also serve as hedonic elements that can increase perceived
48
49 entertainment, which often is intrinsically enjoyable in its own right, regardless of performance
50
51
52 outcomes (Davis, Bagozzi, and Warshaw 1992; Xu, Abdinnour, and Chaparro 2017). To the
53
54 extent that perceived enjoyment creates a pleasant mood, the hedonic aspect of high form realism
55
56
57
58 22
59
60 Journal of Marketing
Page 23 of 62

1
2 Author Accepted Manuscript
3 can improve performance outcomes such as impulsive online purchases (Parboteeah, Valacich,
4
5
6 and Wells 2009). Yet customers also might experience negative disconfirmation, stemming from
7
8 the disappointment with an avatar’s cognitive and social capabilities, resulting in weakened
9
10 customer performance outcomes.
11
12
13
Alternatively, when misalignment arises because the avatar’s behavioral realism exceeds
14
15 its form realism, the positive effects on customer outcomes might be mediated primarily by
16
17 cognitive and social responses. Although typically this avatar’s lower form realism is unlikely to
Pe
18
19
provide much entertainment for the customer3, the positive disconfirmation of avatar’s
20
e

21
behavioral competence may significantly boost customers’ confidence in the avatar’s overall
rR
22
23
24 ability to provide valid information, offer personalized service, or build customer rapport.
25
ev

26 Proposition 3: When an avatar’s form realism exceeds its behavioral realism, it has a
27
28
positive effect on performance outcomes (e.g., purchase likelihood), through customer’s
iew

29 (a) affective responses, but a negative effect on performance outcomes through


30 customer’s (b) cognitive and (c) social responses.
31
32 Proposition 4: When an avatar’s behavioral realism exceeds its form realism, it has a
33 positive effect on performance outcomes (e.g., purchase likelihood), through customer’s
Ve

34
(a) cognitive and (b) social responses.
35
36
Business practices
rsi

37
38
39 As we noted previously, many companies are adopting avatars, arguably to humanize
on

40
41 their brands with a scalable “personalized human touch,” but managers lack guidance for how to
42
43
44 design these avatars to ensure their effectiveness (Bradbury 2018; Wooler 2019). In this section,
45
46 we use business examples to illustrate the avatar categories from our taxonomy (Figure 2) and
47
48 thereby clarify which factors and conditions make them effective. We also generate theoretical
49
50
51
insights and managerial implications.
52
53
54 3We acknowledge there may be exceptions in which a very “cute” avatar, although low in form realism, might
55 provide strong hedonic value. We thank an anonymous reviewer for this insight.
56
57
58 23
59
60 Journal of Marketing
Page 24 of 62

1
2 Author Accepted Manuscript
3 Simplistic avatar. An obvious benefit of using avatars is the firm’s improved efficiency
4
5
6 and scalable customer service. For example, ING Bank’s cartoonish avatar Marie answers
7
8 common debit and credit card questions with preprogrammed information and solutions
9
10 (www.ing.com). In the Los Angeles Superior Court, an animated cartoon avatar Gina, who
11
12
13
speaks multiple languages, successfully handles 1.2 million new traffic citations a year (Llop
14
15 2016). A startup company called TwentyBN has introduced an animated cartoon sales avatar
16
17 Millie that can understand and answer simple questions while presenting various products (Kahn
Pe
18
19
2018) and seems to be especially effective in promoting low-ticket items, such as eyeglasses.
20
e

21
Simplistic avatars thus seem most effective in providing hassle-free, convenient options for
rR
22
23
24 completing quick, specific tasks (e.g., information inquiries), especially when relatively little risk
25
ev

26 is involved (e.g., inexpensive online purchases).


27
28
iew

29 Superficial avatar. The use of superficial avatars in various industries shows more mixed
30
31 results. Among the successes, HSBC Hong Kong’s Amy, a photorealistic avatar that handles
32
33 routine customer inquiries similar to ING’s Marie, was well received by customers (Torresin
Ve

34
35
36
2019). A very realistic-looking 3D avatar, Cora of Natwest Bank in the United Kingdom, can
rsi

37
38 answer 200 basic questions, such as how to open an account or complete a mortgage application
39
on

40 (Peddie 2018). In the insurance industry, Lemonade Insurance’s avatar Maya and Progressive’s
41
42
Flo, both very humanlike, are programmed to provide category-specific information and handle
43
44
45 simple transactions, such as onboarding customers and giving online quotes (Briggs 2018;
46
47 Phaneuf 2020). However, other superficial avatars have been less effective. The Swedish bank
48
49 Nordnet had to discontinue its realistic-looking avatar Amelia, presumably due to her failure to
50
51
52 provide intelligent stock purchasing advice. At IKEA, the decision to eliminate its avatar Anna
53
54 stemmed from a recognition that her realistic anthropomorphic appearance led to complex
55
56
57
58 24
59
60 Journal of Marketing
Page 25 of 62

1
2 Author Accepted Manuscript
3 customer questions, which required responses beyond the predetermined set available in the
4
5
6 avatar’s programming (Brandtzaeg and Følstad 2018; Scott 2008). Overall, superficial avatars
7
8 can entertain customers while enhancing efficiency in low-risk transactions (e.g., bank account
9
10 information inquiries), but they also can produce detrimental effects for customers seeking high-
11
12
13
risk or complex transactions (e.g., financial investments), because these avatars lack the level of
14
15 intelligence that their realistic anthropomorphic appearance leads users to expect.
16
17 Intelligent unrealistic avatar. This type of avatar is relatively rare but generally
Pe
18
19
successful. For example, the REA avatar has been effective in providing virtual showings of
20
e

21
homes for sale; Ellie, an avatar therapist, has been useful in detecting PTSD and depression
rR
22
23
24 symptoms in military veterans. With its humanlike intelligence, Ellie can engage in context-
25
ev

26 appropriate, autonomous conversations and build rapport with subtle, supportive, and
27
28
iew

29 sympathetic gestures when listening to a veteran’s sensitive story. In turn, veterans disclose
30
31 significantly more PTSD symptoms to her than to a human therapist (Gonzalez 2017). Thus,
32
33 intelligent unrealistic avatars seem especially effective for complex relational transactions
Ve

34
35
36
involving sensitive personal information (e.g., finances, health) as they can provide a sense of
rsi

37
38 non-judgement since customers recognize these avatars to be not human, but still competent in
39
on

40 their tasks.
41
42
Digital human avatar. With advanced digital and computing technologies, pioneer avatar
43
44
45 companies such as Souls Machines are breaking new ground for digital human avatars in
46
47 marketing applications (www.soulmachines.com). For example, skincare brand SK-II uses an
48
49 incredibly realistic looking and behaving avatar YUMI, whose AI-powered digital brain enables
50
51
52 advanced cognitive and emotional intelligence. In addition, YUMI can recognize users’ gestures
53
54 and features, such as eye color; communicate via speech or text; and deliver customized tips with
55
56
57
58 25
59
60 Journal of Marketing
Page 26 of 62

1
2 Author Accepted Manuscript
3 credible, highly personalized beauty advice (Brunsman 2019). Another digital human avatar,
4
5
6 modeled after Daniel Kalt, the chief economist of UBS investment bank, can forecast financial
7
8 data and present investment advice to high-wealth customers (www.nanalyze.com). Overall,
9
10 digital human avatars seem most effective for building long-term customer relationships in
11
12
13
contexts that feature substantial complexity or risk (e.g., financial investments), where users
14
15 prioritize realistic, trustworthy, and personalized service.
16
17 Insights and Propositions Derived from Business Practices
Pe
18
19
Observations from business practices suggest that avatars’ effectiveness may be highly
20
e

21
contingent on the level of perceived uncertainty users experience during their interactions with
rR
22
23
24 avatars. This uncertainty might arise from contextual factors, such as functional risk, financial
25
ev

26 risk, or price (de Haan et al. 2018). Functional risk refers to the concern that the product/service
27
28
iew

29 may fall short of performance expectations. Financial risk refers to a possible loss of money,
30
31 independent of purchase price, due to a poor decision (e.g., stock performance) (de Haan et al.
32
33 2018). Additionally, as the purchase price increases, the need for information about the quality
Ve

34
35
36
of products or services also becomes more important to manage perceived risk (Wu and Wang
rsi

37
38 2005). Overall, we predict that when customers feel greater uncertainty, they develop heightened
39
on

40 expectations that an avatar that offers a realistic, anthropomorphic appearance will also have a
41
42
comparable level of behavioral realism, because they rely on the avatar’s informativeness and
43
44
45 ability to provide personalized advice to reduce the perceived risks associated with the purchase.
46
47 Proposition 5: The positive effect of an avatar’s form realism on customers’ behavioral
48 realism expectations is stronger when (a) functional risk is higher, (b) financial risk is
49 higher, and (c) the product is more expensive.
50
51
52 We previously posited that when form realism exceeds behavioral realism, the avatar can
53
54 have both positive (via customer’s affective responses) and negative (via customer’s cognitive
55
56
57
58 26
59
60 Journal of Marketing
Page 27 of 62

1
2 Author Accepted Manuscript
3 and social responses) effects on performance outcomes. These mediated effects also might be
4
5
6 moderated by perceived uncertainty. When perceived uncertainty is high, the avatar’s
7
8 entertainment value may become less salient to the consumer because an entertaining avatar
9
10 cannot overcome the perceived risks associated with the purchase. The negative disconfirmation
11
12
13
induced by the avatar’s lack of behavioral competence should become more salient and
14
15 detrimental to consumers’ cognitive and social responses, resulting in weaker performance
16
17 outcomes. For example, whereas the realistic, anthropomorphic appearance of HSBS’s Amy
Pe
18
19
seems effective because she was programmed to provide basic information to routine customer
20
e

21
questions that do not involve high-risk transactions, the very realistic appearance of Nordnet’s
rR
22
23
24 Amelia could not compensate for her lack of competence to offer stock advice (i.e., high
25
ev

26 financial risk transactions). Thus, we expect the following moderation effects:


27
28
iew

29 Proposition 6: When an avatar’s form realism exceeds its behavioral realism, its positive
30 effect on customers’ affective responses is weakened if (1) functional risk is higher, (2)
31 financial risk is higher, and (3) the product is more expensive.
32
33 Proposition 7: When an avatar’s form realism exceeds its behavioral realism, its negative
Ve

34 effects on customers’ cognitive and social responses are strengthened if (1) functional
35
risk is higher, (2) financial risk is higher, and (3) the product is more expensive.
36
rsi

37
38 Conversely, when an avatar’s behavioral realism exceeds its form realism, it may have a
39
on

40 stronger positive effect on customer performance outcomes under high perceived uncertainty.
41
42 This is because an avatar’s form realism may induce a positive disconfirmation regarding its
43
44
45 behavioral competence, which can reassure customers of the usefulness of the information and
46
47 personalized service provided by the avatar, thereby boosting customers’ confidence in their
48
49 risky purchase decisions. Another pertinent risk in today’s world is that of privacy violations (de
50
51
52
Haan et al. 2018). If an avatar’s behavioral realism exceeds its form realism, the avatar provides
53
54 reassurance that customers are dealing with an intelligent, non-human entity that will not judge
55
56 them, so it should mitigate social unease and embarrassment. The PTSD therapist Ellie works
57
58 27
59
60 Journal of Marketing
Page 28 of 62

1
2 Author Accepted Manuscript
3 with highly private matters, involving patients disclosing emotionally and psychologically
4
5
6 sensitive information. Such situations might evoke greater concerns if patients cannot tell if they
7
8 are dealing with a real human therapist, behind the screen.
9
10 Proposition 8: When an avatar’s behavioral realism exceeds its form realism, its positive
11
effects on customers’ cognitive and social responses are stronger if (a) functional risk is
12
13
higher, (b) financial risk is higher, (c) the product is more expensive, and (d) privacy risk
14 is higher.
15
16 As avatars are also increasingly used in mobile apps (e.g., ING’s Inge, Progressive’s Flo,
17
Pe
18 Bank ABC’s Fatema), the choice of mobile or fixed devices as platforms on which the firm
19
20
e

21 decides to provide an avatar also may be pertinent. Compared with fixed devices (e.g., desktops),
rR
22
23 mobile devices are particularly entertaining as hedonic IT technologies (e.g., video games and
24
25 MP3 players) have made their way into these portable devices (Xu, Abdinnour, and Chaparro
ev

26
27
28
2017). Moreover, consumers spend more time in online communities when they use mobile
iew

29
30 rather than fixed devices (Melumad, Inman, and Pham 2019), suggesting that mobile devices are
31
32 the preferred channel for online social experiences. Thus, avatars that are entertaining and
33
Ve

34
capable of establishing personalized social connections with customers may be especially
35
36
effective on mobile devices.
rsi

37
38
39 Proposition 9: As form realism increases, relative to behavioral realism, the use of
on

40 mobile devices (compared with fixed devices) will strengthen the positive effect of
41 avatars on customers’ affective responses.
42
43
44 Proposition 10: As behavioral realism increases, relative to form realism, the use of
45 mobile devices (compared with fixed devices) will strengthen the positive effect of
46 avatars on customers’ social responses.
47
48 Finally, the effects of customers’ cognitive, affective, and social responses on
49
50
51
performance outcomes may depend on the consumer relationship phase, which refers to the
52
53 relational trajectory of a customer–seller exchange (Dwyer, Schurr, and Oh 1987; Palmatier et al.
54
55 2013). During consumer–avatar interactions, three relationship phases are particularly important:
56
57
58 28
59
60 Journal of Marketing
Page 29 of 62

1
2 Author Accepted Manuscript
3 exploration, build-up, and maturity (Jap and Ganesan 2000). During the exploration phase, a
4
5
6 consumer is primarily concerned with the potential value and benefits of dealing with the seller,
7
8 so the perceived informativeness of the avatar becomes especially critical. During this initial
9
10 stage, entertainment or social engagement provided by the avatar might even detract from the
11
12
13
consumer’s task objectives and compromise customer outcomes. In this stage, avatar design
14
15 should allow for behavioral competence, so that the avatar can provide accurate, task-specific,
16
17 customized information rather than focusing on designing highly attractive or socially engaging
Pe
18
19
avatars. Such an approach should produce a positive disconfirmation for customers’ cognitive
20
e

21
experience. As the relationship proceeds to the build-up phase, the consumer has experienced
rR
22
23
24 some benefits from interactions with the firm’s avatars, so socialization processes (e.g., building
25
ev

26 rapport) become more important. Engaging and fulfilling social experiences can help customers
27
28
iew

29 develop long-term commitment to the firm. A positive disconfirmation about the avatar’s ability
30
31 to deepen social bonds might prove especially effective. Finally, during the maturity phase, a
32
33 consumer satisfied with the cognitive and social benefits of interacting with an avatar may focus
Ve

34
35
36
less on these factors and instead seek more entertainment value, prioritizing affective responses.
rsi

37
38 Proposition 11: The effects of customers’ cognitive, affective, and social responses on
39 performance outcomes are moderated by the relationship phase, such that (a) the effects
on

40 of cognitive responses are amplified in the exploration phase but suppressed in the build-
41 up and maturity phases, (b) the effects of affective responses are enhanced in the maturity
42
phase but suppressed in the exploration and build-up phases, and (c) the effects of social
43
44 responses are strengthened in the build-up phase but weakened in the exploration and
45 maturity phases.
46
47 Integrated Framework of Avatar Performance
48
49 To synthesize these insights, we offer an integrated framework of avatar performance
50
51
52 (Figure 3). This framework provides a visual summary of key insights from our review of extant
53
54 research and business practices, as well as our development of the avatar taxonomy and
55
56
57
58 29
59
60 Journal of Marketing
Page 30 of 62

1
2 Author Accepted Manuscript
3 propositions. With this framework, we work to advance thought in this emerging, contemporary
4
5
6 marketing area by integrating three mediation mechanisms (customers’ cognitive, affective, and
7
8 social responses) together with theory-driven moderators to test theory as well as offer
9
10 managerial implications.
11
12
13
---Insert Figure 3 about here---
14
15
16
17 Managerial Implications and Research Directions for Avatar-Based Marketing
Pe
18
19
20 Our integrative analyses of academic research and business practices generate practical
e

21
implications and research directions for avatar-based marketing, which we group into five key
rR
22
23
24 managerially relevant areas: (1) when to deploy avatars, (2) avatar form realism, (3) avatar
25
ev

26
27
behavioral realism, (4) form–behavioral realism alignment, and (5) avatar contingency effects
28
iew

29 (Table 3).
30
31 ---Insert Table 3 about here---
32
33
In terms of when and where avatars should be used, avatars seem to be most effective in
Ve

34
35
36 service-oriented industries (e.g., financial, travel, telecom services), in which the sheer scale of
rsi

37
38 service requirements and customer inquiries can easily overwhelm a company’s employees.
39
on

40 Avatars can free up employees’ time, so that the employees can focus on complex customer
41
42
43 needs and offer more value-added services, with greater productivity. Avatars can provide
44
45 consistent, personalized service and help build emotional connections between the firm and its
46
47 customers (Corner 2018; Kannan and Bernoff 2019). For companies that serve a large portfolio
48
49
50
of customers, avatars can also make it feasible to launch a segmented, multichannel strategy. By
51
52 offering customers opportunities to engage with avatars through different channels (e.g., social
53
54
55
56
57
58 30
59
60 Journal of Marketing
Page 31 of 62

1
2 Author Accepted Manuscript
3 media, company websites, dedicated apps), the firm ensures that it meets each customer’s unique
4
5
6 needs, at the right time and in the right place.
7
8 After confirming that an avatar should be implemented, the firm should determine the
9
10 design of avatar’s appearance, with the clear recognition that its form realism is a double-edged
11
12
13
sword. On the one hand, a more realistic, anthropomorphic appearance appeals to consumers,
14
15 because it offers greater entertainment value (Parboteeah, Valacich, and Wells 2009). On the
16
17 other hand, it elevates customers’ expectations of the avatar’s behavioral competence, which is
Pe
18
19
much more difficult and costly to develop. If the avatar’s behavioral competence falls short of
20
e

21
customers’ expectations, they experience a negative disconfirmation, which can decrease their
rR
22
23
24 satisfaction. To avoid negative performance outcomes, an avatar's anthropomorphic appearance
25
ev

26 should not exceed the level of its behavioral competence.


27
28
iew

29 Having decided on the avatar’s appearance, companies can design the avatar's behavioral
30
31 competence. If companies lack the resources to develop high form–high behavioral realism
32
33 avatars (i.e., digital human avatars), they should allocate more resources to the avatar's
Ve

34
35
36
behavioral intelligence than to its appearance. A positive disconfirmation of the avatar’s
rsi

37
38 cognitive and social competence likely produces an above-average level of customer satisfaction
39
on

40 or even customer delight (Finn 2012), which can increase firm performance.
41
42
Managers should account for form–behavior realism alignment too. If an avatar has high
43
44
45 levels of both form and behavioral realism, customers’ high initial expectations about the
46
47 avatar’s behavioral performance will be confirmed. Because customer satisfaction is an additive
48
49 function of (1) initial expectations of the avatar’s behavioral competence and (2) subsequent
50
51
52 confirmation or disconfirmation of this expectation (Oliver 1980), the high form–high behavioral
53
54 realism alignment will likely produce high levels of affective, cognitive, and social responses in
55
56
57
58 31
59
60 Journal of Marketing
Page 32 of 62

1
2 Author Accepted Manuscript
3 consumers, as well as better outcomes overall. Additional research is needed to determine the
4
5
6 “zone of tolerance” and the precision required when aligning form and behavioral realism of
7
8 avatars.
9
10 Finally, in addition to considering uncertainty factors and media channel choice, design
11
12
13
efforts should take the customer relationship phase into account, because the relative effects of
14
15 customers’ cognitive, affective, and social responses differ across relationship stages. For
16
17 example, during the exploration phase, a positive confirmation regarding avatar’s behavioral
Pe
18
19
realism can ensure good cognitive experiences (e.g., cognitive trust), but during the maturity
20
e

21
phase, a more entertaining avatar (e.g., funny, attractive appearance) may prove more effective
rR
22
23
24 for sustaining the relationship. Future research that takes a lifecycle approach to avatar design
25
ev

26 and use could determine avatar effectiveness at each stage and the strategies needed to adapt in
27
28
iew

29 accordance with customers’ dynamic needs.


30
31 Research Directions
32
33 Our analysis of avatar design strategies indicates some promising research opportunities.
Ve

34
35
36
First, propositions derived from our conceptual framework provide opportunities for empirical
rsi

37
38 research, which can be tested using different methods. For example, researchers might
39
on

40 collaborate with an avatar design company to manipulate form and behavioral realism in a 2  2
41
42
43 full-factorial experiment, consistent with our taxonomy in Figure 2. These results would provide
44
45 evidence of the effects of form realism on expectations for behavioral realism (P1), as well as
46
47 (dis)confirmations induced by any (mis)alignment (P2). Moderation tests also might be carried
48
49
50
out with lab experiments that allow for manipulations of uncertainty factors (e.g., risk, price,
51
52 privacy) or channels (e.g., mobile app vs. desktop) (P5–P10). To test the mediation effects,
53
54 researchers might use a difference-in-differences field experiment to examine the aggregate main
55
56
57
58 32
59
60 Journal of Marketing
Page 33 of 62

1
2 Author Accepted Manuscript
3 effect. Using a low form–low behavioral realism avatar as a baseline (control group), researchers
4
5
6 could increase form (treatment 1 group) and behavioral (treatment 2 group) realism. Any
7
8 significant differences in daily sales across the treatment groups and control group, between the
9
10 pre- and posttreatment periods, would indicate the external validity of the asymmetric effects of
11
12
13
form–behavioral realism misalignment (P3, P4). To confirm the distinct mediation effects,
14
15 customers also could be surveyed regarding their cognitive, affective, and social responses,
16
17 shortly after the treatments, followed by tests of the effects on performance outcomes (e.g.,
Pe
18
19
purchase intentions or word-of-mouth). Alternatively, a lab experiment can be used for
20
e

21
differential mediation tests. As for the test of the relationship phase (P11), a cross-sectional
rR
22
23
24 survey posted on a retail website that uses an avatar might enable researchers to conduct
25
ev

26 subgroup analyses or moderated regressions to detect differential effects of mediators across


27
28
iew

29 relational phases. A more demanding and rigorous approach would secure panel data from a
30
31 collaborating retailer that uses avatars and agrees to let the researchers track customers’
32
33 relational trajectories through longitudinal surveys, while also granting them access to objective
Ve

34
35
36
customer sales data.
rsi

37
38 Research opportunities also exist in areas that our framework does not cover. Avatars are
39
on

40 designed to enhance the productivity of company employees, rather than replace them altogether,
41
42
so continued research might investigate how to optimize avatar–human collaborations, especially
43
44
45 if problems arise. Various approaches allow customers to switch to a human representative when
46
47 necessary, using avatar-initiated, employee-initiated, or customer-initiated “exit ramps.” These
48
49 approaches can differ in their nature (proactive vs. reactive) and timing (early vs. late in the
50
51
52 interaction). Thus, determining how and when human intervention gets introduced could provide
53
54 significant insights for achieving service recovery and ensuring customers’ overall satisfaction
55
56
57
58 33
59
60 Journal of Marketing
Page 34 of 62

1
2 Author Accepted Manuscript
3 with and commitment to the firm. In this article, we have highlighted the influence of an avatar’s
4
5
6 form realism on the overall customer experience. Additional research should uncover which
7
8 specific dimensions of the avatar’s anthropomorphic appearance exert the strongest impacts on
9
10 customers’ behavioral realism expectations, which form realism elements create the most
11
12
13
entertaining avatar experience, or when a digital assistant without an anthropomorphic
14
15 appearance (e.g., Amazon’s Alexa) would perform better than an avatar. For example, if the
16
17 avatar’s form realism exceeds its behavioral realism, which subset of behavioral elements is most
Pe
18
19
likely to lead to customers’ expectation disconfirmation? What corrective actions would be
20
e

21
effective in addressing negative disconfirmations? Research also could delve deeper into the
rR
22
23
24 effects of avatars’ emotional intelligence, relative to their cognitive intelligence, in shaping
25
ev

26 customers’ expectancy (dis)confirmations and overall experience.


27
28
iew

29 Another type of avatars that is growing in popularity are customers’ self-avatars, created
30
31 with virtual model technology (Smith, Johnston, and Howard 2011). To inform research into
32
33 these applications, the typology we developed would need to be modified to reflect the unique
Ve

34
35
36
characteristics of self-avatars (e.g., resemblance to self). Some designer brands (e.g., Gucci) have
rsi

37
38 successfully engaged customers to dress their self-avatars in branded products, then share them
39
on

40 on social media (Carson 2019). The limited research on self-avatars has focused primarily on
41
42
enclosed online environments (e.g., retailer’s website), not social media (Cho and Schwarz 2012;
43
44
45 Fiore, Kim, and Lee 2005), indicating the pressing need for insights into how, why, and when
46
47 self-avatars could perform in social media marketing campaigns.
48
49 Brands also have turned to virtual influencers (3D, computer-generated personalities)
50
51
52 instead of or in addition to human influencers for online marketing campaigns. Powered by
53
54 advanced AI, these avatars can attract significant followers on social media platforms. For
55
56
57
58 34
59
60 Journal of Marketing
Page 35 of 62

1
2 Author Accepted Manuscript
3 example, with almost 3 million Instagram followers, Lil Miquela has endorsed brands such as
4
5
6 Prada and Calvin Klein (Abad 2019; Bezamat 2018). This avatar actively replies to social media
7
8 comments, appears in publications like Vogue, and even participates in live media interviews
9
10 (Chichioco 2019). While virtual influencers offer unique benefits to brands such as content
11
12
13
control and versatility, they also create potential risks. For example, 61% of consumers assert
14
15 that authentic, relatable content is the primary appeal of human influencers (Penny 2019), but
16
17 only 15% of followers of virtual influencers describe them as credible (Chowdhary 2019).
Pe
18
19
Moreover, because the virtual influencer avatar is not a human, the brand it endorses ultimately
20
e

21
is held responsible for its actions. Academic research has yet to investigate the unique benefits,
rR
22
23
24 risks, and operational mechanisms associated with avatar-based virtual influencer marketing.
25
ev

26 Future research might also explore avatar-based targeting strategies. Demographics,


27
28
iew

29 psychographics, and benefits sought are widely used customer segmentation bases (Tynan and
30
31 Drayton 1987); they also might be used to predict which customers will be best served by a
32
33 given type of avatar. For example, customer’s demographic traits might interact with the avatar’s
Ve

34
35
36
demographic attributes or behavioral elements to influence customer’s cognitive, affective, and
rsi

37
38 social responses to the online experience. Creating a personality or decision-making style for an
39
on

40 avatar that matches those of the customer might be an effective, psychographics-based design
41
42
strategy (Al-Natour, Benbasat, and Cenfetelli 2011). Segmenting markets on the basis of in-
43
44
45 depth analyses of the motives that lead certain people to interact with avatars could also inform
46
47 benefits-based avatar deployment strategies (Brandtzaeg and Følstad 2018). Distinct mediation
48
49 mechanisms could be uncovered across different customer segments to inform idiosyncratic
50
51
52 avatar designs. Research is also needed to determine when avatars may distract from, rather than
53
54 contribute to, the customer’s experience, as well as find strategies to address these challenges.
55
56
57
58 35
59
60 Journal of Marketing
Page 36 of 62

1
2 Author Accepted Manuscript
3 Finally, to extend beyond our focus on the relative effects of the different types of avatars
4
5
6 established by our taxonomy, future research might compare the effects of avatars with the
7
8 impacts of other digital representations such as emoji, anthropomorphized products, brand
9
10 mascots, or voice-only digital assistants. Understanding when and how avatars, versus these
11
12
13
alternative marketing tools, are more effective in influencing online shopping experience and
14
15 performance outcomes demands further scientific inquiry.
16
17
Pe
18
19
20
Conclusion
e

21
Rapid increases in the use of avatars have been fueled by two main factors: advances in digital
rR
22
23
24 technologies and increasing reliance on online experiences among both consumers and firms.
25
ev

26
27
The use of avatars is projected to grow by 35% annually (Globe Newswire 2019). However, the
28
iew

29 effectiveness of avatars continues to be uncertain, so we offer an integrated theoretical


30
31 framework to establish definitional and conceptual clarity, synthetize academic research and
32
33
business practices, and offer insights and propositions that provide managerial implications and
Ve

34
35
36 an agenda for future research. The proposed definition of avatars, as digital entities with
rsi

37
38 anthropomorphic appearance, controlled by a human or software, with an ability to interact,
39
on

40 helps us establish a design typology, which in turn gives academics and managers insights into
41
42
43 how to isolate elements that make avatars more or less effective for specific goals.
44
45 We synthesize academic literature and business practices by offering a 2  2 form
46
47 realism–behavioral realism taxonomy, which in turn enables us to derive propositions regarding
48
49
50 the effectiveness of avatars in marketing. The level of alignment between an avatar’s form
51
52 realism and behavioral realism, according to several contingencies, can provide a parsimonious
53
54 explanation of when an avatar will be most effective. With insights gained from our investigation
55
56
57
58 36
59
60 Journal of Marketing
Page 37 of 62

1
2 Author Accepted Manuscript
3 of fundamental avatar elements, extant research, and business practices, we develop an
4
5
6 integrative framework of avatar performance that offers theoretical insights, research
7
8 propositions, managerial implications, and future research directions.
9
10
11
12
13
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58 37
59
60 Journal of Marketing
Page 38 of 62

1
2 Author Accepted Manuscript
3
References
4
5
6 Abad, Mario (2019), “People Are Slamming Bella Hadid Kissing a Female Influencer as ‘Queer-Baiting’
7 in Calvin Klein Ad,” (accessed October 31, 2019), www.yahoo.com/lifestyle/people-slamming-bella-
8 hadid-kissing-205745785.html.
9 Afifi, Walid A., and Judee K. Burgoon (2000), “The Impact of Violations on Uncertainty and the
10 Consequences for Attractiveness,” Human Communication Research, 26(2), 203-233.
11 Ahn, Sun Joo, Jesse Fox, and Jeremy N. Bailenson (2012), “Avatars,” in Leadership in Science and
12
Technology: A Reference Handbook, William Sims Bainbridge, ed. Thousand Oaks, CA: Sage
13
Publications, 695-702.
14
15
Al-Natour, Sameh, Izak Benbasat, and Ron Cenfetelli (2011), “The Adoption of Online Shopping
16 Assistants: Perceived Similarity as an Antecedent to Evaluative Beliefs,” Journal of the Association
17 for Information Systems, 12(5), 347-374.
Pe
18 Bailenson, Jeremy N., Nick Yee, Jim Blascovich, and Rosanna E. Guadagno (2008), “Transformed Social
19 Interaction in Mediated Interpersonal Communication,” in Mediated Interpersonal Communication,
20 Elly A. Konijn, Sonja Utz, Martin Tanis, and Susan B. Barnes, eds. New York: Routledge, 77-99.
e

21 Bezamat, Bia (2018), “Prada Enlists Computer-Generated Influencer to Promote Fall 18 Show,” Current
rR
22 Daily (accessed October 31, 2019), https://thecurrentdaily.com/2018/02/27/prada-enlists-computer-
23 generated-influencer-ss18-show/.
24 Bickmore, Timothy W., Dina Utami, Robin Matsuyama, and Michael K Paasche-Orlow (2016),
25 “Improving Access to Online Health Information with Conversational Agents: A Randomized
ev

26 Controlled Experiment,” Journal of Medical Internet Research, 18(1), e1.


27 Bickmore, Timothy W., Laura M. Pfeifer, and Brian W. Jack (2009), “Taking the Time to Care:
28 Empowering Low Health Literacy Hospital Patients with Virtual Nurse Agents,” Proceedings of the
iew

29 SIGCHI Conference on Human Factors in Computing Systems. New York: Association for Computing
30 Machinery, 1265-1274.
31
Blascovich, Jim, Jack Loomis, Andrew C. Beall, Kimberly R. Swinth, Crystal L. Hoyt, and Jeremy N.
32
Bailenson (2002), “Immersive Virtual Environment Technology as a Methodological Tool for Social
33
Psychology,” Psychological Inquiry, 13(2), 103-124.
Ve

34
35 Bradbury, Danny (2018), “The Changing Faces of AI,” (accessed October 31, 2019), https://workflow.
36 servicenow.com/customer-experience/build-ai-avatar/.
Brandtzaeg, Petter Bae, and Asbjørn Følstad (2018), “Chatbots: Changing User Needs and Motivations,”
rsi

37
38 Interactions, 25(5), 38-43.
39 Brave, Scott, Clifford Nass, and Kevin Hutchinson (2005), “Computers that Care: Investigating the
Effects of Orientation of Emotion Exhibited by an Embodied Computer Agent,” International
on

40
41 Journal of Human-Computer Studies, 62, 161-187.
42 Briggs, Bill (2018), “Guess Who Wants to Talk! How Flo and Her Fellow Chatbots Engage Customers,”
43 (accessed October 31, 2019), https://news.microsoft.com/transform/flo-rise-ai-chatbots-progressive-
44 sabre-ups/.
45 Brunsman, Barrett J. (2019), “P&G Introduces Virtual Ambassador ‘Obsessed’ over Skin Care (Video),”
46 (accessed October 31, 2019), https://www.bizjournals.com/cincinnati/news/2019/ 06/26/p-g-introduces
47 -virtualambassador-obsessed-over.html.
48 Burden, David, and Maggi Savin-Baden (2019), Virtual Humans: Today and Tomorrow. New York: CRC
49 Press.
50 Carson, Biz (2019), “Billionaires Jim Breyer and Thomas Tull Lead $15 Million Bet that Genies’ Avatars
51
Will be Next Big Thing in Social,” Forbes (accessed October 31, 2019), www.forbes.
52
com/sites/bizcarson/2019/06/11/jim-breyer-thomas-tull-genies-funding/#7468ec1b600a.
53
54
Chattaraman, Veena, Wi-Suk Kwon, and Juan E. Gilbert (2012), “Virtual Agents in Retail Web Sites:
55 Benefits of Simulated Social Interaction for Older Users,” Computers in Human Behavior, 28(6),
56 2055-2066.
57
58 38
59
60 Journal of Marketing
Page 39 of 62

1
2 Author Accepted Manuscript
3 Chattaraman, Veena, Wi-Suk Kwon, Juan E. Gilbert, and Kassandra Ross (2019), “Should AI-Based,
4 Conversational Digital Assistants Employ Social- or Task-Oriented Interaction Style? A Task-
5 Competency and Reciprocity Perspective for Older Adults,” Computers in Human Behavior, 90, 315-
6
330.
7
Chichioco, Aaron (2019), “Virtual Influencers: The Significance of Influencer Chatbots to Your Brand
8
9
Strategy,” (accessed October 31, 2019), https://chatbotsmagazine.com/virtual-influencers-the-
10 significance-of-influencer-chatbots-to-your-brand-strategy-f6206c48adea.
11 Cho, Hyejeung and Norbert Schwarz (2012), “I Like Your Product When I Like My Photo: Misattribution
12 Using Interactive Virtual Mirrors,” Journal of Interactive Marketing, 26(4), 235-43.
13 Chowdhary, Mukta (2019), “How the Humans Behind CGI Influencers Need to Adapt to Consumer
14 Needs, Lil Miquela Isn’t Making as Good of an Impression as Real People,” Adweek (accessed
15 October 31, 2019), www.adweek.com/digital/how-the-humans-behind-cgi-influencers-need-to-adapt-
16 to-consumer-needs/.
17 Corner, Stuart (2018), “Vodafone to Deploy Digital Humans for Customer Service,” (accessed October 2,
Pe
18 2019), www.computerworld.com/article/3478861/vodafone-to-deploy-digital-humans-for-customer-
19 service.html.
20 Davis, Fred. D., Richard P. Bagozzi, and Paul R. Warshaw (1992), “Extrinsic and Intrinsic Motivation to
e

21 Use Computers in the Workplace,” Journal of Applied Social Psychology, 22 (14), 1111-1132.
rR
22 De Haan, Evert, P.K. Kannan, Peter C. Verhoef, and Thorsten Wiesel (2018), “Device Switching in
23 Online Purchasing: Examining the Strategic Contingencies,” Journal of Marketing, 82(5), 1-19.
24 D'Mello, Sidney K., Art Graesser, and Brandon King (2010), “Toward spoken human–computer tutorial
25
ev

dialogues,” Human-Computer Interaction, 25(4), 289-323.


26
Derrick, Douglas C., and Gina Scott Ligon (2014), “The Affective Outcomes of Using Influence Tactics
27
in Embodied Conversational Agents,” Computers in Human Behavior, 33, 39-48.
28
iew

29 Dick, Alan, Dipankar Chakravarti, and Gabriel Biehal (1990), “Memory-Based Inferences During
30 Consumer Choice,” Journal of Consumer Research, 17 (June), 82-93.
31 Dormehl, Luke (2018), “Microsoft’s Friendly Xiaoice A.I Can Figure Out What You Want—Before You
32 Ask,” (accessed November 18, 2019), www.digitaltrends.com/cool-tech/xiaoice-microsoft-future-of-
33 ai-assistants/.
Ve

34 Dwyer, F. Robert, Paul H. Schurr, and Sejo Oh (1987), “Developing Buyer-Seller Relationships,” Journal
35 of Marketing, 51 (April), 11-27.
36 Epley, Nicholas, Adam Waytz, and John T. Cacioppo (2007), “On Seeing Human: A Three-Factor
rsi

37 Theory of Anthropomorphism,” Psychological Review, 114(4), 864-886.


38 Etemad-Sajadi, Reza (2014), “The Influence of a Virtual Agent on Web-Users’ Desire to Visit the
39 Company,” International Journal of Quality & Reliability Management, 31(4), 419-434.
on

40 Etemad-Sajadi, Reza (2016), “The Impact of Online Real-Time Interactivity on Patronage Intention: The
41 Use of Avatars,” Computers in Human Behavior, 61, 227-232.
42 Evangelidis, Ioannis and Stijn M. J. Van Osselaer (2018), “Points of (Dis)parity: Expectation
43 Disconfirmation from Common Attributes in Consumer Choice,” Journal of Marketing Research, 55
44 (February), 1-13.
45
Finn, Adam (2012), “Customer Delight: Distinct Construct or Zone of Nonlinear Response to Customer
46
Satisfaction?” Journal of Service Research, 15 (1), 99-110.
47
48
Fiore, Ann Marie, Jihyun Kim, and Hyun-Hwa Lee (2005), “Effect of Image Interactivity Technology on
49 Consumer Responses toward the Online Retailer,” Journal of Interactive Marketing, 19(3), 38-53.
50 Fox, Jesse, Sun Joo Ahn, Joris H. Janssen, Leo Yeykelis, Kathryn Y. Segovia, and Jeremy N. Bailenson
51 (2015), “Avatars versus Agents: A Meta-Analysis Quantifying the Effect of Agency on Social
52 Influence,” Human-Computer Interaction, 30(5), 401-432.
53 Frank, Aaron (2019), “The Rise of a New Generation of AI Avatars,” (accessed January 15, 2020),
54 https://singularityhub. com/2019/01/15/the-rise-of-a-new-generation-of-ai-avatars/.
55 Freeman, C. and I. Beaver (2018), “The Effect of Response Complexity and Media on User Restatement
56 with Multimodal Virtual Assistants,” International Journal of Human-Computer Studies, 119, 12-27.
57
58 39
59
60 Journal of Marketing
Page 40 of 62

1
2 Author Accepted Manuscript
3 Garau, Maia, Mel Slater, Vinoba Vinayagamoorthy, Andrea Brogni, Anthony Steed, and M. Angela Sasse
4 (2003), “The Impact of Avatar Realism and Eye Gaze Control on Perceived Quality of
5 Communication in a Shared Immersive Virtual Environment,” Proceedings of the SIGCHI Conference
6
on Human Factors in Computing Systems. New York: Association for Computing Machinery, 529-
7
536.
8
9
Garnier, Marion and Ingrid Poncin (2013), “The Avatar in Marketing: Synthesis, Integrative Framework
10 and Perspectives,” Recherche et Applications en Marketing (English Edition), 28(1), 85-115.
11 Globe Newswire (2019), “Chatbot Market-Growth, Trends, and Forecast (2019-2024),” Research Report
12 No. 4622740, ResearchAndMarkets.com.
13 Go, Eun, and S. Shyam Sundar (2019), “Humanizing Chatbots: The Effects of Visual, Identity and
14 Conversational Cues on Humanness Perceptions,” Computers in Human Behavior, 97, 304-316.
15 Gonzalez, Robbie (2017), “Virtual Therapists Help Veteran Open up about PTSD,” Wired (accessed
16 October 17, 2019), www.wired.com/story/virtual-therapists-help-veterans-open-up-about-ptsd/.
17 Ho, Annabell, Jeff Hancock, and Adam S. Miner (2018), “Psychological, Relational, and Emotional
Pe
18 Effects of Self-Disclosure after Conversations with A Chatbot,” Journal of Communication, 68(4),
19 712-733.
20 Holzwarth, Martin, Chris Janiszewski, and Marcus M. Neumann (2006), “The Influence of Avatars on
e

21 Online Consumer Shopping Behavior,” Journal of Marketing, 70(4), 19-36.


rR
22 Ipsoft (2017), “Amelia in Action,” www.ipsoft.com/wp-content/uploads/2016/11/ Amelia_In_
23 Action_Updated_PDF.pdf
24 Jap, Sandy D. and Shankar Ganesan (2000), “Control Mechanisms and the Relationship Life Cycle:
25
ev

Implications for Safeguarding Specific Investments and Developing Commitment,” Journal of


26
Marketing Research, 37 (May), 227-245.
27
Jin, Seung-A. Annie (2009), “The Roles of Modality Richness and Involvement in Shopping Behavior in
28
iew

29 3D Virtual Stores,” Journal of Interactive Marketing, 23(3), 234-246.


30 Kahn, Jeremy (2018), “Meet ‘Millie’ the Avatar. She’d Like to Sell You a Pair of Sunglasses,”
31 Bloomberg (accessed December 15, 2019), www.bloomberg.com/news/articles/2018-12-15/meet-
32 millie-the-avatar-she-d-like-to-sell-you-a-pair-of-sunglasses.
33 Kang, Sin-Hwa and James H. Watt (2013), “The Impact of Avatar Realism and Anonymity on Effective
Ve

34 Communication Via Mobile Devices,” Computers in Human Behavior, 29, 1169-1181.


35 Kang, Sin-Hwa, James H. Watt, and Sasi Kanth Ala (2008), “Communicators’ Perceptions of Social
36 Presence as a Function of Avatar Realism in Small Display Mobile Communication Devices,”
rsi

37 Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008).
38 Washington. DC: IEEE Computer Society, 147-156.
39 Kannan, P.V. and Josh Bernoff (2019), “Does Your Company Really Need a Chatbot?” (accessed May
on

40 21, 2020), https://hbr.org/2019/05/does-your-company-really-need-a-chatbot.


41 Keeling, Kathleen, Peter McGoldrick, and Susan Beatty (2010), “Avatars as Salespeople: Communication
42 Style, Trust, and Intentions,” Journal of Business Research, 63, 793-800.
43 Kilens, Mark (2019), “2019 State of Conversational Marketing [Free Report],” (accessed July 16, 2020),
44 www. drift.com/blog/state-of-conversational-marketing/.
45
Kim, Sara, Rocky Peng Chen, and Ke Zhang (2016), “Anthropomorphized Helpers Undermine
46
Autonomy and Enjoyment in Computer Games,” Journal of Consumer Research, 43(2), 282-302.
47
48
Kim, Kyoung-Min, Jin-Hyuk Hong, and Sung-Bae Cho (2007), “A Semantic Bayesian Network
49 Approach to Retrieving Information with Intelligent Conversational Agents,” Information
50 Processing & Management, 43, 225-236.
51 Kim, Youjeong, and S. Shyam Sundar (2012), “Anthropomorphism of Computers: Is It Mindful or
52 Mindless?” Computers in Human Behavior, 28(1), 241-250.
53 Köhler, Clemens F., Andrew J. Rohm, Ko de Ruyter, and Martin Wetzels (2011), “Return on
54 Interactivity: The Impact of Online Agents on Newcomer Adjustment,” Journal of Marketing, 75(2),
55 93-108.
56
57
58 40
59
60 Journal of Marketing
Page 41 of 62

1
2 Author Accepted Manuscript
3 Lee, Seo Young, and Junho Choi (2017), “Enhancing User Experience with Conversational Agent for
4 Movie Recommendation: Effects of Self-Disclosure and Reciprocity,” International Journal of
5 Human-Computer Studies, 103, 95-105.
6
Liew, Tze Wei, Su-Mae Tan, and Hishamuddin Ismail (2017), “Exploring the Effects of a Non-Interactive
7
Talking Avatar on Social Presence, Credibility, Trust, and Patronage Intention in an E-Commerce
8
9
Website,” Human-Centric Computing and Information Sciences, 7(1), 1-21.
10 Liu, Yuping, and Lawrence J. Shrum (2002), “What Is Interactivity and Is It Always Such a Good Thing?
11 Implications of Definition, Person, and Situation for the Influence of Interactivity on Advertising
12 Effectiveness,” Journal of Advertising, 31(4), 53-64.
13 Llop, Cristina (2016), “Gina—LA’s Online Traffic Avatar Radically Changes Customer Experiences,”
14 www.srln.org/node/1186/gina-las-online-traffic-avatar-radically-changes-customer-experience-news-
15 2016.
16 Martin, Kelly D., Abhishek Borah, and Robert W. Palmatier (2017), “Data Privacy: Effects on Customer
17 and Firm Performance,” Journal of Marketing, 81(1), 36-58.
Pe
18 McDuff, Daniel, and Mary Czerwinski (2018), “Designing Emotionally Sentient Agents,”
19 Communications of the ACM, 61(12), 74-83.
20 McGloin, Rory, Kristine L. Nowak, Stephen C. Stiffano, and Gretta M. Flynn (2009), “The Effect of
e

21 Avatar Perception on Attributions of Source and Text Credibility,” Proceedings of ISPR 2009,
rR
22 International Society for Presence Research Annual Conference. Philadelphia: Temple University
23 Press, 1-9.
24 Melumad, Shiri, J. Jeffrey Inman, and Michel Tuan Pham (2019), “Selectively Emotional: How
25
ev

Smartphone Use Changes User-Generated Content,” Journal of Marketing Research, 56(2), 259-275.
26
Mimoun, Mohammed Slim Ben, and Ingrid Poncin (2015), “A Valued Agent: How ECAs Affect Website
27
Customers’ Satisfaction and Behaviors,” Journal of Retailing and Consumer Services, 26, 70-82.
28
iew

29 Moon, Youngme (2000), “Intimate Exchanges: Using Computers to Elicit Self-Disclosure from
30 Consumers,” Journal of Consumer Research, 26(4), 323-339.
31 Nass, Clifford, and Youngme Moon (2000), “Machines and Mindlessness: Social Responses to
32 Computers,” Journal of Social Issues, 56(1), 81-103.
33 Nass, Clifford, Youngme Moon, Brian Jeffrey Fogg, Byron Reeves, and D. Christopher Dryer (1995),
Ve

34 “Can Computer Personalities Be Human Personalities?” International Journal of Human-Computer


35 Studies, 43(2), 223-239.
36 Nass, Clifford, and Corina Yen (2010), The Man Who Lied to His Laptop: What Machines Teach Us
rsi

37 About Human Relationships. New York: Current.


38 Nowak, Kristine L., and Frank Biocca (2003), “The Effect of The Agency and Anthropomorphism on
39 Users’ Sense of Telepresence, Copresence, and Social Presence in Virtual Environments,” Presence:
on

40 Teleoperators & Virtual Environments, 12(5), 481-494.


41 Nowak, Kristine L. and Jesse Fox (2018), “Avatars and Computer-Mediated Communication: A Review
42 of the Definitions, Uses, and Effects of Digital Representations,” Review of Communication Research,
43 6, 30-53.
44 Nowak, Kristine L. and Christian Rauh (2006), “The Influence of the Avatar on Online Perceptions of
45
Anthropomorphism, Androgyny, Credibility, Homophily, and Attraction,” Journal of Computer-
46
Mediated Communication, 11, 153-178.
47
48
Nunamaker, Jay F., Douglas C. Derrick, Aaron C. Elkins, Judee K. Burgoon, and Mark W. Patton (2011),
49 “Embodied Conversational Agent-Based Kiosk for Automated Interviewing,” Journal of Management
50 Information Systems, 28(1), 17-48.
51 Oliver, Richard L. (1980), “A Cognitive Model of the Antecedents and Consequences of Satisfaction
52 Decisions,” Journal of Marketing Research, 17 (November), 460-469.
53 Palmatier, Robert W., Mark B. Houston, Rajiv P. Dant, and Dhruv Grewal, (2013), “Relationship
54 Velocity: Toward a Theory of Relationship Dynamics,” Journal of Marketing, 77 (1), 13-30.
55
56
57
58 41
59
60 Journal of Marketing
Page 42 of 62

1
2 Author Accepted Manuscript
3 Parboteeah, D. Veena, Joseph S. Valacich, and John D. Wells (2009), “The Influence of Website
4 Characteristics on a Consumer’s Urge to Buy Impulsively,” Information Systems Research, 20 (1), 60-
5 78.
6
Peddie, Bryan (2018), “Are Virtual Tellers the Future of AI in the Banking Sector,” (accessed April 13,
7
2019), www.ncr.com/company/blogs/financial/are-virtual-tellers-the-future-of-ai-in-the-banking-
8
9
sector.
10 Penny, Sarah (2019), “Virtual Influencers Might Be Easier to Mould but They’re Not Necessarily a Safer
11 Option,” (accessed October 31, 2019), https://phvntom.com/virtual-influencers-might-be-easier-to-
12 mould-but-theyre-not-necessarily-a-safer-option/.
13 Persky, Susan, and Jim Blascovich (2007), “Immersive Virtual Environments versus Traditional
14 Platforms: Effects of Violent and Nonviolent Video Game Play,” Media Psychology, 10(1), 135-156.
15 Phaneuf, Alicia (2020), “7 Real Examples of Brands and Businesses Using Chatbots to Gain an Edge,”
16 Business Insider (accessed February 13, 2020), www.businessinsider.com/business-chatbot-examples.
17 Pütten, Astrid M. Von der, Nicole C. Krämer, Jonathan Gratch, and Sin-Hwa Kang (2010), “It Doesn’t
Pe
18 Matter What You Are!” Explaining Social Effects of Agents and Avatars,” Computers in Human
19 Behavior, 26(6), 1641-1650.
20 Qiu, Lingyun and Izak Benbasat (2009), “Evaluating Anthropomorphic Product Recommendation
e

21 Agents: A Social Relationship Perspective to Designing Information Systems,” Journal of


rR
22 Management Information Systems, 25(4), 145-182.
23 Reeves, Byron, and Clifford Ivar Nass (1996), The Media Equation: How People Treat Computers,
24 Television, and New Media Like Real People and Places. Cambridge University Press, New York,
25
ev

NY.
26
Robinson, Ann (2015), “Meet Ellie, the Machine that Can Detect Depression,” (accessed December 9,
27
2020), www.theguardian.com/sustainable-business/2015/sep/17/ellie-machine-that-can-detect-
28
iew

29 depression.
30 Schuetzler, Ryan, Justin Scott Giboney, G. Mark Grimes, and Jay F. Nunamaker Jr. (2018), “The
31 Influence of Conversational Agent Embodiment and Conversational Relevance on Socially Desirable
32 Responding,” Decision Support Systems, 114, 94-102.
33 Scott, David M. (2008), “Anna from IKEA Is Intellectually Challenged (but She Has a Sense of Humor),”
Ve

34 (accessed August 1, 2019), www.davidmeermanscott.com/blog/2008/08/anna-from-ikea.html.


35 Sivaramakrishnan, Subramanian, Fang Wan, and Zaiyong Tang (2010), “Giving an “E-Human Touch” to
36 E-Tailing: The Moderating Roles of Static Information Quantity and Consumption Motive in the
rsi

37 Effectiveness of an Anthropomorphic Information Agent,” Journal of Interactive Marketing, 21(1),


38 60-75.
39 Smith, Stephen P., Robert B. Johnston, and Steve Howard (2011), “Putting Yourself in the Picture: An
on

40 Evaluation of Virtual Model Technology as an Online Shopping Tool,” Information Systems Research,
41 22(3), 640-659.
42 Stayman, Douglas M., Dana L. Alden, and Karen H. Smith (1992), “Some Effects of Schematic
43 Processing on Consumer Expectations and Disconfirmation Judgements,” Journal of Consumer
44 Research, 19(September), 240-255.
45
Sundar, S. Shyam (2008), “The MAIN Model: A Heuristic Approach to Understanding Technology
46
Effects on Credibility,” in Digital Media, Youth, and Credibility, Miriam J. Metzger and Andrew J.
47
48
Flanagin, eds. Boston, MA: The MIT Press, 73-100.
49 Sweezey, Mathew (2019), “Consumer Preference for Chatbots Is Challenging Brands to Think ‘Bot
50 First’,” Forbes (accessed October 30, 2019), www.forbes.com/sites/forbes communicationscoun
51 cil/2019/08/16/consumer-preference-for-chatbots-is-challenging-brands-to-think-bot-first/#4407
52 c60c10f8.
53 Torresin, Veronica (2019), “How Chatbots Improve User Experience in Online Banking,” (accessed
54 October 31, 2019), https://ergomania.eu/how-chatbots-improve-user-experience-in-online-banking/.
55 Touré-Tillery, Maferima and Ann L. McGill (2015), “Who or What to Believe: Trust and The Differential
56 Persuasiveness of Human and Anthropomorphized Messengers,” Journal of Marketing, 79(4), 94-110.
57
58 42
59
60 Journal of Marketing
Page 43 of 62

1
2 Author Accepted Manuscript
3 Tynan, A. Caroline, and Jennifer Drayton (1987), “Market Segmentation,” Journal of Marketing
4 Management, 2(3), 301-335.
5 Verhagen, Tibert, Jaap van Nes, Frans Feldberg, and Willemijn van Dolen (2014), “Virtual Customer
6
Service Agents: Using Social Presence and Personalization to Shape Online Service Encounters,”
7
Journal of Computer-Mediated Communication, 19(3), 529-545.
8
9
Wang, Liz C., Julie Baker, Judy A. Wagner, and Kirk Wakefield (2007), “Can a Retail Web Site Be
10 Social?” Journal of Marketing, 71(3), 143-157.
11 Westerman, David, Ron Tamborini, and Nicholas David Bowman (2015), “The Effects of Static Avatars
12 on Impression Formation Across Different Contexts on Social Networking Sites,” Computers in
13 Human Behavior, 53, 111-117.
14 White, Susan S. and Benjamin Schneider (2000), “Climbing the Commitment Ladder: The Role of
15 Expectations Disconfirmation on Customers’ Behavioral Intentions,” Journal of Service Research, 2
16 (3), 240-253.
17 Wooler, Brodie (2019), “We Need to Chat about Chatbots,” (accessed May 8, 2020),
Pe
18 www.linkedin.com/pulse/we -need-chat-chatbots-brodie-wooler/.
19 Wu, Jen-Her and Shu-Ching Wang (2005), “What Drives Mobile Commerce? An Empirical Evaluation of
20 the Revised Technology Acceptance Model,” Information & Management, 42(5), 719-29.
e

21 Xu, Jingjun D., Sue Abdinnour, and Barbara Chaparro (2017), “An Integrated Temporal Model of Belief
rR
22 and Attitude Change: An Empirical Test With the iPad,” Journal of the Association for Information
23 Systems, 18 (2), 113-140.
24 Yee, Nick, Jeremy N. Bailenson, and Kathryn Rickertsen (2007), “A Meta-Analysis of the Impact of the
25
ev

Inclusion and Realism of Human-Like Faces on User Experiences in Interfaces,” Proceedings of the
26
SIGCHI Conference on Human Factors in Computing Systems. New York: Association for Computing
27
Machinery, 1-10.
28
iew

29 Yokotani, Kenji, Gen Takagi, and Kobun Wakashima (2018), “Advantages of Virtual Agents over
30 Clinical Psychologists During Comprehensive Mental Health Interviews Using a Mixed Methods
31 Design,” Computers in Human Behavior, 85, 135-145.
32 Yun, Chang, Zhigang Deng, and Merrill Hiscock (2009), “Can Local Avatars Satisfy a Global Audience?
33 A Case Study of High-Fidelity 3D Facial Avatar Animation in Subject Identification and Emotion
Ve

34 Perception by US and International Groups,” Computers in Entertainment, 7(2), 1-26.


35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58 43
59
60 Journal of Marketing
Page 44 of 62

1 Author Accepted Manuscript


2
3 Table 1: Avatar Definitional Elements in Empirical Research
4
5
6 Definitional Element
Avatar by
7 Illustrative
Labels/Aliases Definitions our
Research Digital Anthropomorphic Interactivity Controlling
8 Appearance Entity Definition?
9

Pe
10 Percentage of element
78% 70% 78% 90% 51%
11

er
12
13 Ho, Hancock, and Chatbot "computer programs that can simulate ✓ ✓ Human No
Miner (2018) human-human conversation" (p. 712)
14

Re
15
16 Holzwarth, Avatar “general graphic representations that are ✓ ✓ ✓ Software Yes

vie
17 Janiszewski, and personified by means of computer
18 Neumann (2006) technology” (p. 20)
19

w
Jin (2009) Avatar “Artificial, computer-animated ✓ ✓ Software No
20 representations of human interlocutors”
21 or "pictorial representations of humans in

Ve
22 a chat environment" (p. 234)
23
24

rsi
Kang and Watt Avatar "digital models of people that either look ✓ ✓ ✓ Human Yes
25 (2013) or behave like the people they represent."
26 (p. 1170)

on
27
28 Kim, Chen, and Anthropomorphized Entities that "are often imbued with ✓ ✓ Software No
Zhang (2016) helper / digital humanlike features and characteristics" (p.
29 assistant 283)
30
31 Köhler, Rohm, and Socialization agent / "Computer mediated personas that posses ✓ Software No
32 de Ruyter (2011) online agent the capability to involve customers in rich
interactive conversations, rather than
33
discrete, basic exchanges, and that have
34 the ability to apply past interaction
35 content to current interactions" (p. 96)
36
37 Nunamaker et al. Embodied "Virtual, three-dimensional human ✓ ✓ ✓ Software Yes
(2011) conversational agent likenesses that are displayed on computer
38 screens...and interact with people through
39 natural speech" (p. 21)
40
41
42
43 44
44
45 Journal of Marketing
46
47
Page 45 of 62

1 Author Accepted Manuscript


2
3 Schuetzler et al. Conversational agent "systems that mimic human-to-human ✓ ✓ ✓ Software Yes
4 (2018) communication using natural language
processing, machine learning, and/or
5 artificial intelligence" (p. 94)
6
7
8 Sivaramakrishnan, Anthropomorphic "a humanlike chatbot that acts as an ✓ ✓ Software No
Wan, and Tang information agent interactive online information provider"
9

Pe
(2007) (p. 60)
10
11 Touré-Tillery and Anthropomorphized "nonhuman entities that deliver message ✓ Software No

er
12 McGill (2015) agent (partial human) content across a variety of media (e.g.,
13 print, online, television), are typically
imbued with various combinations of
14

Re
human characteristics, such as human
15 form (e.g., human-like faces, arms, and
16 legs), and the apparent ability to speak and

vie
think" (p. 94)"
17
18
Verhagen et al. Virtual customer "computer-generated characters that are ✓ ✓ ✓ Software Yes
19 (2014) service agent able to interact with customers and

w
20 simulate behavior of human company
21 representatives through artificial

Ve
intelligence" (p. 530)
22
23 Wang et al. (2007) Virtual character "avatar with some type of combination of 4 ✓ ✓ ✓ Software Yes
24 online social cues: language, human voice,

rsi
25 interactivity, and social role" (p. 143-144)
26

on
This research Avatar Digital entities with anthropomorphic ✓ ✓ ✓ Software or
27 appearance, controlled by a human or human
28 software, that have an ability to interact.
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43 45
44
45 Journal of Marketing
46
47
Page 46 of 62

1 Author Accepted Manuscript


2
3 Table 2: Avatars in Empirical Research
4
5
6 Illustrative
Original Label Context Theoretical Perspective
Mediator Moderator
Key Findings
Research Variables Variables
7
8
Al-Natour, Automated shopping Online shopping Computers as social actors Perceived None Perceived decision
9

Pe
Benbasat, and assistant for a laptop (CASA) framework; decision process process similarity
10 Cenfetelli computer similarity-attraction similarity mediates the effect of
11 (2011) hypothesis perceived personality

er
similarity on several
12 beliefs (enjoyment, social
13 presence, trust, ease of
14 use, and usefulness).

Re
15 Bickmore et Embodied Cancer patients Not specified None None Patients were more
16 al. (2016) conversational agent identifying and satisfied with the

vie
17 learning about conversational agent
clinical trials on compared to the
18 the internet conventional Web form-
19 based interface, and

w
20 patients with low health
21 literacy had a higher
success rate in finding

Ve
22 relevant trials.
23
24 Brave, Nass, Embodied computer Casino-style CASA framework None None Empathic emotion of the

rsi
and agent blackjack game agent leads to greater
25 Hutchinson user-rated likeability,
26 (2005) trustworthiness,

on
27 perceived caring, and
28 perceived support.
29 Chattaraman Digital assistant Online purchase Social response theory Trust in online Internet Users' internet
30 et al. (2019) of athletic shoes store; competency competency interacts with
31 by older information the digital assistant's
consumers overload; conversational style
32 perceived self- (social- vs. task-oriented)
33 efficacy; ease of in affecting social,
34 use; usefulness functional, and behavioral
35 intention outcomes.
36 Chattaraman, Virtual agent Online purchase Social response theory; Perceived social None Virtual agents can
37 Kwon, and of apparel by CASA framework support; trust in increase older users'
38 Gilbert older consumers online store; patronage intentions by
(2012) perceived risk enhancing perceived
39
social support and trust in
40 the online store while
41 reducing perceived risk.
42
43 46
44
45 Journal of Marketing
46
47
Page 47 of 62

1 Author Accepted Manuscript


2
3
4 Derrick and Embodied An avatar-based CASA framework None Gender Self-promoting agents are
Ligon (2014) conversational agent screening perceived as more
5 checkpoint powerful, more
6 experiment trustworthy, and more
7 expert, whereas
8 ingratiating agents are
perceived as more
9

Pe
attractive. Ingratiation
10 impression management
11 techniques are viewed

er
12 less (more) favorably by
females (males) than self-
13 promotion techniques.
14

Re
15 D'Mello, AutoTutor Computer literacy Social agency theory None None Students who interacted
Graesser, and learning with a with the AutoTutor
16 King (2010) fully automated through a spoken dialogue

vie
17 computer tutor used more cognitive
18 resources and completed
19 more problems than

w
students who had to type.
20
21 Go and Chatbot Online digital CASA framework Social presence; Anthropomorphic Chatbot's message

Ve
22 Sundar camera purchase homophily visual cues; agency interactivity has positive
(2019) effects on customers'
23 attitude toward the
24

rsi
website and return
25 intention mediated by
26 perceived social presence

on
and homophily;
27 anthropomorphic visual
28 cues and agency moderate
29 these effects.
30 Holzwarth, Avatar Online purchase Social response theory Entertainment Product Use of an avatar sales
31 Janiszewski, of shoes that are value; involvement agent increases
32 and Neumann customizable via information satisfaction with the
(2006) online value; likeability retailer, attitude toward
33 consultation of avatar; the product, and purchase
34 credibility of intentions, mediated by
35 avatar perceived entertainment
and information value; an
36 attractive avatar is more
37 effective at moderate
38 levels of product
39 involvement, mediated by
likeability of the avatar,
40 whereas an expert avatar
41 is more effective at high
42
43 47
44
45 Journal of Marketing
46
47
Page 48 of 62

1 Author Accepted Manuscript


2
3 levels of product
4 involvement, mediated by
credibility of the avatar.
5
6
Keeling, Avatar Online CASA framework Trust perception Goods/services high Avatar's social orientation
7 McGoldrick, experiments of in credence vs. and task orientation
8 and Beatty retail websites search qualities increase customers' trust
9 (2010) selling books/CDs perception, which

Pe
10 and travel subsequently has a
insurance positive effect on
11 patronage intention.

er
12 Effects of task (social)-
13 oriented communications
are stronger for search
14

Re
(credence)
15 goods/services.
16

vie
17
18 Kim, Hong, Intelligent Online electronic Not specified None None Agents capable of the
19 and Cho conversational agent product (e.g., probabilistic inference

w
(2007) cellular phone) and the semantic
20 information inference show superior
21 search performance in providing

Ve
22 suitable responses to user
23 inquiries with only a few
interactions.
24

rsi
25 Lee and Choi Conversational agent Interactive movie CASA framework; media Intimacy; trust; None Self-disclosure and
(2017) recommendation equation theory; interactional reciprocity of the
26 system uncertainty reduction enjoyment conversational agent have

on
27 theory positive impacts on user
28 satisfaction and intentions
to use, mediated by
29
intimacy, trust, and
30 interactional enjoyment.
31
Mimoun and Embodied Furniture Technology acceptance Utilitarian value; None Anna increased
32 Poncin conversational agent purchase with model hedonic value consumers' satisfaction
33 (2015) Anna on IKEA's and behavioral intentions
34 website through utilitarian and
hedonic value.
35
36
37
38
39
40
41
42
43 48
44
45 Journal of Marketing
46
47
Page 49 of 62

1 Author Accepted Manuscript


2
3 Nunamaker Embodied Automated kiosk- Not specified None None Male embodied agents are
4 et al. (2011) conversational agent based interviews perceived as more
powerful, more
5 trustworthy, and more
6 expert than female ones;
7 the latter are more
8 likeable though. Avatars
with neutral expressions
9

Pe
are perceived as more
10 powerful, whereas smiling
11 avatars are more likeable.

er
12 Pütten et al. Embodied Interactions Threshold model of social None None Beliefs about whether a
13 (2010) conversational agent involving personal influence; Ethopoeia participant is interacting
14 questions concept with a human-controlled

Re
or a computer-controlled
15 agent lead to almost no
16 differences in the

vie
17 evaluation of the virtual
character or its behavioral
18
reactions; higher
19 behavioral realism

w
20 affected both.
21 Qiu and Anthropomorphic Online CASA framework; social Social presence None Humanoid appearance

Ve
22 Benbasat interface agent recommendation agency theory and human voice-based
23 (2009) system for communication of avatars
complex and significantly increase
24

rsi
attribute- participants' perceived
25 intensive digital social presence, which has
26 cameras a positive effect on trust,

on
27 perceived usefulness,
perceived enjoyment, and
28 the decision to use the
29 avatar as a decision aid.
30 Schuetzler et Conversational agent Responses to Self-disclosure; social None None Conversational agents
31 al. (2018) sensitive desirability; social with better conversational
32 questions to a presence theories abilities prompt more
33 person vs. a socially desirable
conversational responses from
34 agent vs. online participants, with no
35 survey significant effect found for
36 embodiment.
37
38
39
40
41
42
43 49
44
45 Journal of Marketing
46
47
Page 50 of 62

1 Author Accepted Manuscript


2
3 Verhagen et Virtual customer Inquiries about Social response; implicit Social presence; Communication Friendliness and expertise
4 al. (2014) service agent online mobile personality; primitive personalization style (socially vs. have positive effects on
phone service emotional contagion; task-oriented); participants' service
5 social interaction theories anthropomorphism encounter satisfaction
6 mediated by social
7 presence and
8 personalization; the effect
of friendliness on
9

Pe
personalization is
10 stronger for socially-
11 oriented agents than for

er
12 task-oriented agents, as is
the effect of expertise on
13 social presence.
14

Re
Wang et al. Virtual character Online travel Social response theory; Arousal; Product Social cues from
15 (2007) information stimulus-organism- pleasure; flow; involvement interacting with the
16 service response framework; hedonic and avatar increase

vie
17 cognitive mediation theory utilitarian values perceptions of website
socialness, feelings of
18
arousal, pleasure, and
19 flow, leading to greater

w
20 hedonic and utilitarian
21 values, which then
increase patronage

Ve
22 intentions. The effect of
23 arousal on pleasure is
24 stronger when product

rsi
25 involvement is high; the
influence of arousal on
26 hedonic value is stronger

on
27 for women. Flow does not
28 lead to pleasure for older
consumers, and pleasure
29 has a much weaker impact
30 on utilitarian value for
31 those consumers.
32 Yokotani, Virtual agent Mental health Threshold model of social None None Participants revealed
33 Takagi, and interviews influence more sex-related
34 Wakashima symptoms to the virtual
(2018) agent than to a human
35 expert, whereas they
36 disclosed mood and
37 anxiety symptoms more
38 often to the human expert
than to the virtual agent.
39
40
41
42
43 50
44
45 Journal of Marketing
46
47
Page 51 of 62

1 Author Accepted Manuscript


2
3 Table 3: Managerial Implications and Research Directions for Avatar-Based Marketing
4
5
6 Key Issues/Decisions Implications Directions for Future Research
7
8 Avatar deployment • Avatars can be used to humanize a brand with scalable, cost-effective, responsive • How can avatar-human collaborations be optimized?
9 (24/7), humanlike interactions.

Pe
10 • Avatars can be used when the scale of service requirements and customer inquiries • When the customer encounters a problem with the
11 overwhelm company employees (e.g., financial, travel, telecom services). The avatar, which type of "exit ramp" to a human employee is

er
12 employees can use their freed up time for more complex issues. most effective: avatar-initiated, employee-initiated, or
13 customer-initiated and when during an interaction
should it be deployed?
14

Re
15 • Avatars can be used to enhance customer engagement and relationship building • In the event of an avatar service failure, what service
16 through emotional connection, personalization, and service consistency. recovery strategies should be employed?

vie
17
18 • Avatars can be used to offer multichannel flexibility based on segment preferences • To what extent can a successful/failed service recovery
19 (e.g., mobile social media, company website, dedicated apps). experience shape customers' (dis)confirmation of the

w
avatar's effectiveness?
20
21

Ve
22 Avatar form realism • Avatar's anthropomorphic appearance is a double-edged sword. A more humanlike • Which dimension of the avatar's anthropomorphic
23 appearance appeals to consumers, due to enhanced entertainment value, but it also appearance (i.e., spatial dimension, movement, and other
24 raises consumers' expectations of the avatar's behavioral competence. human characteristics) has the greatest impact on

rsi
consumers' expectations for the avatar's behavioral
25 realism?
26

on
27 • If behavioral realism falls short of expectations, a negative disconfirmation will be • When might a digital assistant without a visual
produced resulting in lower levels of customers' cognitive and social responses to the representation (e.g., Amazon's Alexa) outperform an
28 avatar. avatar with an anthropomorphic appearance?
29
30 • Avatar's form realism should not exceed the level of its behavioral competence to avoid • Which avatar form realism elements create the most
31 unfavorable overall customer experiences. entertaining avatar experience?
32
33 Avatar behavioral • An avatar's behavioral competence is a more impactful design factor than its • What is the role of avatar emotional intelligence, relative
34 realism appearance. In case of a budget constraint, more resources should be allocated to to its cognitive abilities, in shaping customers'
improving avatar behavioral competence than to enhancing its visual appearance. expectations and overall experience?
35
36 • The higher the avatar's behavioral competence relative to its appearance, the more • What corrective actions can be taken to redress a
37 favorable the customer's cognitive and social experiences, due to a positive negative disconfirmation stemming from avatar's
38 disconfirmation. behavioral realism?
39 • As a caveat, high levels of behavioral competence may produce negative effects (e.g., • How might other types of avatars (e.g., customers' self-
40 social desirability bias) when the avatar also has a very realistic anthropomorphic avatars) facilitate social media-based marketing
41 appearance. campaigns?
42
43 51
44
45 Journal of Marketing
46
47
Page 52 of 62

1 Author Accepted Manuscript


2
3
• Which behavioral realism elements have the greatest
4 impact on customers' expectations?
5
Form-behavior • High form realism induces high behavioral realism expectations, which will then be • What are the unique benefits, challenges, and risks
6
realism alignment confirmed. associated with using high-realism avatars in brand
7 campaigns?
8
9 • The additive nature of high expected behavioral realism and its subsequent • When would avatar virtual influencers (e.g., Lil Miquela)

Pe
confirmation produces satisfactory customer experience with the avatar. be more effective than human endorsers in brand
10 promotion?
11

er
12 • Alignment of high form-high behavioral realism results in high levels of customers' • How should hyper realistic avatars be deployed in
13 affective, cognitive, and social experiences, which subsequently increase firm marketing campaigns?
performance.
14

Re
15
16 Avatar contingency • Consumers' expectations that the avatar's anthropomorphic appearance reflects a • How might customer segmentation strategies (e.g.,

vie
17 effects comparable level of behavioral competence will be more pronounced when consumers' psychographics, benefits sought) inform effective avatar
18 perceived uncertainty (e.g., product's functional performance, financial risk) is high. designs?
19 • Behavioral realism should account for more weight in avatar design decisions than • What form-behavioral realism elements are most

w
20 form realism, especially when customers' perceived uncertainty is high. relevant and impactful, given a specific customer
21 segment profile?

Ve
22
• When the avatar is designed with a low level of behavioral competence, companies • How do avatar mediation mechanisms differ across
23 should manage customers' expectations by giving the avatar a less realistic, less customers in different segments?
24 humanlike appearance (i.e., simplistic avatar). Refraining from a design with high form-

rsi
25 low behavioral realism (i.e., superficial avatar) becomes even more important when
customers' perceived uncertainty is high.
26

on
27 • When the exchange entails privacy concerns (e.g., mental health), an avatar • In which circumstances will avatars likely distract from,
28 characterized by low form-high behavioral realism (i.e., intelligent unrealistic avatar) rather than contribute to, customers' experience?
29 may be more effective than a high form-high behavioral realism avatar (i.e., digital
human avatar), because it reassures customers that they will not be judged and
30 promotes more honest responses.
31
• Use of mobile devices (e.g., smartphones), compared with fixed devices (e.g., desktops), • What contextual factors determine the relative
32 can enhance avatars' impact on consumers' affective and social experiences. effectiveness of avatars vs. other digital entities (e.g.,
33 anthropomorphized products, brand mascots) in online
34 shopping experience?
35 • Avatar designs should account for the consumer relationship phase. During the
36 exploration phase, avatar behavioral realism should focus on providing the best
37 cognitive experience; during the build-up phase, avatar design should be directed at
enhancing consumers' social experience (e.g., rapport) to promote commitment; during
38 the maturity phase, emphasizing the entertainment value of the avatar (e.g., funny,
39 attractive appearance) may prove more effective in sustaining the established
40 relationship.
41
42
43 52
44
45 Journal of Marketing
46
47
Page 53 of 62

1 Author Accepted Manuscript


2
3 Figure 1: Typology of Avatar Design
4
5
6
7
8
9

Pe
10
11

er
12
13
14

Re
15
16

vie
17
18
19

w
20
21

Ve
22
23
24

rsi
25
26

on
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43 53
44
45 Journal of Marketing
46
47
Page 54 of 62

1
2 Author Accepted Manuscript
3 Figure 2: Form Realism Versus Behavioral Realism Taxonomy
4
5
6
7
8 Form Realism
9 Low High
10
11 Simplistic Avatar Superficial Avatar
12 • Not a very anthropomorphic appearance (e.g., 2D, static, cartoon • Realistic anthropomorphic appearance (e.g., 3D, dynamic, realistic
image) and low intelligence (e.g., scripted, task-specific image) but low intelligence (e.g., scripted, non-customized
13 communication). solutions).
14 • Not a very realistic appearance of an avatar lowers consumers' • Likely results in a negative disconfirmation for customers, because
15 expectations of its behavioral competence. realistic anthropomorphic appearance raises customers'
expectations for avatar's intelligence.
16
• Can provide a hassle-free convenience by completing quick and • Effective in improving productivity of low-risk transactions (e.g.,
17
Pe
specific tasks (e.g., 24/7 travel information, online content bank account information inquiries).
18 Low exploration).
19 • Most effective for low-risk transactions (e.g., basic customer • Can produce detrimental effects on customer experience for high-
inquiries, inexpensive online purchases). risk transactions (e.g., stock purchase) due to lower intelligence.
20
e

21
rR
22
23
Behavioral Realism

24
25
ev

26 ING Netherlands Inge TwentyBN's Millie Nordnet's Amelia Natwest Bank's Cora
27
28 Intelligent Unrealistic Avatar Digital Human Avatar
iew

29 • Intelligent (e.g., cognitive and emotional intelligence) but lacks • Realistic anthropomorphic appearance (e.g., 3D, dynamic, realistic
realistic anthropomorphic appearance (e.g., cartoon image). image) and intelligent (e.g., cognitive and emotional intelligence).
30
• Can produce customer delight because the nonrealistic appearance • Alignment of realistic appearance and intelligence provides highest
31 lowers customers' initial expectations of avatar intelligence. levels of customer experience.
32 • Capable of autonomous, natural verbal and nonverbal • Autonomous, natural verbal and nonverbal communication, that
33 communication, that can also include social content. also includes social content, allows for complex transactions that
Ve

require highly personalized service (e.g., skincare).


34
High • Especially effective for complex, relational transactions involving • Most effective for long-term relational exchange by providing
35 sensitive personal information (e.g., mental health), by providing highest levels of cognitive (e.g., informativeness), affective (e.g.,
36 reassurance that a nonhuman agent will not judge the customer. entertainment), and social (e.g., rapport) customer experiences.
rsi

37
38
39
on

40
41
42 PTSD therapist Ellie MIT's REA SK-II's YUMI UBS's Daniel Kalt
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58 54
59
60 Journal of Marketing
Page 55 of 62

1 Author Accepted Manuscript


2
3 Figure 3: An Integrated Framework of Avatar Performance
4
5
6
7
8
9

Pe
10
11

er
12
13
14

Re
15
16

vie
17
18
19

w
20
21

Ve
22
23
24

rsi
25
26

on
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43 55
44
45 Journal of Marketing
46
47
Page 56 of 62

1 Author Accepted Manuscript


2
3 Web Appendix

4 Avatar Definitional Elements in Empirical Research

5 Definitional Element Avatar by


6 Illustrative Research Journal Labels/Aliases Definitions Digital Anthropomorphic
Appearance
Interactivity Controlling
Entity
our
definition?
7 Percentage of element 78% 70% 78% 90% 51%
8 1 Al-Natour et al. (2011) Journal of the Association for Automated shopping "Naturalistic 2D avatars, which are humanoid in form, but have ✓ ✓ ✓ Software Yes
9 Information Systems assistant degraded levels of detail" (p. 357)

Pe
10 2 Anthony et al. (2020) Journal of Medical Internet Chatbot "Software-driven, automated mobile phone messaging robots" (p. ✓ Software No
Research 3)
11

er
12 3 Araujo (2018) Computers in Human Behavior Chatbot/Disembodied
conversational agent
"Chatbots may be considered social bots as they are designed to
communicate with humans, and substitute for other humans by
✓ Software No

13 mimicking human to human communication" (p. 184)

14

Re
4 Behrend and Computers in Human Behavior Animated pedagogical "Animated pedagogical agents (APAs) are a specific class of ✓ ✓ Software No
Thompson (2011) agent agents that are represented as a human or animal body within
15 the virtual environment, designed to facilitate learning" (p. 1201)
16

vie
17 5 Benbasat et al. (2020) International Journal of Human- Anthropomorphic "Anthropomorphic recommendation agents are decision aids ✓ ✓ ✓ Software Yes
Computer Studies recommendation agent designed to guide consumers in product selection in online stores
18 with an anthropomorphic (human-like) interface" (p. 56)

19

w
20 6 Bente et al. (2008) Human Communication
Research
Avatar "Artificial, computer-animated representations of human
interlocutors within virtual environments" (p. 288)
✓ ✓ ✓ Human Yes

21

Ve
7 Bernard et al. (2018) Journal of Medical Internet Avatar "Visualized representations of users in virtual reality" (p. 2) ✓ ✓ ✓ Human Yes
22 Research
23 8 Bibault et al. (2019) Journal of Medical Internet Chatbot Computer systems that imitate human conversation by using a ✓ Software No
24 Research field of artificial intelligence (AI) known as natural language

rsi
processing (p. 2)
25
9 Bickmore et al. (2016) Journal of Medical Internet Embodied Computer-generated animated characters that demonstrate ✓ ✓ ✓ Software Yes
26 Research conversational agent many of the same properties as humans in face-to-face

on
27 conversation, including the ability to produce and respond to
verbal and nonverbal communication (p. 2)
28
10 Bott et al. (2019) Journal of Medical Internet Embodied "Technological entities that can interact with people through ✓ ✓ Software No
29 Research conversational agent spoken conversation" (p. 1)

30 11 Brave, Nass, and International Journal of Human- Embodied computer Human-computer interface programmed with the capacity for ✓ ✓ ✓ Software Yes
31 Hutchinson (2005) Computer Studies agents emotional expression as human-like and autonomous interaction
partners (p. 162)
32
12 Burgoon et al. (2000) Computers in Human Behavior Intelligent computer "Intelligent computer agents—computer interfaces that come in ✓ ✓ ✓ Software Yes
33 agents a variety of guises and that present and process information
according to a set of predefined algorithms. Agents may be
34 designed to appear more anthropomorphic by fitting them with
35 distinctly human-like (virtual) features such as voice recognition,
synthesized voices, and computer animation that simulates
36 human facial expressions and gestures" (p. 554)

37 13 Burgoon et al. (2016) International Journal of Human- Embodied agent "Virtual—often anthropomorphic—representations of computer ✓ ✓ ✓ Software Yes
Computer Studies interfaces that a user can socially identify with and that can be
38 used to improve the communication, comprehension, and
39 performance of human computer interactions" (p. 24)

40
14 Carlotto and Jaques International Journal of Human- Animated pedagogical "Animated on-screen characters that assist learners in ✓ ✓ Software No
41 (2016) Computer Studies agent multimedia learning environments" (p. 15)
42
43
44
45 Journal of Marketing
46
47
Page 57 of 62

1 Author Accepted Manuscript


2
3 15 Chattaraman et al. Computers in Human Behavior Digital assistant "AI-based digital voice assistants recognize and understand voice- ✓ ✓ ✓ Software Yes
(2019) based user requests and communicate using natural language to
4 accomplish a wide variety of tasks, such as reading the news,
5 getting weather forecasts and sports information, and ordering
products from online stores" (p. 351)
6
16 Chattaraman, Kwon, Computers in Human Behavior Virtual agent "Representing animated embodiments that respond to users ✓ ✓ ✓ Software Yes
7 and Gilbert (2012) through verbal and non-verbal communication" (p. 2055)
8 17 Choi et al. (2015) International Journal of Human- Embodied social agent "Embodied agents that expressed emotion" (p. 41) ✓ ✓ Software No
9 Computer Studies

Pe
10 18 Chung et al. (2020) Journal of Business Research Chatbot/Virtual service "An example of a virtual conversational service robot that can ✓ Software No
11 agent provide human–computer interaction" (p. 588)

er
12 19 Derrick and Ligon Computers in Human Behavior Embodied "Computer-generated cartoon-like characters that demonstrate ✓ ✓ ✓ Software Yes
(2014) conversational agent many of the same properties as humans in face-to-face
13 conversation, including the ability to produce and respond to
14 verbal and non-verbal communication" (p. 40)

Re
15 20 Dincer and Doganay Computers & Education Pedagogical agent "Modules which facilitate learning , guide learners, support ✓ Software No
(2017) motivation and give feedback in education software" (p. 75)
16

vie
17 21 D'Mello, Graesser, and Human-Computer Interaction AutoTutor "A fully automated computer tutor that simulates a human tutor ✓ ✓ ✓ Software Yes
King (2010) and holds conversations with students in natural language" (p.
18 293)

19 22 Dunsworth and Computers & Education Animated pedagogical "Characters on the computer screen with embodied life-like ✓ ✓ Software No

w
20 Atkinson (2007) agent behaviors such as speech, emotions, locomotion, gestures, and
movements of the head, the eye, or other parts of the body" (p.
21 678)

Ve
22 23 Easton et al. (2019) Journal of Medical Internet Virtual agent Artificial intelligence-based systems that have the potential to ✓ ✓ Human No
Research communicate with the patient (p. 2)
23
24 24 Ferrand et al. (2020) Journal of Medical Internet Smart assistant/Voice "Voice assistants, a form of chatbot or conversational agent often ✓ Software No

rsi
Research assistant/Chatbot/Conv referred to colloquially as “smart assistants,” are devices that
25 ersational agent respond to human voices and can be commanded to do a variety
of tasks" (p. 2)
26

on
27 25 Freeman and Beaver International Journal of Human- Intelligent virtual Intelligent Virtual Assistant is a multimodal automated dialogue ✓ ✓ ✓ Software Yes
(2018) Computer Studies assistant system capable of live chat (p. 13)
28
26 Friederichs et al. Journal of Medical Internet Avatar Visualized representations of a computer system that have ✓ ✓ ✓ Software Yes
29 (2014) Research human-like cues (pp. 2-3)
30
27 Fryer et al. (2017) Computers in Human Behavior Chatbot "Chatbots are software avatars with limited, but growing ✓ Software NO
31 capability for conversation with human beings" (p. 461)
32 28 Fryer, Nakao, and Computers in Human Behavior Chatbot "Chatbots are programs developed to engage in conversations ✓ Software NO
33 Thompson (2019) with humans" (p. 280)

34 29 Gammoh, Jimenez, and International Journal of Avatar "general graphic representations that are personified by means ✓ ✓ Software No
35 Wergin (2018) Electronic Commerce of computer technology" (p. 325)

36 30 Gilbert and Forney International Journal of Human- Intelligent 3-dimensional digital representations controlled by a human or ✓ ✓ ✓ Software/Human Yes
(2015) Computer Studies agent/Avatar machine intelligence (p. 31)
37
38 31 Go and Sundar (2019) Computers in Human Behavior Chatbot/Chat agent Automated dialogue system with anthropomorphic cues (pp. 304-
305)
✓ ✓ ✓ Software Yes

39
32 Groom et al. (2009) Computers & Education Embodied agent "Embodied agents are digital, visual representations of an ✓ ✓ Software No
40 interface, often taking a human form" (p. 842)
41
42
43
44
45 Journal of Marketing
46
47
Page 58 of 62

1 Author Accepted Manuscript


2
3 33 Grynszpan, Martin, International Journal of Human- Animated Animated conversational agents are virtual characters that ✓ ✓ ✓ Software Yes
and Nadelc (2008) Computer Studies conversational agent communicate through modalities such as speech, facial
4 expressions and gesture that are inspired from human
5 communication (p. 630)

6 34 Guadagno, Swinth, and Computers in Human Behavior Avatar "A digital representation of another actual person in real time" ✓ ✓ ✓ Human Yes
Blascovich (2011) (p. 2380)
7
8 35 Guo et al, (2015) Computers & Education Affective embodied "An affective embodied agent is a life-like interface character ✓ ✓ Software No
9 agent that is capable of eliciting certain affective experiences from

Pe
users through multiple modalities such as speech, facial
10 expressions and body gestures" (p. 369)
11 36 Guo and Goh (2016) Computers & Education Affective embodied "An embodied agent (EA) therefore refers to a life-like agent, i.e., ✓ Software No

er
12 agent one with a face and body, and communicates with users via
speech, facial expressions and body gestures. Designed with the
13 ability of emotional expression, affective EAs are becoming an
14 increasingly popular technique to incorporate affective elements

Re
in computer programs" (p. 60)
15
37 Hill, Ford, and Computers in Human Behavior Chatbot "Machine conversation systems that interact with human users ✓ Software No
16 Farreras (2015) via natural conversational language" (p. 246)

vie
17 38 Ho, Hancock, and Journal of Communication Chatbot "Computer programs that can simulate human-human ✓ Human No
18 Miner (2018) conversation" (p. 712)

19 ✓ ✓

w
39 Hoffmann and Kramer International Journal of Human- Virtual character Virtual embodiments of artificial entities (p. 1) Software No
20 (2013) Computer Studies

21 40 Holzwarth, Journal of Marketing Avatar “General graphic representations that are personified by means ✓ ✓ ✓ Software Yes

Ve
Janiszewski, and of computer technology” (p. 20)
22 Neumann (2006)
23 41 Hubal et al. (2008) Computers in Human Behavior Embodied "virtual characters rendered on a monitor or screen with whom a ✓ ✓ ✓ Software Yes
24

rsi
conversational agent user converses" (p. 1105)

25 42 Iovine, Narducci, and Decision Support Systems Digital assistant "Systems that can interact with users via natural language, ✓ Software No
26 Semeraro (2020) assisting them in performing daily tasks, such as memorizing

on
appointments, booking plane tickets or, more generally, finding
27 information" (p. 1)

28 43 Jin (2009) Journal of Interactive Avatar "Artificial, computer-animated representations of human ✓ ✓ Software No
29 Marketing interlocutors" or "pictorial representations of humans in a chat
environment" (p. 234)
30 44 Jin (2010) Computers in Human Behavior Virtual agent "A non-player character (NPC) in interactive media ✓ ✓ ✓ Software Yes
31 environments and other virtual environments" (p. 444)

32 45 Kang and Watt (2013) Computers in Human Behavior Avatar "Digital models of people that either look or behave like the ✓ ✓ ✓ Human Yes
33 people they represent" (p. 1170)

34 46 Keeling, McGoldrick, Journal of Business Research Avatar "Interactive representations of sales assistants on retail ✓ ✓ ✓ Software Yes
and Beatty (2010) websites, including all types, whether realistic or cartoon
35 humans, animals, or objects" (p. 793)
36 47 Keng and Liu (2013) Computers in Human Behavior Avatar "A quasi-human character whose role is to assist consumers in ✓ ✓ Software No
37 their shopping and, specifically, to be the chief source of product
information, to strengthen consumers’ trust in the website’s
38 information, and to enhance consumer-shopping experiences,
thereby increasing the possibility of a final purchase" (p. 792)
39
40 48 Kim, Chen, and Zhang Journal of Consumer Research Anthropomorphized Entities that "are often imbued with humanlike features and ✓ ✓ Software No
41 (2016) helper/Digital assistant characteristics" (p. 283)

42
43
44
45 Journal of Marketing
46
47
Page 59 of 62

1 Author Accepted Manuscript


2
3 49 Kim, Hong, and Cho Information Processing and Intelligent "Representative intelligent agents that are capable of responding ✓ ✓ ✓ Software Yes
(2007) Management conversational in an intelligent way (with natural language dialogue) to requests
4 agent/Avatar from users" (p. 225)
5 50 Kleinsmith et al, Computers in Human Behavior Virtual patient "Virtual character that represents a patient interact with human ✓ ✓ ✓ Software Yes
6 (2015) by speech, gestures, text, and other behaviors" (p. 152)

7 51 Kocaballi et al. (2020) Journal of Medical Internet Conversational "Systems that mimic human conversations using text or spoken ✓ Software No
8 Research agent/Chatbot language" (p. 1)

9 52 Köhler et al. (2011) Journal of Marketing Online agent "Computer mediated personas that posses the capability to ✓ Software No

Pe
involve customers in rich interactive conversations, rather than
10 discrete, basic exchanges, and that have the ability to apply past
11 interaction content to current interactions" (p. 96)

er
12 53 Kotlyar and Ariely Computers in Human Behavior Avatar Virtual character that represents a person communicating with ✓ ✓ ✓ Human Yes
(2013) verbal and non-verbal cues (p. 546)
13
54 Krämer et al. (2013) International Journal of Human- Embodied Computer systems with "the ability to recognize and respond to ✓ ✓ ✓ Software Yes
14

Re
Computer Studies conversational agent verbal and nonverbal input, to generate verbal and nonverbal
output, to deal with conversational functions (e.g., turn-taking),
15 and to recognize the state of the conversation" (p. 335)
16

vie
17 55 Krämer et al. (2016) Computers & Education Virtual agent Autonomous virtual figure with social cues that communicates ✓ ✓ ✓ Software Yes
via verbal and nonverbal means (p. 4)
18
19 56 Krämer et al. (2018) International Journal of Human- Virtual "Computer generated anthropomorphic interface agents that ✓ ✓ ✓ Software Yes

w
Computer Studies agent/Embodied employ humanlike behavior within a dyadic conversation with a
20 conversational agent human user" (p. 113)

21 57 Lee and Choi (2017) International Journal of Human- Conversational agent Computer systems that "establish voice recognition as the ✓ ✓ ✓ Human Yes

Ve
Computer Studies mainstream mode of human–computer interaction (HCI)" and
22 "operate like a personal secretary or a concierge, assisting users
23 with a wide range of functions, such as making search queries,
making phone calls, scheduling tasks, reading books, and others"
24

rsi
(p. 1)

25 58 Li et al. (2015) Computers in Human Behavior Embodied pedagogical "Virtual agents – animated characters that are rendered using ✓ ✓ Software No
26 agents computer graphics software – have been used to deliver

on
educational content as part of a vision that a virtual agent can
27 advantageously influence learning" (p. 3)

28 59 Lin et al. (2013) Computers & Education Animated pedagogical "An animated pedagogical agent is a lifelike character that ✓ ✓ Software No
29 agent provides instructional information through verbal and nonverbal
forms of communication" (p. 239)
30
60 Lin et al. (2020) Computers & Education Animated pedagogical "A virtual agent is a lifelike character that has some or all of the ✓ ✓ Software No
31 agent following features: (a) a human-like look, (b) locomotion, (c)
32 goaldirected gestures, (d) facial expression, (e) gaze, and (f) a
human voice." "An animated pedagogical agent is a type of virtual
33 agent that is embedded in a computer-based learning
environment to deliver instruction through verbal and nonverbal
34 forms of communication" (p. 2)
35 61 Lin, Doong, and Journal of Service Research Avatar "graphic representation and personification of the VS [virtual ✓ ✓ ✓ Software Yes
36 Eisingerich (2020) salesperson] by means of computer technology" (p. 2)

37 62 Lunardo et al. (2016) Journal of Retailing and Online virtual agent "Online virtual agents (OVAs) – computer-generated characters ✓ ✓ ✓ Software Yes
38 Consumer Services designed to interact with users by simulating human appearance
and behaviors through artificial intelligence" (p. 1)
39
40 63 Luo, Tong, Fang, and Marketing Science Chatbot "Computer programs that simulate human conversations ✓ Software No
Qu (2019) through voice commands or text chats and serve as virtual
41 assistants to users" (p. 937)
42
43
44
45 Journal of Marketing
46
47
Page 60 of 62

1 Author Accepted Manuscript


2
3 64 Mimoun and Poncin Journal of Retailing and Embodied "Graphic character designed on computer, possessing the ✓ ✓ ✓ Software Yes
(2015) Consumer Services conversational agent capacity to dialogue with a user, by using not only the speech but
4 the other nonverbal capacities such as the gesture, the glance,
5 the intonation and the physical posture" (p. 71)

6 65 Miner et al. (2016) Jama Internal Medicine Conversational agent "Conversational agents, such as Siri (Apple), GoogleNow, SVoice ✓ Software No
(Samsung), and Cortana (Microsoft), are smartphone-based
7 computer programs designed to respond to users in natural
language, thereby mimicking conversations between people" (p.
8 620)
9

Pe
66 Morris et al. (2018) Journal of Medical Internet Conversational agent "Conversational agents are software applications that respond to ✓ Software No
10 Research users with natural language, often with the goal of helping a user
complete a task" (p. 1)
11

er
67 Nunamaker et al. Journal of Management Embodied "Virtual, three-dimensional human likenesses that are displayed ✓ ✓ ✓ Software Yes
12 (2011) Information Systems conversational agent on computer screens" and "interact with people through natural
13 speech" (p. 21)

14

Re
68 Ozogul et al. (2013) Computers & Education Animated pedagogical "Animated pedagogical agents (APAs) are humanlike or cartoon ✓ ✓ Software No
agent animated characters which are displayed within a computer-
15 based learning environment to provide learners with
16 pedagogical assistance" (p.36)

vie
17 69 Palanica et al. (2019) Journal of Medical Internet Chatbot "Chatbots, also known as conversational agents, interactive ✓ Software No
Research agents, virtual agents, virtual humans, or virtual assistants, are
18 artificial intelligence programs designed to simulate human
19 conversation via text or speech" (p. 2)

w
20 70 Pot et al. (2017) Journal of Medical Internet Virtual assistant Visualized representations of a computer system that have ✓ ✓ ✓ Software Yes
Research human-like cues (p. 2)
21

Ve
22 71 Potdevin, Saboureta,
and Clavel (2020)
International Journal of Human-
Computer Studies
Embodied
conversational agent
"ECAs are humanoid and intelligent systems able to converse
with human beings by understanding and answering in natural
✓ ✓ ✓ Software Yes

23 language" (p. 1)

24

rsi
72 Pütten et al. (2010) Computers in Human Behavior Embodied "A perceptible digital representation whose behaviors reflect a ✓ ✓ ✓ Software Yes
conversational agent computational algorithm designed to accomplish a specific goal
25 or set of goals" (p. 1641)
26

on
73 Puzakova, Rocereto, International Journal of Anthropomorphized A computer technology that "can filter and condense a variety of ✓ ✓ Software No
27 and Kwak (2013) Advertising recommendation agent options, thereby assisting them (consumers) in product choice
decisions" (p. 514)
28
74 Qiu and Benbasat Journal of Management Anthropomorphic Adding humanoid embodiment and human voice to software- ✓ ✓ ✓ Software Yes
29 (2009) Information Systems interface agent based product recommendation system (p. 145)
30
75 Qiu and Benbasat International Journal of Human- Product "A Product Recommendation Agent (PRA) is a type of decision ✓ ✓ ✓ Software Yes
31 (2010) Computer Studies recommendation agent support system that helps to alleviate consumers’ cognitive load
by gathering, screening, and evaluating vast amount of product
32 information available on the web" (p. 1)
33 76 Rese et al. (2020) Journal of Retailing and Chatbot "Any software application that engages in a dialog with a human ✓ Software No
34 Consumer Services using natural language" (p. 2)

35 77 Riedl et al. (2014) Journal of Management Avatar "Avatar as a label for digital representations of humans in online ✓ ✓ Software No
36 Information Systems or virtual environments" (p. 86)

37 78 Rosenthal-von der Computers in Human Behavior Virtual agent The digital representation of computer systems with verbal and ✓ ✓ ✓ Software Yes
Püetten et al. (2019) non-verbal cues (p. 398)
38
39
40
41
42
43
44
45 Journal of Marketing
46
47
Page 61 of 62

1 Author Accepted Manuscript


2
3 79 Schuetzler et al. Decision Support Systems Conversational agent "Systems that mimic human-to-human communication using ✓ ✓ ✓ Software Yes
(2018) natural language processing, machine learning, and/or artificial
4 intelligence" (p. 94)
5 80 Schuetzler et al. Computers in Human Behavior Conversational agent "Computer systems designed to communicate in natural ✓ Software No
6 (2019) language with humans, as opposed to using predefined computer
commands" (p. 250)
7
81 Sheehan, Jin, and Journal of Business Research Customer service "Computer programs with natural language capabilities, which ✓ Software No
8 Gottlieb (2020) chatbot can be configured to converse with human users" (p. 14)
9

Pe
82 Shiban et al. (2015) Computers in Human Behavior Pedagogical agent Computer system characterized by "the appearance of the ✓ ✓ ✓ Software Yes
10 agents, advanced communication features like gestures and
emotional expression, and the dialogue itself including
11 motivational messages" (p. 5)

er
12 83 Shin, Kim, and Biocca Computers in Human Behavior Avatar "An image that represents the self in the virtual world, ranging ✓ ✓ Human No
13 (2019) from very simple drawings to quite detailed three-dimensional
renderings of characters" (p. 101)
14

Re
84 Shorey et al. (2019) Journal of Medical Internet Avatar Visualized 3D representations of users "characterized by natural, ✓ ✓ ✓ Software Yes
15 Research nonverbal gestures to elicit more engaging connections as well as
16 to have more life-like realism," which is used for "mimicking

vie
human conversations (both verbal and nonverbal)" (p. 4)
17
18 85 Sin and Munteanu Human-Computer Interaction Intelligent virtual agent "Interactive digital characters that exhibit human-like qualities ✓ ✓ ✓ Software Yes
19 (2020) and can communicate with humans and each other using natural

w
human modalities like facial expressions, speech and gestures"
20 (p. 483)

21 86 Sivaramakrishnan, Journal of Interactive Anthropomorphic "A humanlike chatbot that acts as an interactive online ✓ ✓ Software No

Ve
Wan, and Tang (2007) Marketing information agent information provider" (p. 60)
22
23 87 Strassmann et al. Journal of Medical Internet Virtual agent Virtual agents are assistive technologies capable of ✓ ✓ ✓ Software Yes
24 (2020) Research communicating in a human-like way (p. 2)

rsi
25 88 Tanana et al. (2019) Journal of Medical Internet Conversational agent "Conversational agents are computer programs that are intended ✓ Software No
Research to interact with a real person using language" (p. 2)
26

on
27 89 Touré-Tillery and Journal of Marketing Anthropomorphized "Nonhuman entities that deliver message content across a ✓ Software No
McGill (2015) agent (partial human) variety of media (e.g., print, online, television), are typically
28 imbued with various combinations of human characteristics,
such as human form (e.g., human-like faces, arms, and legs), and
29 the apparent ability to speak and think" (p. 94)
30
90 Van den Broeck, Computers in Human Behavior Chatbot "A computer program, which simulates human language with the ✓ Software No
31 Zarouali, and Poels aid of a text-based dialogue system" (p. 150)
(2019)
32
33 91 van Vugt et al. (2006) International Journal of Human-
Computer Studies
Interface character "Anthropomorphized communication partners, or interface
characters, feature in educational software, the Internet, games,
✓ ✓ Software No

34 and standard desktop applications" (p. 875)

35 92 van Vugt et al. (2009) International Journal of Human- Embodied agent Humanlike representation of computer systems (p. 571) ✓ ✓ ✓ Software Yes
Computer Studies
36
93 Veletsianos (2010) Computers & Education Pedagogical agent "Pedagogical agents are static or animated anthropomorphic ✓ ✓ Software No
37 interfaces employed in electronic learning environments to serve
38 various instructional goals" (p. 577)

39 94 Verhagen et al. (2014) Journal of Computer-Mediated Virtual customer "Computer-generated characters that are able to interact with ✓ ✓ ✓ Software Yes
Communication service agent customers and simulate behavior of human company
40 representatives through artificial intelligence" (p. 530)
41
42
43
44
45 Journal of Marketing
46
47
Page 62 of 62

1 Author Accepted Manuscript


2
3 95 Wang et al. (2016) Decision Support Systems Recommendation agent "Recommendation agents (RAs) are software-based decision aids ✓ ✓ ✓ Software Yes
designed to guide online consumers in product selection" (p. 48)
4
5 96 Wang et al. (2007) Journal of Marketing Virtual character "Avatars are lifelike characters created by technology," with ✓ ✓ ✓ Software Yes
6 some type of combination of 4 online social cues: "language,
human voice, interactivity, and social role" (pp. 143-144)
7
97 Yokotani, Takagi, and Computers in Human Behavior Virtual Agent Virtual representation of the computer systems (p. 136) ✓ ✓ ✓ Software Yes
8 Wakashima (2018)
9

Pe
98 Zand et al. (2020) Journal of Medical Internet Chatbot "A chatbot, or chatterbot, attempts to simulate a natural ✓ Software No
10 Research conversation with a human user" (p. 2)

11

er
12
13
14

Re
15
16

vie
17
18
19

w
20
21

Ve
22
23
24

rsi
25
26

on
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45 Journal of Marketing
46
47

You might also like