Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/368469612

Chatbots and Health: Mental Health History of AI-powered chatbots

Chapter · February 2023


DOI: 10.1002/9781119678816.iehc0725

CITATIONS READS

0 334

3 authors, including:

Yang Cheng
North Carolina State University
87 PUBLICATIONS 1,686 CITATIONS

SEE PROFILE

All content following this page was uploaded by Yang Cheng on 13 February 2023.

The user has requested enhancement of the downloaded file.


10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Chatbots and Health: Mental Health
YANG CHENG
CHENXING XIE
North Carolina State University, USA

YANDING WANG
China Medical University, China

HUA JIANG
Syracuse University, USA

History of AI-powered chatbots

Chatbots are digital tools in the form of hardware or software that “mimic humanlike
behaviors, provide a task-oriented framework with evolving dialogue able to participate
in conversation” (Vaidyam et al., 2019, p. 457). In 1950, Turing proposed a famous
question: “Can machines think?,” prophesying a future where people communicate
with machines as if they were humans (Turing, 1950, p. 433). Turing (1950) also
depicted a vista that machines might “eventually compete with men in all purely
intellectual fields” (p. 460).
In 1966, Weizenbaum designed the first chatbot in the world – ELIZA, which could
perform pattern matching to phrase responses based on decomposition rules to com-
municate with people in natural languages (Shum, He, & Li, 2018). With the develop-
ment of artificial intelligence (AI) and its application in natural language processing, Siri
was released by Apple in 2011, followed by Cortana from Microsoft, Google Assistant,
as well as Alexa from Amazon (Shum et al., 2018). These chatbots are often embed-
ded in mobile devices and accomplish myriad tasks through conversations with users
(Shum et al., 2018).
The evolution of computing power makes the application of artificial intelligence fea-
sible in healthcare. Chatbots can work as automated conversational agents to improve
the communication between clinicians and patients, assist with health education, and
provide more access to health services (Palanica et al., 2019; Vaidyam et al., 2019).

Chatbots and mental health

In the field of mental health, insufficiency in the clinical workforce has impeded the
distribution of high-quality healthcare, and the demand for innovative technological
solutions was increasing gradually (Vaidyam et al., 2019). With the rapidly evolving

The International Encyclopedia of Health Communication.


Evelyn Y. Ho, Carma L. Bylund, and Julia C. M. van Weert (Editors-in-Chief),
Iccha Basnyat, Nadine Bol, and Marleah Dean (Associate Editors).
© 2023 John Wiley & Sons, Inc. Published 2023 by John Wiley & Sons, Inc.
DOI: 10.1002/9781119678816.iehc0725
10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
2 C H AT B O T S A ND H E A L T H : M E N TA L H E A L T H

Table 1 Categories of mental health chatbots.

Chatbots Category Modality Platform


3D WorldBuilder Mental illness specific 3D Computer application
Laura Mental illness specific 3D Computer application
SimSensei Mental illness specific 3D Computer application
Joy Mental illness specific Text Messaging app embedded
Tess Mental illness specific Text Messaging app embedded
Shim Mental illness specific Text Mobile app
Woebot Mental illness specific Text Mobile app
Wysa Mental illness specific Text Mobile app
Youper Mental illness specific Text Mobile app
Bots4Health Health in general Text Messaging app embedded
Florence Health in general Text Messaging app embedded
Super Izzy Health in general Text Messaging app embedded
Ada Health in general Text Mobile app
Babylon Health Health in general Text Mobile app
GYANT Health in general Text Mobile app
Healthily Health in general Text Mobile app
HealthTap Health in general Text Mobile app
Mediktor Health in general Text Mobile app, web
Sensely Health in general Text Mobile app, web
Buoy Health Health in general Text Web

technology in artificial intelligence, various chatbots were developed to screen, diag-


nose, and treat mental health issues. These chatbots are not only a supplement to the
workforce, but they are also more acceptable to patients who may be unwilling to visit
clinicians because of stigma (Vaidyam et al., 2019).
Based on a list of mental health chatbots collected by Palanica and colleagues (2019)
and Vaidyam and colleagues (2019), current chatbots addressing mental illness can be
classified into two categories: one specifically focuses on mental health disorders, while
the other deals with general health issues including mental illness (see Table 1). The dif-
ferent modalities (Vaidyam et al., 2019) and platforms of these chatbots are also listed
in Table 1. The modality and device have been evolving continually. The modality has
changed from text to embodied 3D avatars. The mental health chatbots also appeared in
various platforms including mobile apps, computer applications, and web and embed-
ded messaging apps. For instance, Woebot, an AI-based chatbot for mental illness, was
developed to monitor and mitigate college students’ depression by providing automatic
conversations (Fitzpatrick, Darcy, & Vierhile, 2017).

Functions of chatbots for mental health

With the growing number of chatbots employed in health communication, scholars


have conducted research on the efficacy of these chatbots. Compared with human men-
tal health clinicians, chatbots can provide immediate and personalized conversations
10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
C H AT B O T S A ND H E A L T H : M E N TA L H E A L T H 3

Figure 1 Functions of chatbots for mental health.

to save users’ money and time (Cheng & Jiang, 2020a; Fitzpatrick et al., 2017; Inkster,
Sarda, & Subramanian, 2018). Using the Uses & Gratification theory, Cheng and Jiang
(2020a) found that users’ social, utilitarian, technology, and hedonic gratifications could
affect their engagement with mental health chatbots. Based on these four categories
of gratifications (Cheng & Jiang, 2020a), we categorize the functions of mental health
chatbots into four dimensions: technological convenience, information, social compan-
ionship, and emotional support (see Figure 1).
Technological convenience. Technology gratifications can be regarded as users’
motivation for using a technology because it is easy to access and can provide
immediate conversation (Cheng & Jiang, 2020a). Chatbots are capable of providing
timely, yet interactive support to patients with mental health issues (Fitzpatrick et al.,
2017). In their study on people’s use of chatbots to mitigate mental health issues after
mass-shooting disasters, Cheng and Jiang (2020a) found that chatbots demonstrate a
unique media appeal to users for their technological convenience that can significantly
improve users’ satisfaction.
Information. Utilitarian gratifications can be achieved when the technology can pro-
vide useful information for users (Cheng & Jiang, 2020a). Chatbots can provide people
with necessary information and intelligent solutions to resolve mental health issues
during crises. On the one hand, chatbots fulfilled users’ requests for information via
instant messages to answer users’ questions during crises. Through interactive con-
versation, chatbots encourage users to participate in communication (Ho, Hancock,
& Miner, 2018). On the other hand, users’ information is collected and evaluated by
chatbots via constant monitoring. Chatbots can proactively provide customized feed-
back on individuals’ health states based on their personal information such as feelings
and reported gender, age, education, and medical histories. Chatbots can assess peo-
ple’s mental health conditions based on users’ behavior and mental health information,
and help alleviate anxiety and depression complications after mass shooting disasters
(Cheng & Jiang, 2020a).
Emotional support. Hedonic gratification can be achieved by providing users with
pleasure and emotional support (Cheng & Jiang, 2020a). By offering compassionate
dialogues, mental health chatbots can reduce users’ anxiety and depression effectively
10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
4 C H AT B O T S A ND H E A L T H : M E N TA L H E A L T H

(Fitzpatrick et al., 2017). At the same time, chatbots can make personalized adjust-
ments during a conversation to encourage users to manage emotions (Inkster et al.,
2018). Zhou and colleagues (2020) explored how XiaoIce, an AI-powered empathetic
social chatbot, can build emotional connection with users through identifying user’s
emotional status, understanding users’ needs, and providing personalized responses.
Social companionship. Social gratifications include two dimensions: social interaction
and social presence (Cheng & Jiang, 2020a). Mental health chatbots can provide social
companionship for users. Through long-term companionship, mental health chatbots
with advanced functions can provide long-term social support for users (Zhou et al.,
2020). By monitoring users’ behaviors, mental health chatbots can also enable users
with extended access to therapeutic help and reminders. By providing companionship
and emotional support, mental health chatbots encourage patients to actively identify
their emotions (e.g., anxiety), offer personalized support and assistance with patients,
and help users participate in social conversations (Zhou et al., 2020).

Challenges and the future of chatbots for mental health

To assess the efficacy and potentiality of chatbots for mental health, researchers have
conducted studies to analyze the usability of mental health chatbots in real-world
scenarios.
On one hand, many studies show users are satisfied with mental health chatbots and
considered these tools as useful for individuals who would not seek healthcare because
of time, cost, or stigma (Cheng & Jiang, 2020a). On the other hand, Vaidyam and col-
leagues (2019) indicated that some users may “grow excessively attached due to a dis-
torted or parasocial relationship perhaps stemming from a patient’s psychiatric illness
itself” (p. 460). Such concern may become a potential risk for mental health chatbots.
Ethical concerns such as confidentiality and privacy have been found as key factors
that influence the future of chatbots as well. According to Cheng and Jiang (2020b),
the perceived privacy risk was related to the use of chatbots, which would reduce users’
satisfaction, continuous use intention, and customer loyalty toward the chatbot’s ser-
vices. Future research should emphasize ethical issues such as the protection of users’
information, data privacy, and identification of responsibility for interactive accidents.
Providing empathic conversation is another expectation for mental health chatbots.
More studies should focus on how to enable mental health chatbots to listen to users
empathically and provide empathic responses (Inkster et al., 2018). For instance, using
the human-like filler language in the responses can make the users feel emotionally
engaged (Vaidyam et al., 2019).
Apart from the ethical concerns, the development and application of AI also has
implications for the future of mental health chatbots. With the aid of AI, mental health
chatbots nowadays can accomplish far more tasks compared to their predecessors. As
AI technologies are evolving at a rapid rate, researchers should pay attention to the
integration of cutting-edge AI technologies. These AI technologies should include the
design and build of mental health chatbots to accomplish more complex tasks and assist
10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
C H AT B O T S A ND H E A L T H : M E N TA L H E A L T H 5

with the clinical procedures for mental health issues, providing users with various inter-
actions of text and pictures in different scenarios. For instance, in terms of features of
chatbots, future research could focus on the safety, effectiveness, and privacy protec-
tion involved in chatbots for mental health. If more reliable evidence could be collected
in future research on the application of chatbots in mental health, the trust and social
acceptance of users may be further enhanced.
As a new research hotspot, chatbots for mental health in crises have increasingly
become an important interdisciplinary issue in communication, medicine, psychology,
computer science, and other fields. At present, the current research mainly explores
the attitudes of consumers or patients toward chatbots, but there is a lack of evidence
about the perspective of governments and institutions. It would be our long-term goal
to explore chatbots for knowledge sharing and information monitoring in times of cri-
sis, adopt massive big data to diagnose potential psychological problems, and enhance
people’s mental health engagement.

SEE ALSO: Chatbots and Health: General; Digital Media Use, Impact on Well-being;
Human–Computer Interaction; New Communication Technologies.

References

Cheng, Y., & Jiang, H. (2020a). AI-powered mental health chatbots: Examining users’ motiva-
tions, active communicative action, and engagement after mass-shooting disasters. Journal of
Contingencies and Crisis Management, 28, 339–354. https://doi.org/10.1111/1468-5973.12319
Cheng, Y., & Jiang, H. (2020b). How do AI-driven chatbots impact user experience? Examining
gratifications, perceived privacy risk, satisfaction, loyalty, and continued use. Journal of Broad-
casting and Electronic Media, 65(4), 592–614. https://doi.org/10.1080/08838151.2020.1834296
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior ther-
apy to young adults with symptoms of depression and anxiety using a fully automated
conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.
https://doi.org/10.2196/mental.7785
Ho, A., Hancock, J., & Miner, A. S. (2018). Psychological, relational, and emotional effects of
self-disclosure after conversations with a chatbot. Journal of Communication, 68(4), 712–733.
https://doi.org/10.1093/joc/jqy026
Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, conversational arti-
ficial intelligence agent (Wysa) for digital mental well-being: Real-world data evalua-
tion mixed-methods study. JMIR mHealth and uHealth, 6(11), e12106. https://doi.org/
10.2196/12106
Palanica, A., Flaschner, P., Thommandram, A., Li, M., & Fossat, Y. (2019). Physicians’ percep-
tions of chatbots in health care: Cross-sectional web-based survey. Journal of Medical Internet
Research, 21(4), e12887. https://doi.org/10.2196/12887
Shum, H. Y., He, X. D., & Li, D. (2018). From Eliza to XiaoIce: Challenges and opportunities with
social chatbots. Frontiers of Information Technology & Electronic Engineering, 19(1), 10–26.
https://doi.org/10.1631/FITEE.1700826
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460.
https://doi.org/10.1093/mind/LIX.236.433
Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M. S., & Torous, J. B. (2019). Chat-
bots and conversational agents in mental health: A review of the psychiatric landscape. The
Canadian Journal of Psychiatry, 64(7), 456–464. https://doi.org/10.1177/0706743719828977
10.1002/9781119678816.iehc0725, Downloaded from https://onlinelibrary.wiley.com/doi/10.1002/9781119678816.iehc0725 by Cheng Yang , Wiley Online Library on [13/02/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Zhou, L., Gao, J., Li, D., & Shum, H. Y. (2020). The design and implementation of XiaoIce,
an empathetic social chatbot. Computational Linguistics, 46(1), 53–93. https://doi.org/
H E A L T H : M E N TA L H E A L T H
A ND
C H AT B O T S

10.1162/coli_a_00368

View publication stats


6

You might also like