Professional Documents
Culture Documents
Investigating Student Acceptance of An Academic Ad
Investigating Student Acceptance of An Academic Ad
net/publication/372888018
CITATION READS
1 406
3 authors, including:
All content following this page was uploaded by Samar Ibrahim on 03 October 2023.
Abstract
The study explores factors affecting university students’ behavioural intentions in
adopting an academic advising chatbot. The study focuses on functional, socio-
emotional, and relational factors affecting students’ acceptance of an AI-driven aca-
demic advising chatbot. The research is based on a conceptual model derived from
several constructs of traditional technology acceptance models, TAM, UTAUT, the
latest AI-driven self-service technologies models, the Service Robot Acceptance
(sRAM) model, and the intrinsic motivation Self Determination Theory (SDT)
model. The proposed conceptual model has been tailored to an educational con-
text. A questionnaire Survey of Non-purposive sampling technique was applied to
collect data points from 207 university students from two major universities in the
UAE. Subsequently, PLS-SEM causal modelling was applied for hypothesis testing.
The results revealed that the functional elements, perceived ease of use and social
influence significantly affect behavioural intention for chatbots’ acceptance. How-
ever, perceived usefulness, autonomy, and trust did not show significant evidence
of influence on the acceptance of an advising chatbot. The study reviews chatbot
literature and presents recommendations for educational institutions to implement
AI-driven chatbots effectively for academic advising. It is one of the first studies
that assesses and examines factors that impact the willingness of higher education
students to accept AI-driven academic advising chatbots. This study presents sev-
eral theoretical contributions and practical implications for successful deployment
of service-oriented chatbots for academic advising in the educational sector.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
1 Introduction
The academic advising process plays a key role in the student’s scholastic achieve-
ment, as universities offer a wide range of courses, majors, and academic opportu-
nities. Essentially, the quality of advising determines a student’s academic success
(Assiri, Al-Ghamdi & Brdesee, 2020). Academic advising has become an indispens-
able part of Higher Education Institutions (HEIs) to support student’s academic
achievement and contribute towards institutional goals of maximizing student reten-
tion and persistence (Campbell et al., 2008; Drake, 2011; Fricker, 2015), thereby
leading to overall academic excellence.
As part of the advising process, the advisor plays a multifaceted role to provide
the direction and assistance needed by students at vital stages of their academic ten-
ure (Iatrellis, Kameas & Fitsilis, 2017; Chan et al., 2019). An advisor’s role encom-
passes several functions some of which include developing personalized study plans
to fit students’ specific needs and directing students to available resources to manage
their academic standing (Campbell et al., 2008). In addition, advisors also support
students’ inquiries on institutional and academic policies and procedures, academic
progress, activities, and more. Thus, the responsibilities of advisors are overwhelm-
ing and often fail to meet the student and institutional expectations of high-quality
interaction and support (Iatrellis, Kameas & Fitsilis, 2017).
In order to assist the complex and time-consuming functions of academic advis-
ing, machine learning recommendation engines, and rule-based expert systems are
increasingly being proposed as technology-based alternatives (Iatrellis, Kameas
& Fitsilis, 2017). Essentially, the goal of such systems is to facilitate prescriptive
advising; by assisting students in selecting appropriate courses, classes, or majors
with minimal direct interaction (Noaman & Ahmed, 2015). However, such systems
tackle only one aspect of advising and fail to provide channels for interaction with
advisors, which is much needed by students to integrate with their environments,
feel connected, and achieve higher levels of satisfaction essential for student suc-
cess (Crookston, 1994; Campbell et al., 2008). A key component of developmen-
tal advising is the immediate engagement and exchange of personalized interaction,
which allows students to contact their advisor at any time (Noaman & Ahmed, 2015).
Advisor-student interaction has been established as a significant factor of student
success (Young-Jones et al., 2013), irrespective of the interaction modality (Junco et
al., 2016).
In this digital era, students are constantly in need of information in order to keep
up with their daily tasks and progress. The expectations of interaction and engage-
ment are being reshaped as today’s digital natives are continuously connected and
prefer immediate support for their academic queries (Robbins, 2020). On the other
hand, it has also become increasingly vital to involve all students in the advising
process to create an advising environment that provides inclusivity to all (Hu, 2020)
and not just those who do not actively seek help. To this end, technology is seen as a
positive channel for active advising and continuous engagement (Junco et al., 2016).
Moreover, during the COVID-19 era, it is expected that students have become accus-
tomed to utilizing technology for their learning and support needs (Liu & Ammigan,
2022).
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
2 Literature review
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Chatbot capabilities for supporting academic advising needs have also been high-
lighted in extant literature (Ho et al., 2018; Bilquise & Shaalan, 2022; Kuhail et
al., 2022). An AI-based conversational agent is effective and efficient in advising
interactions as it overcomes human limitations by being available 24/7 to respond to
users’ queries (Bilquise & Shaalan, 2022). Moreover, it is anticipated to reduce the
advisors’ load on more quality interactions with their advisees (Bilquise & Shaalan,
2022). An AI-driven chatbot powered by natural language processing (NLP) tech-
nologies can possess anthropomorphic emotionally intelligent (Bilquise, Ibrahim
& Shaalan, 2022b;) and therefore emulate human like conversations with the advi-
sees (Kuhail et al., 2022). Moreover, an advising chatbot is more likely to provide
inclusivity to all students and also support students’ queries in their native language
(Bilquise, Ibrahim & Shaalan, 2022a). An AI-driven chatbot also has the potential to
intervene at an early stage and support students at risk by using predictive technolo-
gies (Lim, Ho & Chai, 2021).
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Perceived Ease of Use (PEU). According to Davis et al. (1989), perceived ease of use
is the extent to which a particular system may be used with minimal effort. Hamidi
and Chavoshi (2018) claim that new technologies are typically developed for ease of
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
use. They also ascertain that perceived ease of use is an antecedent of behavioural
intention to use technology. Specifically, Dwivedi et al. (2019) in Brachten et al.
(2021) recommend enhancing the ease of use of a system to boost the usage intention.
Pillai and Sivathanu (2020) showed that PEU influences the adoption intention of
AI-powered chatbots for travel planning, which aligns with other tourism technology
adoption studies. Furthermore, the study by Ragheb et al. (2022) revealed significant
effects of perceived ease of use on students’ behavioural intention to accept chatbot
technology for learning in a higher education institution in Egypt. In addition, Fer-
nandes and Oliveira (2021) showed that easy-to-use digital voice assistants promote
users’ acceptance of automated services.
In the context of academic advising chatbots, ease of use relates to the Chatbot
systems features that provide a set of easy-to-use settings and a simple interface
that support seamless interaction and prevent users from putting additional cognitive
effort and time into accomplishing a task (Ashfaq et al., 2020). Hence, if students
can complete their tasks efficiently, they are more likely to favor technology. On the
other hand, complicated steps and conversations would lead to frustration and aban-
donment of the technology (Araujo, 2018; Go & Sundar, 2019) if the effort required
to converse with the chatbot is more than reaching out to a human advisor. Thus,
we believe that students are more likely to use the chatbot if they perceive it to be
easy and effortless to accomplish their advising-related tasks. Therefore, we posit the
hypothesis:
H1. Perceived ease of use positively impacts the behavioural intention to adopt
the advising chatbot.
Perceived Usefulness (PU). The perceived usefulness construct is the degree to which
users believe adopting technology will contribute to their performance (Venkatesh,
James & Xu, 2012). It is primarily related to the system’s performance, quality, and
effectiveness (Davis, 1989). Venkatesh et al. (2012) deemed this construct the most
significant when deciding whether or not to adopt a technology. In addition, they
ascertain this construct as a strong predictor of behavioural intention. Several stud-
ies on the adoption of AI-driven chatbots in domains such as tourism (Pillai & Siv-
athanu, 2020) and customer service (Fernandes & Oliveira, 2021; Aslam et al., 2022)
have shown that perceived usefulness has a direct and positive impact on the inten-
tion to use service-oriented chatbots.
In the context of using chatbots for academic advising, the term “performance”
should concern some benefits of chatbots, such as problem-solving and time sav-
ings through real-time information (Ashfaq et al., 2020). Chocarro et al. (2021)
have shown that teachers’ intentions to use technology are positively impacted by
the perceived usefulness of the chatbot. Furthermore, performance expectation has
also been shown to motivate learners to use and accept new technology (Almaiah,
Alamri & Al-Rahmi, 2019). Therefore, we believe that students are more likely to use
a chatbot for advising if they perceive it to be useful in the advising process. Hence
the perceived usefulness of an advising chatbot is a critical factor in determining its
acceptance. We posit the hypothesis :
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
H2. Perceived usefulness positively impacts the behavioural intention to adopt the
advising chatbot.
Social Influence (SI). It is the degree to which an individual seeks approval of their
social circle in the acceptance of technology (Aslam et al., 2022). It essentially sug-
gests that users’ behavioural intentions are guided by their peers, and social groups
(Sawang, Sun & Salim, 2014) as individuals often tend to comply with others, espe-
cially for short-term decisions such as technology acceptance (Brachten, Kissmer &
Stieglitz, 2021). The underlying rationale behind this construct is that even if people
do not intend to adopt a system, their belief that the significant people in their life
think they should is enough to persuade a change in behaviour (Lorenz & Buhtz,
2017).
Brachten et al. (2021) showed that peer influence has a more substantial impact
than managerial influence in adopting a chatbot within an organizational context.
Consistent with these findings, Fernandes and Oliveira (2021) reported that custom-
ers are influenced by other consumers who believe in the benefits of a service-ori-
ented chatbot. Gursoy et al. (2019) further showed that a strong social influence leads
to the willingness to use AI technology as consumers are persuaded by their social
groups’ belief in the usefulness and ease of the system.
In an educational context, students’ decisions are often influenced by society and
are likely to adopt behavoiurs of their peers, teachers, friends, and family (Martin
et al., 2002). Ragheb et al. (2022) reported a significant impact of social influence
on students’ behavioural intention to use a chatbot for teaching and learning. Based
on these findings, we assume that a student would be positively influenced to use a
chatbot based on the belief of friends, family members, or colleagues. Thus, we posit
the hypothesis:
H3: Social influence positively impacts the behavioural intention to use an advis-
ing chatbot.
Perceived Trust (PT). The trust may be defined as the extent to which an individual
believes the technology is credible, reliable, and secure. Trust is a crucial construct in
a personalized automated system since a user’s belief that the chatbot is efficient and
dependable provides confidence, which signifies trust in the chatbot’s capabilities
(Wirtz et al., 2018). Furthermore, users are more optimistic when they have faith in
the technology and, thus, are more likely to accept it (Fernandes & Oliveira, 2021).
Aslam (2022) showed that perceived trust plays a critical role in service chatbot
acceptance, where a higher sense of trust increases consumers’ willingness to use the
chatbot. Similarly, in the tourism domain, Pillai and Sivathanu (2020) showed that
travelers’ perceived trust leads them to use the travel planning chatbot and share per-
sonal information. Trust has also been identified as a significant factor in the intention
to use movie recommendation chatbots, with user satisfaction playing a mediating
role (Lee & Choi, 2017).
In the educational environment, trust in an advising chatbot refers to the belief that
the chatbot provides accurate information required for advising. Trust in the chatbot
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
could be fostered by the ability of the chatbot to comprehend the student’s query and
respond effectively and accurately. Hamidi and Chavoshi (2018) show that students
are more likely to adopt mobile technologies if they perceive them as reliable and
credible. Thus, the students’ behavioural intention to use an advising chatbot is deter-
mined by how trustworthy they perceive it to be, as they would be sharing personal
information about their academic standing, grades, and GPA and relying on accurate
responses for their advising plans. Hence, we propose the hypothesis:
H4: Perceived trust has a positive impact on advising chatbot acceptance.
In the context of academic advising, perceived autonomy refers to the ability of the
students to seek guidance and perform advising tasks independently without the need
for a human advisor. According to Moraes (2021), autonomy impacts support when
independent students seek guidance without compromising their control, thereby
indirectly influencing students’ intentions and actions toward chatbot adoption. In
addition, Nguyen et al. (2022) based their study on the Self-Determination Theory of
motivation to show that perceived autonomy is a significant predictor of behavioural
intention to adopt and interact with a chatbot interface. Therefore, we believe that the
ability of the students to request advising-related information and make decisions on
their own would positively influence the willingness to use the chatbot. Hence, we
propose the hypothesis:
H5: Perceived autonomy has a positive impact on chatbot acceptance.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
3 Methodology
In this study, a survey questionnaire was designed to collect quantitative data on the
participant’s demographic profiles and responses to construct items adapted from
existing established scales. A pilot study was conducted with five to six respondents
to test the delivery medium and language clarity of the adapted items. A description
of the constructs, their sources, and their corresponding items are presented in the
appendix. The questionnaire statements were rated on a 5-point Likert scale, with
values ranging from (strongly disagree = 1) to (strongly agree = 5).
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
The target population of our study is students in UAE Higher Education Institutions,
including both private and public sectors. We targeted one major institution within
each sector to study students’ perceptions and willingness to adopt a chatbot for aca-
demic advising. An online survey was administered to the students to get maximum
responses. The participants were provided with a brief description of a chatbot with
images of a natural conversation with an online agent to ensure awareness of the
concept of an advising chatbot. The textual descriptions and images avoided any
potential bias and were intended to ensure that all respondents were familiar with
the subject of the study. Furthermore, careful consideration and steps were taken
to minimize non-response bias. Nonresponse bias refers to the situation where par-
ticipants chosen for a sample fail to respond, either due to refusal to participate or
inability to access the survey. This bias can lead to inconclusive research findings due
to increased variability in estimates and the sample no longer accurately representing
the entire population. To minimize nonresponse bias, we took several measures. First,
we excluded collection of sensitive information in the survey, such as names and
contact details and ensured anonymity. Second, we used a mobile friendly medium to
deliver the survey to increase response rates. Third, we also ensured the survey was
not too lengthy and would take no more than 7–8 min to complete. Finally, we com-
municated the objective of the survey and estimated time required for completion for
transparency. Moreover, to improve response rates, we sent multiple reminders to
potential respondents.
The sample size was determined based on the recommendation of Hair (2009)
for a level of five to ten observations per parameter. Thus, the final data set of 207
responses is considered to be adequate for this study.
3.3 Technique
Our research explores and validates the conceptual model constructs through an
Exploratory Factor Analysis (EFA) using the IBM-SPSS tool. Furthermore, the
hypotheses are evaluated using a variance-based causal model. The model was used
to perform the Confirmatory Factor Analysis (CFA), test the validity and the reliabil-
ity of the latent constructs, and evaluate the causal relationships using the Structural
Equation Modelling (SEM) (Hair Jr., Gabriel & Patel 2014). SEM is a statistical
technique that has a set of relations between one or more independent variables (IVs)
and one or more dependent variables (DVs); both are either continuous or discrete.
It provides a means for discovering and verifying relationships between these vari-
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
4 Results
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Table 2 (continued)
Construct / Item Item Factor
Source code loading
(Fernandes AN01 I would like the advising chatbot to be pleasant to interact with. 0.908
& Oliveira, AN02 I would like the advising chatbot to easily understand me. 0.882
2021; Svikh- AN03 I would like the advising chatbot interaction to be human-like 0.864
nushina & (similar to communicating with a real person)
Sciences,
AN04 I would like the advising chatbot to be able to express emotions. Deleted
2022)
AN05 I would like the advising chatbot to be empathetic. Deleted
Social Influ- (CR = 0.911, AVE = 0.72, VIF = 1.584)
ence (SI)
(Lu, Cai SI01 I would use an advising chatbot if many of my classmates and 0.762
& Gursoy, friends will use it
2019) SI02 Using an advising chatbot will be a status symbol in my social 0.814
networks (e.g., friends, family, and co-workers)
SI03 People whose opinions I value would prefer that I use an advising 0.904
chatbot for advising-related queries.
SI04 People who are important to me would encourage me to use an 0.905
advising chatbot
Behavioural (CR = 0.924, AVE = 0.803)
Intention to
Adopt (BI)
(Venkatesh, BI01 I intend to use the advising chatbot in the future 0.900
James & Xu, BI02 I would always try to use the advising chatbot for my advising 0.877
2012) needs
BI03 I plan to use the advising chatbot frequently 0.911
verify the discriminant validity of the constructs in our measurement model. More-
over, the Variance Inflation Factor (VIF) values for all the constructs are below 3,
indicating no multicollinearity between constructs (See Table 2).
The six hypotheses were tested using PLS-SEM with bootstrapping of 10,000
samples. The results of the structural model, shown in Fig 2, revealed that Social
Influence (SI), with a coefficient of 5.961 (p < 0.001), is a key factor in determining
the intentions of students to accept advising chatbots, thus supporting hypothesis 3.
Perceived Ease of Use (PEU), with a coefficient of 2.378 (p < 0.05), also is another
key impact in accepting advising chatbots, supporting hypothesis 1. Furthermore,
the results also show that Anthropomorphism (AN) has a p-value, almost signifi-
cant, with a coefficient of 1.61 when alpha is considered at 0.10. Hence, we report
hypothesis 6 as almost supported. Lastly, Perceived Usefulness (coefficient = 1.193,
p > 0.05), Perceived Autonomy (coefficient = 0.976, p > 0.05), and Perceived Trust
(coefficient = 1.119) are insignificant in determining the acceptance of an advising
chatbot, thus rejecting hypotheses 2, 4 and 5 respectively. The results of the structural
model are presented in Table 6.
The results reveal highly significant evidence that perceived ease of use impacts
the willingness of students to accept advising chatbots. Hence, advising chatbots
that allow users to accomplish their tasks effortlessly will impact the behavioural
intention to adopt the chatbot. Our results align with the findings of several studies
that have shown that ease of use is an essential factor in the technology acceptance
of AI-driven chatbots (Ashfaq et al., 2020; Pillai & Sivathanu, 2020; Aslam et al.,
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
2022). Therefore, the advising chatbot must provide effortless interaction features
that motivate students to use it. Since a text-based chatbot does not have complex
interaction features, ease of use in the AI-driven conversational agent may be defined
as the ease of communication in natural language. Students would utilize the chatbot
to communicate easily and enquire about their advising needs without communica-
tion complexity. Furthermore, students may expect that the advising chatbots can
understand their requests and respond to them in Arabic or English. In other words,
students’ behaviour to accept advising chatbot and replace the in-person advisor
interaction may only happen if the chatbot is user-friendly and communicates the
information more accessible and with no additional complexity. The student will be
willing to adopt advising chatbots if these chatbots provide accurate responses to
their requests. Therefore, the advising chatbot must permit the ease of use of com-
munications with students, allowing efficient and hassle-free communications with
much less complexity.
Our findings also reveal significant evidence that social influence strongly impacts
behavioural intention to accept an advising chatbot. Our findings are supported by
several studies demonstrating social influence as a critical factor in the adoption of
chatbots in various domains (Arif, Aslam & Ali, 2016; Patil and Kulkarni, 2019;
Chin-Yuan et al., 2022). The acceptance of new technologies is strongly influenced
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
The study makes multiple theoretical contributions. Firstly, it extends the UTAUT
model to be more appropriate for AI-based technologies. The new model integrates
constructs from the SDT and sRAM models to investigate how students adopt ser-
vice-oriented chatbots for academic advising in an educational setting. Secondly, the
study adds to the growing literature on chatbot adoption by identifying factors that
influence students’ behavioral intentions to use a chatbot for advising services. The
study introduces a new construct, “anthropomorphism,“ which has not been thor-
oughly researched in an educational setting.
Additionally, the study addresses the lack of research on the use of service-oriented
chatbots in higher education. It tests a new research model in a multicultural environ-
ment where design constraints are significant factors in the decision-making process
towards the satisfaction and continued use of a new technology. Finally, the study
contributes not only to the IS literature but also to the student behavior literature by
exploring student perceptions of academic advising and the use of AI technologies to
support the advising process.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Despite the study’s contributions to the education advising sector, readers should
acknowledge a few limitations of this study that were encountered during the research
process to understand chatbot acceptance. These limitations would create opportuni-
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
ties for researchers to further studies. First, the data was gathered through an online
survey, and only 207 data points were collected. More data points were needed, so
our contributions would be more valuable and increase their generalizability. Several
questions of constructs were eliminated from the model due to insufficient loading in
the exploratory factor analysis. More data points are needed to examine the value of
these questions in these constructs. Finally, the study examined only six constructs
that can influence students’ willingness to accept the use of advising chatbots. As the
model presented in Fig. 1 the relationship between these constructs and the hypoth-
eses are direct relationships which could be a study limitation where there is no medi-
ating construct that could also impact these hypotheses. Thus, there may be other
constructs that significantly impact students’ desire to adopt advising chatbots. More-
over, the moderating effect of certain variables such as age, culture, and stickiness
on human advisors may explain the significance of perceived autonomy and trust.
In addition, to improve the model’s predictive ability, future studies should investi-
gate other antecedents and other mediating construct that impact the willingness of
students to accept the use of advising chatbots. Further research might help identify
other factors, such as hedonic motivation, that may impact advising adoption.
5 Conclusion
Using AI-based technology in the advising process can complement a human advisor.
Rather than be hindered by human advisor limitations, chatbots can provide con-
tinuous service for a more positive higher educational institutional experience. In
addition to improving communication and interaction between students, AI-driven
chatbots can enhance institution performance. This research is a novel study that
examines the acceptance of academic advising chatbots in higher education. Samples
from two major universities were collected through a survey for the purpose of this
research study. A conceptual model was created by extending constructs from the
UTAUT/TAM models for predicting users’ behavioural intentions to utilize advising
chatbot. The study adopted the functional, socio-emotional, and relational constructs
of technology acceptance derived from various theoretical models created using
PLS-SEM. The results of this study demonstrated that social influence and perceived
ease of use, are the socio-functional elements that significantly influenced behav-
ioural intention for chatbot acceptance. In contrast, it was not observed that perceived
usefulness, autonomy, or trust influenced the acceptance of advising chatbots. While
anthropomorphism was identified as an almost significant factor when p value is set
at 0.1. A major contribution of this study is that it presents various recommendations
for educational institutions to implement AI-driven chatbots effectively for academic
advising.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Funding This research did not receive any specific grant from funding agencies in the public, commercial,
or not-for-profit sectors.
Data Availability The datasets used and analysed during the current study are available from the corre-
sponding author on reasonable request.
Declarations
Conflict of interest The authors declare that they have no conflicting interests.
References
Adam, M., Wessel, M., & Benlian, A. (2021). AI-based chatbots in customer service and their effects on
user compliance. Electronic Markets Electronic Markets, 31(2), 427–445.
Almahri, F. A. J., Bell, D., & Merhi, M. (2020). Understanding Student Acceptance and Use of Chatbots
in the United Kingdom Universities: A Structural Equation Modelling Approach. 2020 6th IEEE
International Conference on Information Management, ICIM 2020, pp. 284–288.
Almaiah, M. A., Alamri, M. M., & Al-Rahmi, W. (2019). Applying the UTAUT model to explain the
students’ acceptance of mobile learning system in higher education. IEEE Access IEEE, 7,
174673–174686.
Al Shamsi, J. H., Al-Emran, M., & Shaalan, K. (2022). Understanding key drivers affecting students’
use of artificial intelligence-based voice assistants. Education and Information Technologies, 27(6),
8071–8091. https://doi.org/10.1007/s10639-022-10947-3
Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and com-
municative agency framing on conversational agent and company perceptions. Computers in human
behavior. Amsterdam School of Communication Research (ASCoR), University of Amsterdam (85
vol., pp. 183–189). Netherlands: Elsevier Ltd. Box 15793, Amsterdam, NG 1001.
Arif, I., Aslam, W., & Ali, M. (2016). Students’ dependence on smartphones and its effect on purchasing
behavior. South Asian Journal of Global Business Research, 5(2), 285–302.
Ashfaq, M., Yun, J., Yu, S., & Loureiro, S. M. C. (2020). I, Chatbot: Modeling the determinants of users’
satisfaction and continuance intention of AI-powered service agents. Telematics and Informatics (54
vol., p. 101473). Elsevier. April.
Aslam, W., Siddiqui, D. A., Arif, I., & Farhat, K. (2022). Chatbots in the frontline: drivers of acceptance.
Kybernetes.
Assiri, A., Al-Ghamdi, A. A. M., & Brdesee, H. (2020). From traditional to intelligent academic advising:
A systematic literature review of e-academic advising. International Journal of Advanced Computer
Science and Applications, 11(4), 507–517.
Bagozzi, R. P. (2007). The legacy of the technology acceptance model and a proposal for a paradigm shift.
Journal of the association for information systems, 8(4), 3.
Bilquise, G., Ibrahim, S., & Shaalan, K. (2022a). Bilingual AI-Driven Chatbot for Academic Advising.
International Journal of Advanced Computer Science and Applications, vol. 13(8).
Bilquise, G., Ibrahim, S., & Shaalan, K. (2022b). Emotionally Intelligent Chatbots: A Systematic Litera-
ture Review. Human Behavior and Emerging Technologies. Hindawi, vol. 2022.
Bilquise, G., & Shaalan, K. (2022). AI-based Academic Advising Framework: A Knowledge Management
Perspective. International Journal of Advanced Computer Science and Applications, vol. 13(8).
Brachten, F., Kissmer, T., & Stieglitz, S. (2021). The acceptance of chatbots in an enterprise context – A
survey study. International Journal of Information Management. Elsevier Ltd, vol. 60(May 2020),
p. 102375.
Cai, D., Li, H., Law, R., & Law, R. (2022). Anthropomorphism and OTA chatbot adoption: a mixed meth-
ods study. Journal of Travel & Tourism Marketing. Routledge, vol. 39(2), pp. 228–255.
Campbell, S., Nutt, C., Engagement, S. S., & Outcomes, L. (2008). Academic advising in the New Global
Century: Supporting Student Engagement and Learning Outcomes Achievement. Peer Review,
10(2001), 4–7.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Chan, Z. C. Y., Chan, H. Y., Chow, H. C. J., Choy, S. N., Ng, K. Y., Wong, K. Y., & Yu, P. K. (2019). Aca-
demic advising in undergraduate education: A systematic review. Nurse education today (75 vol., pp.
58–74). Elsevier.
Chin-Yuan, H., Ming-Chin, Y., I-Ming, C., & Wen-Chang, H. (2022). Modeling consumer adoption inten-
tion of an AI-Powered Health Chatbot in Taiwan: An empirical perspective. International Journal of
Performability Engineering, 18(5), 338.
Chocarro, R., Cortiñas, M., & Marcos-Matás, G. (2021). Teachers’ attitudes towards chatbots in education:
A technology acceptance model approach considering the effect of social language, bot proactive-
ness, and users’ characteristics. Educational Studies (00 vol., pp. 1–19). Routledge. 00.
Crookston, B. B. (1994). A Developmental View of Academic Advising as Teaching. NACADA Journal,
14(2), 5–9.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information tech-
nology. MIS quarterly. JSTOR, pp. 319–340.
Deci, E. L., & Ryan, R. M. (2012). Self-determination theory. Sage publications ltd.
De Keyser, A., Köcher, S., Alkire, L., Verbeeck, C., & Kandampully, J. (2019). Frontline service technol-
ogy infusion: conceptual archetypes and future research directions. Journal of Service Management.
Emerald Publishing Limited.
de Visser, E. J., Monfort, S. S., McKendrick, R., Smith, M. A. B., McKnight, P. E., Krueger, F., & Para-
suraman, R. (2016). Almost human: Anthropomorphism increases trust resilience in cognitive agents.
Journal of Experimental Psychology: Applied, 22(3), 331–349.
de Vreede, T., Raghavan, M., & de Vreede, G. J. (2021). Design foundations for AI assisted decision-
making: A self determination theory approach. Proceedings of the Annual Hawaii International Con-
ference on System Sciences, vol. 2020-Janua, pp. 166–175.
Drake, J. K. (2011). The role of academic advising in Student Retention and Persistence. About Campus:
Enriching the Student Learning Experience, 16(3), 8–12.
Dwivedi, Y. K., Rana, N. P., Jeyaraj, A., Clement, M., & Williams, M. D. (2019). Re-examining the Uni-
fied Theory of Acceptance and Use of Technology (UTAUT): Towards a revised theoretical model.
Information Systems Frontiers Information Systems Frontiers, 21(3), 719–734.
Fernandes, T., & Oliveira, E. (2021). Understanding consumers’ acceptance of automated technologies in
service encounters: Drivers of digital voice assistants adoption. Journal of Business Research, 122,
180–191. https://doi.org/10.1016/j.jbusres.2020.08.058.
Fiske, S. T., Cuddy, A. J. C., & Glick, P. (2007). Universal dimensions of social cognition: Warmth and
competence. Trends in cognitive sciences (11 vol., pp. 77–83). Elsevier. 2.
Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables
and measurement error. Journal of marketing research (18 vol., pp. 39–50). Sage Publications Sage
CA: Los Angeles, CA,. 1.
Fricker, T. (2015). The Relationship between Academic Advising and Student Success in Canadian Col-
leges: A Review of the Literature. College Quarterly. ERIC, vol. 18(4), p. n4.
Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational
cues on humanness perceptions. Computers in human behavior (97 vol., pp. 304–316). Elsevier.
Grudin, J., & Jacques, R. (2019). Chatbots, humbots, and the quest for artificial general intelligence. Con-
ference on Human Factors in Computing Systems - Proceedings, pp. 1–11.
Gummerus, J., Lipkin, M., Dube, A., & Heinonen, K. (2019). Technology in use – characterizing customer
self-service devices (SSDS). Journal of Services Marketing, 33(1), 44–56.
Gupta, P., Yadav, S. (2022). A TAM-based Study on the ICT Usage by the Academicians in Higher Edu-
cational Institutions of Delhi NCR. In Congress on Intelligent Systems: Proceedings of CIS 2021,
Volume 2, 329–353.
Gursoy, D., Chi, O. H., Lu, L., & Nunkoo, R. (2019). Consumers acceptance of artificially intelligent
(AI) device use in service delivery. International Journal of Information Management, 49, 157–169.
Hair, J. F. (2009). Multivariate data analysis.
Hair, J. F., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2016). A primer on partial least squares struc-
tural equation modeling (2nd ed.).). Sage publications.
Hair, J. F. Jr., Gabriel, M. L., D. da S., & Patel, V. K. (2014). AMOS Covariance-based Structural equa-
tion modeling (CB-SEM): Guidelines on its applications as a Marketing Research Tool. Revista
Brasileira de Marketing, 13(2), 44–55.
Hair, J. F., Ringle, C. M., & Sarstedt, M. (2011). PLS-SEM: Indeed a Silver Bullet. Journal of Marketing
Theory and Practice. Routledge, vol. 19(2), pp. 139–152.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Hamidi, H., & Chavoshi, A. (2018). Analysis of the essential factors for the adoption of mobile learning in
higher education: A case study of students of the University of Technology. Telematics and Informat-
ics (35 vol., pp. 1053–1070). Elsevier. 4.
Henseler, J., Dijkstra, T. K., Sarstedt, M., Ringle, C. M., Diamantopoulos, A., Straub, D. W., Ketchen,
D. J., Hair, J. F., Hult, G. T. M., & Calantone, R. J. (2014). Common Beliefs and Reality About
PLS: Comments on Rönkkö and Evermann (2013). Organizational Research Methods, vol. 17(2),
pp. 182–209.
Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design Science in Information Systems. MIS
Quarterly, 28(1), 75–105.
Ho, C. C., Lee, H. L., Lo, W. K., & Lui, K. F. A. (2018). Developing a chatbot for college student pro-
gramme advisement. 2018 International Symposium on Educational Technology (ISET). IEEE,
pp. 52–56.
Hu, X. (2020). Building an Equalized Technology-Mediated Advising Structure: Academic Advising at
Community Colleges in the Post-COVID-19 Era. Community College Journal of Research and Prac-
tice. Routledge, vol. 44(10–12), pp. 914–920.
Iatrellis, O., Kameas, A., & Fitsilis, P. (2017). Academic advising systems: A systematic literature review
of empirical evidence. Education Sciences, vol. 7(4).
Jiménez-Barreto, J., Rubio, N., & Molinillo, S. (2021). Find a flight for me, Oscar!” Motivational customer
experiences with chatbots. International Journal of Contemporary Hospitality Management, 33(11),
3860–3882.
Junco, R., Mastrodicasa, J. M., Aguiar, A. V., Longnecker, E. M., & Rokkum, J. N. (2016). Impact of
technology-mediated communication on student evaluations of advising. NACADA Journal, 36(2),
54–66.
Kim, J., Merrill, K., Xu, K., & Sellnow, D. D. (2020). My teacher is a machine: Understanding students’
perceptions of AI teaching assistants in online education. International Journal of Human–Computer
Interaction, 36(20), 1902–1911. https://doi.org/10.1080/10447318.2020.1801227.
Kuberkar, S., & Singhal, T. K. (2020). Factors influencing adoption intention of ai powered chatbot for
public transport services within a smart city. International Journal on Emerging Technologies, 11(3),
948–958.
Kuhail, M. A., Katheeri, A., Negreiros, H., Seffah, J., A., & Alfandi, O. (2022). Engaging students with
a Chatbot-Based academic advising system. International Journal of Human–Computer Interaction
(pp. 1–27). Taylor & Francis.
Lee, S. Y., & Choi, J. (2017). Enhancing user experience with conversational agent for movie recommen-
dation: Effects of self-disclosure and reciprocity. International Journal of Human Computer Studies.
Elsevier, vol. 103, pp. 95–105.
Lim, M. S., Ho, S. B., & Chai, I. (2021). Design and functionality of a university academic advisor chatbot
as an early intervention to improve students’ academic performance. Computational science and
technology (pp. 167–178). Springer.
Liu, C., & Ammigan, R. (2022). Humanizing the academic advising experience with technology: An inte-
grative review. STAR Scholar Book Series, pp. 185–202.
Lorenz, G. V., & Buhtz, K. (2017). Social influence in technology adoption research. a literature review
and research agenda.
Lu, L., Cai, R., & Gursoy, D. (2019). Developing and validating a service robot integration willingness
scale. International Journal of Hospitality Management, 80(January), 36–51.
Meet, R. K., Kala, D., & Al-Adwan, A. S. (2022). Exploring factors affecting the adoption of MOOC
in Generation Z using extended UTAUT2 model. Education and Information Technologies, 27(7),
10261–10283. https://doi.org/10.1007/s10639-022-11052-1
Mohamad Suhaili, S., Salim, N., & Jambli, M. N. (2021). Service chatbots: A systematic review. Expert
Systems with Applications. Elsevier Ltd, vol. 184(July 2020), p. 115461.
Moraes, C. L. (2021). Chatbot as a Learning Assistant: Factors influencing adoption and recommenda-
tion. Information Management School.
Moran, M. (2022). No Title. Startup Bonsai [online]. [Accessed 29 October 2022]. Available at: https://
startupbonsai.com/chatbot-statistics/.
Nguyen, Q. N., Sidorova, A., & Torres, R. (2022). User interactions with chatbot interfaces vs. Menu-
based interfaces: An empirical study. Computers in Human Behavior. Elsevier Ltd, vol. 128(Novem-
ber 2021), p. 107093.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Nikou, S. A., & Economides, A. A. (2017). Mobile-Based Assessment: Integrating acceptance and moti-
vational factors into a combined model of Self-Determination Theory and Technology Acceptance.
Computers in Human Behavior, vol. 68, pp. 83–95.
Noaman, A. Y., & Ahmed, F. F. (2015). A New Framework for e Academic Advising. Procedia Computer
Science (65 vol., pp. 358–367). Elsevier Masson SAS. Iccmit.
Okonkwo, C. W., & Ade-Ibijola, A. (2021). Chatbots applications in education: A systematic review. Com-
puters and Education: Artificial Intelligence, 2, 100033. https://doi.org/10.1016/j.caeai.2021.100033.
Patil, K., & Kulkarni, M. S. (2019). Artificial intelligence in financial services: Customer chatbot advisor
adoption. Int J Innov Technol Explor Eng, 9(1), 4296–4303.
Pedrotti, M., & Nistor, N. (2016). In K. Verbert, M. Sharples, & T. Klobučar (Eds.), User motivation and
Technology Acceptance in Online Learning Environments BT - Adaptive and Adaptable Learning
(pp. 472–477). Cham: Springer International Publishing.
Pillai, R., & Sivathanu, B. (2020). Adoption of AI-based chatbots for hospitality and tourism. International
Journal of Contemporary Hospitality Management, 32(10), 3199–3226.
Ragheb, M. A., Tantawi, P., Farouk, N., & Hatata, A. (2022). Investigating the acceptance of applying
chat-bot (Artificial intelligence) technology among higher education students in Egypt. International
Journal of Higher Education Management, 08(02), 1–14.
Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic lit-
erature review of ten years of research on text-based chatbots. International Journal of Human Com-
puter Studies. Elsevier Ltd, vol. 151(March), p. 102630.
Rietz, T., Benke, I., & Maedche, A. (2019). The impact of anthropomorphic and functional Chatbot Design
features in enterprise collaboration Systems on user Acceptance. Wirtschaftsinformatik, (February),
pp. 1642–1656.
Robbins, R. (2020). Engaging gen zers through academic advising. Academic Advising Today, vol. 43(2).
Sandu, N., & Gide, E. (2019). Adoption of AI-chatbots to enhance student learning experience in higher
education in india. 2019 18th International Conference on Information Technology Based Higher
Education and Training, ITHET 2019. IEEE, pp. 1–5.
Saner, R. (2018). Chatbots:Changing User Needs and Motivations. The Expert Negotiator, pp. 69–84.
Sawang, S., Sun, Y., & Salim, S. A. (2014). It’s not only what I think but what they think! The moderating
effect of social norms. Computers & Education (76 vol., pp. 182–189). Elsevier.
Sheehan, B., Jin, H. S., & Gottlieb, U. (2020). Customer service chatbots: Anthropomorphism and adop-
tion. Journal of Business Research, vol. 115(February 2019), pp. 14–24.
Sánchez-Prieto, J. C., Cruz-Benito, J., Therón Sánchez, R., & García-Peñalvo, F. J. (2020). Assessed by
machines: Development of a TAM-based tool to measure AI-based assessment acceptance among
students. International Journal of Interactive Multimedia and Artificial Intelligence, 6(4), 80. https://
doi.org/10.9781/ijimai.2020.11.009
Solomon, M. R., Surprenant, C., Czepiel, J. A., & Gutman, E. G. (1985). A role theory perspective on
dyadic interactions: The service encounter. Journal of marketing (49 vol., pp. 99–111). Los Angeles,
CA: SAGE Publications Sage CA. 1.
Sorebo, O., Halvari, H., Gulli, V. F., & Kristiansen, R. (2009). The role of self-determination theory in
explaining teachers’ motivation to continue to use e-learning technology. Computers & Education,
53(4), 1177–1187.
Svikhnushina, E., & Sciences, C. (2022). PEACE: A model of Key Social and emotional ualities of con-
versational chatbots. ACM Trans Interact Intell Sys.
van Pinxteren, M. M. E., Wetzels, R. W. H., Rüger, J., Pluymaekers, M., & Wetzels, M. (2019). Trust
in humanoid robots: Implications for services marketing. Journal of Services Marketing, 33(4),
507–518.
Venkatesh, V., James, Y. T., & Xu, X. (2012). Consumer Acceptance and Use of Information Technology:
Extending the Unified Theory of Acceptance and Use of Technology. MIS quarterly, pp. 157–178.
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technol-
ogy: Toward a unified view. MIS quarterly. JSTOR, pp. 425–478.
Wirtz, J., Patterson, P. G., Kunz, W. H., Gruber, T., Lu, V. N., Paluch, S., & Martins, A. (2018). Brave new
world: Service robots in the frontline. Journal of Service Management, 29(5) 907–931. https://doi.
org/10.1108/JOSM-04-2018-0119.
Young-Jones, A. D., Burt, T. D., Dixon, S., Hawthorne, M. J., Young-jones, A. D., Burt, T. D., Dixon, S., &
Hawthorne, M. J. (2013). Academic advising: Does it really impact student success? Quality Assur-
ance in Education (21 vol., pp. 7–19). Emerald Group Publishing Limited. 1.
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Education and Information Technologies
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps
and institutional affiliations.
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under
a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted
manuscript version of this article is solely governed by the terms of such publishing agreement and appli-
cable law.
Ghazala Bilquise
gbilquise@hct.ac.ae
Samar Ibrahim
sibrahim@adjunct.aud.edu
Sa’Ed M. Salhieh
saed.salhieh@buid.ac.ae
1
Department of Computer Information Science, Higher Colleges of Technology, Dubai,
UAE
2
School of Arts and Science, American University in Dubai, Dubai, UAE
3
College of Engineering and IT, The British University in Dubai, Dubai, UAE
13
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center
GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers
and authorised users (“Users”), for small-scale personal, non-commercial use provided that all
copyright, trade and service marks and other proprietary notices are maintained. By accessing,
sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of
use (“Terms”). For these purposes, Springer Nature considers academic use (by researchers and
students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and
conditions, a relevant site licence or a personal subscription. These Terms will prevail over any
conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription (to
the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of
the Creative Commons license used will apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may
also use these personal data internally within ResearchGate and Springer Nature and as agreed share
it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not otherwise
disclose your personal data outside the ResearchGate or the Springer Nature group of companies
unless we have your permission as detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial
use, it is important to note that Users may not:
1. use such content for the purpose of providing other users with access on a regular or large scale
basis or as a means to circumvent access control;
2. use such content where to do so would be considered a criminal or statutory offence in any
jurisdiction, or gives rise to civil liability, or is otherwise unlawful;
3. falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association
unless explicitly agreed to by Springer Nature in writing;
4. use bots or other automated methods to access the content or redirect messages
5. override any security feature or exclusionary protocol; or
6. share the content in order to create substitute for Springer Nature products or services or a
systematic database of Springer Nature journal content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a
product or service that creates revenue, royalties, rent or income from our content or its inclusion as
part of a paid for service or for other commercial gain. Springer Nature journal content cannot be
used for inter-library loans and librarians may not upload Springer Nature journal content on a large
scale into their, or any other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not
obligated to publish any information or content on this website and may remove it or features or
functionality at our sole discretion, at any time with or without notice. Springer Nature may revoke
this licence to you at any time and remove access to any copies of the Springer Nature journal content
which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or
guarantees to Users, either express or implied with respect to the Springer nature journal content and
all parties disclaim and waive any implied warranties or warranties imposed by law, including
merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published
by Springer Nature that may be licensed from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a
regular basis or in any other manner not expressly permitted by these Terms, please contact Springer
Nature at
onlineservice@springernature.com