UX On E-Learning Platforms in Higher Education

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

User Experience on E-learning Platforms in Higher

Education
Luca Giraldi (  l.giraldi@unimc.it )
University of Macerata
Marta Giovannetti
University of Macerata
Elena Cedrola
University of Macerata

Article

Keywords: student modelling learning, e-learning platforms, user behaviour, user experience, digital
transformation

Posted Date: May 15th, 2023

DOI: https://doi.org/10.21203/rs.3.rs-2753702/v1

License:   This work is licensed under a Creative Commons Attribution 4.0 International License.
Read Full License

Additional Declarations: No competing interests reported.


User Experience on E-learning Platforms in Higher
Education
Luca Giraldi1*, Marta Giovannetti1+ and Elena Cedrola1 +
1
University of Macerata, Department of Economics and Law, Macerata, Italy
*
l.giraldi@unimc.it
+
these authors contributed to this work

ABSTRACT

Even though Covid-19 facilitated the move towards e-learning, research on the user experience (UX) of e-learning platforms
has been limited, particularly regarding its cognitive and emotional outcomes. Considering this gap, this study proposes a
non-invasive method for assessing emotional effects related to e-learning platforms.
The study involved an experiment with 23 university students and compared the effectiveness of a real-time face and eye
detection methodology (MIORA) with a retrospective questionnaire (SAM) in understanding the emotional responses elicited
by the user-platform interaction. To examine the consistency between the two tools, the authors intentionally introduced
usability issues in the system to observe students' emotional reactions.
The study's results confirmed the research hypothesis that real-time non-invasive tools for assessing emotional reactions are
more comprehensive and reliable than the SAM questionnaire. Furthermore, these tools enable dynamic adaptations to the
site's usability and interface based on the student's emotional reactions, potentially improving satisfaction and learning
outcomes.
The findings inform future research on how emotional responses to e-learning platforms can impact user experience and
learning outcomes. Ultimately, this study offers a foundation for understanding the emotional outcomes of e-learning and how
they can be effectively assessed to improve online and hybrid education.

KEYWORDS: student modelling learning, e-learning platforms, user behaviour, user experience, digital transformation.

Introduction
Online education and educational platforms have grown in recent decades due to technological advances and the spread of the
Internet. The platform market has exceeded the estimates (Statista, 2015) and 2022 its value reached $ 315B in 2022 (Global
Market Insight, 2022), while the number of mobile education apps comes to 600K - 470M on the App Store and 466M on
Google Play 1. The Covid pandemic and the Lockdown have helped accelerate the shift from a traditional to a digital learning
environment 2. On these platforms, educators and trainers upload course content their students can access or download anywhere
and on any device, making the learning sustainable 3.
As these platforms become relevant, it is necessary to understand the impact of User Experience (UX) on engagement and the
emotional and cognitive sphere. An e-learning system must be sufficiently usable and immersive to prevent users from
experiencing negative emotional and cognitive states while performing tasks. E-learning application developers and designers
should prioritise achieving good usability and user experience for the widest possible range of users.
4,5
Furthermore, research indicates that UX evaluation in education platforms is essential for learning success .
In the realm of user-centred design, the definition of UX encompasses the entirety of a user's interaction process with a product,
service, or website, from initial engagement to ultimate disengagement. This journey represents the user's active exploration
and evaluation of the environment as they seek to acquire specific knowledge and form impressions 6,7.
Multiple aspects of UX and different ways of designing interactive platforms can be considered: of paramount importance are
the quality of the content and technical aspects of the platform (Poma et al., 2021), attractiveness, perspicuity, efficiency,
reliability, emotional stimulation, and novelty 8. In this regard, most human interaction with the software comes from non-verbal
communication, so researchers have mainly focused on the user's emotional state during their UX 9–12. To enhance the course
content and the platform interface's intuitiveness, utilising emotion detection represents a potential avenue of exploration 13.
A proper usability assessment may lead to platform customisation, which can influence consumer behaviour and thus increase
UX 14. One of the most well-established definitions to describe the concept of software usability is EN ISO 9241: "Usability is
the level at which a user can achieve through software their goals effectively, efficiently and with high levels of satisfaction"
15
. One of the most popular methods for evaluating and optimising usability is the so-called usability test 16, in which an
appropriate number of participants attempt typical tasks that a piece of software should do. Various tools, invasive and non-
invasive, can be used to assess the usability of a platform.
Various instruments, retrospective or real-time, can be used to assess the usability of a platform. To assess the effect of usability
attributes, questionnaires that ask participants to remember their feelings, such as the SAM Questionnaire 17. However, these
retrospective instruments may lead users to remember only high-dominance emotions while leaving out everything else, or they
may not perceive an emotion as intensely as it was.
Several technological innovations have facilitated novel approaches for conceptualising the behaviour and decision-making of
online users, primarily via non-invasive methodologies. Non-invasive methodologies allow for identifying users' unconscious
emotions in real-time, ensuring results that are not influenced by retrospective bias and are, therefore, entirely dependable. 18.
This paper investigates the emotional and cognitive state while evaluating the usability and UX of an online course platform
using the MIORA monitoring software in an experiment involving N=23 university students. Several studies have applied
artificial intelligence methods for learner tracking in e-learning platforms to enhance the acquisition of educational content
12,19,20
. However, this article presents a pioneering investigation into educational platforms' user experience (UX). This research
employs a novel approach using Multiple Instruments for Objective and Subjective Analysis (MIORA) and the Self-Assessment
Manikin (SAM) retrospective evaluation tool. Furthermore, this study emphasises emotional outcomes over cognitive outcomes
(Lopez-Aguilar et al., 2021).
The paper follows a structured approach, discussing the theoretical background surrounding user experience (UX), emphasising
emotions in UX and UX assessment. The subsequent section introduces the research aim and design, which presents the MIORA
framework and measurement tool. This framework enables the mapping of expressions and facilitates the measurement of
emotional reactions when using educational platforms or tools. The methodology employed and the resulting data collection
outcomes are presented in sequence, followed by a comprehensive discussion of the implications for theory, education, and
practice.

Theoretical Background

• The role of emotions in UX


To better understand the role of emotions in UX, a search of the most influential literature in the field was carried out 21. Previous
research identifies a correlation between emotion and user response behaviour 22. Emotion is considered a key factor in
determining UX while using interactive technologies. Forlizzi and Battarbee23 state that: "emotion influences the way users
interact with products, perceptions and outcomes surrounding such interactions" (Forlizzi and Battarbee, 2004, p. 264).
Generally, reference is made to the FACS construct 24, which identifies six primary emotions: happiness, surprise, sadness,
anger, fear, and disgust. Emotions are used to describe how the learner feels while using a website. Happiness refers to the
degree of satisfaction and excitement, while anxiety refers to how insecure one feels. Sadness refers to how discouraged one
feels, while anger describes frustration or irritation at a prolonged website. Although the emotions are related, their relationship
is not symmetrical but is complex, so the presence of one does not guarantee or exclude the presence of another 25.
The design should be real-time modified according to the emotional reaction of users during interaction with the website if we
want to achieve UX optimisation. For this, it is crucial to identify users' emotions, experiences and skills and to identify where
changes are needed 26.
Existing literature has already explored different student reactions through Artificial Intelligence techniques in online platforms.
Many research works have relied on artificial intelligence to discretely assess a student's engagement by analysing signals from
facial expressions 27, head posture and gaze 19,20. D'Mello, Chipman and Grasser 28 used the student's posture to discriminate
between low and high involvement (boredom). Buono et al. (2022) wanted to detect student involvement by comparing two
assessment methods, an invasive one consisting of self-assessment questionnaires at the end of the test and a non-invasive
instrument for coding facial expressions. The authors posit that involvement encompasses two key factors: the perceived
difficulty level associated with the assigned task and the range of emotional responses elicited by the learning process. While
Buono et al. (2022) also acknowledged the emotional dimension of involvement concerning facial expressions, their
investigation primarily focused on cognitive involvement without delving deeply into the emotional component.
Few works use deep learning and AI to recognise students’ emotions in real-time and thus adapt the platform to the student's
feelings 29–31. Therefore, this paper proposes a non-invasive tool for emotional evaluation during an e-learning course, aiming
to make the learning and platform more satisfying.

• UX Assessment Method
With the advent of powerful machine learning technologies, the assessment of emotions using a digital service has shown
significant progress over the last years 32. One can distinguish between subjective and objective measurements and retrospective
or real-time measurements. Self-reported methods consist of participants reporting their feelings and thoughts in a questionnaire,
a survey that does not require the intervention of an expert. They help gather information regarding the recognition section of
the user experience cost-effectively.
Several standardised usability questionnaires have been developed in the literature, such as the Questionnaire for User
Interaction Satisfaction (QUIS)33, Software Usability Measurement Inventory (SUMI) 34 and System Usability Scale (SUS) 35.
The most frequent method for assessing users' emotional states is subjective evaluations in which users fill out questionnaires
such as the Self-Assessment Manikin (SAM) 36. SAM allows participants to assess their emotional states through rating scales.
One of the main advantages of the SAM questionnaire is its reproducibility, which allows for various comparisons and is quick
and easy to apply 37.
However, UX studies have shown a critical caveat in subjective ratings of emotions 11. Recently, Scherer argued that emotions
are fleeting, i.e., intense peaks of short-term experiences that may be difficult to remember later 38. This issue has also been
found in self-reported evaluations, i.e. thinking aloud by which participants judge the platform and UX during the website
usability test; this evaluation can also lead to a lack of concentration during the test and unnatural behaviour.
The retrospective or self-report instrument is the easiest and cheapest way to measure emotions. However, these evaluations
reflect a filtered feeling. They are thus potentially compromised by the subject's consciousness and cognitive bias 39, so they
cannot be considered an instrument with reliable results.
In recent years, psychological and neuroscientific research has made notable strides in implementing biometric sensors to
provide objective, rather than subjective, evaluations of emotional states. This is achieved through the utilisation of sensors
attached to the body. Recently, the analysis of physiological signals captured by sensors, such as the Galvanic Skin Response,
which measures skin sweating and correlates it with arousal, is also being considered to analyse emotional-cognitive reactions
40
. Other sensors are electromyography (EMG), heart rate (HR), and electroencephalography (EEG). All these biometric sensors
were initially developed for medical purposes and applied in controlled environments such as laboratories, which require the
presence of experts. Moreover, these sensors are intrusive since users have to wear them, and they are physically expensive.
Researchers have recently developed non-invasive user assessment instruments as an alternative to the previously mentioned
methods 41. These monitoring systems do not require wearing devices and have the advantage that they can also be used in
unsupervised contests, as they only require the use of a simple video camera.
Standard methods include video-based facial expression analysis 42, emotion recognition from the human voice 3 and eye-
tracking 43. Current research on students' emotion recognition using their facial expressions include the classification of
emotions into learning-centred emotions (boredom, confusion, frustration, eureka, flow/engagement) (D’mello & Graesser,
2012), (engagement, immersion, motivational (challenge), cognitive (skills) (Buono et al., 2022) and Ekman's basic emotions
(happy, sad, fear, anger, disgust and surprise) (Ekman, 2005).
Eye-tracking technology reflects eyeball movements and determines focus positions. With improved accuracy and simple laptop
webcams, this technique is used to measure users' eye movements while interacting with system interfaces 44 and has been used
in many studies related to interface design 45.
Eye-tracking sensors capture eye movements and detect engagement, arousal, stress and fatigue during a user's interaction with
the product. They do this by recording metrics such as time to the first fixation, percentage of consumers who fixated, total
fixation duration, fixation count, and visit count in each area of interest.
It has previously been found that mapping facial expressions and tracking emotional responses can help better understand the
user experience and improve interaction with the platform and content (Lopez-Aguilar et al. 2021; Buono et al. 2022). The
experiment by Ming et al. (2021) asserted that it is possible to assess the human emotional-cognitive process in online browsing
through eye tracking. This technology has been used in various contests, such as video games 46 and digital entertainment 47.

Methods
Previous literature has aimed to investigate the importance of the impact of engagement on performance and the student learning
experience in e-learning contests 12,19,20,27. The present work's originality concerning the previous literature is due to the presence
of two assessment methods. However, there are both concurrent and ex-post measurement mechanisms, in Buono et al. (2020),
that study entails the engagement aspects of the learning platform but did not investigate the role of the emotional dimension
on engagement nor the influence of platform usability. This study starts from different theoretical and behavioural premises:
first, we investigate the emotional aspects of an e-learning site usage, affected by usability problems, to assess the influence of
the graphical/layout setting on satisfaction and, more generally, on UX. Second, we detect the learning experience and the level
of emotional and cognitive student engagement during an online lesson, helping detect problems with the course material and
adapting the lecture content accordingly. Specifically, this research proposes the application of a web-based tool for remote
usability testing of an online teaching website. The tool, called MIORA, assesses and measures the degree of usability of the
website, identifying in real-time the users' emotional reactions during interaction with the software, in a non-invasive manner,
without the use of sensors but only with the help of a webcam built into the computer. The tool records the user's face via the
webcam and the screen. It has two main modules: emotion analysis via facial recognition and gazes analysis via eye tracking.
Facial expression analysis aims to recognise patterns of facial expressions using the FACS manual 48. In addition to detecting
the six primary emotions (joy, sadness, anger, fear, contempt, disgust and surprise), this module can assess the level of
satisfaction derived from valence and arousal.
Upon concluding the evaluation process, the web terminal automatically generates a report evaluating the user experience (UX),
which is readily accessible to the user. Additionally, researchers can manually evaluate the UX through screen recording,
allowing for analysis of specific moments in the experience where user emotions were at their lowest or highest points. This
data can be compared with the system automatically generated by the system, showing several easy-to-understand graphs and
diagrams, including the user's facial registration, the application screen recording and the UX evaluation report.
In both types of research, student behaviour is monitored during the developmental time of the entire course, interspersed with
self-assessment questionnaires to be completed. In the present experiment, however, the authors wanted to influence the
emotional curve of the students by deliberately inserting platform usability problems and investigating the students' emotional
reactions. The objectives of the study are, therefore:
− To propose and test the MIORA tool for the UX analysis of educational platforms;
− To compare a simultaneous evaluation method (MIORA) with a retrospective evaluation tool (SAM);
− To show the results of an empirical analysis based on the experimental methodology on detecting and analysing
emotions to improve the platform's usability and UX evaluation dynamically.
Concluding our investigation, we have formulated two hypotheses that will be subject to experimentation, derived from a
synthesis of our research objectives:
H1: Non-invasive methods produce more comprehensive and less prejudiced results, rendering them more dependable than
retrospective subjective self-assessment systems. The latter tend to overestimate outcomes due to external conditioning.
H.2. Non-invasive methods allow the UX to be improved through real-time customisation of the GUI and, so, a better UX
evaluation.

• Inclusion and Exclusion Criteria


An invitation to participate in this experiment as a research sample was sent by e-mail to all students regularly enrolled in a
degree course. There were no rewards, and participation was entirely voluntary.
Informed consent was obtained from all respondents. The tests were placed at the Polytechnic University of Marche (Ancona).
The study follows relevant guidelines and regulations. Respondents gave their informed consent for the publication of their
pictures.
The study began by selecting a sample of N=50 students, after which inclusion and exclusion criteria were applied to ensure the
sample was cohesive and in line with the experimental parameters.
The initial sample was asked to fill in a concise closed-ended questionnaire that investigated their demographic and attitudinal
characteristics, knowledge and familiarity with User Experience, IT skills (possession of professional certifications), and
interest in the subject. From the analysis of the responses, the authors proceeded with the selection by applying the inclusion
and exclusion criteria.
Exclusion Criteria:
− Students over 45 were excluded, as they were considered too distant from the rest of the sample, whose average age
was around 24. The authors feared that they would alter the results of the experiment.
− Students far from the test venue were excluded due to mobility issues. Students located within a 50km radius of the
test venue were excluded.
− Students who stated they needed more familiarity with computers and the Internet were excluded.
− Students who admitted no interest in taking the test were excluded. This lack of interest would have led them to take
the test unwillingly and, therefore, unreasonably and untruthfully distort the results; therefore, the authors preferred to
exclude them.
− Subjects with no basic knowledge or understanding of the user experience were excluded, as they would have needed
more time to comprehend the test questions fully.
− The authors had to exclude students with visual, auditory or cognitive disabilities from the test, which would have
prevented them from experimenting correctly, distorting MIORA's results.
Inclusion Criteria:
− A prerequisite for participation in the experiment was the use of the Italian language, as the test would be conducted
exclusively in Italian.
− Students who had participated in an experiment of this type at most five years previously were included.
The skimming process resulted in a final sample of N=23 participants from different engineering university courses in Italy.
Below (Table 1) are the details of the sample. Please refer to the link in the "Data Availability" chapter for complete participant
data.

Participants N=23

Age Average 24

Gender 70% Male

30% Female

University location 100% Italy

University Department 39% Computer Engineering

39% Electronic Engineering

22% Biomedical Engineering

Type of degree 74% Bachelor Degree

26% Master Degree

Table 1. The demographic composition of university students

• Experimental Methodology Design


Before experimenting, a map of its unfolding was precisely drawn, depicted in Figure 1.
Figure 1. Experimental Methodology Design

MIORA consists of two modules: the first module uses machine and deep learning algorithms to recognise students' emotional
states using facial expressions and to categorise them according to Ekman's Facial Coding System (percentage of emotion
associated with each of Ekman's seven emotional states: happiness, sadness, boredom, anger, fear, surprise, neutrality). Facial
expression analysis also makes it possible to detect the percentage of valence and arousal, i.e. the pleasantness of emotion and
the intensity of the emotion.
The second module uses eye-tracking to detect eye movement and rotation of the head up, down, left or right. This module
makes it possible to detect the student's attention or inattention. The researchers established a maximum time limit for a student
to be classified as inattentive.
The researchers excluded video frames with poor or inadequate lighting conditions (e.g. lack or excess of light) and low-
resolution video frames while training the MIORA algorithm. They also excluded parts where the video was paused from the
analysis. Furthermore, they activated screen recording to observe the student's movements on the page. The experimental
protocol was sent to the Ethics Committee of the University Research of the University of Macerata for approval.
Buono et al. (2022) used open-source software, Open-Face, to extract head pose, facial expressions and gaze, and Open-Pose,
to extract body posture. To extract features related to facial behaviour during e-learning, they used the Action Units of the Facial
Action Coding System (FACS) (Ekman, 2005). The combination and continuous training of these datasets made it possible to
estimate the intensity for each of the affective states detected, labelling them into four states: boredom, confusion, engagement
and frustration. It should be noted that the dataset used by Buono et al. (2020) is annotated in terms of engagement, not emotions.
In both cases, a convolutional neural network (CCN) capable of capturing the different temporal dynamics and thus analysing
a sequence of videos was used to map the student's alterations and reactions progressively.
The experiment was conducted in a single day in a laboratory with the necessary equipment. The experiment was carried out
simultaneously by all 23 participants and was a time trial (maximum time of one hour). The subjects were video-recorded and
timed while carrying out the instructions below in the shortest possible time. The researchers informed the participants about
the four tasks to be performed consecutively before starting the test.:
1. Task 1: Search and start the lesson. Instructions: Click on the course search bar and type in "Engineering Courses".
Start and take the first Engineering Fundamentals lesson
2. Task 2: Stop the video lesson: Instructions: Stop the video lesson after 30 minutes
3. Task 3: Take the test: Instructions: End the video course and take the relevant test
4. Task 4: Change one answer. Instructions: During the test, change the answer to a question.
The participants were unaware of the deliberate 'usability problems' inherent. The developers deliberately tampered with certain
options within the website to gauge students' reactions when faced with a poorly responsive website. Specifically:
1. During the course, the page automatically reloaded three times in the first five minutes and started the video course
from the beginning
2. The buttons to 'start', 'stop' and 'end' the video lesson were delayed
3. After the video lesson, the students were asked to perform the relevant examination. This activity required access to a
toolbar menu (the App menu) that is not visible by default;
4. The attempt to change the response in the survey led to an incomprehensible message error.
The platform to which the participants were directed was an Open Access e-learning platform with various courses. The course
consisted of teaching material in the form of slides and video presentations. Considering the sample's degree type, the authors
chose an introductory engineering video course for the experiment. The participants were instructed to be timed.
The SAM questionnaire was conducted after each activity and asked participants to recall the negative or positive emotions,
their valence, dominance and arousal they felt at the end of each task conducted during the experiment. These auto-reported
results were compared with real-time analyses of facial expressions monitored by MIORA to assess response consistency.
In the SAM test, participants had to score each stimulus according to the 1- to 9-point Likert scale detailed in 49 (Figure 2). The
SAM test is also known as the PAD test, in which the initial letters identify emotions measured in three axes: 1. Pleasure (or
Valence); 2. Arousal; 3. Dominance. Tsonos and Kouroupetroglou (2008) reported that the emotional state of "Pleasure" is
denoted by a smiling, happy manikin, whereas an unhappy, frowning manikin represents the opposite pole. Regarding the
"Arousal" dimension, the highly energetic pole is represented by an animated manikin, while a relaxed, eyes-closed manikin
represents the other pole. Conversely, for the "Dominance" dimension, the controlled and in-control poles are represented by a
small and large manikin, respectively (p. 444).

Figure 2. SAM Questionnaire Likert? (Tsonos and Kouropetroglou, 2008)

The participants' answers can be transformed from the 1-9 point scale into a dimensional space of (+100;-100). Data from the
SAM questionnaire were converted into percentages and coded through Russell's map, mainly used to map specific emotions
(Scherer, 2005). Considering that the results derived from the MIORA coding of facial expressions were mapped according to
Ekman's FACS model, the authors' intervention was necessary to simplify the experiment's results and obtain a coherent
emotional response. The present authors integrated Russell's proposal with Ekman's model, drawing on the contribution made
by Ierache50. Before this, the authors conducted a factor analysis of the participant's responses to determine whether similar
dimensions could be extracted in a single cluster. Factor analysis is a statistical way of examining the correlation between items
to validate the existence of variables. Highly related items (emotions) are assumed to have roughly similar values and can
therefore be associated with a single emotion category.
By comparing Russell's map with Ekman's map (Figure 3), it is possible to group them into a single cluster:
− The anger emotion mentioned in Ekman is equivalent to frustrated, nervous, stressed, upset and angry in Russell’s map
− The sad emotion mentioned in Ekman is equivalent to depression, miserable, gloomy and sad in Russell
− What Ekman calls 'Fear' is also identified in Russell as alarmed, tense, and afraid by Russell
− What Ekman means by "Happiness" is also identified in Russell as serene, content, satisfied, glad, pleased and happy
Russell
− The joyless emotion mentioned in Ekman is equivalent to aroused and excited in Russell.

Figure 3. Comparison between Russell and Ekman system

The authors have therefore identified the seven primary emotions, composed within it of distinct pathways underlying Russell's
various emotional facets, identified in detail in the authors' reworking in the figure below (Figure 4).

Figure 4. Russell model with Ekman emotions (Source)

Results
For each task, the authors found a difference between the score generated by the SAM questionnaire and the score generated by
MIORA.
• First task: Search and start the lesson
Task 1 - Search and start the Fundamental lesson. The participants were disoriented due to the confusing graphic layout,
inappropriate colour and font size.
All participants took an average of two minutes to start the course due to difficulties in finding what they were asked for, which
is confirmed by the somewhat confusing heatmap in Figure 5.

Figure 5. Heatmap

Once the lesson started, it alternated scrolling slides with short video tutorials, experiments or interviews with experienced
engineers. The lesson discussed basic engineering notions and was explained with simple concepts, illustrative examples, trivia
pills and immediately comprehensible exercises. However, in the initial ten minutes when the usability problems of automatic
page loading occurred, MIORA detected the continued presence of negative emotions felt by the students, which caused a
decrease in the level of attention.
The MIORA analysis recorded several emotional changes in the student concerning the usability issues and the different ways
of using the course (slides vs video). The table shows the highlights of the video analysis where the change in the student's
emotional-cognitive reactions is present in the example of Figure 6(a).

Figure 6(a). Example of Student reactions measured with MIORA and the associated face in key course points

In the picture above left, at the beginning of the first interaction, the student appeared perplexed, and the coded facial expression
showed a sense of frustration at the continuous loading of the page. MIORA recorded intense peaks of anger. As the minutes
ticked by, attention and engagement also gradually declined. In contrast, when the usability problems disappeared, the student
raised his head, and his gaze remained straight ahead of the screen (first picture on the right). The change in the mode of content
enjoyment gradually increased engagement results and an upward shift in the emotional curve; in particular, an experiment
conducted by the professor particularly excited the students (Joy 10%) (Figure 6b). The present test confirms the hypotheses
postulated by Buono et al. (2020) that students perceive higher levels of attention and engagement in video lecture mode rather
than slide mode. This viewing mode also raised the levels of the emotional curve, leading them to express satisfaction in the
SAM questionnaire.
At the end of the 30 minutes, the participants completed the SAM questionnaire and rated this first task as satisfactory (Figure
7 represents SAM results). The SAM questionnaire shows that students liked the topics covered in the video course.
Figure 6(b). MIORA results for the first task

Figure 7. SAM results for the first task

• Second task: Stop the video-lesson


Task 2 -Stop the video lesson after 30 minutes. The analysis by MIORA revealed a small percentage of frustration identified in
the emotion 'Anger' and an even smaller percentage of despondency identified in the emotion 'Sadness' (Figure 8). The authors
further investigated this result through the data generated by face detection, gaze detection and screen recording and understood
at which critical moment negative emotional trends were formed. The negative emotion peak occurred when the student
repeatedly clicked the 'stop' button without feedback from the platform.
The negative emotions experienced had low levels of arousal and dominance, so they disappeared as soon as the platform
responded correctly to the request to stop the video.

Figure 8. MIORA results for the second task

In the SAM questionnaire, the participants did not report anything that influenced their feelings (Figure 9).
Figure 9. SAM results for the second task

• Third task: Take the test


Task 3 - Go to the examination section and take the test. The participants did not all run the same course, but some took longer
courses - thus taking longer - than others.
The results of the SAM questionnaire generalised a constant state of tension during the execution of the entire task, motivated
by the fact that time was running out and being timed generated competition anxiety among the participants (Figure 10).

Figure 10. SAM results for the third task

In particular, the participants who had already conducted tests of this kind within similar platforms knew a quick way to get to
the examination section with little effort, so it took them less time. In contrast, the subjects who took the longer and more time-
consuming route experienced more intense and dominant peaks of negative emotions than those mentioned above.
Only through MIORA's real-time analysis was it possible to find a more intense difference in the negative emotions felt by
those subjects who took the longer and more time-consuming route to the examination section (Figure 11 a, b).

(a) (b)

Figure 11. MIORA results for the third task (Path comparison)

• Fourth task: Change an answer


In the final task, the participants experienced heightened pressure due to time constraints as they were required to modify an
answer in the quiz. The pressure adversely triggered the individual's emotions, and they further heightened upon encountering
an error message while taking the quiz. MIORA identified the fear of running out of time or being surpassed by other students
regarding completion time, as illustrated in Figure 12. Correspondingly, the SAM questionnaire indicated that this fear was
closely linked to feelings of nervousness, as illustrated in Figure 13.

Figure 12. MIORA results for the fourth task

Figure 13. SAM results for the fourth task

The tasks' results show that in the SAM questionnaire, the subjects only ticked the emotions they perceived to be the most
dominant most of the time, whereas less intense emotions were not reported. Furthermore, some emotions confused the
participants, as happened in the last task (MIORA detected fear, the SAM questionnaire nervousness).
The following section presents a discussion of the theoretical and practical implications of the results.

Discussion
Positive UX is often related to usability metrics; thus, usability is an effective criterion for assessing the quality of UX (Agarwal
& Meyer, 2009). According to the literature, emotions are closely related to the final judgement of usability and, more generally,
to UX. The present work aims to measure usability metrics by assessing engagement and the participants' emotional responses
when faced with specific platform issues (number of errors, waiting for the page to load). This study affirms that emotional
responses play a crucial role in shaping users' overall evaluation of the learning experience, as demonstrated by previous
research conducted by Lopez-Aguilar et al. (2021) and Poma et al. (2021), but also in shaping their perceptions of usability.
Previous studies by Tokkonen and Saailuoma (2013), Bruun et al. (2016), and Pengnate and Delen (2014) have all suggested
that emotional responses can influence how users perceive the usability of e-learning platforms.
Several instruments, invasive or non-invasive, real-time or retrospective, have been used in the literature to assess emotions
during a usability test. To the best of the authors' knowledge, this study is the first to analyse the UX aspects of subjects'
emotional responses during interaction with an e-learning platform. Specifically, this study compared emotional responses from
a non-invasive, real-time analysis tool - MIORA - and a retrospective questionnaire -SAM. The MIORA tool records the user's
face and gaze via webcam and the screen. It measures Ekman's primary emotions and gazes' directions in real time. The SAM
questionnaire evaluates valence, dominance, and arousal through a Likert scale of 1 to 9. The findings support Hypothesis 1,
which posits that non-invasive methods facilitate greater adherence to subjects' complex emotional reactions with reduced
susceptibility to cognitive biases (Lewinski et al., 2014).
The different emotional responses confirmed a clear gap between the emotional responses experienced in real-time, and those
confessed later in the questionnaire. During the test, MIORA detected an alteration of emotions at the appearance of usability
problems perceived less intensively or not in the SAM questionnaires. In particular, ephemeral emotions were not reported in
retrospective self-assessments, only emotions felt over a prolonged period or the most intense ones. The authors attributed the
gap in results between the SAM questionnaire and the MIORA evaluation system to the phenomenon described by Scherer's
(2005) thesis. This thesis posits that emotions experienced over a brief period are unconsciously omitted during retrospective
self-assessment, leading individuals to recollect only the predominant emotions experienced over an extended duration.
Thus, the first research hypothesis is confirmed: non-invasive and real-time monitoring tools such as MIORA are necessary to
validate the experiment because they offer complete and reliable results.
According to several studies, proper usability evaluation through real-time analysis of users' emotional reactions allows for
responsive and immediate platform customisation, improving learning and satisfaction 14,51.
MIORA can be useful in becoming aware of the more or less conscious impact that usability problems can cause on the student's
emotional and cognitive experience, not directly perceivable through the SAM questionnaire. MIORA allowed the authors to
understand how the negative evaluation of UX was associated with numerous usability problems. The emotional curve returned
to neutral or positive levels during the video course without usability problems.
Therefore, the second hypothesis regarding adapting the GUI and platform to the user's needs results in positive emotional
changes, confirming the hypothesis.
From practical implications, practitioners and platform developers should consider UX assessment methods and synchronous
and interactive emotional reactions to save time and cost from an in-depth analysis, as stated in 18. These methods can help
better understand the quality of the structure and content of the platforms and the impact of the student's involvement and
emotional response. This is made possible by adopting real-time assessing methods to know their navigability and effectiveness
in educational terms.
Despite the novelty of this work and the findings, several limitations should be overcome. The research sample size should be
expanded and differentiated to strengthen the conclusions with additional inclusion and exclusion criteria. Another observation
concerns the effectiveness of course evaluation tests submitted at a time after the course has been conducted. This work
highlights the limitations of the questionnaire completed at a distance. Limitations in course evolution design may arise due to
being restricted to posing broad questions. In future research endeavours, we should administer a real-time verbal or oral self-
assessment during the experiment to address the retrospective completion of the System Usability Scale (SUS) questionnaire
while conducting Multimodal Interaction and Online Recognition of Affect (MIORA) analyses continuously and in real-time.

Conclusion
Assessing a user's emotions while interacting with a website has become crucial to ensure a successful user experience (UX).
The literature has explored several elements for evaluating UX, categorised into subjective/self-referential and objective tools
such as ECG and fMRI. Nevertheless, the reliability of these tools has been a subject of inquiry.
This work is among the first to hypothesise and test the efficacy of emotional and cognitive assessment by comparing non-
invasive and ex-post methods. The proposed experiment compares the use of MIORA software for real-time cognitive and
emotional analysis and the ex-post SAM questionnaire of the user while interfacing with an e-learning website. The results
show a significant gap between the retrospective and simultaneous evaluation of emotions. This study contributes to the research
on learning platform UX assessment, demonstrating that the non-invasive methods allow greater adherence to the subjects'
complex emotional reactions, less influenced by cognitive bias. Additionally, this non-invasive method allows the GUI and
platform to adapt to the user's needs, resulting in positive emotional changes, allowing additional engagement and better
educational results. Hence, this work offers theoretical, didactic and practical reflections for the future implementation of
detection and assessment methods of e-learning platforms, allowing professors, researchers, platform providers and graphic
designers to comprehend these aspects' role in the learning experience and outcomes.
Finally, this research can provide valuable contributions to extend further studies on real-time customisation and adaptation of
the graphical user interface based on users' emotional responses.

Data availability
The dataset generated by the questionnaire in its complete and filtered version with the inclusion/exclusion criteria is freely
available [NAME] repository [Persistent web link to datasets].
The data that support the findings of this study are available from Emoj. However, restrictions apply to the availability of these
data, which were used under license for the current study and are not publicly available. Data are, however, available from the
authors upon reasonable request and with permission of Emoj.

References

1. Khalimonchuk, K. Top Education App Design Trends in 2022: The Complete List + Best Cases.
https://fulcrum.rocks/blog/education-app-design#the-latest-trends-of-ux-app-design-1 (2022).
2. Taglietti, D., Landri, P. & Grimaldi, E. The big acceleration in digital education in Italy: The COVID-19
pandemic and the blended-school form. European Educational Research Journal 20, 423–441 (2021).
3. Zardari, B. A., Hussain, Z., Arain, A. A., Rizvi, W. H. & Vighio, M. S. Development and Validation of User
Experience-Based E-Learning Acceptance Model for Sustainable Higher Education. Sustainability 13, 6201
(2021).
4. Lopez-Aguilar, A., Bustamante-Bello, R. & Navarro-Tuch, S. A. Advanced system to measure UX in online
learning environments. in 2021 IEEE Global Engineering Education Conference (EDUCON) 774–777
(Springer Paris, 2021).
5. Poma, A., Rodríguez, G. & Torres, P. User Experience Evaluation in MOOC Platforms: A Hybrid
Approach. in Human-Computer Interaction: 7th Iberoamerican Workshop (HCI-COLLAB 2021) 208–224
(Spinger International Publishing , 2021).
6. McNamara, N. & Kirakowski, J. Functionality, usability, and user experience. Interactions 13, 26–28
(2006).
7. Bevan, N. Classifying and selecting UX and usability measures. (2008).
8. Yuill, N. & Rogers, Y. Mechanisms for collaboration. ACM Transactions on Computer-Human Interaction
19, 1–25 (2012).
9. Champney, R. K. & Stanney, K. M. Using Emotions in Usability. Proceedings of the Human Factors and
Ergonomics Society Annual Meeting 51, 1044–1049 (2007).
10. Agarwal, A. & Meyer, A. Beyond usability. in CHI ’09 Extended Abstracts on Human Factors in Computing
Systems 2919–2930 (ACM, 2009). doi:10.1145/1520340.1520420.
11. Bruun, A., Law, E. L.-C., Heintz, M. & Eriksen, P. S. Asserting Real-Time Emotions through Cued-Recall.
in Proceedings of the 9th Nordic Conference on Human-Computer Interaction 1–10 (ACM, 2016).
doi:10.1145/2971485.2971516.
12. Buono, P., De Carolis, B., D’Errico, F., Macchiarulo, N. & Palestra, G. Assessing student engagement from
facial behavior in online learning. Multimed Tools Appl (2022) doi:10.1007/s11042-022-14048-8.
13. Tokkonen, H. & Saariluoma, P. How user experience is understood? in Science and Information Conference
(2013).
14. Pappas, I. O. User experience in personalised online shopping: a fuzzy-set analysis. Eur J Mark 52, 1679–
1703 (2018).
15. Dubey, S. K. & Rana, A. Analytical roadmap to usability definitions and decompositions. International
Journal of Engineering Science and Technology 2, 4723–4729 (2010).
16. Albert, W. & Tullis, T. Measuring the user experience: collecting, analysing, and presenting usability
metrics. . (2013).
17. Xiao, L. & Wang, S. Mobile marketing interface layout attributes that affect user aesthetic preference: an
eye-tracking study. Asia Pacific Journal of Marketing and Logistics 35, 472–492 (2023).
18. Ceccacci, S., Giraldi, L. & Mengoni, M. Product Usability: Is it a Criterion to Measure “Good UX” or a
Prerequisite? in Volume 1A: 36th Computers and Information in Engineering Conference (American Society
of Mechanical Engineers, 2016). doi:10.1115/DETC2016-59500.
19. Dhall, A. E. Automatic emotion, engagement and cohesion prediction tasks. in 2019 International
conference on multimodal interaction 546–550 (2019).
20. Behera, A. et al. Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for
Enhancing Intelligent Tutoring Systems. Int J Artif Intell Educ 30, 236–270 (2020).
21. Palmatier, R. W., Houston, M. B. & Hulland, J. Review articles: purpose, process, and structure. J Acad
Mark Sci 46, 1–5 (2018).
22. Pengnate, S. F. & Delen, D. Evaluating emotions in mobile application descriptions: Sentiment analysis
approach. in Twentieth Americas Conference on Information Systems 1–9 (2014).
23. Forlizzi, J. & Battarbee, K. Understanding experience in interactive systems. in Proceedings of the 5th
conference on Designing interactive systems: processes, practices, methods, and techniques 261–268 (ACM,
2004). doi:10.1145/1013115.1013152.
24. Ekman, P. Basic Emotions. in Handbook of Cognition and Emotion 45–60 (John Wiley & Sons, Ltd, 2005).
doi:10.1002/0470013494.ch3.
25. Chang, E.-C., Lv, Y., Chou, T.-J., He, Q. & Song, Z. Now or later: Delay’s effects on post-consumption
emotions and consumer loyalty. J Bus Res 67, 1368–1375 (2014).
26. Gilleade, K. M. & Dix, A. Using frustration in the design of adaptive videogames. in Proceedings of the
2004 ACM SIGCHI International Conference on Advances in computer entertainment technology 228–232
(ACM, 2004). doi:10.1145/1067343.1067372.
27. Nicholls, M. E. R., Loveless, K. M., Thomas, N. A., Loetscher, T. & Churches, O. Some participants may be
better than others: Sustained attention and motivation are higher early in semester. Quarterly Journal of
Experimental Psychology 68, 10–18 (2015).
28. D’Mello, S., Chipman, P. P. & Graesser, A. Posture as a predictor of learner’s affective engagement. in
Proceedings of the 29th annual cognitive science society cognitive science society (eds. McNamara D.S. &
Trafton J.G) 905–910 (2007).
29. Klein, R. & Celik, T. The Wits Intelligent Teaching System: Detecting student engagement during lectures
using convolutional neural networks. in 2017 IEEE International Conference on Image Processing (ICIP)
2856–2860 (IEEE, 2017). doi:10.1109/ICIP.2017.8296804.
30. Zaletelj, J. & Košir, A. Predicting students’ attention in the classroom from Kinect facial and body features.
EURASIP J Image Video Process 2017, 80 (2017).
31. Burnik, U., Zaletelj, J. & Košir, A. Video-based learners’ observed attention estimates for lecture learning
gain evaluation. Multimed Tools Appl 77, 16903–16926 (2018).
32. Daily, S. B. et al. Affective Computing: Historical Foundations, Current Applications, and Future Trends. in
Emotions and Affect in Human Factors and Human-Computer Interaction 213–231 (Elsevier, 2017).
doi:10.1016/B978-0-12-801851-4.00009-4.
33. Chin, J. P., Diehl, V. A. & Norman, L. K. Development of an instrument measuring user satisfaction of the
human-computer interface. in Proceedings of the SIGCHI conference on Human factors in computing
systems - CHI ’88 213–218 (ACM Press, 1988). doi:10.1145/57167.57203.
34. Kirakowski, J. & Corbett, M. SUMI: the Software Usability Measurement Inventory. British Journal of
Educational Technology 24, 210–212 (1993).
35. Brooke, J. SUS-A quick and dirty usability scale. in Usability Evaluation in Industry (eds. Patrick W.
Jordan, Bruce Thomas, Bernard A. Weerdmeester & Ian L. McClelland) 189–194 (Taylor & Francis, 1996).
36. Bargas-Avila, J. A. & Hornbæk, K. Old wine in new bottles or novel challenges. in Proceedings of the
SIGCHI Conference on Human Factors in Computing Systems 2689–2698 (ACM, 2011).
doi:10.1145/1978942.1979336.
37. Kouroupetroglou, G., Papatheodorou, N. & Tsonos, D. Design and Development Methodology for the
Emotional State Estimation of Verbs. in Human Factors in Computing and Informatics: First International
Conference (SouthCHI) 1–15 (Springer Berlin , 2013).
38. Scherer, K. R. What are emotions? And how can they be measured? Social Science Information 44, 695–729
(2005).
39. Lewinski, P., Fransen, M. L. & Tan, E. S. H. Predicting advertising effectiveness by facial expressions in
response to amusing persuasive stimuli. J Neurosci Psychol Econ 7, 1–14 (2014).
40. Herbig, N. et al. Investigating Multimodal Measures for Cognitive Load Detection in E-Learning. in
Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization 88–97 (ACM,
2020). doi:10.1145/3340631.3394861.
41. Generosi, A., Ceccacci, S., Faggiano, S., Giraldi, L. & Mengoni, M. A Toolkit for the Automatic Analysis of
Human Behavior in HCI Applications in the Wild. Advances in Science, Technology and Engineering
Systems Journal 5, 185–192 (2020).
42. Munim, K. Md., Islam, I., Khatun, M., Karim, M. M. & Islam, M. N. Towards developing a tool for UX
evaluation using facial expression. in 2017 3rd International Conference on Electrical Information and
Communication Technology (EICT) 1–6 (IEEE, 2017). doi:10.1109/EICT.2017.8275227.
43. Qu, Q.-X., Zhang, L., Chao, W.-Y. & Duffy, V. User Experience Design Based on Eye-Tracking
Technology: A Case Study on Smartphone APPs. in 303–315 (2017). doi:10.1007/978-3-319-41627-4_27.
44. Bojko, A. & Schumacher, R. M. ye tracking and usability testing in form layout evaluation. in Proceedings
of the 38th International Symposium of Business Forms Management Association 1–13 (2008).
45. Cooke, L. Eye tracking: How it works and how it relates to usability. Tech Commun 52, 456–463 (2005).
46. Smith, J. D. & Graham, T. N. Use of eye movements for video game control. in Proceedings of the 2006
ACM SIGCHI International Conference on Advances in Computer Entertainment Technology 20–28 (2006).
47. Usakli, A. B. & Gurkan, S. Design of a Novel Efficient Human–Computer Interface: An Electrooculagram
Based Virtual Keyboard. IEEE Trans Instrum Meas 59, 2099–2108 (2010).
48. Frank, M. G., Ekman, P. & Friesen, W. V. Behavioral markers and recognizability of the smile of
enjoyment. J Pers Soc Psychol 64, 83–93 (1993).
49. Tsonos, D. & Kouroupetroglou, G. A Methodology for the Extraction of Reader’s Emotional State Triggered
from Text Typography. in Tools in Artificial Intelligence (InTech, 2008). doi:10.5772/6071.
50. Ierache, J. S., Nervo, F., Sattolo, I. I., Ierache, R. & Chapperón, G. Proposal of a multimodal model for
emotional assessment within affective computing in gastronomic settings. in XXVI Congreso Argentino de
Ciencias de la Computación (CACIC) (2020).
51. de Kock, E., van Biljon, J. & Botha, A. User Experience of Academic Staff in the Use of a Learning
Management System Tool. in Proceedings of the Annual Conference of the South African Institute of
Computer Scientists and Information Technologists on - SAICSIT ’16 1–10 (ACM Press, 2016).
doi:10.1145/2987491.2987514.

You might also like