Professional Documents
Culture Documents
Insomnia Eye Tracking
Insomnia Eye Tracking
Insomnia Eye Tracking
doi: 10.1093/sleep/zsy220
Advance Access Publication Date: 17 November 2018
Original Article
Original Article
Individuals with insomnia misrecognize angry faces
City University of Hong Kong, Kowloon Tong, Hong Kong, 3Department of Psychology, Education University of Hong
Kong, Tai Po, Hong Kong and 4Centre for Psychosocial Health, Education University of Hong Kong, Tai Po, Hong Kong
*Corresponding author. Janet Hsiao, 6/F, Department of Psychology, Jockey Club Tower, Centennial Campus, The University of Hong Kong, Pok Fu Lam, Hong Kong.
Email: jhsiao@hku.hk.
Abstract
Individuals with insomnia have been found to have disturbed perception of facial expressions. Through eye movement
examinations, here we test the hypothesis that this effect is due to impaired visual attention functions for retrieving diagnostic
features in facial expression judgments. Twenty-three individuals with insomnia symptoms and 23 controls without insomnia
completed a task to categorize happy, sad, fearful, and angry facial expressions. The participants with insomnia were less accurate
in recognizing angry faces and misidentified them as fearful faces more often than the controls. A hidden Markov modeling
approach for eye movement data analysis revealed that when viewing facial expressions, more individuals with insomnia adopted
a nose–mouth eye movement pattern focusing on the vertical face midline while more controls adopted an eyes–mouth pattern
preferentially attending to lateral features, particularly the two eyes. As previous studies found that the primary diagnostic feature
for recognizing angry faces is the eyes while the diagnostic features for other facial expressions involve the mouth region, missing
the eye region may contribute to specific difficulties in recognizing angry facial expressions, consistent with our behavioral finding
in participants with insomnia symptoms. Taken together, the findings suggest that impaired information selection through visual
attention control may be related to the compromised emotion perception in individuals with insomnia.
Statement of Significance
We were the first to use an eye-tracking approach to understand the mechanism underlying impaired recognition of emotional facial
expressions in individuals with disturbed sleep. In contrast to traditional methods of eye movement data analysis that only focus on group-
level analysis of fixation locations, we used a state-of-the-art, machine-learning based approach (i.e. Eye Movement analysis with Hidden
Markov Models, EMHMM) that accounts for individual differences in both spatial and temporal information of eye movements and enables
quantitative measurements of eye movement pattern similarities. Through this analysis, we discovered for the first time in the literature
the potential role of visual attention control in accounting for disturbed emotional perception in individuals with insomnia.
Key words: insomnia; eye-tracking; hidden Markov model; facial expression; visual attention control
1
2 | SLEEPJ, 2019, Vol. 42, No. 2
Methods for item 2, 17 for item 3, 23 for item 4, 14 for item 5, 20 for item 6,
14 for item 7, and 16 for item 8.
Participants
a fixation cross for 500 ms. Once a fixation was detected at the processing procedures and algorithms are summarized
fixation cross at the end of the 500 ms, a color picture of an Asian in Supplementary Figure S1.) First, each participant’s eye
individual’s face with an emotional facial expression (Figure 1B; movements while viewing facial expressions were summarized
450 × 600 pixels) was presented either above or below the center with an HMM, including person-specific ROIs and transition
of the screen until the participants categorized it as a happy, probabilities among the ROIs. The optimal number of ROIs for
sad, fearful, or angry face by pressing corresponding buttons. each model was determined automatically through a variational
Participants were asked to respond as quickly as possible. After Bayesian approach, by selecting the model with the highest
a 250 ms pause, they were asked to rate the emotional intensity marginal likelihood. Second, we clustered all 46 individual
of the facial expression on a 6-point scale, ranging from “1-not HMMs (one for each participant) into two groups according to
very intense” to “6-extremely intense.” All responses were made their similarities using the variational hierarchical expectation-
on an RB-840 Cedrus Response Pad. The faces spanned around maximization (VHEM) algorithm [16]. Third, each group of
8° of visual angle at the viewing distance of 60 cm [29]. The individual HMMs was summarized into a representative eye
participants had a practice session of 24 trials (6 trials for each movement pattern described using an HMM with group-specific
expression) at the beginning of the task. Pictures were presented ROIs and transition probabilities among the ROIs. We examined
in a random order within each block and the order of the two the frequency of individuals adopting the two representative
blocks was counterbalanced across participants. eye movement patterns in the control and insomnia participant
groups, and the correlations between similarity of their eye
movement patterns to each representative pattern (measured
Data processing and analysis with log-likelihood) and their behavioral performances.
Behavioral data
The behavioral performances of the two groups in the emotional
face task (i.e. accuracy, response time, and intensity rating) Results
were compared by repeated-measures ANOVAs to examine
the effects of group (control vs. insomnia) and emotion (happy Behavioral results
vs. sad vs. fearful vs. angry). Significant interactions and main A 2 (group: control vs. insomnia) by 4 (emotion: happy vs. sad
effects in repeated-measures ANOVA were followed by post hoc vs. fearful vs. angry) repeated measures ANOVA revealed a
independent t-tests between the two groups in each emotion or significant interaction between group and emotion on the
pairwise comparisons with Bonfferoni–Sidak adjustments. accuracy of the facial expression judgment task, F(3, 132) = 2.68,
p = .049, ηp2 = .057. Independent t-tests between the insomnia
Eye movement data group and the control group in each emotion condition indicated
The eye movement data were analyzed using the EMHMM that there was 8.4% higher accuracy on average to identify angry
(Eye Movement analysis with Hidden Markov Models, http:// faces in the control group than the insomnia group, t(44) = 2.12,
visal.cs.cityu.edu.hk/research/emhmm/) [18] approach. (The p = .039, Cohen’s d = .63 (Figure 2A). This group difference was
Zhang et al. | 5
not found in other emotion conditions, p’s > .05. There was p < .001, η2 = .921) indicated that the percentage of responses
also a significant main effect of emotion on the accuracy, towards angry faces from the highest to the lowest was angry,
F(3,132) = 246.58, p < .001, ηp2 = .514. Adjusted post hoc pairwise sad, fearful, and happy, all p’s < .05. Note that the low percentage
comparisons indicated that the accuracy for recognizing of fearful responses towards angry faces in the control group
emotional faces from the highest to the lowest was happy faces (2.61%) suggested that angry expressions are rarely misidentified
(M = .984, SD = .033), fearful faces (M = .923, SD = .088), sad faces as fearful in a healthy person. Thus, compromised identification
(M = .871, SD = .090), and angry faces (M = .775, SD = .140), all p’s due to insomnia, although the error rate was only increased
< .05 (Figure 2A). There was no significant effect of group, F(1, slightly (6.74%), could lead to a large effect size and affect daily
44) = 2.53, p = .119. To understand what led to the difference life. Another contributing factor was the selective increase
in recognizing angry faces between the two groups, we moved in misidentifying angry faces as fearful but not as any other
on to examine the responses participants made towards angry expressions.
faces (Figure 2B; for other expressions, see Supplementary In the data of response time (RT) for correct responses in
Figure S2). The 2 (group: control vs. insomnia) by 4 (response: the expression categorization task (Figure 2C), a 2 (group) by 4
happy vs. sad vs. fearful vs. angry) repeated measures ANOVA (emotion) repeated measures ANOVA indicated a main effect
indicated a significant interaction between group and response, of emotion, F(3, 132) = 40.72, p < .001, ηp2 = .48. Post hoc pairwise
100%
* 100%
*
75% 75%
Percentage
Accuracy
50% 50%
25% 25% *
0% 0%
happy sad fearful angry happy sad fearful angry
2500
6
2000
Intensity ratings (1−6)
Response time (ms)
4
1500
1000
2
500
0 0
happy sad fearful angry happy sad fearful angry
Figure 2. (A) The accuracy to identify happy, sad, fearful, and angry facial emotions in the control and the insomnia group (black dots represent individual data
points). The insomnia group had a lower accuracy to identify angry faces than the control group. (B) Reponses made while angry faces were presented. Individuals
with insomnia gave more mistaken “fearful” response than controls. (C) Response time to accurately identify emotional facial expressions. The insomnia group was
marginally 350 ms slower than the control group to identify angry faces. (D) Emotional intensity rating of the four facial emotions in the two groups. (*p < .05; †.05 < p
< .10; error bars: 1 s.e.m.).
6 | SLEEPJ, 2019, Vol. 42, No. 2
emotion. When we examined the difference between the two χ2(1) = 4.39, p = .036, with more controls (56.5%) adopting eyes–
groups in identifying different expressions separately, there mouth patterns and more individuals with insomnia (73.9%)
was a trend that the insomnia group identified angry faces adopting nose–mouth patterns (Table 3). Note that this group
slower than the control group, t(44) = 1.768, p = .084, Cohen’s difference was not readily observable using a traditional ROI
d = .53. The group by emotion repeated measures ANOVA on analysis: when we examined the number of fixations and total
emotional intensity rating showed a main effect of emotion fixation duration in a set of predetermined, commonly used
(fearful > happy, angry > sad; Figure 2D), F(3, 132) = 23.24, p < facial ROIs (eyes, corrugator, nose, and mouth; Supplementary
.001, ηp2 = .35. However, there was no main effect of group or Figure S3), no significant difference was found between the
group by emotion interaction. two groups, all p’s > .05 (Supplementary Tables S1 and S2). Of
note, individuals adopting eyes–mouth patterns (accuracy:
84.5% ± .091) were more accurate to recognize angry faces than
Eye movement data
those adopting nose–mouth patterns (accuracy: 72.6% ± .150),
We modeled each participant’s eye movements with an HMM t(44) = 3.078, p = .004, Cohen’s d = .93, suggesting the advantage of
for viewing all facial expressions and also for viewing each eyes–mouth patterns for identifying angry faces. This effect was
type of facial expressions separately. For all expressions not observed in identifying happy, sad, or fearful faces. Indeed,
Figure 3. Representative eye movement patterns during facial expression recognition discovered through HMM clustering. (A1 and A2) The eyes–mouth and nose–
mouth patterns for overall facial expression viewing. (B1 and B2) The eyes–mouth and nose–mouth patterns for viewing happy facial expressions. (C1 and C2) The
eyes–mouth and nose–mouth patterns for viewing sad facial expressions. (D1 and D2) The eyes–mouth and nose–mouth patterns for viewing fearful facial expressions.
(E1 and E2) The eyes–mouth and nose–mouth patterns for viewing angry facial expressions. There are three images and a table illustrating each representative pattern.
Three images from left to right: three elliptical ROIs in three colors, assignments of the actual fixations to the ROIs, and a heat map of the eye fixations. The tables
indicate the priors (i.e. the probability of the first fixation in each trial landed in each region) and transition probabilities of eye movements among the ROIs.
Sad face viewing. and the mouth (green and blue ROI, 56%). It was also likely to be
Figure 3C shows the two representative patterns while viewing sad in the eye region (red ROI, 44%). The following fixations mostly
facial expressions. In the eyes–mouth pattern (Figure 3C1, n = 21), stayed in its original area and also transited between the two
the first fixation was most likely to be at the area of the nose tip areas sometimes. In the nose–mouth pattern (Figure 3C2, n = 25),
8 | SLEEPJ, 2019, Vol. 42, No. 2
all three ROIs were along the face midline from the forehead to the control group and the insomnia group differed significantly in
chin (from top to bottom: green, red, and blue). Fixations in one their frequencies of adopting the two representative patterns,
ROI rarely transited to another ROI (≤ 3%). To summarize the two χ2(1) = 7.10, p = .008, with more controls (73.9%) adopting the
patterns for viewing sad faces, the eyes–mouth pattern mainly eyes–mouth pattern and more individuals with insomnia (62.5%)
attended to the two eyes and the mouth, whereas the nose–mouth adopting the nose–mouth pattern (Table 3).
pattern mainly attended to the nose and the mouth. When viewing
sad facial expressions, the control group and the insomnia group Angry face viewing.
differed significantly in their frequencies of adopting the two Figure 3E shows the two representative patterns while viewing
representative patterns, χ2(1) = 4.29, p = .038, with more controls angry facial expressions. In the eyes–mouth pattern (Figure 3E1;
(60.9%) adopting the eyes–mouth pattern and more individuals n = 17), the first fixation was most likely to be around the nose tip
with insomnia (69.6%) adopting the nose–mouth pattern (Table 3). and the mouth (green ROI and blue ROI, 54%), or the horizontal
elliptical area covering the two eyes (red ROI, 46%). Fixations
Fearful face viewing. around the nose and the mouth sometimes transferred to the eyes
Figure 3D shows the two representative patterns while region (from green ROI to red ROI, 25%). Fixations around the eyes
viewing fearful facial expressions. In the eyes–mouth pattern were most likely to stay there (from red ROI to red ROI, 93%). In the
Table 3. Frequencies of participants in the control and the insomnia group adopting analytic and holistic patterns for overall face viewing and
for viewing each emotional facial expression (happy, sad, fearful, and angry)
perception of facial expressions, particularly those with diagnostic The misrecognition of facial anger in individuals with
information in the eye region such as anger and fear. Consistent insomnia corresponded to their eye movement patterns. Most
with our hypothesis, our results showed that individuals with of the individuals with insomnia adopted nose–mouth patterns
insomnia symptoms were less accurate and marginally slower to that focused on the vertical face midline while missing the two
identify angry faces than controls without insomnia. Furthermore, eyes. In contrast, most controls adopted eyes–mouth patterns
participants with insomnia symptoms misrecognized angry faces that looked at more lateral areas covering both the eyes and
as fearful faces more often than controls. Through the EMHMM the mouth. Since participants adopting eyes–mouth patterns
approach for eye movement data analysis [18], we discovered two outperformed those adopting nose–mouth patterns in angry
common eye movement patterns among the participants when face recognition, the impaired recognition of angry faces in
viewing emotional facial expressions: a nose–mouth pattern individuals with insomnia may be related to their use of nose–
that primarily looked at areas along the face midline (forehead mouth patterns. Although the eye region has also been reported
center to nose to mouth center), and an eyes–mouth pattern to be diagnostic for identifying fearful expressions, in the
that fixated at more lateral features such as the two eyes and the current study participants with insomnia symptoms did not
mouth corner. Consistent with our hypothesis, significantly more show impaired fearful expression recognition when compared
controls adopted the eyes–mouth pattern and more individuals with controls. We speculated that this result may be because
prefrontal cortex and these limbic structures towards emotional Scotland to whom information from the eyes and the mouth
stimuli [6, 34]. In addition to this emotional brain network, the may be more equally used [37], Kyle and colleagues [7] did not
current study suggests that impaired attentional functioning may find difference in recognition accuracy but only in emotional
also play an important role in accounting for the misinterpretation intensity ratings between the participants with insomnia and
of emotional information after sleep loss. Indeed, sleep loss is the controls. Future work will examine these possibilities.
shown to affect visual attention control and activation in the Whereas our current results showed that participants with
attention brain network [9, 10]. Decreased activations in the insomnia symptoms tended to overlook the eye region during facial
attention brain network (e.g. the frontal eye field and intraparietal expression identification, some recent studies have reported that
sulcus) are associated with maladaptive eye movement patterns individuals with insomnia had preferential attention to the eye
and impaired recognition performance during face viewing [14]. region during passive face viewing [13, 38]. This differential effect
Impaired visual attention functions may cause failure of selecting of insomnia on eye movement patterns due to task difference
diagnostic information for emotional face perception, leading to suggests that task demand may be an important factor to consider
biased interpretation of emotional information. Our findings are in the examination of sleep loss on visuospatial attention functions.
consistent with those of Cote et al. [5], which showed that impaired Indeed, in the literature, it has been well demonstrated that eye
facial expression recognition in sleep-deprived individuals was movement behavior can be significantly altered by task [39, 40], and
Funding 19. Coviello E, et al. Clustering hidden Markov models with
variational HEM. J Mach Learn Res. 2014;15:697–747.
We are grateful to the Research Grant Council of Hong Kong
20. Chuk T, et al. Is having similar eye movement patterns
(#17402814 to Hsiao and CityU110513 to Chan). during face learning and recognition beneficial for recognition
Conflict of interest statement. None declared. performance? Evidence from hidden Markov modeling. Vision
Research 2017;141:204–216.
21. Chan CY, et al. Eye-movement patterns in face recognition
References are associated with cognitive decline in older adults.
1. Beattie L, et al. Social interactions, emotion and sleep: a Psychon Bull Rev. 2018:1–8.
systematic review and research agenda. Sleep Med Rev. 22. Smith ML, et al. Transmitting and decoding facial
2015;24:83–100. expressions. Psychol Sci. 2005;16(3):184–189.
2. Killgore WD, et al. Sleep deprivation impairs recognition 23. Schyns PG, et al. Transmission of facial expressions of
of specific emotions. Neurobiol Sleep Circadian Rhythms. emotion co-evolved with their efficient decoding in the brain:
2017;3:10–6. behavioral and brain evidence. PLoS One. 2009;4(5):e5625.
3. Huck NO, et al. The effects of modafinil, caffeine, and 24. Espie CA, et al. The sleep condition indicator: a clinical
dextroamphetamine on judgments of simple versus screening tool to evaluate insomnia disorder. BMJ Open.