Professional Documents
Culture Documents
2018 Aluja A Psychological Research - 10.1007s00426 - 018 - 0991-x - 1-18
2018 Aluja A Psychological Research - 10.1007s00426 - 018 - 0991-x - 1-18
https://doi.org/10.1007/s00426-018-0991-x
ORIGINAL ARTICLE
Abstract
The current research was designed to assess possible differences in the emotional content of pleasant and unpleasant face
emoji using acoustically evoked eyeblink startle reflex response. Stimuli were selected from Emojipedia Webpage. First, we
assessed these stimuli with a previous independent sample of 190 undergraduate students (46 males and 144 females) mean
age of 21.43 years (SD 3.89). A principal axis method was performed using the 30 selected emoji faces, extracting two factors
(15 pleasant and 15 unpleasant emoji). Second, we measured the acoustic startle reflex modulation in 53 young adult women
[mean age 22.13 years (SD 4.3)] during the viewing of each of the 30 emoji emotional faces in the context of the theory of
motivation and emotion proposed by Lang (1995), but considering only the valence dimension. We expected to find higher
acoustically evoked startle responses when viewing unpleasant emoji and lower responses for pleasant ones, similarly to the
results obtained in the studies using human faces as emotional stimulus. An ANOVA was conducted to compare acoustic
startle responses associated with pleasant and unpleasant emoji. Results yielded main effects for picture valence (λ = 0.80,
F(1, 50) = 12.80, p = .001, η2 = 0.20). Post-hoc t test analysis indicated significant differences in the startle response between
unpleasant (50.95 ± 1.75) and pleasant (49.14 ± 2.49) emoji (t (52) = 3.59, p = .001), with a Cohen’s d = 0.495. Viewing affec-
tive facial emoji expressions modulates the acoustic startle reflex response according to their emotional content.
Introduction Blanch, & Balada, 2015; Blanch, Aluja, Blanco, & Balada.,
2016; Bradley & Lang, 2000a; Cuthbert, Bradley, & Lang,
Acoustic startle reflex (ASR) modulation by affective emoji 1996; Lang, Bradley, & Cuthbert, 1998; Vrana, Spence,
is a useful paradigm for studying emotion. Eyeblink mag- & Lang, 1988). Affective startle modulation demonstrates
nitude modulated by affective state has become one of the that neuronal networks underlying emotions include direct
most widely used physiological measures in the study of connections to the brain’s primary motivational systems
motivation and emotion (Filion, Dawson, & Schell, 1998). (Lang, Davis, & Ohman, 2000). Lang and collaborators
Lang and collaborators proposed a theory of emotion, where also proposed three factors to measure affective reactions:
appetitive and aversive systems activate in accordance with (a) “emotional valence”, indicating the hedonic value of a
the subject’s current emotional state (Bradley & Lang, 1999; specific emotion as either positive or negative; (b) “emo-
Bradley & Lang, 2000a). ASR is then modulated depend- tional arousal”, indicating its intensity; and (c) “dominance”,
ing on how they are activated. Several studies have demon- indicating the feeling of being in control vs being controlled
strated that acoustic startle responses are attenuated during (Bradley & Lang, 1994).
passive viewing of affective pleasant pictures, whereas they Emotional expressions of human faces are relevant in
are increased when viewing unpleasant pictures (Aluja, non-verbal communication and can produce a significant
modulation of the ASR according to their emotional valence
(Anokhin & Golosheykin, 2010; Duval, Lovelace, Aarant,
* Anton Aluja & Filion, 2013; Paulus, Musial, & Renn, 2014). This effect
aluja@pip.udl.cat
is especially potentiated when viewing angry faces (Dun-
1
University of Lleida, Lleida, Catalonia, Spain ning, Auriemmo, Castille, & Hajcak, 2010). Nevertheless,
2
Institute of Biomedical Research of Lleida (IRB Lleida),
other studies have failed to find relationships among the
University of Lleida, Avda. de l’Estudi General, 4, valence of facial expressions and ASR responses (Alpers &
25001 Lleida, Catalonia, Spain Adolph, 2006; Waters, Neumann, Henry, Craske, & Ornitz,
3
Autonomous University of Barcelona, Lleida, Catalonia, 2008). In a study by Balaban (1995), emoji of unfamiliar
Spain
13
Vol.:(0123456789)
Psychological Research
adult faces with happy, neutral, and angry expressions were faces in a similar way to face-to-face communication with
administered to 5-month-old infants and affected the ASR regard to the social context and interaction partner. Never-
amplitude. It increased when they viewed angry faces and theless, an event-related potential (ERP) study with emoji
decreased when happy faces were presented. In another showed a reduced amplitude of N170 when the emoji were
study, Springer, Rosas, McGetrick, and Bowers (2007) found inverted, the opposite pattern showed by faces, indicating
a significant effect of facial expressions on startle amplitude, differences in recognition of the physiognomic features of
where angry but not fearful faces produced an increased eye- emoji compared to human faces, but not through configu-
blink response compared to all other expressions. ral processing (Churches, Nicholls, Thiessen, Kohler, &
Emoji are pictographic symbols widely used in Internet, Keage, 2014). Participants were better at categorizing face
electronic, and text communication such as instant messag- cues as fearful or neutral with upright than with inverted
ing applications and social media sharing (see real-time faces (Khalid, Horstmann, Ditye, & Ansorge, 2017). In fact,
emoji use on Twitter http://www.emojitracker.com/). Emoji some studies have suggested that human faces and schematic
are available on different operative systems (e.g., iOS, faces (similar to emoji, but not in color) activate the same
Android…). On the Internet, web pages provide several emotional/motivational system.
emoji collections. The Emoji_Report_20151 indicate that As far as we know, no study has attempted to explore the
emoji are being used by 92% of the online population and associations between emotional face emoji and ASR modu-
over 60% have used emoji several times a week or several lation. The main objective of this study was to assess the
times a day. Some reasons for using emoji according to U.S. emotional impact of facial expression emoji via the effect
Internet users as of August 20152 were: (a) “they help me of pleasant and unpleasant emoji faces to the acoustically
more accurately express what I am talking” (70.4%); (b) “it evoked startle response. In accordance with the theory of
makes it easy for other people to understand me” (64.7%); motivation and emotion proposed by Lang, Bradley, Cuth-
and (c) “they help create a more personal connection with bert, and Patrick (1993), we hypothesized that the ASR
the other persons” (49.7%). Traditional emoji faces dominate should be potentiated by emoji expressions with negative
the top two categories, with nearly 60% of all emoji sent. emotional valence (fearful, angry, crying face, tired face,
Face emoji convey their meaning through graphic resem- weary face, etc.), whereas the startle response should be
blance to human faces (e.g., a smiling face) (Miller et al., attenuated by positive emotional valence faces (face with
2016). Emoji can be considered examples of how people tears of joy, smiling face with smiling eyes, etc.). The cur-
have created surrogates for non-verbal cues. All these data rent research was conducted in two different studies. The
emphasize the importance of studying this form of commu- first one, to assess the subjective emotional valence of the
nication and their emotional impact further. emoji, and the second one to assess their emotional impact
Neuroimaging studies report that human facial expres- with a psychophysiological measure. It was expected that
sions activate emotion-related brain circuits in the limbic emoji with different emotional expressions would modulate
system, particularly the amygdala (Breiter et al., 1996; the ASR in a similar way as human faces with emotional
Hariri, Tessitore, Mattay, Fera, & Weinberger, 2002; Vuil- expressions.
leumier, Armony, Driver, & Dolan, 2001; Whalen et al.,
2001). The amygdala plays an important role in the pro- Study 1: emoji valence assessment
cessing of facial emotional expressions by humans and
non-human primates as is demonstrated by lesion and The aim of this study was to validate the valence classifi-
direct electrical stimulation studies (Adolphs et al., 2005; cation of the previous emoji selected from the emojipedia
Adolphs, Tranel, Damasio, & Damasio, 1995). However, descriptions.
schematic faces appear to evoke greater amygdala activa-
tion on arousal-based contrasts (negative vs neutral) than Method
in valence-based ones (negative vs positive) (Evans et al.,
2008). Emoji faces express feelings and emotions, both posi- Participants In Study 1, valence scores were obtained with
tive and negative, although are more usual in a positive than a sample of 190 undergraduate students (46 males and 144
in a negative context (Derks, Bos, & von Grumbkow, 2008). females) with a mean age of 21.43 years old (SD 3.89). Par-
These authors also suggest that people seem to use emoji ticipation was voluntary and anonymous.
13
Psychological Research
Table 1 Pleasant (15) and unpleasant emoji (15) by presentation group (startle reflex condition) and operative system
Smiling Face With Smiling Eyes Pleasant Y iOS 9.3 Android 6.0.1 Twemoji 2.0
Crying Face Unpleasant Y Twemoji 2.0 iOS 9.3 Android 6.0.1
White Smiling Face Pleasant Y Android 6.0.1 Twemoji 2.0 iOS 9.3
The 30 emoji are the same in the three groups, but ordered in different ways
Y yes, N no, Operative system iOS 9.3, Twemoji 2.0, Android 6.0.1
emojipedia.org). Each emoji description was selected and extracting two factors using similar procedure conducted by
represented in three operating systems: iOS 9.3, Android Aluja and Blanch (2011). The Kaiser–Meyer–Olkin meas-
6.0.1 and Twemoji 2.0 (Table 1). Participants were asked to ures of sample adequacy were 0.89 and a Bartletts’s test
evaluate each emoji according to its affective valence from of sphericity yielded an approximate Chi square value of
1 (unpleasant) to 9 (pleasant). 4061,992 (degrees of freedom 435; p < .001). The two fac-
tors explained 52.76% of the total variance using the struc-
Results ture matrix. The first factor was composed by the 15 pleasant
emoji and the second factor by the 15 unpleasant ones. Cor-
Table 2 shows the factorial structure of the 30 emoji using relation between factors was − 0.41. Table 2 also shows the
the principal axis analysis (PA) with oblimin rotation means and standard deviations of the 30-emoji subjective
13
Psychological Research
assessments. Significant differences in subjective valence factorial structure was robust and orthogonal. The pleas-
assessment were found between positive (7.85 ± 1.31) and ant emoji with high loadings were represented by “smiles”,
negative (3.61 ± 1.30) emoji (t test (190) = 26.33, p < .001). while the unpleasant emoji with high loadings were repre-
sented by “crying”, “tired”, and “weary” faces. This clas-
Discussion sification supported the bipolar emotional content of these
emoji and its use as emotional material in the ASR study.
Following the Ekman & Friesen proposal (1975), other pre-
vious study has categorized emotions of emoji according to Study 2: startle response modulation
six categories (happy, sad, fear, surprise, angry, and disgust),
but emoji used in cellular phones can also be divided into The aim of this study was to assess the differences in ASR
two major categories of pleasant (smiling, laughing, etc.) modulation associated to pleasant or unpleasant emoji.
and unpleasant (sad, anger, fear, etc.) (Toratani & Hirayama,
2011). Method
The present factorial analyses confirm that all the
30-selected emoji from the emojipedia description loaded Participants The participants were 53 young adult women
into two independent factors according to affective valence composed the sample (mean age 22.13 years; SD 4.3). Par-
(pleasant vs unpleasant). An additional advantage of the ticipation was also voluntary and anonymous. Both stud-
factorial analysis is that each factor is ordered according to ies were authorized by the ethical committee and were in
items with high to low factorial loadings. All emoji facto- accordance with the ethical standards laid down in the 1964
rial loadings were above 0.50 for their respective factor. The Declaration of Helsinki.
13
Psychological Research
Materials We composed three blocks of 10 emoji from of phy (EMG) module of the Biopac system and Acqkowledge
the 30 obtained in study 1. software. The startle blink reflex was recorded electromyo-
Each block contained five negative and five positive emoji graphically at the right orbicularis oculi muscle with two
from three different operating systems (iOS 9.3, Android 6-mm Ag–AgCl electrodes filled with electrolyte gel; one
6.0.1, Twemoji 2.0). The emoji were presented in the same electrode was placed below the right lower eyelid in line
emotional valence sequence for each block (unpleasant, with the pupil in forward gaze, and the other approximately
pleasant, pleasant, unpleasant, pleasant, unpleasant, unpleas- 1–2 cm next to the first one (center-to-center) (Balada,
ant, pleasant, unpleasant, and pleasant), but alternating Blanch, & Aluja, 2014;Blanch, Balada, & Aluja, 2013; Blu-
the operative system for each emoji. Once determined the menthal et al., 2005). Impedance levels were below 5 kΩ for
sequence of stimuli, the operating system corresponding to each participant.
each stimulus was randomized, assuming that these ten stim- EMG eyeblink responses were collected using a
uli should have four stimuli of an operating system and three 10–500 Hz band pass analogic filter. The signal was then
of each one of the other two operating systems. Participants integrated and smoothed with a five-point moving average
were assigned randomly to three groups, varying the order of off-line filter. Peak amplitude was measured with the high-
block presentation. After defining the first set of stimuli, the est point in the EMG response between 20 and 120 ms after
other two sets were designed by changing operating systems. probe onset, using the average of the 30 ms before probe
For example, if the first block had an emoji from iOS 9.3, in administrations as baseline. No response was considered
the second block, the same emoji was from Android 6.0.1, when peak amplitude was lowest than the average baseline
and in the third block from Twemoji 2.0 (Table 1). plus three standard deviations for this period. Responses
with onset latency below 20 ms and/or rise time above than
Procedure A clinical psychologist with open questions 95 ms were rejected. Raw data on ASR magnitude showed
interviewed all subjects. None of them reported a history high between-subjects variability. To reduce this variability,
of psychopathological disorders, current medication, audi- magnitude was transformed to within-subjects T-scores.
tory impairment or any uncorrected visual problems. Par-
ticipants were strongly advised to refrain from smoking and Results
drinking alcoholic or stimulant drinks for at least 12 h prior
to the experiment. The participants signed a written con- A 2 × 3 ANOVA performed with group as between-group
sent form. The recording sessions took place in an isolated factor and valence (pleasant and unpleasant) as within-group
acoustic and electromagnetic room (Faraday cage). Emoji factor yielded main effects for emoji valence [λ = 0.80, F(1,
had a diameter of 15 cm and were presented in a 32-inch 50) = 12.80, p = .001, η2 = 0.20], but not for the interaction
screen, while the subject was sitting at a distance of 60 cm. effect valence × group [λ = 0.98, F(2, 50) = 0.58, p = .56]. In
Prior to the stimuli presentation, the physiological sensors addition, no significant effects were observed in between-
were attached and impedances checked. To habituate partic- subject effects of groups [F(2, 50) = 0.78, p = .47].
ipants to the acoustic probe and minimize the novelty effect, Figure 1 shows that there were significant differences
three acoustic stimuli were delivered before the emoji pres- in the startle response between unpleasant (50.95 ± 1.75)
entations. Emoji were presented over a black background and pleasant (49.14 ± 2.49) emoji [t (52) = 3.59, p = .001],
for 6 s with an inter-trial interval (ITI) varying from 16 to with an intermediate Cohen’s effect d = 0.495 (Cohen,
18 s; each emoji was preceded by a white cross (1 s dura- 1988). No statistical differences appeared when we com-
tion). An acoustic startle probe was presented associated pared ITI startle magnitude (49.83 ± 4.47) with startle
with each emoji, within 2.5 to 3.5 s after emoji onset. Each response during viewing positive [t (52) = − 0.78, p = .44]
block included one of the five emoji of each emotional and negative [t (52) = 1.57, p = .12] emoji’s. Significant
valence without sound. Two ITI startle probes were pre- differences in subjective valence assessment were found
sented in each block to reduce the temporal predictability between positive (8.08 ± 1.38) and negative (3.46 ± 1.31)
of the startle sound and to compare modulation of ASR by emoji (t (52) = 13.31, p < .001). These data did not differ
emotional valence with a “baseline” level of reactivity. All from the data obtained in the first study for both positive [t
participants in this study assessed the 30 emoji according (242) = 1.11, p = .2 7] and negative emoji [t (242) = 0.73,
to valence values with the same procedure used in Study 1. p = .47].
13
Psychological Research
13
Psychological Research
participants. In addition, a possible withdrawal effects Informed consent Informed consent was obtained from all individual
(headache, irritability) that may have occurred if partici- participants included in the study.
pants were regular nicotine/caffeine users. As participants
could not smoke or take stimulants 12 h before the study.
Another limitation of this study is the lack of neutral
emoji. While the pleasant and unpleasant emoji seem easy References
to identify subjectively, the affective “neutrality” from the
Adolphs, R., Gosselin, F., Buchanan, T. W., Tranel, D., Schyns, P., &
pool of emoji was more difficult to recognize; for this rea- Damasio, A. R. (2005). A mechanism for impaired fear recogni-
son, it was not included in this study. We have included tion after amygdala damage. Nature, 433(7021), 68–72. https://
the ITIs startle amplitudes to the analysis to compare the doi.org/10.1038/nature03086.
Adolphs, R., Tranel, D., Damasio, H., & Damasio, A. R. (1995). Fear
obtained differences between pleasant and unpleasant
and the human amygdala. The Journal of neuroscience: the official
pictures with a neutral condition or “baseline” level of journal of the Society for Neuroscience, 15(9), 5879–5891.
reactivity. If we consider that ITIs are equivalent to the Alpers, G. W., & Adolph, D. (2006). Startle and autonomic nervous
neutral responses to pictures, the ASR magnitude should system modulation while viewing emotional scenes or emotional
facial expressions. Psychophysiology, 43(Supplement 1), S7. https
be lower than the magnitude of the negative pictures and
://doi.org/10.1016/j.biopsycho.2009.10.001.
higher than positive pictures. Our results of the average Aluja, A., & Blanch, A. (2011). Neuropsychological behavioral inhibi-
ASR of the ITIs were in this line, as they were placed tion system (BIS) and behavioral approach system (BAS) assess-
between the positive and negative pictures, although the ment: A shortened sensitivity to punishment and sensitivity to
reward questionnaire version (SPSRQ–20). Journal of Person-
differences were not significant. This lack of differentia-
ality Assessment, 93(6), 1–10. https://doi.org/10.1080/00223
tion could be due to the lower valence effect produced 891.2011.608760.
by emoji in comparison with other emotional stimuli like Aluja, A., Blanch, A., & Balada, F. (2015). Affective modulation of the
IAPS pictures (International Affective Picture System). It startle reflex and the reinforcement sensitivity theory of personal-
ity in females. Physiology and Behavior, 138, 332–339. https://
would be interesting to compare the emoji with other com-
doi.org/10.1016/j.physbeh.2014.09.009.
monly used emotional stimulus (Aluja, Rossier, Blanch Aluja, A., Rossier, L., Blanch Á., Blanco, E., Martí-Guiu, M., & Bal-
Blanco Martí-Guiu, & Balada, 2015). ada, F. (2015). Personality effects and sex differences on the Inter-
In summary, this research demonstrates that emoji national Affective Picture System (IAPS): A Spanish and Swiss
study. Personality and Individual Differences, 77, 143–148. https
induce changes in the ASR similar to the findings reported
://doi.org/10.1016/j.paid.2014.12.058.
for human faces in line with Lang’s theory (Lang et al., Anokhin, A. P., & Golosheykin, S. (2010). Startle modulation by
1990). Our intention was to be cautious and first inves- affective faces. Biological Psychology, 83(1), 37–40. https://doi.
tigate the possible ASR difference depending on emoji org/10.1016/j.biopsycho.2009.10.001.
Balaban, M. T. (1995). Affective influences on startle in five-month-
affective valence and later, in future studies, incorporate
old infants: Reactions to facial expressions of emotions. Child
the arousal and dominance dimensions. These results Dev, 66(1), 28–36. https://doi.org/10.1111/j.1467-8624.1995.
represent a starting point, which other researchers can tb00853.x.
build on, using different emoji types in research involving Balada, F., Blanch, A., & Aluja, A. (2014). Arousal and habituation
effects (excitability) on startle responses to the international affec-
emotions and ASR modulation or other neurophysiology
tive picture systems (IAPS). Journal of Psychophysiology, 28,
techniques and to be replicated in samples of men. The 233–241. https://doi.org/10.1027/0269-8803/a000115.
emoji affective content should be considered as an object Blanch, A., Aluja, A., Blanco, E., & Balada, F. (2016). Examining
of study for future investigations in the field of emotion. habituation of the startle reflex with the reinforcement sensitivity
theory of personality: Habituation and RST. Psychophysiology,
53(10), 1535–1541. https://doi.org/10.1111/psyp.12725.
Funding This research was supported by a grant from the “Plan
Blanch, A., Balada, F., & Aluja, A. (2013). Presentation and Acq-
Nacional” number: PSI2015-63551-P. Ministerio de Ciencia e Inno-
Knowledge: An application of software to study human emotions
vación. Spain. This research was performed within the Catalonian
and individual differences. Comput Methods Programs Biomed,
Consolidated Research Group SGR 111. 2014–2016 period. Eduardo
110(1), 89–98. https://doi.org/10.1016/j.cmpb.2012.10.013.
Blanco is a “Serra Húnter Fellowship”.
Blumenthal, T. D., Cuthbert, B. N., Filion, D. L., Hackley, S., Lipp, O.
V., & Van Boxtel, A. (2005). Committee report: Guidelines for
Compliance with ethical standards human startle eyeblink electromyographic studies. Psychophysiol-
ogy, 42(1), 1–15. https: //doi.org/10.1111/j.1469-8986.2005.00271
Conflict of interest The authors report no conflicts of interest. The au- .x.
thors alone are responsible for the content and writing of the article. Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-
assessment manikin and the semantic differential. Journal of
Ethical approval All procedures performed were in accordance with behavior therapy and experimental psychiatry, 25(1), 49–59. https
the ethical standards of the institutional research committee “Comité ://doi.org/10.1016/0005-7916(94)90063-9.
Ètic de l’Hospital Aranu de Vilanova” of the University of Lleida Bradley, M. M., & Lang, P. J. (1999). Affective norms for English
(Spain), and with the 1964 Helsinki declaration and its later amend- words (ANEW): Stimuli, instruction manual and affective ratings
ments or comparable ethical standards. (Tech. Rep. No. C-1). Gainesville: University of Florida, Center
for Research in Psychophysiology.
13
Psychological Research
Bradley, M. M., & Lang, P. J. (2000a). Affective reactions to acous- comparison of faces and scenes. Neuroimage, 17(1), 317–323.
tic stimuli. Psychophysiology, 37(2), 204–215. https : //doi. https://doi.org/10.1006/nimg.2002.1179.
org/10.1111/1469-8986.3720204. Khalid, S., Horstmann, G., Ditye, T., & Ansorge, U. (2017). Measur-
Bradley, M. M., & Lang, P. J. (2000b). Measuring emotion: Behavior ing the emotion-specificity of rapid stimulus-driven attraction of
feeling, and physiology. In R. D. Lane & L. Nadel (Eds.), Cogni- attention to fearful faces: Evidence from emotion categorization
tive neuroscience of emotion (pp. 242–276). New York: Oxford and a comparison with disgusted faces. Psychological Research,
University Press. 81(2), 508–523. https://doi.org/10.1007/s00426-016-0743-8.
Breiter, H. C., Etcoff, N. L., Whalen, P. J., Kennedy, W. A., Rauch, Lang, P. J. (1995). The emotion probe—studies of motivation and
S. L., Buckner, R. L. et al. (1996). Response and habituation of attention. American Psychologist, 50(5), 372–385. https://doi.
the human amygdala during visual processing of facial expres- org/10.1037/0003-066x.50.5.372.
sion. Neuron, 17(5), 875–887. https://doi.org/10.1016/S0896 Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1990). Emotion, atten-
-6273(00)80219-6. tion, and the startle reflex. Psychological Review, 97(3), 377–395.
Churches, O., Nicholls, M., Thiessen, M., Kohler, M., & Keage, H. https://doi.org/10.1037/0033-295x.97.3.377.
(2014). Emoticons in mind: An event-related potential study. Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1998). Emotion, moti-
Soc Neurosci, 9(2), 196–202. https://doi.org/10.1080/17470 vation, and anxiety: Brain mechanisms and psychophysiology.
919.2013.873737. Biol Psychiatry, 44(12), 1248–1263. https://doi.org/10.1016/
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. S0006-3223(98)00275-3.
Hillsdale, NJ: Erlbaum. Lang, P. J., Bradley, M. M., Cuthbert, B. N., & Patrick, C. J. (1993).
Cuthbert, B. N., Bradley, M. M., & Lang, P. J. (1996). Probing picture Emotion and psychopathology: A startle probe analysis. Progress
perception: Activation and emotion. Psychophysiology, 33(2), in Experimental Personality & Psychopathology Research, 16,
103–111. https://doi.org/10.1111/j.1469-8986.1996.tb02114.x. 163–199.
Derks, D., Bos, A. E., & von Grumbkow, J. (2008). Emoticons in Lang, P. J., Davis, M., & Ohman, A. (2000). Fear and anxiety: Animal
computer-mediated communication: Social motives and social models and human cognitive psychophysiology. Journal of Affec-
context. Cyberpsychology Behavior, 11(1), 99–101. https://doi. tive Disorders, 61(3), 137–159. https://doi.org/10.1016/S0165
org/10.1089/cpb.2007.9926. -0327(00)00343-8.
Detenber, B. H., & Reeves, B. (1996). A bio-informational the- Miller, H., Thebault-Spieker, J., Chang, S., Johnson, I., Terveen, L., &
ory of emotion: Motion and image size effects on view- Hecht, B. (2016). “Blissfully happy” or “ready to fight”: Varying
ers. Journal of Communication, 46(3), 66–84. https : //doi. Interpretations of Emoji. In Paper presented at the proceedings
org/10.1111/j.1460-2466.1996.tb01489.x. of the international AAAI conference on web and social media
Dunning, J. P., Auriemmo, A., Castille, C., & Hajcak, G. (2010). In (ICWSM-2016), Cologne, Germany.
the face of anger: Startle modulation to graded facial expres- Paulus, A., Musial, E., & Renn, K. (2014). Gender of the expresser
sions. Psychophysiology, 47(5), 874–878. https://doi.org/10.111 moderates the effect of emotional faces on the startle reflex.
1/j.1469-8986.2010.01007.x. Cognition and Emotion, 28(8), 1493–1501. https : //doi.
Duval, E. R., Lovelace, C. T., Aarant, J., & Filion, D. L. (2013). The org/10.1080/02699931.2014.886557.
time course of face processing: Startle eyeblink response modu- Springer, U. S., Rosas, A., McGetrick, J., & Bowers, D. (2007).
lation by face gender and expression. International Journal of Differences in startle reactivity during the perception of
Psychophysiology, 90(3), 354–357. https: //doi.org/10.1016/j.ijpsy angry and fearful faces. Emotion, 7(3), 516–525. https://doi.
cho.2013.08.006. org/10.1037/1528-3542.7.3.516.
Ekman, P., & Friesen, W. V. (1975). Unmasking the face; a guide to Toratani, Y., & Hirayama, M. J. (2011, September). Psychological
recognizing emotions from facial clues. N.J.: Prentice-Hall. analysis of emoticons used for e-mails on cellular phones. In
Evans, K. C., Wright, C. I., Wedig, M. M., Gold, A. L., Pollack, M. Mobile IT Convergence (ICMIC), 2011 International Conference
H., & Rauch, S. L. (2008). A functional MRI study of amygdala on IEEE (pp. 49–53).
responses to angry schematic faces in social anxiety disorder. Vrana, S. R., Spence, E. L., & Lang, P. J. (1988). The star-
Depression and anxiety, 25(6), 496–505. https://doi.org/10.1002/ tle probe response: A new measure of emotion? Jour-
da.20347. nal of Abnormal Psychology, 97(4), 487–491. https : //doi.
Filion, D. L., Dawson, M. E., & Schell, A. M. (1998). The psychologi- org/10.1037/0021-843X.97.4.487.
cal significance of human startle eyeblink modification: A review. Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects
Biological Psychology, 47(1), 1–43. https://doi.org/10.1016/ of attention and emotion on face processing in the human brain:
S0301-0511(97)00020-3. An event-related fMRI study. Neuron, 30(3), 829–841. https: //doi.
Ganster, T., Eimler, S. C., & Kramer, N. C. (2012). Same same but org/10.1016/S0896-6273(01)00328-2.
different!? The differential influence of smilies and emoticons on Waters, A. M., Neumann, D. L., Henry, J., Craske, M. G., & Ornitz, E.
person perception. Cyberpsychology Behaviour and Social Net- M. (2008). Baseline and affective startle modulation by angry and
work, 15(4), 226–230. https://doi.org/10.1089/cyber.2011.0179. neutral faces in 4–8-year-old anxious and non-anxious children.
Guerra, P., Sanchez-Adam, A., Anllo-Vento, L., Ramirez, I., & Vila, J. Biological Psychology, 78(1), 10–19. https://doi.org/10.1016/j.
(2012). Viewing loved faces inhibits defense reactions: A health- biopsycho.2007.12.005.
promotion mechanism? Plos One. https://doi.org/10.1371/journ Whalen, P. J., Shin, L. M., McInerney, S. C., Fischer, H., Wright, C. I.,
al.pone.0041631 (ARTN e41631). & Rauch, S. L. (2001). A functional MRI study of human amyg-
Hariri, A. R., Tessitore, A., Mattay, V. S., Fera, F., & Weinberger, dala responses to facial expressions of fear versus anger. Emotion,
D. R. (2002). The amygdala response to emotional stimuli: A 1(1), 70–83. https://doi.org/10.1037/1528-3542.1.1.70.
13