Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

www.elsevier.

com/locate/ynimg
NeuroImage 31 (2006) 906 – 919

Facial expressions and complex IAPS pictures: Common and


differential networks
Jennifer C. Britton,a,* Stephan F. Taylor,b Keith D. Sudheimer,a and Israel Liberzon b,c
a
Department of Neuroscience, University of Michigan, Ann Arbor, MI 48109, USA
b
Department of Psychiatry, University of Michigan, Ann Arbor, MI 48109, USA
c
Psychiatry Service, Ann Arbor VAMC, Ann Arbor, MI 48105, USA

Received 20 July 2005; revised 11 December 2005; accepted 16 December 2005


Available online 17 February 2006

Neuroimaging studies investigating emotion have commonly used two relatively separate. Facial expressions are often viewed as external
different visual stimulus formats, facial expressions of emotion or signals of experienced emotions that communicate information to
emotionally evocative scenes. However, it remains an important the observer (Frank and Stennett, 2001). Facial expressions
unanswered question whether or not these different stimulus formats portraying specific emotions (e.g. happy, sad, anger, fear) are
entail the same processes. Facial expressions of emotion may elicit more universally recognized (Ekman, 1992, 1994; Izard, 1994) and each
emotion recognition/perception, and evocative pictures may elicit more
expression of discrete emotion has meaning, targeting a specific
direct experience of emotion. In spite of these differences, common areas
of activation have been reported across different studies, but little work response (Halberstadt and Niedenthal, 1997). Even though facial
has investigated activations in response to the two stimulus formats in expressions are used frequently as probes of emotion recognition,
the same subjects. In this fMRI study, we compared BOLD activation some studies have shown that faces can be inducers of emotion
patterns to facial expression of emotions and to complex emotional (Hatfield et al., 1992; Wild et al., 2001). Facial expressions have
pictures from the International Affective Picture System (IAPS) to been also shown to evoke physiological changes (Clark et al., 1992;
determine if these stimuli would activate similar or distinct brain Esteves and Ohman, 1993) and autonomic activity in response to
regions. Healthy volunteers passively viewed blocks of expressive faces facial expressions has been shown to correlate with neural activation
and IAPS pictures balanced for specific emotion (happy, sad, anger, fear, (Williams et al., 2004). Complex pictures from the International
neutral), interleaved with blocks of fixation. Eye movement, reaction Affective Picture System (IAPS), another common emotional probe,
times, and off-line subjective ratings including discrete emotion, valence,
depict emotion-laden scenes to induce affective states. The
and arousal were also recorded. Both faces and IAPS pictures activated
similar structures, including the amygdala, posterior hippocampus, standardized set of IAPS pictures has been rated in terms of their
ventromedial prefrontal cortex, and visual cortex. In addition, expres- ability to induce valence (unpleasant/pleasant) and arousal (calm/
sive faces uniquely activated the superior temporal gyrus, insula, and excited) changes. These measures have also been correlated with
anterior cingulate more than IAPS pictures, despite the faces being less viewer’s heart rate and skin conductance changes, respectively,
arousing. For the most part, these regions were activated in response to providing physiological validity to subjectively reported emotion
all specific emotions; however, some regions responded only to a subset. induction (Lang et al., 1993). However, little work has been done to
D 2006 Elsevier Inc. All rights reserved. identify the discrete emotions elicited by these pictures. Although
both emotional faces and IAPS pictures target emotional processing,
these two stimuli sets may preferentially engage certain brain
structures involved in emotion. In addition, it is not known whether
facial expressions and IAPS pictures of specific emotions (happy,
Introduction sad, anger, and fear) would activate similar or discrete circuits.
Studies of expressive faces and IAPS pictures suggest that a similar
Emotion research utilizes different types of stimuli (e.g. set of regions is involved in processing both emotional stimulus types.
expressive faces and complex evocative pictures) to probe affective Expressive faces and IAPS pictures activate regions involved in
processing; however, the two lines of investigation have remained emotion processing, including the amygdala (Breiter et al., 1996;
Liberzon et al., 2003; Morris et al., 1996), hippocampus (Gur et al.,
* Corresponding author. Massachusetts General Hospital, Psychiatry 2002; Lane et al., 1997c), insula (Phan et al., 2004; Phillips et al., 1997),
Department, Building 149 Thirteenth Street, Charlestown, MA 02129, anterior cingulate (ACC, Killgore and Yurgelun-Todd, 2004; Morris et
USA. Fax: +1 617 726 4078. al., 1998), medial prefrontal cortex (mPFC, Kim et al., 2003; Taylor et
E-mail address: jbritton@nmr.mgh.harvard.edu (J.C. Britton). al., 2003; Winston et al., 2003), ventromedial prefrontal cortex
Available online on ScienceDirect (www.sciencedirect.com). (vMPFC, Phan et al., 2004)/orbitofrontal cortex (OFC, Blair et al.,
1053-8119/$ - see front matter D 2006 Elsevier Inc. All rights reserved.
doi:10.1016/j.neuroimage.2005.12.050
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 907

1999), and visual cortex (Liberzon et al., 2003; Morris et al., 1998). Table 1
Both stimulus types may recruit similar structures due to the underlying Demographics
emotional processes activated within those regions (e.g. amygdala Behavioral Group 1 Behavioral Group 2 fMRI
activation reflecting fear (LeDoux, 2000) or stimulus salience (Liberzon Participants 60 60 12
et al., 2003), insula activation reflecting somatic/visceral responses Gender (males) 30 30 6
(Damasio, 1999) and disgust perception (Phillips et al., 1997), anterior Age (years) 21.6 T 3.0 (SD) 22.1 T 2.9 21.4 T 2.2
cingulate activation reflecting attention and self-awareness (Lane et al., Caucasian 43 44 9
1997a), and medial prefrontal activation reflecting emotion regulation African American 3 2 1
(Davidson et al., 2000)). However, few studies have compared these Asian 8 6 2
stimuli directly. In a single study comparing threat-related stimuli, Indian 4 4 0
bilateral amygdala activation was found in response to both expressive Hispanic 2 4 0
faces and IAPS pictures (Hariri et al., 2003); however, the low Z-scores
and the cognitive matching task in this study prevent any definitive
conclusions regarding the common and differential emotional networks participants were between 18 and 30 years, right-handed,
activated by these emotional stimuli. English speaking, and had normal or corrected-to-normal visual
Even though expressive faces and complex pictures may activate acuity. Participants did not have a current or prior history of
a similar set of regions, given the role of emotional facial expressions head injury, learning disability, psychiatric illness, medical
in transacting social behavior, emotional perception of faces is illness, or substance abuse/dependence (> 6 months). For the
thought to be processed by a distinct circuitry (Calder et al., 2001), fMRI study, a formal screening assessment (Mini SCID) was
including superior temporal gyrus (STG) and amygdala (Adolphs et used (Sheehan et al., 1998). After explanation of the experi-
al., 2002; Winston et al., 2003). Facial expressions of emotion have mental protocol, all participants gave written informed consent,
characteristic profiles (e.g. protruded tongue when disgusted, as approved by the University of Michigan Institutional Review
contracted eyebrows when angry) (Darwin, 1998) and the STG Board. Participants were paid for their participation.
has been shown to respond to variable aspects in facial expressions
(Narumoto et al., 2001). In some studies, superior temporal gyrus Experiment 1: Behavioral study
has also been shown to respond preferentially to faces relative to
pictures (Geday et al., 2003). Lesion and neuroimaging studies Stimuli
highlight the robustness of the amygdala response to faces. The image set included 150 facial expressions of specific
Amygdala lesions have been shown to impair fear recognition emotions posed and evoked by actors balanced for gender and
(Yang et al., 2002a). Neuroimaging studies have shown increased ethnicity (Gur et al., 2002) and 200 IAPS pictures (Lang et al.,
amygdala activity when viewing fear (Breiter et al., 1996; Hariri et 1997). These images were selected to target the emotions of
al., 2003; Morris et al., 1996; Phillips et al., 1997; Whalen et al., happiness (babies, Mickey Mouse, sporting events), sadness
2001), angry (Whalen et al., 2001), sad (Blair et al., 1999), and (funeral scenes/cemeteries, premature babies, wounded bodies),
happy facial expressions (Breiter et al., 1996; Dolan et al., 1996). anger (human violence, guns, KKK images), and fear (snakes,
Even though IAPS pictures also activate these regions, processing spiders, sharks, medical procedures) in equal quantities. In
emotional information from facial expressions may be processed addition, neutral or nonemotional images (mushrooms, household
preferentially by superior temporal gyrus and amygdala. items) were also selected. All images were converted from color to
In the current study, we aimed to examine the neural correlates of gray scale/black and white using Photoshop 6.0 (Adobe Systems,
responses to expressive faces and IAPS pictures. Do these emotional San Jose, CA) and matched on luminance.
probes elicit similar or distinct activation patterns? In order to
effectively compare BOLD responses to expressive faces and IAPS Procedure
pictures, stimulus properties (e.g. specific emotion, valence and Volunteers participated in separate rating-task experiments
arousal) had to be balanced, but only few studies have examined the (Group 1: IAPS rating task, Group 2: Face rating task). For both,
emotion induction capability of facial expressions (Wild et al., 2001) participants were seated in front of a laptop computer (Dell PC,
or the profiles of specific emotions induced by the IAPS pictures Inspiron 2650) in a quiet experimental room.
(Davis et al., 1995). Therefore, a behavioral experiment was After viewing an image for 3 s, participants were prompted to
conducted to match stimuli based on these features. Subsequently, rate each image. IAPS pictures were rated only on (1)
a block design fMRI study was conducted to examine the neural predominant emotion and (2) emotion intensity because stan-
correlates of processing facial expressions and IAPS pictures, dardized ratings of valence and arousal for each picture have
balanced on specific emotion. We hypothesized that facial expres- been published (Lang et al., 1997). Facial expressions were rated
sions and IAPS pictures would activate a similar emotional network, on (1) predominant emotion, (2) emotion intensity, and also on
and that some brain regions (superior temporal gyrus and amygdala) (3) valence and (4) arousal. The predominant emotion rating
would preferentially respond to facial expressions. instructions were ‘‘Indicate the predominant emotion that is
depicted in the image given the following options: happy, neutral,
sad, anger, fear, and disgust’’. The emotion intensity rating
Methods instructions were ‘‘Indicate the degree/intensity of the selected
emotion (1 = not at all, 2 = mildly, 3 = moderately, 4 = strongly,
Participants 5 = extremely)’’. The valence rating instructions were ‘‘Rate how
unpleasant or pleasant the image makes you feel using a 1 – 9
Healthy volunteers were recruited from advertisements placed scale (1 = very unpleasant, 5 = neutral, 9 = very pleasant)’’. The
at local universities. Demographics are outlined in Table 1. All arousal rating instructions were ‘‘Rate how emotionally intense or
908 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

arousing the image makes you feel using a 1 – 9 scale (1 = calm, general, more agreement was detected in the emotional faces
5 = somewhat aroused, 9 = excited)’’. (83.70%) than IAPS pictures (75.2%). In addition, happy images
showed most agreement (>90%).
Analysis Standardized valence and arousal ratings for the IAPS picture set
For each image, the frequencies of the two most reported (Lang et al., 1997) were compared to the ratings of facial expressions
emotions were compared using chi-squared analysis. For images obtained from our participants (Figs. 1B – D). The IAPS pictures
with significant chi-squared values ( P < 0.05), the image was were rated higher on valence (i.e. more pleasant or more unpleasant)
classified according to the predominant emotion. The number of as compared to faces in each specific emotion category (post hoc
images in each specific emotional category was compiled and the pairwise t tests: P < 0.001) except anger ( P > 0.241). Happy and
percentage agreement across participants was calculated. neutral IAPS pictures were rated more positively than happy and
The valence and arousal ratings obtained for the face and neutral facial expressions, respectively. Sad and fear IAPS pictures
standardized IAPS picture ratings were compared using t tests. A were rated more negatively than sad and fear faces, respectively. The
series of t tests compared valence and arousal ratings within each arousal rating for the faces (3.19 T 0.06 (SEM)) was lower than for
specific emotion category as well. IAPS pictures (5.07 T 0.09) for all specific emotion categories
[t(334.9) = 17.56, P < 0.001; post hoc pairwise t tests: P < 0.001].
Results
Using the criteria described above (significance on a chi-square Experiment 2: fMRI study
test), a proportion of the 150 facial expressions stimuli (82.6%) were
classified according to specific emotions (happy: 19.3%, neutral: Procedure
18.0%, sad: 17.3%, anger: 12%, fear: 16%). From the set of 200 Volunteers were placed comfortably within the scanner. A light
IAPS pictures, 67.5% were classified according to specific emotions restraint was used to limit head movement during acquisition.
(happy: 19.5%, neutral: 16.0%, sad: 13.5%, anger: 7.5%, fear: While lying inside the scanner, stimuli were presented to
11.0%) (Fig. 1). participants via MRI-compatible display goggles (VisuaStimXGA,
After assigning the images in a particular specific emotion Resonance Technology) mounted on the RF head coil and adjusted
category, the percent agreement was analyzed (Fig. 1A). In to ensure an unobstructed field of view. Stimuli were displayed

Fig. 1. Behavioral ratings. Each stimulus set (expressive faces and complex IAPS pictures) was rated on several dimensions. (A) Percentage of participants
agreeing with predominant emotion assigned to each image. (B) Valence (1 = very unpleasant, 5 = neutral, 9 = very pleasant) and arousal (1 = calm, 9 =
excited) ratings plotted for each image. (C) Mean and standard error of valence ratings for images within each assigned discrete emotion category. (D) Mean
and standard error of arousal ratings for images within each assigned discrete emotion category.
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 909

using Eprime software (Psychology Software Tools, Inc.; Analysis


Schneider et al., 2002a,b). In addition, Eprime recorded partic- Participants responded when a new image appeared on the
ipants’ subjective responses via right-handed button-glove. screen in this passive viewing task to monitor on-task performance.
Using a block design, expressive faces and IAPS pictures were To test on-task performance, the number of responses and the
interleaved with control periods. Images for each specific emotion reaction time was examined. The number of responses to face,
block (happy, neutral, sad, anger, fear) were identified using the IAPS picture, and fixation images was examined. The reaction
emotional ratings and emotional intensities obtained in the times were examined using 2 (image: face, IAPS picture)  5
behavioral experiment. Emotion block order was counterbalanced (emotion: happy, neutral, sad, anger, fear) Repeated Measures
across the entire scanning session. Four emotional stimuli were ANOVA and post hoc analysis. Separate paired t tests were used to
presented in each face and IAPS picture block. Each image within test differences between images (face, IAPS picture, and fixation).
the block was shown for 4 s with no interstimulus interval. Two gray In addition, paired t tests examined differences between the
scale fixation images were presented during control periods. The reaction times during the first and last part of the experiment for
sequence of face and picture blocks was repeated eight times within each image type.
each run. Eight runs were acquired. Each stimulus block was Preprocessing of eye movement occurred offline, beginning
repeated in the second half of the experiment; however, while the with the identification of eye blinks. Linear interpolation was then
stimuli within each block were maintained, the block order within performed to correct for missing data points. The standard
each run was counterbalanced. deviation of the eye position in horizontal and vertical directions
Participants passively viewed each image and responded via was calculated for each stimulus block using MATLAB (Math-
button-press using the right index finger to indicate when a new works, Inc., Sherborn, MA). The eye movement data in the
image appeared on the screen. The reaction time of this response was horizontal and vertical directions were examined using separate 2
recorded and used to monitor task performance. In addition, eye (image: faces, IAPS pictures)  5 (emotion: happy, neutral, sad,
movements were monitored with infrared camera within the display anger, fear) Repeated Measures ANOVA. Paired t tests examined
goggles that sampled pupil location at 30 Hz with an accuracy of the differences between faces, IAPS pictures, and fixation images.
1.0- of visual arc (ViewPoint Eyetracker, Arrington Research). The postscan ratings (valence and arousal) were examined
Before scanning, participants were introduced to a brief version using separate 2 (image type: faces, IAPS pictures)  5 (emotion:
of the task, consisting of one block of neutral expressive faces and happy, neutral, sad, anger, fear) Repeated Measures ANOVA. Post
one block of neutral complex pictures interspersed by fixation. The hoc analysis determined significant main effects of image type and
images displayed in this practice session were not repeated during emotion. Paired t tests examined the differences between faces and
image acquisition. Immediately following scanning, participants IAPS pictures in each discrete emotion category.
completed a self-paced rating task outside the scanner similar to the
procedure of the behavioral experiment. Maintaining image order fMRI analysis
within each block, participants rated each image on several
dimensions: predominant emotion (forced-choice selection be- Images were slice-time corrected, realigned, co-registered,
tween happy, neutral, sad, anger, fear, and disgust), associated normalized, and smoothed according to standard methods. Scans
emotional intensity (1 = not at all, 5 = extremely), valence (1 = were slice-time corrected using sinc interpolation of the eight
most unpleasant, 5 = neutral, 9 = most pleasant), and arousal (1 = nearest neighbors in the time series (Oppenheim and Schafer,
calm, 9 = very excited). The block order was counterbalanced 1989) and realigned to the first acquired volume using AIR 3.08
between subjects. routines (Woods et al., 1998). Additional preprocessing and image
analysis of the BOLD signal were performed with Statistical
fMRI acquisition Parametric Mapping (SPM99; Wellcome Institute of Cognitive
Neurology, London, UK; www.fil.ion.ucl.ac.uk/spm) implemented
Scanning was performed on a 3.0 T GE Signa System in MATLAB. Images were co-registered with the high-resolution
(Milwaukee, WI) using a standard radio frequency coil. A T1- SPGR T1 image. This high-resolution image was then spatially
weighted image was acquired for landmark identification to normalized to the Montreal Neurological Institute (MNI152)
position subsequent scans. After initial acquisition of T1 structural template brain and transformation parameters were then applied
images, functional images were acquired. To minimize suscepti- to the co-registered functional volumes, resliced, and spatially
bility artifact (Yang et al., 2002b), whole-brain functional scans smoothed by an isotropic 6 mm full-width-half-maximum
were acquired using T2*-weighted reverse spiral sequence with (FWHM) Gaussian kernel to minimize noise and residual differ-
BOLD (blood oxygenation level-dependent) contrast (echo time/ ences in gyral anatomy. Each normalized image set was band pass-
TE = 30 ms, repetition time/TR of 2000 ms, frequency of 64 filtered (high pass filter = 100 s) to eliminate low frequency signals
frames, flip angle of 90-, field of view/FOV of 20 cm, 40 (Ashburner et al., 1997; Friston et al., 1995). The data were
contiguous 3 mm oblique axial slices/TR approximately parallel to analyzed using a general linear model with parameters
the AC – PC line). Each run began with 6 Fdummy_ volumes corresponding to each specific emotion (happy, neutral, sad, anger,
(subsequently discarded) to allow for T1 equilibration effects. and fear) and image type (expressive faces, IAPS pictures, and
After 8 functional runs were collected, a high-resolution T1 scan fixation images), modeling each run separately. Each stimulus
was also acquired to provide precise anatomical localization (3D- block was convolved with a canonical hemodynamic response
SPGR, TR of 27 ms, minimum TE, flip angle of 25-, FOV of 24 function (HRF).
cm, slice thickness of 1.0 cm, 60 slices/TR). Co-images were For each participant, parameter estimates of block-related
reconstructed off-line using the gridding approach into a 128  activity were obtained at each voxel. Contrast images were
128 display matrix with an effective spatial resolution of 3 mm calculated by applying appropriate linear contrasts to the parameter
isotropic voxels. estimates of each block to produce statistical parametric maps
910 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

(SPM{t}), which were transformed to a normal distribution ments elicited by faces (SD: 0.235 T 0.065, t(8) = 2.83, P <
(SPM{Z}). Relevant linear contrasts included image type main 0.018) and fixation (SD: 0.210 T 0.058, t(8) = 1.80, P <
effects (e.g. Face-Fixation, IAPS-Fixation), specific emotion main 0.101). No differences between specific emotions were detected
effects within each image type (e.g. Happy Face-Fixation, Happy ( P > 0.556). No difference in vertical eye movements between
Face – Neutral Face), and emotion  image type interaction effects different images was detected ( P > 0.319).
(e.g. [Emotional Face – Neutral Face] – [Emotional IAPS picture –
Emotional IAPS picture], [Happy Face – Neutral Face] – [Happy Postscan subjective ratings
IAPS picture – Neutral IAPS picture]). To account for interindivid-
ual variability, an additional 6-mm smoothing was performed on The stimulus sets were examined to determine the percentage
the contrast images before incorporating the individual contrasts in agreement with the predominant emotion standards determined
a random effects analysis. by the behavioral experiment. In general, more agreement was
A second-level random effects analysis used one-sample t detected in the emotional faces (83.5%) compared to the IAPS
tests on smoothed contrast images obtained in each subject for pictures (78.9%). In addition, happy images were more
each comparison of interest, treating subjects as a random consistently identified by participants than any other emotion
variable (Friston, 1998). This analysis estimates the error (Fig. 2).
variance for each condition of interest across subjects, rather Similar to Experiment 1, emotional IAPS pictures were rated
than across scans, and therefore provides a stronger generaliza- with higher valence (i.e. more pleasant or more unpleasant) for
tion to the population from which data are acquired. In this all specific emotions [image type: F(1,11) = 12.46, P < 0.005,
random effects analysis, resulting SPM maps (df = 11) were paired t tests: P < 0.005]. Happy and neutral IAPS pictures were
examined in a priori regions (amygdala/sublenticular extended rated more positively than happy and neutral faces, respectively.
amygdala, hippocampus, STG, insula, ACC, mPFC, vMPFC/ Sad, anger, and fear IAPS pictures were rated more negatively
OFC). Whole-brain analysis conducts comparisons in a voxel- than the sad, anger, and fear faces, respectively [emotion:
wise manner, increasing the possibility of false positives unless F(5,55) = 76.73, P < 0.001, emotion  image type interaction:
an appropriate correction for multiple comparisons is used. To F(5,55) = 48.62, P < 0.001].
restrict the number of comparisons, a Small Volume Correction Emotional IAPS pictures were more arousing than emotional
(SVC) was applied for all activations in a priori regions. SVC faces for all specific emotions [image type: F(1,11) = 52.04,
was implemented in SPM across a two volumes of interest P < 0.001; paired t tests: P < 0.001; emotion main effect:
[rectangular box 1: x = 0 T 70 mm, y = 10 T 30 mm, z = 5 T F(5,55) = 35.94, P < 0.001, image type  emotion interaction:
25 mm; rectangular box 2: x = 0 T 20 mm, y = 35 T 35 mm, z = 15 T F(5,55) = 13.88, P < 0.001]. Arousal ratings for neutral IAPS
45 mm] defined using the Talaraich atlas to isolate central regions pictures and neutral faces were not significantly different
(amygdala/SLEA, hippocampus, STG, insula) and anterior midline [paired t test: t(11) = 1.74, P < 0.110].
regions (mPFC, ACC, vMPFC/OFC). Within each SVC, a false
discovery rate [FDR] of 0.005 was used to ensure that on average no fMRI results
more than 0.5% of activated voxels for each contrast are expected to
be false positive results (Genovese et al., 2002). In addition, Effects of facial expressions and IAPS pictures
activation foci were required to have a cluster size/extent threshold Facial expressions analyzed together (contrast: all faces-
of greater than 5 contiguous voxels. For activation foci detected fixation) and picture stimuli analyzed together (contrast: all
between modalities (e.g. Faces > IAPS Pictures and IAPS Pictures > IAPS pictures-fixation) activated a similar network: bilateral
Faces), regions activated within each modality that fell just below amygdala, posterior hippocampus, ventral medial prefrontal
the cluster threshold are also denoted in the tables. cortex, and visual cortex (Table 2, Fig. 3). In addition,
dorsomedial prefrontal cortex activated in response to expressive
faces [( 3, 57, 33), Z = 3.02, k = 19].
Results This pattern of activation was consistently present when several
different specific emotions (happy, sad, anger, fear, and neutral)
On-task performance were analyzed separately (contrast: specific emotion-fixation, e.g.
happy face-fixation). The amygdala was activated in response to all
Participants responded via button-press to 98% of images emotional facial expressions and sad and anger IAPS pictures.
(100% accuracy to faces, 99% accuracy to IAPS pictures, and With the exception of happy facial expressions, the pattern of
96% accuracy to blanks), confirming on-task performance. dorsomedial prefrontal cortex activation was similar to the
Reaction times differed depending on modality [ F(1,11) = amygdala. Hippocampus activated in response to all facial
17.87, P < 0.001]. Reaction times to faces (638.5 T 91.8 ms expressions (except happy) and also to all IAPS pictures.
(SEM)) were significantly faster than reaction times to IAPS Ventromedial prefrontal cortex activated in response to all facial
pictures (859.7 T 142.3 ms, t(11) = 4.23, P < 0.001). No main expressions (except happy) and all IAPS pictures (except fear).
effect of specific emotion was detected ( P > 0.477). Visual cortex was activated in response to all facial expressions and
The first half of the experiment elicited slower reaction times all IAPS pictures.
than the second half for all image types [1st half: 807.4 T 124.4 ms,
2nd half: 686.0 T 114.7 ms, paired t tests t(11) = 4.91, P < 0.001]. Effects of emotional faces and emotional IAPS pictures
Different lateral eye movement patterns were detected for To identify and compare emotionality in these stimulus types, all
different stimulus types [image effect: F(1,8) = 8.01, P < facial expressions and all IAPS pictures were analyzed relative to
0.018]. IAPS pictures (SD: 0.339 T 0.079) elicited more eye neutral (e.g. contrast: emotional faces – neutral faces and [emotional
movement in the horizontal direction compared to eye move- faces – neutral faces] – [emotional IAPS pictures – neutral IAPS
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 911

Effects of specific emotional faces and emotional IAPS pictures


To identify the effects of each specific emotion, each
specific emotion (happy, sad, anger, and fear) was also
analyzed separately (e.g. contrast: happy faces – neutral faces).
While amygdala, hippocampus, vMPFC, and visual cortex
were commonly activated among faces and pictures when all
specific emotions were analyzed together, we observed
differential activation in these regions in response to specific
emotions, suggesting that some emotions contributed more
substantially to these overall results. Amygdala activated in
response neutral stimuli; however, anger faces showed signif-
icantly greater amygdala activity than neutral faces. Similarly,
while hippocampus activated in response to neutral stimuli,
anger and fear stimuli showed significantly greater hippocam-
pal activity than neutral stimuli. Ventromedial prefrontal cortex
activated in response to neutral stimuli, and sad and anger
faces and anger IAPS pictures showed greater vMPFC activity
than neutral faces and pictures, respectively. Visual cortex
activated in response to neutral stimuli, and happy, sad, and
fear IAPS pictures showed greater visual cortical activity than
neutral pictures. Additionally, fear and sad IAPS pictures
showed greater visual activity compared to fear and sad faces.
Specific emotions (e.g. contrast: [happy faces – neutral faces] –
[happy IAPS pictures – neutral IAPS pictures]) contributed to the
overall differences in activation between facial expressions and
IAPS pictures in STG, insula, and ACC (Fig. 4). STG was
significantly activated in response to all specific emotional faces
relative to neutral faces (happy, sad, anger, and fear). All these
activations (except sad) were also significantly larger than
corresponding activations elicited by specific emotional IAPS
relative to neutral IAPS pictures. Similarly, insula was activated in
response to all specific emotional faces (happy at a subthreshold
level), and all these activations (except anger) were significantly
larger than corresponding activations elicited by specific emotional
IAPS pictures. Anterior cingulate was significantly activated in
response to fear and sad (sad at a subthreshold level) facial
expressions, and these activations showed greater anterior cingulate
activity compared to corresponding IAPS pictures. Anger and sad
faces also elicited greater rostral anterior cingulate activity compared
to corresponding IAPS pictures (Table 4).

Discussion

In this study, we examined whether expressive faces and IAPS


pictures would activate similar brain regions. Analyzed as set of
Fig. 2. fMRI postscan ratings. Each stimulus set (expressive faces and stimuli, expressive faces and IAPS pictures activated a common
complex IAPS pictures) was rated on several dimensions. (A) Percentage of
pattern of brain regions including the amygdala, posterior
participants agreeing with predominant emotion assigned to each image. (B)
hippocampus, ventromedial prefrontal cortex, and visual cortex.
Mean and standard error of valence (1 = very unpleasant, 5 = neutral, 9 = very
pleasant) ratings for images within each assigned discrete emotion category. These stimuli also activated superior temporal gyrus, insula, and
(C) Mean and standard error of arousal (1 = calm, 9 = excited) ratings for anterior cingulate differentially, e.g. more activation in these
images within each assigned discrete emotion category. regions to expressive faces than to IAPS pictures. For the most
part, these regions were activated in response to each specific
pictures]). The superior temporal gyrus, insula, and anterior emotion separately; however, some regions responded only to a
cingulate activated in response to emotional faces, and showed subset of specific emotions.
greater activity in these regions compared to emotional IAPS
pictures. Visual cortex activated in response to emotional pictures, Expressive faces and IAPS pictures: common areas of activation
and showed greater activity compared to facial expressions. Of note,
the activations to neutral stimuli did not differ in any region other The amygdala, posterior hippocampus, ventromedial prefrontal
than the visual cortex [IAPS pictures > faces: ( 9, 93, 3), Z = cortex, and visual cortex were activated by both expressive faces
5.36, k = 656] (Table 3). and IAPS pictures analyzed as two sets of emotional stimuli,
912 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

Table 2
Emotional faces and IAPS pictures activate a similar network relative to fixation
Region Faces IAPS pictures
a b c
(x, y, z) Z k (x, y, z) Z k
L. amygdala ( 21, 6, 18) 4.18 36 ( 21, 6, 15) 3.81 12
R. amygdala (24, 6, 15) 4.05 44 (24, 3, 15) 3.06 6
Hippocampus ( 24, 30, 3) 3.73 30 (15, 30, 6) 4.57 207
(15, 30, 3) 2.90 12
Ventromedial prefrontal/orbitofrontal cortex (0, 45, 24) 4.55 49 (3, 45, 21) 3.65 45
Visual (30, 78, 15) 5.68 2559 ( 33, 60, 15) 5.79 5607
a
Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively. R = right, L = left.
b
Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005).
c
Spatial extent in cluster size, threshold  6 voxels.

suggesting that these regions are involved in general emotion previous studies (Fried et al., 1997; Lane et al., 1997c). The
processing (i.e. not specific to stimulus type or a particular process, hippocampus has been shown to be involved in episodic memory
recognition vs. induction). Consistent with previous findings, and declarative knowledge (Bechara et al., 1995) and with its
negative emotional faces and IAPS pictures activated the amygdala extensive connections from extrastriate visual areas including
(Hariri et al., 2000, 2003). In addition, we found amygdala fusiform gyrus, the hippocampal activation may reflect contextual
activation to happy emotional faces. Amygdala activation has been memory and visual processing triggered by our stimuli. Negative
reported to positive and negative facial expressions (Breiter et al., facial expressions and negative IAPS pictures, with the exception
1996; Morris et al., 1996; Somerville et al., 2004) and IAPS of fear, activated the ventromedial prefrontal cortex. The medial
pictures (Liberzon et al., 2003); therefore, it is unclear why positive prefrontal cortex is thought to be involved in emotional self-
IAPS pictures did not activate the amygdala as well. Emotional awareness (Lane et al., 1997b) and reexperiencing the Ffeelings_ of
faces and IAPS pictures activated the hippocampus, in concert with one’s emotional past (Damasio, 1999). In concert, ventromedial

Fig. 3. Common regions of activation. SPM t map showing activated visual cortex (visual), ventromedial prefrontal cortex (vMPFC), and amygdala (Amy) to
(A) Expressive Faces and (B) IAPS pictures relative to fixation. Posterior hippocampus was also activated (not shown). Activated voxels are displayed with P <
0.005 uncorrected, [k] > 5 voxels threshold.
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 913

Table 3
Emotional faces and IAPS pictures activate a different network relative to neutral
Region Faces Faces > IAPS pictures IAPS pictures IAPS pictures > Faces
a b c a b c a b c
(x, y, z) Z k (x, y, z) Z k (x, y, z) Z k (x, y, z)a Zb kc
Superior temporal gyrus (60, 39, 6) 4.17 284 (69, 21, 15) 3.22 134
( 66, 6, 9) 3.50 113 ( 66, 9, 12) 3.64 91
Insular cortex ( 39, 18, 9) 3.56 69 ( 27, 6, 12) 3.14 37
Anterior cingulate (0, 30, 30) 4.10 35 ( 3, 30, 30) 3.64 43
Visual cortex (12, 75, 12) 3.04 9 (36, 78, 0) 3.81 88
(30, 93, 15) 3.24 25
a
Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively.
b
Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005).
c
Spatial extent in cluster size, threshold  6 voxels.

prefrontal lesions lead to deficits in recognizing emotion from lower on valence and arousal. Previous studies suggest that facial
facial expressions (Hornak et al., 1996). In addition, ventromedial expressions can evoke emotion portrayed to the viewer through
prefrontal cortical activation to IAPS was modulated by the extent primitive emotion contagion (Wild et al., 2001). The subjective
of self-association (Phan et al., 2004); thus, the ventromedial responses in this study indicate that IAPS pictures are even more
prefrontal cortex activation may reflect personal association. Both potent than facial expressions at inducing changes in the subjective
expressive faces and IAPS pictures activated the visual cortex, state of emotional valence and arousal. Nevertheless, expressive
which is expected given the reports of emotional content faces elicited greater activation than IAPS pictures in several
modulating visual processing. Two components of emotional regions. Superior temporal gyrus has been shown to be involved in
processing (e.g. arousal and valence) have been shown to processing variable components of the face such as eye gaze, eye
contribute to visual cortex activations (Mourao-Miranda et al., brows, and mouth gape (Haxby et al., 2000); therefore, it is not
2003), and increased activation in the visual cortex may also reflect surprising that expressive faces would activate this region more
the stimulus’ significance (Anderson and Phelps, 2001; Pessoa et than IAPS pictures. The insula has been shown to be involved in
al., 2002) or increased attention (Lane et al., 1999). processing emotional expression in others (Haxby et al., 2002) and
The dorsomedial prefrontal cortex is thought to be involved in insular projections to inferior prefrontal cortex and amygdala may
general emotional processing (i.e. emotional appraisal/evaluation convey motivation and social information from these stimuli
and emotion regulation) (Phan et al., 2002). Discrete emotions in (Critchley et al., 2000). Anterior cingulate has been posited to
both types of stimuli activated dorsomedial prefrontal cortex, but reflect emotional awareness (Lane et al., 1997a) and cognitive-
this activation was detected in the main effect of expressive faces but emotion interactions (Bush et al., 2000, 2002). Generally, the
not IAPS pictures. The less consistent dorsomedial prefrontal cortex processing differences between emotion types detected in these
activation to these emotional stimuli might have been be due to our regions may be partially a reflection of the fact that faces and IAPS
choice of passive viewing task in this study. Including a cognitive pictures differ on novelty and complexity (Winston et al., 2003). If
task (e.g. rating) has shown to increase dorsomedial prefrontal novelty and complexity of the stimuli do contribute, faster
cortex activation (Taylor et al., 2003) and while the passive viewing habituation to novelty effects in faces and slower habituation to
task was chosen as to not bias the participants towards emotion novelty in pictures could explain the significant effect in one
recognition or emotion induction, it is possible that subjects were modality (e.g. expressive faces) and a lack of effect resulting from
labeling the emotion displayed on each face. Although the dMPFC sustained activation in the other (e.g. IAPS pictures).
and amygdala activation was consistent among all facial expres- With respect to novelty, faces can be viewed as a relatively
sions, it was observed that negative pictures showed dMPFC unchanging stimulus having consistent facial features (eyes, nose,
activation when amygdala was activated in those conditions as well. mouth), despite feature changes (raised brows, gaping mouth, etc.)
Given the anatomical connections, the co-activation of these two that depict particular emotional states; whereas, each IAPS picture
structures have been hypothesized to reflect possible influence of with complex contextual scenes is often more unique and novel.
cortical inhibitory control (Ongur and Price, 2000). The MPFC has Decreased novelty and resulting habituation of responses to neutral
been implicated in emotion regulation (Levesque et al., 2003; facial expressions may lead to detectable activations (Fischer et al.,
Ochsner et al., 2002; Taylor et al., 2003), extinguished fear (Milad 2003; Wright et al., 2003); whereas, sustained novelty (i.e. similar
and Quirk, 2002; Milad et al., 2004; Quirk et al., 2003), and levels of novelty) between emotional and nonemotional IAPS
cognitive-emotion interactions (Liberzon et al., 2000; Simpson et pictures would lend itself to not detecting activation. Some
al., 2000; Taylor et al., 2003). In this study, sad and anger IAPS evidence in the literature supports this idea. Several regions,
pictures show medial prefrontal cortex and amygdala co-activation, including superior temporal gyrus, insula, and anterior cingulate,
suggesting that dorsomedial prefrontal cortex activation may be have been shown to respond to novel relative to familiar stimuli
playing a role in reappraisal of negative emotion (Beauregard et al., (Downar et al., 2002; Tulving et al., 1994). Insula activation to
2001; Ochsner et al., 2002; Phan et al., 2005). fearful faces was detected during early but not later periods,
reflecting initial orienting not sustained processing (Williams et al.,
Expressive faces and IAPS pictures: differential areas of activation 2004). Even though IAPS pictures have been shown to habituate
with repeated exposure (Phan et al., 2003), IAPS pictures depict
As a group, expressive faces activated superior temporal gyrus, emotion-laden scenes portraying a variety of contexts; thus,
insula, and anterior cingulate more than IAPS pictures, despite the novelty in IAPS pictures may show reduced habituation (i.e.
fact that expressive faces overall were subjectively rated to be sustained activation) effects relative to facial expressions.
914 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

Fig. 4. Expressive faces activate anterior cingulate, insula, and superior temporal gyrus more than IAPS pictures. SPM t map showing greater BOLD activity to
expressive faces than IAPS pictures in anterior cingulate (ACC), insular cortex (Ins), and superior temporal gyrus (STG). (A) Happy relative to neutral. (B) Sad
relative to neutral. (C) Fear relative to neutral. Activated voxels are displayed with P < 0.005 uncorrected, [k] > 5 voxels threshold.

With respect to complexity, faces may be processed more high agreement of discrete emotion in facial expressions (Carroll
automatically; whereas, the complex scenes within IAPS pictures and Russell, 1996; Frank and Stennett, 2001). Significant activation
may require additional cognitive processing, leading to sustained may be more easily detected due to automatic, but relatively
activation in all IAPS conditions (including neutral) but not in face transient processing of facial expressions. On the other hand,
conditions. Neuroimaging studies involving masked faces designs subjective reports indicate that IAPS pictures have higher valence
elicit emotional networks despite subjective experience (Whalen et and arousal and increasing intensity may introduce ambiguity. The
al., 1998), pointing to the automatic processing of facial expressions. IAPS pictures have less percent agreement and increased reaction
In addition, emotions in facial expressions are universally recog- times than expressive faces, which may reflect increased cognitive
nized (Ekman, 1992, 1994; Izard, 1994). In support, in our study, demands. Processing the context in relation to past experience and
expressive faces showed more agreement on discrete emotion labels acquired knowledge (memories and associations with emotional
than IAPS pictures. This finding is consistent with studies reporting stimulus) may require additional cognitive load. Right insula and
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 915

Table 4
Activations to specific emotions relative to neutral
Region Faces Faces > IAPS pictures IAPS pictures IAPS pictures > faces
a b c
(x, y, z) Z k (x, y, z) Z k (x, y, z) Z k (x, y, z) Z k
Happy
Superior temporal (63, 18, 9) 3.51 58 (57, 18, 3) 3.56 160
gyrus
( 63, 15, 6) 2.89 8 ( 60, 15, 6) 2.73 10
Insular cortex ( 30, 6, 9) 2.62 2d ( 39, 3, 12) 3.50 161
Visual cortex (54, 75, 3) 4.33 78
( 45, 72, 30) 3.64 8

Sad
Superior temporal (63, 27, 12) 3.21 27
gyrus
Insular cortex (51, 12, 3) 2.60 3d (54, 12, 6) 3.04 14
( 42, 21, 21) 3.53 102 ( 39, 24, 0) 2.92 9
Anterior cingulate (0, 30, 30) 2.52d 5d ( 3, 36, 30) 3.21 24
Rostral anterior (0, 30, 12) 3.10 9
cingulate
Ventromedial (9, 51, 9) 3.21 65
prefrontal/
orbitofrontal
cortex
Visual cortex (0, 72, 24) 3.71 43 (0, 90, 0) 3.74 211 (30, 90, 15) 3.67 #

Anger
R. amygdala (33, 3, 24) 3.01 8
Hippocampus ( 21, 21, 12) 4.14 62 ( 21, 12, 12) 2.84 10
Superior temporal ( 63, 6, 12) 3.73 45 ( 60, 6, 12) 3.32 19
gyrus
(57, 24, 12) 3.43 80 (57, 15, 6) 3.36 163
Insular cortex ( 39, 18, 9) 3.86 307 ( 42, 24, 6) 4.11 47
Rostral anterior (3, 27, 3) 2.96 7
cingulate
Ventromedial ( 15, 60, 12) 2.95 7 ( 12, 69, 6) 3.07 8
prefrontal/
orbitofrontal
cortex

Fear
L. hippocampus ( 18, 27, 9) 2.96 9 ( 18, 27, 3) 3.91 14
Superior temporal (60, 33, 0) 3.99 113 (66, 33, 9) 3.01 39
gyrus
Insular cortex ( 36, 15, 6) 3.61 125 ( 45, 9, 0) 3.31 27
( 39, 9, 9) 3.15 34
Anterior cingulate (3, 18, 21) 3.50 7 (0, 30, 30) 2.86 8
(0, 6, 30) 3.45 42
Visual cortex (9, 90, 3) 3.6 40 (9, 93, 3) 3.62 28
(0, 72, 15) 3.16 12
#, part of larger cluster.
a
Stereotactic coordinates from MNI atlas, left/right (x), anterior/posterior ( y), and superior/inferior (z), respectively. R = right, L = left.
b
Z score, significant after Small Volume Correction (SVC thresholded using a false discovery rate [FDR] correction for multiple comparisons of 0.005).
c
Spatial extent in cluster size, threshold  6 voxels.
d
Subthreshold activations.

anterior cingulate were activated to an explicit evaluation task, Specific emotions


suggesting that these regions may be required to associate personal
reflections and memories in order to make an evaluation of the Specific emotions influenced the subjective ratings and neuro-
stimulus due to increased complexity (Cunningham et al., 2004). imaging activation patterns. Subjectively, positive emotions were
Like the case of novelty, a significant effect in one modality (e.g. more easily identified. Few labels for positive emotions exist;
faces) and a lack of effect in another (e.g. IAPS pictures) may be due whereas, increased variability in labeling negative emotions result
to differences in complexity (i.e. automatic vs. effortful response or from increased choices. This interpretation is consistent with the
innate vs. learned associations). fact that positive emotions may be more general; whereas, negative
916 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

emotions may be more specific (Fredrickson, 2001, 2004). As seen show greater activity than IAPS pictures. While dorsal anterior
in this study, recognition of anger and fear is often confused for cingulate showed activation to faces and neither activation nor
one another. This variability could result from similarly high levels deactivation to pictures, the rostral anterior cingulate showed a
of valence and arousal (Carroll and Russell, 1996; Davis et al., differential activity pattern. From region of interest (ROI) analysis,
1995) or the inability to assign agency. Anger and fear may elicit a we noted that sad and anger faces tend to activate while sad and
more intense reaction and increasing emotional intensity may anger pictures tend to deactivate in the rostral anterior cingulate
result in a more complex profile, illustrating the increased cortex. Even though the rACC is implicated in self-induced
difficulty in distinguishing emotions. Variability in the reports sadness and depression (Mayberg, 1997; Mayberg et al., 1999)
may also result because a complementary emotion is activated and it may not be surprising to find sadness differentially activated
rather than mimicked (i.e. an angry faces makes the observer by faces, one must take caution interpreting these results given that
fearful). the findings within each modality are nonsignificant. Overall
Some specific emotions contributed more substantially to the though, these findings suggest an interaction between specific
regions activated by both expressive faces and IAPS pictures. emotion and emotion type influences activation within anterior
Amygdala activity has been most reported in response to fearful cingulate regions.
stimuli (Breiter et al., 1996; Downar et al., 2001; Hariri et al., Several limitations should be noted when interpreting the results
2003; Morris et al., 1996; Phillips et al., 1997; Whalen et al., of these studies. First, expressive faces and IAPS pictures were
2001). In this study, amygdala activity relative to fixation was balanced in terms of the predominant emotion but valence and
detected to fearful faces, like all other specific emotional faces. arousal varied; expressive faces had lower valence and arousal. This
However, relative to neutral amygdala activity was detected in result, however, offers an advantage in analyzing differential results
response to anger faces only and not fearful faces. In studies because neural activation patterns showed greater activity to
examining anger and fear faces only, amygdala activation was expressive faces despite lower arousal ratings. Secondly, forced
significantly greater for fearful faces compared to anger faces; choice methods to determine specific emotions may reflect response
however, these studies did not incorporate additional specific bias and demand characteristics; however, this bias is present in both
emotions or additional stimulus types (Whalen et al., 2001). Some IAPS and expressive faces. Even though the presence of mixed
studies have suggested that this discrepancy may be explained by emotions was minimized, this method may not have completely
the inclusion of other conditions that will influence activation eliminated this effect. Thirdly, the two stimulus sets may be
within the amygdala (Somerville et al., 2004). Hippocampus unbalanced with respect to complexity, intentionality, and sociality.
activated in response to both types of anger and fear stimuli more Despite these differences, critical comparisons attempted to ‘‘sub-
significantly than neutral stimuli. Since this hippocampal activation tracted out’’ the effects of stimulus type by comparing the emotional
was present in both faces and pictures, it appears that hippocampus stimulus relative to its neutral (e.g. emotional face – neutral face and
may be more responsive to specific emotion rather than stimulus emotional IAPS – neutral IAPS), isolating the contribution of
type. Ventromedial prefrontal cortex activated in response to anger emotionality above and beyond the processing of the stimuli
stimuli and sad faces. Since no significant difference between properties contained within each. Even though we attempted to
modality was detected in this region for sad stimuli, the ‘‘subtract out’’ effects of stimulus type, the expressive face set may
ventromedial prefrontal cortex, like the hippocampus, seems to be more balanced due to the cohesive properties of faces but more
respond to specific emotion. susceptible to habituation effects; whereas, IAPS pictures may have
All specific emotions contributed to activations in STG and increased variability due to contextual differences but more resistant
insula in response to faces and most specific emotions contributed to habituation effects. Future investigations are needed to tease apart
to the differences in activation patterns seen between expressive these components. Additionally, the faces, both emotional and
faces and IAPS pictures. Happy, anger, and fear emotions neutral, may be more inherently social than the IAPS pictures given
contributed to differences between expressive faces and IAPS their role in social communication. Alternatively, IAPS pictures may
pictures in the superior temporal gyrus. This finding may reflect be characterized by more variable interpersonal interactions. Future
sensitivity to detect differences between facial expressions when studies need to determine how sociality influences these neural
presented in isolation, rather than the variability introduced by activation patterns. Next, the limited number of TR volumes
expressions within a greater context as presented in IAPS pictures. collected per emotion condition may have yielded low power,
Insula activation is typically found when recognizing disgust faces resulting in a failure to detect additional differences. However, it
(Phillips et al., 1997); however, happy, sad, and fear contributed to should be noted that even at similar levels of power the differences
the insula differences between modalities in this study. This finding between facial expressions and IAPS pictures were detected. Finally,
suggests that the insula may play a role in general emotional this analysis assumed a canonical hemodynamic response function;
processing with respect to specific emotion (Phan et al., 2002), but however, emotions elicited by facial expressions and IAPS pictures
also points to its preference to processing faces. may have different temporal dynamics (Siegle et al., 2002), and this
A subset of negative emotions contributed to the anterior warrants further exploration.
cingulate activation differences between modalities. For more In summary, this study begins to elucidate the underlying
dorsal regions of the anterior cingulate, the specific emotions of functional regions that are common and different among emotional
sad and fear showed preferentially processing to faces, both stimulus types that emphasize emotion recognition or emotion
activating and showing significant difference compared to IAPS evocation in direct comparison in the same subjects. Even though
pictures. In previous studies, when a rating task is compared to a expressive faces may predominantly involve emotion recognition
perceptually matching task, ACC activation was detected (Hariri et and IAPS pictures predominantly involve emotion evocation, both
al., 2003). Thus, this ACC activation may indicate that the negative expressive faces and IAPS pictures recruit similar brain regions,
emotions in faces are being evaluated to a greater extent compared reflected in a common pattern of activation which included
to IAPS pictures. For more rostral regions, sad and anger faces amygdala, hippocampus, ventromedial prefrontal cortex, and visual
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 917

cortex. This common activation pattern further confirms the role Clark, B.M., Siddle, D.A., Bond, N.W., 1992. Effects of social anxiety and
these regions have in general emotional processing. Some brain facial expression on habituation of the electrodermal orienting response.
regions, however, respond preferentially to a particular emotional Biol. Psychol. 33 (2 – 3), 211 – 223.
Critchley, H., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams,
stimulus type. In this study, a differential pattern of activation was
S., et al., 2000. Explicit and implicit neural mechanisms for processing
detected in superior temporal gyrus, insula, and anterior cingulate,
of social information from facial expressions: a functional magnetic
with more activation to expressive faces compared to IAPS pictures. resonance imaging study. Hum. Brain Mapp. 9 (2), 93 – 105.
Inherent properties unique to the specific emotional stimuli (e.g. Cunningham, W.A., Raye, C.L., Johnson, M.K., 2004. Implicit and
novelty, complexity, sociality) may have yielded the differential explicit evaluation: FMRI correlates of valence, emotional intensity,
pattern of brain activation. In addition, the effects of specific and control in the processing of attitudes. J. Cogn. Neurosci. 16 (10),
emotions and their interactions with stimulus types may also be 1717 – 1729.
contributing to these differential patterns. These findings may aid in Damasio, A.R., 1999. The Feeling of What Happens: Body and Emotion in
determining the optimal stimulus selection for probing general the Making of Consciousness, first edR Harcourt Brace, New York.
emotion processing, emotion recognition, emotion induction, and Darwin, C., 1998. The Expression of the Emotions in Man and Animal,
specific emotions. Although, further replication using other emo- third edition. Oxford Univ. Press, New York.
Davidson, R.J., Jackson, D.C., Kalin, N.H., 2000. Emotion, plasticity,
tional stimulus probes (e.g. Ekman faces, evocative films) is needed.
context, and regulation: perspectives from affective neuroscience.
Psychol. Bull. 126 (6), 890 – 909.
Davis, W.J., Rahman, M.A., Smith, L.J., Burns, A., Senecal, L., McArthur,
Acknowledgments D., et al., 1995. Properties of human affect induced by static color slides
(IAPS): dimensional, categorical and electromyographic analysis. Biol.
We wish to thank Ruben Gur and his colleagues at the University Psychol. 41 (3), 229 – 253.
of Pennsylvania for graciously sharing with us their stimuli set of Dolan, R.J., Fletcher, P., Morris, J., Kapur, N., Deakin, J.F., Frith, C.D.,
facial expressions and Margaret Bradley, Peter Lang, and the NIMH 1996. Neural activation during covert processing of positive emotional
Center for the Study of Emotion and Attention (CSEA) at the facial expressions. NeuroImage 4 (3 Pt. 1), 194 – 200.
Downar, J., Crawley, A.P., Mikulis, D.J., Davis, K.D., 2001. The effect of
University of Florida for providing us with the set of IAPS pictures.
task relevance on the cortical response to changes in visual and auditory
Supported by Veterans Education and Research Association of stimuli: an event-related fMRI study. NeuroImage 14 (6), 1256 – 1267.
Michigan and National Institutes of Mental Health (NIMH): Downar, J., Crawley, A.P., Mikulis, D.J., Davis, K.D., 2002. A cortical
National Research Service Award (NRSA), F31MH069003 to JCB. network sensitive to stimulus salience in a neutral behavioral context
across multiple sensory modalities. J. Neurophysiol. 87 (1), 615 – 620.
Ekman, P., 1992. Are there basic emotions? Psychol. Rev. 99 (3), 550 – 553.
References Ekman, P., 1994. Strong evidence for universals in facial expressions: a
reply to Russell’s mistaken critique. Psychol. Bull. 115 (2), 268 – 287.
Adolphs, R., Baron-Cohen, S., Tranel, D., 2002. Impaired recognition of Esteves, F., Ohman, A., 1993. Masking the face: recognition of emotional
social emotions following amygdala damage. J. Cogn. Neurosci. 14 (8), facial expressions as a function of the parameters of backward masking.
1264 – 1274. Scand. J. Psychol. 34 (1), 1 – 18.
Anderson, A.K., Phelps, E.A., 2001. Lesions of the human amygdala impair Fischer, H., Wright, C.I., Whalen, P.J., McInerney, S.C., Shin, L.M., Rauch,
enhanced perception of emotionally salient events. Nature 411 (6835), S.L., 2003. Brain habituation during repeated exposure to fearful and
305 – 309. neutral faces: a functional MRI study. Brain Res. Bull. 59 (5), 387 – 392.
Ashburner, J., Neelin, P., Collins, D.L., Evans, A., Friston, K., 1997. Frank, M.G., Stennett, J., 2001. The forced-choice paradigm and the
Incorporating prior knowledge into image registration. NeuroImage 6 perception of facial expressions of emotion. J. Pers. Soc. Psychol. 80
(4), 344 – 352. (1), 75 – 85.
Beauregard, M., Levesque, J., Bourgouin, P., 2001. Neural correlates of Fredrickson, B.L., 2001. The role of positive emotions in positive
conscious self-regulation of emotion. J. Neurosci. 21 (18), RC165. psychology. The broaden-and-build theory of positive emotions. Am.
Bechara, A., Tranel, D., Damasio, H., Adolphs, R., Rockland, C., Damasio, Psychol. 56 (3), 218 – 226.
A.R., 1995. Double dissociation of conditioning and declarative Fredrickson, B.L., 2004. The broaden-and-build theory of positive
knowledge relative to the amygdala and hippocampus in humans. emotions. Philos. Trans. R. Soc. London, Ser. B Biol. Sci. 359 (1449),
Science 269 (5227), 1115 – 1118. 1367 – 1378.
Blair, R.J., Morris, J.S., Frith, C.D., Perrett, D.I., Dolan, R.J., 1999. Fried, I., MacDonald, K.A., Wilson, C.L., 1997. Single neuron activity in
Dissociable neural responses to facial expressions of sadness and anger. human hippocampus and amygdala during recognition of faces and
Brain 122 (Pt. 5), 883 – 893. objects. Neuron 18 (5), 753 – 765.
Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Friston, K.J., 1998. Generalisability, random effects and population
Buckner, R.L., et al., 1996. Response and habituation of the human inference. NeuroImage 7, S754.
amygdala during visual processing of facial expression. Neuron 17 (5), Friston, K.J., Holmes, A.P., Worsley, K.J., Poline, J.B., Frith, C.D.,
875 – 887. Frackowiak, R.S., 1995. Statistical parametric maps in functional
Bush, G., Luu, P., Posner, M.I., 2000. Cognitive and emotional influences imaging: a general linear approach. Hum. Brain Mapp. 2, 189 – 210.
in anterior cingulate cortex. Trends Cogn. Sci. 4 (6), 215 – 222. Geday, J., Gjedde, A., Boldsen, A.S., Kupers, R., 2003. Emotional valence
Bush, G., Vogt, B.A., Holmes, J., Dale, A.M., Greve, D., Jenike, M.A., modulates activity in the posterior fusiform gyrus and inferior medial
et al., 2002. Dorsal anterior cingulate cortex: a role in reward-based prefrontal cortex in social perception. NeuroImage 18 (3), 675 – 684.
decision making. Proc. Natl. Acad. Sci. U. S. A. 99 (1), 523 – 528. Genovese, C.R., Lazar, N.A., Nichols, T., 2002. Thresholding of statistical
Calder, A.J., Burton, A.M., Miller, P., Young, A.W., Akamatsu, S., 2001. A maps in functional neuroimaging using the false discovery rate.
principal component analysis of facial expressions. Vision Res. 41 (9), NeuroImage 15 (4), 870 – 878.
1179 – 1208. Gur, R.C., Schroeder, L., Turner, T., McGrath, C., Chan, R.M., Turetsky,
Carroll, J.M., Russell, J.A., 1996. Do facial expressions signal specific B.I., et al., 2002. Brain activation during facial emotion processing.
emotions?: judging emotion from the face in context. J. Pers. Soc. NeuroImage 16 (3 Pt. 1), 651 – 662.
Psychol. 70 (2), 205 – 218. Halberstadt, J.B., Niedenthal, P.M., 1997. Emotional state and the use
918 J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919

of stimulus dimensions in judgment. J. Pers. Soc. Psychol. 72 (5), Milad, M.R., Quirk, G.J., 2002. Neurons in medial prefrontal cortex signal
1017 – 1033. memory for fear extinction. Nature 420 (6911), 70 – 74.
Hariri, A.R., Bookheimer, S.Y., Mazziotta, J.C., 2000. Modulating Milad, M.R., Vidal-Gonzalez, I., Quirk, G.J., 2004. Electrical stimulation of
emotional responses: effects of a neocortical network on the limbic medial prefrontal cortex reduces conditioned fear in a temporally
system. NeuroReport 11 (1), 43 – 48. specific manner. Behav. Neurosci. 118 (2), 389 – 394.
Hariri, A.R., Mattay, V.S., Tessitore, A., Fera, F., Weinberger, D.R., 2003. Morris, J.S., Frith, C.D., Perrett, D.I., Rowland, D., Young, A.W., Calder,
Neocortical modulation of the amygdala response to fearful stimuli. A.J., 1996. A differential neural response in the human amygdala to
Biol. Psychiatry 53 (6), 494 – 501. fearful and happy facial expressions. Nature 383 (6603), 812 – 815.
Hatfield, E., Cacioppo, J.T., Rapson, R.L., 1992. Primitive emotional Morris, J.S., Friston, K.J., Buchel, C., Frith, C.D., Young, A.W., Calder,
contagion. Rev. Person. Soc. Psychol. 14, 151 – 177. A.J., et al., 1998. A neuromodulatory role for the human amygdala in
Haxby, J.V., Petit, L., Ungerleider, L.G., Courtney, S.M., 2000. Distinguish- processing emotional facial expressions. Brain 121 (Pt. 1), 47 – 57.
ing the functional roles of multiple regions in distributed neural systems Mourao-Miranda, J., Volchan, E., Moll, J., de Oliveira-Souza, R., Oliveira,
for visual working memory. NeuroImage 11 (5 Pt. 1), 380 – 391. L., Bramati, I., et al., 2003. Contributions of stimulus valence and
Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2002. Human neural systems arousal to visual activation during emotional perception. NeuroImage
for face recognition and social communication. Biol. Psychiatry 51 (1), 20 (4), 1955 – 1963.
59 – 67. Narumoto, J., Okada, T., Sadato, N., Fukui, K., Yonekura, Y., 2001.
Hornak, J., Rolls, E.T., Wade, D., 1996. Face and voice expression Attention to emotion modulates fMRI activity in human right superior
identification in patients with emotional and behavioural changes temporal sulcus. Brain Res. Cogn. Brain Res. 12 (2), 225 – 231.
following ventral frontal lobe damage. Neuropsychologia 34 (4), Ochsner, K.N., Bunge, S.A., Gross, J.J., Gabrieli, J.D., 2002. Rethinking
247 – 261. feelings: an FMRI study of the cognitive regulation of emotion. J. Cogn.
Izard, C.E., 1994. Innate and universal facial expressions: evidence Neurosci. 14 (8), 1215 – 1229.
from developmental and cross-cultural research. Psychol. Bull. 115 Ongur, D., Price, J.L., 2000. The organization of networks within the orbital
(2), 288 – 299. and medial prefrontal cortex of rats, monkeys and humans. Cereb.
Killgore, W.D., Yurgelun-Todd, D.A., 2004. Activation of the amygdala Cortex 10 (3), 206 – 219.
and anterior cingulate during nonconscious processing of sad versus Oppenheim, A., Schafer, R., 1989. Discrete-Time Signal Processing.
happy faces. NeuroImage 21 (4), 1215 – 1223. Englewood Cliffs, Prentice Hall, NJ.
Kim, H., Somerville, L.H., Johnstone, T., Alexander, A.L., Whalen, P.J., Pessoa, L., Kastner, S., Ungerleider, L.G., 2002. Attentional control of the
2003. Inverse amygdala and medial prefrontal cortex responses to processing of neural and emotional stimuli. Brain Res. Cogn. Brain Res.
surprised faces. NeuroReport 14 (18), 2317 – 2322. 15 (1), 31 – 45.
Lane, R.D., Fink, G.R., Chau, P.M., Dolan, R.J., 1997a. Neural activation Phan, K.L., Wager, T., Taylor, S.F., Liberzon, I., 2002. Functional
during selective attention to subjective emotional responses. Neuro- neuroanatomy of emotion: a meta-analysis of emotion activation studies
Report 8 (18), 3969 – 3972. in PET and fMRI. NeuroImage 16 (2), 331 – 348.
Lane, R.D., Reiman, E.M., Ahern, G.L., Schwartz, G.E., Davidson, R.J., Phan, K.L., Liberzon, I., Welsh, R.C., Britton, J.C., Taylor, S.F., 2003.
1997b. Neuroanatomical correlates of happiness, sadness, and disgust. Habituation of rostral anterior cingulate cortex to repeated emotionally
Am. J. Psychiatry 154 (7), 926 – 933. salient pictures. Neuropsychopharmacology 28 (7), 1344 – 1350.
Lane, R.D., Reiman, E.M., Bradley, M.M., Lang, P.J., Ahern, G.L., Phan, K.L., Taylor, S.F., Welsh, R.C., Ho, S.H., Britton, J.C., Liberzon, I.,
Davidson, R.J., et al., 1997c. Neuroanatomical correlates of pleasant 2004. Neural correlates of individual ratings of emotional salience: a
and unpleasant emotion. Neuropsychologia 35 (11), 1437 – 1444. trial-related fMRI study. NeuroImage 21 (2), 768 – 780.
Lane, R.D., Chua, P.M., Dolan, R.J., 1999. Common effects of emotional Phan, K.L., Fitzgerald, D.A., Nathan, P.J., Moore, G.J., Uhde, T.W., Tancer,
valence, arousal and attention on neural activation during visual M.E., 2005. Neural substrates for voluntary suppression of negative
processing of pictures. Neuropsychologia 37 (9), 989 – 997. affect: a functional magnetic resonance imaging study. Biol. Psychiatry
Lang, P.J., Greenwald, M.K., Bradley, M.M., Hamm, A.O., 1993. Looking 57 (3), 210 – 219.
at pictures: affective, facial, visceral, and behavioral reactions. Phillips, M.L., Young, A.W., Senior, C., Brammer, M., Andrew, C., Calder,
Psychophysiology 30 (3), 261 – 273. A.J., et al., 1997. A specific neural substrate for perceiving facial
Lang, P.J., Bradley, M.M., Cuthbert, B.N. (1997). International Affective expressions of disgust. Nature 389 (6650), 495 – 498.
Picture System (IAPS): Technical Manual and Affective Ratings. Quirk, G.J., Likhtik, E., Pelletier, J.G., Pare, D., 2003. Stimulation of
Gainesville, FL: NIMH Center for the Study of Emotion and Attention, medial prefrontal cortex decreases the responsiveness of central
University of Florida. amygdala output neurons. J. Neurosci. 23 (25), 8800 – 8807.
LeDoux, J.E., 2000. Emotion circuits in the brain. Annu. Rev. Neurosci. 23, Schneider, W., Eschman, A., Zuccolotto, A., 2002a. E-Prime Reference
155 – 184. Guide. Psychology Software Tools, Inc., Pittsburgh.
Levesque, J., Joanette, Y., Mensour, B., Beaudoin, G., Leroux, J.M., Schneider, W., Eschman, A., Zuccolotto, A., 2002b. E-Prime User’s Guide.
Bourgouin, P., et al., 2003. Neural correlates of sad feelings in healthy Psychology Software Tools, Inc., Pittsburgh.
girls. Neuroscience 121 (3), 545 – 551. Sheehan, D., Janavs, J., Baker, R., Harnett-Sheehan, K., Knapp, E.,
Liberzon, I., Taylor, S.F., Fig, L.M., Decker, L.R., Koeppe, R.A., Sheehan, M., 1998. Mini International Neuropsychiatric Interview.
Minoshima, S., 2000. Limbic activation and psychophysiologic English Version 5.0.0, DSM-IV.
responses to aversive visual stimuli. Interaction with cognitive task. Siegle, G.J., Steinhauer, S.R., Thase, M.E., Stenger, V.A., Carter, C.S.,
Neuropsychopharmacology 23 (5), 508 – 516. 2002. Can’t shake that feeling: event-related fMRI assessment of
Liberzon, I., Phan, K.L., Decker, L.R., Taylor, S.F., 2003. Extended sustained amygdala activity in response to emotional information in
amygdala and emotional salience: a PET activation study of depressed individuals. Biol. Psychiatry 51 (9), 693 – 707.
positive and negative affect. Neuropsychopharmacology 28 (4), Simpson, J.R., Ongur, D., Akbudak, E., Conturo, T.E., Ollinger, J.M.,
726 – 733. Snyder, A.Z., et al., 2000. The emotional modulation of cognitive
Mayberg, H.S., 1997. Limbic-cortical dysregulation: a proposed model of processing: an fMRI study. J. Cogn. Neurosci. 12 (Suppl. 2),
depression. J. Neuropsychiatry Clin. Neurosci. 9 (3), 471 – 481. 157 – 170.
Mayberg, H.S., Liotti, M., Brannan, S.K., McGinnis, S., Mahurin, R.K., Somerville, L.H., Kim, H., Johnstone, T., Alexander, A.L., Whalen, P.J.,
Jerabek, P.A., et al., 1999. Reciprocal limbic-cortical function and 2004. Human amygdala responses during presentation of happy and
negative mood: converging PET findings in depression and normal neutral faces: correlations with state anxiety. Biol. Psychiatry 55 (9),
sadness. Am. J. Psychiatry 156 (5), 675 – 682. 897 – 903.
J.C. Britton et al. / NeuroImage 31 (2006) 906 – 919 919

Taylor, S.F., Phan, K.L., Decker, L.R., Liberzon, I., 2003. Subjective rating nomic activity over the experimental time course of fear perception.
of emotionally salient stimuli modulates neural activity. NeuroImage 18 Brain Res. Cogn. Brain Res. 21 (1), 114 – 123.
(3), 650 – 659. Winston, J.S., O’Doherty, J., Dolan, R.J., 2003. Common and distinct
Tulving, E., Markowitsch, H.J., Kapur, S., Habib, R., Houle, S., 1994. neural responses during direct and incidental processing of multiple
Novelty encoding networks in the human brain: positron emission facial emotions. NeuroImage 20 (1), 84 – 97.
tomography data. NeuroReport 5 (18), 2525 – 2528. Woods, R.P., Grafton, S.T., Watson, J.D., Sicotte, N.L., Mazziotta,
Whalen, P.J., Rauch, S.L., Etcoff, N.L., McInerney, S.C., Lee, M.B., Jenike, J.C., 1998. Automated image registration: II. Intersubject validation
M.A., 1998. Masked presentations of emotional facial expressions of linear and nonlinear models. J. Comput. Assist. Tomogr. 22 (1),
modulate amygdala activity without explicit knowledge. J. Neurosci. 18 153 – 165.
(1), 411 – 418. Wright, C.I., Martis, B., Schwartz, C.E., Shin, L.M., Fischer, H.H.,
Whalen, P.J., Shin, L.M., McInerney, S.C., Fischer, H., Wright, C.I., McMullin, K., et al., 2003. Novelty responses and differential effects
Rauch, S.L., 2001. A functional MRI study of human amygdala of order in the amygdala, substantia innominata, and inferior temporal
responses to facial expressions of fear versus anger. Emotion 1 (1), cortex. NeuroImage 18 (3), 660 – 669.
70 – 83. Yang, T.T., Menon, V., Eliez, S., Blasey, C., White, C.D., Reid, A.J., et al.,
Wild, B., Erb, M., Bartels, M., 2001. Are emotions contagious? Evoked 2002a. Amygdalar activation associated with positive and negative
emotions while viewing emotionally expressive faces: quality, facial expressions. NeuroReport 13 (14), 1737 – 1741.
quantity, time course and gender differences. Psychiatry Res. 102 Yang, Y., Gu, H., Zhan, W., Xu, S., Silbersweig, D.A., Stern, E., 2002b.
(2), 109 – 124. Simultaneous perfusion and BOLD imaging using reverse spiral
Williams, L.M., Brown, K.J., Das, P., Boucsein, W., Sokolov, E.N., scanning at 3T: characterization of functional contrast and susceptibility
Brammer, M.J., 2004. The dynamics of cortico-amygdala and auto- artifacts. Magn. Reson. Med. 48 (2), 278 – 289.

You might also like