Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

Affective Non-Verbal Expressions:

Affective Interaction
Nadia Berthouze
Afffective behavior Social interactions
 Turn-taking
Verbal Non-verbal (individual)
 Common ground
  Gestures
Lexicon  Backchannel
 Words  Head gestures
 Eye gestures  Relative distance
 Grammar  Arm gestures  Mimicry …….
 Part-of-speech
 Dependencies  Body language Macro behavior
 Body posture
 Meaning  Amount of social
 Proxemics
 Discourse acts interaction (e.g.
 Touch
 …. phone calls)
 Eye  Amount of
 Head gaze
 Vocal Prosody  Gaze direction functional and
 Volume social activity
 Facial expressions
 Intonation  Distance from
 32 action units
 Tempo
 Smile, frowning known places
 ……… 
 ……. ….
From taxonomy to building blocks of expression
Earlier research: build taxonomy of affective expressions
– Identify stereotypical expressions or full blown expressions
• How is happiness expressed? How is anger expressed?
• How does a fake from a real smile looks like
– Provide direct design directions
• E.g., expressive robots/avatar
– Capture only the tips of the affective iceberg
• Automatic expression recognition fail in real life application
• Robots and avatars become very repetitive
From taxonomy to building blocks of expression
Earlier research: build taxonomy of affective expressions
– Identify stereotypical expressions or full blown expressions
• How is happiness expressed? How is anger expressed?
• How does a fake from a real smile looks like
– Provide direct design directions
• E.g., expressive robots/avatar
– Capture only the tips of the affective iceberg
• Automatic expression recognition fail in real life application
• Robots and avatars become very repetitive

Current approaches: use findings as buildings blocks for


understanding and designing expressions
– What part of the body and what aspect f movement should explored?
– What factors should be consider when designing expressions?
From taxonomy to building blocks of expression
Earlier research: build taxonomy of affective expressions
– Identify stereotypical expressions or full blown expressions
• How is happiness expressed? How is anger expressed?
• How does a fake from a real smile looks like
– Provide direct design directions
• E.g., expressive robots/avatar
– Capture only the tips of the affective iceberg
• Automatic expression recognition fail in real life application
• Robots and avatars become very repetitive

Current approaches: use findings as buildings blocks for


understanding and designing expressions
– What part of the body and what aspect f movement should explored?
– What factors should be consider when designing expressions?

Future research: technology learning directly from/with human


Facial Expressions

6
Facial Action Coding System (FACs)
 System to build a taxonomy of facial movement
 32 AUs (& more): contraction or relaxation of one or more muscles.

Breuer et al. 2017 arXiv:1705.01842v2

 Examples:
 Insincere and voluntary smile: contraction of zygomatic major alone

 Sincere and involuntary Duchenne smile: contraction of zygomatic

major and inferior part of orbicularis oculi.

Hager, J.C., Ekman, P., Friesen, W.V. (2002). Facial action coding system.
Limitations

• FACs  Emotions
– It does not capture the variability of everyday
expressions
– Constructivist approach: facial expressions are affect by
context, evaluation of the context, etc.
• Emotional expressions do exist but they are not so predefined.
• Field: moving away from standardization but embracing
variability
– AUs used as the building blocks of facial expressions

8
What automatic recognition systems base
their assessment on?

Aus - Based
Beyond AUs

Shan Li, 2018


Texture of the face – related
Facial points detection to AUs but more (e.g., colours
AU estimations due to changes of blood flow)

9
Designing avatars expressions
(towards naturalism: beyond just stereotypical smiles)

How a virtual agent should smile?, Ochs, Niewiadomski, Pelachaud, IVA’ 2010)
Learning to express emotions
(towards naturalism: beyond just stereotypical smiles)

Click on video

Einstein robot: 30 facial muscles. As a baby would do, it learns to make


expressions through machine learning Javier Movellan’s team ICDL 2009
- How does a child learn: rewarding, imitations, …..
Take-home questions

• What type of behaviour can be used to infer people’


affective states?
• What is FACs? And what is AU?
• Why is FACs not sufficient to explain all facial
expressions?
• How is FACs used to
– Inform the design of automatic emotion recognition system? And
why are datasets an issue?
– Inform the design of social avatars and robots?

12
Is it all about facial expressions?

13
Here is the answer:
The illusory facial expressions: the body count (2)
W

Face+Body Body Face

W = Winner ---- Positive Valence


Did you get it right?

Aviezer, Trope, Todorov, A. 2012. Body Cues, Not Facial Expressions, Discriminate
Between Intense Positive and Negative Emotions. Science, 30, 1225-1229
Here is the answer:
The illusory facial expressions: the body count (2)
W

Face+Body Body Face

W = Winner ---- Positive Valence


Did you get it right?

Aviezer, Trope, Todorov, A. 2012. Body Cues, Not Facial Expressions, Discriminate
Between Intense Positive and Negative Emotions. Science, 30, 1225-1229
Exercise 3: Who lost the game?
What body cues give it away?

• video

Replay the video as many time you need.


Exercise 2: Body expressions

• How does your body express how you feel when


studying
– Alone at home or in a coffee shop:
• Bored, interested/engaged, Frustrated

– During a discussion with peers


• Engaged and excited, frustrated, disengaged

• What are the typical body patterns that allow other


to understand your state?
17
Body expression models

18
Laban Movement Analysis: major components
(von Laban, 1928)

• Body: which body part are involved, where the Upper movement as
expression of positive
movements are initiate and how the movement intention (Boone et. al,
1998).
spread.

• Space: how large is the mover kinesphere.


Relation between people
• Shape: changing forms that the body makes in
space (body configuration)

• Effort: Dynamic qualities of movement / inner


attitude towards using energy. Body contraction:
sadness, fear

• Relationship: modes of interaction with oneself,


others and the environment (facing, contact…)
EFFORT
Weight: light - strong
– Feather movement
– Punching

Space: indirect - direct


– Waving away bugs
– Pointing to a particular spot

Time: sustained - sudden


– Stretching
– grabbing a falling object

Flow: free - bound


– Waving widely
– Moving in slow motion
EFFORT SHAPE
Weight: light - strong Horizontal:
– Feather movement spreading - enclosing
– Punching
– Opening arms to embrace
– Clasping someone in a hug
Space: indirect - direct
– Waving away bugs
– Pointing to a particular spot Vertical: rising - sinking
– Reaching for something in
Time: sustained - sudden a high shelf
– Stretching – stamping the floor with
– grabbing a falling object indignation

Sagittal: advancing- retreating


Flow: free - bound
– Waving widely – Reaching out to shake
– Moving in slow motion hand
– Avoiding a punch
Taxonomy of
body expressions

Mapping body features


to affective states

A. Kleinsmith et al. (2013) Affective


Body Expression Perception and
Recognition: A Survey. IEEE Trans.
on Affect. Comp.,

22
Exercise 2: Exploring robot expressive behaviour
Play the videos and analyse the movement using the Laban notation.

An example where we can see


Laban movement attributes:
A tool for exploring robot affective
behaviour parameters Relationship and effort: following the
(Laban-related kinematics and shape person through a flowing slow
movement: it creates a positive and
parameters)
calm atmosphere and state of being
seen.
https://www.autodesk.com/research/publications/
geppetto https://dl.acm.org/doi/abs/
10.1145/3313831.3376411
23
Take-home questions: body expressions

• What is the Laban notation system?


– And what are its components?

• How does Laban inform the design of


robots/avatars affective body behaviour?

• What body features can be used to build


automatic affect recognition models
– What sensors could be used to track the body?

24
Models of affective touch

25
Affective Touch
Haptic sensations

Haptic sensation (skin receptors) Kinaesthetic Sensations


(Proprioceptive sensations)
Mechanoreceptors (pressure, vibration,
texture): Kinestetic receptors (body position and
AB fibers: discriminative receptors movement): muscle spindles, tendon
CT fibers: hedonic receptors mechnoreceptors,..

Thermo-receptor (warm and cold receptor)


Itch receptors
Nociceptors (pain)

26
Affective and social touch models
• Low level dimensions (Hertenstein et al., 2006-2009)
– Touch types
– Body parts
– Kinematic: speed, duration, intensity
Emotion Touch types most frequently used
Anger Hit, Pat, Push, Shake, Squeeze
Fear Contact, Lift, Press, Shake, Squeeze
Happiness Hug, Lift, Pat, Shake, Squeeze, Swing
Sadness Contact, Hug, Lift, Nuzzle, Rub, Squeeze,
Stroke
Disgust Contact, Kick, Lift, Push, Shake, Slap,
Toss, Squeeze
Love Contact, Hug, Lift, Pat, Press, Shake,
Stroke, Tap
Gratitude Contact, Hug, Lift, Pat, Shake
Sympathy Contact, Hug, Pat, Rub, Stroke

27
Remote tactile sensations The tactile sleeve TaSST

Sending and receiving


sleeve

Based on vibrators
https://vimeo.com/111519978

(start video min. 3)


Automatic emotion recognition in touch games

Video

The software captures players’ finger-stroke behaviour


• Coordinates of each point of a stroke
• Finger pressure at each point
• Stroke duration: from finger contact to finger lift.

Gao, Bianchi-Berthouze, Meng (2012) What does touch tell us about emotions in touchscreen-based
gameplay?, ACM Transaction on CHI
Touch sensors
• Pressure sensors on object or agent to be
touched
• EMG+IMU: measure arm’s muscle activity
and movement driving the touch behaviour
• Could be used to measure affective
experience MyoEMG armband

Affective applications e.g.,


- objects that sense
how people feel
about them
- crowdsourcing
material
experiences for
on-line shopping
or for more
objective material
evaluation Benalcazar et al., 2017 Work on gesture 30
recognition
Crowdsoursing fabric tactile experiences

Different EMG sensors to


gestures for collect data of
touching fabric touching
experience

Image Source
1. 2021. How To Tell If Fabric Is 100 Cotton? Silver Bobbin. Retrieved July 21, 2021 from https://silverbobbin.com/how-to-tell-if-fabric-is-100-cotton/
2. 2020. 98 Cotton 2 Elastane Fabric Roll Cotton Stretch Fabric For Pants - Buy Cotton Stretch Twill Fabric,Cotton Stretch Fabric,Elastane Fabric For Pants Product on Alibaba.com. Alibaba.com. Retrieved July 22, 2021 fro
3. Ricardo O' Nascimento. 2017. Touch and electronic textiles - Ricardo O’ Nascimento - Medium. Medium. Retrieved July 22, 2021 from https://ricardoonascimento.medium.com/touch-and-electronic-textiles-371d30fcc239
Muscle activity driving hand gesture

• Characterisation of textile handling gestures


Exercise 3a: Affective Touch - social touch
• Choose one situation and think at how you express affect
through touch
– Relationship: parent-child, partners
• Consider a scenario where they hug you or pat you on your
shoulder because:
– they want to show empathy as you are very sad
– they are happy and want to share their happiness with you
– you are upset with them and they want to make it up
• How does the hug changes across these three scenarios
– Think to the touch part (their pressure, their gesture, their
speed)
– Think also how you touch back (your pressure, your
gesture, etc.)
• What if you were far apart …. How could we use the
sensors to detect their messages? 33
Exercise 3b: Affective Touch – object sensing
• Think to an object that you have at home that you like to
touch and a similar one that you dislike
– E.g., a soft fur vs a normal coat

– What’s the objects


– How do you touch it?
• What gestures?
– Caressing, patting, etc.
• Does the speed, the pressure, the duration of your gesture changes between
the two objects?
• If it changes how does it change:
– Speed
– Pressure
– Duration
– Direction
34
Can technology sense …. how it feels to you?

MyoEMG
How does it feel?

Z-Speed_max
Cold Warm
What Property?

EMG-8-Std
EMG-6-Mean

Rough Smooth

Warmth – Thickness – Smoothness

Keija Wang et al., in preparation


Physical dimensions of touch
PHYSICAL DIMENSION Affective meaning Context creation Naturalness (match to real
(Modalities) touch)

Pressure Most affective states Sense of presence Essential: there is no touch


without some pressure
Modulate arousal
Temperature Most affective states Emotional amplification, Skin temperature, expectation and
sense of skin touching, awareness of each other
Modulate valence
identity/presence temperature

Vibration Increase arousal, Unpleasant state   Unnatural, unpleasant


Sound Suggested: most affective state   Manly unnatural but augment
according to metaphorical meaning human touch
or sound characteristics
Dynamics of Affective meaning Context creation Naturalness (match to real touch)
modalities
Initiation (activation) Unforeseen may contribute to    
of touch (forsee or surprise
unforeseen)
Speed of activation Modulate both arousal and valence Sense of control It should be kept within natural
(incremental vs of affective state range – otherwise perceived as
awkward o
sudden)
Granularity of changes Critical to control emotional meaning Easier to respect social  
(incremental vs rules
sudden)
Patterns (e.g.: Still, To create tactile gestures and    
Cyclic, Random) modulate level of arousal
Localization of touch Level of intimacy Easier to respect social According to relationship between
where, amount of area rules person and social context
touched and direction
of touch) 36
Social tactile dimensions
SOCIAL
DIMENSIONS Means Requirements

Sensation of bidirectional touch during Feeling the other person skin and its changes in
interaction response
Sensation of touch past interaction Own skin natural changes (eg,sweat due to hand heat)
Presence Levels of presence vs presence/absence signal Controlled of amount of and area touched
Uncontrolled reactions to touch & be touched  
A person idiosyncratic touch characteristics Own skin temperature
A person tactile preferences Relative body part size (e.g. a bigger and longer hand
than the one touched)
Identity Idiosyncratic tactile patterns
The other person automatic reaction to touch (does
she likes it?)
Ways of touching
Having control over being touch Enabling blocking from being touched
Privacy & Controlling the body area that can be touched Being able to send rejection signals to being touched
sense of Being able to offer the body area to be touched Being able to send acceptance signals of being touch
Control Being able to offer body area to be touched

Touch interpreted in context No predefined mapping


Receiving cues as feedback to once Building messages and meaning through interaction
Ambiguity interpretation Using feedback (automatic and construed) to reinforce
Enabling building knowledge of the other understanding

37
Affective touch models
• Low level dimensions (Hertenstein et al., 2006-2009)
– Touch types
– Kinematics: speed, duration, intensity
– Body parts

• Remote digital touch (Price, et al. submission)


– Low-level: Physical dimensions
– Middle level: Social dimensions

• High level dimensions (Ned Barker and Carey Jewitt, in press)


– Touch landscape
– Touch triggers
– Touch learning trajectory (evolving over time in people life)
38
Take-home questions: Touch expressions
• What dimensions can be used to explore and
understand touch in human-human communication?
– Low, middle and high levels

• How can they be used to inform the design of affective


technology?
– Remote touch
– Automatic recognition of affective touch
– And what about robot? How can we design robot’s responses
to affective touch?

• What sensors can be used to track people’s touch


types and their kinematics?
– What opportunities does it offer to affective interaction?

You might also like