Summary of programme Affect and Personality in Interaction with Ubiquitous Systems speech, language, gesture, facial expressions, music, colour Professor Ruth Aylett Vision Interactive Systems & Graphical Environments MACS, Heriot-Watt University www.macs.hw.ac.uk/~ruth Introduction and overview Affective outputs Affective/Personality models and action-selection approaches Affective inputs Applications Embodied Conversational Characters, Intelligent Virtual, Agents, human-robot interaction Evaluation approaches Today s topics Embodied Conversational Characters ECAs Laban Movement analysis Gesture FACS and facial expressions Affective Embodied Conversational Agents Embodied Conversational Agents 1
ECAs Definition Software entities capable of autonomously communicating with users through verbal and nonverbal means Greta U Paris8 Pelachaud, Mancini, Bevacqua et al ECA s cognitive and expressive capabilities simulate human capabilities Models are based on theories from human studies, particularly in the domains of linguistic, phonetics, cognitive science, emotion, psychology, and sociology MAX in the Museum Expressive movement and facial expression University Bielefeld, Wachsmuth, Kopp et al Located in a museum for use by all Engages visitors in Natural Language small-talk conversations Affective multimodal ECA Produce dynamic expressive visual behaviors Achieve coordination of signals in multiple modalities Encode behavior variability Autonomous they can plan what to say, know when to start a conversation, when to answer and when to take the conversation turn Gives information about the museum and exhibition Knows about external environment Tells jokes, plays animal guessing game Internet lookup of information, e.g. weather report Emotions Continuously updated, always shown Influence behavior, e.g. leaving the scene when very annoyed by rude visitor behavior How to make this human? But are we doing life or drama? Look for formalisms that we can reapply Expressive movement: Laban Movement Analysis Facial expression: Facial Action Coding System 2
Laban Movement Analysis Originates in ballet choreography As a notation for dance 5 major components: Body: part of the body in used Space: description of the directions and paths of motion Shape: changing forms of the body Effort: dynamics of the body moves Relationship: modes of interaction with oneself, others and the environment (facing, contact ) Textual and symbolic language of movement description EMOTE (Badler et al) EMOTE model for Effort and Shape Based on Movement Observation Science Laban Movement Analysis Computational model of Effort and Shape components Effort Effort Four motion factors: space, weight, time, flow Space Weight Time Flow Each factor ranges from: Reinforcing the factor Fighting against the factor Space: indirect direct Waving away bugs pointing to a particular spot Weight: light strong Feather movement punching Time: sustained sudden Stretching grabbing a falling object Flow: free bound Waving widely moving in slow motion 3
Shape Three distinct qualities of change in the form of movement: Shape flow: mover s attitude toward the changing relationship among body parts Directional movement: mover s intent to bridge the action to a point in the environment Shaping: mover s carving or molding attitude with the environment Shape Three dimensions: Horizontal: spreading enclosing Opening arms to embrace clasping someone in a hug Vertical: rising sinking Reaching for something in a high shelf stamping the floor with indignation Sagittal: advancing retreating Reaching out to shake hand Avoiding a punch Capturing the movement 4
GRETA example Greta with different emotional parameters Duchenne de Bologne Facial Expression Facial Action Coding System French anatomist of 19thC In 1860s used electric currents on facially paralysed subject Showed which muscles go with which expressions 5
Expressions - facial muscles Facial expression Main human muscle groups for articulation and facial expressions Muscle numbered in white: 1. Frontalis, 2. Corrugator, 3. Orbicularis oculi, 4. Levator palpebrae, 5. Nose and upper lip, 6. Upper lip, 7. Zygomaticus major, 8. Lip stretch, 9. Triangularis, 10. Lower lip, 11. Mentalis, 12. Orbicularis oris, 13. Jaw Reflector-point positions numbered in yellow Drawing on psychology: FACS (Facial Action Coding System) Developed by Ekman & Friesen (1978) Distinguishes among 44 Action Units Each action unit (AU) describes a separate, visible, and muscle-based facial change Ekman s FACs For primitive emotions: Surprise, fear, disgust, anger, happiness, sadness The Duchenne smile Associated with expression of happiness AU12 (Lip Corner Puller) Plus AU6 (Cheek Raiser) 6
Mapping onto mpeg-4 Defines FAPs Facial action parameters Set of feature points Model calibration Morphology parameters Expression parameters CG facial expressions http://mrl.nyu.edu/perlin/experiments/facedemo Neutral Frightened Kiss Disappointed Sleeping Annoyed Surprised Happy Arrogant Angry Talking (CG face Ken Perlin, NY) Facial expressions simplified Use of texture with morphing Can work well with cartoon-like faces Affective Listener André, U Ausgburg Creation of rapport between a virtual agent and the human user by providing the agent with emotional sensitivity Responding to the user s emotional voice when telling a story to the agent E André 7
Agents for training University of Southern California QUESTIONS? 8