1 / 55

GESTURE EXPRESSIVITY

GESTURE EXPRESSIVITY. Norman I. Badler Center for Human Modeling and Simulation University of Pennsylvania Philadelphia, PA 19104-6389 USA http://www.cis.upenn.edu/~badler. Special effects. Agents. The “Realism Ceiling” for Human Models. Time to create Behavioral Realism. Real-time.

creda
Download Presentation

GESTURE EXPRESSIVITY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GESTURE EXPRESSIVITY Norman I. Badler Center for Human Modeling and Simulation University of Pennsylvania Philadelphia, PA 19104-6389 USA http://www.cis.upenn.edu/~badler

  2. Special effects Agents The “Realism Ceiling” for Human Models Time to create Behavioral Realism Real-time Visual Realism Inanimate objects Life-like virtual humans CASA 2003 Tutorial

  3. What Approaches Push the Curve Toward More Realism? Motion Capture Time to create Behavioral Realism Parameterization (face, gait, gesture) Real-time Visual Realism Inanimate objects Life-like virtual humans CASA 2003 Tutorial

  4. Why is Realism Still Hard? Motion Capture Time to create Behavioral Realism What are the “right” parameters? Difficult to generalize Parameterization (face, gait, gesture) Real-time Visual Realism Inanimate objects Life-like virtual humans CASA 2003 Tutorial

  5. Goal: Real-Time Interaction between Real and Virtual People • Requires parameterized behaviors for flexible expression of virtual person’s internal state. • Requires virtual person to sense state of live participants. • What information is salient to both? CASA 2003 Tutorial

  6. Outline • Parameterized Action Representation (PAR) • EMOTE, eye movements, FacEMOTE • Character consistency • Recognizing movement qualities • The LiveActor environment CASA 2003 Tutorial

  7. Parameterized Behaviors for Embodied Agents • Use existing human behavior models as a guide. • Want to drive embodied agent from internal state: goals, emotions, culture, motivation… • External behaviors help the observer perceive these internal [hidden] agent states. CASA 2003 Tutorial

  8. Human Movement Categories • Voluntary (tasks, reach, look-at) • Dynamic (run, jump, fall) • Involuntary (breathe, balance, blink) • Subconscious • Low level motor functions (fingers, legs, lips) • Communicative acts (facial expressions, limb gestures, body posture, eye movement) CASA 2003 Tutorial

  9. Parameterized Action Representation (PAR) • Derived from BOTH Natural Language representation and animation requirements. • Lets people instruct virtual agents. • May be associated with a process-based agent cognitive model. CASA 2003 Tutorial

  10. Parameterized Action Representation (PAR) Virtual agent Internal State Synthesized Actions Language (NLP) PAR Language Generation Observed Actions Actionary CASA 2003 Tutorial

  11. PAR Examples • Virtual Reality checkpoint trainer. • Maintenance instruction validation. • The EMOTE motion qualities model. CASA 2003 Tutorial

  12. Checkpoint Virtual Environment CASA 2003 Tutorial

  13. Maintenance Instruction Validation • Example, F-22 Power Supply removal CASA 2003 Tutorial

  14. Instructions • Rotate the handle at the base of the unit. • Disconnect the 4 bottom electric connectors. • Disconnect the 5 top electric connectors. • Disconnect the 2 coolant lines. • Unbolt the 8 bolts retaining the power supply to the airframe and support it accordingly, and remove it. CASA 2003 Tutorial

  15. Executing the Corresponding PARs Eye view (Note attention Control) Instructions translated to PAR. PAR controls actions. CASA 2003 Tutorial

  16. EMOTE Motion Quality Model • EMOTE: A real-time motion quality model. • Based on Effort and Shape components of Laban Movement Analysis. • Defines of movement with 8 parameters. • Controls numerous lower level parameters of an articulated figure. CASA 2003 Tutorial

  17. Effort Motion Factors • Four factors range from an • indulging extreme to a fighting extreme: Space: Indirect ------------------ Direct Weight: Light --------------------- Strong Time: Sustained ------------- Sudden Flow: Free --------------------- Bound CASA 2003 Tutorial

  18. Shape Motion Factors • Four factors relating body movement to space: Sagittal: Advancing ----------- Retreating Vertical: Rising ----------------- Sinking Horizontal: Spreading ----------- Enclosing Flow: Growing -------------- Shrinking CASA 2003 Tutorial

  19. Inputs EMOTE Output 4 Efforts 4 Shapes Key Poses End Effector Goals Frame Rate Poses Procedures Inverse Kinematics Interpolation Motion Capture CASA 2003 Tutorial

  20. Applying Effort to Arm Motions • Effort parameters consist of values in [-1,+1] for Space, Weight, Time, and Flow • Translated into low-level movement parameters • Trajectory Definition • Timing Control • Flourishes CASA 2003 Tutorial

  21. Applying Shape to Arm Motions • Shape parameters consist of values in [-1,+1] for horizontal, vertical, sagittal dimensions • For each dimension, we define an ellipse in the corresponding plane • Parameter value specifies magnitude of movement of keypoint along ellipse • Reach Space parameter moves keypoint away or toward body’s center of mass CASA 2003 Tutorial

  22. Velocity Anticipation Overshoot Time exponent # of frames multiplier Wrist bend multiplier Wrist extension mult. Hand shape Squash Limb volume Path curvature Displacement multiplier Elbow twist magnitude Wrist twist magnitude Elbow twist frequency Wrist twist frequency EMOTE Controls other Performance Parameters CASA 2003 Tutorial

  23. ??? Inferred State Actions + EMOTE Motion Qualities are Significant • Movements with EMOTE qualities give insight into the agent’s cognitive state. • When EMOTE qualities spread from limbs to body, movements appear more sincere. • Agent Observer CASA 2003 Tutorial

  24. Start with Scripted Motions CASA 2003 Tutorial

  25. The Actor with Rising Efforts (Strong emotions?) CASA 2003 Tutorial

  26. The Actor with Neutral Efforts (A Politician?) CASA 2003 Tutorial

  27. Actor with Less Rising Shape (Not quite as excited?) CASA 2003 Tutorial

  28. Moving the Shapes Inward (Woody Allen?) CASA 2003 Tutorial

  29. With Light and Sustained Efforts (More solemn and serious?) CASA 2003 Tutorial

  30. Without Torso Movements (Used Car Salesman?) CASA 2003 Tutorial

  31. Application of EMOTE to Manner Variants (adverbs) in PAR: HIT Hit the ball …softly. … forcefully. CASA 2003 Tutorial

  32. Define Links between Gesture Selection and Agent Model • Gesture performance cues agent state! • Normal people show a variety of EMOTE parameters during gestures. • Emotional states and some pathologies indicated by reduced spectrum of EMOTE parameters. • That’s why some synthetic characters appear bland and uninspired. CASA 2003 Tutorial

  33. Eye Saccade Generation: to Distinguish Various Agent States • speaking, listening, thinking • pre-process: run-time: Face2Face Face2Face CASA 2003 Tutorial

  34. Eye Movements Modeled from Human Performance Data Source Eyes fixed ahead Eyes moved by statistical model Full MPEG-4 face CASA 2003 Tutorial

  35. 4 Efforts 4 Shapes Action Unit Modifications Extending EMOTE to the Face:Inputs FacEMOTE Output MPEG-4 FAP stream MPEG-4 FAPs CASA 2003 Tutorial

  36. `open_jaw’ ‘lower_t_midlip’ ‘raise_b_midlip’ ‘stretch_cornerlip’ 5.       ‘raise_l_cornerlip’, 6.       ‘close_t_l_eyelid’, 7.       ‘raise_l_i_eyebrow’, 8.       ‘squeeze_l_eyebrow’. FAPs Used • 47 used out of 66. • (Not visemes and expressions; nor tongue, nose, ears, pupils.) • 8 primary parameters; rest interpolated from these. CASA 2003 Tutorial

  37. Approach • Preprocess FAP stream for target face model to find maxima and minima (optional but useful to avoid caricatures). • Maintain the overall structure of muscle actions (contracting or relaxing) but: • Change path/time of the muscle trajectory. CASA 2003 Tutorial

  38. Main Steps • Threshold local maxima and minima peaks as required by EMOTE parameters. • Modulate (multiply) the primary FAPs by EMOTE parameters to adjust their strength. • Modulate the secondary FAPs with weighted blending functions reflecting the relative influence of each primary parameter on particular secondary parameters. CASA 2003 Tutorial

  39. Toward Character Consistency (very much in progress…) • Apply a consistent set of parameters to: • Arms (torso) – EMOTE • Face – FacEMOTE • [Gait?] • [Speech?!] • The following examples use gesture motion capture data and MPEG-4 face data CASA 2003 Tutorial

  40. Body without Face Motion • Body by Salim Zayat CASA 2003 Tutorial

  41. Face without Body Motion • ‘Greta’ face courtesy Catherine Pelachaud CASA 2003 Tutorial

  42. Face and Body with EMOTE • Body – indirect, widening; Face – indirect CASA 2003 Tutorial

  43. Mismatched Face and Body Qualities • Body - indirect, light, free; Face - direct, heavy, bound CASA 2003 Tutorial

  44. Next Steps: • Add torso shape changes and rotations. • Fix gesture timing and synchronization (use BEAT?). • Adapt EMOTE phrasing to speech prosody. • Find a suitable speech generator. • Investigate role of body realism. • Develop an evaluation methodology. CASA 2003 Tutorial

  45. Recognizing EMOTE Qualities • Professional LMA notators can do it. • Gives a “reading” of performer state. • The manner in which a gesture is performed may be more salient to understanding action (intent) than careful classification of the gesture itself. • Labeling EMOTE qualities in real-time would inform virtual agents of human state. CASA 2003 Tutorial

  46. Recognizing EMOTE Parameters in a Live Performance • Neural Networks trained to recognize occurrence of significant EMOTE parameters from motion capture (both in video and from 3D electromagnetic sensors). • Ground truth from 2 LMA notators. • Results encouraging that EMOTE features may be detectable in everyday actions and even from a single camera view. CASA 2003 Tutorial

  47. Experimental Results – Confusion Matrices (3D Motion Capture) CASA 2003 Tutorial

  48. Experimental Results – Confusion Matrices (2 Camera Video Input) CASA 2003 Tutorial

  49. EMOTE Parameters Determined from A Single Camera View • First- and second-order features are mostly preserved. • Initial results are encouraging. CASA 2003 Tutorial

  50. LiveActor: Immersive Environment for Real + Virtual Player Interaction • Based on Ascension ReActor IR real-time motion capture system. • EON Reality stereo display wall. • Emphasis on non-verbal communication channels. CASA 2003 Tutorial

More Related