1 / 43

Machines that Recognize Human Emotion

Machines that Recognize Human Emotion. Yuan Qi MIT Media Laboratory. A man barges into your office when you’re busy. He doesn’t apologize, doesn’t introduce himself, and doesn’t notice you are annoyed. He offers you useless advice. You express more annoyance. He ignores it.

gita
Download Presentation

Machines that Recognize Human Emotion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Machines that Recognize Human Emotion Yuan Qi MIT Media Laboratory

  2. A man barges into your office when you’re busy. He doesn’t apologize, doesn’t introduce himself, and doesn’t notice you are annoyed. He offers you useless advice.You express more annoyance. He ignores it. He continues to be unhelpful. The clarity of your emotional expression escalates. He ignores it. (this goes on) Finally you have to tell him explicitly “go away” He winks, and does a little dancebefore exiting.

  3. Recognition of three “basic” states:

  4. “ Emotion recognition” • Expressions, behaviors “Flared nostrils, tightened lips, a quick sharp gesture, skin conductivity=high; probably she is angry ” • Situation, reasoning That was an important goal to her and Bob just thwarted it, so she probably feels angry toward Bob

  5. Face Distance Voice Sensing: Posture Gestures, movement, behavior Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters … Emotions give rise to changes that can be sensed

  6. Distance Voice Sensing: Posture Gestures, movement, behavior Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters … Emotions give rise to changes that can be sensed

  7. Sensing: Posture Gestures, movement, behavior Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters … Emotions give rise to changes that can be sensed

  8. Gestures, movement, behavior Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters … Emotions give rise to changes that can be sensed

  9. Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters ... Emotions give rise to changes that can be sensed

  10. Can a machine tell if a person is bored or interested? Attentive? Fidgeting?Application: Computer Learning Companion, Tutor, Mentor

  11. Sit upright Lean Forward Slump Back Side Lean Can we teach a chair to recognize behaviors indicative of interest and boredom? (Mota and Picard)

  12. What can the sensor chair contribute toward inferring the user’s state: Bored vs. interested? 9-state Posture Recognition: 89-97% accurate High/Low interest, Taking a Break: 69-83% accurate (Results on kids not in training data, 2002)

  13. Detecting, tracking, and recognizing facial expressions from video (Kapoor & Picard)

  14. Computer recognition of natural head nods and shakes Kapoor and Picard, PUI ‘01

  15. Fully automatic computer recognition of six natural facial “action units” (Kapoor and Picard) Accuracy: “Expert” human: 75% Our first system: 67%

  16. Can the computer sense mild frustration or distress?(e.g., for usability testing in the field?)

  17. Things to communicate frustration (Reynolds & Picard)

  18. Example: data from pressure mouse Forthcoming paper w/Jack Dennerlein, Harvard School of Public Health, and Carson Reynolds/Rosalind Picard at MIT, International Ergonomics Association, linking frustration and physical risk factors

  19. Can the computer sense other emotions? Stress? Pleasure?…

  20. Processing Wearable skin conductivity communicator Expression Sensing

  21. Making the light glow: • Significant thoughts • Exciting events • Exercise • Motion artifacts • Lying • Pain

  22. Audience’s “Glow” conveys excitement(Approximate Skin Conductivity Level) Communicate emotion in new ways Picard and Scheirer, HCI 2001

  23. Cybernetic wearable camera(Healey & Picard, ISWC 98)

  24. StartleCam Filter

  25. Video: StartleCam(Healey & Picard, ISWC 98)

  26. 1. Neutral 5. Platonic Love 2. Anger 6. Romantic Love 3. Hate 7. Joy 4. Grief 8. Reverence Subject intentionally expressing 8 emotions: 1. Neutral 5. Platonic Love 2. Anger 6. Romantic Love 3. Hate 7. Joy 4. Grief 8. Reverence Each emotion collected daily, for > 4 weeks 4 physiological signals: EMG on jaw, skin conductivity, BVP, respiration Classification Accuracy: 81% on 8 emotions (person dependent) Picard et al., IEEE Trans. Pattern Analysis Machine Intell.,Oct 2001.

  27. Autonomic Balance = LF/HF

  28. Bayesian Spectrum Estimation of Unevenly Sampled Nonstationary Data(Y. Qi, T.P. Minka, and R.W. Picard 01) • Problem • Estimating spectrum with data that is • Nonstationary • Unevenly Sampled • Noisy • Bayesian Approach • Dynamic modeling of the time series wi : the process noise at time ti vi: the observation noise at time ti. The filtering distribution p(si|x1:i ) can be sequentially estimated as Then the spectrum at time ti can be summarized by the posterior mean of p(si|x1:i ).

  29.                        Welch                          Burg                           Music                              Multitaper                              New The signal is the sum of 19, 20, and 21 Hz real sinusoid waves with amplitudes 0.5, 1, and 1 respectively. The variance of the additive white noise is 0.1. The signal is evenly sampled 128 times at 50 Hz. Comparison with Classical Spectrum Estimation Algorithms

  30. Lomb-Scargle periodogram with a window size of 200 points Lomb-Scargle periodogram with a window size of 200 points Spectral analysis for an unevenly sampled signal The signal frequency jumps from 20 Hz to 40 Hz at the sampling time -0.833 second, and then jumps  from 40 Hz to 60 Hz at 0.833 second.

  31. Spectrogram by the new method Spectrogram by the new method  coupled with sparsification Spectral analysis for an unevenly sampled signal The signal frequency jumps from 20 Hz to 40 Hz at the sampling time -0.833 second, and then jumps  from 40 Hz to 60 Hz at 0.833 second.

  32. Simultaneously examine physiology and behavior for recognizing level of stress: up to 96% accurate, across 12 drivers. (Healey and Picard, ICPR 2000)

  33. Driver Stress Demo (work w/Jen Healey, Yuan Qi, incorporating new spectral estimation technique for assessing heart rate variability)

  34. Stress is evident for this person when: driving through city turning around at toll booth hearing siren New algorithm: analysis of heart-rate variability via real-time spectrum estimation with missing and irregularly sampled data (Qi and Picard, 2001)

  35. Goal: recognize stress in speech of driver, over cell phone headset.

  36. Recognizing Affect in Speech: Stress Data: Four drivers talking over cell phone (headset) Problem: Associate stress with cognitive load of driving/verbal task: 2 speeds of driving (~60 kph, ~120 kph) 2 speeds of questioning (every 9 sec, every 4 sec) Models: Daubechies-4 filterbank: 21 bands, Teager Energy Operator features, Models: HMM, Auto-regressive HMM, Factorial HMM, Hidden-Markov Decision Tree, Support Vector Machine, Neural Network, Mixture of HMM’s Results: 96% training/62% testing on 4 categories stress with Mixture HMM’s; highly speaker dependent, e.g. 89-100% training, 36-96% test Fernandez & Picard, ISCA Workshop on Speech and Emotions, Belfast 2000

  37. Understanding the Structure of Spoken Language for Affect Modeling Intonation Tempo Syllables Rhythmicality, … F0 Breaths Pauses Extralinguistic Markers

  38. Face Distance Voice Sensing: Posture Gestures, movement, behavior Skin conductivity Pupillary dilation Up-close Respiration, heart rate, pulse Sensing: Temperature Blood pressure Internal Hormones Sensing: Neurotransmitters … Emotions give rise to changes that can be sensed

  39. Conclusions & Challenges • Steady progress w/sensors, pattern rec • Put the desires of the user first: • more visible vs. less visible signals • non-tethered, wearable, portable, • psychological comfort • cognitive load/interruptions • Still to combine w/additional context sensing & cognitive reasoning

  40. Papers and projects/details:http://www.media.mit.edu/affecthttp://www.media.mit.edu/~yuanqi • Machines that “have emotion” • Emotion and consciousness • Concerns • Applications • How to sense, recognize, build • Modeling emotion • Affective wearables

More Related