1 / 26

Michael Arbib: CS564 - Brain Theory and Artificial Intelligence University of Southern California, Fall 2001

Michael Arbib: CS564 - Brain Theory and Artificial Intelligence University of Southern California, Fall 2001. Lecture 10. The Mirror Neuron System Model (MNS) 1 Reading Assignment: Schema Design and Implementation of the Grasp-Related Mirror Neuron System Erhan Oztop and Michael A. Arbib.

elle
Download Presentation

Michael Arbib: CS564 - Brain Theory and Artificial Intelligence University of Southern California, Fall 2001

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michael Arbib: CS564 - Brain Theory and Artificial IntelligenceUniversity of Southern California, Fall 2001 • Lecture 10. • The Mirror Neuron System Model (MNS) 1 • Reading Assignment: • Schema Design and Implementation of • the Grasp-Related Mirror Neuron System • Erhan Oztop and Michael A. Arbib

  2. Visual Control of Grasping in Macaque Monkey A key theme of visuomotor coordination: parietal affordances (AIP) drive frontal motor schemas (F5) AIP - grasp affordances in parietal cortex Hideo Sakata F5 - grasp commands in premotor cortex Giacomo Rizzolatti

  3. Mirror Neurons Rizzolatti, Fadiga, Gallese, and Fogassi, 1995:Premotor cortex and the recognition of motor actions Mirror neurons form the subset of grasp-related premotor neurons of F5 which discharge when the monkey observes meaningful hand movements made by the experimenter or another monkey. F5 is endowed with anobservation/execution matching system

  4. F5 Motor Neurons • F5 Motor Neurons include all F5 neurons whose firing is related to motor activity. • We focus on grasp-related behavior. Other F5 motor neurons are related to oro-facial movements. • F5 Mirror Neurons form the subset of grasp-related F5 motor neurons of F5 which discharge when the monkey observes meaningful hand movements. • F5 Canonical Neurons form the subset of grasp-related F5 motor neurons of F5 which fire when the monkey sees an object with related affordances.

  5. What is the mirror system (for grasping) for? Mirror neurons: The cells that selectively discharge when the monkey executes particular actions as well as when the monkey observes an other individual executing the same action. Mirror neuron system (MNS): The mirror neurons and the brain regions involved in eliciting mirror behavior. Interpretations: • Action recognition • Understanding (assigning meaning to other’s actions) • Associative memory for actions

  6. Computing the Mirror System Response • The FARS Model: • Recognize object affordances and determine appropriate grasp. • The Mirror Neuron System (MNS) Model: • We must add recognition of • trajectoryand • hand preshape • to • recognition of object affordances • and ensure that all three are congruent. • There are parietal systems other than AIP adapted to this task.

  7. cIPS: caudal intraparietal sulcus cIPS cIPS cIPS 7a (PG): caudal part of the posterior parietal lobule STS: Superior Temporal Sulcus 7b (PF): Rostral part of the posterior parietal lobule Further Brain Regions Involved Axis and surface orientation Spatial coding for objects, analysis of motion during interaction of objects and self-motion Detection of biologically meaningful stimuli (e.g.hand actions) Motion related activity (MT/MST part) Mainly somatosensory Mirror-like responses

  8. cIPS cIPS cIPS cell response Surface orientation selectivity of a cIPS cell Sakata et al. 1997

  9. Key Criteria for Mirror Neuron Activation When Observing a Grasp • a) Does the preshape of the hand correspond to the grasp encoded by the mirror neuron? • b) Does this preshape match an affordance of the target object? • c) Do samples of the hand state indicate a trajectory that will bring the hand to grasp the object? • Modeling Challenges: • i) To have mirror neurons self-organize to learn to recognize grasps in the monkey’s motor repertoire • ii) To learn to activate mirror neurons from smaller and smaller samples of a trajectory.

  10. Initial Hypothesis on Mirror Neuron Development The development of the (grasp) mirror neuron system in a healthy infant isdriven by the visual stimuligenerated by the actions (grasps) performed by theinfant himself. The infant (with maturation of visual acuity) gains the ability tomap other individual’s actionsinto his internal motor representation. [In the MNS model, the hand state provides the key representation for this transfer.] Then the infant acquires the ability to create (internal) representations fornovel actionsobserved. Parallel to these achievements, the infant develops anaction predictioncapability (the recognition of an action given the prefix of the action and the target object)

  11. The Mirror Neuron System (MNS) Model

  12. Implementing the Basic Schemas of the Mirror Neuron System (MNS) Model • using Artificial Neural Networks • (Work of Erhan Oztop)

  13. Opposition Spaces and Virtual Fingers The goal of a successful preshape, reach and grasp is to match the opposition axis defined by the virtual fingers of the hand with the opposition axis defined by an affordance of the object (Iberall and Arbib 1990)

  14. Hand State • Our current representation of hand state defines a 7-dimensional trajectory F(t) with the following components • F(t) = (d(t), v(t), a(t), o1(t),o2(t), o3(t), o4(t)): • d(t): distance to target at time t • v(t): tangential velocity of the wrist • a(t): Aperture of the virtual fingers involved in grasping at time t • o1(t): Angle between the object axis and the (index finger tip – thumb tip) vector [relevant for pad and palm oppositions] • o2(t): Angle between the object axis and the (index finger knuckle – thumb tip) vector [relevant for side oppositions] • o3(t), o4(t): The two angles defining how close the thumb is to the hand as measured relative to the side of the hand and to the inner surface of the palm.

  15. Curve recognition Solution: Fit a cubic spline to the sampled values. Then normalize and re-sample from the spline curve. Result:Very good generalization. Better performance than using the Fourier coefficients to recognize curves. The general problem: associate N-dimensional space curves with object affordances A special case: The recognition of two (or three) dimensional trajectoriesin physical space Simplest solution: Map temporal information into spatial domain. Then apply known pattern recognition techniques. Problem with simplest solution: The speed of the moving point can be a problem! The spatial representation may change drastically with the speed Scaling can overcome the problem. However the scaling must be such that it preserves thegeneralization ability of the pattern recognition engine.

  16. Curve recognition Curve recognition system demonstrated for hand drawn numeral recognition (successful recognition examples for 2, 8 and 3). Spatial resolution: 30 Network input size: 30 Hidden layer size: 15 Output size: 5 Training : Back-propagation with momentum.and adaptive learning rate Sampled points Point used for spline interpolation Fitted spline

  17. STS hand shape recognition Model Matching Precision grasp Hand Configuration Classification Color Coded Hand Feature Extraction Step 1 of hand shape recognition: system processes the color-coded hand image and generates a set of features to be used by the second step Step 2: The feature vector generated by the first step is used to fit a 3D-kinematics model of the hand by the model matching module. The resulting hand configuration is sent to the classification module.

  18. STS hand shape recognition 1:Color Segmentation and Feature Extraction Color Expert (Network weights) Preprocessing Training phase: A color expert is generated by training a feed-forward network to approximate human perception of color. Features NN augmented segmentation system Actual processing: The hand image is fed to the augmented segmentation system. The color decision during segmentation is done by consulting color expert.

  19. STS hand shape recognition2:3D Hand Model Matching Feature Vector Error minimization Result of feature extraction Grasp Type Classification The model matching algorithm minimizes the error between the extracted features and the model hand. A realistic drawing of hand bones. The hand is modelled with 14 degrees of freedom as illustrated.

  20. Virtual Hand/Arm and Reach/Grasp Simulator A precision pinch A power grasp and a side grasp

  21. Power grasp time series data +: aperture; *: angle 1; x: angle 2; : 1-axisdisp1; :1-axisdisp2; : speed; : distance.

  22. Object Affordances Motor program Object affordance -hand state association Integrate temporal association F5canonical Hand shape recognition & Hand motion detection Action recognition (Mirror Neurons) Motor program Mirror Feedback Motor execution Hand-Object spatial relation analysis F5mirror Core Mirror Circuit Object affordance Mirror Neurons (F5mirror) Association (7b) Neurons Mirror Neuron Output Hand state Motor Program (F5 canonical) Mirror Feedback

  23. Connectivity pattern Object affordance (AIP) STS F5mirror 7b Motor Program (F5canonical) Mirror Feedback 7a

  24. A single grasp trajectory viewed from three different angles How the network classifies the action as a power grasp. Empty squares: power grasp output; filled squares: precision grasp; crosses: side grasp output The wrist trajectory during the grasp is shown by square traces, with the distance between any two consecutive trace marks traveled in equal time intervals.

  25. Power and precision grasp resolution (a) Precision Pinch Mirror Neuron (b) Power Grasp Mirror Neuron Note that the modeling yields novel predictions for time course of activity across a population of mirror neurons.

  26. Research Plan • Development of the Mirror System • Development of Grasp Specificity in F5 Motor and Canonical Neurons • Visual Feedback for Grasping: A Possible Precursor of the Mirror Property • Recognition of Novel and Compound Actions and their Context • The Pliers Experiment: Extending the Visual Vocabulary • Recognition of Compounds of Known Movements • From Action Recognition to Understanding: Context and Expectation

More Related