140 likes | 289 Views
Background. Neural organization of speech perception difficult to characterize 1 st hypothesis: based on auditory cortex (auditory language comprehension disorders, left STG lesions) Challenged by 2 discoveries
E N D
Background • Neural organization of speech perception difficult to characterize • 1st hypothesis: based on auditory cortex (auditory language comprehension disorders, left STG lesions) • Challenged by 2 discoveries • Deficits in speech sound perception contribute minimally to auditory comprehension in Wernicke’s aphasia • Destruction of left STG leads to deficits in speech production (not auditory comprehension) • Deficits in syllable discrimination from frontoparietal damage • Syllable discrimination and word comprehension doubly dissociated (can have one without the other)
Paradox and Article Goal • First imaging studies: • Passive listening highlights ST regions bilaterally • Syllable discrimination activates left STG and left IF lobe • Paradox remains • Damage to left regions causes mainly deficits in speech production (not auditory comprehension issues) • In this article we will: • Describe and extend a dual-stream model of speech processing that resolves paradox • Outline central components • Discuss model assumptions and relevant evidence
Task Dependence and Definitions • Neural organization of speech processing is task dependent => must define task of interest • Speech perception and speech recognition doubly dissociate (can have one without the other) • Speech processing = any task involving aurally presented speech • Speech perception = sublexical tasks (e.g. syllable discrimination) • Speech recognition = mapping of acoustic signals to mental lexicon • Not necessarily a single route to recognition • Perception tasks involve executive control and working memory
Ventral stream: sound to meaning • [Dorsal stream: sound to action] • Parallel computations and bilateral organization • Multi-time resolution processing • STS crucial to phonological-level processing • Lexical, semantic and grammatical linkages
Dorsal stream: sound to action • Less agreement on role of auditory dorsal stream • Spatial hearing, “where” function • Interface with motor system • Need for auditory-motor integration • Learning to speak is a motor task • Neural mechanism for mapping sounds to speech gestures • Generation of sensory representation of new words • Continues to function in adults • Evidence for a sensorimotor dorsal stream • Left lesions (conduction aphasia: phonemic errors but comprehends) • Functional imaging (next figure)
Summary and future perspectives • Acoustic speech network must interface with • Conceptual systems • Motor-articulatory systems • Above basic concept accounts for an array of fundamental observations • It also fits with similar proposals in the visual and somatosensory domains • Might be a reflection of a more general principle of sensory systems organization • Future work • Specify within-stream details • Empirical work to test hypotheses • How neural models map to linguistic and psycholiguistic models