380 likes | 575 Views
Embodied Construction Grammar ECG (Formalizing Cognitive Linguistics ). Community Grammar and Core Concepts Deep Grammatical Analysis Computational Implementation Test Grammars Applied Projects – Question Answering Map to Connectionist Models, Brain Models of Grammar Acquisition.
E N D
Embodied Construction GrammarECG(Formalizing Cognitive Linguistics) • Community Grammar and Core Concepts • Deep Grammatical Analysis • Computational Implementation • Test Grammars • Applied Projects – Question Answering • Map to Connectionist Models, Brain • Models of Grammar Acquisition
Simulation specification • The analysis process produces a simulation specification that • includes image-schematic, motor control and conceptual structures • provides parameters for a mental simulation
Summary: ECG • Linguistic constructions are tied to a model of simulated action and perception • Embedded in a theory of language processing • Constrains theory to be usable • Basis for models of grammar learning • Precise, computationally usable formalism • Practical computational applications, like MT and NLU • Testing of functionality, e.g. language learning • A shared theory and formalism for different cognitive mechanisms • Constructions, metaphor, mental spaces, etc. • Reduction to Connectionist and Neural levels
Constrained Best Fit in Nature inanimateanimate framing, compromise society, politics
Competition-based analyzer An analysis is made up of: A constructional tree A semantic specification A set of resolutions A-GIVE-B-X subj obj2 v obj1 Ref-Exp Give Ref-Exp Ref-Exp @Book @Man Give-Action @Woman Bill Mary book01 recipient giver theme Johno Bryant Bill gave Mary the book
Combined score determines best-fit Syntactic Fit: Constituency relations Combine with preferences on non-local elements Conditioned on syntactic context Antecedent Fit: Ability to find referents in the context Conditioned on syntax match, feature agreement Semantic Fit: Semantic bindings for frame roles Frame roles’ fillers are scored
Constructs -------------- NPVP[0] (0,5) Eve[3] (0,1) ActiveSelfMotionPath [2] (1,5) WalkedVerb[57] (1,2) SpatialPP[56] (2,5) Into[174] (2,3) DetNoun[173] (3,5) The[204] (3,4) House[205] (4,5) Schema Instances ------------------- SelfMotionPathEvent[1] HouseSchema[66] WalkAction[60] Person[4] SPG[58] RD[177] ~ house RD[5]~ Eve 0Eve1walked2into3the4house5
SelfMotionPathEvent[1].mover SPG[58].trajector WalkAction[60].walker RD[5].resolved-ref RD[5].category Filler: Person4 SpatialPP[56].m Into[174].m SelfMotionPathEvent[1].spg Filler: SPG58 SelfMotionPathEvent[1] .landmark House[205].m RD[177].category SPG[58].landmark Filler:HouseSchema66 WalkedVerb[57].m WalkAction[60].routine WalkAction[60].gait SelfMotionPathEvent[1] .motion Filler:WalkAction60 Unification chains and their fillers
Productive Argument Omission (Mandarin)Johno Bryant & Eva Mok 1 • Mother (I) give you this (a toy). 2 • You give auntie[the peach]. 3 • Oh (go on)! You give[auntie] [that]. 4 • [I]give[you] [some peach]. CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996)
Arguments are omitted with different probabilities All args omitted: 30.6% No args omitted: 6.1%
Analyzing ni3 gei3 yi2 (You give auntie) Two of the competing analyses: • Syntactic Fit: • P(Theme omitted | ditransitive cxn) = 0.65 • P(Recipient omitted | ditransitive cxn) = 0.42 (1-0.78)*(1-0.42)*0.65 = 0.08 (1-0.78)*(1-0.65)*0.42 = 0.03
Using frame and lexical information to restrict type of reference
Discourse & Situational Context • child mother • peach auntie • table Can the omitted argument be recovered from context? • Antecedent Fit: ?
How good of a theme is a peach? How about an aunt? • Semantic Fit:
The argument omission patterns shown earlier can be covered with just ONE construction • Each construction is annotated with probabilities of omission • Language-specific default probability can be set P(omitted|cxn): 0.78 0.42 0.65
Leverage process to simplify representation • The processing model is complementary to the theory of grammar • By using a competition-based analysis process, we can: • Find the best-fit analysis with respect to constituency structure, context, and semantics • Eliminate the need to enumerate allowable patterns of argument omission in grammar • This is currently being applied in models of language understanding and grammar learning.
Modeling context for language understanding and learning • Linguistic structure reflects experiential structure • Discourse participants and entities • Embodied schemas: • action, perception, emotion, attention, perspective • Semantic and pragmatic relations: • spatial, social, ontological, causal • ‘Contextual bootstrapping’ for grammar learning
Discourse: Discourse01 participants: Eve , Mother objects: Hands, ... discourse-history: DS01 situational-history: Wash-Action Discourse & Situational Context The context model tracks accessible entities, events, and utterances
Hands category: BodyPart part-of: Eve number: plural accessibility: accessible Eve category: child gender: female name: Eve age: 2 Wash-Action washer: Eve washee: Hands DS01 speaker: Mother addressee: Eve attentional-focus: Hands content: {"are they clean yet?"} speech-act: question Mother category: parent gender: female name: Eve age: 33 Each of the items in the context model has rich internal structure Discourse: Participants: Objects: Situational History: Discourse History:
Analysis Discourse & Situational Context Analysis produces a semantic specification World Knowledge Linguistic Knowledge Utterance “You washed them” Semantic Specification WASH-ACTION washer: Eve washee: Hands
Gold’s Theorem: No superfinite class of language is identifiable in the limit from positive data only Principles & Parameters Babies are born as blank slates but acquire language quickly (with noisy input and little correction) → Language must be innate: Universal Grammar + parameter setting But babies aren’t born as blank slates! And they do not learn language in a vacuum! How Can Children Be So Good At Learning Language?
Embodied Construction Grammar Opulence of the Substrate Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledge Basic Scenes Simple clause constructions are associated directly with scenes basic to human experience (Goldberg 1995, Slobin 1985) Verb Island Hypothesis Children learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis (Verb Island Hypothesis, Tomasello 1992) Key ideas for a NT of language acquisitionNancy Chang and Eva Mok
Embodiment and Grammar Learning Paradigm problem for Nature vs. Nurture The poverty of the stimulus The opulence of the substrate Intricate interplay of genetic and environmental, including social, factors.
Computational models Grammatical induction language identification context-free grammars, unification grammars statistical NLP (parsing, etc.) Word learning models semantic representations logical forms discrete representations continuous representations statistical models Developmental evidence Prior knowledge primitive concepts event-based knowledge social cognition lexical items Data-driven learning basic scenes lexically specific patterns usage-based learning Two perspectives on grammar learning
Significant prior conceptual/embodied knowledge rich sensorimotor/social substrate Incremental learning based on experience Lexically specific constructions are learned first. Language learning tied to language use Acquisition interacts with comprehension, production; reflects communication and experience in world. Statistical properties of data affect learning Key assumptions for language acquisition
Context you Addressee Eve addressee washer washer Discourse Segment washed Wash-Action Wash-Action washee washee attentional-focus Hands them ContextElement Analysis draws on constructions and context Form Meaning before before
World Knowledge Utterance Linguistic Knowledge Analysis Discourse & Situational Context PartialSemSpec Learning updates linguistic knowledge based on input utterances Learning
Context you Addressee Eve addressee washer washer Discourse Segment washed Wash-Action Wash-Action washee washee attentional-focus Hands them ContextElement Context aids understanding: Incomplete grammars yield partial SemSpec Form Meaning
Context you Addressee Eve addressee washer Discourse Segment washed Wash-Action Wash-Action washee attentional-focus Hands them ContextElement Context bootstraps learning: new construction maps form to meaning Form Meaning before washer washee before
you Addressee washed Wash-Action them ContextElement Context bootstraps learning: new construction maps form to meaning Form Meaning YOU-WASHED-THEM constituents: YOU, WASHED, THEM form: YOU before WASHED WASHED before THEM meaning: WASH-ACTION washer: addressee washee: ContextElement before washer washee before
World Knowledge Utterance • reorganize • merge • join • split reinforcement Analysis • hypothesize • map form to meaning • learn contextual constraints Discourse & Situational Context PartialSemSpec Grammar learning: suggesting new CxNs and reorganizing existing ones Linguistic Knowledge
Challenge: How far up to generalize Inanimate Object Manipulable Objects Unmovable Objects Food Furniture Fruit Savory Chair Sofa apple watermelon rice Eat rice Eat apple Eat watermelon Want rice Want apple Want chair
Challenge: Omissible constituents In Mandarin, almost anything available in context can be omitted – and often is in child-directed speech. Intuition: Same context, two expressions that differ by one constituent a general construction with the constituent being omissible May require verbatim memory traces of utterances + “relevant” context
When does the learning stop? Bayesian Learning Framework Schemas + Constructions reorganize Analysis + Resolution reinforcement Context Fitting hypothesize SemSpec Most likely grammar given utterances and context The grammar prior includes a preference for the “kind” of grammar In practice, take the log and minimize cost Minimum Description Length (MDL)
Intuition for MDL Suppose that the prior is inversely proportional to the size of the grammar (e.g. number of rules) It’s not worthwhile to make this generalization S -> Give me NP NP -> the book NP -> a book S -> Give me NP NP -> DET book DET -> the DET -> a 39
Intuition for MDL S -> Give me NP NP -> the book NP -> a book NP -> the pen NP -> a pen NP -> the pencil NP -> a pencil NP -> the marker NP -> a marker S -> Give me NP NP -> DET N DET -> the DET -> a N -> book N -> pen N -> pencil N -> marker
world knowledge utterance comm. intent constructicon reinforcement (usage) reinforcement (usage) analyze & resolve generate hypothesize constructions & reorganize discourse & situational context analysis utterance simulation response reinforcement(correction) reinformcent (correction) Usage-based learning: comprehension and production