1 / 15

2. Syntactic Analysis: Parsing

2. Syntactic Analysis: Parsing. 2. Parsing . Suppose we want a previous grammar to recognize the sentence: “ The large cat eats the small rat ”. Definite Clause Grammar . sentence --> nounPhrase, verbPhrase. nounPhrase --> article, adjective, noun. verbPhrase --> verb, nounPhrase.

kelli
Download Presentation

2. Syntactic Analysis: Parsing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2. Syntactic Analysis: Parsing 2. Parsing • Suppose we want a previous grammar to recognize the sentence: “The large cat eats the small rat” • Definite Clause Grammar sentence --> nounPhrase, verbPhrase. nounPhrase --> article, adjective, noun. verbPhrase --> verb, nounPhrase. • Using these rules we can determine whether a sentence is legal, and obtain its structure. • The sentence consists of: • Noun Phrase: The large cat • Verb Phrase: eats the small rat. • The verb phrase in turn consists of: • verb: eats • Noun Phrase: the small rat . • The sentence above is legal. Ch05B / 1

  2. 2. Syntactic Analysis: Parsing Parsing Tree • The structure of the sentence according to grammar can be represented as a tree: Ch05B / 2

  3. 2. Syntactic Analysis: Parsing • The tree structure gives you groupings of words. (e.g., the small cat). • These are meaningful groupings - considering these together helps in working out what the sentence means. • Basic approach of parsing is based on rewriting. • To parse a sentence you must be able to “rewrite” the “start” symbol (in this case sentence) to the sequence of syntactic categories corresponding to the sentence. • You can rewrite a symbol using one of the grammar rules if it corresponds to the LHS of a rule. You then just replace it with the symbols in LHS. e.g., • You can rewrite a symbol using one of the grammar rules if it corresponds to the LHS of a rule. You then just replace it with the symbols in LHS. e.g., • sentence • nounPhrase verbPhrase • article adjective noun verbPhrase • etc Ch05B / 3

  4. 2. Syntactic Analysis: Extended Grammar A little more on grammars • Example grammar will ONLY parse sentences of a very restricted form. • What about: • “John jumps” • The man jumps”. • John jumps in the pond • We need to add extra rules to cover some of these cases • Extended Grammar sentence --> nounPhrase, verbPhrase. nounPhrase --> article, adjective, noun. nounPhrase --> article, noun. nounPhrase --> properName. verbPhrase --> verb, nounPhrase. verbPhrase --> verb. • (Think how you might handle “in the pond”.) • Grammar now parses: • John jumps the pond. Ch05B / 4

  5. 2. Syntactic Analysis: Extended Grammar • And fails to parse ungrammatical ones like: • jumps pond John the NL Grammars • A good NL grammar should: • cover a reasonable subset of natural language. • Avoid parsing ungrammatical sentences • (or at least, ones that are viewed as not acceptable in the target application). • Assign plausible structures to the sentence, where meaningful bits of the sentence are grouped together. • But, the role is NOT to check that a sentence is grammatical. By excluding dodgy sentences the grammar is more likely to get the right structure of a sentence. Ch05B / 5

  6. 2. Syntactic Analysis: Extended Grammar • Consider following examples: • “John likes.” NOT OK • “John jumps.” OK • “John jumps in the water,” OK • “The small fluffy cat jumps.” OK • John like the cat. NOT OK. • The cats likes John. NOT OK. Better Grammar • Should deal with: • Intransive/Transitive verbs. Former are ones that don’t need following noun phrase. • Prepositional phrases (e.g., in the lake). Prepostion followed by noun phrase. • Series of adjectives. Recursive rule can be used. • Subject-verb agreement. Can add arguments to grammar rules/dictionary entries. Ch05B / 6

  7. 2. Syntactic Analysis: Extended Grammar sentence --> np(Num), vp(Num). np(Num) --> art, noun(Num). noun(sing) --> [cat]. 3. Semantics • Syntax: Uses grammar to structure sentence. • Semantics: Maps this to a structured representation that can be used in inference. (often referred to as sentence meaning) • Possible representations: • SQL. Map “Find me all the students who are taking AI3” to relevant SQL query. • Predicate Logic: Map “John loves anyone who is tall” onto relevant statement in predicate logic. • Other structured rep: (e.g., “case frame”: action: loves subject: john object: mary Ch05B / 7

  8. 3. Semantics • How do we get from the parsed sentence to this kind of representation? • In general rather tricky, but to illustrate idea we will show how it could be done for “John loves Mary” by adding extra arguments to a prolog grammar. • We want to map that sentence to • loves(john, mary). • We will cheat by assuming that the functor pf Prolog structured objects can be a variable. • Verb(Object, Subject) Grammar with Semantics Sentence(Verb(Subject, Object)) --> nounPhrase(Subject), verbPhrase(Verb, Object). nounPhrase(Subject) --> properName(Subject). verbPhrase(Verb, Object) --> verb(Verb), nounPhrase(Object). • General idea is that we can “compose” the sentence meaning by working out the “meaning” of the syntactic constituents and sticking the results together somehow. Ch05B / 8

  9. 4. Pragmatics • But can’t get very far without knowing something about the world, and the context in which a sentence is uttered. • Pragmatics deals with this. • Example. Determining referents of pronouns etc. • “John likes that blue car. He buys it.” • We need context to determine what he is referring to in “that blue car”, “he”, it”. • Then can create meaning: likes(john, car1) and buys(john, car1). • Pragmatics is also about what people DO with language. • Making sense of, and generating language involves mapping language to goals. • “Do you have the time?” -> speaker wants to know the time. • “When is the last train to London?” -> speaker probably wants to go there. • We can apply some of our planning ideas to this problem. Ch05B / 9

  10. 4. Pragmatics Pragmatics and Plans • As an example of a plan-based approach to language, consider the actions of requesting, informing, asking. • Referred to as “speech acts”. • We can describe these as planning operators. • The preconditions and effects refer to speaker and hearer’s beliefs and desires. • We use a notation to describe these: • knows(Agent, Fact) • wants(Agent, State/Action) • e.g., wants(fred, kiss(fred, mary)) • knows(fred, loves(mary, joe)) Ch05B / 10

  11. 4. Pragmatics More speech acts • Sketch of inform, request, • inform(Speaker, Hearer, Fact) pre: knows(Speaker, Fact) wants(Speaker, knows(Hearer, Fact)) add: knows(Hearer, Fact) knows(Speaker, knows(Hearer, Fact)) • How does this oversimplify the “informing” action? • request(Speaker, Hearer, do(Hearer, Action)) pre: wants(Speaker, Action) knows(Speaker, cando(Hearer, Action)) add: wants(hearer, Action) • (Note: A bit tricky to integrate with ordinary planning rules.) Ch05B / 11

  12. 4. Pragmatics Putting it all together • Given sentences like spoken by John about Fred: • “What is the time?” • He has missed the train. • Can now • parse the sentence • map that to a structured representation that is good for inference. • Use context and knowledge of goals/plans to obtain from that: • wants(john, know(john, time1)) (where time1 is the time at some instant) • believes(john, missed(fred, train2)) Ch05B / 12

  13. 4. Pragmatics Language Generation • Language processing also about generation of language. • Structured representation --> NL text. • Simplest generation method is using templates, mapping representation straight to text template (with variables/slots to fill in). • loves(X, Y) -> X “loves” Y • gives(X, Y, Z) -> X “gives the” Y “to” Z • Mail-merge tools in word processors work similarly, extracting data from simple database to fill slots. • But much more to language generation in general. Templates are very rigid. • Consider “John eats the cheese. John eats the apple. John sneezes. John laughs.” • Better as “John eats the cheese and apple, then sneezes. He then laughs.” • Getting good style involves working out how to map many facts to one sentence, when to use pronouns, when to use “connectives” like “then”. Ch05B / 13

  14. 4. Pragmatics • Serious language generation involves deciding: • what to say. • how to order and structure it. • How to break it up into sentences. • How to refer to objects (using pronouns, and expressions like “the cat” etc). • How to express things in terms of grammatically correct sentences. • Often starting point is a communicative goal Ch05B / 14

  15. Summary • Natural Language Processing covers understanding and generating spoken and written language, from sentences to large texts. • Focus on understanding sentences. First step is to parse sentence to derive structure. • Use grammar rules which define constituency structure of language. • Parse gives tree structure which shows how words are grouped together. • NL Processing includes: • Syntax • Semantics • Pragmatics • And involves: • Generating language • Understanding language Ch05B / 15

More Related