Title: CPSC 503 Computational Linguistics
1CPSC 503Computational Linguistics
- Semantic Analysis
- Lecture 16
- Giuseppe Carenini
2Semantic Analysis
Sentence
Meanings of grammatical structures
- Syntax-driven
- Semantic Analysis
Meanings of words
Literal Meaning
I N F E R E N C E
Common-Sense Domain knowledge
Further Analysis
Discourse Structure
Intended meaning
Context
3Today 17/3
- Compositional Analysis
- Integrate semantics and parsing
- Non-compositionality
- Semantic Grammars
- Information Extraction
4Meaning Structure of Language
- How does language convey meaning?
- Grammaticization
- Tense systems
- Conjunctions
- Quantifiers
- Indefinites (variables)
-
- Display a partially compositional semantics
- Display a basic predicate-argument structure
5Compositional Analysis
- Principle of Compositionality
- The meaning of a whole is derived from the
meanings of the parts - What parts?
- The constituents of the syntactic parse of the
input - What could it mean for a part to have a meaning?
6Compositional Analysis Example
7Augmented Rules
- Augment each syntactic CFG rule with a semantic
formation rule
- i.e., The semantics of A can be computed from
some function applied to the semantics of As
parts.
- The class of actions performed by f will be quite
restricted.
8Simple Extension of FOL Lambda Forms
- Lambda-reduction variables are bound by treating
the lambda form as a function with formal
arguments
9Augmented Rules Example
assigning constants
- Attachments
- AyCaramba
- MEAT
- PropNoun -gt AyCaramba
- MassNoun -gt meat
10Augmented Rules Example
Semantics attached to one daughter is applied to
semantics of the other daughter(s).
- VP.sem(NP.sem)
- Verb.sem(NP.sem)
- S -gt NP VP
- VP -gt Verb NP
lambda-form
11Example
MEAT
.
AC
MEAT
- S -gt NP VP
- VP -gt Verb NP
- Verb -gt serves
- NP -gt PropNoun
- NP -gt MassNoun
- PropNoun -gt AyCaramba
- MassNoun -gt meat
- VP.sem(NP.sem)
- Verb.sem(NP.sem)
- PropNoun.sem
- MassNoun.sem
- AC
- MEAT
12Problem Quantified Phrases
- Consider
- A restaurant serves meat.
- Assume that semantics for A restaurant is
- If we proceed as we did in the previous example,
the semantics for S would be
?!?
13Solution Complex Terms
Complex-Term ? ltQuantifier var bodygt
Examples
14Convert complex-terms back to FOL
P(ltquantifier, var, bodygt)
Quantifier var body connective P(var)
Example
15Problem Quantifier Scope Ambiguity
- Consider Every restaurant has a menu
16Solution Quantifier Scope Ambiguity
- Similarly to PP attachment, number of possible
interpretations exponential in the number of
complex terms
17Attachments for a fragment of English
- Sentences
- Noun-phrases
- Verb-phrases
- Prepositional-phrases
Based on The core Language Engine 1992
18Integration with a Parser
- Assume youre using a dynamic-programming style
parser (Earley or CYK).
- Two basic approaches
- Integrate semantic analysis into the parser
(assign meaning representations as constituents
are completed)
- Pipeline assign meaning representations to
complete trees only after theyre completed
19Pros and Cons
- Integration
- use semantic constraints to cut off parses that
make no sense - assign meaning representations to constituents
that dont take part in the correct (most
probable) parse
20Non-Compositionality
- Unfortunately, there are lots of examples where
the meaning of a constituent cant be derived
from the meanings of the parts
- - metaphor, (corporation as person)
- metonymy, (??)
- idioms,
- irony,
- sarcasm,
- indirect requests, etc
21English Idioms
- Lots of these constructions where the meaning of
the whole is either - Totally unrelated to the meanings of the parts
(kick the bucket) - Related in some opaque way (run the show)
- buy the farm
- bite the bullet
- bury the hatchet
- etc
22The Tip of the Iceberg
- Enron is the tip of the iceberg.
- NP -gt the tip of the iceberg .
- the tip of an old iceberg
- the tip of a 1000-page iceberg
- the merest tip of the iceberg
NP -gt TipNP of IcebergNP TipNP NP with tip
as its head IcebergNP NP with iceberg as its
head
23Handling Idioms
- Mixing lexical items and grammatical constituents
- Introduction of idiom-specific constituents
- Permit semantic attachments that introduce
predicates unrelated with constituents
NP -gt TipNP of IcebergNP small-part(),
beginning(). TipNP NP with tip as its head
IcebergNP NP with iceberg as its head
24Knowledge-Formalisms Map(including probabilistic
formalisms)
State Machines (and prob. versions) (Finite State
Automata,Finite State Transducers, Markov Models)
Morphology
Syntax
IE
Rule systems (and prob. versions) (e.g., (Prob.)
Context-Free Grammars)
SG
Semantics
- Logical formalisms
- (First-Order Logics)
Pragmatics Discourse and Dialogue
AI planners
25Semantic Grammars
- Def CFGs in which rules and constituents
correspond directly to semantic entities and
relations
26Semantic Grammars
- Limitations
- Almost complete lack of reuse
- Tend to grow in size (missing syntactic
generalizations)
- Typically used in conversational agents in
constrained domains - Limited vocabulary
- Limited grammatical complexity
27Information Extraction (IE)
- Scanning newspapers, newswires for a fixed set of
events of interests E.g., ?? - Scanning websites for products, prices, reviews,
etc.
- Arbitrarily complex (long) sentences
- Extended discourse
- Multiple writers
28Back to Finite State Methods
- Apply a series of cascaded transducers to an
input text
- At each stage specific elements of
syntax/semantics are extracted for use in the
next level e.g., complex phrases, semantic
patterns
- The end result is a set of relations suitable for
entry into a database
29Complex Phrases Semantic Patterns
- Bridgestone Sports Co. said Friday it has set up
a joint venture in Taiwan with a local concern
and a Japanese trading house to produce golf
clubs to be shipped to Japan. - The joint venture, Bridgestone Sports Taiwan
Co., capitalized at 20 million new Taiwan
dollars, will start production in January 1990
with production of 20,000 iron and metal wood
clubs a...
30FASTUS Output
31Named Entities Recognition
- Labeling all the occurrences of named entities in
a text - People, organizations, lakes, bridges, hospitals,
mountains, etc
- This can be done quite robustly and looks like
one of the most useful tasks across a variety of
applications
32Next Time
- Lexical Semantics
- Read Chapter 16
33Meaning Structure of Language
- The semantics of human languages
- Display a basic predicate-argument structure
- Make use of variables
- Make use of quantifiers
- Use a partially compositional semantics
34IE Key Points
- What about the stuff we dont care about?
- Ignore it. I.e. Its not written to the next
tape, so it just disappears from further
processing
- It works because of the constrained nature of the
problem - Only looking for a small set of items that can
appear in a small set of roles
35Cascades
36Key Point
- It works because of the constrained nature of the
problem - Only looking for a small set of items that can
appear in a small set of roles
37Next Time
- More robust approaches to semantic analysis
- Semantic grammars
- Information extraction
- Probabilistic labeling
- More on less than compositional constructions and
- Word meanings
- So read Chapter 16
38Predicate-Argument Semantics
- The functions/operations permitted in the
semantic rules fall into two classes - Pass the semantics of a daughter up unchanged to
the mother - Apply (as a function) the semantics of one of the
daughters of a node to the semantics of the other
daughters
39Predicate-Argument Semantics
- S -gt NP VP
- VP -gt Verb NP
- Is it really necessary to specify these
attachments?
- VP.sem(NP.sem)
- Verb.sem(NP.sem)
- No, in each rule theres a daughter whose
semantics is a function and one that isnt. What
else is there to do?
40Harder Example
- What makes this hard?
- What role does Harry play in all this?
41Harder Example
- The VP for told is VP -gt V NP VPto
- So you do what?
- Apply the semantic function attached to VPTO the
semantics of the NP this binds Harry as the goer
of the going. - Then apply the semantics of the V to the
semantics of the NP this binds Harry as the
Tellee of the Telling - And to the result of the first application to
get the right value of the told thing. - V.Sem(NP.Sem, VPto.Sem(NP.Sem)
42Harder Example
- Thats a little messy and violates the notion
that the grammar ought not to know much about
what is going on in the semantics - Better might be
- V.sem(NP.Sem, VPto.Sem)
- i.e Apply the semantics of the head verb to the
semantics of its arguments. - Complicate the semantics of the verb inside VPto
to figure out whats going on.
43Two Philosophies
- Let the syntax do what syntax does well and dont
expect it to know much about meaning - In this approach, the lexical entrys semantic
attachments do the work - Assume the syntax does know about meaning
- Here the grammar gets complicated and the lexicon
simpler
44Example
- Consider the attachments for the VPs
- VP -gt Verb NP NP rule (gave Mary a book)
- VP -gt Verb NP PP (gave a book to Mary)
- Assume the meaning representations should be the
same for both. Under the lexicon-heavy scheme the
attachments are - VP.Sem(NP.Sem, NP.Sem)
- VP.Sem(NP.Sem, PP.Sem)
45Example
- Under the syntax-heavy scheme we might want to do
something like - VP -gt V NP NP
- V.sem Recip(NP1.sem)
Object(NP2.sem) - VP -gt V NP PP
- V.Sem Recip(PP.Sem) Object(NP1.sem)
- I.e the verb only contributes the predicate, the
grammar knows the roles.
46Constructional Approach
- So well allow both
- VP ? V NP V.sem(NP.sem)
- and
- VP ? Kick-Verb the bucket ? x Die(x)
47Semantic Grammars
- One problem with traditional grammars is that
they dont necessarily reflect the semantics in a
straightforward way - You can deal with this by
- Fighting with the grammar
- Complex lambdas and complex terms, etc
- Rewriting the grammar to reflect the semantics
- And in the process give up on some syntactic
niceties
48BERP Example
49BERP Example
- How about a rule like the following
- Request ? I want to go to eat FoodType Time
- some attachment
50Semantic Grammar
- The term semantic grammar refers to the
motivation for the grammar rules - The technology (plain CFG rules with a set of
terminals) is the same as weve been using - The good thing about them is that you get exactly
the semantic rules you need - The bad thing is that you need to develop a new
grammar for each new domain
51Semantic Grammars
- Typically used in conversational agents in
constrained domains - Limited vocabulary
- Limited grammatical complexity
- Chart parsing (Earley) can often produce all
thats needed for semantic interpretation even in
the face of ungrammatical input.