Title: CPSC 503 Computational Linguistics
1CPSC 503Computational Linguistics
- Lecture 10
- Giuseppe Carenini
2Knowledge-Formalisms Map(including probabilistic
formalisms)
State Machines (and prob. versions) (Finite State
Automata,Finite State Transducers, Markov Models)
Morphology
Syntax
Rule systems (and prob. versions) (e.g., (Prob.)
Context-Free Grammars)
Semantics
- Logical formalisms
- (First-Order Logics)
Pragmatics Discourse and Dialogue
AI planners
3Next three classes
- What meaning is and how to represent it
- How to map sentences into their meaning
- Meaning of individual words (lexical semantics)
- Computational Lexical Semantics Tasks
- Word sense disambiguation
- Word Similarity
- Semantic Labeling
4Today 16/10
- Semantics / Meaning /Meaning Representations
- Linguistically relevant Concepts in FOPC / POL
- Semantic Analysis
5Semantics
- Def. Semantics The study of the meaning of
words, intermediate constituents and sentences
Def1. Meaning a representation that expresses
the linguistic input in terms of objects,
actions, events, time, space beliefs,
attitudes...relationships
Def2. Meaning a representation that links the
linguistic input to knowledge of the world
Language independent!
6Semantic Relations involving Sentences
Same truth conditions
- Paraphrase have the same meaning
- I gave the apple to John vs. I gave John the
apple - I bought a car from you vs. you sold a car to me
- The thief was chased by the police vs.
- Entailment implication
- The park rangers killed the bear vs. The bear is
dead - Nemo is a fish vs. Nemo is an animal
Contradiction I am in Vancouver vs. I am in
India
7Meaning Structure of Language
- How does language convey meaning?
- Grammaticization
- Display a partially compositional semantics
- Display a basic predicate-argument structure
(e.g., verb complements) - Words
8Grammaticization
Concept
Affix
- Past
- More than one
- Again
- Negation
9Common Meaning Representations
I have a car
FOL
Semantic Nets
Conceptual Dependency
Frames
10Requirements for Meaning Representations
- Sample NLP Task giving advice about restaurants
- Accept queries in NL
- Generate appropriate responses by consulting a KB
- e.g,
- Does Maharani serve vegetarian food?
- -gt Yes
- What restaurants are close to the ocean?
- -gt C and Monks
11Verifiability (in the world?)
- Example Does LeDog serve vegetarian food?
- Knowledge base (KB) expressing our world model
(in a formal language)
- Convert question to KB language and verify its
truth value against the KB content
Yes / No / I do not know
12Canonical Form
- Paraphrases should be mapped into the same
representation. - Does LeDog have vegetarian dishes?
- Do they have vegetarian food at LeDog?
- Are vegetarian dishes served at LeDog?
- Does LeDog serve vegetarian fare?
13How to Produce a Canonical Form
- Words have different senses
- food ___
- dish _______one overlapping meaning sense
- fare ___
- Meaning of alternative syntactic constructions
are systematically related - server thing-being-served
- S NP Maharani serves NP vegetarian dishes
- thing-being-served
server - S NP vegetarian dishes are served at NP
Maharani
14Inference and Expressiveness
- Consider a more complex request
- Can vegetarians eat at Maharani?
- Vs Does Maharani serve vegetarian food?
- Why do these result in the same answer?
- Inference Systems ability to draw valid
conclusions based on the meaning representations
of inputs and its KB
- serve(Maharani,VegetarianFood) gt
CanEat(Vegetarians,At(Maharani))
Expressiveness system must be able to handle a
wide range of subject matter
15Non Yes/No Questions
- Example I'd like to find a restaurant where I
can get vegetarian food.
- Indefinite reference lt-gt variable
- serve(x,VegetarianFood)
- Matching succeeds only if variable x can be
replaced by known object in KB.
What restaurants are close to the ocean? -gt C
and Monks
16Meaning Structure of Language
- How does language convey meaning?
- Grammaticization
- Display a partially compositional semantics
- Display a basic predicate-argument structure
(e.g., verb complements) - Words
17Predicate-Argument Structure
- Represent relationships among concepts
- Some words act like arguments and some words act
like predicates - Nouns as concepts or arguments red(ball)
- Adj, Adv, Verbs as predicates red(ball)
- Subcategorization frames specify number,
position, and syntactic category of arguments - Examples give NP2 NP1, find NP, sneeze
18Semantic (Thematic) Roles
This can be extended to the realm of semantics
- Semantic Roles Participants in an event
- Agent George hit Bill. Bill was hit by George
- Theme George hit Bill. Bill was hit by George
Source, Goal, Instrument, Force
- Verb subcategorization Allows linking arguments
in surface structure with their semantic roles
- Mary gave/sent/read a book to Ming
- Agent Theme Goal
- Mary gave/sent/read Ming a book
- Agent Goal Theme
19Non-verbal predicate-argument structures
- A Spanish restaurant under the bridge
-
Under(SpanishRestaurant, bridge)
20First Order Predicate Calculus (FOPC)
- FOPC provides sound computational basis for
verifiability, inference, expressiveness - Supports determination of truth
- Supports Canonical Form
- Supports compositionality of meaning
- Supports question-answering (via variables)
- Supports inference
- Argument-Predicate structure
21Today 16/10
- Semantics / Meaning /Meaning Representations
- Linguistically relevant Concepts in FOPC / POL
- Semantic Analysis
22Linguistically Relevant Concepts in FOPC
- Categories Events (Reification)
- Representing Time
- Beliefs (optional, read if relevant to your
project) - Aspects (optional, read if relevant to your
project) - Description Logics (optional, read if relevant
to your project)
23Categories Events
- Categories
- VegetarianRestaurant (Joes) - relation vs.
object - MostPopular(Joes,VegetarianRestaurant)
Reification
- ISA (Joes,VegetarianRestaurant)
- AKO (VegetarianRestaurant,Restaurant)
- Events eg. Make a reservation
- Reservation (Speaker,Joes,Today,8PM,2)
- Problems
- Determining the correct number of roles
- Representing facts about the roles associated
with an event - Ensuring that all and only the correct inferences
can be drawn
24MUC-4 Example
INCIDENT DATE 30 OCT 89 INCIDENT
LOCATION EL SALVADOR INCIDENT TYPE ATTACK
INCIDENT STAGE OF EXECUTION ACCOMPLISHED
INCIDENT INSTRUMENT ID INCIDENT INSTRUMENT
TYPEPERP INCIDENT CATEGORY TERRORIST ACT
PERP INDIVIDUAL ID "TERRORIST" PERP
ORGANIZATION ID "THE FMLN" PERP ORG.
CONFIDENCE REPORTED "THE FMLN" PHYS TGT ID
PHYS TGT TYPEPHYS TGT NUMBERPHYS TGT
FOREIGN NATIONPHYS TGT EFFECT OF INCIDENTPHYS
TGT TOTAL NUMBERHUM TGT NAMEHUM TGT
DESCRIPTION "1 CIVILIAN"HUM TGT TYPE
CIVILIAN "1 CIVILIAN"HUM TGT NUMBER 1 "1
CIVILIAN"HUM TGT FOREIGN NATIONHUM TGT EFFECT
OF INCIDENT DEATH "1 CIVILIAN"HUM TGT TOTAL
NUMBER
25Subcategorization frames
- I ate
- I ate a turkey sandwich
- I ate a turkey sandwich at my desk
- I ate at my desk
- I ate lunch
- I ate a turkey sandwich for lunch
- I ate a turkey sandwich for lunch at my desk
no fixed arity!
26Reification Again
I ate a turkey sandwich for lunch w
Isa(w,Eating) Ù Eater(w,Speaker) Ù
Eaten(w,TurkeySandwich) Ù MealEaten(w,Lunch)
- Reification Advantages
- No need to specify fixed number of arguments for
a given surface predicate - No more roles are postulated than mentioned in
the input - Logical connections among related examples are
specified
27Representing Time
- Events are associated with points or intervals in
time. - We can impose an ordering on distinct events
using the notion of precedes.
- Temporal logic notation (w,x,t) Arrive(w,x,t)
- Constraints on variable tI arrived in New
York( t) Arrive(I,NewYork,t) Ù precedes(t,Now)
28Interval Events
- Need tstart and tend
- She was driving to New York until now
- tstart,tend ,e, i
- ISA(e,Drive) Driver(e, She)
- Dest(e, NewYork) Ù IntervalOf(e,i)
- Endpoint(i, tend) Startpoint(i, tend)
- Precedes(tstart,Now) Ù
- Equals(tend,Now)
29Relation Between Tenses and Time
- Relation between simple verb tenses and points in
time is not straightforward - Present tense used like future
- We fly from Baltimore to Boston at 10
- Complex tenses
- Flight 1902 arrived late
- Flight 1902 had arrived late
30Reference Point
- Reichenbach (1947) introduced notion of Reference
point (R), separated out from Utterance time (U)
and Event time (E) - Example
- When Mary's flight departed, I ate lunch
- When Mary's flight departed, I had eaten lunch
- Departure event specifies reference point.
31Today 16/10
- Semantics / Meaning /Meaning Representations
- Linguistically relevant Concepts in FOPC / POL
- Semantic Analysis
32Semantic Analysis
Sentence
Meanings of grammatical structures
- Syntax-driven
- Semantic Analysis
Meanings of words
Literal Meaning
I N F E R E N C E
Common-Sense Domain knowledge
Further Analysis
Discourse Structure
Intended meaning
Context
33Compositional Analysis
- Principle of Compositionality
- The meaning of a whole is derived from the
meanings of the parts - What parts?
- The constituents of the syntactic parse of the
input - What could it mean for a part to have a meaning?
34Compositional Analysis Example
35Augmented Rules
- Augment each syntactic CFG rule with a semantic
formation rule
- i.e., The semantics of A can be computed from
some function applied to the semantics of its
parts.
- The class of actions performed by f will be quite
restricted.
36Simple Extension of FOL Lambda Forms
- Lambda-reduction variables are bound by treating
the lambda form as a function with formal
arguments
37Augmented Rules Example
assigning constants
- Attachments
- AyCaramba
- MEAT
- PropNoun -gt AyCaramba
- MassNoun -gt meat
38Augmented Rules Example
Semantics attached to one daughter is applied to
semantics of the other daughter(s).
- VP.sem(NP.sem)
- Verb.sem(NP.sem)
- S -gt NP VP
- VP -gt Verb NP
lambda-form
39Example
MEAT
.
AC
MEAT
- S -gt NP VP
- VP -gt Verb NP
- Verb -gt serves
- NP -gt PropNoun
- NP -gt MassNoun
- PropNoun -gt AyCaramba
- MassNoun -gt meat
- VP.sem(NP.sem)
- Verb.sem(NP.sem)
- PropNoun.sem
- MassNoun.sem
- AC
- MEAT
40Full story more complex
- To deal properly with quantifiers
- Permit lambda-variables to range over predicates.
E.g.,
- Introduce complex terms to remain agnostic about
final scoping
41Solution Quantifier Scope Ambiguity
- Similarly to PP attachment, number of possible
interpretations exponential in the number of
complex terms
42Attachments for a fragment of English (Sect. 18.5)
- Sentences
- Noun-phrases
- Verb-phrases
- Prepositional-phrases
Based on The core Language Engine 1992
43Integration with a Parser
- Assume youre using a dynamic-programming style
parser (Earley or CYK).
- Two basic approaches
- Integrate semantic analysis into the parser
(assign meaning representations as constituents
are completed)
- Pipeline assign meaning representations to
complete trees only after theyre completed
44Pros and Cons
- Integration
- use semantic constraints to cut off parses that
make no sense - assign meaning representations to constituents
that dont take part in any correct parse
45Next Time
- Read Chp. 19 (Lexical Semantics)
46Non-Compositionality
- Unfortunately, there are lots of examples where
the meaning of a constituent cant be derived
from the meanings of the parts
- - metaphor, (e.g., corporation as person)
- metonymy, (??)
- idioms,
- irony,
- sarcasm,
- indirect requests, etc
47English Idioms
- Lots of these constructions where the meaning of
the whole is either - Totally unrelated to the meanings of the parts
(kick the bucket) - Related in some opaque way (run the show)
- buy the farm
- bite the bullet
- bury the hatchet
- etc
48The Tip of the Iceberg
- Enron is the tip of the iceberg.
- NP -gt the tip of the iceberg .
- the tip of an old iceberg
- the tip of a 1000-page iceberg
- the merest tip of the iceberg
NP -gt TipNP of IcebergNP TipNP NP with tip
as its head IcebergNP NP with iceberg as its
head
49Handling Idioms
- Mixing lexical items and grammatical constituents
- Introduction of idiom-specific constituents
- Permit semantic attachments that introduce
predicates unrelated with constituents
NP -gt TipNP of IcebergNP small-part(),
beginning(). TipNP NP with tip as its head
IcebergNP NP with iceberg as its head