Meaning Representations Computational Semantics - PowerPoint PPT Presentation

About This Presentation
Title:

Meaning Representations Computational Semantics

Description:

Title: CISC882-lecture11-meaning Author: Kathy McCoy Last modified by: Sudeshna Sarkar Created Date: 1/20/1999 7:57:44 PM Document presentation format – PowerPoint PPT presentation

Number of Views:120
Avg rating:3.0/5.0
Slides: 88
Provided by: KathyM151
Category:

less

Transcript and Presenter's Notes

Title: Meaning Representations Computational Semantics


1
Meaning RepresentationsComputational Semantics
2
Grammar Coverage
  • Coverage is never complete
  • Add more rules
  • All grammars leak
  • More specific rules
  • Add more features

3
General NLP System Architecture
Grammar
User Modeling
Dialogue Management
4
Big Transition
  • First we did words (morphology)
  • Then we looked at syntax
  • Now were moving on to meaning. Where some would
    say we should have started to begin with.
  • Now we look at meaning representations
    representations that link linguistic forms to
    knowledge of the world.

5
Semantics
  • Syntax
  • how signs are related to each other
  • Semantics
  • how signs are related to things
  • Pragmatics
  • how signs are related to people

Mr. Smith is expressive
6
Compositional Semantics
  • Compositional Semantics
  • The abstract meaning of a sentence
  • (built from the meaning of its parts)
  • Situational Semantics
  • Adds context-dependent information

Forget about it World knowledge knowledge
about the world shared between groups of people
7
Meaning
  • Language is useful and amazing because it allows
    us to encode/decode
  • Descriptions of the world
  • What were thinking
  • What we think about what other people think
  • Dont be fooled by how natural and easy it is In
    particular, you do not ever
  • Utter word strings that match the world
  • Say what youre thinking
  • Say what you think about what other people think

8
Computational Semantics?
  • Automating the processes of
  • mapping natural language to semantic
    representations
  • using logical representation to draw inferences
  • Patrick Blackburn Johan Bos (Saarbrücken, 1999)
  • Representation and Inference for Natural
    Language A First Course in Computational
    Semantics

9
Meaning Representations
  • Were going to take the same basic approach to
    meaning that we took to syntax and morphology
  • Were going to create representations of
    linguistic inputs that capture the meanings of
    those inputs.
  • But unlike parse trees and the like these
    representations arent primarily descriptions of
    the structure of the inputs

10
Meaning Representations
  • In most cases, theyre simultaneously
    descriptions of the meanings of utterances and of
    some potential state of affairs in some world.

11
Meaning Representations
  • What could this mean
  • representations of linguistic inputs that capture
    the meanings of those inputs
  • What are some of the linguistic concepts we want
    to capture?
  • Categories, events, time, aspect, BDI
  • How? What is most important? This means lots of
    different things to lots of different
    philosophers.
  • Were not going to go there. For us it means
  • Representations that permit or facilitate
    semantic processing

12
Semantic Processing
  • Ok, so what does that mean?
  • What we take as a meaning representation is a
    representation that serves the core practical
    purposes of a program that is doing semantic
    processing.
  • Representations that
  • Permit us to reason about their truth
    (relationship to some world)
  • Is the blue block on the red block?
  • Permit us to answer questions based on their
    content
  • What is the tallest building in the world.
  • Permit us to perform inference (answer questions
    and determine the truth of things we dont
    actually know)
  • If the blue block is on the red block, and the
    red block is in the room, then the blue block is
    in the room.

13
Linguistic Meaning
  • Translation from linguistic form to some
    language of thought
  • (linguistic form grammatical / syntactic form)
  • Fodor
  • mental states with propositional content are
    computational
  • the mind computes a conclusion from the
    premises (beliefs, desires, etc.) on the basis of
    their structural characteristics
  • Thus beliefs, etc., must have a representational
    structure

14
Logical Forms should be
  • Disambiguated
  • alternative readings ? different logical forms
  • Representing literal meanings
  • (truth conditions)
  • Vehicle for reasoning
  • Basis for generation
  • one logical form ? several readings

15
Semantic Processing
  • Touchstone application is always question
    answering
  • Can I answer questions involving the meaning of
    some text or discourse?
  • What kind of representations do I need to
    mechanize that process?

16
Sample Meaning Representations
  • I have a car.
  • First-Order Predicate Calculus
  • Semantic Networks
  • Conceptual Dependency
  • Frame-based representation

17
Common Meaning Representations
  • FOPC
  • Semantic Net
  • having
  • haver had-thing
  • speaker
    car

18
  • Conceptual Dependency Diagram
  • Car
  • ? Poss-By
  • Speaker
  • Frame
  • Having
  • Haver S
  • HadThing Car
  • All represent linguistic meaning of I have a
    car
  • and state of affairs in some world
  • All consist of structures, composed of symbols
    representing objects and relations among them

19
What requirements must meaning representations
fulfill?
  • Verifiability The system should allow us to
    compare representations to facts in a Knowledge
    Base (KB)
  • Cat(Huey)
  • Ambiguity The system should allow us to
    represent meanings unambiguously
  • German teachers has 2 representations
  • Vagueness The system should allow us to
    represent vagueness
  • He lives somewhere in the south of France.

20
Initial Simplifying Assumptions
  • Focus on literal meaning
  • Conventional meanings of words
  • Ignore context

21
Canonical Form
  • Inputs that mean the same thing have the same
    representation.
  • Huey eats kibble.
  • Kibble, Huey will eat.
  • What Huey eats is kibble.
  • Its kibble that Huey eats.
  • Alternatives
  • Four different semantic representations
  • Store all possible meaning representations in KB

22
Canonical Form Pros and Cons
  • Advantages
  • Simplifies reasoning tasks
  • Compactness of representations dont need to
    write inference rules for all different
    paraphrases of the same meaning
  • Disadvantages
  • Complicates task of semantic analysis

23
Inference
  • Draw valid conclusions based on the meaning
    representation of inputs and its store of
    background knowledge.
  • Does Huey eat kibble?
  • thing(kibble)
  • Eat(Huey,x) thing(x)

24
Expressiveness
  • Must accommodate wide variety of meanings

25
First-Order Languages
  • Non-logical all symbols in the vocabulary
  • Variables x, y, z, w, (infinitely many)
  • Boolean operators
  • ? negation
  • ? implication
  • ? disjunction
  • ? conjunction
  • Quantifiers
  • ? universal
  • ? existential
  • (, ) and ,

26
Beliefs
  • Acquiring a new belief
  • linguistic form ? mental representation
  • Aristotle
  • Deduction and inference are based on formal
    relations
  • Circumstantial problem
  • Accessing the language of thought via the
    language of speech
  • Fundamental problem
  • Falls short of explaining what language really
    means
  • (We're just shifting the problem to another
    language.)

27
What is Missing?
  • When we speak or think, we speak or think about
    something.
  • We speak about things in the world.
  • Utterances concerning the actual world may be
    true or false.
  • The truth or falsity of an utterance depends on
  • the meaning of the expression uttered
  • the factual constitution of its subject matter.

28
First-Order Models
  • A model is a pair (D,F)
  • D domain
  • the set of entities
  • F interpretation function
  • map symbols in the vocabulary to entities

29
Model-Theoretic Semantics (Montague)
  • Separate meaning of expressions from factual
    constitutions
  • The subject matter is represented by a model
  • Model abstract structure encoding factual
    information pertaining to truth values of
    sentences
  • State for each sentence S
  • in which possible models uttering S ? truth
  • in which possible models uttering S ? falsehood

30
The Meaning of Sentences (Frege)
  • Giving an account of linguistic meaning
    describing the meanings of complete sentences
  • Explaining the meaning of a sentence S
    explaining under which conditions S is true
  • Explaining the meanings of other units describe
    how they contribute to Ss meaning

31
Semantic Construction
  • Given a sentence of a language,
  • is there a systematic way of constructing its
    semantic representation?
  • Can we translate a syntactic structure into an
    abstract representation of its actual meaning?
  • (e.g. first-order logic)

32
Compositionality, Freges Principle
  • Meaning ultimately flows from the lexicon
  • Meanings are combined by syntactic information
  • The meaning of the whole is a function of the
    meaning of its parts
  • (parts the substructure given by syntax)

33
Syntactic Structure
  • Vincent loves Mia

S
LOVES(VINCENT,MIA)
VP
LOVES(?,MIA)
NP
NP
V
Vincent
likes
VINCENT
Mia
LOVES(?,?)
MIA
34
Three Tasks
  • We Need to Specify
  • a syntax for the language fragment
  • semantic representations for the lexical items
  • the translation compositionally
  • ( specify the translation of all expressions in
    terms of the translation of their parts)
  • All in a way that is naturally implemented

35
Task 1 A Context-Free Grammar
  • s ? np, vp.
  • vp ? iv.
  • vp ? tv, np.
  • np ? pname.
  • np ? det, n.

pname ? vincent. pname ? mia.
n ? robber. n ? woman. det ? a. det ?
every. iv ? snores. tv ? loves.
Montague I fail to see any great interest in
syntax except as a preliminary to semantics.
36
Incomplete / Quasi-Logical Forms
  • To build representations we need to
  • work with incomplete formulas
  • indicate where the information they lack must go

VP
LOVES(?,MIA)
37
Task 2 Semantic Lexicon
  • pname(semvincent) ? vincent.
  • pname(semmia) ? mia.
  • n(sem(X,robber(X))) ? robber.
  • n(sem(X,woman(X))) ? woman.
  • iv(sem(X,snore(X))) ? snores.
  • tv(sem(X,Y,love(X,Y))) ? loves.
  • Associating missing information with an explicit
    variable

38
Quantifiers / Determiners
  • Every robber snores
  • ?x(ROBBER(x) ? SNORE(x))
  • forall(X, robber(X) gt snore(X))
  • A robber snores
  • ?x(ROBBER(x)) SNORE(x))
  • exists(X, robber(X) snore(X))
  • det(X, N, VP, forall (X, N gt VP))? every.
  • det(X, N, VP, exists (X, N VP))? a.
  • Noun contribution restriction
  • VP contribution nuclear scope

39
Task 3 Production Rules
  • s(semN) ? np(sem(X,VP,N)), vp(sem(X,VP)).
  • vp(sem(X,V)) ? iv(sem(X,V)).
  • vp(sem(X,N)) ? tv(sem(X,Y,V)), np(sem(Y,V,N)).
  • np(sem(Name,X,X)) ? pname(semName).
  • np(sem(X,VP,Det))? det(sem(X,N,VP,Det)),
    n(sem(X,N)).

40
How did we do?
  • It works!
  • The underlying intuition is pretty clear.
  • Much of the work is done by the rules.
  • Hard to treat the grammar in a modular way.

41
Lambda Calculus (Church)
  • Notational extension of first order logic
  • Variable binding by an operator ? (lambda)
  • ?x.MAN(x)
  • Variables bound by ? are placeholders
  • (for missing information)
  • lambda reduction performs the substitutions

42
Functional Application Lambda Reduction
  • Concatenation indicates functional application
  • ( that we wish to perform a substitution)
  • (?x.MAN(x)) VINCENT
  • ?x.MAN(x) functor
  • VINCENT argument
  • lambda reduction perform the substitution
  • MAN(VINCENT)

43
Marking more complex kinds of information
  • Representation of a man
  • ?Q.?x(MAN(x) ? Q)
  • The variable Q indicates that
  • some information is missing
  • where this information has to be plugged in

44
Every robber snores
  • Step 1
  • assign ?-expressions to the syntactic categories
  • robber ?x.ROBBER(x)
  • snores ?x.SNORES(x)
  • every ?N.?VP.?x(N(x) ? VP(x))

45
Every robber snores, cont.
  • Step 2
  • associate the NP with the application that has
    the DET as functor and the NOUN as argument

every robber (NP) (?N.?VP.?x(N(x) ? VP(x)))
(?y.ROBBER(y))
every (DET) ?N.?VP.?x(N(x) ? VP(x))
robber (N) ?y.ROBBER(y)
46
Lambda Reduction
  • Step 3
  • Perform the
  • demanded
  • substitutions

every robber (NP) ?VP.?x((?y.ROBBER(y))(x) ?
VP(x))
every robber (NP) ?VP.?x(ROBBER(x)? VP(x))
every robber (NP) (?N.?VP.?x(N(x) ? VP(x)))
(?y.ROBBER(y))
every (DET) ?N.?VP.?x(N(x) ? VP(x))
robber (N) ?y.ROBBER(y)
47
Every robber snores, final representation
  • Step 4
  • Add the VP

every robber snores (S) (?VP.?x(ROBBER(x)?
VP(x)))(?z.SNORES(z))
every robber snores (S) ?x(ROBBER(x)?
(?z.SNORES(z))(x))
every robber snores (S) ?x(ROBBER(x)? SNORES(x))
snores (V) ?z.SNORES(z)
every robber (NP) ?VP.?x(ROBBER(x)? VP(x))
every (DET) ?N.?VP.?x(N(x) ? VP(x))
robber (N) ?y.ROBBER(y)
48
Transitive Verbs
  • loves ?NP.?z.(NP(?x.LOVE(z,x))
  • TV semantic representations take their object
    NPs semantic representation as argument
  • Subject NP semantic representations take the VP
    semantic representation as argument

49
Quantifying Noun Phrases Every woman loves a
man
every woman loves a man (S) (?VP.?w(WOMAN(w)?VP(
w)))(?x.(?m(MAN(m) LOVE(x,m)))
every woman loves a man (S) ?w(WOMAN(w)?(?x.(?m(
MAN(m) LOVE(x,m)))(w)))
every woman loves a man (S) ?w(WOMAN(w)?
?x(MAN(m) LOVE(w,m)))
every woman (NP) ?VP.?w(WOMAN(w)?VP(w))
loves a man (VP) (?NP.?x.(NP(?y.LOVE(x,y)))
(?VP.?m(MAN(m) VP(m)))
loves a man (VP) ?x.(?VP.?m(MAN(m)VP(m))(?y
.LOVE(x,y)))
loves a man (VP) ?x.(?m(MAN(m)
LOVE(x,m)))
loves a man (VP) ?x.(?m(MAN(m)(?y.LOVE(x,y
))(m)))
a man (NP) ?VP.?m(MAN(m) VP(m))
loves (V) ?NP.?x.(NP(?y.LOVE(x,y))
50
Scope Ambiguities
  • Every woman loves a man
  • ?w(WOMAN(w)? ?x(MAN(m) LOVE(w,m)))
  • for each woman there is a man that she loves
  • Second reading
  • ?x(MAN(m) ?w(WOMAN(w)? LOVE(w,m)))
  • there is one man who is loved by all women

51
Construction of Semantic Representations
  • Three basic principles
  • Lexicalization
  • try to keep semantic information lexicalized
  • Compositionality
  • pass information up compositionally from
    terminals
  • Underspecification
  • Dont make a choice unless you have to
  • (the interpretation of ambiguous parts is left
    unresolved)

52
Underspecification
  • A meaning ? of a formalism L is underspecified
  • represents an ambiguous sentence in a more
    compact manner than by a disjunction of all
    readings
  • L is complete Ls disambiguation device
    produces all possible refinements of any ?
  • Example
  • consider a sentence with 3 quantified NPs
  • (with underspecifed scoping relations)
  • L must be able to represent all 23! 64
    refinements
  • (partial and complete disambiguations) of the
    sentence.

53
Phenomena for Underspecification
  • local ambiguities
  • e.g., lexical ambiguities, anaphoric or deictic
    use of PRO
  • global ambiguities
  • e.g., scopal ambiguities, collective-distributive
    readings
  • ambiguous or incoherent non-semantic information
  • e.g., PP-attachment, number disagreement

54
Predicate-Argument Structure
  • Represents concepts and relationships among them
  • Nouns as concepts or arguments (red(ball))
  • Adjectives, adverbs, verbs as predicates
    (red(ball))
  • Subcategorization (or, argument) frames specify
    number, position, and syntactic category of
    arguments
  • NP likes NP
  • NP likes Inf-VP
  • NP likes NP Inf-VP

55
Fillmores Theory about Universal Cases
  • Fillmore there are a small number of semantic
    roles that an NP in a sentence may play with
    respect to the verb.
  • A major task of semantic analysis is to provide
    an appropriate mapping between the syntactic
    constituents of a parsed clause and the semantic
    roles (cases) associated with the verb.

56
Major Cases Include
  • Agent doer of the action, entails
    intentionality
  • Experiencer doer when no intentionality
  • Theme thing being acted upon or undergoing
    change
  • Instrument tool used to do the action
  • Beneficiary person/thing for whom the event is
    performed
  • To/At/From Loc/Poss/Time location or possession
    or time representations

57
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Lets identify the cases in these sentences
    notice any syntactic regularities in the case
    assignment.

58
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Agent doer of action, attributes intention

59
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Agent doer of action, attributes intention
  • Theme thing being acted upon or undergoing
    change

60
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Agent doer of action, attributes intention
  • Theme thing being acted upon or undergoing
    change
  • Instrument tool used to do the action

61
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Agent doer of action, attributes intention
  • Theme thing being acted upon or undergoing
    change
  • Instrument tool used to do the action
  • To-Poss

62
Some Sentences and their cases
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.
  • John gave Mary the book.
  • John gave the book to Mary.
  • Intuition syntactic choices are largely a
    reflection of underlying semantic relationships.

63
Semantic Analysis
  • A major task of semantic analysis is to provide
    an appropriate mapping between the syntactic
    constituents of a parsed clause and the semantic
    roles associated with the verb.

64
Factors to Complicate
  • Ability of syntactic constituents to indicate
    several different semantic roles
  • E.g., Subject position agent versus instrument
    versus theme
  • John broke the window.
  • The rock broke the window.
  • The window broke.
  • Large number of choices available for syntactic
    expression of any particular syntactic role
  • E.g., agent and theme in different configurations
  • John broke the window.
  • It was the window that John broke.
  • The window was broken by John.

65
Factors to Complicate (cont)
  • Prepositional ambiguities it is the case that a
    particular preposition does not always introduce
    the same role
  • E.g., proposition by may indicate either agent
    or instrument
  • The door was opened by John.
  • The door was opened by a key.
  • Optionality of a given role in a sentence
  • John opened the door with a key.
  • The door was opened by John.
  • The door was opened with a key.
  • A key opened the door.
  • The door opened.

66
How bad is it?
  • It seems that semantic roles are playing musical
    chairs with the syntactic constituents. That is,
    they seem to sit down in any old syntactic
    constituent and one or more of them seem to be
    left out at times!
  • Actually, it isnt as bad as it may seem!
  • There is a great deal of regularity consider
    the following set of rules.

67
Some Rules
  • If Agent it becomes Subject
  • Else If Instrument it becomes Subject
  • Else If Theme it becomes Subject
  • Agent preposition is BY
  • Instrument preposition is BY if no agent, else
    WITH
  • Some Rules
  • Some verbs may have exceptions
  • No case can appear twice in the same clause
  • Only NPs of same case can be conjoined
  • Each syntactic constituent can fill only 1 case

68
Whats missing???
  • If Agent it becomes Subject
  • Else If Instrument it becomes Subject
  • Else If Theme it becomes Subject
  • How do I know whether or not an agent exists?
    How about an instrument?
  • Selectional Restrictions restrict the types of
    certain roles to be a certain semantic entity
  • Agents must be animate
  • Instruments are not animate
  • Theme? type may be dependent on the verb itself.

69
Selectional Restrictions
  • Selectional Restrictions constraints on the
    types
  • of arguments verbs take
  • George assassinated the senator.
  • The spider assassinated the fly.
  • assassinate intentional (political?) killing
  • NOTE dependence on the particular verb being
    used!

70
So? What about Case in General?
  • You may or may not see particular cases used in
    semantic analysis.
  • In the book, they have NOT used the specific
    cases.
  • But, note, the roles they use are derived from
    the general cases identified in Fillmores work
    they make them verb-specific.
  • Semantic analysis is going to take advantage of
    the syntactic regularities and selectional
    restrictions to identify the role being played by
    each constituent in a sentence!

71
Representational Schemes
  • Lets go back to the question what kind of
    semantic representation should we derive for a
    given sentence?
  • Were going to make use of First Order Predicate
    Calculus (FOPC) as our representational framework
  • Not because we think its perfect
  • All the alternatives turn out to be either too
    limiting or
  • They turn out to be notational variants
  • Essentially the important parts are the same no
    matter which variant you choose!

72
FOPC
  • Allows for
  • The analysis of truth conditions
  • Allows us to answer yes/no questions
  • Supports the use of variables
  • Allows us to answer questions through the use of
    variable binding
  • Supports inference
  • Allows us to answer questions that go beyond what
    we know explicitly

73
FOPC
  • This choice isnt completely arbitrary or driven
    by the needs of practical applications
  • FOPC reflects the semantics of natural languages
    because it was designed that way by human beings
  • In particular

74
Meaning Structure of Language
  • The semantics of human languages
  • Display a basic predicate-argument structure
  • Make use of variables (e.g., indefinites)
  • Make use of quantifiers (e.g., every, some)
  • Use a partially compositional semantics (sort of)

75
Predicate-Argument Structure
  • Events, actions and relationships can be captured
    with representations that consist of predicates
    and arguments.
  • Languages display a division of labor where some
    words and constituents function as predicates and
    some as arguments.
  • E.g., predicates represent the verb, and the
    arguments (in the right order) represent the
    cases of the verb.

76
Predicate-Argument Structure
  • Predicates
  • Primarily Verbs, VPs, PPs, adjectives, Sentences
  • Sometimes Nouns and NPs
  • Arguments
  • Primarily Nouns, Nominals, NPs
  • But also everything else as well see it depends
    on the context

77
Example
  • John gave a book to Mary
  • Giving(John, Mary, Book)
  • More precisely
  • Gave conveys a three-argument predicate
  • The first argument is the giver (agent)
  • The second is the recipient (to-poss), which is
    conveyed by the NP in the PP
  • The third argument is the thing given (theme),
    conveyed by the direct object

78
More Examples
  • What about situation of missing/additional cases?
  • John gave Mary a book for Susan.
  • Giving(John, Mary, Book, Susan)
  • John gave Mary a book for Susan on Wednesday.
  • Giving(John, Mary, Book, Susan, Wednesday)
  • John gave Mary a book for Susan on Wednesday in
    class.
  • Giving(John, Mary, Book, Susan, Wednesday,
    InClass)
  • Problem Remember each of these predicates would
    be different because of the different number of
    arguments! Except for the suggestive names of
    predicates and arguments, there is nothing that
    indicates the obvious logical relations among
    them.

79
Meaning Representation Problems
  • Assumes that the predicate representing the
    meaning of a verb has the same number of
    arguments as are present in the verbs syntactic
    categorization frame.
  • This makes it hard to
  • Determine the correct number of roles for any
    given event
  • Represent facts about the roles associated with
    the event
  • Insure that all and only the correct inferences
    can be derived from the representation of an event

80
Better
  • Turns out this representation isnt quite as
    useful as it could be.
  • Giving(John, Mary, Book)
  • Better would be one where the roles or cases
    are separated out. E.g., consider
  • Note essentially GiverAgent, GivenTheme,
    GiveeTo-Poss

81
Predicates
  • The notion of a predicate just got more
    complicated
  • In this example, think of the verb/VP providing a
    template like the following
  • The semantics of the NPs and the PPs in the
    sentence plug into the slots provided in the
    template (well worry about how in a bit!)

82
Advantages
  • Can have variable number of arguments associated
    with an event events have many roles and fillers
    can be glued on as appear in the input.
  • Specifies categories (e.g., book) so that we can
    make assertions about categories themselves as
    well as their instances. E.g., Isa(MobyDick,
    Novel), AKO(Novel, Book).
  • Reifies events so that they can be quantified and
    related to other events and objects via sets of
    defined relations.
  • Can see logical connections between closely
    related examples without the need for meaning
    postulates.

83
Additional Material
  • The following are some aspects covered in the
    book that will likely not be covered in lecture!

84
FOPC Syntax
  • Terms constants, functions, variables
  • Constants objects in the world, e.g. Huey
  • Functions concepts, e.g. sisterof(Huey)
  • Variables x, e.g. sisterof(x)
  • Predicates symbols that refer to relations that
    hold among objects in some domain or properties
    that hold of some object in a domain
  • likes(Huey, kibble)
  • cat(Huey)

85
  • Logical connectives permit compositionality of
    meaning
  • kibble(x) ? likes(Huey,x)
  • cat(Vera) weird(Vera)
  • sleeping(Huey) v eating(Huey)
  • Sentences in FOPC can be assigned truth values, T
    or F, based on whether the propositions they
    represent are T or F in the world
  • Atomic formulae are T or F based on their
    presence or absence in a DB (Closed World
    Assumption?)
  • Composed meanings are inferred from DB and
    meaning of logical connectives

86
  • cat(Huey)
  • sibling(Huey,Vera)
  • sibling(x,y) cat(x) ? cat(y)
  • cat(Vera)??
  • Limitations
  • Do and and or in natural language really mean
    and v?
  • Mary got married and had a baby.
  • Your money or your life!
  • She was happy but ignorant.
  • Does ? mean if?
  • Ill go if you promise to wear a tutu.

87
  • Quantifiers
  • Existential quantification There is a unicorn in
    my garden. Some unicorn is in my garden.
  • Universal quantification The unicorn is a
    mythical beast. Unicorns are mythical beasts.
  • Inference
  • Modus ponens
  • rich(Harry)
  • x rich(x) ? happy(x)
  • happy(Harry)
  • Production systems
  • Forward and backward chaining

88
Temporal Representations
  • How do we represent time and temporal
    relationships between events?
  • Last year Martha Stewart was happy but soon she
    will be sad.
  • Where do we get temporal information?
  • Verb tense
  • Temporal expressions
  • Sequence of presentation
  • Linear representations Reichenbach 47

89
  • Utterance time when the utterance occurs
  • Reference time the temporal point-of-view of the
    utterance
  • Event time when events described in the
    utterance occur
  • George had intended to eat a sandwich.
  • E R U ?
  • George is eating a sandwich.
  • -- E,R,U ?
  • George had better eat a sandwich soon.
  • --R,U E ?

90
Verbs and Event Types Aspect
  • Statives states or properties of objects at a
    particular point in time
  • Mary needs sleep.
  • Mary is needing sleep. Need sleep. Mary
    needs sleep in a week.
  • Activities events with no clear endpoint
  • Harry drives a Porsche. Harry drives a Porsche
    in a week.

91
  • Accomplishments events with durations and
    endpoints that result in some change of state
  • Marlon filled out the form. Marlon stopped
    filling out the form (Marlon did not fill out the
    form) vs. Harry stopped driving a Porsche (Harry
    still drove a Porsche for a while)
  • Achievements events that change state but have
    no particular duration
  • Larry reached the top. Larry stopped reaching
    the top.
  • Larry reached the top for a few minutes.

92
Beliefs, Desires and Intentions
  • How do we represent internal speaker states like
    believing, knowing, wanting, assuming,
    imagining..?
  • Not well modeled by a simple DB lookup approach
  • Truth in the world vs. truth in some possible
    world
  • George imagined that he could dance.
  • Geroge believed that he could dance.
  • Augment FOPC with special modal operators that
    take logical formulae as arguments, e.g. believe,
    know
Write a Comment
User Comments (0)
About PowerShow.com