Title: Parsers and Grammars
1Parsers and Grammars
Colin Phillips
2Outline
- The Standard History of Psycholinguistics
- Parsing and rewrite rules
- Initial optimism
- Disappointment and the DTC
- Emergence of independent psycholinguistics
- Reevaluating relations between competence and
performance systems
3Standard View
324 697 ?
217 x 32 ?
arithmetic
4Standard View
specialized algorithm
specialized algorithm
324 697 ?
217 x 32 ?
arithmetic
5Standard View
specialized algorithm
specialized algorithm
324 697 ?
217 x 32 ?
?
arithmetic
something deeper
6Standard View
specialized algorithm
specialized algorithm
understanding
speaking
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
7Standard View
specialized algorithm
specialized algorithm
understanding
speaking
precisebut ill-adapted toreal-time operation
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
8Standard View
specialized algorithm
specialized algorithm
understanding
speaking
well-adapted toreal-time operationbut maybe
inaccurate
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
9Grammatical Knowledge
- How is grammatical knowledge accessed in
syntactic computation for...(a) grammaticality
judgment(b) understanding(c) speaking - Almost no proposals under standard view
- This presents a serious obstacle to unification
at the level of syntactic computation
10(No Transcript)
11Townsend Bever (2001, ch. 2)
- Linguists made a firm point of insisting that,
at most, a grammar was a model of competence -
that is, what the speaker knows. This was
contrasted with effects of performance, actual
systems of language behaviors such as speaking
and understanding. Part of the motive for this
distinction was the observation that sentences
can be intuitively grammatical while being
difficult to understand, and conversely.
12Townsend Bever (2001, ch. 2)
- Despite this distinction the syntactic model
had great appeal as a model of the processes we
carry out when we talk and listen. It was
tempting to postulate that the theory of what we
know is a theory of what we do, thus answering
two questions simultaneously.1. What do we know
when we know a language?2. What do we do when we
use what we know?
13Townsend Bever (2001, ch. 2)
- It was assumed that this knowledge is linked to
behavior in such a way that every syntactic
operation corresponds to a psychological process.
The hypothesis linking language behavior and
knowledge was that they are identical.
14Miller (1962)
- 1. Mary hit Mark. K(ernel)2. Mary did not hit
Mark. N3. Mark was hit by Mary. P4. Did
Mary hit Mark? Q5. Mark was not hit by
Mary. NP6. Didnt Mary hit Mark? NQ7. Was
Mark hit by Mary? PQ8. Wasnt Mark hit by
Mary? PNQ
15Miller (1962)
Transformational Cube
16Townsend Bever (2001, ch. 2)
- The initial results were breathtaking. The
amount of time it takes to produce a sentence,
given another variant of it, is a function of the
distance between them on the sentence cube.
(Miller McKean 1964).It is hard to convey
how exciting these developments were. It appeared
that there was to be a continuing direct
connection between linguistic and psychological
research. The golden age had arrived.
17Townsend Bever (2001, ch. 2)
- Alas, it soon became clear that either the
linking hypothesis was wrong, or the grammar was
wrong, or both.
18Townsend Bever (2001, ch. 2)
- The moral of this experience is clear. Cognitive
science made progress by separating the question
of what people understand and say from how they
understand and say it. The straightforward
attempt to use the grammatical model directly as
a processing model failed. The question of what
humans know about language is not only distinct
from how children learn it, it is distinct from
how adults use it.
19A Simple Derivation
S (starting axiom)
S
20A Simple Derivation
S (starting axiom)1. S ? NP VP
S
NP
VP
21A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP
S
NP
VP
V
NP
22A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N
S
NP
VP
V
NP
D
N
23A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill
S
NP
VP
V
NP
Bill
D
N
24A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit
S
NP
VP
V
NP
Bill
hit
D
N
25A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the
S
NP
VP
V
NP
Bill
hit
D
N
the
26A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
27A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
28Reverse the derivation...
29A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
Bill
30A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
Bill
31A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
Bill
hit
32A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
33A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
the
34A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
the
35A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
the
ball
36A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
Bill
hit
D
N
the
ball
37A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
V
NP
Bill
hit
D
N
the
ball
38A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
NP
VP
V
NP
Bill
hit
D
N
the
ball
39A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
40A Simple Derivation
S (starting axiom)1. S ? NP VP2. VP
? V NP3. NP ? D N4. N ? Bill5. V ? hit6.
D ? the7. N ? ball
S
NP
VP
V
NP
Bill
hit
D
N
the
ball
41Transformations
- wh-movement
- X wh-NP Y
- 1 2 3
- --gt 2 1 0 3
42Transformations
- VP-ellipsis
- X VP1 Y VP2 Z
- 1 2 3 4 5
- --gt 1 2 3 0 5
- condition VP1 VP2
43Difficulties
- How to build structure incrementally in
right-branching structures - How to recognize output of transformations that
create nulls
44Summary
- Running the grammar backwards is not so
straightforward - problems of indeterminacy and
incrementality - Disappointment in empirical tests of Derivational
Theory of Complexity - Unable to account for processing of local
ambiguities
45Standard View
specialized algorithm
specialized algorithm
understanding
speaking
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
46Grammatical Knowledge
- How is grammatical knowledge accessed in
syntactic computation for...(a) grammaticality
judgment(b) understanding(c) speaking - Almost no proposals under standard view
- This presents a serious obstacle to unification
at the level of syntactic computation
47(No Transcript)
48Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
49Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
50Grammar as Parser - Problems
- Incremental structure building with PS Rules
(e.g. S -gt NP VP) - delay
- prediction/guessing
- Indeterminacy ( how to recover nulls created by
transformations)
51Grammar as Parser - Solutions
- Lexicalized grammars make incremental
structure-building much easier (available in
HPSG, minimalism, LFG, Categorial Grammar, etc.
etc.)
VP
VP -gt V PPPP -gt P NP
V
PP
sat
NP
P
on
the rug
52Grammar as Parser - Solutions
- Lexicalized grammars make incremental
structure-building much easier (available in
HPSG, minimalism, LFG, Categorial Grammar, etc.
etc.)
VP
sitcomp __ Poncomp __ N
V
PP
sat
NP
P
on
the rug
53Grammar as Parser - Solutions
- Problem of seeking nulls in movement structures
54Transformations
- wh-movement
- X wh-NP Y
- 1 2 3
- --gt 2 1 0 3
55Transformations
- VP-ellipsis
- X VP1 Y VP2 Z
- 1 2 3 4 5
- --gt 1 2 3 0 5
- condition VP1 VP2
56Grammar as Parser - Solutions
- Problem of seeking nulls in movement structures
- becomes problem of seeking licensing features
for displaced phrases, e.g. for wh-phrase, seek
Case assigner and thematic role assigner. - Requirement to find licensing features is a basic
component of all syntactic composition
57Incremental Structure Building
- An investigation of the grammatical consequences
of incremental, left-to-right structure building
58Incremental Structure Building
A
59Incremental Structure Building
A
B
60Incremental Structure Building
A
B
C
61Incremental Structure Building
A
B
C
D
62Incremental Structure Building
A
B
C
D
E
63Incremental Structure Building
A
B
64Incremental Structure Building
A
B
constituent
65Incremental Structure Building
A
B
C
constituent is destroyed by addition of new
material
66Incremental Structure Building
A
B
C
67Incremental Structure Building
A
constituent
B
C
68Incremental Structure Building
A
B
C
D
constituent is destroyed by addition of new
material
69Incremental Structure Building
the cat
70Incremental Structure Building
the cat
sat
71Incremental Structure Building
the cat
sat
on
72Incremental Structure Building
the cat
sat
on
the rug
73Incremental Structure Building
the cat
sat
on
74Incremental Structure Building
the cat
sat
on
the rug
75Incremental Structure Building
the cat
sat
sat on is a temporary constituent, which is
destroyed as soon as the NP the rug is added.
on
the rug
76Incremental Structure Building
- Conflicting Constituency Tests
- Verb Preposition sequences can undergo
coordination - (1) The cat sat on and slept under the rug.
- but cannot undergo pseudogapping (Baltin
Postal, 1996) - (2) The cat sat on the rug and the dog did the
chair.
77Incremental Structure Building
the cat
sat
on
78Incremental Structure Building
the cat
and
sat
on
slept
under
79Incremental Structure Building
the cat
coordination applies early, before the VP
constituent is destroyed.
and
sat
on
slept
under
80Incremental Structure Building
the cat
sat
on
81Incremental Structure Building
the cat
sat
on
the rug
82Incremental Structure Building
the cat
and
the dog
did
sat
on
the rug
83Incremental Structure Building
the cat
and
the dog
did
sat
pseudogapping applies too late, after the VP
constituent is destroyed.
on
the rug
84Incremental Structure Building
- Constituency ProblemDifferent diagnostics of
constituency frequently yield conflicting results - Incrementality Hypothesis(a) Structures are
assembled strictly incrementally(b) Syntactic
processes see a snapshot of a derivation - they
target constituents that are present when the
process applies(c) Conflicts reflect the simple
fact that different processes have different
linear properties - Applied to interactions among binding, movement,
ellipsis, prosodic phrasing, clitic placement,
islands, etc. (Phillips 1996, in press Richards
1999, 2000 Guimaraes 1999 etc.)
85Interim Conclusion
- Grammatical derivations look strikingly like the
incremental derivations of a parsing system - But we want to be explicit about this, so...
86Computational Modeling
(Schneider 1999 Schneider Phillips, 1999)
87(No Transcript)
88Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
89Townsend Bever (2001, ch. 2)
- Linguists made a firm point of insisting that,
at most, a grammar was a model of competence -
that is, what the speaker knows. This was
contrasted with effects of performance, actual
systems of language behaviors such as speaking
and understanding. Part of the motive for this
distinction was the observation that sentences
can be intuitively grammatical while being
difficult to understand, and conversely.
90Grammaticality ? Parsability
- It is straightforward enough to show that
sentence parsing and grammaticality judgments are
different. There are sentences which are easy to
parse but ungrammatical (e.g. that-trace
effects), and there are sentences which are
extremely difficult to parse, but which may be
judged grammatical given appropriate time for
reflection (e.g. multiply center embedded
sentences). This classic argument shows that
parsing and grammar are not identical, but it
tells us very little about just how much they
have in common. - (Phillips, 1995)
91Grammaticality ? Parsability
- Grammatical sentences that are hard to parse
- The cat the dog the rat bit chased fled
- John gave the man the dog bit a sandwich
- Ungrammatical sentences that are understandable
- Who do you think that left?
- The children is happy
- The millionaire donated the museum a painting
92Grammaticality ? Parsability
- Grammatical sentences that are hard to parse
- The cat the dog the rat bit chased fled
- John gave the man the dog bit a sandwich
- Can arise independent of grammar
- Resource (memory) limitations
- Incorrect choices in ambiguity
93(Preliminary)
- Incomplete structural dependencies have a cost
(thats what yields center embedding)
94A Contrast (Gibson 1998)
- Relative Clause within a Sentential Complement
(RC? SC) - The fact CP that the employee RC who the
manager hired stole office supplies worried the
executive. - Sentential Complement within a Relative Clause
(SC ? RC) - The executive RC who the fact CP that the
employee stole office supplies worried hired
the manager. - RC ? SC is easier to process than SC ? RC
95A Contrast (Gibson 1998)
- Relative Clause within a Sentential Complement
(RC? SC) - SC that the employee RC who the manager
hired stole - Sentential Complement within a Relative Clause
(SC ? RC) - RC who the fact SC that the employee stole
office supplies worried - RC ? SC is easier to process than SC ? RC
96A Contrast (Gibson 1998)
- Relative Clause within a Sentential Complement
(RC? SC) - SC that the employee RC who the manager
hired stole - Sentential Complement within a Relative Clause
(SC ? RC) - RC who the fact SC that the employee stole
office supplies worried - RC ? SC is easier to process than SC ? RC
97A Contrast (Gibson 1998)
- Relative Clause within a Sentential Complement
(RC? SC) - SC that the employee RC who the manager
hired stole - Sentential Complement within a Relative Clause
(SC ? RC) - RC who the fact SC that the employee stole
office supplies worried - Contrast is motivated by off-line complexity
ratings
98Grammaticality ? Parsability
- Ungrammatical sentences that are understandable
- Who do you think that left?
- The children is happy
- The millionaire donated the museum a painting
- System can represent illegal combinations (e.g.
categories are appropriate, but feature values
are inappropriate) - Fact that understandable errors are (i)
diagnosable, (ii) nearly grammatical, should not
be overlooked
99Grammaticality ? Parsability
- Are the parsers operations fully grammatically
accurate?
100Standard View
specialized algorithm
specialized algorithm
understanding
speaking
well-adapted toreal-time operationbut maybe
inaccurate
grammaticalknowledge,competence
language
recursive characterization ofwell-formed
expressions
101Grammatical Accuracy in Parsing
- The grammar looks rather like a parser
- BUT, does the parser look like a grammar?i.e.,
are the parsers operations fully grammatically
accurate at every step even in situations
where such accuracy appears quite difficult to
achieve
(Phillips Wong 2000)
102Self-Paced Reading
-- --- ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
103Self-Paced Reading
We --- ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
104Self-Paced Reading
-- can ------- ------- ---- --- ----.
(e.g. Phillips Wong 2000)
105Self-Paced Reading
-- --- measure ------- ---- --- ----.
(e.g. Phillips Wong 2000)
106Self-Paced Reading
-- --- ------- reading ---- --- ----.
(e.g. Phillips Wong 2000)
107Self-Paced Reading
-- --- ------- ------- time --- ----.
(e.g. Phillips Wong 2000)
108Self-Paced Reading
-- --- ------- ------- ---- per ----.
(e.g. Phillips Wong 2000)
109Self-Paced Reading
-- --- ------- ------- ---- --- word.
(e.g. Phillips Wong 2000)
110Grammatical Accuracy in Parsing
Wh-Questions
(Phillips Wong 2000)
111Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook wonderful dinners.
(Phillips Wong 2000)
112Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook wonderful dinners.
(Phillips Wong 2000)
113Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook what
(Phillips Wong 2000)
114Grammatical Accuracy in Parsing
Wh-Questions
Englishmen cook what
(Phillips Wong 2000)
115Grammatical Accuracy in Parsing
Wh-Questions
What do Englishmen cook
(Phillips Wong 2000)
116Grammatical Accuracy in Parsing
Wh-Questions
What do Englishmen cook gap
(Phillips Wong 2000)
117Grammatical Accuracy in Parsing
Wh-Questions
?
What do Englishmen cook gap
(Phillips Wong 2000)
118Grammatical Accuracy in Parsing
Long-distance Wh-Questions
Few people think that anybody realizes that
Englishmen cook wonderful dinners
(Phillips Wong 2000)
119Grammatical Accuracy in Parsing
Long-distance Wh-Questions
Few people think that anybody realizes that
Englishmen cook what
(Phillips Wong 2000)
120Grammatical Accuracy in Parsing
Long-distance Wh-Questions
What do few people think that anybody realizes
that Englishmen cook gap
?
(Phillips Wong 2000)
121Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
(Phillips Wong 2000)
122Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
(Phillips Wong 2000)
123Grammatical Accuracy in Parsing
Parasitic Gaps
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
Main Clause
(Phillips Wong 2000)
124Grammatical Accuracy in Parsing
Parasitic Gaps
Subject NP
The plan to remove the equipment ultimately
destroyed the building.
Direct Object NP
Direct Object NP
Main Clause
Embedded Clause
(Phillips Wong 2000)
125Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove the equipment
ultimately destroy
(Phillips Wong 2000)
126Grammatical Accuracy in Parsing
Parasitic Gaps
?
What did the plan to remove the equipment
ultimately destroy gap
(Phillips Wong 2000)
127Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy the building
(Phillips Wong 2000)
128Grammatical Accuracy in Parsing
Parasitic Gaps
?
What did the plan to remove gap
ultimately destroy the building
(Phillips Wong 2000)
129Grammatical Accuracy in Parsing
Parasitic Gaps
Subject
?
What did the plan to remove gap
ultimately destroy the building
Island Constraint A wh-phrase cannot be moved out
of a subject.
(Phillips Wong 2000)
130Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
131Grammatical Accuracy in Parsing
Parasitic Gaps
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
132Grammatical Accuracy in Parsing
Parasitic Gaps
?
?
What did the plan to remove
ultimately destroy
Parasitic Gap
Generalization the good gap rescues the bad gap
133Grammatical Accuracy in Parsing
Parasitic Gaps
Infinitive
?
?
What did the plan to remove
ultimately destroy
Generalization the good gap rescues the bad gap
134Grammatical Accuracy in Parsing
Parasitic Gaps
Finite
?
?
What did the plan that removed
ultimately destroy
Revised Generalization (informal) Only mildly bad
gaps can be rescued by good gaps.
135Grammaticality Ratings
Ratings from 50 subjects
136Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
The good gap rescues the bad gap BUT The bad gap
appears before the good gap a look-ahead problem
137Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
Embedded Verb
Question When the parser reaches the embedded
verb, does it construct a dependency - even
though the gap would be a bad gap?
138Grammatical Accuracy in Parsing
A Look-Ahead Problem
Infinitive
?
?
What did the plan to remove
ultimately destroy
Risky
Finite
?
?
What did the plan that removed
ultimately destroy
Reckless
139Grammatical Accuracy in Parsing
Question
What do speakers do when they get to the verb
embedded inside the subject NP? (i) RISKY create
a gap in infinitival clauses only - violates a
constraint, but may be rescued (ii) RECKLESS
create a gap in all clause types - violates a
constraint cannot be rescued (iii) CONSERVATIVE
do not create a gap
(Phillips Wong 2000)
140Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
(Phillips Wong 2000)
141Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
Gap here RISKY
(Phillips Wong 2000)
142Grammatical Accuracy in Parsing
Materials
a. what infinitival verb ... infinitive, gap
ok b. whether ..infinitival verb
... infinitive, no gap c. what finite verb
... finite, gap not ok d. whether finite
verb ... finite, no gap
Gap here RECKLESS
(Phillips Wong 2000)
143Grammatical Accuracy in Parsing
Materials
a. The outspoken environmentalist worked to
investigate what the local campaign to preserve
the important habitats had actually harmed in the
area that the birds once used as a place for
resting while flying south. infinitive,
gap b. whether the local campaign to preserve
infinitive, no gap c. what the local campaign
that preserved finite, gap d. whether the
local campaign that preserved finite, no gap
(Phillips Wong 2000)
144Grammatical Accuracy in Parsing
Infinitive
?
?
What did the plan to remove
ultimately destroy
(Phillips Wong 2000)
Risky
145Grammatical Accuracy in Parsing
Finite
?
?
What did the plan that removed
ultimately destroy
(Phillips Wong 2000)
Reckless
146Grammatical Accuracy in Parsing
Conclusion
- Structure-building is extremely grammatically
accurate, even when the word-order of a language
is not cooperative - Constraints on movement are violated in exactly
the environments where the grammar allows the
violation to be forgiven (may help to explain
discrepancies in past studies) - Such accuracy is required if grammatical
computation is to be understood as real-time
on-line computation
147(No Transcript)
148Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
149Derivational Theory of Complexity
- The psychological plausibility of a
transformational model of the language user would
be strengthened, of course, if it could be shown
that our performance on tasks requiring an
appreciation of the structure of transformed
sentences is some function of the nature, number
and complexity of the grammatical transformations
involved. (Miller Chomsky 1963 p. 481)
150Miller (1962)
- 1. Mary hit Mark. K(ernel)2. Mary did not hit
Mark. N3. Mark was hit by Mary. P4. Did
Mary hit Mark? Q5. Mark was not hit by
Mary. NP6. Didnt Mary hit Mark? NQ7. Was
Mark hit by Mary? PQ8. Wasnt Mark hit by
Mary? PNQ
151Miller (1962)
Transformational Cube
152Derivational Theory of Complexity
- Miller McKean (1964) Matching sentences with
the same meaning or kernel - Joe warned the old woman. KThe old woman was
warned by Joe. P 1.65s - Joe warned the old woman. KJoe didnt warn the
old woman. N 1.40s - Joe warned the old woman. KThe old woman wasnt
warned by Joe. PN 3.12s
153McMahon (1963)
- a. i. seven precedes thirteen K (true)
- ii. thirteen precedes seven K (false)
- b. i. thirteen is preceded by seven P (true)
- ii. seven is preceded by thirteen P (false)
- c. i. thirteen does not precede seven N (true)
- ii. seven does not precede thirteen N (false)
- d. i. seven is not preceded by thirteen PN (true)
- ii. thirteen is not preceded by seven PN (false)
154Easy Transformations
- Passive
- The first shot the tired soldier the mosquito bit
fired missed. - The first shot fired by the tired soldier bitten
by the mosquito missed. - Heavy NP Shift
- I gave a complete set of the annotated works of
H.H. Munro to Felix. - I gave to Felix a complete set of the annotated
works of H.H. Munro. - Full Passives
- Fido was kissed (by Tom).
- Adjectives
- The red house/house which is red is on fire.
155Failure of DTC?
- Any DTC-like prediction is contingent on a
particular theory of grammar, which may be wrong - Its not surprising that transformations are not
the only contributor to perceptual complexity - memory demands, may increase or decrease
- ambiguity, where grammar does not help
- difficulty of access
156(No Transcript)
157Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
158Garden Paths Temporary Ambiguity
- The horse raced past the barn fell.
- Weapons test scores a hit.
- John gave the man the dog bit a sandwich.
- Grammar can account for the existence of global
ambiguities (e.g. Visiting relatives can be
boring), but not local ambiguities since the
grammar does not typically assemble structure
incrementally
159Garden Paths Temporary Ambiguity
- Ambiguity originally studied as test of solution
to the incrementality problem - Heuristics Strategies (e.g. Bever, 1970)
- NP V gt subject verb
- V NP gt verb object
- V NP NP gt verb object object
- Garden paths used as evidence for effects of
heuristics
160Garden Paths Temporary Ambiguity
- Heuristics Strategies
- NP V gt subject verbThe horse raced past the
barn fell - V NP gt verb objectThe student knew the answer
was wrong - V NP NP gt verb object objectJohn gave the man
the dog bit a sandwich
161Ambiguity Resolution
- Observation heuristics miss a generalization
about how ambiguities are preferentially resolved - Kimball (1973) Seven principles of surface
structure parsing (e.g. Right Association) - Frazier (1978), Fodor Frazier (1978) Minimal
Attachment, Late Closure - Various others, much controversy...
162Ambiguity Resolution
- Assumptions
- grammatical parses are accessed (unclear how)
- simplest analysis of ambiguity chosen
(uncontroversial) - structural complexity affects simplicity (partly
controversial) - structural complexity determines simplicity (most
controversial)
163Ambiguity Resolution
- Relevance to architecture of language
- Comprehension-specific heuristics which
compensate for inadequacy of grammar imply
independent system - Comprehension-specific notions of structural
complexity compatible with independent system - If grammar says nothing about ambiguity, and
structural complexity is irrelevant to ambiguity
resolution, as some argue, then ambiguity is
irrelevant to question of parser-grammar
relations.
164(No Transcript)
165Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
166Parsing ? Production
- Parsing generates meaning from form
- Production generates form from meaning
- Different bottlenecks in the two areas
- garden paths in comprehension
- word-category constraint in production errors
- etc., etc.
- Lexical access speaking and recognizing words
differs, but do we assume that this reflects
different systems? - Contemporary production theories are now
incremental structure-building systems, more
similar to comprehension models
167(No Transcript)
168Arguments for Architecture
- 1. Available grammars dont make good parsing
devices - 2. Grammaticality ? Parsability
- 3. Failure of DTC
- 4. Evidence for parser-specific structure
- 5. Parsing/production have distinct properties
- 6. Possibility of independent damage to
parsing/production - 7. Competence/performance distinction is
necessary, right?
169Competence Performance
- Different kinds of formal systems Competence
systems and Performance systems - The difference between what a system can generate
given unbounded resources, and what it can
generate given bounded resources - The difference between a cognitive system and its
behavior
170Competence Performance
- (1) Its impossible to deny the distinction
between cognitive states and actions, the
distinction between knowledge and its deployment. - (2) How to distinguish ungrammatical-but-comprehen
sible examples (e.g. John speaks fluently
English) from hard-to-parse examples. - (3) How to distinguish garden-path sentences
(e.g. The horse raced past the barn fell) from
ungrammatical sentences. - (4) How to distinguish complexity overload
sentences (e.g. The cat the dog the rat chased
saw fled) from ungrammatical sentences.
171Competence Performance
- It is straightforward enough to show that
sentence parsing and grammaticality judgments are
different. There are sentences which are easy to
parse but ungrammatical (e.g. that-trace
effects), and there are sentences which are
extremely difficult to parse, but which may be
judged grammatical given appropriate time for
reflection (e.g. multiply center embedded
sentences). This classic argument shows that
parsing and grammar are not identical, but it
tells us very little about just how much they
have in common. - (Phillips, 1995)
- This argument is spurious!
172(No Transcript)
173Summary
- Motivation for combining learning theories with
theories of adult knowledge is well-understood
much more evidence needed. - Theories of comprehension and production long
thought to be independent of competence models.
In fact, combination of these is quite feasible
if true, possible to investigate linguistic
knowledge in real time.