Title: The Emergent Structure of Semantic Knowledge
1The Emergent Structure of Semantic Knowledge
- Jay McClelland
- Department of Psychology andCenter for Mind,
Brain, and ComputationStanford University
2The Parallel Distributed Processing Approach to
Semantic Cognition
- Representation is a pattern of activation
distributed over neurons within and across brain
areas. - Bidirectional propagation of activation underlies
the ability to bring these representations to
mind from given inputs. - The knowledge underlying propagation of
activation is in the connections. - Experience affects our knowledge representations
through a gradual connection adjustment process
3Distributed Representationsand Overlapping
Patterns for Related Concepts
dog goat hammer
4Kiani et al, J Neurophysiol 97 42964309, 2007.
5Emergence of Meaning and Metaphor
- Learned distributed representations that capture
important aspects of meaning emerge through a
gradual learning process in simple connectionist
networks - Metaphor arises naturally as a byproduct of
learning information in homologous domains in
models of this type
6Emergence of Meaning Differentiation,
Reorganization, and Context-Sensitivity
7(No Transcript)
8The Rumelhart Model
9The Training Data
All propositions true of items at the bottom
levelof the tree, e.g. Robin can grow, move,
fly
10Target output for robin can input
11Forward Propagation of Activation
12Back Propagation of Error (d)
aj
wij
ai
di Sdkwki
wki
dk (tk-ak)
Error-correcting learning At the output
layer Dwki edkai At the prior layer Dwij
edjaj
13(No Transcript)
14(No Transcript)
15Early Later LaterStill
Experie nce
16(No Transcript)
17What Drives Progressive Differentiation?
- Waves of differentiation reflect coherent
covariation of properties across items. - Patterns of coherent covariation are reflected in
the principal components of the property
covariance matrix. - Figure shows attribute loadings on the first
three principal components - 1. Plants vs. animals
- 2. Birds vs. fish
- 3. Trees vs. flowers
- Same color features covary in
component - Diff color anti-covarying
features
18Sensitivity to Coherence Requires Convergence
A
A
19Conceptual Reorganization (Carey, 1985)
- Carey demonstrated that young children discover
the unity of plants and animals as living things
with many shared properties only around the age
of 10. - She suggested that the coalescence of the concept
of living thing depends on learning about diverse
aspects of plants and animals including - Nature of life sustaining processes
- What it means to be dead vs. alive
- Reproductive properties
- Can reorganization occur in a connectionist net?
20Conceptual Reorganization in the Model
- Suppose superficial appearance information, which
is not coherent with much else, is always
available - And there is a pattern of coherent covariation
across information that is contingently available
in different contexts. - The model forms initial representations based on
superficial appearances. - Later, it discovers the shared structure that
cuts across the different contexts, reorganizing
its representations.
21(No Transcript)
22Organization of Conceptual Knowledge Early and
Late in Development
23(No Transcript)
24Overall Structure Extracted by a Structured
Statistical Model
25(No Transcript)
26Sensitivity to Context
Context-general representation
Context-sensitive representation
27Relation-specificrepresentations
- IS Representations (top) reflect idiosyncratic
appearance properties. - HAS representations are similar to the
context-general representations (middle). - Can representations collapse differences between
plants, since there is little that plants can do. - The fish are all the same, because theres no
difference in what they can do.
28Ongoing Work
- Can the representations learned in the
distributed connectionist model capture different
patterns of generalization of different kinds of
properties? - Simulations already show context-specific
patterns of property generalization. - We are currently collecting detailed data from a
new data set to explore the sufficiency of the
model to explain experimental data on context
specific patterns of generalization.
29Generalization of different property types
- At different points in training, the network is
taught one of - Maple can queem
- Maple is queem
- Maple has queem
- Only weights from hidden to output are allowed to
change. - Network is then tested to see how strongly
queem is activated then same relation is paired
with other items.
queem
30Generalization to other concepts after training
with can, has, or is queem
31Ongoing Work
- Can the representations learned in the
distributed connectionist model capture different
patterns of generalization of different kinds of
properties? - Our simulations already show context-specific
patterns of property generalization. - We are currently conducting new experiments to
gather experimental data on context specific
patterns of generalization that we will use to
test an extended version of the model trained
with a much larger training set.
32Metaphor in Connectionist Models of Semantics
- By metaphor I mean the application of a
relation learned in one domain to a novel
situation in another
33Hintons Family Tree Network
34English Tree Recovered
Italian Tree Recovered
35Understanding Via Metaphor in the Family Trees
Network
Marcos father is Pierro. Who is Jamess father?
36Future Work Metaphors We Live By
- In Hintons model, neither domain is the base
each influences the other equally - But research suggests that some domains serve as
a base that influences other domains - Lakoff physical structure as a base for the
structure of an intellectual argument - Boroditsky space as a base for time
- In connectionist networks, primacy and frequency
both influence performance - This allows the models to simulate how early and
pervasive experience may allow one domain to
serve as the base for others experienced later or
less frequently - Influences can still run in both directions, but
to different extents
37Emergence of Meaning and Metaphor
- Learned distributed representations that capture
important aspects of meaning emerge through a
gradual learning process in simple connectionist
networks - Metaphor arises naturally as a byproduct of
learning information in homologous domains in
models of this type