Models of Word Recognition - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Models of Word Recognition

Description:

No language fits perfectly into one of these. All languages deviate somewhat ... homophony (bear-bare) polysemy (bank-bank) bass-bass. A Simple Model. letter features ... – PowerPoint PPT presentation

Number of Views:110
Avg rating:3.0/5.0
Slides: 25
Provided by: cllr
Category:

less

Transcript and Presenter's Notes

Title: Models of Word Recognition


1
Part 5
  • Models of Word Recognition

2
Writing Systems of the World
  • Not all language systems have one
  • Logographic system
  • each word or morpheme has a symbol
  • Syllabic system
  • each distinct syllable has a symbol
  • Alphabetic system
  • each phoneme has a symbol

3
But...
  • No language fits perfectly into one of these
  • All languages deviate somewhat
  • English is usually alphabetic, but what
    aboutSAVE, SHOOT, THROUGH
  • Deep vs. shallow orthographies
  • Other useful terms
  • homophony (bear-bare) polysemy (bank-bank)
  • bass-bass

4
A Simple Model
visual input
letter features
5
Measuring Lexical Processes
  • Lexical decision tasks
  • deciding whether a letter string is a real word
  • HAVE, MAVE, ASRMP, XXXXX
  • Naming tasks
  • Semantic priming
  • Dyslexia
  • acquired, developmental

6
Frequency Effects
  • More frequent words are easier to process
  • typically measured in terms of word frequency
  • Kucera Francis (1967) counted words in texts
  • Tested using lexical decision, naming tasks
  • GREAT gt BLEAT
  • Small differences, but definitely there

7
Spelling-Sound Regularity
  • How do you pronounce words ending in -ave?
  • RAVE, SAVE, KNAVE, GAVE, SHAVE
  • HAVE ?????
  • English spelling is sometimes inconsistent
  • Lots of words follow the rules
  • regulars
  • Some dont
  • irregulars/exceptions

8
Irregular Spellings
  • Inconsistent with other words with similar
    spellings
  • pint (mint, stint)
  • great, sweat (beat, seat., treat)
  • Strange words
  • colonel, yacht

9
Regularity Effects
  • Regular words are usually faster to name than
    exceptions
  • but only for naming, not L.D.
  • Frequency by Regularity interaction
  • naming low frequency exceptions is much harder

10
Why Regularity Matters
  • How do we recognize words?
  • Problem with memorization we want to be able to
    generalize to novel words
  • NUST
  • NAPSTER
  • Decoding using what we know about spelling-sound
    correspondences
  • Big question
  • how much of reading is decoding?
  • how much is recalling a word from memory?

11
The Dual-Route Model
  • Coltheart (1978) an extension of Forsters model
  • We can access the meaning of a word in several
    ways
  • orthographically
  • phonologically
  • lexical route whole word recognition
  • rule-based route grapheme-phoneme correspondence
    rules

12
Dual Route Model
spelling-sound correspondence rules
Response
how a word is spelled
the words meaning
the words sound
13
The Connectionist Approach
  • Does away with rules, lists of words
  • instead uses a brain analogy
  • distributed information
  • All words are encoded as bundles of 3 types of
    information
  • spelling, sound and meaning
  • Lexical processing completing this pattern given
    some part of it
  • reading mapping spelling to meaning or sound
  • listening mapping sound to meaning

14
AKA, The Triangle Model
speaking
Semantics
reading for meaning
Orthography
Phonology
listening
reading for sound
15
Implementation in SM89
Semantics
Orthography
Phonology
16
What Makes this Model Different
  • No real routes, since everything is
    interconnected
  • Suggests we dont ever use use a single route for
    a word division of labour
  • Distributed representations
  • naturally codes similarity
  • allows for spontaneous generalization
  • can generalize NUST by learning MUST, NUTS, NUMB,
    CUSP etc.

17
Training the model
  • 2,884 English monosyllables
  • Orthography Phonology Wickelfeatures
  • individual units encoded triples of letters
  • make -gt _ma, mak, ake, ak_
  • mek -gt _me, mek, ek_
  • Input orthography
  • Output phonology
  • Presentation was frequency-weighted

18
Frequency Manipulation
  • The probability of presenting a word during
    training
  • p K log(freq 2)
  • K frequency of the
  • Created a frequency distribution that was a
    compressed copy of actual English
  • Why was this necessary?

19
Network Performance
People
Model
  • Freq x Reg interaction

20
Nonword Performance
  • Generalized well to nonwords
  • e.g., consistent vs. inconsistent neighborhood
    effects
  • nust vs. mave
  • Error rates were appropriate for what is observed
    in normals RTs

21
The Slot Problem
  • Nonword reading in SM89 model was weak
  • This is a symptom of a bigger problem with these
    networks
  • the slot problem
  • Consider
  • drop d r a p _
  • door _d o r _
  • slid s l I d _
  • What has the network learned about d
  • Can it read bold?

22
Plaut et al. (1996)
  • Builds on SM89, and incorporates better
    phonological representations
  • eliminated wickelphones
  • feature-based phonology instead
  • Phonological attractors
  • recurrent connections on the phoneme units allows
    it to build more complex phonological
    representations

23
Plaut et al model
  • How do you read slint?
  • slip, slap, slay
  • SL_ -gt sl_
  • mint, bin
  • _IN -gt _in
  • _INT -gt _int

Phonology
100 hidden units
Orthography
24
Encoding spelling
  • Componential attractors (Plaut et al.)
  • encoding information at different levels of grain
  • Letter-sound correspondences
  • 1-to-1
  • many to one
  • Exceptions
  • PINT
  • Strange words
  • YACHT, SERGEANT
Write a Comment
User Comments (0)
About PowerShow.com