Title: Speech Production Comprehension Reading Writing
1The Language system
Speech Production
Comprehension Reading
Writing
2Contributors
Speech Production Jenny Crinion Mairead
MacSweeney Liz Warburton
Auditory speech Guillaume Thierry Tim
Griffiths Anne Lise Giraud
Object recognition Jacquie Phillips Uta
Noppeney Glyn Humphreys
Reading Andrea Mechelli Joe Devlin
Wellcome Department of Imaging Neuroscience Univer
sity College London UK
3The Neural systems
Auditory speech comprehension Speech
Production Reading
Warburton et al.
Mechelli et al.
Thierry et al.
Most of the brain!
4My Question
Do language specific functions emerge
from Language specific brain regions? or Languag
e specific connections?
5Language specific brain regions?
Neuropsychological functional imaging
studies E.g. Brocas area for speech production
Wenickes area for speech comprehension
Language specific functional connectivity?
Each language area involved in many tasks
(verbal and nonverbal) Language function depends
on which set of regions is activated
how these regions interact with one another
Both?
6Structure of talk
- Auditory speech comprehension areas
- Speech production areas
- Reading areas
- Conclusions from functional imaging studies
- Consistency with neuropsychological studies
7Auditory Speech Comprehension Areas
Experiment 1 Semantic decisions on (A)
Speech and (B) environmental sounds
Experiment 2 Listen (A) Speech and (B)
reversed speech
8Experiment 1 Thierry, Giraud, Price, 2003
Semantic decisions on sequences of heard words
and environmental sounds
Button press Button press Button
press Button press
Speech and sounds matched for task, response
semantic content
not phonological input / lexical
access
9Experiment 1 Thierry, Giraud, Price, 2003
Semantic decisions on words and environmental
sounds
No difference for Words/Sounds in RTs F(1,11)
0.003 p 0.954 or Accuracy F(1,11)
0.071 p 0.795
Speech and sounds matched for task, response
semantic content
not phonological input / lexical
access
10Semantic decisions on heard speech gt noise bursts
finger press
Left hemisphere
Right hemisphere
plt0.001
should include all cognitive components of speech
perception
11Speech gt Environmental sounds
Left hemisphere
Right hemisphere
matched for task, response semantic content
but not phonological input /lexical access to
semantics
12Speech gt Environmental sounds
Left hemisphere
Right hemisphere
- Do these regions
- all play a role in phonological processing or
lexical access to semantics - - if so what role does each area play?
- or do they have functions that are not
stipulated in cognitive models? - - if so what are these functions?
13 Experiment 2 Listen to (A) Speech
(unrelated words) or (B) the
same words after being digitally reversed
- Reversed speech
- sounds like a foreign language
- partially controls for phonetic cues (except
the cues are reversed) - does not contain word forms or their semantic
associations - may reduce attention to auditory stimuli
14Experiment 2 Listen to Speech gt reversed speech
Left hemisphere
Right hemisphere
y 8 y -12 y -42
15Summary of Experiments 1 2
? Phonetic cues / lexical access to semantics
Speech gt sounds
Speech gt reversed speech
? semantics / phonetic cues
16y -24
x -62
2 regions Speech gt sounds not speech gt
reversed (? Acoustic/phonological)
5 left temporal lobe Speech areas
x -62
x -55
x -55
3 regions Speech gt sounds speech gt
reversed ?Semantics ? Phonetic cues
17- Are the left temporal speech areas
- dedicated to phonological / semantic processing?
- also activated by non- verbal stimuli / tasks ?
18Non- verbal auditory activation
Experiment 1 Semantic decisions on (A)
environmental sounds (B) noise
Griffiths et al. 1998 Listen (A) Melodies
and (B) fixed pitch sequences
Warren et al., 2003 Changes in (A) pitch
chroma and (B) pitch height
Vandenberghe et al., 1996 (A) semantic and (B)
perceptual decisions
19Semantic decisions on Environmental sounds gt noise
Left hemisphere
Right hemisphere
Sounds gt noise
20Semantic decisions on Environmental sounds
Left Right
Left hemisphere
Coronal section
Sounds gt noise
Left Right
21Activated by sounds but more activated by words
Is this region involved in acoustic processing ?
22Griffiths et al. 1998 Listen (A) Melodies
and (B) fixed pitch sequences
with parametric
modulation in pitch strength
As pitch gets stronger, Melody gets stronger
Melody how the pitch changes over the course of
the sound
23Griffiths (1998) Activation as strength of
melody increases
Left hemisphere
Right hemisphere
24Melody / Speech
y -42
y -34
z -22
z -14
Speech gt reversed
25Activated by words gt sounds words gt reversed
words but also activated by melodies
Are these regions determining how the pitch
changes over the course of the sound ?
26Warren et al., 2003 Changes in (A) pitch
chroma and (B) pitch height
Pitch chroma used in tracking changes in
information (melodies) Pitch height used in the
segregation of sources (male/female cello/violin)
27Warren (2003) Pitch Chroma vs Pitch height
Pitch chroma (anterior auditory cortex)
Pitch height (posterior auditory cortex)
28Pitch / Speech
Pitch chroma gt height
Speech gt sounds
29Activated by words gt sounds but also activated by
pitch chroma
30Semantics / Speech
Semantic decisions on objects
31Activated by words gt sounds words gt reversed
words but also activated by semantic decisions on
visual objects
32Summary of Nonverbal responses in auditory speech
areas
pitch chroma
environmental sounds
melody
melody
semantics
No extra regions dedicated to phonology Does
phonology emerge from increased demands on other
acoustic areas? Or Are there small
populations of neurons that we are not detecting?
33Structure of talk
- Auditory speech comprehension
- Speech production
- Reading
- Conclusions
- Consistency with neuropsychological studies
34Speech Production Areas
- Object Naming
-
- Word retrieval
- Speech production
-
- Object recognition
- Verbal fluency
-
- Word retrieval
- Speech production
-
- Conceptual processing
35Segregating Speech Production from Object
recognition
Examples of Stimuli
Speech gt Finger tasks
Target Non-target
Object Decision
Object Name
gt
Speech Task 1
36Segregating Speech Production from Object
recognition
Speech gt Finger tasks
Examples of Stimuli
Target Non-target
Object Decision
Object Name
gt
Speech Task 1
Target Non-target
Speech Task 2
Say OK
Circle detection
gt
37Speech Production excluding object recognition
1) Object Name gt Decision
38Speech Production excluding object recognition
1) Object Name gt Decision
2) OK gt circle detection
Does not control for mouth movements
39Speech Production excluding mouth movements
Visual pacing stimuli
1 3 1 3
1) Speech production
40Speech Production excluding mouth movements
Visual pacing stimuli
1 3 1 3
1) Speech production
2) Mouth movements
41Speech Production excluding mouth movements
Visual pacing stimuli
1 3 1 3
1) Speech production
2) Mouth movements
3) Finger tapping
42Speech Production excluding mouth movements
Brocas and Wernickes areas
43Wernickes area also responds to unfamiliar melody
Speech gt Mouth Fingers
Unfamiliar Melody
44Brocas area also responds to noverbal action
retrieval
Action gtSize
45Brocas area also responds to noverbal action
retrieval
Speech gt Mouth Fingers
Action retrieval
46Speech Production Areas
Mouth Fingers
Speech gt Mouth Fingers
Wernickes area also responds to non-verbal
auditory stimuli Brocas area also responds to
nonverbal action retrieval
47The Evolution of Language
Corballis (Behav Brain Sci. 2003) Language
evolved from manual gestures, gradually
incorporating vocal elements. The transition may
be traced through changes in the function of
Brocas area. Its homologue in monkeys has
nothing to do with vocal control, but contains
the so-called Mirror neurons that code for
both the production of manual reaching movements
and the perception of the same movements
performed by others.
Rizzolatti Arbib (Trends in Neuroscience,
1998) An observation/execution matching system
provides a necessary bridge from 'doing' to
'communicating', as the link between actor and
observer becomes the link between the sender and
the receiver of each message.
48Structure of talk
- Auditory speech comprehension
- Speech production
- Reading
- Conclusions
- Consistency with neuropsychological studies
49Reading Areas
Reading gt fixation
50Reading Areas
Reading gt fixation
Pure alexia lesion
Leff et al. (2000)
51Reading Areas
Reading gt fixation
Object naming gt fixation
52Reading Areas
Reading gt fixation
Object naming gt fixation
Object naming gt Reading
53Left occipito-temporal cortex also responds to
noverbal action retrieval
Action gtSize
54Action retrieval to un-nameable objects
reading
speech
Mirror neurons Precursor of speech
55Summary
Verbal
NonVerbal
melody
comprehension
production
action
reading
56Conclude No area is dedicated to word processing
Brocas area
Action retrieval to non-objects
?
Speech production
Wernickes
Left Fusiform
Anterior-temp
Auditory speech perception Long term time
structure of Melodies
Visual word recognition
Specialisation arises from interactions between
areas.
57Structure of talk
- Auditory speech comprehension
- Speech production
- Reading
- Conclusions from functional imaging
- Consistency with neuropsychological studies
58If there are no language Specific regions how can
we explain language specific impairments in
Neuropsychological studies of brain damaged
patients
59E.g. if Left occipito-temporal activation is
observed for action retrieval on nonobjects as
well as reading How can pure alexia be explained ?
Reading
Action retrieval to un-nameable objects
typical lesion in pure alexia
60Contrasting accounts of neuropsychological
deficits
Classic lesion-deficit interpretation Pure
alexia results from damage to area specialised
for reading
Degeneracy interpretation Pure alexia results
from damage to a shared region but there
are alternative pathways available for object
processing.
61Different processing in Left and Right OT ?
Left occipito-temporal
Right occipito-temporal
(e.g. periphery biased representations / Global
features)
(e.g. center biased representations/ local
features)
Visual cortex
Normal, both left and right O-T activated by
reading and object naming
62Different processing in Left and Right OT ?
Left occipito-temporal
Right occipito-temporal
(e.g. periphery biased representations / Global
features)
(e.g. center biased representations/ local
features)
Visual cortex
Both left and right O-T activated by reading and
object naming
When left OT is damaged, patient is dependent on
right OT right OT may sustain object
naming better than reading if object naming
is less error prone when recognition is
dependent on global /periphery biased processes
63Different processing in Left and Right OT ?
Left occipito-temporal
Right occipito-temporal
(e.g. periphery biased representations / Global
features)
(e.g. center biased representations/ local
features)
Visual cortex
Both left and right O-T activated by reading and
object naming
When left OT is damaged, patient is dependent on
right OT right OT may sustain object
naming better than reading if object naming
is less error prone when recognition is
dependent on global /periphery biased processes
Damage to shared area can result in stimulus
specific deficit
64Summary
Brocas area
Action retrieval to non-objects
?
Speech production
Wernickes
Left Fusiform
Anterior-temp
Auditory speech perception Long term time
structure of Melodies
Visual word recognition
Specialisation arises from interactions between
areas.