+ All Categories
Home > Documents > Oral sessions: Language

Oral sessions: Language

Date post: 05-Jan-2017
Category:
Upload: truongdieu
View: 214 times
Download: 0 times
Share this document with a friend
10
ORAL SESSION: LANGUAGE Auditory and visual speech converge in auditory cortex at the phonetic level. G. A. Calvert', E. T. Bullmore/, M. J. Brammer', R. Campbell", S. C. R. Williams 2 , P. K. McGuire 2 , P. W. R. Woodruff", S. D. Iversen] and A. S. Davidf I University of Oxford, Oxford; 2 Institute of Psychiatry, London; 3 University College, London Introduction Investigating how information from distinct modalities is integrated in the brain is important for understanding the unified perception of heard and seen speech, and the McGurk illusion. In fMRI studies of hearing subjects, we compared the cerebral regions activated in silent lip-reading with those activated during heard speech in the absence of visual cues, to localise a common pathway by which information from visual and auditory modalities is integrated during face-to-face conversation. In two further experiments, we manipulated the linguistic saliency of these visual cues to explore the level at which they may intluence auditory speech perception. Methods All experiments employed an alternating test versus baseline design: 30 s ON. 30 s OFF. 5 subjects were scanned whilst listening to speech (Expt I) versus ambient noise, and whilst lip-reading the same stimuli without the accompanying sound (Expt 2) versus viewing a static face. During both experiments. subjects were asked to repeat back to themselves the stimuli they heard (Expt I) or saw (Expt 2), to activate cortical regions involved in internally generated speech consistently during both conditions. Brain regions activated in Expt I were then superimposed on areas activated in Expt 2; sec Figure. In 5 new subjects, we repeated Expt 2 and then holding the test condition constant (silent lipreading), manipulated the baseline condition to viewing closed mouth movements (Expt 3); and viewing pseudospeech (non-word phoneme strings) (Expt 4). Gradient echo EPI data were acquired at 1.5 T. 100 T2 *-weighted images depicting BOLD contrast were acquired with an in-plane resolution of 3 mm (TR = 3 s, TE = 40 ms) at each of lOncar-axial non- contiguous 5 mm thick slices positioned parallel to the AC-PC line to include visual, auditory and frontal cortices. Images were analysed by generic brain activation mapping (GBAM) 1 Results and Conclusion The most intriguing finding was the considerable overlap of activation in auditory cortex (BA 41. 42 & 22) during silent lip-reading in Expt 2 and auditory speech perception in Expt I. This suggests that lip- reading activates very similar cortical substrates to those activated by heard speech, providing a possible physiological basis for the enhancing effects of visual cues on auditory speech perception and the McGurk illusion. Furthermore, these auditory cortical sites are similarly activated by visible pseudospeech (Expt 4) but not by non-linguistic closed mouth movements (Expt 3). This provides novel physiological support to psychological evidence that lip-reading modulates the auditory perception of speech at a prelexical level, potentially prior to the classification of sound into phonemes. GBAM: white voxels show coincident temporal cortical activation by silent lip-reading and hearing speech Reference Bullmore ET, Brammer MJ et al. (1996) Mag Res Med 35,261-277 S51
Transcript

ORAL SESSION: LANGUAGE

Auditory and visual speech converge in auditory cortex at thephonetic level.

G. A. Calvert', E. T. Bullmore/, M. J. Brammer', R. Campbell",S. C. R. Williams2

, P. K. McGuire2, P. W. R. Woodruff",

S. D. Iversen] and A. S. DavidfI University of Oxford, Oxford; 2Institute of Psychiatry, London;

3 University College, LondonIntroductionInvestigating how information from distinct modalities is integrated in the brain is important forunderstanding the unified perception of heard and seen speech, and the McGurk illusion. In fMRIstudies of hearing subjects, we compared the cerebral regions activated in silent lip-reading with thoseactivated during heard speech in the absence of visual cues, to localise a common pathway by whichinformation from visual and auditory modalities is integrated during face-to-face conversation. In twofurther experiments, we manipulated the linguistic saliency of these visual cues to explore the level atwhich they may intluence auditory speech perception.

MethodsAll experiments employed an alternating test versus baseline design: 30 s ON. 30 s OFF. 5 subjectswere scanned whilst listening to speech (Expt I) versus ambient noise, and whilst lip-reading the samestimuli without the accompanying sound (Expt 2) versus viewing a static face. During both experiments.subjects were asked to repeat back to themselves the stimuli they heard (Expt I) or saw (Expt 2), toactivate cortical regions involved in internally generated speech consistently during both conditions.Brain regions activated in Expt I were then superimposed on areas activated in Expt 2; sec Figure. In 5new subjects, we repeated Expt 2 and then holding the test condition constant (silent lipreading),manipulated the baseline condition to viewing closed mouth movements (Expt 3); and viewingpseudospeech (non-word phoneme strings) (Expt 4).Gradient echo EPI data were acquired at 1.5 T. 100 T2 *-weighted images depicting BOLD contrastwere acquired with an in-plane resolution of 3 mm (TR = 3 s, TE = 40 ms) at each of lOncar-axial non­contiguous 5 mm thick slices positioned parallel to the AC-PC line to include visual, auditory andfrontal cortices. Images were analysed by generic brain activation mapping (GBAM) 1

Results and ConclusionThe most intriguing finding was the considerable overlap of activation in auditory cortex (BA 41. 42 &22) during silent lip-reading in Expt 2 and auditory speech perception in Expt I. This suggests that lip­reading activates very similar cortical substrates to those activated by heard speech, providing apossible physiological basis for the enhancing effects of visual cues on auditory speech perception andthe McGurk illusion. Furthermore, these auditory cortical sites are similarly activated by visiblepseudospeech (Expt 4) but not by non-linguistic closed mouth movements (Expt 3). This providesnovel physiological support to psychological evidence that lip-reading modulates the auditoryperception of speech at a prelexical level, potentially prior to the classification of sound into phonemes.

GBAM: white voxels show coincident temporal cortical activation by silent lip-reading and hearing speech

ReferenceBullmore ET, Brammer MJ et al. (1996) Mag Res Med 35,261-277

S51

ORAL SESSION: LANGUAGE

Modulation of ERP correlates of the detection of an ambiguousspeech segment by the context of presentation: phonetic versus

auditory modes of perception?P Celsis, B Doyon, K Boulanouar, JL Nespoulous

INSERM U455 and Laboratoire Jacques-Lordat, Toulouse, France

Introduction The hypothesis that speech may be perceived and processed differently from other auditory signalshas been the focus of a prolonged debate. Experiments in animal (1) support the view that categorical perception is ageneral auditory process while other studies (2) provide arguments for a specific mode of processing for speechsounds. A convenient way for testing the hypothesis would make use of acoustic patterns that can be perceivedeither as speech or as nonspeech, depending on the context of presentation. In this study, we recorded ERPs in adetection task where the target was an ambiguous speech segment presented either in speech or environmental soundcontext. We hypothetized that modulation effects by context might be seen on the ERPs amplitude, latencies orhemispheric distribution.

Methods The target was the unvoiced fricative consonant IfI pronounced by a human locutor. This phoneme isambiguous in that, presented in isolation, it can be perceived as nonspeech sound. Two lists of 180 stimuli wereconstructed in which 30 targets were presented either among the CV syllables Ifil, lvii, Isil and Izil (speech context)or among four environmental sound such as pouring water, paper crumpling, hooter blast and glass scratching (soundcontext). Duration of target and distractors was 200 ms and the inter-stimulus interval was 2s. The order ofpresentation of the lists varied among subjects. Stimuli were delivered binaurally via earphones. Eleven right-handedvolunteers were instructed to push a button as quickly as possible each time they heard the target, right and leftfinger being alternated between subjects and lists. ERPs were recorded through a 32 channel NEUROSCAN system.Recordings were filtered, corrected for ocular artifacts and cut into one second epochs synchronized on the stimuli.Grand average ERPs were then calculated from the whole group for each electrode and for each context ofpresentation. Significant difference between contexts was test by paired t-test. In addition, maps of ERPsinterpolated across electrodes were generated to get idea of the spatial distribution of the events.

Results Fig l shows the grand average ERPs at CZ for target IfI according to the context of presentation. Whereasthe waves are perfectly superimposed up to 250 IDS, a clear difference then appears that lasted for about 200 ms (250­450 ms after stimulus presentation). In this time window, the ERPs obtained in the speech context show an amplepositive wave that peaks about 350 ms and that is not observed in the context of environmental sounds, thedifference between contexts being significant (t > 3.5, p<O.OI). Fig 2 illustrates the spatial distribution of thespeech-minus-sound difference between the traces in the 250-350 ms window (vertical bars on fig 1). The distributionis clearly asymmetric, with a predominance for speech in the left hemisphere.

Fig IERP' s a t CZ for targe l/f l ac cording t o the differenl

c ontexts or presenta t ion ( Spee ch v s So unds )

; • ' I

Fig 2Brai n mapping of t he difference in ERP's for t arget I f I

bet ween t he sp eech cont ext and envi ro nmen t al sound co ntext

... " , I I , • • <or>d .

••..,..• ·' .1..:1 .9. ·2.•

· 2. 2* l.<f. I . !I· lo l......•..

• • • • 1_ -' .1.-....•

Conclusion This study shows a top-down, modulatory effect of the context on the ERP correlates of an auditorilypresented stimulus. While it does not demonstrate the existence of a speech-specific mode of perception, it suggeststhat different neural networks might be implied in the processing of the same stimulus whether the stimulus bears orbears not a phonetic value.

1. Steinschneider, M. et a1. Brain and Language 1995, 48: 326-340.2. Mann, VA., Liberman, AM. Cognition 1983, 14: 211-235.

852

ORAL SESSION: LANGUAGE

Selective Processing of Nouns and Function Words in the Human BrainAC Nobre", CJ Price", R Turner" & K Eriston"

+ Department of Experimental Psych ology, Oxford University, Oxford, UK;4'yhe Wellcome Department of Cognitive Neurology, Institute ofNeurology, London, UK.

ObjectivesWe inve sti gated the hypothesis that the semantic representations of word s reflect basic functionalspecialisations of brain regi ons. Specificall y, we contras ted the proce ssing of co nc rete noun s and gramma tica lfunction words during a task which focused on word meaning.

MethodsTen right -handed native-Engl ish speakers, 19-33 yea rs old, were studied using tMRI. Subjects detectedsuccess ive synonyms of co ncrete noun s or functi on words, or success ive repetitions of co nsonant letter strings .Subjects completed three experimental runs in counterba lanced order. In each, two stimulus types werepresented in blocks of lO-stimuli which alternated and repeated six times . Run s co ntai ned 120 stimuli andlasted four minutes . Stimuli appeared for 500 ms at an average interval of 2 sec . Stimuli were equated forlength, and words were equated for frequency.The data were acquired at 2 Tesla using a Magnetom VISI ON (Siemens, Erlangen) whole body MRI system ,equipped with a head volume coil. Contiguous multislice T2 *-weighted tMRI images were obtained with agradient-echo planar sequence using an axial slice orientation (TE = 40ms, TR = 1.7 seconds, 64x64 voxels) .Each run comprised 120 volume images with 16 slices of 3x3x6mm voxels , The images spanned most of th ecerebrum starting at the infer ior temporal lobe. Structural images were obtained at the sa me or ientation, usin ga T I-w eighted sequence with Ix Ix3mm voxels (matrix size : 192 x256x54). The data were analysed using th egenera l linear model , in thi s instance multiple linear regression on condition-s pec ific waveform s convol vedwith a hem od ynamic response function as implemented in SPM95 (We llcome Dep t. of Co gni tive Neurology).Condition-s pec ific effects were assessed by co mparing ( I) noun s to function and non -words, (2) func tion wordsto noun s and non -words and (3 ) non-words to nouns and function words.

ResultsDistinct di stributed neu ral sys tems were associat ed with co ncrete noun s and functio n words (T able I ).Concrete nouns eli cited le ft-weighted acti vity in six brain regi ons in the major ity of the subjects: inferi orfrontal cortex (IF, BA 47 ), posteri or middle temporal cortex in the angular gyru s (AG , BA 39) , medialtemporal- lobe regions including hippocampus and the underlying cortex (Me T), later al anterior temporalcortex (LA T) . posteri or c ingulate gyrus just anteri or to the parietal-occipital su lcus (PC, BA 23), andprefront al cortex in the superior frontal gyrus/sulcus (PF. BA 9) . Fun cti on words ac tiv ated a different patternof left -hemisphere structures : precentral sulcus in Bro ca ' s area of premotor cortex (PM , BA 6/44) , pre centralgyrus in motor co rtex (M , BA 4/6 ), inferior frontal gyru s (IF, BA 45 ), superior temporal sulcus and middletemporal gyru s (M iT , BA 21 ), and supramarginal gyrus at the end of the Syl vian fissure (SMG, BA 40).

ConclusionsThe results suggest that word -related representations are di str ibuted acc ording to the functional properties ofwords. Concrete noun s have rich semantic representati on s, with strong visual and mnemonic association s .Activations in the left inferi or frontal cortex, angular gyrus, left anterior temporal regi on, and superior frontalgyrus were similar to tho se obtained in tasks which stress semantic processing of words and objects (1-2) .Medial temporal activation may have reflected visual or multimodal associations of concrete imageablenouns . Posterior cingulate activation may also be linked to memory associations .

Function words do not have strong visual or semantic associations, and may rely more heavily on phonologyand articulation. The brain regions engaged were consistent with those imaged during tasks which haveemphasised phonological and articulatory processes (2,3). Unlike other word types, fun cti on words do not haveany meaning per se . Instead, their meaning is derived from anchoring, linking or sequencing other items. Thi srole is fundamental to grammatical processing and is con sistent with functions of the premotor cortex. Theselective activation of premotor cortex in the region of Broca 's area may reflect the se "premotor" attributes offunction words.

noun IF (47) AG (39) MeT LAT PC (23) PF (9)-38 +34 -06 -51 -66 +20 -31 -26 -16 -56 -03 -14 -12 -54 +34 -14 +53 +25

n = 8: 7L, 1 R n = 8: 7L. 2R n = 7: 7L, 2R n = 5: 5L, OR n = 7: 6L, 3R n = 5: 4L, IR

function PM (44/6) M (6/4) IF (45) PF (46/9) MiT (21) SMG (40)-57 +09 +32 -40 -02 +59 -56+23+15 -45 +51 +17 -59 -47 +02 -56 -45 +27

n=8:8L,IR n = 7: 7L, IR n = 7: 7 L, 6R n = 6: 6L. IR n = 9: 9L, 6R n = 9: 9L, OR

Table 1: regions of peak activations in the majority of subjec ts (Brodmann area), average Talairach locationof the peak activations in left hemi sphere across subjects, and number of subjects who sho wed effect.

ReferencesI. Vandenberghe R, Pri ce C, Wise R, Josephs 0 , Frackowiak RSJ. Nature 1996 ,383: 254-256.2. Price CJ , Moore CJ , Humphreys GW, Wise RJS . J Cog Neurosci 1997, in pre ss .3. Paulesu E, Frith CD, Frackowiak RSJ. Nature 1993, 362:342 -345.

S53

ORAL SESSION: LANGUAGE

Cortical Regions Associated with Perception, Naming, and Knowledgeof Color

L.L. Chao, J.A. Weisberg, C.L. Wiggs, J.V. Haxby, L.G. Ungerleider,A. Martin

Laboratory ofBrain and Cognition, NIMH, Bethesda, Maryland, USA

Previous PET results from this laboratory suggested that information about object color (color knowledge) is storednear, but not within, cortical regions involved in the perception of color (1). Lesion data suggest that colorknowledge may be impaired in some (2) but not all (3) achromatopsic patients. Thus, it is not clear whetherretrieval of information about a specific object attribute (e.g. color) requires re-activation of areas that mediateperception of that attribute. We used positron emission tomography (PET) to investigate this issue by identifyingthe location of the cortical regions associated with color perception, color naming, and color knowledge in the samesubjects.

MethodsTwelve (6 males) neurologically normal, right-handed subjects (age range: 21 to 39) participated in the study. PETscans were obtained with a Scanditronix PC2048-15B tomograph (Milwaukee, WI). During separate scans subjectswere presented with colored and equiluminant gray-scale Mondrians, colored objects embedded within coloredMondrians, and achromatic objects embedded within gray-scale Mondrians. Subjects passively viewed theMondrians, named the colored and achromatic objects, named the color of colored objects, and generated color namesassociated with achromatic objects. Subjects began the task about 30 s before injection of 30 mCi of H2

1SO.Scanning began when the brain radioactivity count reached a threshold value and continued for 60 seconds. Data wereanalyzed with SPM.

ResultsPerception of color (naming colored objects relative to naming achromatic objects) activated the right collateralsulcu s, while selective attention to color (naming the color of colored objects relative to naming the colored objects)activated the left lingual gyrus . The coordinates of these activations were in close agreement with those in previousimaging studies of color perception (4,5,6). Relative to naming achromatic objects, generating color names for theseachromatic objects (color knowledge condition) activated left inferior temporal, frontal, and posterior parietal cortex .These results replicated previous findings from this laboratory (1). When retrieval of color knowledge was comparedto the low level gray-scale baseline, there was activation in the left fusiforrn/lateral occipital region . However, thisactivation was lateral to the site activated by color perception . Naming achromatic objects also activated this area,and other studies have shown that this region is responsive to object fonn as well (7).

ConclusionsBased on the regions identified in this study that responded selectively to color and not object form, the resultssuggest that retrieving color knowledge does not activate regions subserving color perception. Thus, in agreementwith a recent finding from achromatopsic patients (3), intact color perception is not necessary for retrieving colorknowledge.

ReferencesI. Martin, A. et aI., Science. 1995, 270: 102-105 .2. Ogden , J.A. Neuropsychologia. 1993, 31(6):571-589.3. Shuren, J.E. et aI., Neuropsychologia. 1996, 34(6) :485-489.4 . Zeki, S. et aI., J Neurosci . 1991, 11:641-649.5. Corbetta, M. et al. , J Neurosci. 1991, 11:2383-2402.6 . Clark, V.P. et aI., Human Brain Mapping (in press)7 . Malach, Ret aI., Proc. Natl. Acad. Sci. USA 1995,92:8135-8139.

S54

ORAL SESSION: LANGUAGE

Functional Connectivity of the Angular Gyrus DuringSingle Word Reading in Normal and Dyslexic Men

B. Horwitz-, J. M. Rumsey-, B. C. Donohue-I National Institutes on Aging and 2Mental Health, NIH, Bethesda, MD USA

IntroductionIt has long been conjectured, based on lesion studies, that the left angular gyrus plays a central role in reading;hypotheses for the neuropathological basis for alexia center on a disconnection of the angular gyrus fromoccipital and temporal lobe areas (I ). Positron emi ssion tomographic (PET) and functional MRI studies ofreading, however, have been inconsistent in showing activation of the angular gyrus during reading compared toviewing letter-like stimuli; rather, activation generally is found in occipital and temporal lobe regions (2,3) .However, a region such as the angular gyrus may playa critical role in the neural network mediating singleword reading without showing increased regional cerebral blood flow (rCBF) compared to some controlcondition, and within-ta sk covariance analysis (4) can be used to investigate this by examining a region 'sfunctional connectivity. If the left angular gyrus plays a central role in reading, then it should show strongwithin-task, across-subjects correlations in control subjects during reading with extrastriate occipital andtemporal lobe regions. In this study, we examined the functional connecti vity of the angular gyrus in normalmen and in a reading impaired group of dyslexic men.

MethodsWe measured rCBF using [O-15]-water and PET in 14 young males (mean age ± sd: 25 ± 5 yrs;) and 17dyslexic men (27 ± 8 yrs) . Each subject performed two pronun ciation tasks (orthographic: read word s withirregular spellings; phonologic: read pseudowords) (3). Data were stereotacticall y normalized and smoothed to20mm x 20mm x I2mm; values of rCBF for each subject were standardized by dividing rCBF for each pixel byglobal CBE Becau se of smoothing, an individual pixel can be thought of as representin g reg ional activity.Interregional correlation s (across subjects) within each condition were calculated (4) using reference pixels inthe vicinity of the right and left angular gyri where the middle temporal gyrus and the inferior parietal lobulemeet [Talairach (5) coordinates (x,y,z) =-±44, -54, +24] .

ResultsA significant (p<.OOl) posiuve correlation betw een standardized rCBF in the left angular gyrus andstandard ized rCBF in the left fusiform gyrus (Brodmann area (BA) 19/37 ; local maximum has Talairachcoordinates -48 , -64, - 12) was found for orthographic pronunciation in controls but not in dyslexic men . Thi spattern of significant corre lations was not found between rCBF in the right angular gyrus and rCBF in rightoccipital and temporal areas in controls. During phonological pronunciation, rCBF in the left angular gyrus wassignificantly correlated with rCBF in the left fusiform gyrus (RA 20; local maximum at -46 -34 -20) and withrCBF in a region in the left superior and middle temporal gyr i (BA 22; local maximum at -48 -34 8) in controlsbut not in dyslexic men .

ConclusionsThe results of this functional connectivity study indicate that many aspects of the classical model (1) forreading, based on lesion studies, are found in normal readers during single word reading. In particular, we havefound strong functional linkages between the left angular gyrus and visual association occipital and temporalareas shown to be activated by word and word-like stimuli (2,3). Furthermore, during a pseudowordpronunciation task , the left angular gyrus also was functionally linked to these areas , as well as to a region in theleft superior and middle temporal gyri that may correspond to Wernicke's area. In dyslexic men , these strongfunctional linkages were absent, suggesting a functional disconnection of the left angular gyrus from visualareas and from Wernicke's area in these patients.

References1. Henderson, VW. Brain Lang. 1986,29: 119-133.2. Price, CJ., Wise, RJS ., Watson, JDG ., Patterson, K., Howard , D., Frackowiak, RSJ. Brain. 1994, 117: 1255-

1269.3. Rumsey, 1M., Horwitz, B., Donohue, Be., Nace, K., Maisog, 1M., Andreason, P. (under review).4. Horwitz, B. Hum . Brain Mapping. 1994,2: 112-122.5. Talairach, 1., Tournoux, P., Co-Planar Atlas of the Human Brain , Thieme, 1988.

S55

ORAL SESSION: LANGUAGE

%

i\ /1 rn80 ~ tj I 'l /~ (\/ I I60 I~~I I / I U40 ~Flg3'\1 rI 'I

J_

3 4

Session number (6 blocks per session)

100

Performance

300

.::'.... ,'~. :~.:.: ".

' .. " :: ,"

100 ZOO

Fig. 2 . L DLPFC

o

165

..'.' .

160

-, .,

175 :

300 (Scans)200100

Learning-related temporal dynamics in the prefrontal cortexP.C.Fletcher, O.Josephs, K.J.Friston, C.Biichel and Rc.l.Dolan

Wellcome Department. of Cognitive Neurology, Institute ofNeurology, London, UK.IntroductionWhen a task is unpractised, attentional and working memory demands may be greater andqualitatively different executive processes may predominate compared to when the task is welllearnt. This poses problems for interpreting functional imaging studies of learning. Weaddressed these problems, with respect to the dorsolateral prefrontal cortex (DLPFC), in anartificial grammar learning paradigm. Learning processes occurred within and across sessionsallowing a disambiguation of the time-dependent dynamics of DLPFC responses associatedwith the learning of specific items from those involved with learning more general rules.Methods.In two subjects, 360 64-s1ice T2*-weighted MRI images (TR = 6.4) were acquired in twoalternating conditions. The activation task required a grammatical/ungrammatical decision, andpush-button response, to 4-letter consonant strings. The artificial grammar system wasunknown to the subject and feedback was supplied after each response. The baseline taskrequired a simple, pre-specified response to one of two strings (NNNN - push button I; PPPP- push button 2). Stimulus onset asynchrony was 3.2s. Study items were presented in blocksof 10 alternating with blocks of 10 control items. Each block was presented 6 times (onesession) and, on completion, a new session began. Items in each successive session weregenerated from the same grammar system. Thus, within sessions, subjects learnt repeatedlypresented stimulus-response pairings and across sessions they learnt rules derived from thoseexplicitly learnt pairings.Data were analysed using multiple linear regressions as implemented in SPM96 (WellcomeDepartment of Cognitive Neurology)). We examined 2 models, one comprising a simplelearning effect (learning versus control tasks), together with adaptation, i.e. condition-by-timeinteractions (for the learning condition alone) across and between sessions, and a secondincluding, in addition, second-order time interactions. (An F ratio test was used to evaluate thesignificance of these higher order terms. They were found to be insignificant and we presentresults from the simpler model). Figures I and 2 show plots of activity over the experiment, forone of the subjects, for left and right DLPFC (the other subject showed qualitatively similar

results). Performance data for the subject are shown in figure 3.

Fig 1. R DLPFC

1 Session(6blocks learning/6 blockscontrol)

172

aI 166

S

i 170g ....11

Discussion.While the changes in DLPFC responses within sessions are likely to reflect paired associate(i.e.letter string-grammaticality pairings) learning, and are most appropriately interpreted in thislight, the changes across sessions (a decrease in left DLPFC and an increase in right DLPFCactivations) is more likely to reflect learning of the artificial grammar itself. This was reflectedbehaviourally in performance improvements (in the final session, in all blocks, the subjectscored 100% despite the items being previously unseen). We interpret changes in left DLPFCresponses in terms of processes concerned with abstracting the grammatical rules (consistentwith greater activity during earlier sessions and decreasing activity as the experimentprogresses). Conversely, the increasing right DLPFC activations with time may reflect theapplication of the rules and item monitoring in the context of an increasing facility with thegrammar system. Reduction of right DLPFC activations within sessions is likely to reflectreducing need for these processes as items are repeatedly presented.

S56

ORAL SESSION: LANGUAGE

Cerebral substrates of bilingualism: a 3-T fMRI study

Stanislas Dehaene', Pierre-Francois van de Moortele2, Emmanuel Dupoux",

Stephane Lehericy', Laurent Cohen", Daniela Perani'',Jacques Mehler3 and Denis Le Bihan2

lINSERM U 334, Orsay, France; 2SHFJ, CEA, Orsay, France; 3Laboratoire de SciencesCognitives et Psycholinguistique, EHESS/CNRS, Paris, France; 4H6pital de la Salpetriere,

Paris, France; and sDIPSCO Scientific Institute HS. Raffaele, Milan, Italy

When a polyglol suffers a brain lesion, aphasia is occasionally observed in only one of the languages

originally mastered 1. This dissociation, together with evidence from electrical cortical stimulation 2, suggeststhat different brain areas are recruited for learning the first language (Ll ) and the second language (L2). Yetbrain imaging studies in groups of bilingual subjects with variables levels of fluency in L2 have yielded

divergent results that fail to pinpoint a specific neuronal substrate for second language acquisition 3, 4. Thepurpose of the present study was to use functional magnetic resonance imaging (fMRI), a method applicable tosingle subjects, to assess inter-subject variability in the cortical representation of language comprehensionprocesses.

Method. Eight right-handed male students, native speakers of French, aged between 21 and 25,listened to stories in their native language (Ll = French) or in a second language that they had learned atschool after the age of seven, and in which they were only moderately proficient (L2 = English). In bothlanguages, short stories were recorded by a native speaker, digitally edited and cut into three blocks of 36seconds. These blocks were then presented in alternation with three blocks of 36 seconds of backward speech.This design subtracted out brain activity related to early auditory processing and isolated the regionsspecifically involved in the processing of real speech. Experiments were performed on a 3T whole-body system(Broker. Germany). Eighteen axial contiguous slices of 5mm thickness and 22cm field of view were scannedevery 6 seconds for 216 s (36 images) using a gradient-echo EPI sequence. After control for subject motion,activation maps were calculated based on the correlation coefficient between the MR signal time course for

each pixel and the processing paradigm 5.Results. During listening to L1, all subjects showed activity in the left temporal lobe all along the

superior temporal sulcus (STS) as well as in neighboring portions of the superior and middle temporal gyri(STG, MTG), often extending anteriorily into the temporal pole (TP ; 4 subjects) and posteriorily in the leftangular gyrus (AG : 4 subjects). Although similar activity was occasionally found in the right temporal lobe,including the right STS (6 subjects) and TP (2 subjects), it was always weaker, highly variable from subject tosubject, and never extended posteriorily into the right AG. During listening to L2, no single anatomical areawas found active in more than six subjects. In the left temporal lobe, six subjects showed activity in regionsglobally similar to those found in Ll (STS, STG, MTG), but with considerably more dispersion and with noactivity in the left TP and AG. The remaining two subjects showed a striking absence of activations in the lefttemporal regions during L2. Only their right temporal lobe was active. Three subjects also showed left inferiorfrontal activations specific to L2, and four also showed anterior cingulate activity. Globally, there was asignificantly greater left-hemispheric dominance for Ll and than for L2.

Conclusions. The fMRI results when listening to Ll replicated previous results using PET 3, 6, thusconfirming that Ll comprehension relies on a reproducible cerebral network, mostly in the left temporal lobe.Listening to L2, however, activated widely different areas for different subjects, suggesting that in late learnerswith low proficiency, the primary language network ceases to be available for acquiring a second language.

References.1. Paradis, M. Aspects ofbilingual aphasia. (Pergamon Press, Oxford, 1995).2. Ojemann, G.A The Behavioraland Brain Sciences 2, 189-203 (1983).3. Perani, D., et at. NeuroReport 7,2439-2444 (1996).4. Klein, D., Zatorre, RJ., Milner, B., Meyer, E. & Evans, AC. in Aspects of bilingual aphasia (ed.

Paradis, M.) 23-36 (Pergamon Press, Oxford, 1995).5. Bandettini, P.A, Jesmanowicz, A, Wong, E.C. & Hyde, J.S. Magnetic Resonance inMedicine 30, 161­

173 (1993).6. Mazoyer, B.M., et at. Journal ofCognitive Neuroscience 5, 467-479 (1993).

857

3. Binder, J.R., et al. Neuroimage 1996, 3: S530.4. Binder, J.R., et al. J Neurosci 1997, 17: 353-362 .

ORAL SESSION: LANGUAGE

Lateralization of Seizure Focus Predicts Activation of the Left MedialTemporal Lobe During Semantic Information Encoding

P.S.F. Bellgowan, J.R. Binder, S.J. Swanson, T.A. Hammeke, J.A. Springer,J.A. Frost, W.M. Mueller, G.L. Morris

Medical College ofWisconsin, Milwaukee, WI, USA.Introduction: Epileptic patients with left temporal lobe seizure focus are generally more impaired on verbalmemory tasks than are patients with right temporal lobe seizure focus . Th is memory deficit is correlated withstructural pathologies in the hippocampus and may reflect disruption of the verbal episodic memory system atthe level of the left medial temporal lobe [1,2]. The involvement of the left medial temporal lobe , including thehippocampus (HIP) and parahippocampal gyrus (PHG), in the encoding of semantic information has beendemonstrated using FMRI in normal subjects [3]. The present study was conducted to determine whether leftmedial temporal lobe pathology, asso ciated with left temporal lobe seizure focus , would decrease functionalactivity of this region during a semantic encoding task relative to subjects with right temporal lobe seizure focus .

Methods: 26 subjects with temporal lobe epilepsy were thoroughly evaluated in preparation for temporallobectomy and were classified as having right (RHF; N = 13) or left (LHF; N = 13) hemisphere seizure focusbased on electrophysiological measures. The two groups were matched on handedness and task performance.Whole-brain imaging was performed using multislice, gradient-echo echo-planar FMRI on a 1.5T GE Signascanner. Subjects performed a binary decision task invol ving categorization of binaurally presented nounsaccording to two semantic criteria. This task was compared to a pure-tone pitch discrimination task used tocontrol for auditory, motor and attentional factors [4]. Within- subject t-tests were conducted comparing imagesfrom each task. These t-values were redistributed into standard stereotaxic space, smoothed with a Gaussianfilter of rms radius = 4mm, and combined using voxel-by-voxel averaging to produce group maps for both theRHF and LHF groups. Voxel values in the averaged group maps were thresholded at a significance level of p <0.01.

Results: Activation of the temporal lobe was lateralized to the left hemisphere in both groups (Figure). Althoughtask performances were matched between groups, the RHF group showed much greater activation in both theHIP and PHG. In contrast, temporal activation in the LHF group was predominantly in the middle temporalgyrus.

Figure. Coronal views of areas activated more during the semantic categorization task for subjects withLEFT (upper) and RIGHT (lower) seizure focus. Active voxels at p < 0.01 significance level are shown.

Conclusions: These results are consistent with the proposed role for the left med ial temporal lobe in the encodi ngof verbal episodic memory. The RHF group showed similar activation maps as normal controls in pre viousstudies [4]. Disease of the left medial temporal lobe produced by seizure activity resulted in decreased functionalactivation in the left HIP and PHG. Interestingly, this was not accompanied by activation in the right HIP OrPHG, indicating that, if reorganization of memory systems occurs , this reorganization probably does not involve ashift ofverbal memory functions to the right medial temporal lobe.

Reference: .I. Sass, K.J ., et al. Neurology 1990, 40 : 1694-1697.2. Sass, K.J. , et al, Neurology 1995, 45 : 2 154-2 158.

S58

ORAL SESSION : LANGUAGE

The temporal dynamics of visual word processing: A PET study

Price CJ and Friston KJ

Wellcome Department of Cognitive Neuro logy, Institute of Neurology, London , UK

Introduction: Transient brain responses, evoked by single-word pro cessing, are usually charac terize d usingelectrophy sio logical tec hniq ues . In this work we have used a parame tric expe rime ntal design and nonlinearregression to look at these dynamics, on a time-scale of milliseconds, using PET. This novel approach revealed thathemodynam ic responses to visually prese nted words, varied in a nonlinear fashion with the duration of the stimulus .

Methods: Six subjec ts were scanned 10 times using PET. In each scan they viewed a succession of printed word sat a rate of one per 1.5 seconds. The only variable was word duration which ranged from 150ms to 750ms . The dat awere analyzed with SPM96 to revea l nonlinear relationships between rCBF and word dura tion .

Results : The effect of word duration in visual cortex (bilatera l posterior fusiform gyri) suggested adaptation duringsustained stimulation (see Figure Ia). This contrasted with the inverse of this effect in the anterior cingulate, righttemp oral and right front al cortices where activation fell with increas ing duration (see Figure Ib). Left hemisphereword process ing areas (media l extrastriate, the anterior temporal and inferior front al cortice s) responded differentlyaga in with activa tion reaching a maximum with 300-500m s durations (see Figure Ic).

l ao Bilate ral fusiform gyri Ib Anterior Ci ngulatc Ic Prefro nta l co rtex

r-----~----~/--, 7 Q r---~----------,

4'

,...; •46 I ............

45 , /,, / ',43!,"

:J8 0 0 0 100 200 300 400 500 600 700 800

DiscussionTh is study has revealed the temp oral dynamics of visual wor d processin g usin g nonl inear regression anal ysis ofrCBF responses to change s in word duration. The results indicate that earl y evo ked res ponses to visual stimuliincrease mo notonica lly with their duration. The obse rve d departure from linearity (Figure Ia) sugg ests either (i)adaptati on during sustained stimulation or (ii) responses to short stimuli are preferentially enhanced. The possibilityof atte ntional modulat ion is indicated by the fact that the anterior cing ulate responds maximall y with shortdurations. On ce beyond primary and seco ndary cortices, even as early as medial ex trastriate cortex , the responsebeco mes more comp licated showi ng an inver ted "U" behav iour with preferred durations of 300- 500ms . The inve rted"U" behaviour suggests that afte r an initi al per iod of activation, responses not only adapt but mu st rever se andbecome deactiva tions . In short the presence of an inverted "U" implies a biphasic activity function.

859

ORAL SESSION: LANGUAGE

Brain plasticity induced by language training after stroke.

M. Musso", C. Weillert, S. Kiebelt, M. Rijntjes", M. Juptner", P. Biilau.'Department ofNeurology, Friedrich-Schiller University oflena, lena, Germany

2Department ofNeurology, University ofEssen, Essen, Germany

IntroductionAfter injury , the brain shows some degree of spontaneous functional recovery. A meta- analysis about the efficacyof treatment of persons suffering from aphasia demonstratcds the clear superiority in performance of patientsrecei ving treatment by a speech-language pathologist versus those left untreated (I). Which brain area is involvedin this specific recovery ? A recent PET study stressed the importance of the right hemi sphere integrity forreorganisation of the brain after recovery from Wernicke 's aphasia (2). Our hypothesis was that during therehabilitation of language disturbances the improvement correlates with an activation in the right hemisphere.

Subjects and MethodsSubjects of our study were 23 patients with Wernicke's Aphasia (on standardised testing: AAT) caused byinfarction destroying the left posterior perisylvian language zone. All were right handed and native Germanspeakers. At first we tested a 2 hours language comprehension training in 20 patients, consisting of sentence-levelauditory-comprehension and a visual reaction therapy. Improvement of language comprehension was measured bythe Token Test before and after therapy. After this pilot study , we so far tested with PET three patients. Only onecondition, a language comprehension task , was studied during 12 repetitive measurements of regional cerebralblood flow. All subjects had to comprehend the semantic difference between the commands "taking" or "pointing"to a sample of everyday objects. Between the scans, the same language comprehension tra ining as in the pilotstudy was carried and the verbal comprehension was tested directly with a short version of Token Test. Regionalcerebral blood flow was measured after slow bolus injection of H2

150 . Analysis of the data was carried out usingStatistical Parametic Mapping (SPM96) (3) . After movement correction, the data were normalised into thestandardised stereotactic space, corresponding to the atlas of Talairach & Toumoux . The images were smoothedwith a Gaus sian filter with IOx1Ox6 mm. Statistically significant pixels were found using the general linear modeland the theory of Gaussian random fields at p<0.05 .

ResultsIn the pilot study we observed in almost all patients (18/20) an increase of number of correct answers of the TokenTest after the specific training. In the PET investigation, correlating the increase of regional cerebral blood flow(rCBF) with time, we found activation of different areas: left inferior frontal gyrus , cingulate gyrus and rightsuperior temporal gyrus in the first patient; the left occipital gyrus, the cingulate gyrus and the left precentralisgyrus; the cingulate gyrus in third patient. When we correlate the increase of rCBF with performance of Token Testbefore each scan, we found an activation only in the right hemisphere: in the superior temporal gyrus, in first andthird patient; in inferior frontal gyrus, in the second patient.

ConclusionsEven short term intense training improves language comprehension as assessed by the Token Test in Wernicke'saphasia patients . Our PET investigation, supporting the hypothesis of a pre-existing extensive and bilaterallanguage network, demonstrated that specific training leads to a shift from left henisphere to homologues righthemisphere language zones, corresponding to a measurable improvement of lost language function.

ReferencesI. Randall R. Robey. Brain and Language, 1994 ; 47: 582-608.2. Weiller, C. et al. Annals of neurology 1995; 37:723-732.3. Friston, KJ et al . Human Brain Mapping, 1995; 2: 189-210.

860


Recommended