+ All Categories
Home > Documents > The cognitive neuroscience of signed language

The cognitive neuroscience of signed language

Date post: 26-Nov-2023
Category:
Upload: lu
View: 0 times
Download: 0 times
Share this document with a friend
18
The cognitive neuroscience of signed language Jerker R onnberg a, * , Birgitta S oderfeldt b , Jarl Risberg c a Department of Behavioral Sciences, Link oping University, and The Swedish Institute for Disability Research, Link oping and Orebro Universities, Sweden b Department of Neuroscience and Locomotion, University Hospital, Link oping University, Sweden c Department of Psychiatry, University Hospital, Lund University, Sweden Abstract The present article is an assessment of the current state of knowledge in the field of cog- nitive neuroscience of signed language. Reviewed lesion data show that the left hemisphere is dominant for perception and production of signed language in aphasics, in a fashion similar to spoken language aphasia. Several neuropsychological dissociations support this claim: Non- linguistic visuospatial functions can be dissociated from spatial functions and general motor deficits can be dissociated from execution of signs. Reviewed imaging data corroborate the lesion data in that the importance of the left hemisphere is re-confirmed. The data also es- tablish the role of the right hemisphere in signed language processing. Alternative hypotheses regarding what aspects of signed language processing are handled by the right hemisphere are currently tested. The second section of the paper starts by addressing the role that early ac- quisition of signed and spoken language play for the neurofunctional activation patterns in the brain. Compensatory cognitive and communicative enhancements have also been documented as a function of early sign language use, suggesting an interesting interaction between language and cognition. Recent behavioural data on sign processing in working memory – a cognitive system important for language perception and production suggest e.g. phonological loop ef- fects analogous to those obtained for speech processing. Neuroimaging studies will have to address this potential communality. Ó 2000 Elsevier Science B.V. All rights reserved. Acta Psychologica 105 (2000) 237–254 www.elsevier.com/locate/actpsy * Corresponding author. Department of Behavioral Sciences, Link oping University, 58183 Link oping, Sweden. 0001-6918/00/$ - see front matter Ó 2000 Elsevier Science B.V. All rights reserved. PII: S 0 0 0 1 - 6 9 1 8 ( 0 0 ) 0 0 0 6 3 - 9
Transcript

The cognitive neuroscience of signed language

Jerker R�onnberg a,*, Birgitta S�oderfeldt b, Jarl Risberg c

a Department of Behavioral Sciences, Link�oping University, and The Swedish Institute for Disability

Research, Link�oping and �Orebro Universities, Swedenb Department of Neuroscience and Locomotion, University Hospital, Link�oping University, Sweden

c Department of Psychiatry, University Hospital, Lund University, Sweden

Abstract

The present article is an assessment of the current state of knowledge in the ®eld of cog-

nitive neuroscience of signed language. Reviewed lesion data show that the left hemisphere is

dominant for perception and production of signed language in aphasics, in a fashion similar to

spoken language aphasia. Several neuropsychological dissociations support this claim: Non-

linguistic visuospatial functions can be dissociated from spatial functions and general motor

de®cits can be dissociated from execution of signs. Reviewed imaging data corroborate the

lesion data in that the importance of the left hemisphere is re-con®rmed. The data also es-

tablish the role of the right hemisphere in signed language processing. Alternative hypotheses

regarding what aspects of signed language processing are handled by the right hemisphere are

currently tested. The second section of the paper starts by addressing the role that early ac-

quisition of signed and spoken language play for the neurofunctional activation patterns in the

brain. Compensatory cognitive and communicative enhancements have also been documented

as a function of early sign language use, suggesting an interesting interaction between language

and cognition. Recent behavioural data on sign processing in working memory ± a cognitive

system important for language perception and production suggest e.g. phonological loop ef-

fects analogous to those obtained for speech processing. Neuroimaging studies will have to

address this potential communality. Ó 2000 Elsevier Science B.V. All rights reserved.

Acta Psychologica 105 (2000) 237±254

www.elsevier.com/locate/actpsy

* Corresponding author. Department of Behavioral Sciences, Link�oping University, 58183 Link�oping,

Sweden.

0001-6918/00/$ - see front matter Ó 2000 Elsevier Science B.V. All rights reserved.

PII: S 0 0 0 1 - 6 9 1 8 ( 0 0 ) 0 0 0 6 3 - 9

1. Introduction

This review article is an assessment of the cognitive neuroscience of signed lan-guage. It starts by brie¯y introducing signed language and the similarities it shareswith spoken language. From a cognitive neuroscience point of view, signed languagemay prove to be a signi®cant tool for furthering our understanding of brain function.Similarities in brain function are suggested by both the lesion and imaging datareviewed. Speci®cally, the lesion data suggest that visuospatial linguistic functionscan be dissociated from non-linguistic visuospatial functions. The same type ofdissociation holds true for motor functions. While the imaging data are compatiblewith a left-hemisphere involvement for signed language, they also point to right-hemisphere involvement at some linguistic levels, and are generalizable acrossimaging techniques and sign languages. Several hypotheses compete for the expla-nation of the right-hemisphere e�ect.

The second section of the article addresses cognitive and communicative conse-quences of signed language. Early use of signed language can be demonstrated tocontribute to improved peripheral attention, enhanced spatial cognition, bettermemory for faces, and also potentially better perspective-taking abilities in the child(i.e., better theories of mind), suggesting an interesting interaction between languageand cognition. The similarities and di�erences between signed and spoken languageshave just begun to be explored within the working memory paradigm. As workingmemory is assumed to be central to both perception and production of language,already observed behavioral phonological loop e�ects also for signed language haveimportant theoretical implications. However, recent episodic memory research ±comparing brain activation at retrieval of signed and spoken episodes ± also suggestinteresting functional di�erences. Further, imaging research will therefore have tofocus on the components of working memory for signed language. Several poten-tially very interesting convergences and contrasts lie ahead.

2. Signed language

With his groundbreaking studies, Stokoe (1960) claimed that signed language isa language like any other language. His and others' work resulted in a dictionary ofAmerican sign language (ASL) (Stokoe, Casterline, & Croneberg, 1965). For itsexpression, signed language depends on arbitrary hand shapes, palm orientations,locations, movements, spatial relations among signs, and sign order. It has all thelinguistic levels represented and ful®ls the criteria of a language (Lillo-Martin,1997; Siple, 1997). It thus encompasses phonological, morphological, syntactic,semantic, and pragmatic levels. ASL is the most studied sign language, and is ofthe predicate classi®er type, with a rich and complex morphological system.However, other sign languages have also recently been employed in cognitiveneuroscience studies. More descriptive (e.g., grammatical) information regardingASL can be found in e.g. Lucas and Valli (1990, 1991), for British sign language(BSL) in Sutton-Spence and Woll (1998) and Woll (1990), and descriptive infor-

238 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

mation about Swedish sign language (SSL) can be obtained in Bergman (1982,1983, 1990) and Wallin (1996).

``Phonological'' development in ASL in terms of di�culty and anatomical com-plexity of sublexical handshapes seems to parallel the acquisition of spoken pho-nemes in terms of production di�culty. Morphological development ± from the useof context insensitive, simple monomorphemic signs to the use of appropriate con-text-dependent execution (in time and space) of multimorphemic signs ± parallels theacquisition order of spoken classi®er languages (Siple, 1997). ASL syntax seems tofollow the principles of universal grammar, but it also has a well-developed spatialgrammar with various referential devices (Lillo-Martin, 1997). Thus, ASL acquisi-tion unfolds in developmental sequences much in the same way as for spoken lan-guages and has about the same proposition rate as spoken discourse (Studdert-Kennedy, 1983).

3. Signed language: a tool for understanding brain function

3.1. Initial lesion studies and main ®ndings

Although signed language can and has been studied in its own right, it has alsoproven to be scienti®cally useful in neuroscience contexts. With the seminal studiesby the Bellugi group, signed language has become a powerful scienti®c tool forimproving our understanding of the brain bases for language and cognition. In a setof lesion studies it has generally been shown that signed language processing ± asspoken language perception and production ± is dominated by left-hemispherefunctions. Patients with left-hemisphere lesions typically are the ones su�ering fromsign language aphasia (e.g., Bellugi, Poizner, & Klima, 1983; Poizner, Klima, &Bellugi, 1990). Sign language aphasia also occurs in ways analogous to spokenlanguage: ¯uent aphasia with sign recognition and sentence comprehension prob-lems, have been characteristic of patients with posterior left-hemisphere lesions. Alsofor signed language, anterior left-hemisphere brain damage has been associated withnon-¯uent production, but with no comprehension problems (Poizner et al., 1990;Poizner, Bellugi, & Klima, 1987). In a later case study, it was shown that the left-frontal operculum was involved in sign language aphasia, as an ischemic infarct inthis area ®rst caused acute expressive aphasia, which ± upon the patient's recoveryfrom the infarct ± was reduced, implicating an active role of classical auditory``speech-motor'' areas (Hickok, Klima, & Bellugi, 1996). Thus, classical spokenlanguage areas constitute necessary cortical sites for signed language perception andproduction. Broadly speaking, this would also be expected from the cross-languageuniversals brie¯y mentioned above.

As signed language has the properties of a real language, the implications of signlanguage aphasia research are far-reaching, in that they propose that the lefthemisphere is not specialized only for spoken language, but rather for language ir-respective of perceptual and expressive modalities. At a general level this means thatthe explanation for language dominance of the left hemisphere is not necessarily to

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 239

be sought in variables tied to the auditory presentation mode, such as temporalprocessing demands (e.g., Tallal, Merzenich, Miller, & Jenkins, 1998). A previousinvestigation attempting to study precisely this possibility found that congenitallydeaf persons do not compensate for their hearing loss, nor do they underperform insimple psychophysical tasks that assess temporal processing skills (Poizner & Tallal,1987). If temporal processing skills had been crucially tied to processing of speech,then deaf subjects should not perform on a par with controls. In addition, the ma-jority of evidence pertinent to the sensory level of information processing is in linewith this result, suggesting a lack of compensation. As a recent review indicates, thisseems to apply equally well to both the blind and the deaf (R�onnberg, 1995). In otherwords, simple, sensory processing skills are on the whole invariant across groups ofindividuals using di�erent languages and language modalities, and exposure tospeech is not necessary for the skills to occur.

Thus, as evidenced by sign language aphasia studies, classical auditory languageareas are also recruited by signed language (see Neville & Bavelier, 1998). The basisfor this similarity is under current debate but the similarity seems to be independentof temporal processing skills and rather more dependent on grammatical processingnecessary for both spoken and signed languages (Poizner et al., 1990).

3.2. Linguistic and non-linguistic visuospatial dissociations

Several case studies have addressed the possibility of dissociation between signlanguage performance and general visual and spatial abilities. For example, in theirstudy Poizner et al. (1987) described one patient with a right-hemisphere lesion,where premorbid artistic non-language skills were radically impaired, whereas vi-suospatial linguistic (ASL) processing was unimpaired. Such a case seems to suggestthat even though visuospatial functions necessary for spatial components in signedlanguage are unimpaired, these functions in other non-linguistic domains may beselectively impaired.

A deaf patient with left-visual ®eld neglect sheds further light on this issue. Whileprocessing of visuospatial non-linguistic objects is severely impaired as measured e.g.by the traditional Benton Line Orientation test and the Block Design test ± pro-cessing of ASL is unimpaired, as measured by the Salk Institute sign languageanalogy to the Boston Diagnostic Aphasia Examination (Corina, Kritchevsky, &Bellugi, 1996). Particularly striking is the performance in a task designed to assessmemory for complex ®gures (Rey-Osterreith). The patient's reproduction of theleft-half of the ®gure is remarkably compromised, suggesting a visual±spatial neglect.Despite this neglect, lexical sign identi®cation is unimpaired, whereas object iden-ti®cation is much lower in visual bilateral stimulation conditions. In a study byHickok, Say, Bellugi, and Klima (1996) of two deaf signers, one with a left-hemi-sphere damage and the other with a right-hemisphere damage, a similar dissociationwas found for the use of non-linguistic, spatial information vs. linguistic, signedinformation. In a larger sample of left- and right-hemisphere damaged patients,Hickok et al. (1996) were able to show a generalized and distinct pattern suggesting adouble dissociation between sign language ability and visuospatial ability.

240 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

Furthermore, Hickok et al. (1999) found that patients may be impaired visuo-spatially, with consequences at the extra-grammatical level in ASL (i.e., for spatialconsistency in framework and topic coherence), but still visuospatial processing re-mains intact at the other ASL levels of language processing. The ®nding is clear inthat the language processing at the grammatical level is not disrupted. Add to thisthe ®nding that emotional facial processing is dissociable from grammatical func-tions controlled by the face in ASL (Corina, 1989; cf. Campbell, Woll, Benson, &Wallace, 1999).

Thus, qualitatively di�erent representations are abstracted from a visuospatialsignal and can be selectively impaired. What we have seen on the basis of patientdata is that there are several general visuospatial functions which can be dissociatedfrom visuospatial sign language functions: performance in standard neuropsycho-logical tasks, emotional facial expressions, and extra-grammatical spatial functions.

3.3. Linguistic and non-linguistic motor dissociations

Analogous to the issue of visuospatial dissociations, several studies have alsoaddressed the issue of motor linguistic dissociations from general motor functions.One such strand of research concerns the linguistic behaviour of signers (ASL) withParkinson's disease (PD). PD patients reveal qualitatively di�erent linguistic ex-pressions compared to deaf aphasic signers. For example, in Poizner and Kegl (1992)the observation is that the PD signer typically shows a dampening of facial ex-pression and reduction of the amplitude of movement in general ± much in the sameway as syllable boundaries become indistinct with the slurred speech by the PDspeaker. Also, PD signers keep a constant pausing in their stream of signs, producingsome negative grammatical functions (Kegl, Cohen, & Poizner, 1999), althoughperhaps in a ``whispering'' fashion, sign production is generally comprehensible inthe PD signer. In contrast, deaf aphasic signers with left-posterior damages showproblems with selecting appropriate distinctive features (Brentari, Poizner, & Kegl,1995). It is commonly argued that one prominent cognitive consequence of PD isrelative loss of executive control (see R�onnberg, 1999, for a comparison across motordisorders). From that perspective it may be understood that the strategy of reducingtemporal pauses and transitions between phonemes is in part explained by loss ofexecutive control over temporal sequencing (Kegl et al., 1999).

Several cases of left-hemisphere lesions show sign language aphasia, while at thesame time performing normally in pantomime recognition and apraxia tests (Poizner& Kegl, 1992; Kegl & Poizner, 1997). For example, matching a sign to a picture ismuch harder than matching a pantomimed gesture to a picture (Corina et al., 1992).In a larger sample, Hickok, Kritchevsky, Bellugi, and Klima (1996) found evidencefor the dissociation (i.e., relative lack of correlation) between performance onKimura's (1993) apraxia test and parameters of sign language aphasia.

The specialization and involvement of the left hemisphere for sign processing havealso been investigated from the perspective of hand preference. Bonvillian, Richards,and Dooley (1997) found that signing infants show a right-hand preference for signvs. non-sign actions. This preference shows up early in development suggesting that a

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 241

biologically determined laterality for signed language, rather than e.g. parentaltraining, is responsible for the e�ects demonstrated (Bonvillian et al., 1997).

The general conclusion that can be drawn is that in many sign aphasic patientsthe motor processes speci®cally tied to producing sign language are dissociatedfrom general motor performance indicated by apraxia tests, and from the speci®cdisorders of sign expression exhibited by PD patients. The biological roots of lat-eralization of right-handedness for sign language versus non-sign motor actionsfurther attest to the dominance and interaction between language and motorcontrol.

3.4. Imaging the emergence of right-hemisphere involvement

Imaging studies further con®rm the lesion data which suggest that classical left-hemisphere areas are engaged in processing of signed language (e.g., Bavelier, Co-rina, & Neville, 1998; Neville et al., 1997, 1998; Nishimura et al., 1999; S�oderfeldt,R�onnberg, & Risberg, 1994a,b; S�oderfeldt et al., 1997). It even su�ces to imagine theproduction of signs to activate Broca's area (McGuire et al., 1997, for BSL; An-dersson, Carlsson, Lundberg, & S�oderfeldt, 1998, for SSL). What seems to be newinformation with respect to laterality, however, is the fact that also the right hemi-sphere is involved.

Neville et al. (1997) observed signi®cant ERPs in both hemispheres for posteriortemporal and for occipital regions when deaf native signers processed ASL signs inthe middle of sentences. Pronounced processing in the occipital regions for the deafbut not for the hearing native signers extends previous results of auditory depriva-tion to the language domain. Right-parietal cortex is also implicated for hearingnative sign language users (Neville et al., 1997). In a further functional magneticresonance imaging (fMRI) study by Neville et al. (1998) it has again been docu-mented that in addition to the classical left-hemisphere areas activated by signlanguage, homologous right-hemisphere areas were signi®cantly activated by bothdeaf native signers and hearing native signers. In the deaf native signers there wereno lateralization e�ects at all, and for the hearing native signers left-hemispheredominance was found for Broca's area, but the equality in activation was upheld forthe PT-Wernicke regions (Bavelier et al., 1998).

Independent regional cerebral blood ¯ow (rCBF) data collected by S�oderfeldt,R�onnberg, and Risberg (1995) from SSL users also demonstrate that there is a shiftin cortical processing of stories presented to right parieto-occipital regions in thedeaf native signer, compared to the hearing native signer (S�oderfeldt et al., 1995).Preliminary data also suggest that signing skill involves a successively higher degreeof right-hemisphere involvement (S�oderfeldt et al., 1995). In a recent positronemission (PET) study by Nishimura et al. (1999) of a deaf native signer, it was foundthat the supratemporal gyri were activated bilaterally. Note that only single signswere used ± not sentences or discourse ± which may be part of the explanation whyno Wernicke areas were active, as in the Neville studies. Thus, the above three sets ofdata by Neville, Nishimura, and S�oderfeldt et al. all suggest that the right hemi-sphere also is involved during signed language processing.

242 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

In the ®rst, pioneering study by S�oderfeldt et al. (1994a) of hearing native signersthey even found rCBF evidence to support a stronger SSL bilateral activation inposterior temporal regions than was the case for pure listening to stories (cf. Ni-shimura et al., 1999). They were in that study unable to demonstrate di�erences inactivation between sign language perception (seeing the signer) and audiovisual,spoken perception (seeing and hearing the speaker). This can be taken as evidencethat audiovisual perception and signed language have a visual component in com-mon, and, as long as this component is present, the same cortical structures areactivated. Hickok, Bellugi, and Klima (1998), in commenting on the S�oderfeldt et al.(1994a) study, tended to interpret this as support for their basic hypothesis of in-volvement of essentially the same brain structures across languages. Nevertheless,they also thought that it would be important to conduct this comparison of naturalconditions with the speaker and signer in full view, with a more advanced brainimaging methodology.

To get a deeper understanding of what the right-hemisphere involvement mightmean, it is important to discuss the nature of the control conditions used. In theNeville et al. (1998) study, control for ASL sentence perception was meaninglessarbitrary gestures compared to silent reading of written English sentences (withconsonants as control). In a study by Decety et al. (1997) it was shown that hearingnon-signers who perceive meaningful non-linguistic gestures do not activate right-parietal regions, which again means that the right hemisphere does not play a rolefor gestures in general. It rather supports the contention that, to the extent that thereare robust right-hemisphere activations, there may be genuine linguistic right-hemisphere components for signed language.

In a PET study by S�oderfeldt et al. (1997) of hearing native signers addressingsome of the shortcomings implied by Hickok et al. (1998) ± both the experimentaland control conditions entailed seeing (and/or hearing) a signer (or speaker) pre-senting short stories (cf. Nishimura et al., 1999). In brief, SSL activation (1) wascompared to (2) audiovisual spoken presentation, to (3) spoken language with themouth covered, and to (4) heard spoken language with a still image of the samespeaker as in the other conditions. Here, activation speci®c to sign language wasprimarily found bilaterally in visual association areas (i.e., in the inferoposteriortemporal lobe (Broadman 37) and in the middle occipital gyrus (Broadman 19) inhearing native signers, when condition 1 was compared to conditions 2, 3, and 4).Expectedly, the perisylvian areas were bilaterally activated for audiovisual, spokenlanguage (condition 2 - condition 1). The general point here is that signed languageperception di�ers from audiovisual, spoken language perception, signi®cantly so inboth the left and the right hemispheres, and that the speci®c di�erences are mainlylocated in the visual association areas. Compared to the Neville et al. (1998) studyS�oderfeldt et al. (1997) did not capture the whole spectrum of linguistic activation,since the contrasts focus on relative di�erences among linguistic conditions. Thus, theclaim that signed language activates the same structures as spoken language (au-diovisual or visual) is simply wrong. Whether the di�erences obtained are languagespeci®c or modality speci®c can still be debated. But two important things wereunder control here: in addition to keeping a visual linguistic component present in all

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 243

linguistic control conditions, we used the auditory speech signal in all conditions butthe sign language condition. Neither an auditory deprivation argument nor the ar-gument with unnatural conditions can be used as explanation for the results of thisstudy.

In addition, if conditions 2 and 4 were compared, we found an e�ect speci®c tolipreading only in left Broadman 37 (cf. Calvert et al., 1997). So, if we compare thelipreading speci®c e�ect (speech conveyed by a moving face) with the sign speci®ce�ects, we still have a remaining signed language speci®c activation (i.e., comparingconditions 1 and 2) which clearly extends beyond a left Broadman 37 activation, viz.,a bilateral activation of Broadman 37 and 19.

To us it seems that signed language therefore engages right-hemisphere structuresthat truly have to do with other levels of language than the auditory or visuospatialmodalities. Preliminary proposals are that discourse (Hickok et al., 1999) and pro-sodic functions (Bavelier et al., 1998), as well as direct spatial encoding of objects insigned language may play a role (Hickok et al., 1998). More analytical studies haveto be performed before we can conclude exactly what kinds of functions are servedby the right hemisphere in signed language processing. Irrespective of whether it canbe concluded or not that some level of signed languages activates the right hemi-sphere in a way that is not present for similar spoken conditions, we then have ex-panded our knowledge signi®cantly about the ¯exibility/invariance of brainorganization for language by means of a new potent tool: signed language (Pe-perkamp & Mehler, 1999).

In sum: Right-hemisphere e�ects can be documented for the lexical (Nishimuraet al., 1999), sentence (Neville et al., 1997, 1998), as well as discourse comprehensionlevels of signed language (S�oderfeldt et al., 1994a,b, 1997). Right-hemisphere in-volvement is generalizable across type of imaging techniques and sign languages.Certain homologous structures seem to be common denominators. Even whencomparisons are made across di�erent natural linguistic conditions (i.e., keeeping thespeaker/signer in view in all cases) in hearing native signers, there seems to be somesign language speci®c right-hemisphere involvement. As argued above, this is notdue to modality of presentation or imaging technique. It should be added that theright-hemisphere involvement in signed language by no means is a unique linguisticphenomenon. Several studies have reported right-hemisphere activation duringspeech production especially for automatic speech like repetition of the months ofthe year (Larsen, Skinhoj, & Lassen, 1978; Ryding, Bradvik, & Ingvar, 1987,Ryding, Bradvik, & Ingvar, 1996).

4. Brain plasticity: cognitive and communicative consequences of signed language

4.1. Early acquisition

One area of research that speaks to the general issue of malleability of the corticalbasis of signed language addresses the consequences of early acquisition. It is notquite clear how deafness and early, native sign language acquisition interacts. In

244 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

addition to the acquisition aspect of signed language, there is a potentially veryimportant area which assesses the cognitive changes as a function of sign languageuse.

In the previously cited fMRI study by Neville et al. (1998), deaf native signers,hearing native signers, and hearing non-signing subjects participated. It was shownthat, as long as early acquisition is accomplished ± either in ASL or English-left-hemisphere cortical areas are activated. This means that, independent of linguisticform, early acquisition seems to be crucial in mediating the development of languageprocessing in classical brain areas. This represents a further feature of speci®c pro-cessing requirements that characterize the brain organization for signed languageand for language in general. Once the brain organization is established for signedlanguage (i.e., Broca's and Wernicke's areas, the dorsolateral prefrontal cortex, theprecentral sulcus, and the superior temporal sulcus), another type of activation willbe found for sign language users who read English; they read English primarily byactivating right, middle and posterior temporal±parietal structures (Neville et al.,1998). However, the ®ndings of right-hemisphere processing speci®c for deaf nativesign language users may prove to be an interesting exception (e.g., Neville et al.,1998; S�oderfeldt et al., 1997).

As the hearing, native signers in the Neville et al. (1998) study show high vari-ability in their cortical responses, it might be suspected that the timing of learning asecond language is important in the ®ne tuning of the cortical organization for brainsystems mediating signed language. This type of study, where age of acquisition ofsign language is explicitly varied, has not yet been carried out. Awaiting such astudy, other independent data by WeberFox and Neville (1996) are very suggestiveas to the possibility of critical periods in language acquisition in Chinese/Englishbilinguals. They found that ERPs associated with syntactic judgements were a�ectedby very short delays (i.e., already after 1±3 yrs) in exposure to English, whereas ERPsfollowing judgement of semantic anomalies were only detected for those bilingualswith a later exposure to English (i.e., after 11 yrs of age). In addition, qualitativedi�erences between semantic and syntactic ERPs emerged, suggesting that matura-tional di�erences exist in the brain subsystems responsible for language development.

Thus, brain organization for language (and signed language) seems to interactwith sensitive periods of acquisition of the language. In particular, early acquisitionpromotes the establishment of left-hemisphere activation of brain structures. Deafnative sign language users show similar activation patterns in the left hemisphere astheir hearing non-signing counterparts with respect to written English. Furtherstudies will hopefully reveal the developmental unfolding and plasticity of brainsubsystems mediating acquisition of signed language and potential cross-modalplasticity (cf. Cohen et al., 1999).

4.2. Cognitive consequences

Recent work on the cognitive consequences of early sign language use has startedto paint an exciting picture: enhanced visual attention of peripheral events is sys-tematically observed in the deaf native signers (Neville & Lawson, 1987a,b,c), as are

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 245

improved visuospatial cognition (Emmorey, Kosslyn, & Bellugi, 1993; Parasnis,Samar, Bettger, & Sathe, 1996), and better mental rotation of non-linguistic objects(Emmorey, Klima, & Hickok, 1998). The expertise of deaf ASL-signers to extractand remember facial features (McCullough & Emmorey, 1997) ± in combinationwith the above improvements ± all suggest that early use of sign language ± typicalllycombined with congenital deafness ± may impact and cortically reorganize somebrain functions (Neville, 1990, see also Wol� & Thatcher, 1990). It seems to be thecase that cortical, intermodal plasticity is to be expected in the superior temporalsulcus, where di�erent sensory inputs converge. In a recent fMRI and magnetoen-cephalographic (MEG) study it was found that compensatory cortical plasticityoccurs primarily in the non-primary cortices under attentionally demanding condi-tions (Hickok et al., 1997), supporting the reasoning of Neville (1990).

There are also further ± perhaps surprising ± cognitive aspects of sign languageacquisition and communicative development which build on speci®c spatial prop-erties of sign language (French sign language, FSL; Courtin & Melot, 1998; Courtin,2000). As the spatial aspects of syntax in signed language typically use the referentialperspective of the signer, it demands by the receiver an ability (i.e., mental rotations).As a matter of fact, mental rotations using ASL may be selectively spared in right-hemisphere damaged signers, while non-linguistic mental rotation performance isseverely compromised (Emmorey et al., 1998). Signing also demands ``spatialmapping'' (Courtin, 2000) of objects in a scene, and the perspective of the currentscene being signed can be changed back and forth depending on communicativecon¯icts and demands. Multiple perspective taking in the deaf sign language user hasrecently been shown by Courtin and colleagues (Courtin, 2000) to have signi®cante�ects on what is fashionably called ``theories of mind'' (e.g., Peterson & Siegal,1995). Courtin has been able to show that deaf native signers outperform non-nativesigners and deaf non-signers on so called false belief attribution tasks. However, thisissue is not settled, since other studies suggest that deaf native signers do not performbetter than normal children and deaf non-signers (oral deaf children) (e.g., Peterson& Siegal, 1999). What seems to be generalizable, though, is the fact that deaf non-native signers su�er in false belief tasks relative to deaf native signers (Russell et al.,1998; Scott et al., 1999).

False belief tasks measure a child's ability to take another person's perspectivewhen deciding on what is a true state of the world. This means that a certain beliefmay be true with the limited information this person has about the world (e.g., wherean object is placed), but is clearly false from the perspective of a second person,because the second person has placed the object in another place while the ®rstperson was out of the room. Again, this type of conceptual perspective taking isapparently facilitated by the use of signed language ± and is supposedly a corner-stone in children's ``theories'' of others' minds.

There may even be a neurocognitive connection to the fact that perspective takinginherent in false belief tasks promotes communicative and social development. Suchstudies have not been performed on sign language users yet. But, it seems obviousthat everyday communication builds on theories of others' minds and intentions inthe role taking in a dialogue. The existence of such a capacity is taken for granted in

246 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

most people but is lacking in various ways in children with e.g. autism spectrumdisorders. The hypothesis that there may be a social module that is somehow de®-cient or lacking in the autistic child has been evaluated in story comprehension tasksinvolving mentalizing states. It was found that the middle frontal gyrus on the leftside (i.e., Broadman 8) was speci®cally associated with mentalizing required by atheory of mind task (Fletcher et al., 1995). This ®nding converges with the currentstate of knowledge regarding the relative sparing of frontal structures in hypersocialindividuals having William's syndrome (Bellugi, Adolphs, Cassady, & Chiles, 1999;Karmillo�smith, Klima, Bellugi, Grant, & Baroncohen, 1995). It therefore seems tobe a distinct possibility that there is a separate and dissociable brain function withspeci®c neural correlates that also may be particularly prone to stimulation by earlyuse of signed language.

Thus, there exist several visual±spatial, memory and attentional cognitive con-sequences of having sign language as a ®rst language in the deaf native signer,representing compelling examples of cognitive compensation. There are potentiallyalso communicative and social consequences of using sign language, rooted in thecognitive capacity of having ``theories'' of others' minds.

4.3. Working memory

One cognitive function, studied in many contexts and central to both perceptionand production of language, is working memory (Baddeley, 1990). In a discussion ofthe functions played by working memory ± bridging the gap between signal analysisand dialogue ± R�onnberg et al. (1998) proposed a general working memory systemfor poorly speci®ed linguistic input, applicable to various speech-based forms ofcommunication in the profoundly hearing-impaired or deaf. Some of the bottle-necks in working memory, while reading poorly conveyed visual, tactile or auditoryinformation about speech, are general information-processing speed, and decodingability (R�onnber, 1990). The existence of phonological representations is necessary fore�cient decoding and lexical access ± and is important for successes or failures as adeaf speech-reader (R�onnberg et al., 1999). In R�onnberg, S�oderfeldt, and Risberg(1998) it was also ventured that this working memory system might include signlanguage as well, due to both behavioural analogies and to some neurophysiologicalsimilarities beween the two languages.

More direct behavioural evidence pointing to the possibility of an analogous``phonological loop'' e�ect using signed language has been elegantly provided byWilson and Emmorey (1997). They designed a set of short-term serial recall exper-iments that were modelled on the classical manipulations of the phonological loop:the e�ect of phonological similarity was accomplished by sign similarity, handmovements played the role of articulatory suppression, and sign length simulated theword length e�ect. All e�ects were borne out in the sense that the classical e�ectswere replicated.

Similar behavioural ®ndings have been reported by Marschark and Mayer (1998).They found that deaf subjects' recall is a�ected by manual articulatory suppressionand sign-length, and, interestingly, they were able to demonstrate that deaf subjects'

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 247

sign loop has the same capacity as that of the hearing subjects, when adjusted forarticulation time. These data have two types of consequences:1. they undoubtedly give rise to a critical perspective on the traditional position that

working memory of the deaf is inferior to that of hearing subjects (e.g., Hanson &Lichtenstein, 1990);

2. they o�er new input to theorizing about working memory and its componentfunctions.In our view (R�onnberg et al., 1998), the above data imply that the brain has the

capacity for a variety of ``phonological'' codes ± coding schemes that capitalize on arapid combinatorial capacity for sublexical units. This apparent ¯exibility in codingstrategy also implies either that working memory can be conceptualized as having avariety of loops for linguistic or other materials, or that there is at least some moreamodal, basic sublexical processor which is common to all linguistic ``loop e�ects''(cf. Leybaert, 2000). Only recently, imaging data have been presented that can helpstructuring the components of working memory.

To the extent that silent articulation of sentences engages the phonological loopresponsible for inner speech, it was found in a recent imaging study that ``innersigning'' of sentences also engages similar functional networks in the brain as thoseassociated with the activation of the phonological loop, e.g., left inferior frontalcortex, rather than visuospatial areas (McGuire et al., 1997). Another imaging studyused sentence generation as one of the focal task demands for working memoryresources, and found that the left middle/inferior frontal gyrus (Broadman 46) wasthe main responsible area for executing the task (Muller et al., 1997; cf. Stowe et al.,1998). Still other studies have dissociated several putative components of workingmemory by means of PET studies demonstrating di�erent main foci of activation forspatial and verbal information, and for passive storage vs. active maintenance, in-cluding di�erent tasks assumed to measure the phonological loop and the centralexecutive (e.g., Smith & Jonides, 1997 for a review). Imaging research on workingmemory for sign is still in the making. Several convergences and contrasts lie ahead.

Nevertheless, we (R�onnberg et al., 1998) have carried out one ®rst general studyon the interaction between memory and language system in both deaf native signersand hearing non-signing individuals. This study used unrelated lexical items (signedor spoken, auditory presentation only) as materials, and demonstrated that right-hemisphere involvement was obtained for signed episodic materials, whereas in thesigned semantic memory task left-hemisphere activation was obtained. For thehearing non-signing subjects, both episodic and semantic memory activation en-gaged the left hemisphere. It seems that early acquisition of sign language contrib-utes to a selective transfer of the typical activation patterns of di�erent memorysystems to the right hemisphere. The episodic memory activations include temporo-parietal, right-hemisphere cortical regions, a ®nding that makes good sense in thatsigned episodes appropriately ± and in a rich and multidimensional way (i.e., time,place, and space event coordinates) ± operationalize the concept of episodic memory(Tulving, 1983). However, before we can reach more ®rm conclusions on this topic,encoding-retrieval designs need to be used and evaluated, as they have been for other

248 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

stimulus materials in the episodic memory domain (e.g., Cabeza & Nyberg, 2000;Nyberg, Cabeza, & Tulving, 1996; Shallice et al., 1994).

In sum, there are reasons to believe that working memory for sign and speechshare behavioural and functional similarities. Neuroimaging of various componentsof working memory for spatial and verbal materials has rapidly generated solidknowledge, but such analytical studies on signed language are sadly lacking. How-ever, results from episodic±semantic memory paradigms suggest that episodicmemory for signed episodes activates right, temporo-parietal cortical areas, stillanother aspect of the emerging role of the right hemisphere in signed languageprocessing.

5. General conclusions

By now it seems clear that many ground-breaking studies have been carried out byusing signed language as a tool for furthering our understanding of the brain bases ofcognition and language. There are many case studies, but also group studies, whichrest on data from patients with brain lesions. These studies are very consistent indemonstrating that the same anatomical structures in the left hemisphere are re-sponsible for spoken as well as signed language aphasia. In accord with the generalcredo of a left-hemisphere bias for language ± and not just for spoken language ±these studies have systematically dissociated non-linguistic motor functions fromlinguistic motor functions (e.g., PD vs. sign aphasics), they have also systematicallydissociated visuospatial non-linguistic functions (block design, spatial rotation ofobjects) from linguistically important functions such as spatial syntax and mor-phology inherent in signed language. With the emergence of imaging studies ofsigned language in neurologically intact participants, two data structures emerge:

(a) classical left-hemisphere structures, important for spoken language percep-tion and production, remain important for sign language perception and produc-tion;(b) recent imaging studies suggest that the right hemisphere plays a role in signedlanguage.

Presently, several competing hypotheses exist, none of which has been thoroughlyexamined in the imaging context:

(a) extra-grammatical, prosodic, and topic coherence functions have been sug-gested;(b) signed language triggers visuospatial modality-speci®c functions;(c) signed language activates sign speci®c, but non-modality-speci®c functions.

Brain plasticity can be seen in the sense that the period of acquisition of signedand spoken language determine the degree of lateralization of activation. Apart fromthe general issue of the period of acquisition of a language, several positive cognitiveside-e�ects have been recently observed as a function of the early use of signedlanguage: e.g. enhanced spatial cognition, better memory for faces, and more e�-cient peripheral attentional mechanisms. The cognitive consequences may evenspread to communicative and social functions, in that intrinsic perspective-taking

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 249

abilities ± theories of mind ± supposedly are facilitated by the use of signed language.There may even be separate, dissociable brain structures responsible for this putativecognitive-social perspective-taking component. One cognitive function which mayprove to be an important future research topic in the cognitive neuroscience ofsigned language is working memory. Working memory is important for both per-ception and production of language and may even be related to the observed com-pensatory cognitive enhancements. In fact, recent studies have replicatedbehavioural ``phonological loop'' e�ects in working memory data with signed ma-terials. Much work remains if the goal is to match the relatively well-developedknowledge base that exists with respect to imaging of working memory componentsfor the spoken language modality. Episodic memory for signed words has also beendocumented by means of neurofunctional imaging. Again, more analytical researchis called for before ®rm conclusions can be reached.

In conclusion, sign language based neurocognitive research has rapidly expandedits domain in the very few last years and is becoming a major area of cognitiveneuroscience research. The output from the area has already documented interestingand far-reaching results with respect to brain organization and brain plasticity ofcognitive and language processes.

Acknowledgements

The research was supported by a programme grant from the Swedish Council forSocial Research to Dr. R�onnberg, and by grants to Dr. Risberg from the MedicalResearch Council and The Council for Research in the Humanities and Social Sci-ences. The constructive and positive comments by two anonymous reviewers aregreatly acknowledged.

References

Andersson, A., Carlsson, J., Lundberg, P., & S�oderfeldt, B. (1998). Sign language studied by functional

MRI. Abstract fourth international conference on functional mapping of the human brain.

Baddeley, A. D. (1990). Human memory: theory and practice. Hove: Lawrence Erlbaum.

Bavelier, D., Corina, D., Jezzard, P., Clark, V., Karni, A., Lalwani, A., Rauschecker, J. R., Braun, A.,

Turner, R., & Neville, H. J. (1998). Hemispheric specialization for English and ASL: left invariance-

right variability. Neuroreport, 9, 1537±1542.

Bavelier, D., Corina, D., & Neville, H. J. (1998). Brain and language: a perspective from sign language.

Neuron, 21, 275±278.

Bellugi, U., Adolphs, R., Cassady, C., & Chiles, M. (1999). Towards the neural basis for hypersociability

in a genetic syndrome. Neuroreport, 10, 1653±1657.

Bellugi, U., Poizner, H., & Klima, E. S. (1983). Brain organization for language: clues from sign aphasia.

Human Neurobiology, 2, 155±170.

Bergman, B. (1982). Studies in Swedish sign language. Doctoral dissertation. Institute of Linguistics,

University of Stockholm.

250 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

Bergman, B. (1983). Verbs and adjectives: morphological processes in Swedish sign language. In: J. Kyle,

& B. Woll (Eds.), Language in sign: an international perspective on sign language. London: Croom

Helm.

Bergman, B. (1990). Grammaticalisation of location. In: W. H. Edmondson, & F. Karlsson (Eds.),

SLR'87. Papers from the fourth international symposium on sign language research. Lappeenranta,

Finland, Hamburg: SIGNUM-Press..

Bonvillian, J. D., Richards, H. C., & Dooley, T. T. (1997). Early sign language acquisition and the

development of hand preference in young children. Brain and Language, 58, 1±22.

Brentari, D., Poizner, H., & Kegl, J. (1995). Aphasic and Parkinsonian signing: di�erences in phonological

disruption. Brain and Language, 48, 69±105.

Cabeza, R., & Nyberg, L. (2000). Imaging cognition II: empirical review of 275 PET and fMRI studies.

Journal of Cognitive Neuroscience, 12(1), 1±47.

Calvert, G., Bullmore, E., Brammer, M., Campbell, R., Woodru�, P., McGuire, P., Williams, S., Iversen,

S. D., & David, A. S. (1997). Activation of auditory cortex during silent speechreading. Science, 276,

593±596.

Campbell, R., Woll, B., Benson, P., & Wallace, S. B. (1999). Categorical perception of face actions: their

role in sign language and in communicative facial displays. Quarterly Journal of Experimental

Psychology. A, Human Experimental Psychology, 52A, 67±95.

Cohen, L. G., Weeks, R. A., Sadato, N., Celnik, P., Ishii, K., & Hallet, M. (1999). Period of susceptibility

for cross-modal plasticity in the blind. Annals of Neurology, 45, 451±460.

Corina, D. (1989). Recognition of a�ective and noncanonical linguistic facial expressions in hearing and

deaf subjects. Brain and Cognition, 9, 227±237.

Corina, D., Kritchevsky, M., & Bellugi, U. (1996). Visual language processing and unilateral neglect:

evidence from American sign language. Cognitive Neuropsychology, 3, 321±356.

Corina, D., Poizner, H., Bellugi, U., Batch, L., Feinberg, D., & Dowd, D. (1992). Dissociation between

linguistic and non-linguistic gestural systems: a case for compositionality. Brain and Language, 43,

414±447.

Courtin, C. (2000). The impact of sign language on the cognitive development of deaf children: the case of

theories of mind. Journal of Deaf Studies and Deaf Education, 5, 266±276.

Courtin, C., Melot, A.-M. (1998). The development of theories of mind in deaf children. In M. Marschark,

& M. D. Clark (Eds.), Psychological perspectives on deafness (Vol. 2, pp. 79±102). Mahvah, NJ:

Lawrence Erlbaum.

Decety, J., Grezes, J., Costes, N., Perani, D., Jeannerod, M., Procyk, E., Grassi, F., & Fazio, F. (1997).

Brain activity during observation of actions ± in¯uence of action content and subject's strategy. Brain,

120, 1763±1777.

Emmorey, K., Klima, E., & Hickok, G. (1998). Mental rotation within linguistic and non-linguistic

domains in users of American sign language. Cognition, 68, 221±246.

Emmorey, K., Kosslyn, S. M., & Bellugi, U. (1993). Visual imagery and visual±spatial language. Enhanced

imagery abilities in deaf and hearing ASL signers. Cognition, 46, 139±181.

Fletcher, P. C., Happe, F., Frith, U., Baker, S. C., Dolan, R. J., Frackowiak, R. S. J., & Frith, C. D.

(1995). Other minds in the brain ± a functional imaging study of theory of mind in story

comprehension. Cognition, 57, 109±128.

Hanson, V. L., & Lichtenstein, E. H. (1990). Short-term memory coding by deaf signers. The primary

language coding hypothesis reconsidered. Cognitive Psychology, 22, 211±224.

Hickok, G., Bellugi, U., & Klima, E. S. (1998). What's right about the neural organization of sign

language? A perspective on recent neuroimaging results. Trends in Cognitive Sciences, 2, 465±467.

Hickok, G., Klima, E. S., & Bellugi, U. (1996). The neurobiology of sign language and its implications for

the neural basis of language. Nature, 381, 699±702.

Hickok, G., Kritchevsky, M., Bellugi, U., & Klima, E. S. (1996). The role of the frontal operculum in sign

language aphasia. Neurocase, 5, 373±380.

Hickok, G., Poeppel, D., Clark, K., Buxton, R. B., Rowley, H. A., & Roberts, T. P. L. (1997). Sensory

mapping in a congenitally deaf subject: MEG and fMRI studies of cross-modal non-plasticity. Human

Brain Mapping, 5, 437±444.

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 251

Hickok, G., Say, K., Bellugi, U., & Klima, E. S. (1996). The basis for hemispheric asymmetries for

language and spatial cognition: clues from focal brain damage in two deaf signers. Aphasiology, 10,

577±591.

Hickok, G., Wilson, M., Clark, K., Klima, E. S., Kritchevsky, M., & Bellugi, U. (1999). Discourse de®cits

following right hemisphere damage in deaf signers. Brain and Language, 66, 233±248.

Karmillo�smith, A., Klima, E., Bellugi, U., Grant, J., & Baroncohen, S. (1995). There is a social module ±

language, face processing, and theory of mind in individuals with Williams-syndrome. Journal of

Cognitive Neuroscience, 7, 196±208.

Kegl, J., Cohen, H., & Poizner, H. (1999). Articulatory consequences of Parkinson's desease: perspectives

from two modalities. Brain and Cognition, 40, 355±386.

Kegl, J., & Poizner, H. (1997). Crosslinguistic/crossmodal syntactic consequences of left-hemisphere

damage: evidence from an aphasic signer and his identical twin. Aphasiology, 11, 1±37.

Kimura, D. (1993). Neuromotor mechanisms in human communication. Oxford: Oxford University Press.

Larsen, B., Skinhoj, E., & Lassen, N. A. (1978). Variations in regional cortical blood ¯ow in the right and

left hemispheres during automatic speech. Brain, 101, 193±209.

Leybaert, J. (2000). The role of language in structuring working memory: evidence from deaf children

educated with Cued-Speech. International Journal of Psychology. Paper presented at the ICP 2000,

Stockholm.

Lillo-Martin, D. (1997). The modular e�ects of sign language acquisition. In: M. Marschark, & V. S.

Everhart (Eds), Relations of language and thought: the view from sign language and deaf children (pp.

62±109). Oxford: Oxford University Press.

Lucas, C., & Valli, C. (1990). Predicates of perceived motion in ASL. In S.D. Fischer, & P. Siple et al.

(Eds), Theoretical issues in sign language research (Vol. 1: Linguistics, pp. 153±166). Chicago, IL, USA:

University of Chicago Press.

Lucas, C., & Valli, C. (1991). ASL or contact signing: issues of judgment. Language in Society, 20, 201±

216.

Marschark, M., & Mayer, T. S. (1998). Interactions of language and memory in deaf children and adults.

Scandinavian Journal of Psychology, 39, 145±148.

McCullough, S., & Emmorey, K. (1997). Face processing by deaf ASL signers: evidence for expertise in

distinguishing local features. Journal of Deaf Studies and Deaf Education, 2, 212±222.

McGuire, P. K., Robertson, D., Thacker, A., David, A. S., Kitson, N., Frackowiak, R. S. J., & Frith, C.

D. (1997). Neural correlates of thinking in sign language. Neuroreport, 8, 695±698.

Muller, R-A., Rothermel, R. D., Behen, M. E., Musik, O., Mangner, T. J., & Chugani, H. T. (1997).

Receptive and expressive language activations for sentences: a PET study. Neuroreport, 8, 3767±

3770.

Neville, H. J. (1990). Intermodal competition and compensation in development: evidence from studies of

the visual system in congenitally deaf adults. In A. Diamond (Ed.), The development and neural bases of

higher cognitive functions (pp. 71±91). New York: The New York Academy of Sciences.

Neville, H. J., & Bavelier, D. (1998). Current opinion in Neurobiology, 8, 254±258.

Neville, H. J., Bavelier, D., Corina, D., Rauschecker, J., Karni, A., Lawani, A., Braun, A., Clark, V.,

Jezzard, P., & Turner, R. (1998). Cerebral organization for language in deaf and hearing subjects:

biological constraints and e�ects of experience. Proceedings of the National Academic Science, 95, 922±

929.

Neville, H. J., Co�ey, S. A., Lawson, D. S., Fischer, A., Emmorey, K., & Bellugi, U. (1997). Neural

systems mediating American sign language: e�ects of sensory experience and age of acquisition. Brain

and Language, 57, 285±308.

Neville, H. J., & Lawson, D. (1987a). Attention to central and peripheral visual space in a movement

detection task. I. Normal hearing adults. Brain Research, 405, 253±267.

Neville, H. J., & Lawson, D. (1987b). Attention to central and peripheral visual space in a movement

detection task. II. Congenitally deaf subjects. Brain Research, 405, 268±283.

Neville, H. J., & Lawson, D. (1987c). Attention to central and peripheral visual space in a movement

detection task III. Separate e�ects of auditory deprivation and acquisition of a visual language. Brain

Research, 405, 284±294.

252 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254

Nishimura, H., Hashikawa, K., Doi, K., Iwaki, T., Watanabe, Y., Kusuoka, H., Nishimura, T., & Kubo,

T. (1999). Sign language ``heard'' in the auditory cortex. Nature, 397, 116.

Nyberg, L., Cabeza, R., & Tulving, E. (1996). PET studies of encoding and retrieval: the HERA model.

Psychonomic Bulletin and Review, 3, 135±148.

Parasnis, I., Samar, V., Bettger, J., & Sathe, K. (1996). Does deafness lead to enhancement of visual spatial

cognition in children? Negative evidence from deaf nonsigners. Journal of Deaf Studies and Deaf

Education, 1(2), 145±152.

Peperkamp, S., & Mehler, J. (1999). Signed and spoken language: a unique underlying system? Language

and Speech, 42, 333±346.

Peterson, C. C., & Siegal, M. (1995). Deafness, conversation and theory of mind. Journal of Child

Psychology and Psychiatry and Allied Disciplines, 36, 458±474.

Peterson, C. C., & Siegal, M. (1999). Representing inner worlds: theory of mind in autistic, deaf, and

normal hearing children. Psychological Science, 10, 126±129.

Poizner, H., Bellugi, U., & Klima, E. S. (1987). What the hands reveal about the brain. Cambridge, MA:

MIT Press.

Poizner, H., & Kegl, J. (1992). Neural basis of language and motor behavior: perspectives from Americal

sign language. Aphasiology, 6, 219±256.

Poizner, H., Klima, E. S., & Bellugi, U. (1990). Biological foundation of language: clues from sign

language. Annual Review of Neuroscience, 13, 283±307.

Poizner, H., & Tallal, P. (1987). Temporal processing in deaf signers. Brain and Language, 30, 52±62.

Ryding, E., Bradvik, B., & Ingvar, D. H. (1987). Changes of regional cerebral blood ¯ow measured

simultaneously in the right and left hemisphere during automatic speech and humming. Brain, 110,

1345±1358.

Ryding, E., Bradvik, B., & Ingvar, D. H. (1996). Silent speech activates prefrontal cortical regions asym-

metrically as well as speech-related areas in the dominant hemisphere. Brain and Language, 52, 435±451.

R�onnberg, J. (1990). Cognitive and communicative function: the e�ects of chronological age and

``handicap age''. European Journal of Cognitive Psychology, 2, 253±273.

R�onnberg, J. (1995). Perceptual compensation in the deaf and blind: myth or reality? In R. Dixon, & L.

B�ackman (Eds.), Compensating for psychological de®cits and declines: managing (pp. 251±274). mahwah

losses and promoting gains. NJ: Lawrence Erlbaum.

R�onnberg, J. (1999). Cognitive and communicative perspectives on physiotherapy: a review. Advances in

Physiotherapy, 1, 37±44.

R�onnberg, J., Andersson, J., Andersson, U., Johansson, K., Lyxell, B., & Samuelsson, S. (1998).

Cognition as a bridge between signal and dialogue: communication in the hearing impaired and deaf.

Scandinavian Audiology, 27(Suppl. 49), 101±108.

R�onnberg, J., Andersson, J., Samuelsson, S., S�oderfeldt, B., Lyxell, B., & Risberg, J. (1999). A visual

speechreading expert: the case of MM. Journal of Speech, Language and Hearing Research, 42, 5±20.

R�onnberg, J., S�oderfeldt, B., & Risberg, J. (1998). Regional cerebral blood ¯ow in signed and heard

episodic and semantic memory tasks. Applied Neuropsychology, 5, 132±138.

Russell, P. A., Hosie, J. A., Gray, C. D., Scott, C., Hunter, N., Banks, J. S., & Macaulay, M. C. (1998).

The development of theory of mind in deaf children. Journal of Child Psychology and Psychiatry and

Allied Disciplines, 39, 903±910.

Scott, C., Russell, P. A., Gray, C. D., Hosie, J. A., & Hunter, N. (1999). The interpretation of line of

regard by prelingually deaf children. Social Development, 8, 412±426.

Shallice, T., Fletcher, P., Frith, C. D., Grasby, P., Frackowiak, R. S. J., & Dolan, R. J. (1994). Brain

regions associated with acquisition and retrieval of verbal episodic memory. Nature, 368, 633±635.

Siple, P. (1997). Universals, generalizability, and the acquisition of sign language. In M. Marschark, & V.

S. Everhart (Eds.), Relations of language and thought: the view from sign language and deaf children (pp.

24±61). Oxford: Oxford University Press.

Smith, E. E., & Jonides, J. (1997). Working memory: a view from neuroimaging. Cognitive Psychology, 33,

5±42.

Stokoe, W.C. (1960). Sign language structure: an outline of the visual communication systems of the

American deaf. Studies in linguistics: occasional papers (Vol. 8). Bu�alo: University of Bu�alo.

J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254 253

Stokoe, W.C., Casterline, D., & Croneberg, C. (1965). Dictionary of American sign language. Washington,

DC: Gallaudet College Press.

Stowe, L. A., Broere, C. A. J., Paans, A. M. J., Wijers, A. A., Mulder, G., Vaalburg, W., & Zwarts, F.

(1998). Localizing components of a complex task sentence processing and working memory.

Neuroreport, 9, 2995±2999.

Sutton-Spence, R., & Woll, B. (1998). Fingerspelling as a resource for lexical innovation in British sign

language. In Verma, K. Mahendra et al. (Ed.), Sociolinguistics, language and society. Language and

development (Vol. 5, pp. 10±143). New Dehli, India: Sage Publications.

S�oderfeldt, B., Ingvar, M., R�onnberg, J., Eriksson, L., Serrander, M., & Stone-Elander, S. (1997). Signed

and spoken language perception studied by positron emission tomography. Neurology, 49, 82±87.

S�oderfeldt, B., R�onnberg, J., & Risberg, J. (1994a). Regional cerebral blood ¯ow in sign-language users.

Brain and Language, 46, 59±68.

S�oderfeldt, B., R�onnberg, J., & Risberg, J. (1994b). Regional cerebral blood ¯ow during sign language

perception: a comparison between deaf and hearing subjects with deaf parents. Sign Language Studies,

84, 199±208.

S�oderfeldt, B., R�onnberg, J., & Risberg, J. (1995). Regional cerebral blood ¯ow during sign language

perception and speechreading in hard-of-hearing subjects. Ms.

Tallal, P., Merzenich, M., Miller, S., & Jenkins, W. (1998). Language learning impairment: integrating

research and remediation. Scandinavian Journal of Psychology, 39, 197±199.

Studdert-Kennedy, M. (1983). On learning to speak. Human Neurobiology, 2, 191±195.

Tulving, E. (1983). Elements of episodic memory. Oxford: Clarendon Press.

Wallin, L. (1996). Polysynthetic signs in Swedish sign language. Doctoral dissertation, Stockholm

university, Sweden.

WeberFox, C. M., & Neville, H. J. (1996). Maturational constraints of functional specializations for

language processing: ERP and behavioral evidence in bilingual speakers. Journal of Cognitive

Neuroscience, 8, 231±256.

Wilson, K., & Emmorey, K. (1997). Working memory for sign language: a window into the architecture of

the working memory system. Journal of Deaf Studies and Deaf Education, 2, 121±130.

Wol�, A. B., & Thatcher, R. W. (1990). Cortical reorganization in deaf children. Journal of Clinical and

Experimental Neuropsychology, 12, 209±221.

Woll, B. (1990). Sign language. In N. E. Collinge et al. (Eds.), An encyclopedia of language (pp. 740±783).

London: Routledge & Kegan Paul.

254 J. R�onnberg et al. / Acta Psychologica 105 (2000) 237±254


Recommended