The neural organization of language: evidence. Sign language aphasia. While the left cerebral hemisphere s dominance for language.

Similar documents
How To Understand A Deaf Person

The Brain. Hemisphere-specific Deficits. Broca s Area 1/2/11. Cerebral Specialization

2) Language: Lesion Studies Demonstrating the Left Hemisphere s Dominance for Language - Broca s Aphasia, Wernicke s Aphasia

Neural Systems Mediating American Sign Language: Effects of Sensory Experience and Age of Acquisition

Psychology G4470. Psychology and Neuropsychology of Language. Spring 2013.

2 Neurons. 4 The Brain: Cortex

Ph.D in Speech-Language Pathology

Fall 2013 to present Assistant Professor, Department of Psychological and Brain Sciences, Johns Hopkins University

03. Commonplaces in Clinical Linguistics The language code and language use: semiotic skills.

Activation of intentional mechanisms through utilization of nonsymbolic movements in aphasia rehabilitation

Bedside cognitive examination beyond the MMSE. Dr Richard Perry Dept of Neurosciences Imperial College

3030. Eligibility Criteria.

Communication Strategies for Primary Progressive Aphasia & FTD s:

Auditory memory and cerebral reorganization in post-linguistically deaf adults

PERSPECTIVES. The cortical organization of speech processing

The effect of instrumentality and verb-noun name relation on verb retrieval in bilingual Greek-English anomic aphasic individuals

An fmri study on reading Hangul and Chinese Characters by Korean Native Speakers

Aphasia Assessment. Introduction. Etiology

BRAIN ASYMMETRY. Outline

62 Hearing Impaired MI-SG-FLD062-02

Overview. Neuropsychological Assessment in Stroke. Why a Neuropsychologist. How to make a referral. Referral Questions 11/6/2013

MICHIGAN TEST FOR TEACHER CERTIFICATION (MTTC) TEST OBJECTIVES FIELD 062: HEARING IMPAIRED

Culture and Language. What We Say Influences What We Think, What We Feel and What We Believe

More fine-grained distinction

Understanding sign language classifiers through a polycomponential approach

How Children Acquire Language: A New Answer by Dr. Laura Ann Petitto

Areas of Processing Deficit and Their Link to Areas of Academic Achievement

Chapter 4: Eligibility Categories

Personal Negligence and the Fluff Test

Cognitive Neuroscience. Questions. Multiple Methods. Electrophysiology. Multiple Methods. Approaches to Thinking about the Mind

Brain Structures That are Involved with Memory

Reading Competencies

ARTICLE The Cognitive Neuroscience of Sign Language: Engaging Undergraduate Students Critical Thinking Skills Using the Primary Literature

The Development of Nicaraguan Sign Language via the Language Acquisition Process

SPECIFIC LEARNING DISABILITY

Encyclopedia of School Psychology Neuropsychological Assessment

Table 1. Results for all participants (pts) for tests given pre- and post-treatment

INCREASE YOUR PRODUCTIVITY WITH CELF 4 SOFTWARE! SAMPLE REPORTS. To order, call , or visit our Web site at

Introduction. Methods

Quarterly Progress and Status Report. Language competence among cognitively non-disabled individuals with cerebral palsy

How are Parts of the Brain Related to Brain Function?

COGNITIVE PSYCHOLOGY

Technical Report. Overview. Revisions in this Edition. Four-Level Assessment Process

SPECIAL EDUCATION ELIGIBILITY CRITERIA AND IEP PLANNING GUIDELINES

Functional Auditory Performance Indicators (FAPI)

Skill acquisition. Skill acquisition: Closed loop theory Feedback guides learning a motor skill. Problems. Motor learning practice

Mind, Brain, and Education: Neuroscience Implications for the Classroom. Study Guide

Visual spatial search task (VISSTA): a computerized assessment and training program

Bayesian probability theory

How do the deaf read?

WMS III to WMS IV: Rationale for Change

3. The neuron has many branch-like extensions called that receive input from other neurons. a. glia b. dendrites c. axons d.

Career Paths for the CDS Major

Discourse Markers in English Writing

Function (& other notes)

Prof Dr Dr Friedemann Pulvermüller Freie Universität Berlin WS 2013/14 Progress in Brain Language Research Wed, 4-6 pm ct, K 23/11

SPECIFIC LEARNING DISABILITIES (SLD)

WHAT IS CEREBRAL PALSY?

Language Acquisition in Autistic Children: A Longitudinal Study

Integration and Visualization of Multimodality Brain Data for Language Mapping

Cortical Visual Impairment An introduction

Effects of Age and Experience on the Development of Neurocognitive Systems

The Rehearsal Function of Phrases and their Models

STROKE CARE NOW NETWORK CONFERENCE MAY 22, 2014

Neurogenic Disorders of Speech in Children and Adults

Categories of Exceptionality and Definitions

African American English-Speaking Children's Comprehension of Past Tense: Evidence from a Grammaticality Judgment Task Abstract

Chapter Fourteen. Emotion, Reward, Aggression, and Stress

Criteria for Entry into Programs of Special Education for Students with Disabilities

Newcastle and Northumbria Working Papers in Linguistics 21.2 (2015)

Reading Specialist (151)

Charles A. Nelson III Children s Hospital Boston/Harvard Medical School Harvard Center on the Developing Child

Using the search keywords brain and language in Ovid Medline, and papers, were found, respectively. Combining

Vygotsky-Luria-Leontiev s School of Psycholinguistics: the Mechanisms of Language Production

Joan Stiles, Elizabeth A. Bates, Donna Thal, Doris A. Trauner, and Judy Reilly

Table Key. RHD = right- hemisphere damage. LHD = left-hemisphere damage. PN = prenatal. + S = patients suffering from seizures

Comprehensive Special Education Plan. Programs and Services for Students with Disabilities

National Academy of Sciences Committee on Educational Interventions for Children with Autism

Introduction to 30th Anniversary Perspectives on Cognitive Science: Past, Present, and Future

Manual signs: what does research tell?

ELPS TELPAS. Proficiency Level Descriptors

Neural organization for recognition of grammatical and emotional facial expressions in deaf ASL signers and hearing nonsigners

Dr V. J. Brown. Neuroscience (see Biomedical Sciences) History, Philosophy, Social Anthropology, Theological Studies.

SIGN LANGUAGE: AN OVERVIEW Wendy Sandler The University of Haifa

SPEECH OR LANGUAGE IMPAIRMENT

PSYC PSYCHOLOGY Calendar Proof

Social Security Disability Insurance and young onset dementia: A guide for employers and employees

Preserved speech abilities and compensation following prefrontal damage (neuroimaging/positron emission tomography/aphasia/recovery/lesion)

BILINGUALISM Kenji Hakuta School of Education Stanford University

PROGRAMME SPECIFICATION POSTGRADUATE PROGRAMMES

The Handbook of Clinical Neuropsychology

ERP indices of lab-learned phonotactics

Master of Arts in Linguistics Syllabus

Sign Language Linguistics Course texts Overview Assessment Week 1: Introduction and history of sign language research

The syllable as emerging unit of information, processing, production

Baby Signing. Babies are born with an inherent body language that is common to all cultures.

PS3021, PS3022, PS4040

Are signed languages "real" languages? Evidence from American Sign Language and Langue des Signes Québécoise Reprinted from

Electrophysiology of language

Transcription:

Hickok et al. Sign language aphasia Review The neural organization of language: evidence from sign language aphasia Gregory Hickok, Ursula Bellugi and Edward S. Klima To what extent is the neural organization of language dependent on factors specific to the modalities in which language is perceived and through which it is produced? That is, is the left-hemisphere dominance for language a function of a linguistic specialization or a function of some domain-general specialization(s), such as temporal processing or motor planning? Investigations of the neurobiology of signed language can help answer these questions. As with spoken languages, signed languages of the deaf display complex grammatical structure but are perceived and produced via radically different modalities. Thus, by mapping out the neurological similarities and differences between signed and spoken language, it is possible to identify modality-specific contributions to brain organization for language. Research to date has shown a significant degree of similarity in the neurobiology of signed and spoken languages, suggesting that the neural organization of language is largely modality-independent. While the left cerebral hemisphere s dominance for language is well-established, the basis of this asymmetry is still a matter of debate. Some have argued that the left hemisphere contains neural systems specialized for processing linguistic information 1 3, while others have claimed that these left hemisphere systems are not specialized for language processing per se, but rather for a more domain-general process, or set of processes 4,5. On the latter view there is no direct neurobiological association between language itself and the left hemisphere except insofar as language processes involve these more basic, domain-general operations. There have been two principle candidates proposed for a domain-general basis for language asymmetry: (1) processing of rapidly changing temporal information 4, and (2) controlling the articulation of complex motor sequences 5. To be sure, processing language involves neural mechanisms such as these to some extent, and there is evidence to suggest that an impairment in fast temporal processing or articulation of motor commands can affect aspects of language performance, at least under some conditions 4,5. However, in this paper, we review data that call into question the hypothesis that left-hemisphere dominance for language can be reduced fully to domain-general processes. The data come from studies of deaf individuals who have unilateral brain lesions and whose primary means of communication is American Sign Language (ASL). This population provides a means to examine the neurobiology of a natural, highly structured human language (see Box 1) with modality dependent effects (such as a heavy reliance on processing fast temporal information) factored out, and thus is well-suited to addressing questions concerning the basis of brain organization for language generally. In discussing the neural organization of language, it is worthwhile clarifying at the outset exactly which aspects of language we are talking about. Language is a complex, multi-level system. One can talk about phonological, morphological and syntactic processes, as well as lexical, semantic, discourse and pragmatic-level processes, among others. When it is claimed that language is predominantly processed by the left hemisphere, this usually refers to phonological, morphological and syntactic levels (what can be called grammatical structure) as well as aspects of lexicalsemantic processing. Unless otherwise indicated, we will follow this convention when we refer to sign language. The signs of aphasia A number of case studies have suggested that in deaf life-long signers the left hemisphere is dominant for comprehension and production of signed language 3,6 9. (See also previous case study reviews 3,10,11.) In these studies, left-hemisphere damaged (LHD) signers presented with disruptions in various aspects of their signing ability (aphasia) whereas right-hemisphere damaged (RHD) signers were non-aphasic. Here, we provide some examples of the types of sign language deficits found in individual cases which make the point that the kinds of deficits observed in deaf signers are quite similar to those found in hearing aphasics. We will then turn to G. Hickok is at the Department of Cognitive Sciences, University of California, Irvine, CA 92697, USA. U. Bellugi is at the Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA. E.S. Klima is at the Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA, and the Department of Linguistics, University of California, San Diego, La Jolla, CA 92093, USA. tel: +1 714 824 1409 fax: +1 714 824 2307 e-mail: gshickok@ uci.edu Copyright 1998, Elsevier Science Ltd. All rights reserved. 1364-6613/98/$19.00 PII: S1364-6613(98)01154-1 129

Review Hickok et al. Sign language aphasia Box 1. Sign language as a natural human language Fig. A,B Spatial contrasts at the lexical and morphological levels in ASL. (A) The same sign at different spatial locations has three different meanings. (B) Different inflections of the word give, shown by increasing sign complexity. (Reproduced, with permission, from Ref. 15.) Like spoken languages, signed languages of the deaf are formal, highly-structured linguistic systems, passed down from one generation to the next, with a rigid developmental course, including a critical period for acquisition a,b. Signed languages have emerged independently of the language used among hearing individuals in the surrounding community: ASL and British Sign Language, for example, are mutually incomprehensible, despite the fact that English is the dominant spoken language in both surrounding communities. Signed and spoken languages, however, share all the underlying structural complexities of human language c. That is, all natural human languages have linguistic structure at phonological, morphological and syntactic levels, and signed languages are no exception (see Fig.). At the phonological level, research has shown that like the words of spoken languages, signs are fractionated into sub-lexical elements, including various recurring handshapes, articulation locations, and limb/hand movements, among other features d,e. Further, comparison of two different signed languages (ASL and Chinese Sign Language) reveals that there are even fine-level systematic phonetic differences leading to an accent when native users of one sign language learn another f,g. At the morphological level, ASL, for example, has developed grammatical markers that serve as inflectional and derivational morphemes; these are regular changes in form across classes of lexical items group-level studies comparing the effects of left- versus right-hemisphere damage on sign language ability. Case studies (1) Paraphasic errors: a hallmark of aphasia is the presence of paraphasic errors in language production. Several types of paraphasic errors occur to varying degrees in different types of aphasia. These include literal or phonemic paraphasias in which errors are made in the sound pattern of words (e.g. tagle for table ), verbal or semantic paraphasias in which a semantically related word is substituted for the target (e.g. uncle for brother ), and paragrammatic paraphasias in which inappropriate or neologistic words or morphemes are selected during running speech. A similar range of paraphasic errors have been documented in many of the LHD signers that have been studied. Some examples of phonemic 130

Hickok et al. Sign language aphasia Review Fig. C The syntactic level in ASL. Illustration of a complete sentence by the spatial relation between signs. Examples are seen from in front and from above the signer. associated with systematic changes in meaning g. At the syntactic level, ASL specifies relations among lexical items using a variety of mechanisms including (a) sign order, (b) the manipulation of sign forms (usually verbs) in space, where different spatial relations between signs have systematic differences in meaning, and (c) a small set of grammaticized facial expressions that are used to mark questions, topicalized sentences, and conditionals h j. In sum, ASL has developed as a fully autonomous language with grammatical structuring at the same levels as spoken language and with similar kinds of organizational principles. Yet the surface form that this grammatical structuring assumes in a visualspatial language is shaped by the modality in which the language developed in that there is a strong tendency to encode grammatical relations spatially rather than temporally. The implication of this situation for research on the neurobiology of language is that we have the opportunity to study a linguistic system that is essentially identical to that of spoken language in terms of its underlying linguistic (i.e. representational) structure, but that is implemented in a radically different perceptual signal. In effect, we have a well-controlled experimental manipulation: linguistic structure is held constant while surface perceptual form is varied. By studying the brain organization of such a system as it compares to spoken language we can gain insights into the factors that drive brain organization for language. References a Newport, E. and Meier, R. (1985) The acquisition of American Sign Language, in The Crosslinguistic Study of Language Acquisition: The Data (Vol. 1) (Slobin, D.I., ed.), pp. 881 938, Erlbaum b Newport, E.L. (1991) Contrasting conceptions of the critical period for language, in The Epigenesis of Mind: Essays in Biology and Cognition (Carey, S. and Gelman, R., eds), pp. 111 130, Erlbaum c Goldin-Meadow, S. and Mylander, C. (1998) Spontaneous sign systems created by deaf children in two cultures Nature 391, 279 281 d Corina, D. and Sandler, W. (1993) On the nature of phonological structure in sign language Phonology 10, 165 207 e Perlmutter, D.M. (1992) Sonority and syllable structure in American Sign Language Linguistic Inquiry 23, 407 442 f Poizner, H., Klima, E.S. and Bellugi, U. (1987) What the Hands Reveal About the Brain, MIT Press g Klima, E. and Bellugi, U. (1979) The Signs of Language, Harvard University Press h Lillo-Martin, D. and Klima, E.S. (1990) Pointing out differences: ASL pronouns in syntactic theory, in Theoretical Issues in Sign Language Research, Vol. 1: Linguistics (Fischer, S.D. and Siple, P., eds), pp. 191 210, University of Chicago Press i Lillo-Martin, D. (1991) Universal Grammar and American Sign Language: Setting the Null Argument Parameters, Kluwer Academic Publishers j Liddell, S. (1980) American Sign Language Syntax, Mouton and paragrammatic paraphasias produced by LHD signers are shown in Fig. 1. Note that the phonemic errors (Fig. 1A) represent a substitution of one ASL phoneme for another, whereas the paragrammatic error (Fig. 1B) represents an illegal combination of ASL morphemes, the root sign brilliant combined with the grammatical inflection that conveys a meaning similar to characteristically. Errors like these are fairly common in LHD signers, but not in RHD signers 15. (2) Fluent and non-fluent aphasias: the distinction between fluent and non-fluent aphasia types (defined in terms of the number of words uttered in an uninterrupted string) is prominent in aphasia research and appears to be a fairly robust dichotomy 12. This distinction has also been observed in sign language aphasia 11. One subject presented with a Broca s-aphasia-like syndrome in which her sign output was extremely effortful, dysarthric, and restricted to one or two 131

Review Hickok et al. Sign language aphasia Fig. 1 Examples of paraphasias in deaf left-hemisphere damaged (LHD) signers. In phonological errors (A) the correct American Sign Language (ASL) phoneme is substituted for an incorrect one. In paragrammatic errors (B) an illegal combination of ASL morphemes or signs is used. signs at a time (mostly nouns); comprehension was relatively spared. Consistent with data from hearing aphasics, her lesion involved most of the left lateral frontal lobe 3. A contrasting case sustained a mostly posterior lesion. He was able to string together sequences of several signs effortlessly, but he produced frequent paraphasic errors including several paragrammatical errors 3. (3) Naming: a virtually universal symptom of aphasia is a difficulty in retrieving words, a deficit that is often revealed when subjects are asked to provide the name for pictured objects or actions 13. This appears to be true also for sign language aphasia: on a 32-item naming test RHD subjects correctly named an average of 31.1 items (range 30 32). LHD subjects correctly named an average of 25.2 items (range 12 32), with 61% scoring below the RHD range. Excluding LHD subjects with non-perisylvian or subcortical lesions the percentage of subjects scoring below the RHD range increased to 80% (8/10), and one of the two subjects who performed well on the naming task had marked sign finding problems during conversational production. Difficulty in naming, then, appears to be a pervasive symptom of sign language aphasia, just as it is in spoken language aphasia. (4) Agrammatism: a common feature of aphasia produced by anterior left perisylvian lesions is agrammatism, a tendency to omit grammatical function morphemes in speech 13. A similar deficit has been observed in a Broca s, aphasia-like subject 3. Her production, in addition to being severely non-fluent, lacked all of the syntactic and morphological markings required in ASL. (5) A case of sign blindness : pure word blindness or alexia without agraphia has been well-documented in the literature 14. Patients with this disorder have normal spoken and heard language capacity, are able to write, but cannot read even their own writing. The lesion typically involves the left primary visual cortex and splenium of the corpus callosum. Language areas are preserved, allowing normal production, comprehension and writing, but these areas are isolated from visual input. A recently reported deaf signer had such a lesion and was alexic as expected 9. Would her signing be similarly affected? The answer was indeed yes. Her signing was fluent and grammatical, yet her comprehension was profoundly impaired; she could not follow even simple one-step ASL commands. Visual object recognition, however, was unimpaired. It would appear that this patient was essentially blind for sign language as a result of her left medial occipito-temporal lesion that isolated the left-hemisphere language systems from visual information. This case provides strong evidence favoring the view that the left hemisphere is dominant for ASL in deaf individuals because it demonstrates that the right hemisphere by itself has little capacity to process signed language beyond rudimentary single-sign comprehension 9. (6) Double dissociation between sign language ability and non-linguistic spatial cognitive ability: to determine whether deficits in sign language processing are simply a function of deficits in general spatial cognitive ability, standard measures of visuospatial cognition have been administered. Examples of performance on such tasks by four aphasic LHD signers and four non-aphasic RHD signers are presented in Fig. 2, showing clear dissociations between language and non-linguistic visuospatial abilities 3,15. Group studies (1) Language assessment: a recent group study comparing 13 LHD and 10 RHD signers on a range of standard language tests has confirmed the hypotheses suggested by case studies that the left hemisphere is dominant for sign language 15. Using an ASL-adapted version of the Boston Diagnostic Aphasia Examination 16, this study assessed each subject s competence in several basic aspects of language use: production, comprehension, naming and repetition. The LHD signers performed significantly worse than the RHD signers on all measures. The differences held even in the subset of subjects who were both deaf from birth and were exposed to ASL prelingually (3 RHD, 3 LHD). 132

Hickok et al. Sign language aphasia Review Fig. 2 Performance on spatial cognitive tasks by four aphasic LHD signers (top) and four non-aphasic RHD signers (bottom). The task in each case was to copy the model drawing in the centre panel. RHD signers show a greater impairment than LHD signers, indicating a dissociation between language and non-linguistic visuospatial functions. (Reproduced, with permission, from Ref. 15.) Crucially, this subset of cases includes contrasting LHD and RHD subjects with large perisylvian lesions. (2) Non-linguistic spatial cognitive assessment: it was noted above that a double-dissociation has been documented between sign language deficits and deficits in non-linguistic visuospatial abilities. The visuospatial abilities that were assessed in that work involved relatively gross, global disruptions (see Fig. 2) of the type that are typically associated with right-hemisphere damage. While these data make the point that there is a fair degree of separability between language and gross visuospatial function, they do not address the question of whether more subtle visuospatial deficits of the type commonly associated with left-hemisphere damage in the hearing/speaking population 17 might underlie sign language deficits. A recent experiment addressed this issue 18 : A group of left- or right-lesioned deaf signers were asked to reproduce drawings that contained hierarchically organized structure 17. Consistent with data from hearing subjects, LHD deaf subjects were significantly better at reproducing global-level features (global configuration) whereas RHD deaf subjects were significantly better at reproducing local-level features (internal details). However, local-level visuospatial deficits in LHD signers did not correlate with expressive or receptive sign language measures. These findings suggest that language deficits in LHD deaf signers cannot be attributed to domain-general visuospatial deficits. Within hemisphere organization of signed language Given that the sensory/motor and input/output systems are radically different in signed versus spoken language, one might expect significant reorganization of within-hemisphere areas involved in signed language processing. For example, it seems reasonable to suppose that regions involved in sign language perception would be located in visual association cortex rather than in auditory association cortex as is the case for spoken language. Likewise, one might hypothesize that signed language production would involve pre-motor regions anterior to the part of motor cortex representing hand and arm, rather than anterior to motor cortex for the oral articulators, where Broca s area is. Given the sparseness of the clinical population, this is a difficult issue to address using the lesion approach but some preliminary data on the issue have emerged. To the extent that the types and patterns of deficits found in aphasia for sign language are similar to those found in aphasia for spoken language, a common functional organization is suggested between the two forms of language within the left hemisphere. There do seem to be a number of commonalities in the language deficits found in signed and spoken language. Thus, many of the aphasic symptom clusters that have been observed in deaf signers fall into classical clinical categories defined on the basis of hearing aphasics, and the lesions producing these patterns of deficits in LHD signers are consistent with clinical anatomic correlations in the hearing population. Examples of this include the following observations: there has not been a case in which a lesion outside the perisylvian language zone has led to a primary aphasia (although admittedly there have not been many subjects with extra-perisylvian lesions); non-fluent aphasic signers have lesions involving anterior language regions, and fluent aphasic signers have lesions involving posterior language regions 11 ; and Broca s area appears to play a role in ASL production 19 (see Box 2). In addition, the range of common deficit types that have been reported in hearing aphasics have been observed regularly in sign language aphasia (see above). Thus, based 133

Review Hickok et al. Sign language aphasia Box 2. The role of Broca s area in sign language production Broca s area has figured prominently in hypotheses concerning the anatomy of speech production a. While recent studies have shown convincingly that lesions restricted to Broca s area do not lead to a lasting, severe speech production deficit b d, evidence from the acute postictal syndrome, from cortical stimulation e,f, and from functional neuroimaging g,h suggests at least some role for Broca s area in speech production. The idea that Broca s area is involved in speech production Fig. Broca s area lesion in a deaf native signer. (A) Cortical and subcortical regions corresponding to Broca s area. (B) Lesioned area in the patient shown in increasingly dorsal horizontal sections. (Reproduced, with permission, from Ref. i.) holds considerable intuitive appeal it makes sense that an area involved in programming speech should be topographically situated near motor cortex controlling speech-related musculature. In fact, it is tempting to hypothesize that the location of motor cortex for the speech articulators is what drives the topographic organization of Broca s area (functionally speaking) in development. Determining whether Broca s area (anatomically speaking) plays a role in the production of sign language (which uses articulators that are controlled by superior-lateral motor cortex) will contribute to answering the question: To what extent is the cerebral organization of language areas driven by the cerebral topography of the sensory motor systems? Relevant to this question is a recent case study of a congenitally deaf, native user of ASL, who suffered an ischemic infarct involving Broca s area and the inferior portion of the primary motor cortex i (see Fig.). This study revealed both similarities and differences in the aphasia syndrome compared with that found in hearing/speaking subjects following such a lesion. In short, the set of symptoms observed in the case of sign language aphasia was a superset of that noted in spoken language c,d : consistent with the effects of similar lesions in hearing subjects, the deaf subject presented with an acute mutism that quickly resolved. But whereas hearing subjects are typically left with only a very mild aphasia, or no aphasia at all, the deaf subject was left with a chronic fluent aphasia characterized by frequent phonemic paraphasias. These findings suggest that Broca s area does indeed play a role in sign language production, which in turn suggests that at least some aspects of the within-hemisphere organization for language are modality independent. References a Mohr, J.P. (1976) Broca s area and Broca s aphasia, in Studies in Neurolinguistics (Vol. 1) (Whitaker, H. and Whitaker, H.A., eds), pp. 201 235, Academic Press b Mohr, J.P. et al. (1978) Broca s aphasia Pathol. Clin. Neurol. 28, 311 324 c Alexander, M.P., Naeser, M.A. and Palumbo, C. (1990) Broca s area aphasias: aphasia after lesions including the frontal operculum Neurology 40, 353 362 d Tonkonogy, J. and Goodglass, H. (1981) Language function, foot of the third frontal gyrus, and rolandic operculum Arch. Neurol. 38, 486 490 e Penfield, W. and Roberts, L. (1959) Speech and Brain-Mechanisms, Princeton University Press f Ojemann, G.A. (1983) Brain organization for language from the perspective of electrical stimulation mapping Behav. Brain Sci. 6, 189 230 g Petersen, S.E. et al. (1988) Positron emission tomographic studies of the cortical anatomy of single-word processing Nature 331, 585 589 h Hinke, R.M. et al. (1993) Functional magnetic resonance imaging of Broca s area during internal speech NeuroReport 4, 675 678 i Hickok, G. et al. (1996) The role of the left frontal operculum in sign language aphasia Neurocase 2, 373 380 on available evidence, it is reasonable to hypothesize that the functional organization of signed and spoken language in the left hemisphere is very similar. Functional dissociations Here, we describe a number of functional dissociations that shed light on the relation between language and other cognitive domains. The first two concern the separability of sign language deficits from non-linguistic processes that involve the manual modality namely, manual praxis and gestural ability and the third examines the use of space to represent linguistic information in ASL (as in identifying the subject and object of a sentence) compared with the use of space to represent spatial information itself. On the relation between apraxia and aphasia Dissociations between motor control in the service of sign language versus in the service of non-linguistic behaviors have been reported by several authors 3,7,20. Some of these data come from fine-grained analyses of the effects of Parkinson s disease (PD) on deaf signers. Poizner and colleagues 20,21 report that while signers with PD exhibit disruptions in temporal organization and coordination during sign production, linguistic distinctions are preserved in contrast to aphasic signers. Other workers, however, have reported an association between sign language disruptions and disruptions of domain-general motor control which has led to the claim that sign aphasia is merely a reflection of apraxia 5. To address this claim a group of LHD signers were asked to copy non-representational manual movements 3,5,22 using the arm ipsilateral to the lesion. Varying degrees of disruption in the ability to perform this task were noted, but apraxia scores did not correlate significantly with measures of sign production during connected signing 15. These data suggest that there is a significant amount of variability in at least some aspects of sign language disruption that cannot be accounted for solely by a disruption of motor control. 134

Hickok et al. Sign language aphasia Review Dissociation of gesture and sign language Evidence supporting the view that deficits in sign language are qualitatively different from deficits in the ability to produce and understand pantomimic gesture comes from a case study of a LHD signer 8. Following an ischemic infarct involving both anterior and posterior perisylvian regions, the subject became aphasic for sign language. His comprehension was poor and his sign production was characterized by frequent paraphasias and a tendency to substitute pantomime for ASL signs a tendency not present prior to his stroke. These pantomimic gestures were used even in cases in which the gesture involved similar or more elaborate sequences of movements, arguing against a complexity-based explanation of his performance. A similar dissociation was noted in his comprehension of signs versus pantomime. This case makes the point that disruptions in sign language ability are not merely the result of more general disruptions in the ability to communicate through symbolic gesture. Use of space to communicate grammatical information versus spatial information directly In addition to using space to encode grammatical information, ASL uses space to represent spatial information directly, as for example, in describing the layout of objects in a room. It is worth making the distinction clear between the grammatical use of space in ASL, as in the encoding of phonological, morphological and syntactic information described previously, and the use of space to encode spatial information directly in ASL discourse. The latter refers to the ability to use language to communicate spatial information, and it takes place in signed and spoken language. Spoken language communicates spatial information through the use of prepositions and spatial description words as in, The cup is near the left, front corner of the table. Note that the grammatical structure of such a sentence is independent of how accurate the spatial information is. In ASL, instead of using lexical means to communicate spatial information, in many cases the location of objects relative to one another is literally mapped out in (signing) space. Again, the grammatical structure of a signed sentence is independent of the truth value of the content. Can the grammatical use of space (i.e. language) be dissociated from the use of space to communicate spatial information (i.e. spatial cognition) even when these types of information are expressed in the same channel? Two deaf, native signers one LHD and one RHD participated in comprehension tasks involving these two uses of space within ASL 23. The grammatical task involved signed sentences similar to the cat chased the dog in which the grammatical subject and object of the verb were indicated spatially. The spatial task involved a signed description of the layout of furniture in a room. In both tasks, subjects were asked to match the signed stimulus to a picture. The LHD signer was impaired on the grammatical task, but performed well on the spatial task. The RHD signer showed the reverse pattern. These data suggest that the neural organization for language and spatial cognition are driven by the type of representation that is ultimately constructed from the signal (grammatical versus spatial), rather than by the physical properties of the signal itself. Neuroanatomy of sign language as revealed by functional neuroimaging Recent studies have begun looking at the neural organization of sign language using various functional imaging techniques. One such study addressed the question, Does Broca s area play a role in sign language production? Native deaf signers participated in a functional magnetic resonance imaging (fmri) experiment in which they covertly produced ASL signs. Similar tasks (including covert production) have produced fmri activations in Broca s and Wernicke s areas in hearing subjects producing spoken words 24. In preliminary analyses it was found that deaf subjects did indeed show activation in Broca s area (G. Hickok et al. reported at Society for Neuroscience, 1995), corroborating earlier lesion data 19. Another recent fmri study carried out by Neville and colleagues 25 has provided further evidence that classic lefthemisphere language regions are activated during the perception of ASL sentences by deaf native signers. And event related potential (ERP) work 26 has also suggested that the within-hemisphere organization of neural systems mediating grammatical and lexical aspects of signed and spoken language are quite similar. Taken together, these recent neuroimaging studies corroborate the lesion data and suggest that at least some major components of the neural organization for language processing are modality independent. There has been the additional suggestion, based primarily on some of the above mentioned neuroimaging work, that while classic left-hemisphere language areas are involved in sign language production and comprehension, there may be more extensive involvement of right-hemisphere systems in processing signed compared to spoken language 25 27, a claim that appears contradictory to what has been concluded based on lesion work. There are several possible explanations for this discrepancy. The first is that the functional imaging studies contrasted deaf subjects perceiving sign language sentences, with hearing subjects reading printed English sentences. Thus the conditions differ in (1) their prosodic content, a right-hemisphere language function 28,29, and (2) the visual presence of a human source for the linguistic signal which may have included non-linguistic communicative signals as well. Secondly, the sign stimuli used in these studies may have incorporated some of the spatial description mechanisms described above, which could have driven right-hemisphere systems. A final, more general point is that lesion studies often target specific behaviors (e.g. naming ability, phonological processing, etc.) and ignore others (e.g. discourse, prosodic abilities), whereas an imaging study may reflect processes across a wider range of linguistic and non-linguistic domains simultaneously. It is therefore difficult to know which aspect(s) of the stimulus, or higher-level cognitive operation on the stimulus, is driving the additional activations (e.g. deaf people might use visual imagery to a greater extent during language processing). Therefore, much work remains to be done before strong conclusions can be drawn from these neuroimaging studies regarding possible differences in the hemispheric organization of signed and spoken language, but the general claim regarding the involvement of classic left-hemisphere systems in deaf signers has been confirmed by these results. 135

Review Hickok et al. Sign language aphasia Outstanding questions Language processes appear to be separable from domain-general processes in the adult. Might these domain-general processes contribute to language organization in development or perhaps in evolution? How do lower-level visual perception systems, such as those involved in processing motion and shape, interact with the perception of movement and hand-shape components of signed utterances? What neural pathways are involved in the transmission of information from the visual system to temporal lobe language systems in sign language perception? Might these pathways overlap with those involved in the perception of the printed form of spoken language by hearing/speaking individuals? Will it be possible to treat a patient with sign blindness by training them to perceive sign language in the tactile modality, as is done with deaf blind individuals? The analysis of sign blindness presented above predicts that this should be possible. Conclusion Research investigating the neural organization of signed language and spatial cognition in deaf individuals has demonstrated the same hemispheric asymmetries found in hearing, speaking individuals. This suggests that the hemispheric organization of grammatical aspects of language is independent of modality and, more specifically, unaffected by the fact that signed language involves a significant degree of visuospatial processing. Recent investigations of the within-hemisphere organization of sign language have hinted that there is also a some degree of similarity in the neural organization of signed and spoken language even within the left hemisphere, although much work remains. Since the time-course of sign language articulation is significantly slower than for spoken language, the existence of sign language deficits following left-hemisphere damage argues against the view that the lateralization of language is simply a function of a more general left-hemisphere bias for rapid temporal processing. Further, a series of functional dissociations has shown that deficits in processing sign language cannot be explained in terms of other domain-general deficits such as manual praxis and symbolic communication (pantomime). Finally, dissociations within aspects of sign language and between sign language and non-linguistic spatial abilities (all of which might fall under the general rubric of visuospatial processing ) suggest that the functional organization of cognitive systems is to some extent modular, with the modules being organized with respect to representational properties of the systems (e.g. grammatical representations versus purely spatial representations), rather than in terms of the physical characteristics of the stimulus (e.g. visuospatial versus temporal). Acknowledgement This research was supported in part by the National Institutes of Health grant R01 DC00201 to U.B. at The Salk Institute for Biological Studies. References 1 Bellugi, U., Poizner, H. and Klima, E. (1989) Language, modality, and the brain Trends Neurosci. 10, 380 388 2 Corina, D.P., Jyotsna, V. and Bellugi, U. (1992) The linguistic basis of left hemisphere specialization Science 255, 1258 1260 3 Poizner, H., Klima, E.S. and Bellugi, U. (1987) What the Hands Reveal About the Brain, MIT Press 4 Tallal, P., Miller, S. and Fitch, R.H. (1993) Neurobiological basis of speech: a case for the pre-eminence of temporal processing Ann. New York Acad. Sci. 682, 27 47 5 Kimura, D. (1993) Neuromotor Mechanisms in Human Communication, Oxford University Press 6 Kimura, D., Battison, R. and Lubert, B. (1976) Impairment of nonlinguistic hand movements in a deaf aphasic Brain Lang. 3, 566 571 7 Chiarello, C., Knight, R. and Mandel, M. (1982) Aphasia in a prelingually deaf woman Brain 105, 29 51 8 Corina, D. et al. (1992) Dissociation between linguistic and nonlinguistic gestrual systems: a case for compositionality Brain Lang. 43, 414 447 9 Hickok, G. et al. (1995) A case of sign blindness following left occipital damage in a deaf signer Neuropsychologia 33, 1597 1606 10 Kimura, D. (1981) Neural mechanisms in manual signing Sign Lang. Studies 33, 291 312 11 Corina, D. (1998) The processing of sign language: evidence from aphasia, in Handbook of Neurolinguistics (Stemmer, B. and Whitaker, H.A., eds), pp. 313 329, Academic Press 12 Goodglass, H., Quadfasel, F. and Timberlake, W. (1964) Phrase length and the type and severity of aphasia Cortex 1, 133 153 13 Goodglass, H. (1993) Understanding Aphasia, Academic Press 14 Friedman, R.B. and Albert, M.L. (1985) Alexia, in Clinical Neuropsychology (Heilman, K.M. and Valenstein, E., eds), pp. 49 73, Oxford University Press 15 Hickok, G., Bellugi, U. and Klima, E.S. (1996) The neurobiology of signed language and its implications for the neural basis of language Nature 381, 699 702 16 Goodglass, H. and Kaplan, E. (1976) The Assessment of Aphasia and Related Disorders, Lea & Febiger 17 Delis, D.C., Kiefner, M.G. and Fridlund, A.J. (1988) Visuospatial dysfunction following unilateral brain damage: dissociations in hierarchical and hemispatial analysis J. Clin. Exp. Neuropsychol. 10, 421 431 18 Hickok, G., Kirk, K. and Bellugi, U. Lateralization of local and global processing and its relation to sign language aphasia in deaf brain lesioned signers, in Proc. Fifth Annu. Meet. Cognit. Neurosci. Soc., 5 7 April 1998, MIT Press/Cognit. Neurosci. Soc. (in press) 19 Hickok, G. et al. (1996) The role of the left frontal operculum in sign language aphasia Neurocase 2, 373 380 20 Poizner, H. and Kegl, J. (1993) Neural disorders of the linguistic use of space and movement Ann. New York Acad. Sci. 682, 192 213 21 Brentari, D., Poizner, H. and Kegl, J. (1995) Aphasic and parkinsonian signing: differences in phonological disruption Brain Lang. 48, 69 105 22 Kimura, D. (1982) Left-hemisphere control of oral and brachial movements and their relation to communication Philos. Trans. R. Soc. London Ser. B 298, 135 149 23 Hickok, G. et al. (1996) The basis of hemispheric asymmetry for language and spatial cognition: clues from focal brain damage in two deaf native signers Aphasiology 10, 577 591 24 Hinke, R.M. et al. (1993) Functional magnetic resonance imaging of Broca s area during internal speech NeuroReport 4, 675 678 25 Neville, H. et al. (1998) Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience Proc. Natl. Acad. Sci. U. S. A. 95, 922 929 26 Neville, H.J. et al. (1997) Neural systems mediating American Sign Language: effects of sensory experience and age of acquisition Brain Lang. 57, 285 308 27 Corina, D.P. (1998) Studies of neural processing in deaf signers: toward a neurocognitive model of language processing in the deaf J. Deaf Studies Deaf Educ. 3, 35 48 28 Ross, E.D. (1981) The aprosodias: functional anatomic organization of the affective components of language in the right hemisphere Arch. Neurol. 38, 561 569 29 Ross, E.D. et al. (1988) Acoustic analysis of affective prosody during right-sided Wada Test: a within-subjects verification of the right hemisphere s role in language Brain Lang. 33, 128 145 136