Several people, deaf and hearing, have asked me why I’m studying Language Sciences and why I get so enthused about people’s brains (disclaimer: I only like them as ghostly digital scans – the pink and fibrous horror of exposed and dissected brains is completely revolting). What are you doing that for, they ask? The answer is that the study of atypical signing (signing brains that are damaged or otherwise different in some way) not only helps us gather further evidence that sign languages really are full and proper languages, but is now giving us a different and necessary perspective on the cognitive processes behind all language: in other words, both the similarities and differences between spoken and signed languages tell us something useful about language and the potential of the human brain. Sometimes people get frustrated when the answers to their questions are just more questions, but I love it: a better question should be a good enough answer.
We have known for over a century that fairly specific regions of the left hemisphere of the brain are critical for language (in over 90% of the population). Damage to these left hemisphere regions usually creates a range of difficulties with producing or understanding language, generally termed “aphasia”, and damage to the equivalent region in the right hemisphere does not result in the same difficulties: this seems to be true for both signed and spoken languages. But just why the left side is “special” remains unclear, and the waters have been muddied for many of us by a lot of mainstream media coverage that makes wild and vague claims about left- and right-handed brains (“neurobollocks”, if you’ll pardon the technical term).
So does the right hemisphere play no role in language? Does a specific brain impairment affect the use of all languages in the same way? What are we assuming that language is, anyway? Here are a few studies of right hemisphere involvement in language use.
Spaces and places
Atkinson et al. (2005) investigated the difficulties that deaf native BSL users have with tasks involving the comprehension of different types of locative sentence after they had suffered from stroke (CVA). A “locative sentence” is one which gives information about the relationships of objects in space, and BSL and most other sign languages have at least two distinct ways to do this. Firstly there is the use of “entity classifiers”, symbolic indicators of the form or function of objects or actions, to directly and visually indicate spatial relationships or movements: this is a feature of sign languages which has no direct equivalent in spoken languages. A second, distinct feature found in some people’s sign language production is the “prepositional” sentence, similar to structures found in many spoken languages.
The subjects were in three groups: those with left hemisphere damage (LHD), those with right hemisphere damage (RHD), and those with no damage (controls). A series of short sentences containing both types of construction were demonstrated by an examiner, for example, a BSL sentence which could be translated into English as “the pen is on the book”; the subjects then selected one of four pictures which they believed the sentence represented, one of which was always a direct reversal of the target picture (eg. a book on a pen) and one which was a semantic substitution (a pen on a chair). The results were that the LHD group tended to have difficulties with the prepositional sentences, but the RHD group had difficulties with these and with distinguishing between reversed items in the classifier sentences; the LHD group chose fewer reversals than the RHD group but made more semantic errors.
Although the authors caution that the group size was small and findings are tentative, they cite consistency in their deaf LHD group with other findings indicating LHD damage tending to produce difficulties with lexical selection and with prepositional sentences in non-deaf LHD subjects. However the study also indicates that RHD does create a different class of linguistic difficulties for signed language users, due to a reduced ability to directly perceive spatial or topographical relationships. This raises questions about the degree of separation assumed in other studies between linguistic and spatial faculties and what we mean by a “language impairment”.
Beyond single words and sentences
The study of aphasia (language impairment from LHD) and the conclusions drawn about the nature of language are often focused on deficits at the lexical and grammatical levels – single words and single sentences. But the effective use of language does require longer-term planning and maintenance at the discourse level – skilled language users can construct a point, and take several sentences to do it – and there is evidence that the right hemisphere plays a role in this. In Emmorey (2002, pp.304-306) and Woll (2012, p.4), you can read reviews of several cited studies which indicate that RHD signers exhibit various language impairments at the discourse level, similar to RHD spoken language users, such as going off at tangents, or getting stuck on irrelevant details. But looking at atypical signing added some refinements: while there was some evidence of the role impaired spatial cognition plays in some of the RHD atypical signers due to reduced ability to maintain “placement” and characterise “role shifts” (features of signed but not spoken languages languages), there were also individual cases where spatial reference was only mixed up at the wider discourse level, with indexes and verb agreements being preserved at sentence level (Hickok et al., 1999). It seems that impairment in spatial cognition does not correlate well with impaired discourse ability in signed language users, who make heavy use of spatial cognition.
No, no and no
Languages are very flexible and endlessly creative, and British Sign Language (BSL) is no exception: it has many forms of negation. Three examples are: a manual negative (a sign) which can be variously translated as “nothing”, “no-one”, “none” or “no”, eg. ship see none (“I see no ships”); a manual suffix which has the same effect as the English “not” on adjectives, eg. book index3 bad-not (“That book isn’t bad”); and facial/head movement negation, sometimes referred to as a “non-manual feature”, shown as a negating head shake which continues throughout a negated sentence constituent, eg. party i-am-goingneg (“I’m not going to the party”). There are several more types, and variations on those types, but these examples illustrate that generally we could divide BSL negation into “manual” forms (lexical signs) and non-manual prosodic features or “co-speech gesture”. (See Sutton-Spence & Woll 1999, pp.72-77 for more.)
How does the brain process these different forms of negation? A study by Atkinson et al. (2004) followed up previous cited ASL studies by investigating differences in hemisphere association with different forms of negation in BSL. It was found that three BSL users with RHD had difficulties in comprehending only the facial/prosodic negation, while three BSL users with LHD had no such difficulties but again displayed evidence of sign aphasia comparable to what you might see in a spoken language user. The authors suggest that “non-manual negation in sign may not be a direct surface realisation of syntax”, meaning that non-manual elements of sign language may rely on prosodic interpretation processes which take place in the right or minor hemisphere.
This has implications for everyone. Non-aphasic language impairments – problems with language which aren’t obvious at the word and sentence levels – could be neglected in existing definitions of communication disorder. Gesture and prosody are not only a feature of signed languages but are routinely used to add propositional, contextual, substantive and pragmatic content to speech utterances (Kendon 2000, pp.51-60). Spoken languages are less rich and useful when gesture and facial expression are removed, as anyone who has had an argument on the phone can tell you, so why should the same not be true for signed languages as well? There are those who are now asking good questions about the composition of signed languages: what is and is not a gesture in a signed language?
This is just a taster of the evidence that the right or non-dominant hemisphere has a subtle but important role to play in language, and we have it because of people interested in the signing brain who did not assume that language has a specific modality, that language is only spoken or written. At the very least, we can now ask better questions and challenge received wisdom about the cognition of language and the implications of left hemisphere lateralisation. In a review of various sign language studies examining the role of each hemisphere in language cognition, Belsevier et al. (1998) conclude that while “standard left hemisphere language areas may be common to all natural languages, the final organisation of the language system appears to be determined by the exact language experience of the individual”.
- Atkinson, J., Campbell, R., Marshall, J., Thacker, A., Woll, B. (2004): “Understanding ‘not’: neuropsychological dissociations between hand and head markers of negation in BSL”, in Neuropsychologia, vol. 42, Elsevier
- Atkinson, J., Marshall, J., Woll, B. & Thacker, A. (2005): “Testing comprehension abilities in users of British Sign Language following CVA”, in Brain and Language, vol. 94, Elsevier
- Belsevier, D., Corina, D.P., & Neville, H.J. (1998): “Brain and Language: a Perspective from Sign Language”, in Neuron, vol. 21, Cell Press
- Emmorey, K. (2002): Language, Cognition and the Brain: insights from sign language research, Lawrence Erlbaum Associates
- Hickok, G., Wilson, M., Clark, K., Klima, E.S. (1999): “Discourse deficits following right hemisphere damage in deaf signers”, Brain and Language, 66, Elsevier
- Kendon, A. (2000): “Language and gesture: unity or duality?”, in McNeill, D. (ed.) Language and Gesture, Cambridge University Press
- Sutton-Spence, R. & Woll, B. (1999): The Linguistics of British Sign Language: An Introduction, University Press Cambridge
- Woll, B. (2012): “Atypical signing”, in Pfau, R., Steinbach, M., & Woll, B. (eds.) Sign Language: an International Handbook, Mouton de Gruyter