10-Minute Talks: Looking at sign languages
14 Jul 2021
Professor Bencie Woll FBA introduces research on the sign languages of deaf communities.
I’m Professor Bencie Woll, a linguist and Honorary Professor of Sign Language and Deaf Studies, based at the Deafness Cognition and Language Research Centre at University College London, and also a Fellow of the British Academy. Unlike most linguists, the languages I research are sign languages, and in particular, British Sign Language (BSL for short), the preferred language of tens of thousands of deaf people in the UK. BSL is the third most widely used indigenous language in the UK, after English and Welsh.
Like the other sign languages of deaf communities around the world, BSL is a natural language, not invented by anyone, with the first descriptions of BSL going back to the 16th century. BSL is unrelated to English and has its own vocabulary and grammar. Because of Britain's colonial history, closely related sign languages can be found in Australia (Australian Sign Language or Auslan), New Zealand (NZ Sign Language), Malta (Maltese Sign Language) and in some parts of South Africa, India, and Canada (Maritime Sign Language).
Just as with spoken language, there are many sign languages. In the European Union for example, 23 official spoken languages and 31 sign languages have been documented. But it should be noted that the boundaries between spoken languages and those between sign languages are not always the same. For example, Swedish and Finnish Sign Languages are closely related, although the spoken languages are very different. British Sign Language and American Sign Language are unrelated languages and are not mutually intelligible.
While spoken languages use movements of the mouth, tongue and teeth, sign languages use movements of both hands, the upper body and face. And of course, spoken languages are primarily received through audition, while sign languages are primarily received through vision. Recognising these modality differences and their impact on language structure has caused a revolution in linguistics, psycholinguistics and our understanding of the neurobiology of language.
I research many different aspects of BSL and in the remainder of this talk, I will discuss some of my recent research in the following areas: sign language and the evolution of human language; sign language and the brain – including studies of signers with acquired neurological impairments, such as stroke, and how children learn BSL. I’ll end with a few comments about the future of BSL.
Some writers have suggested that the earliest human language was sign language, but we cannot look at sign languages as they are as evidence for how human language evolved, since all modern humans have "language-ready" brains, and there are no "ancestral" sign languages older than spoken languages. More relevant for the evolution of language is the observation that all human communication is multi-modal. That is to say, speakers typically use vocal and manual/bodily gestures when speaking, and signers always use facial and body gestures when signing.
However, one striking feature of sign languages does suggest that the development of symbols to represent referents (objects, concepts, actions and so on) is far easier in the visual modality. While spoken languages make extensive use of iconicity – for example, in English it is difficult to think of a word that refers to a sound that is not iconic: murmur, whisper, clang, thud, meow, but most referents are not associated with sounds. In contrast, humans primarily perceive the world visually, and it is clear that actions of the hands can be used to represent qualities of a referent, such as shape or movement. Thus new sign languages can be created relatively easily.
Different human languages – spoken or signed – also differ from each other in the way the properties of objects and events need to be expressed. This need to pay attention to the way in which we express things in different languages can be seen in spoken languages. For example, in English the gender of nouns is not marked – nouns are not masculine, feminine or neuter. So in English, a word such as "friend" can be used in a general way, whereas in Italian – which does mark the gender of nouns – the speaker needs to pay attention to whether the friend is male or female, and this will change the form of the word for friend. Because every noun in Italian has a gender – not just those relating to people – Italian speakers are always forced to pay more attention to the gender of the noun than English speakers.
In sign languages, because a spatial/visual modality is used to talk about things and actions, where a spatial expression is called upon – such as the shape of objects or spatial layout – signers will express properties – such as the location of objects in a room – in a way that takes their relative positions into account far more than spoken languages.
An example of this is that in English one would say “there is a table in my kitchen”, whereas in BSL, the signer may refer to the table in a way that reflects its actual physical position, orientation and size in the kitchen. For this reason, signers may pay greater attention to these aspects of the world than users of spoken languages. These differences may underscore some differences in brain networks used in processing language for signers and speakers.
Therefore, in the light of the differences in form and function, we might ask whether the same areas of the brain are used to process sign language and spoken language. Remarkably the answer to this is "mostly, yes!". Our studies comparing which areas of the brain are involved in the processing of BSL sentences by Deaf signers, compared to the areas of the brain involved in processing of English sentences by hearing non-signers show a remarkable similarity. Both English and BSL are primarily left-lateralised, and both make use of Broca’s and Wernicke’s areas – two areas which have been known for well over 100 years to be crucial areas in the brain for language. Additionally, signers who have suffered damage to the brain – for example, as a result of a stroke – show very similar language problems as hearing speakers with damage to the same brain regions. For example, damage to the front of the brain on the left side can lead to problems with both sign and speech production – aphasia. BSL signers and English speakers with right hemisphere strokes show a loss of facial expression and understanding of affect. Questions and negation are not understood in sentences where the meaning is conveyed through intonation (in spoken languages) or visual prosody (in sign languages). In other neurological conditions in signers, such as Parkinson’s disease, disturbances in movement may result in “micro-signing” – comparable to micrographia (tiny writing) in hearing people with Parkinson’s disease.
Children usually learn their first language in interaction with their family and others in the environment. Deaf or hearing children with deaf parents learn a sign language in a parallel way to hearing children of hearing parents. But only five to 10 per cent of deaf children have deaf parents. So for many deaf children, acquisition of a first language poses a great challenge. Their parents are unlikely to know BSL and even with the provision of a cochlear implant, acquisition of spoken language may be delayed or incomplete. Where English is the only language provided to the child – often in the erroneous belief that bilingualism in BSL and English would be disadvantageous, there are often long-term effects of delayed first language acquisition in terms of efficiency of language processing in the brain and in educational achievement, including poor literacy and poor exam achievements.
Our research strongly indicates that children should be exposed to BSL during the period when they are also acquiring spoken language, in order to acquire the skills of effective communication and to develop cognition, including concepts such as those linked to abstract ideas such as time and space. Early BSL input can act as a bridge later to speech and literacy. Children with signing parents have been found in several studies to have good communication, language, literacy and social-emotional development.
The history of sign languages, like that of many minority languages, cannot be separated from a study of their relationship with the majority language communities which surround them. In 2021, there are contrasting futures: pressures, such as the decrease in opportunities for deaf children to use sign language with their peers as a result of the shift to mainstream education, and the possible decrease in the deaf population as a result of medical intervention and advances in genetics; while at the same time, increased interest and demand from the hearing community for courses in sign language (including a new GCSE in BSL), increased use of sign language in public contexts, including provision of BSL/English interpreting; and increased pride of the deaf community in their distinctive language and culture. It is to be hoped and expected that sign languages will continue to be living languages.
This talk originally took place on 14 July 2021, part of the series The British Academy 10-Minute Talks, where the world’s leading professors explain the latest thinking in the humanities and social sciences in just 10 minutes. 10-Minute Talks are screened each Wednesday, 13:00-13:10, on YouTube and available on Apple Podcasts. Subscribe to the British Academy 10-Minute Talks here.
Sign Language: The Study of Deaf People and their Language, book by Jim G. Kyle and Bencie Woll.
The Linguistics of British Sign Language, book by Rachel Sutton-Spence and Bencie Woll.
The Oxford Handbook of Deaf Studies, Language, and Education, edited by Marc Marschark and Patricia Elizabeth Spencer.
Sign Language: An International Handbook, book by Roland Pfau, Markus Steinbach, Bencie Woll.