Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Human Movement Science 28 (2009) 556565

Contents lists available at ScienceDirect

Human Movement Science


journal homepage: www.elsevier.com/locate/humov

Language as gesture
Michael C. Corballis *
Department of Psychology, University of Auckland, Private Bag 92019, Auckland 1142, New Zealand

a r t i c l e

i n f o

a b s t r a c t
Language can be understood as an embodied system, expressible as gestures. Perception of these gestures depends on the mirror system, rst discovered in monkeys, in which the same neural elements respond both when the animal makes a movement and when it perceives the same movement made by others. This system allows gestures to be understood in terms of how they are produced, as in the so-called motor theory of speech perception. I argue that human speech evolved from manual gestures, with vocal gestures being gradually incorporated into the mirror system in the course of hominin evolution. Speech may have become the dominant mode only with the emergence of Homo sapiens some 170,100 years ago, although language as a relatively complex syntactic system probably emerged over the past 2 million years, initially as a predominantly manual system. Despite the present-day dominance of speech, manual gestures accompany speech, and visuomanual forms of language persist in signed languages of the deaf, in handwriting, and even in such forms as texting. 2009 Elsevier B.V. All rights reserved.

Article history: Available online 8 August 2009 PsycINFO classication: 2330 2720 Keywords: Gesture Language Mirror neurons Sign language Speech Evolution

1. Introduction Language takes many forms. It may be spoken, signed, typed, handwritten, and even whistled (Carreiras, Lopez, Rivero, & Corina, 2005). Language may therefore be regarded as a product of the human mind, expressible through different media and perceived through different senses. At the same time, though, it is an embodied system, made possible through the diverse and exible way humans can represent information about the world, using their bodies. In other words, language is based on gesture.

* Tel.: +649 3737599x88561. E-mail address: m.corballis@auckland.ac.nz 0167-9457/$ - see front matter 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.humov.2009.07.003

M.C. Corballis / Human Movement Science 28 (2009) 556565

557

Of course, the gestural nature of language is obvious in such forms as sign language and handwriting, but even speech, which is the dominant form of language in the modern world, is better regarded as a gestural system than as a sound-based one. The realization that this is so is relatively recent. Traditionally, speech has been regarded as made up of discrete elements of sound, called phonemes, despite the fact that phonemes do not exist as discrete units in the acoustic signal (Joos, 1948), and are not discretely discernible in mechanical recordings of sound, such as a sound spectrograph (Liberman, Cooper, Shankweiler, & Studdert-Kennedy, 1967). One reason for this is that the acoustic signals corresponding to individual phonemes vary widely, depending on the contexts in which they are embedded. So long as speech is considered in auditory terms, then, it must be assumed that the acoustic signal undergoes complex transformation for individual phonemes to be perceived as such. Yet we can perceive speech at remarkably high rates, up to at least 1015 phonemes per second, which seems at odds with the idea that some complex, context-dependent transformation is necessary. These problems led to the so-called motor theory of speech perception, whereby speech is perceived in terms of how it is produced, rather than how it sounds (Liberman et al., 1967; see also Galantucci, Fowler, & Turvey, 2006). This in turn led to the concept of articulatory phonology (Browman & Goldstein, 1995), in which speech is described in terms of the movements, or gestures, of the six articulatory organsnamely, the lips, the velum, the larynx, and the blade, body, and root of the tongue. Each is controlled separately, so that individual speech units are comprised of different combinations of movements. The distribution of action over these articulators means that the elements overlap in time, which makes possible the high rates of production and perception. The motor theory of speech perception was motivated by the discovery that the elements usually considered the basic particles of speech, phonemes, are not discernible in a sound spectrograph, which is a physical record of the sound patterns (Liberman et al., 1967). In contrast, speech is much more readily perceived in a physical record of the gestures that produce it. To quote Studdert-Kennedy (2005), . . . as a unit of phonetic action the gesture can be directly observed by a variety of recording techniques, including X-ray, magnetic resonance imaging, and palatography (p. 57). In other words, we seem to be wired to perceive speech as a series of gestures rather than as a sequence of sounds.

2. Mirror neurons The motor theory of speech perception, and of gestural perception generally, was boosted with the discovery of mirror neurons. These neurons, rst recorded in area F5 in the ventral premotor cortex of the monkey, re both when the animal makes grasping movements and when it observes another individual making the same movements (Rizzolatti, Fadiga, Fogassi, & Gallese, 1996). Although mirror neurons, at least in the monkey, have to do with manual grasping, and not with speech, they offer support for a motor theory of perception, in that they map perceived movement onto its production. It has also become apparent that mirror neurons are part of a more general mirror system that involves other regions of the brain as well. The superior temporal sulcus (STS) also contains cells that respond to observed biological actions, including grasping actions (Perrett et al., 1989), although few if any respond when the animal itself performs an action. F5 and STS are connected to area PF in the inferior parietal lobule, where there are also neurons that respond both to the execution and perception of actions. These neurons are now known as PF mirror neurons (Rizzolatti, Fogassi, & Gallese, 2001). Other areas, such as amygdala and orbito-frontal cortex, may also be part of the mirror system. A similar system has been inferred in humans, based on evidence from electroencephalography (Muthukumaraswamy, Johnson, & McNair, 2004), magnetoencephalography (Hari et al., 1998), transcranial magnetic stimulation (Fadiga, Fogassi, Pavesi, & Rizzolatti, 1995), and functional magnetic resonance imaging (fMRI) (Iacoboni et al., 1999). The mirror system in humans appears to involve areas in the frontal, temporal, and parietal lobes that are homologous to those in the monkey, although there is some evidence that they tend to be lateralized to the left-hemisphere in humans (Fecteau, Lassonde, & Theoret, 2005; Iacoboni et al., 1999; Nishitani & Hari, 2000). It is well established that manual apraxia, especially for actions involving ne motor control, is associated with left-hemisphere damage (Heilman, Meador, & Loring, 2000).

558

M.C. Corballis / Human Movement Science 28 (2009) 556565

In humans, at least, the mirror system is tuned to a wide variety of actions, and especially those related to human movement. An fMRI study shows, for example, that it is activated when people watch mouth actions, such as biting, lip-smacking, oral movements involved in vocalization (e.g., speech reading, barking), performed by people, but not when they watch such actions performed by a monkey or a dog. Actions belonging to the observers own motor repertoire are mapped onto the observers motor system, while those that do not belong are notinstead, they are perceived in terms of their visual rather than their motor properties (Buccino et al., 2004). Watching speech movements, and even stills of a mouth making a speech sound, activate the mirror system, including Brocas area, which is one of the main cortical areas underlying the production of speech (Calvert & Campbell, 2003). If human language, including speech, is essentially a matter of producing, perceiving, and interpreting gestures, then it is natural to suppose that language, whether spoken or signed, incorporates the mirror system. Of course, the grasping movements of monkeys are not language, but they may well have provided the platform from which more complex interpretative structures evolved. The area of the human brain that corresponds most closely to area F5 in the monkey includes Brocas area, suggesting that speech itself was inducted into the mirror system. Quite apart from any involvement in language, though, the mirror system in humans seems to have properties that are somewhat language-like. For example, unlike the mirror system in monkeys, the human mirror system appears to be activated by movements that need not be directed toward an object (Rizzolatti et al., 2001), although there is evidence that it is activated more by actions that are object-directed than by those that are not object-directed (Muthukumaraswamy et al., 2004). Activation by non-object-directed action may reect adaptation of the system for more abstract signaling. But the neural areas involved in language, and not just Brocas area, seem to overlap with the mirror system itself, providing strong evidence that language developed from the mirror system, and is part of a more general capacity for the understanding of biological motion. One might also expect the mirror system to be activated by writing, or by stimuli associated with writing, although there is little direct evidence to date that this is so. Indirect evidence, though, comes from a rare condition known as echographia, in which patients automatically translate visual or auditory stimuli into writing. Patients with this disorder typically have lesions of the medial frontal and temporal cortices. It has been suggested that the fronto-parietal circuits of the mirror system are spared in these patients, resulting in the release of incessant, automatic writing induced by the perception of written or spoken words (Berthier, Pulvermuller, Green, & Higueros, 2006). Among the primates, at least, the incorporation of vocalization into the mirror system may be unique to humans, although there are mirror neurons in the monkey that do respond to the sounds of those movementssuch as the sound of tearing paper or nuts being cracked open (Kohler et al., 2002). But these sounds are based on manual action, and monkey calls themselves did not activate the system. A likely reason for this is that there is little if any intentional control over vocalization in non-human primates. Indeed, vocalization in non-human primates appears to be primarily under limbic control, whereas incorporation of vocal control into the pyramidal system, providing the level of intentional control necessary for speech, is unique to humans (Ploog, 2002). Even in the chimpanzee, the species most closely related to humans, voluntary control of vocalization appears to be extremely limited, at best (Goodall, 1986).

3. Evolution of language 3.1. The gestural theory The idea that language evolved from manual rather than vocal gestures is often attributed to the 18th-century philosopher Condillac (1971 [1746]), and was revived in the modern era by Hewes (1973). Although the idea was controversial at the time, and remains so, it continues to be advocated, often independently (e.g., Armstrong, 1999; Armstrong, Stokoe, & Wilcox, 1995; Armstrong & Wilcox, 2007; Corballis, 2002; Gentilucci & Corballis, 2006; Givn, 1995; Rizzolatti & Arbib, 1998; Rizzolatti & Sinigaglia, 2008). Part of the evidence comes from attempts to teach language to great apes, and

M.C. Corballis / Human Movement Science 28 (2009) 556565

559

especially to our closest relatives the chimpanzee and bonobo. It soon emerged that chimpanzees were essentially unable to speak; in one famous example, a baby chimpanzee reared in a human family proved able to articulate only three or four words, and these were actually whispered rather than vocalized. He was soon outstripped by the human children in the family (Hayes, 1952). It was then realized that the failure to speak may have resulted from deciencies of the vocal apparatus, and perhaps of cortical control of vocal output, rather than a failure of language itself. Subsequent attempts to teach language to non-human primates have therefore been based on manual action and visual representations. For example, the chimpanzee Washoe was taught over 100 manual signs, based loosely on American Sign Language, and was able to combine signs into two- or threeword sequences to make simple requests (Gardner & Gardner, 1969). The bonobo Kanzi has an even larger vocabulary, but his ability to construct meaningful sequences appears to be limited to only two or three words. Nevertheless Kanzi has shown an impressive ability to follow instructions conveyed in spoken sentences, with as many as seven or eight words (Savage-Rumbaugh, Shanker, & Taylor, 1998). There seems to be a general consensus, though, that these exploits are still not languageas Pinker (1994, p. 340) put it, the great apes just dont get it. Kanzis ability to understand spoken sentences, although seemingly impressive, was shown to be roughly equivalent to that of a two-and-a-half-yearold girl (Savage-Rumbaugh et al., 1998), and is probably based on the extraction of two or three key words rather than a full decoding of the syntax of the sentences. His ability to produce symbol sequences is also at about the level of the average two-year-old human. In human children, grammar typically emerges between the ages of two and four, so that the linguistic capabilities of Kanzi and other great apes is generally taken as equivalent to that of children in whom grammar has not yet emerged. Bickerton (1995, p. 339) wrote that The chimps abilities at anything one would want to call grammar were next to nil, and has labeled this pre-grammatical level of linguistic performance protolanguage. Even so, manuovisual communication in these apes is much closer to language than are their vocalizations. The study of Kanzi and other linguistic apes is based on attempts to teach forms of human language, but further evidence comes from a comparison of the natural communications of animals in the wild. Animal calls are communicative, but otherwise have none of the essential properties of language. Where language is exible and conveys propositional information, animal calls are typically stereotyped and inexible, and convey information about xed situations, such as threat of predation, mating, and territorial claims. In our closest non-human relatives, chimpanzees and bonobos, bodily gestures are much less tied to context than are vocalizations (Pollick & de Waal, 2007). Freedom from context is one of the characteristics of language. Language is critically dependent on social learning and on sensitivity to attentional state. Relatively few species are capable of vocal learning; these include elephants, seals, killer whales, and some birds, but of the primates only humans are vocal learners (Jarvis, 2006). Indeed, the adaptations necessary for exible control of vocalization appear to have emerged late in hominin evolution, well after the split of the hominins from the line leading to modern chimpanzees and bonobos, and the control of vocalization sufcient to sustain autonomous speech was possibly not complete until the emergence of our own species, Homo sapiens, within the past 200,000 years (Lieberman, 1998). In marked contrast to the inexibility of vocalizations in primates, the communicative bodily gestures of the bodily gestures of gorillas (Pika, Liebal, & Tomasello, 2003), chimpanzees (Liebal, Call, & Tomasello, 2004), and bonobos (Pika, Liebal, & Tomasello, 2005) are both subject to social learning and sensitive to the attentional state of the recipientboth prerequisites for language. Thus our great-ape heritage ensured that our hominin forebears were much better preadapted to a form of language based on manual and other bodily gestures than one based on vocalization. The likely scenario, then, is that language evolved from a system of manual gestures, with the gradual incorporation of vocalizations (Corballis, 2002, 2003, 2006; Gentilucci & Corballis, 2006). 3.2. Language as a big bang Some authors have proposed that true language evolved in a single step, perhaps with the emergence of our own species, Homo sapiens. This so-called big bang theory is often attributed to Bick-

560

M.C. Corballis / Human Movement Science 28 (2009) 556565

erton (1995), who wrote that . . . true language, via the emergence of syntax, was a catastrophic event, occurring within the rst few generations of Homo sapiens sapiens [p. 69]. Even more radically, Crow (2002) has proposed that a genetic mutation gave rise to the speciation of Homo sapiens, along with such uniquely human attributes as language, cerebral asymmetry, theory of mind, and a vulnerability to psychosis. Part of the argument for a late and sudden emergence of grammatical language is that the development of manufacture in Homo was in fact very slow. As Bickerton (2002) put it, for the rst 1.95 million years after the emergence of erectus almost nothing happened: The clunky stone tools became less clunky and slightly more diversied stone tools, and everything beyond that, from bone tools to supercomputers, happened in the last one-fortieth of the period in question (p. 104). Although there was something of an advance some 300,000 years ago, it was not really until the emergence of Homo sapiens that manufacture really began to progress, and perhaps it was really only in the last 40,000 years that so-called modern behavior became truly evident, in what has been termed the human revolution (Mellars & Stringer, 1989). In Bickertons view, these late developments are evidence of a recent big bang not only in language but in the complexity of human thought and behavior.1 Another reason to suppose that language may have emerged only recently in the evolution of Homo is that speech itself appears to have been a recent development. As noted earlier, it has proven virtually impossible to teach chimpanzees to speak, and the fossil evidence suggests that the alterations to the vocal tract and breathing apparatus necessary for articulate speech were completed late in hominid evolution (e.g., Lieberman, 1998; Lieberman, Crelin, & Klatt, 1972; MacLarnon & Hewitt, 1999), and perhaps only with the emergence of our own species, Homo sapiens, some 170,000 years ago. Thus Liebermans (1998) book Eve spoke: Human language and human evolution apparently equates the late emergence of speech with the late emergence of language itself. The aw in this argument, though, is that language is not necessarily equated with speech, as explained below. 3.3. Big bang refuted: A gradualist theory Language is a highly complex system, involving rules that are not yet fully understood even by professional linguists. There are phonological rules governing the formation of phonemeswhether conceived as speech sounds or manual signsand morphosyntactic rules governing the formation of morphemes and sentences. This double system of rules is known as duality of structure. The sheer complexity of language, then, makes it highly unlikely that language evolved in a single big bang, or indeed that it was restricted to our own species. The notion that language evolved from manual gestures allows for a much more continuous view of language evolution, with vocalizations gradually added to the gestural repertoire, achieving dominance, perhaps, with the emergence of Homo sapiens. And even today, language can of course exist without speech. Reading and writing can be accomplished silently, and although these skills are perhaps parasitic on speech, a skilled reader may have little access to the sounds of the words she reads. More compellingly, it is now well documented that signed languages have all of the grammatical sophistication of spoken languages, and in many deaf people have no connection with speech at all (Armstrong et al., 1995; Emmorey, 2002; Neidle, Kegl, MacLaughlin, Bahan, & Lee, 2000). Moreover, children exposed only to manual sign language go through the same stages of language development as those exposed to speech, even babbling in sign (Petitto & Marentette, 1991), suggesting that manual language is as natural as spoken language. It is likely that the progression from protolanguage to more sophisticated grammatical language, far from being a big bang linked to our own species, began with the emergence of the genus Homo some 2 million years ago, well after the split between the hominin and chimpanzee lineages, but well before Homo sapiens evolved some 170,000 years ago. Manufactured stone tools, often considered to be a conceptual advance beyond the opportunistic use of sticks or rocks as tools, do not appear in the fossil record until some 2.5 million years ago, perhaps in Homo rudolfensis, a precursor to Homo erectus
1 I argue below that the big bang theory is unlikely to be true, but even if it were it is unlikely to have included writing. Codied forms of writing did not develop until between about 40003000 BC in the Fertile Crescent, and later in other regions (Gaur, 1987). Moreover language, whether spoken or gestures, is universal, whereas for most of human history writing was restricted only to a small minority of the population.

M.C. Corballis / Human Movement Science 28 (2009) 556565

561

(Semaw et al., 1997). From some 1.8 million years, erectus began to migrate out of Africa into Asia and later into Europe (Tattersall, 2003), and the Acheulian industry emerged, with large bifacial tools and handaxes that seemed to mark a signicant advance over the simple aked tools of the earlier Oldowan industry (Gowlett, 1992). Another, perhaps more compelling pointer to the emergence of language is the increase in brain size. The brain size of the early hominins was about the same, relative to body size, as that of the present-day great apes, but from the emergence of the genus Homo some 22.5 million years ago brain size increased, and had doubled by about 1.2 million years ago. It reached a peak, not with Homo sapiens, but with the Neanderthals, who shared a common ancestry with modern humans from about 700,000 years ago (Noonan et al., 2006). In some individual Neanderthals, brain capacity seems to have been as high as 1800 cc, with an average of around 1450 cc. Brain size in our own species, Homo sapiens, is a little lower, with a present-day average of about 1350 cc (Wood & Collard, 1999). This is about three times the size expected for a great ape of the same body size. During most of the period of increased brain size, it is likely that language was primarily gestural rather than vocal; we saw above that the modications necessary for articulate speech were probably not complete until the emergence of our own species within the past 200,000 years. This implies that the increase in brain size may have played an important role in the control and sequencing of movements. It should be noted, though, that true language, whether spoken or signed, is not simply a matter of movement control. It also involves the development of complex grammatical structures, and other cognitive changes such as enhanced memory capacity and attentional control. It is not clear which aspects of language were critically dependent on the increase in brain size, or whether the increase in brain size was necessary for the motor component itself, or for specic acts such as writing. The increase in brain size corresponds at least approximately to the era known as the Pleistocene, usually dated from about 1.8 million years to about 10,000 years ago (e.g., Janis, 1993)although it has been argued that it should be dated from as early as 2.58 million years ago (Suc, Bertini, Leroy, & Suballyova, 1997), which corresponds more closely to the emergence of the genus Homo. With the global shift to cooler climate after 2.5 million years ago, much of southern and eastern Africa probably became more open and sparsely wooded (Foley, 1987). This left the hominins not only more exposed to attack from dangerous predators, such as saber-tooth cats, lions, and hyenas, but also obliged to compete with them as carnivores. The solution was not to compete on the same terms, but to establish what Tooby and DeVore (1987) called the cognitive niche, relying on social cooperation and intelligent planning for survival. As Pinker (2003, p. 27) put it, it became increasingly important to encode, and no doubt express, information as to who did what to whom, when, where, and why. The problem is that the number of combinations of actions, actors, locations, time periods, implements, and so forth, that dene episodes becomes very large, and a system of holistic calls to describe those episodes rapidly taxes the perceptual and memory systems. Syntax may then have emerged as a series of rules whereby episodic elements could be combined.

4. Varieties of expression As noted at the outset of this article, language can be expressed in a variety of ways. As an embodied system, it is natural to suppose that it began as a system of manual gestures. As primates, humans are preadapted for intentional movements of the limbs, especially the hands, and bipedalism in the hominins would have enhanced the ability to represent events using the hands. But even language based on manual gestures, as in the signed languages of the deaf, quickly becomes conventionalized and loses most of its iconic or mimetic aspect. Once language develops as a conventionalized symbolic system, it can be maintained by culture, and can take different forms. Vocal language offers a number of advantages over a manual system, and in the course of hominin evolution there was probably selective pressure to introduce vocal elements, to the point the speech eventually became dominant. This process was undoubtedly slow, because our primate forebears had little if any intentional control over vocalization, and were largely incapable of vocal learning. The evolution of speech involved signicant changes to the vocal apparatus, control of breathing, and incor-

562

M.C. Corballis / Human Movement Science 28 (2009) 556565

poration of vocalization into the mirror system. There is reason to believe that these changes were not complete until the emergence of our own species, Homo sapiens (Lieberman, 1998, 2002). So what are the advantages of speech over manual language? Perhaps the most important is that it frees the hands for other activities, such as carrying and manufacture. It also allows people to speak and use tools at the same time, leading perhaps to pedagogy, and a signicant advancement in technology (Corballis, 2002). The advantages of speech over manual gesture may even help explain why humans eventually predominated over other large-brained hominins, including the Neanderthals, who died out some 30,000 years ago. By the same token, it may help explain the so-called human revolution (Mellars & Stringer, 1989), manifest in the dramatic appearance of more sophisticated tools, bodily ornamentation, art, and perhaps music, dating from some 40,000 years ago in Europe, and probably earlier in Africa (McBrearty & Brooks, 2000; Oppenheimer, 2003). Through speech, humans achieved what has been termed modernityunlike other primates, or indeed other hominins, we live in an accelerating spiral of technology and cultural complexity (Corballis, 2004). These were not the only advantages associated with speech. For one thing, speech is much less energy-consuming than manual gesture. Anecdotal evidence from courses in sign language suggests that the instructors require regular massages in order to meet the sheer physical demands of sign-language expression. In contrast, the physiological costs of speech are so low as to be nearly unmeasurable (Russell, Cerny, & Stathopoulos, 1998). In terms of expenditure of energy, speech adds little to the cost of breathing, which we must do anyway to sustain life. Speech is also less attentionally demanding than signed language; one can attend to speech (or at least pretend to) with ones eyes shut, or when watching something else. Speech also allows communication over longer distances, as well as communication at night or when the speaker is not visible to the listener. The San, a modern hunter-gatherer society, are known to talk late at night, sometimes all through the night, to resolve conict and share knowledge (Konner, 1982). Of course there are also advantages to visual language over vocal language. Vocal language is denied to those unable to hear or to speak, and signed languages form a natural substitute. Visual language is also more iconic, and most people resort to gesture, or drawing, when trying to communicate with those who speak a different language. Some sort of manual gesture is necessary even for the acquisition of speech; in learning the names of objects, for example, there must be some means of indicating which object has which name. Even adults gesture as they speak, and their gestures can add a signicant component of meaning (Goldin-Meadow & McNeill, 1999). For example, people regularly point to indicate directions: He went that way . . ., accompanied by pointing. Language has still not fully escaped its manual origins. An important function of language is storage. Nonliterate societies tell stories to maintain historical records and cultural mythologies through the generations. The invention of writing enabled information to be stored in more tangible and lasting fashion. Some scripts are of course based on spoken language, but others, such as Chinese, are primarily visual, with relatively little connection to the spoken word. Of course with modern recording devices, information can be stored in auditory form, or with modern computers in digital forms that can be retrieved as either visual or auditory sequences. We may even be seeing a reversion to manuovisual language as texting seems to be taking over as the dominant form of interpersonal communication; Bianchi and Phillips (2005) report that younger adults use text on their mobile phones more than older adults do. More disturbing, perhaps, is that texters tend to be more lonely and socially anxious than those who talk on their cellphones, and more likely to disclose their real-self through text than through voiced exchanges (Reid & Reid, 2004). Humans are vastly exible and creative in the ways they communicate and store informationindeed, perhaps the most important sources of change in human history and prehistory have to do with ways of communicating and storing information, beginning with the switch from manual to vocal language, and continuing with tablets, books, phonograph records, tape recorders, videotapes, computers, and cellphones. Communication not only enables the transmission of propositional information, but also reects individual idiosyncrasies and emotional states. Both speech and signing carry the imprint of individual identity, and can vary within individuals depending on mood or pathological states. The same is true of handwriting, which has been shown to be sensitive to depression (Mergl et al., 2004), aging (Slavin, Phillips, & Bradshaw, 1996), schizophrenia (Gallucci, Phillips, Bradshaw, Vaddadi, &

M.C. Corballis / Human Movement Science 28 (2009) 556565

563

Pantelis, 1997), degenerative disorders (Slavin, Phillips, Bradshaw, Hall, & Presnell, 1999), and the chewing of nicotine gum (Tucha & Lange, 2004)! 5. Summary I have argued in this paper that language has its origins in manual gesture rather than in animal calls. This position is based on the role of the mirror system in mediating the perception and production of intentional movements, the exibility and intentionality of manual gestures and relative inexibility of vocal calls in non-human primates, the ease with which people can learn signed languages when deprived of speech, and the prominent role of manual gestures in ordinary human conversation. Anatomical and archeological evidence suggests in fact that speech emerged late in human evolution, perhaps replacing a predominantly gestural language only with the emergence of our own species, Homo sapiens, some 200,000 years ago. Although the earliest true languages may have been manual, writing developed well after the emergence of speech, and was parasitic on it. Nevertheless writing is itself a form of gesture, and probably involves the mirror system. In that sense, it may have roots that go far back in primate evolution. Acknowledgments A number of colleagues and friends have helped me develop the ideas expressed in this paper, over many years. They include Michael Arbib, Dick Byrne, Tecumseh Fitch, Maurizio Gentilucci, Russell Gray, Jim Hurford, Giacomo Rizzolatti, Michael Studdert-Kennedy, and Andrew Whiten. Not all of them agree with me. References
Armstrong, D. F. (1999). Original signs: Gesture, sign, and the source of language. Washington, DC: Gallaudet University Press. Armstrong, D. F., Stokoe, W. C., & Wilcox, S. E. (1995). Gesture and the nature of language. Cambridge: Cambridge University Press. Armstrong, D. F., & Wilcox, S. E. (2007). The gestural origin of language. Oxford: Oxford University Press. Berthier, M. L., Pulvermuller, F., Green, C., & Higueros, C. (2006). Are release phenomena explained by disinhibition of mirror neuron circuits? Arnold Picks remarks on echographia and their relevance for modern cognitive neuroscience. Aphasiology, 20, 462480. Bianchi, A., & Phillips, J. G. (2005). Psychological predictors of problem mobile phone use. CyberPsychology and Behavior, 8, 3951. Bickerton, D. (1995). Language and human behavior. Seattle, WA: University of Washington Press. Bickerton, D. (2002). From protolanguage to language. In T. J. Crow (Ed.), The speciation of modern Homo sapiens (pp. 103120). Oxford: Oxford University Press. Browman, C. P., & Goldstein, L. F. (1995). Dynamics and articulatory phonology. In T. van Gelder & R. F. Port (Eds.), Mind as motion. Cambridge, MA: MIT Press. Buccino, G., Lui, F., Canessa, N., Patteri, I., Lagravinese, G., Benuzzi, F., et al. (2004). Neural circuits involved in the recognition of actions performed by nonconspecics: An fMRI study. Journal of Cognitive Neuroscience, 16, 114126. Calvert, G. A., & Campbell, R. (2003). Reading speech from still and moving faces: The neural substrates of visible speech. Journal of Cognitive Neuroscience, 15, 5770. Carreiras, M., Lopez, J., Rivero, F., & Corina, D. (2005). Neural processing of a whistled language. Nature, 433, 3132. Condillac, E. B. de. (1971). An essay on the origin of human knowledge (T. Nugent, Tr.). Gainesville, FL: Scholars Facsimiles and Reprints (Originally published 1746). Corballis, M. C. (2002). From hand to mouth: The origins of language. Princeton, NJ: Princeton University Press. Corballis, M. C. (2003). From mouth to hand: Gesture, speech, and the evolution of right-handedness. Behavioral and Brain Sciences, 26, 199260. Corballis, M. C. (2004). The origins of modernity: Was autonomous speech the critical factor? Psychological Review, 111, 543552. Corballis, M. C. (2006). Evolution of language as a gestural system. Marges Linguistics, 11, 218229. Crow, T. J. (2002). Sexual selection, timing, and an XY homologous gene: Did Homo sapiens speciate on the Y chromosome? In T. J. Crow (Ed.), The speciation of modern Homo sapiens (pp. 197216). Oxford, UK: Oxford University Press. Emmorey, K. (2002). Language, cognition, and brain: Insights from sign language research. Hillsdale, NJ: Erlbaum. Fadiga, L., Fogassi, L., Pavesi, G., & Rizzolatti, G. (1995). Motor facilitation during action observation A magnetic stimulation study. Journal of Neurophysiology, 73, 26082611. Fecteau, S., Lassonde, M., & Theoret, H. (2005). Modulation of motor cortex excitability during action observation in disconnected hemispheres. Neuroreport, 16, 15911594. Foley, R. (1987). Another unique species: Patterns in human evolutionary ecology. Harlow: Longman Scientic and Technical.

564

M.C. Corballis / Human Movement Science 28 (2009) 556565

Galantucci, B., Fowler, C. A., & Turvey, M. T. (2006). The motor theory of speech perception reviewed. Psychonomic Bulletin and Review, 13, 361377. Gallucci, R. M., Phillips, J. G., Bradshaw, J. L., Vaddadi, K. S., & Pantelis, C. (1997). Kinematic analysis of handwriting movements in schizophrenic patients. Biological Psychiatry, 41, 830833. Gardner, R. A., & Gardner, B. T. (1969). Teaching sign language to a chimpanzee. Science, 165, 664672. Gaur, A. (1987). A history of writing. London: The British Library. Gentilucci, M., & Corballis, M. C. (2006). From manual gesture to speech: A gradual transition. Neuroscience and Biobehavioral Reviews, 30, 949960. Givn, T. (1995). Functionalism and grammar. Philadelphia, PA: Benjamins. Goldin-Meadow, S., & McNeill, D. (1999). The role of gesture and mimetic representation in making language the province of speech. In M. C. Corballis & S. E. G. Lea (Eds.), The descent of mind (pp. 155172). Oxford, UK: Oxford University Press. Goodall, J. (1986). The chimpanzees of Gombe: Patterns of behavior. Cambridge, MA: Harvard University Press. Gowlett, J. A. J. (1992). Early human mental abilities. In S. Jones, R. Martin, & D. Pilbeam (Eds.), The Cambridge encyclopedia of human evolution (pp. 341345). Cambridge: Cambridge University Press. Hari, R., Forss, N., Avikainen, S., Kirveskari, E., Salenius, S., & Rizzolatti, G. (1998). Activation of human primary motor cortex during action observation: A neuromagnetic study. Proceedings of the National Academy of Sciences, USA, 95, 15061 15065. Hayes, C. (1952). The ape in our house. London: Gollancz. Heilman, K. M., Meador, K. J., & Loring, D. W. (2000). Hemispheric asymmetries of limb-kinetic apraxia A loss of deftness. Neurology, 55, 523526. Hewes, G. W. (1973). Primate communication and the gestural origins of language. Current Anthropology, 14, 524. Iacoboni, M., Woods, R. P., Brass, M., Bekkering, H., Mazziotta, J. C., & Rizzolatti, G. (1999). Cortical mechanisms of human imitation. Science, 286, 25262528. Janis, C. (1993). Victors by default: The mammalian succession. In S. J. Gould (Ed.), The book of life (pp. 169217). New York: W.W. Norton. Jarvis, E. D. (2006). Selection for and against vocal learning in birds and mammals. Ornithological Science, 5, 514. Joos, M. (1948). Acoustic phonetics. Language monograph no. 23. Baltimore, MD: Linguistic Society of America. Kohler, E., Keysers, C., Umilta, M. A., Fogassi, L., Gallese, V., & Rizzolatti, G. (2002). Hearing sounds, understanding actions: Action representation in mirror neurons. Science, 297, 846848. Konner, M. (1982). The tangled wing: Biological constraints on the human spirit. New York: Harper. Liberman, A. M., Cooper, F. S., Shankweiler, D. P., & Studdert-Kennedy, M. (1967). Perception of the speech code. Psychological Review, 74, 431461. Liebal, K., Call, J., & Tomasello, M. (2004). Use of gesture sequences in chimpanzees. American Journal of Primatology, 64, 377396. Lieberman, D. E. (1998a). Sphenoid shortening and the evolution of modern cranial shape. Nature, 393, 158162. Lieberman, P. (1998b). Eve spoke: Human language and human evolution. New York: W.W. Norton. Lieberman, P. (2002). On the nature and evolution of the neural bases of human language. Yearbook of Physical Anthropology, 45, 3662. Lieberman, P., Crelin, E. S., & Klatt, D. H. (1972). Phonetic ability and related anatomy of the new-born, adult human, Neanderthal man, and the chimpanzee. American Anthropologist, 74, 287307. MacLarnon, A., & Hewitt, G. (1999). The evolution of human speech: The role of enhanced breathing control. American Journal of Physical Anthropology, 109, 341363. Mellars, P. A., & Stringer, C. B. (Eds.). (1989). The human revolution: Behavioral and biological perspectives on the origins of modern humans. Edinburgh: Edinburgh University Press. McBrearty, S., & Brooks, A. S. (2000). The revolution that wasnt: A new interpretation of the origin of modern human behavior. Journal of Human Evolution, 39, 453563. Mergl, R., Juckel, G., Rihl, J., Henkel, V., Karner, M., Tiggs, P., et al. (2004). Kinematic analysis of handwriting in depressed patients. Acta Psychiatrica Scandinavica, 109, 391393. Muthukumaraswamy, S. D., Johnson, B. W., & McNair, N. A. (2004). Mu rhythm modulation during observation of an objectdirected grasp. Cognitive Brain Research, 19, 195201. Neidle, C., Kegl, J., MacLaughlin, D., Bahan, B., & Lee, R. G. (2000). The syntax of American sign language. Cambridge, MA: The MIT Press. Nishitani, N., & Hari, R. (2000). Temporal dynamics of cortical representation for action. Proceedings of the National Academy of Sciences, USA, 97, 913918. Noonan, J. P., Coop, G., Kudaravalli, S., Smith, D., Krause, J., Alessi, J., et al. (2006). Sequencing and analysis of Neanderthal genomic DNA. Science, 314, 11131121. Oppenheimer, S. (2003). Out of Eden: The peopling of the world. London: Constable. Perrett, D. I., Harries, M. H., Bevan, R., Thomas, S., Benson, P. J., Mistlin, A. J., et al. (1989). Frameworks of analysis for the neural representation of animate objects and actions. Journal of Experimental Biology, 146, 87113. Petitto, L. A., & Marentette, P. (1991). Babbling in the manual mode: Evidence for the ontogeny of language. Science, 251, 14831496. Pika, S., Liebal, K., & Tomasello, M. (2003). Gestural communication in young gorillas (Gorilla gorilla): Gestural repertoire, and use. American Journal of Primatology, 60, 95111. Pika, S., Liebal, K., & Tomasello, M. (2005). Gestural communication in subadult bonobos (Pan paniscus): Repertoire and use. American Journal of Primatology, 65, 3961. Pinker, S. (1994). The language instinct. New York: Morrow. Pinker, S. (2003). Language as an adaptation to the cognitive niche. In M. H. Christiansen & S. Kirby (Eds.), Language evolution (pp. 1637). Oxford: Oxford University Press. Ploog, D. (2002). Is the neural basis of vocalisation different in non-human primates and Homo sapiens? In T. J. Crow (Ed.), The speciation of modern Homo sapiens (pp. 121135). Oxford, UK: Oxford University Press.

M.C. Corballis / Human Movement Science 28 (2009) 556565

565

Pollick, A. S., & de Waal, F. B. M. (2007). Apes gestures and language evolution. Proceedings of the National Academy of Sciences, 104, 81848189. Reid, D., & Reid, F. (2004). Insights into the social and psychological effects of SMS text messaging. See http://socio.ch/mobile/ index_mobile.htm. Rizzolatti, G., & Arbib, M. A. (1998). Language within our grasp. Trends in Cognitive Science, 21, 188194. Rizzolatti, G., Fadiga, L., Fogassi, L., & Gallese, V. (1996). Premotor cortex and the recognition of motor actions. Cognitive Brain Research, 3, 131141. Rizzolatti, G., Fogassi, L., & Gallese, V. (2001). Neurophysiological mechanisms underlying the understanding and imitation of action. Nature Reviews, 2, 661670. Rizzolatti, G., & Sinigaglia, C. (2008). Mirrors in the brain. Oxford: Oxford University Press. Russell, B. A., Cerny, F. J., & Stathopoulos, E. T. (1998). Effects of varied vocal intensity on ventilation and energy expenditure in women and men. Journal of Speech, Language, and Hearing Research, 41, 239248. Savage-Rumbaugh, S., Shanker, S. G., & Taylor, T. J. (1998). Apes, language, and the human mind. New York: Oxford University Press. Semaw, S., Renne, P., Harris, J. W. K., Feibel, C. S., Bernor, R. L., Fesseha, N., et al. (1997). 2.5-million-year-old stone tools from Gona, Ethiopia. Nature, 385(33), 3336. Slavin, M. J., Phillips, J. G., & Bradshaw, J. L. (1996). Visual cues and handwriting of older adults. Psychology & Aging, 11, 521526. Slavin, M. J., Phillips, J. G., Bradshaw, J. L., Hall, K. A., & Presnell, I. (1999). Consistency of handwriting movements in dementia of the Alzheimers type: A comparison with Huntingtons and Parkinsons diseases. Journal of the International Neuropsychological Society, 5, 2025. Studdert-Kennedy, M. (2005). How did language go discrete? In M. Tallerman (Ed.), Language origins: Perspectives on evolution (pp. 4867). Oxford, UK: Oxford University Press. Suc, J.-P., Bertini, A., Leroy, S. A. G., & Suballyova, D. (1997). Towards the lowering of the Pliocene/Pleistocene boundary to the Gauss-Matuyama reversal. Quaternary International, 40, 3742. Tattersall, I. (2003). Once we were not alone. Scientic American, 13, 2027. Tooby, J., & DeVore, I. (1987). The reconstruction of hominin evolution through strategic modeling. In W. G. Kinzey (Ed.), The evolution of human behavior: Primate models (pp. 183237). Albany, NY: SUNY Press. Tucha, O., & Lange, K. W. (2004). Effects of nicotine on a real-life motor task: A kinematic analysis of handwriting movements in smokers and non-smokers. Psychopharmacology, 173, 4956. Wood, B., & Collard, M. (1999). The human genus. Science, 284, 6571.

You might also like