The First Word: The Search for the Origins of Language (2007) by Christine Kenneally. | Linguistics: An Introduction to Language and Communication by Adrian Akmajian et al. | See also: Inventing English: A Portable History of the Language by Seth Lerer. |
Far from the Madding Gerund
and other dispatches from the Language Log (2006) by Mark Liberman and Geoffrey K. Pullum. Reprints over 100 Language Log blog entries. |
A fascinating and readable popularizing book by linguist Christine Kenneally.
Part 1: Language is not a thing.
Traces the historical arc of interest in the origins of language and language evolution, culminating with four recent "figures that bear much of the responsibility for the current state of the art" in the study of language evolution. Chapters:
Chomsky's signature claim is that all humans share a 'universal grammar,' ... a set of rules that can generate the syntax of every human language. ... Traditional researchers committed to Chomskyan linguistics believed that universal grammar exists in some part of our brain in a language organ that all humans possess but no other animals have. [p.25] |
His view was that "language is a uniquely human phenomenon, distinct from the adaptations of all other organisms on the planet ... Not only does language differentiate us from all other animal life; it also exists separate from other cognitive abilities like memory, perception, and even the act of speech itself. Researchers in this tradition have searched for ... a part of the brain devoted solely to linguistic skills ... [and for] the roots of language in the fine grain of the human genome, maintaining, in some cases, that certain genes may exist for the soul purpose of encoding grammar." [p.9] The Chomskyan view dominated from the mid-20th century till 1990s.
Chomsky's legend ... stifled language evolution research during the latter half of the twentieth century. [p.39] |
no single dramatic event gave birth to human language. The Chomskyan idea of an ideal speaker and hearer confused the origins of language. [p.72] |
Part 2: If you Have Human Language ...
This Part is full of experimental results, exploring how "the language suite — what abilities you have if you have human language" [p. 10] evolved. Chapters:
Sue Savage-Rumbaugh's observed two apes that had some use of sign language engage in
a sign-shouting match; neither ape was willing to listen.
Language, wrote Savage-Rambaugh,
"coordinates behaviors between individuals by a complex process of exchanging behaviors that are punctuated by speech." At its most fundamental, language is an act of shared attention, and without the fundamentally human willingness to listen to what another person is saying, language would not work. |
got his graduate students to count how often particular letters appeared in different texts, like Ulysses, and plotted the frequency of each letter in descending order on a log scale. He found the slope he had plotted had a -1 gradient ... [and] that most human languages, whether written or spoken, had approximately the same slope of -1. Zipf also established that completely disordered sets of symbols produced a slope of 0 ... all elements occurred more or less equally. Zipf applied the tool to babies' babbling, and the resulting slope was closer to the horizontal. |
When Doyle and McCowan applied Zipf's law to the sounds of dolphins and plotted sound frequency:
they discovered that, like human language, it [their graph] had a slope of -1. A dolphin's signal ... had structure and complexity. ... [Furthermore] signals produced by squirrel monkeys ... [have a slope of] -0.6, suggesting that they have a less complex form of vocalization. |
Also see the application of entropy to information as [p.143-4]:
developed by Claude Shannon, who ... [calculated] how much information was actually passing through a given phone wire. Entropy can be measured regardless of what is being communicated because instead of gauging meaning, it computes the information content of a signal. The more complex a signal is, the more information it can carry. Entropy can indicate the complexity of a signal like speech or whistling, even if the person measuring the signal doesn't know what it means. ... SETI plans to use entropy to evaluate signals from outer space ... to give us an idea about the intelligence of the beings that transmitted the signal even if we can't decode the message itself. ... Human languages are approximately ninth-order entropy, which means that if you had a nine-word (or shorter) sequence from, say, English, you would have a chance of guessing what might come next. If the sequence is ten words or more, you'll have no chance of guessing the next word correctly. The simplest forms of communication have first-order entropy. Squirrel monkeys have second or third-order, and dolphins measure higher, around fourth-order. |
In contrast,
Jackendoff call this kind of approach "syntactocentric," meaning that syntax is regarded
as the fundamental element of language.
In contrast, he says, "in a number of different quarters,
another approach has been emerging in distinction to Chomsky's," In this new way of
accounting for structure in language, words and phrases are as important as the rules that combine them,
and the idea of pure syntax is downplayed.
... Rather than thinking of syntax as a set of computational algorithms, Jackendorf and Pinker call it a "sophisticated accounting system" for tracking the various layers of relationship and meaning that can be encoded in speech and then decoded by a listener. |
The brain is very plastic, with parts of the brain taking over for parts that are removed. One of its hemispheres can be removed (in a 'hemisperectomy') to avoid seizures. In the case of at least one 9-year-old, a first spoken language was aquired after such surgery.
Also discusses mirror neurons and the importance of the cerebellum in coordination of movement and language.
Language is constituted of an aggregate of different traits and processes
that have developed over time.
There was no one moment at which humans became definably human,
just as language did not appear suddenly from the ether.
... In the end, you have to be human to have human language.
... Humans won't speak or produce language unless they are taught to do so, which means that our remarkable capacity doesn't amount to much at all if someone isn't there to provide a model for how to use it. |
Part 3: What Evolves?
How did the language of our parents get here in the first place? Includes computer modeling by Kirby and Christiansen viewing language "as a virus, one that grows and evolves symbiotically with human beings ... language shifts around and adapts itself in order to develop and survive" [p.11]
Chapters:
When did language begin? Its foundation can be traced back to our common ancestor with
primordial lizards.
Yet what we recognize as language today took shape sometime in the last six
million years.
It's clear just from the distribution of elements of the language suite
over different animal species and throughout the lineage of the human species
that language did not evolve overnight, turning us from animals into people.
The mere fact that different traits are shared with different animals suggests that
language come together in bits and pieces, step by step.
... How old is gesture? At least as old as our shared ancestor with other great apes. How old is simple syntax? Perhaps as old as our common ancestor with monkeys, which lived forty-five million years ago. ... When you look at how these pieces evolved, you can start to narrow down why they evolved. |
Part 4: Where Next?
Chapters:
Concludes with:
15 key researchers interviewed by Kenneally answered her question (answers ranged from 'Yes' to 'No'):
If we shipwrecked a boatload of babies on the Galápagos Islands — assuming they had all the food, water, and shelter they needed to thrive — would they produce language in any form when they grew up? And if they did, how many individuals would you need for it to take off, what form might it take, and how would it change over the generations? |
Examples of responses:
A terrific book for thoughtful students of language.
The First Word: The Search for the Origins of Language (2007) by Christine Kenneally. |
Check our disclaimer. Copyright © 2009 by J. Zimmerman. |