Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Imam Mohammed bin Saud University

English and Translation college


Linguistics
ENG 601

Universal Grammar

By / Naif Abdulkareem Alhwaimil

Supervised by : Dr. Abdullah Alsaif


Introduction

During the first half of the 20th century, linguists who theorized about the human
ability to speak did so from the behaviorist perspective that prevailed at that time. They
therefore held that language learning, like any other kind of learning, could be explained
by a succession of trials, errors, and rewards for success. In other words, children learned
their mother tongue by simple imitation, listening to and repeating what adults said.

This view became radically questioned, however, by the American linguist Noam
Chomsky. For Chomsky, acquiring language cannot be reduced to simply developing an
inventory of responses to stimuli, because every sentence that anyone produces can be a
totally new combination of words. When we speak, we combine a finite number of
elements—the words of our language—to create an infinite number of larger structures—
sentences.

Moreover, language is governed by a large number of rules and principles,


particularly those of syntax, which determine the order of words in sentences. The term
“generative grammar” refers to the set of rules that enables us to understand sentences but
of which we are usually totally unaware. It is because of generative grammar that
everyone says “that’s how you say it” rather than “how that’s you it say”, or that the
words “Bob” and “him” cannot mean the same person in the sentence “Bob loves him.”
but can do so in “Bob knows that his father loves him.” (Note in passing that generative
grammar has nothing to do with grammar textbooks, whose purpose is simply to explain
what is grammatically correct and incorrect in a given language.)

Even before the age of 5, children can, without having had any formal instruction,
consistently produce and interpret sentences that they have never encountered before. It is
this extraordinary ability to use language despite having had only very partial exposure to
the allowable syntactic variants that led Chomsky to formulate his “poverty of the
stimulus” argument, which was the foundation for the new approach that he proposed in
the early 1960s.

In Chomsky’s view, the reason that children so easily master the complex
operations of language is that they have innate knowledge of certain principles that guide
them in developing the grammar of their language. In other words, Chomsky’s theory is
that language learning is facilitated by a predisposition that our brains have for certain
structures of language.

But what language? For Chomsky’s theory to hold true, all of the languages in the
world must share certain structural properties. And indeed, Chomsky and other
generative linguists like him have shown that the 5000 to 6000 languages in the world,
despite their very different grammars, do share a set of syntactic rules and principles.
These linguists believe that this “universal grammar” is innate and is embedded

Ϯ
somewhere in the neuronal circuitry of the human brain. And that would be why children
can select, from all the sentences that come to their minds, only those that conform to a
“deep structure” encoded in the brain’s circuits.

About Chomsky

He was born in Philadelphia on December 7, 1928, Noam Chomsky was an


intellectual prodigy who went on to earn a PhD in linguistics at the University of
Pennsylvania. Since 1955, he has been a professor at Massachusetts Institute of
Technology (MIT) and has produced groundbreaking, controversial theories on human
linguistic capacity. Chomsky is widely published, both on topics in his field and on issues
of dissent and U.S. foreign policy.

Universal grammar

The concept of universal grammar has been traced to the observation of Roger
Bacon, a 13th-century Franciscan friar and philosopher, that all languages are built upon
a common grammar. The expression was popularized in the 1950s and 1960s by Noam
Chomsky and other

Universal grammar, then, consists of a set of unconscious constraints that let us


decide whether a sentence is correctly formed. This mental grammar is not necessarily
the same for all languages. But according to Chomskyian theorists, the process by which,
in any given language, certain sentences are perceived as correct while others are not, is
universal and independent of meaning.

Thus, we immediately perceive that the sentence “Robert book reads the” is not
correct English, even though we have a pretty good idea of what it means. Conversely,
we recognize that a sentence such as “Colorless green ideas sleep furiously.” is
grammatically correct English, even though it is nonsense.

A pair of dice offers a useful metaphor to explain what Chomsky means when he
refers to universal grammar as a “set of constraints”. Before we throw the pair of dice, we
know that the result will be a number from 2 to 12, but nobody would take a bet on its
being 3.143. Similarly, a newborn baby has the potential to speak any of a number of
languages, depending on what country it is born in, but it will not just speak them any
way it likes: it will adopt certain preferred, innate structures. One way to describe these
structures would be that they are not things that babies and children learn, but rather
things that happen to them. Just as babies naturally develop arms and not wings while
they are still in the womb, once they are born they naturally learn to speak, and not to
chirp or neigh.

ϯ
Observations that support the Chomskyian view of language

Until Chomsky propounded his theory of universal grammar in the 1960s, the
empiricist school that had dominated thinking about language since the Enlightenment
held that when children came into the world, their minds were like a blank slate.
Chomsky’s theory had the impact of a large rock thrown into this previously tranquil,
undisturbed pond of empiricism.

Subsequent research in the cognitive sciences, which combined the tools of


psychology, linguistics, computer science, and philosophy, soon lent further support to
the theory of universal grammar. For example, researchers found that babies only a few
days old could distinguish the phonemes of any language and seemed to have an innate
mechanism for processing the sounds of the human voice.

Thus, from birth, children would appear to have certain linguistic abilities that
predispose them not only to acquire a complex language, but even to create one from
whole cloth if the situation requires. One example of such a situation dates back to the
time of plantations and slavery. On many plantations, the slaves came from many
different places and so had different mother tongues. They therefore developed what are
known as pidgin languages to communicate with one another. Pidgin languages are not
languages in the true sense, because they employ words so chaotically—there is
tremendous variation in word order, and very little grammar. But these slaves’ children,
though exposed to these pidgins at the age when children normally acquire their first
language, were not content to merely imitate them. Instead, the children spontaneously
introduced grammatical complexity into their speech, thus in the space of one generation
creating new languages, known as creoles.

Chomsky and the evolution of language

Many authors, adopting the approach of evolutionary psychology, believe that


language has been shaped by natural selection. In their view, certain random genetic
mutations were thus selected over many thousands of years to provide certain individuals
with a decisive adaptive advantage. Whether the advantage that language provided was in
coordinating hunting parties, warning of danger, or communicating with sexual partners
remains uncertain, however.

Chomsky, for his part, does not see our linguistic faculties as having originated
from any particular selective pressure, but rather as a sort of fortuitous accident. He bases
this view, among other things, on studies which found that recursivity—the ability to
embed one clause inside another, as in “the person who was singing yesterday had a
lovely voice”—might be the only specifically human component of language. According
to the authors of these studies, recursivity originally developed not to help us

ϰ
communicate, but rather to help us solve other problems connected, for example, with
numerical quantification or social relations, and humans did not become capable of
complex language until recursivity was linked with the other motor and perceptual
abilities needed for this purpose. (Thus recursivity would meet the definition of a
spandrel offered by Stephen Jay Gould.) According to Chomsky and his colleagues, there
is nothing to indicate that this linkage was achieved through natural selection. They
believe that it might simply be the result of some other kind of neuronal reorganization.

The minimalist program

In the 99 s, Chomsky’s research focused on what he called the “minimalist


program”, which attempted to demonstrate that the brain’s language faculties are the
minimum faculties that could be expected, given certain external conditions that are
imposed on us independently. In other words, Chomsky began to place less emphasis on
something such as a universal grammar embedded in the human brain, and more
emphasis on a large number of plastic cerebral circuits. And along with this plasticity
would come an infinite number of concepts. The brain would then proceed to associate
sounds and concepts, and the rules of grammar that we observe would in fact be only the
consequences, or side effects, of the way that language works. Analogously, we can, for
example, use rules to describe the way a muscle operates, but these rules do nothing but
explain what happens in the muscle; they do not explain the mechanisms that the brain
uses to generate these rules.

Criticisms of Chomsky’s theories

Chomsky thus continues to believe that language is “pre-organized” in some way


or other within the neuronal structure of the human brain, and that the environment only
shapes the contours of this network into a particular language. His approach thus remains
radically opposed to that of Skinner or Piaget, for whom language is constructed solely
through simple interaction with the environment. This latter, behaviourist model, in
which the acquisition of language is nothing but a by-product of general cognitive
development based on sensorimotor interaction with the world, would appear to have
been abandoned as the result of Chomsky’s theories.

Since Chomsky first advanced these theories, however, evolutionary biologists


have undermined them with the proposition that it may be only the brain’s general
abilities that are “pre-organized”. These biologists believe that to try to understand
language, we must approach it not from the standpoint of syntax, but rather from that of
evolution and the biological structures that have resulted from it. According to Philip
Lieberman, for example, language is not an instinct encoded in the cortical networks of a
“language organ”, but rather a learned skill based on a “functional language system”
distributed across numerous cortical and subcortical structures.

ϱ
Though Lieberman does recognize that human language is by far the most
sophisticated form of animal communication, he does not believe that it is a qualitatively
different form, as Chomsky claims. Lieberman sees no need to posit a quantum leap in
evolution or a specific area of the brain that would have been the seat of this innovation.
On the contrary, he says that language can be described as a neurological system
composed of several separate functional abilities.

For Lieberman and other authors, such as Terrence Deacon, it is the neural
circuits of this system, and not some “language organ”, that constitute a genetically
predetermined set that limits the possible characteristics of a language. In other words,
these authors believe that our ancestors invented modes of communication that were
compatible with the brain’s natural abilities. And the constraints inherent in these natural
abilities would then have manifested themselves in the universal structures of language.

Another approach that offers an alternative to Chomsky’s universal grammar is


generative semantics, developed by linguist George Lakoff of the University of
California at Berkeley. In contrast to Chomsky, for whom syntax is independent of such
things as meaning, context, knowledge, and memory, Lakoff shows that semantics,
context, and other factors can come into play in the rules that govern syntax. In addition,
metaphor, which earlier authors saw as a simple linguistic device, becomes for Lakoff a
conceptual construct that is essential and central to the development of thought.

Lastly, even among those authors who embrace Chomsky’s universal grammar,
there are various conflicting positions, in particular about how this universal grammar
may have emerged. Steven Pinker, for instance, takes an adaptationist position that
departs considerably from the exaptation thesis proposed by Chomsky.

Universal Grammar and Second Language Acquisition

A key question now is do learners of a second language use Universal Grammar?


In other words, can they apply the parameters to a second language, something which
must involve resetting the parameters to the L ? If they don’t reset the parameters, L
parameters may often get in the way of learning the L2. This may mean learning an L2
requires the use of cognitive skills to overcome L ‘interference’.

From the perspective of UG, the L2 learner is faced with one major problem: the
need to reset parameters when the L2 has different parameters to their L1. Someone who
has English as their L1 has their head parameter set to head first; if that person is going to
learn Japanese, they need to set the parameter for the L2 to head last. Is this possible?
Can UG parameters have different settings for different languages? Or does the learner
have to use their cognitive skills rather than UG to learn where heads are placed in
Japanese? Or are L1 parameter settings also applied to the L2, meaning the learner has to
subsequently revise the parameter settings according to input. Does the L2 learner even

ϲ
have access to UG? Does the older L2 learner with their more developed cognitive skills,
even need UG? Maybe UG is only needed by infants because they have lousy cognitive
skills.

Research findings are pretty ambiguous and conflicting theories abound with
some academics saying UG is only available for the first few years of a child’s life (the
critical period).

The Language Acquisition Device

Earlier theories of language acquisition regarded language acquisition as a process


of imitation and reinforcement, a kind of 'habit formation'. According to this view, the
child would learn linguistic forms by a process of analogy with other forms. The last
decades have marked the decline of this concept of language acquisition. Many
observations and studies indicate that the child cannot proceed in the acquisition of
language by relying only on a process of analogy. By no means, in fact, can such a
process account for the richness of language, creativity and for the complexity of
language, given the limitations of data actually available to the child.

Later formulations of grammar acquisition in the context of generativism


postulate the existence of some kind of cognitive mechanism governing and permitting
the acquisition of language, the 'language acquisition device' (henceforth LAD). It is
undeniable that the environment affects L1 learners. In order to learn a language, children
need the incoming data, but also something that allows them to process the data they are
exposed to. In the following passage, Chomsky postulates the existence of LAD:

"Having some knowledge of the characteristics of the acquired grammars and the
limitations on the available data, we can formulate quite reasonable and fairly
strong empirical hypotheses regarding the internal structure of the language-
acquisition device that constructs the postulated grammars from the given
data"(Chomsky, 1968: 113).

According to this view, the content of LAD is a system of universal principles and
parameters fixed through the available data.

There is agreement among linguists that the process of acquiring a language is


very peculiar and complex. There is, however, not much consensus about the nature of
the mechanism which governs it. In particular, various proposals have been made about
the nature of the LAD and its psychological basis.

The 'modular mind'

ϳ
It is possible to assume a mental representation by justifying the existence of a
certain set of parameters and of universal principles. The central idea is that the human
mind is made up of different modules, one of them is UG (the language faculty), another
one is the vocal system, then vision, hearing systems and so on. Lightfoot (1982: 43)
makes a distinction between a perceptual mechanism, grammar and conceptual
knowledge. Each module has a separate set of universal principles within and can be
evaluated separately. The important aspect of the 'modular mind' (Fodor, 1983) is that the
connections between modules are very different from the modules themselves: the
modules are 'hardwired' and autonomous, in other words, they are very precisely
specified and are transmitted genetically. Moreover, the information inside these modules
is said to be 'encapsulated' namely, filled with information specific to the module (e.g. the
'θ-criterion' is specific only to UG); crucially, the connection between modules is not
modular. Modules may or may not interact with each other (meaning and grammar can be
evaluated separately). If one thinks about language being not the output of one single
module but the interface of a number of modules, then what one expects are parameters
because there are different logical options, and one has to take the decision in terms of
the input data: the connection between modules is then left open to the different types of
parameter setting. In this context, the violation of a principle of grammar "can be decided
only in light of the success of the theory of the mind as a whole" (Lightfoot, 9

Every module is "likely to develop in time and to have distinct initial and mature
states" (Lightfoot, 1982 : 46). This proposal involves the idea that

"the theory of grammar is a hypothesis about the initial state of the mental organ,
the innate capacity of the child, and a particular grammar conforming to this theory is a
hypothesis about the final state, the grammar eventually attained"

ϴ
Work cited
"Avram Noam Chomsky." 2014. The Biography.com website. Apr 23 2014

http://www.biography.com/people/noam-chomsky-

Cornelius, Charles. "Universal Grammar (Chomsky)." Eal360. N.p., 25 Apr. 2014. Web.
08 May 2014.

Gentile, Giuseppe. Language Acquisition and Universal Grammar. N.p., 1994- 99 .

"Tool Module: Chomsky’s Universal Grammar." Tool Module Chomsky’s Universal


Grammar. N.p., N.d. Web. 05 May 2014.

Universal Grammar and Second Language Acquisition by Vivian Cook." Universal


Grammar and Second Language Acquisition by Vivian Cook. N.p., N.d. Web. 05
May 2014.

You might also like