Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Sociolinguistics

CHARLES
J.
FILLMORE

Lucía Molina Martínez

Charles J. Fillmore
Charles J. Fillmore (1929-2014) was an American Linguist. He was one of the world’s pre-
eminent scholars of lexical meaning and its relationship with context, grammar and
computation, and his work had an enormous impact on computational linguistics. He read
Noam Chomsky’s Syntactic Structure and became an immediate proponent of the new
transformational grammar. He had been extremely influential in the areas of syntax and
lexical semantics; he was one of the founders of cognitive linguistics, and developed the
theories of Case Grammar and Frame Semantics. In all of his research he has illuminated
the fundamental importance of semantics, and its role in motivating syntactic and
morphological phenomena. His earlier work, in collaboration with Paul Kay and George
Lakoff, was generalized into the theory of Construction Grammar.
His current major project is called FrameNet; it is a wide-ranging on-line description of the
English lexicon. In this project words are described in terms of the Frames they evoke.

His seminal publications include:


● "The Position of Embedding Transformations in a Grammar" (1963). In Word 19:208-
231.
● "The Case for Case" (1968). In Bach and Harms (Ed.): Universals in Linguistic
Theory. New York: Holt, Rinehart, and Winston, 1-88.
● "Frame semantics and the nature of language" (1976): . In Annals of the New York
Academy of Sciences: Conference on the Origin and Development of Language and
Speech. Volume 280: 20-32.
● "Frame semantics" (1982). In Linguistics in the Morning Calm. Seoul, Hanshin
Publishing Co., 111-137.
● (with Paul Kay and Mary Catherine O'Connor) "Regularity and Idiomaticity in
Grammatical Constructions: The Case of Let Alone" (1988). Language. Vol. 64, No. 3
(Sep., 1988), 501-538
● (with Sue Atkins) "Starting where the dictionaries stop: The challenge for
computational lexicography". (1994). In Atkins, B. T. S. and A. Zampolli (Eds.)
Computational Approaches to the Lexicon. Oxford: Oxford University Press, 349-393.
● Lectures on Deixis (1997). Stanford: CSLI Publications. (originally distributed as
Fillmore (1975/1971) Santa Cruz Lectures on Deixis by the Indiana University
Linguistics Club).

Lexical Semantics:
Lexical semantics is a subfield of linguistics. It’s the study of how and what the words of a
language denote. Words may either be taken to denote things in the world, or concepts,
depending on the particular approach to lexical semantics.
Cognitive Linguistics:
Cognitive linguistics refers to the school of linguistics that understands language creation,
learning and usage as best explained by reference to human cognition in general. It is
characterized by adherence to thee central positions.
● First, it denies that there is an autonomous linguistic faculty in the mind;
● Second, it understands grammar in terms of conceptualization;
● Third, it claims that knowledge of language arises out of language use.
Cognitives linguists deny that the mind has any module for language-acquisition that is unique
and autonomous. This stands in contrast to the work done in the field of genitive grammar by
Noam Chomsky. Although cognitive linguists don’t necessarily deny that part of human linguistic
ability is innate, they deny that it is separate from the rest of cognition. Thus, they argue that
knowledge of linguistic phenomena is essentially conceptual in nature. Moreover, they argue that
the storage and retrieval of linguistic data is not significantly different from the storage and
retrieval of knowledge, and use of language is understanding employs similar cognitive abilities
as used in other non-linguistic tasks.
Depending from the tradition of truth-conditional semantics, cognitive linguists view meaning in
terms of conceptualization. Instead of viewing meaning in terms of models of the world, they view
it in terms of mental spaces.
Finally, cognitive linguistics argues that language is both embodied and situated in a specific
environment. This can be considered a moderate offshoot of the Sapir-Whorf hypothesis, in that
language and cognition mutually influence one another, and are both embedded in the
experiences and environment of its users.
Some basic aspects of cognitive linguistics:
some shared tenets
network architectures
phenomena studied
increasing understanding of how they relate
differences with Construction Grammar, other related theories.

Case Grammar
Case Grammar is a system of linguistic analysis, focusing on the link between valance, or
number of subjects, objects, etc., of a verb and the grammatical context it requires. This system
was created in 1968, in the context of Transformational Grammar. This theory analyzes the
surface syntactic structure of sentences by studying the combination of deep cases (i.e. semantic
roles) -agent, object, benefactor, location or instrument- which are required by a specific verb.
According to Fillmore, each verb selects a certain number of deep cases which from its case
frame. Thus, a case frame describes important aspects of semantic valency of verbs, adjectives
and nouns. Case frames are subject to certain constraints, such as that deep case can occur
only once per sentence. Some of the cases are obligatory and others are optional. Obligatory
cases may not be deleted, at the risk of producing ungrammatical sentences. For example, “Mary
gave the apples” is ungrammatical in this sense.
Kind of cases:
The eight cases are as follows, with examples either of the English case or of the English
syntactic alternative to case:
● Normative case: indicates the subject of a finite verb.
Ex. We went to the store. // She bought a new dress.
● Accusative case: indicates the direct object of a verb.
Ex. The clerk remembered us. // He forgot her.
● Dative case: indicates the indirect object of a verb.
Ex. The clerk gave us a discount. // He gave a flower to his mom.
● Ablative case: indicates the movement from something, or cause.
Ex. The patient went to the doctor because she had headache. // He was unhappy
because of depression.
● Genitive case: which roughly corresponds to English possessive case and preposition of,
indicates the possessor of another noun.
Ex. John’s book was on the table.
● Vocative case: indicates an addressee.
Ex. John, are you alright? // Hello, John!
● Locative cases: indicates a location.
Ex. Wwe live in China. // John is at the supermarket.
● Instrumental case: indicates an object used in performing an action:
Ex. We wiped the floor with a mop. // The essay was written by hand.
There are two notable features that are illustrated in the example representation. First, the cases
associated with a verb seem to be associated with questions that we one would naturally ask
about an event. Who did what to whom when? The representation seems well adapted to the
retrieval of the information provided in a sentence. This feature was particularly appealing to
psychologists and computational linguists.
A second interesting feature is that the same representation is provided by both the active and
passive forma o the sentence. In the figure the active form is shown above the representation
and the passive form below. This feature would be consistent with a finding that we rarely recall
the exact syntactic form of the sentence but do recall the basic information provided by the
sentence.
To conclude:
Case grammar uses semantic case analysis to find the link between syntactically distinct
sentences. “The boy will open the door with the key.” “The key will open the door.” Key is a D.O.
in the first sentence and a noun in the second. However, the relationship between Key, Open,
and Door is identical in both sentences.
Case grammar allows a worf to have the same semantic role, despite having a different syntactic
designation.
Case analysis looks at relationship between concepts in term of:
Agent: who?
Time: specifies tense
Instrument: the what
Relation: the action
Recipient: to what
The case grammar approach is supported by normal readers difficulties with ambiguous
sentences where the “obvious” syntactic structure results in an unintelligible sentence.
Case grammar specifies two processes that take place during normal comprehension:
1. We start comprehension immediately: we don’t wait for the end of the sentence in order
to begin our understanding.
2. This analysis is a process of assigning roles to each of the concepts within the sentence.
The influence of case grammar on contemporary linguistics has been significant, to the extent
that numerous linguistic theories incorporate deep roles in one or other form, such as the so-
called Thematic Structure in Government and Binding theory. It has also inspired the
development o frame-based representations in Al research.
Frame semantics:
Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends
his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic
idea is that one cannot understand the meaning of a single word without access to all the
essential knowledge that relates to that word. For example, one would not be able to understand
the word "sell" without knowing anything about the situation of commercial transfer, which also
involves, among other things, a seller, a buyer, goods, money, the relation between the money
and the goods, the relations between the seller and the goods and the money, the relation
between the buyer and the goods and the money and so on.
Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific
concept to which it refers (or highlights, in frame semantic terminology).
The theory applies the notion of a semantic frame also used in artificial intelligence, which is a
collection of facts that specify "characteristic features, attributes, and functions of a denotatum,
and its characteristic interactions with things necessarily or typically associated with it." A
semantic frame can also be defined as a coherent structure of related concepts that are related
such that without knowledge of all of them, one does not have complete knowledge of any one;
they are in that sense types of gestalt. Frames are based on recurring experiences. So the
commercial transaction frame is based on recurring experiences of commercial transactions.
Words not only highlight individual concepts, but also specify a certain perspective from which
the frame is viewed. For example "sell" views the situation from the perspective of the seller and
"buy" from the perspective of the buyer. This, according to Fillmore, explains the observed
asymmetries in many lexical relations.
While originally only being applied to lexemes, frame semantics has now been expanded to
grammatical constructions and other larger and more complex linguistic units and has more or
less been integrated into construction grammar as the main semantic principle. Semantic frames
are also becoming used in information modeling, for example in Gelish, especially in the form of
'definition models' and 'knowledge models'.
Frame Net:
In the 1990s, Fillmore taught classes in computational lexicography at the University of Pisa,
where he met Sue Atkins, who was conducting frame-semantic analyses from a lexicographic
perspective. In their subsequent discussions and collaborations, Fillmore came to acknowledge
the importance of considering corpus data. They discussed the "dictionary of the future", in which
every word would be linked to example sentences from corpora.
The FrameNet corpus is a lexical database of English that is both human- and machine-
readable, based on annotating examples of how words are used in actual texts. FrameNet is
based on a theory of meaning called Frame Semantics, deriving from the work of Charles J.
Fillmore and colleagues. The basic idea is straightforward: that the meanings of most words can
best be understood on the basis of a semantic frame: a description of a type of event, relation, or
entity and the participants in it. For example, the concept of cooking typically involves a person
doing the cooking (Cook), the food that is to be cooked (Food), something to hold the food while
cooking (Container) and a source of heat (Heating_instrument). In the FrameNet project, this is
represented as a frame called Apply_heat, and the Cook, Food, Heating_instrument and
Container are called frame elements (FEs). Words that evoke this frame, such as fry, bake, boil,
and broil, are called lexical units (LUs) of the Apply_heat frame. The job of FrameNet is to define
the frames and to annotate sentences to show how the FEs fit syntactically around the word that
evokes the frame.
BIBLIOGRAPHY:
http://linguistictheoryevolution.blogspot.com/2012/05/charles-fillmores-grammatical-cases.html
https://www.google.com/search?
q=case+structure+grammar+alla+fillmore&rlz=1C1CHBF_esAR816AR818&source=lnms&tbm=is
ch&sa=X&ved=0ahUKEwimr63Y6LLkAhXmIbkGHQVDBiYQ_AUIESgB&biw=1517&bih=738#img
rc=G78TgdBPESzIXM:
https://www.unity.org/resources/charles-fillmore
https://es.wikipedia.org/wiki/Charles_J._Fillmore
https://en.wikipedia.org/wiki/Frame_semantics_(linguistics)
http://brenocon.com/Fillmore%201982_2up.pdf
http://www.nltk.org//howto/framenet.html

You might also like