Professional Documents
Culture Documents
Assignment
Assignment
Assignment
Assignment
On
Applications/Uses of Computer in Literature and Linguistics
What would the machines do? Count angels in Milton? Measure Hemingway's sentences?
That kind of scholarship, a lot of humanists believed, was dreary enough when done slowly and
without computers. Would mathematical profiles of style determine if St. Paul wrote the Epistles,
or if Thomas More wrote one of Shakespeare's plays? Several ambitious computerized studies
along these lines proved extremely controversial.
Yet computers have grown popular in a wide variety of relatively mechanical tasks. From
etymology to bibliography, from the analysis of words in foreign languages to the quick
manipulation of cumbersome texts, computers have made significant changes in research
methods.
Nearly all of ancient Greek, from Homer to sixth century A.D., is now available on
computer tape, and it can be manipulated with computer programs that elucidate grammar and
perform other analytic tasks.
The project's equipment includes a reading machine that scans printed documents and
automatically enters them into a computer, as well as laser printers and a computerized type font
that spews out Chinese and other non- Roman characters.
Identifying Stylistic Quirks:
Computerized literary studies are a notably international field. Last April, 120 experts from
Israel, Taiwan, Singapore, Western Europe and the United States gathered at Louvain-la-Neuve
in Belgium for a conference of the Association of Literary and Linguistic Computing, a mainly
European group.
Participants delivered papers on a computer study of royal English legal charters, on the
new French dictionary being produced in France with the aid of some 1,700 computerized
French literary texts dating back to the 17th century and on computerized stylistic studies of
William Blake, Thomas Carlyle, Emile Zola and Andre Gide.
For years, a band of computer scholars has been counting words, measuring sentence
length, figuring the ratios of unique words to common words and quantifying literature in dozens
of similar ways.
The field is known as computer stylistics and, according to the scholar who spoke in
Belgium on the development of Carlyle's style - Robert L. Oakman, professor of English at the
University of South Carolina - there are important elements of style, such as Carlyle's penchant
for Germanic syntax, that computers pick up faster than readers.
''People are always asking me, 'So what?' '' Said Louis T. Milic, a professor of English at
Cleveland State University who has devoted years to the quantification of 18th-century English
literature. But Dr. Milic has stuck to his computer measurements.
Asked about non-computerized critics, Dr. Milic asserted, ''Most literary criticism of style
has been pretty subjective, intuitive, impressionistic and essentially not of much use.''
Few specialists in computer stylistics sound as radical as Dr. Milic, but they all defend the
potential value of their research.
An area of literature where computers are being used with more than average enthusiasm is
in the study of religious texts. Apart from the concordance to Wesley, huge quantities of
Mormon theological materials are being recorded and manipulated by computer scientists in
Utah, and at the University of Pennsylvania a group of scholars under the direction of Robert
Kraft, professor of religious studies, is computerizing the early Greek translation of the Old
Testament known as the Septuagint.
Text analysis:
With each year that passes more texts are becoming available to the scholarly community
in machine readable form, whether from text archives, such as the one maintained at Oxford
University, or commercial distributors such as Chadwick-Healey (of Cambridge), who have
released a comprehensive database of English poetry.[2] Users of such texts, however, are still
confined largely to a small and specialized research community, with the technical skills to make
use of electronic text. The notable absence so far of computer-assisted research in the leading
scholarly journals is one sign that the field is still marginal.
Willie Van Peer has pointed to a basic problem with the present state of text analysis
methods, which deal largely in counting words.[5] The quantification offered by text analysis
enables only relatively primitive methods of examination. The frequencies of words,
collocations, or particular stylistic features, tell us rather little about the literary qualities of a
text, since these aspects of a text find their meaning only within the larger and constantly shifting
context constituted by the reading process. Text as object (a pattern of words) is a quite different
entity from text as communication (a reader's interaction with a text).
This is not to argue that computer methods of analysis have no place: a number of
interesting studies could be cited to show the opposite, from Oakman's analysis of Carlyle's prose
style to Burrow's study of the idiolects of the characters in Jane Austen's novels.[6] The issue is
rather, that no paradigm shift (to use that much overworked concept) in our theoretical
understanding has been effected by our use of computer methods in literary scholarship. Nor is it
likely to occur until a much more refined and accurate understanding of human cognitive
processes is available, and of the process of literary reading in particular.
There are many use of computer in linguistics some of which as discuss below;
Oral communication, without any doubt, is the first tale-teller of the linguistic background
of every speaker, as the ear immediately picks what is different, odd or strange from what it
normally hears. The written communication does not easily and quickly betray the linguistic
signpost of the speaker. The role of the phonologists is, therefore, to collect and collate the
different sounds that emanate from speakers for the purpose of understanding the sound system
of the spoken language.
Computational linguistics:
Computational linguistics (CL) is a discipline between linguistics and computer science
which is concerned with the computational aspects of the human language faculty. It belongs to
the cognitive sciences and overlaps with the field of artificial intelligence (AI), a branch of
computer science aiming at computational models of human cognition. Computational linguistics
has applied and theoretical components.
Machine Translation
Speech Recognition
Text-to-Speech
Natural Language Generation
Human-Computer Dialogs
Information Retrieval
Computational Modeling
Natural language interfaces enable the user to communicate with the computer in French,
English, German, or another human language. Some applications of such interfaces are database
queries, information retrieval from texts, so-called expert systems, and robot control. Current
advances in the recognition of spoken language improve the usability of many types of natural
language systems. Communication with computers using spoken language will have a lasting
impact upon the work environment; completely new areas of application for information
technology will open up. However, spoken language needs to be combined with other modes of
communication such as pointing with mouse or finger. If such multimodal communication is
finally embedded in an effective general model of cooperation, we have succeeded in turning the
machine into a partner.