Professional Documents
Culture Documents
Acm
Acm
Computational linguistics
computational linguistics linguistics? physics chemistry biology neuropsychology psychology literary criticism
more rigorous
Linguistics
engineering
linguistics
sociology
literary criticism
more rigorous
less rigorous
6
other areas of sociolinguistics (e.g. Deborah Tannen) theoretical linguistics (e.g. minimalist syntax)
less rigorous
historical linguistics
experimental phonetics
more rigorous
Computational linguistics
Often thought of as natural language engineering
11
12
13
14
Machine learning in CL
In general its a plus since it has meant that evaluation has become more rigorous But its important that the field not turn into applied machine learning For this to be avoided, people need to continue to focus on what linguistic features are important Fortunately, this seems to be happening
15
Grammar induction:
Linguists have done a poor job at their stated goal of explaining how humans learn grammar
17
Some applications
Analysis of word structure morphology Analysis of sentence structure
Part of speech tagging Parsing
Regular languages
A regular language is a language with a finite alphabet that can be constructed out of one or more of the following operations:
Set union Concatenation Transitive closure (Kleene star)
19
Every regular language can be recognized by a finite-state automaton. Every finite-state automaton recognizes a regular language. (Kleenes theorem)
20
21
22
Finite-state transducers
23
An FST
24
Composition
In addition to union, concatenation and Kleene closure, regular relations are closed under composition Composition is to be understood here the same way as composition in algebra:
R1oR1 means take the output of R1 and feed it to the input of R2
25
Composition: an illustration
26
R1 as a transducer
27
R2 as a transducer
28
R1R2
29
(With weights) pronunciation modeling and language modeling for speech recognition
30
31
What is morphology?
scripsrunt is third person, plural, perfect, active of scrb (`I write) Morphology relates word forms
the lemma of scripsrunt is scrb
32
Morphology is a relation
Imagine you have a Latin morphological analyzer comprising:
D: a relation that maps between surface form and decomposed form L: a relation that maps between decomposed form and lemma
Then:
scripsrunt D = scrb+s+runt scripsrunt D L = scrb
33
34
Transducer for each affix transforms base into required templatic form and appends the relevant string.
35
Subtractive morphology
Bontoc infixation
Insert a marker > after the first consonant (if any) Change > into the infix um-
37
Kalamazoo
f*****g
38
Reduplication: Gothic
Factoring Reduplication
Prosodic constraints
40
Non-Exact Copies
Dakota (Inkelas & Zoll, 1999):
41
Non-Exact Copies
Basic and modified stems in Sye (Inkelas &
Zoll, 1999):
42
Most linguistic accounts of reduplication assume that the copying is done as part of morphology In MDT:
Reduplication involves doubling at the morphosyntactic level i.e. one is actually simply repeating words or morphemes Phonological doubling is thus expected, but not required
43
44
Summary
If Inkelas & Zoll are right then all morphology can be computed using regular relations This in turn suggests that computational morphology has picked the right tool for the job
45
47
48
49
m n `nx t
r`
w t
xpr
w nb
50
51
Grammar induction
The common nativist view in linguistics
From Gilbert Harman's review of Chomsky's New Horizons in the Study of Language and Mind (published in Journal of Philosophy, 98(5), May 2001): Further reflection along these lines and a great deal of empirical study of particular languages has led to the "principles and parameters" framework which has dominated linguistics in the last few decades. The idea is that languages are basically the same in structure, up to certain parameters, for example, whether the head of a phrase goes at the beginning of a phrase or at the end. Children do not have to learn the basic principles, they only need to set the parameters. Linguistics aims at stating the basic principles and parameters by considering how languages differ in certain more or less subtle respects. The result of this approach has been a truly amazing outpouring of discoveries about how languages are the same yet different.
52
Similarly
Cedric Boeckx and Norbert Hornstein. 2003. The Varying Aims of Linguistic Theory. Children come equipped with a set of principles of grammar construction (i.e. Universal Grammar (UG)). The principles of UG have open parameters. Specific grammars arise once values for these open parameters are specified. Parameter values are determined on the basis of [the primary linguistic data]. A language specific grammar, then, is simply a specification the values that the principles of UG leave open.
53
54
55
56
57
Performance
59
Improvements
Constituent structure can be induced in a similar way to inducing word classes (e.g. parts of speech) by considering the environments in which the putative constituent finds itself. In Klein & Mannings constituent-context model (CCM) probability of a bracketing is computed as follows:
60
Combined DMV+CCM
Subsequent work e.g. Rens Bods 2006 Unsupervised Data Oriented Parsing report F-scores close to 83.0 For comparison, the best supervised parsers get about 91.0 61
Performance of automatic induction algorithms is still far from human performance so they do not constitute evidence that we can do away with (nativist) linguistic theories of language acquisition
They do not show this. But the argument would have more weight if nativist theories had already been demonstrated to contribute to a working model of grammar induction
But Computational Linguistics is starting to make some serious contributions to this 50-year-old debate
62
Examples from: Stump, Gregory (2001) Inflectional Morphology: A Theory of Paradigm Structure. Cambridge University Press. 63
64
A multi-agent simulation
System is seeded with a grammar and small number of agents
Each agent randomly selects a set of phonetic rules to apply to forms Agents are assigned to one of a small number of social groups
Children must learn to generalize to unseen slots for words Learning algorithm similar to:
David Yarowsky and Richard Wicentowski (2001) "Minimally supervised morphological analysis by multimodal alignment." Proceedings of ACL-2000, Hong Kong, pages 207216. Features include last n-characters of input form, plus semantic class
Learners select the optimal surface form to derive other forms from (optimal = requiring the simplest resulting ruleset a Minimum Description Length criterion)
Forms are periodically pooled among all agents and the n best forms are kept for each word and each slot Population grows, but is kept in check by natural disasters and a quasiMalthusian model of resource limitations
Agents age and die according to reasonably realistic mortality statistics
65
66
Another example
Kirby, Simon. 2001. Spontaneous evolution of linguistic structure: an iterated learning model of the emergence of regularity and irregularity. IEEE Transactions on Evolutionary Computation, 5(2):102--110. Assumes two meaning components each with 5 values, for 25 possible words Initial speaker randomly selects examples from the 25, producing random strings for each, and teaches them to the hearer Not all of the slots are filled, thus producing a bottleneck: the hearer must compute forms for the missing slots
67
Final state
68
69
Summary
Evolutionary modeling is evolving slowly
We are a long way from being able to model the complexities of known language evolution
Nonetheless, computational approaches promise to lend insights into how complex social systems such as language change over time, and complement discoveries in historical linguistics
70
Final thoughts
Language is central to what it means to be human. Language is used to:
Communicate information Communicate requests Persuade, cajole (In written form) record history Deceive
Other animals do some or most of these things (cf. Anindya Sinhas work on bonnet macaques) But humans are better at all of these
71
Final thoughts
So the scientific study of language ought to be more central than it is We need to learn much more about how language works
How humans evolved language How languages changed over time How humans learn language
73