Professional Documents
Culture Documents
Can Machines Think
Can Machines Think
net/publication/285475907
CITATIONS READS
30 8,102
1 author:
Daniel Dennett
Tufts University
181 PUBLICATIONS 21,349 CITATIONS
SEE PROFILE
All content following this page was uploaded by Daniel Dennett on 21 November 2019.
r
NOBEL CONFERENCE XX
Gustavus Adolphus Coll*. St. Pettt. ~inncsota
HOW
WE KNOW
I
I
Ed;ted by
MICHAEL SHAFTO
f
l
I
W itl'\ Contnbutton~ bv
- DANIEL C. DENNETT
'.L
:b
H:if1'tt .k Ro w. P\ibli1hen. s~n F'nncuco
~-_.. Hu"°'10 - . ;.:..,. , ..,.... i'hib!Hlon"ll
1,.oo.ckon. Mnoo:o U 1• . 5.io P1Wo. Sin~pot~. ::> •ClfW"I'
haps he was inspired by Descartes, who in his Disrourse 011 Me/hod Is this high op111io11 of the Turing· tcsl's severity misguided?
(1637) plausibly argued that there was no more demanding tcsl of Certainly many have thought so-but usually because they have
hum;111 mcntalit y than the capacity 10 hold an intellig·enl conversa- not imagined the test in sufficient de1ail, and hence have trndcrc-
tion: stimated it. Trying~ forestall Lhis skepticism, Turing im;igined
It is indeed co11cciv;ihk 1h;11 ;1 111achi11c could be so made 1hal it would
several lines ofques1io11ing 1hat ajudge might employ in this game
ullcr words, a11d l'\'Cll words appropriate 10 the prcscm-c of physical acls -about wri1ing poetry, or playing rhess-lhal would be taxing
or ohjt-cls 11hid1 cu1sc so111e d1;111i-:c i11 its oq~aus; as, for cxa111pk, if it was indeed, but with thirty years' experience wi1h the actual talents and
touched in some spot that ii 11011ld ask what you wanted lo say lO it; if in foibles of computers behind us, perhaps we can add a few more
;inothcr, that it would cry th;ll it w;is hurt, and so on for similar things. tough lines of questioning.
Bui it could never modify i1s phrases to reply Lo the sense of whatever was Terry Winograd, a leader in artificial i11telligence efforts to pro-
said in its presence, as even the most Stupid lllCll Giii do. 2 duce conversalional ability in vQfnpuler, draws our allcntion to
a pair of sentences. 3 They diller in only one word. The first sen-
This seemed obvious to lkscancs in the sevc11tec111h century, but
tence is this:
of course the fanciest machines he knew were elaborate clockwork
figures, nol electronic computers. Tod;1y it is far from obvious that The committee denied the group a parade permit because they advocated
such machines arc impossible. but Descartes's hunch that ordi11;11')' \'iolence.
conversation would put as severe a slrain on artificial i1.1tclligencc Herc's the second sentence:
as an)' other lest was sh;ired by Turing. Of course there is nothing
sacred about the particular com·crsational game chosen by Turing The committee denied the group a parade permit because lhey karcd
for his test; it is just a cannily chosen test of more general intelli- violence.
gence. The assumption Tmi11g· was prepared to make was this: The difference is jusL 'in the vcrb-advoraled or feared. As Winograd
Nothing could possibly pass the Turing test by wi11ni11g the imita- points out, the pronoun /hey in each sentence is officially ambigu-
tion game without being able to perform indcli11itcly many other ous. Both readings of the pronoun arc always legal. Thus we can
clearly intelligent actions. Lei us call that assumption the quick- imagine a world i11 which governmcnlal commillecs in charge of
probc assumplion. Turing realized, as anyone would, thal there parade permits advocalc viole11ce in the streets and, for some
are hundreds and thousands of telling signs of intelligent thinking slrang·c reason, use this as their prelext for denying a parade
to be observed in our fellow creatures, and one could, if one pcrmil. Bul the natural, reasonable, intelligent reading of the first
wanted, compile a \'asl battery of dillcrcnl tests to assay the capac- sentence is that iL's the group that advocated violence, a11d of the
ity for intelligent thought. But success on his chosen test, he second, thal it's the commillce Lhat teared 1he violence.
thought, would be highly predictive of success 011 many olher Now if sentences like this are embedded in a conversation, the
intuitively acceptable tests of i11tcllige11cc. Remember, failure 011 computer must figure oul which reading of the pronoun is meant,
the Turing test does nol prcdicl failure 011 those others, but suc- if it is to respond intelligently. But mere rules of grammar or
cess would surely predict success. His test was so severe, he vocabulary will not fix the rigl11 reading. What fixes the right read-
thought, that no1hing that could pass it fair and square would ing for us is knowledge about the world, about politics, social
disappoint us in other quarters. l\laybc ii wouldn't do everything circumstances, committees and their all ii udcs, groups that want to
we hoped-maybe it wouldn '1 appreciate ballet, or understand parade, how they Lend Lo behave, and the like. One must know
quantu111 physics, or have a good plan for world peace, hut we'd about the world, in short, to make sense of such a sentence.
all sec that it was surely one of the intelligent, thinking enlities in J111hcjargon 'ofartificial i11Lclligcnce (Al). a conversational com-
the ncighhorl1ood. pulc.r needs lots of world k1101~1/edge Lo do its job. But, it seems, if
f!ii;/, .. ·
~·'' "'"·~ ·
. ...,. .4.
~ 'Zn .-··· -...-.• ·;~
I I
110\\' WE KNOW
( CAN :lllAClllNES TlltNK? 127
I
somehow il is endowed wilh 1ha1 world knowledge on many lopics, That's not a good test , as it t ur11s oul. Chess prowess has Ill u1·c11
to he an isolatablc t;llcnt. There arc programs today that ran phi}·
il should he able lo do much more with Lhat world knowledge 1han
merely make sense of a co11versatio11 conlaini11gjusl that senlence. line chess but can de; nothing else. So Lhc quilk prnbc assumption
The only way , ii appears, for a computer to disambiguale Lhal is false for the test of playing winning chess.
se11te11ce and keep up ils end of a conversalion thal uses thal Candidate 2: The co111p11ter is ~Iii gent if it soh-cs the Arab-Israeli con-
sentence 1rnuld be for it lo ha\'c a much more general ability to flict. .,.,
respond i11tellige11tly to information aboul social and political cir-
This is surely a more SC\'cre test than Turing's. But it has some
cumslanccs, and many other topics. Thus, such sc111c11ccs, by put-
defects: il is unrepeatable, if passed once; slow, no doubt; and it
ting a demand 011 such abilities, arc good quick probes. That is,
is 1101 crisply clear what would l<>tnll as passing it. 1lcre's another
lhcy test for a wider compctc11cc.
People Lypically ignore the prospccl of having the judge ask prospecl, then:
off-the-wall questions in the ·1·uring test, and hence Lhey underesti- Candidate ;r A t·o111putcr is intelligent if it su<Tceds in stealing the British
mate the con1petenrc a computer would have lo have 10 pass the uown jewels wit ho11t the use of force or ~>If nee.
lest. But remember, the rules of the imitation game as Turing I
".,
I
I! 130 llOW WE KNOW
(
possible good conversalions within the limils and heat judge
~ (In 1 2 percenl of those correct respo11ses Lhere were Lrivial,
..5.
with nolhing more sophisticated Lhan a syslem of table lov1rnp . In
correctable defects.) Of course, Woods's molive in creating
fact, Lhal isn'l in Lhc cards. Even wilh Lhcsc severe and improbable
LUNAR was nol Lo Lrick unwary geologists into Lhinking Lhey were
and suspicious restriclions imposed upon Lhe imitation game, the
conversing with an intelligent being. And if Lhal had been his
number of legal games, though linilc, is mind-bogglingly large. I
motive, his project would still be a long way from success.
haven't bothered lrying tp calculate il, bul il surely exceeds as-
For it is easy enough to unmask LUNAR withoul ever straying
tronomically Lhc numhc~I possible chess games with 110 more
from the presnibed topic of 1110011 rotks. l'ul LUNAR i11 one room,
than forty moves, and Lhal number has been calculated . .fohn
and a moo11 rocks specialist i11 a110Lher, a11d then ask Lhem both
1laugdand says it's in Lhe neighborhood of ten lo the one hu11dred
their opinion of the social value of the moon-rocks-gal11ering ex-
twentieth power. For comparison, I laugcland suggests there have
peditions, for ins lance. Or ask 1he contesta11ts their opinion of the
only been ten lo the eighLeenlli seconds since Lhe beginning of Lhe
suitability of moon rocks as ashtrays, or whether people who have
touched moon rocks arc i11cligiblc for the drah . Any intclligenl univcrse. 5
01 course, the number ol good, sensible conversations under
person knows a lot more aboul moon rocks than their geology.
these limits is a tiny fraction, maybs- Qi11e in a quadrillion, of the
Although it might be 1111fa11 lO demand this extra knowledge of a
number of merely grammatically ."fcl\ formed conversations. So
computer moon rock specialist, it would be an easy way to get il
let's say, Lo be very conservative, that there arc only lc_Jl to the
to fail the Turing Lesl.
lihieth difkrcnl smart convcrsalions such a compuler would have
But just suppose that someone could extend LUNAR to cover
to store. Well, the task shouldn't Lake more than a kw trillion years
itself plausibly on such probes, so long as the topic was still, how-
-given generous federal support. Finilc numbers can be very
ever indirectly, moon rocks. \Ve might come to think it was a lol
more like the human moon rocks specialist than it really was. The large.
So though we needn'l worry that Lhis particular Lrick of storing
moral we should draw is that as Turing test judges we should resist
all the smart conversations would work, we c;111 appreciate that
all limital ions and waterings-down of the Turing test. They make
there arc lols of ways of making Lhc Lask easier thal may appear
the game too easy-vastly easier than the original test. I lence they
innocenl al first. We alfo gel a reassuring measure ofjusl how
lead us into Lhe risk of overcstimaling the aelual comprehension
severe Lhe unrestricted Turing Lesl is by rellecling on the more
of the sysLem being tested.
I han a·s lronomical size of even Lhat severely rcslriclcd version of
Consider a different limital ion on the Turing Lcsl that should
slrike a suspicious chord in us as soon as we hear il. This is a it.
4 Block's imagined-and ullcrly impossible-program exhibils
varialion on a Lheme developed in a recent aniclc by Ned Block.
the dreaded fcalure known in computer science circles as ro111-
Suppose someone were lo propose lo restrict the judge to a vocab-
ulary of, say, the 850 words ol "Basic English," and to single- bi1wtorial explosion! No conceivable compuler could overpower a
combinalorial ex1)losion with sheer speed and size. Since the prob-
senlcnce probes-that is "moves"-of no more than four words.
lem areas addressed by artilici~l intelligeqcc are veritable mine-
rvlorcovcr, conteslants musl respond lo these probes with no more
liclds of combinatorial cxplosi()n, and since il has often proven
than four words per move, and a test may involve no more than
diflicult lo find a11v solution lo a problem that avoids them, there
forty questions. is considerable plausibility in Newell and .Simon 's proposal 1hal
Is Lhis an innocent variation on Turing's original lesl? These
avoiding combin;ltorial explosion (by any means at all) be viewed
restrictions would make the imitation game clearly linile. Thal is,
the lotal 1111111her of all possible pennissible games is a large, but as one of the hallmarks of intelligence.
Our brains are millions of times bigger than the lira ins of gnats,
linilc, number. One mighl suspecl Lhal such a limitation would
bul Lhey arc still, for all their vast complcxit y, compact, eflicicnl,
pcrmil the trickster simply lo store, in alphabetical order, all the
' '
,t: 'ii.'!.~
!o,. ..... \\ " •
I 110\V WE KNOW <I CAN MACHINES TlllNK? 1 33
timely organs that somehow or other manage lo perform all their didn't even need a computer for it. All you needed w;is an electric
tasks while avoiding combinatorial explosion . A computer a mil- typewriter. I lis program 111odclcd infantile autis111. Aud the Lran-
lion Limes bigger or faster than a human brain might nol look like scripls-you type in your questions, and the thing jusl sits there
the brain of a human being, or even be internally organized like and hums-cannot be disl\nguished by experts from transcripts of
the brain of a human being, but. ii, for all its dillcrences, it some- real conversations with i1~1tile autistic patients. Whal was wrong,
how managed lo control a wise and timely seL of activities, il would of course, with Colby's test was Lhal the unsuspecting interviewers
have Lo be the beneficiary of a very special design Lhal avoided had no motivation at all Lo try oul any of'Lhe sorts of'qucslions thal
co1nbinatorial explosion, and whatever that design was, would we easily would have unmasked PARRY.
nol be right Lo consider the entity intelligent? Colby was undaunted, and a fl er his team had improved I' ARR Y
Turing's Lesl was designed to allow for this possibility. His point he pul il Lo a much more severe Lest-a surprisingly severe Lesl.
was that we should nol be species-chauvinistic, or anthropocentric, This time, the interviewers-again, psychiatrists-were given the
about the in sides of an intelligent being, for there 111ighL be inhu- Lask al the oulsel of telling the co•,!_!,Ptj(er from the real patient.
man ways of being intdligent. They were sel up in a classic Turi1)~ i;1achine Lest: the patient in
To my knowledge, the only serious and interesting allcmpl by one room, the computer PARRY in the other room, with the
any program designer to win even a severely modified Turing tesl judges conducting interviews with both of them (on successive
has been Kenneth Colby's . Coll»' is a psychiatrist and intelligence days). The judges' task was lo find oul which oi1e was Lhe computer
artificer al UCLA. He has a program called PARRY, which is a and which one was the real patient. Amazingly, they didn't do
computer simulation of a paranoid patient who has delusions much beuer, which leads some people to say, "Well, that jusl
about the l\lalia being oul Lo get him. As you do with other conver- confirms my impression of the intelligence of psychiatrists!"
sational programs, you inleracl with il by sill ing al a lenninal and llut now, more seriously, was this an honest-lo-goodness Turing
typing questions and answers back and forth. A number of years lest? Were there Lacil restrictions on the lines of questioning of the
ago, Colby pul PARRY lo a very restricLed lest. He had genuine judges? Like the geologis(s interacting with.LUNAR, the psychia-
psychiatrists interview PARRY. lie did nol suggest to them that trists' professional preoccupations and habits kept them from ask-
they might he talking or typing to a computer; rather, he made up ing the sons of unlikely questions that would have easily unmasked
some plausible story about why they were co111m1micating with a PARRY. After all, they realized that since one of the contestants
real live patient by teletype. I le also had the psychiatrists interview was a real, live paranoid person, 111edical ethics virtually forbade
real, human paranoids via teletype. Then he took a PARRY tran- them from toying with, upseLLing, or allempling Lo confuse their
script, inserted il in a group of teletype lranscripls from real pa- interlocutors. Moreover, they also knew that this was a tesL of a
tients, gave them to a110/hn group of experts-more psychiatrists model of paranoia, so there were certain questions thal wouldn't
-and said, "One of these was a conversation with a computer. Can be deemed to be relevant lo testing the model as a model of J1ara11oia .
you figure out which one il was?" They couldn't. They didn't do So, they asked jusL the sort of questions Lhal therapists lyj1ically ask
better than chance. of such patients, and of course PARRY had been ingeniously and
Colby presented this with some huzzah, but critics sn>ffed al the laboriously prepared lo deal with jusl that sort of question.
suggestion that this was a legitimate Turing lesl. l\ly favorite com- 011e of the psychiatrist judges did, in fact, make a rather half-
mentary 011 il was Joseph \Veizenhaum's; in a leller lo the Co1111111111i- hearted auempl lo break ouL of the mold and ask so1'ne telling
calio11s of the 1ls.wciatio11 of Co111jmli11g J\£Mhi11crv, 6 he said that, in- questions: "l\laybe you've heard the saying 'Don't cry over spilled
spired by Coll>)'. he had designed ;111 even l>l'tter program, which milk.' Whal does that mean lo you?" I' ARR\' answered: "Maybe
passed the sa111e Lesl. I lis also had the virtue of being a very you have to watch oul for the l\lafia." When then asked "Okay, now
inexpensive program, in these Limes of Light money. In fa cl you if you were in a movie theater watching a movie and smelled
j
r {
something like burning wood or rubber, what would you do?" TOR, an SRl-dcveldped expert system in geology, had correctly
PARRY replied: "You know, they kuow me." And the next ques- predicted the existe1\ce of a large, important mineral deposit that
tion was, "If you found a stamped, addressed letter in your path had been entirely unanticipated by the human geologists who had
as you were walking dowu the street, what would you do?" PARRY fed it its data. MYCIN, perl~s thr most famous of these expert
replied: "Whal else do you waul lo k11uw?" 7 systems, diagnoses inkctions of tlje blood, and it docs probably
Clearly PARRY was, you might say, /H11ry111g these quest ions, as well as, maybe bcucr than, any human consultants. And mauy
whid1 were incomprehensible tu it, with more or less stock para- other expert systems arc on the way.
noid formulas . We see a bit of a dodge, which is apl lo work, apl All expert systems.~ like all other large Al programs, are what you
to seem plausible Lo the judge, only because the "contestant" is might call Potemkin villages. Thal is, they are cleverly constructed
511/Jf>osed to be a paranoid, and such people arc expected lo respond facades, like cinema sels. The actual filling-in of details of Al
uucoopcratively on such occasions. These unimpressive responses programs is time-c~msuming, costly ~~Ir, so economy dictates
didu 't p<frticularly arouse the suspicious of the judge, as a mat Ler that only those surfaces of the pheuoinenon that are likely lo be
of fact, though probably tl1ey should have. probed or observed are represented.
PARRY, like all other large computer programs, is dramatically Consider, for example, the CYRUS program developed by Janel
bo1111d hy limitatious of cost-effectiveness . \\lh;1t was imponaut to Kolodncr in Roger Schank's Al group al Yale a kw years ago.s
Colby and his crew was simulating· his model of paranoia. This was CYRUS stands (we are told) for Computerized Yale Retrieval and
a massive effort. PARRY has a thesaurus or dictionary of about Updating System, bul surely it is no accident that CYR US modeled
4500 words and 700 idioms and the grammatical competence Lo the memory of Cyrus Vance, who was then secretary of stale in the
use it-a parser, in the jargon of compulaliunal linguistics . The Carter administration. The point of the CYRUS project was Lo
entire PARRY program Lakes up aboul 200,000 words of computer devise and tcsl some plausi~lc ideas about how people organize
memory, all laboriously installed hy the programming lcam. Now their memories of the events they participate iu; hence il was
once all the effort had gouc iuto devising the model of paranoid meant lo be a "pure" Al system, a scientific model, not an expert
thought processes and linguistic aliilit y, there was liulc if any time, system intended for any practical purpose. CYRUS was updated
energy, moucy, or interest lch over lo build iu huge amounts of daily by being fed all UPI wire service news stories that mentioned
world knowledge of the sort that any actual paranoid, of course,
Vance, and it was fed them directly, with no doctoring and no
would have. (Not that anyone yet knows how to build in world
human intervention. Thanks lo an ingenious news-reading pro-
knowledge in the first place.) Building in the world kuuwlcdge, if
gram called FRUMP, it could take any story just as it came in on
one could even du it, would no doubt have made PARRY orders
the wire and could digest it and use it to update its data base so
of 111ag11itude larger and slower. Aud what would have bccu the
that it could answer more questions. You could address questions
point, given Colby's theoretical aims?
lo CYRUS in English by typing at a terminal. You addressed them
PARRY is a theoretician's model of a psychological phenome-
in the second person, as if you were talking with Cyrus Vance
non: paranoia. It is not intended tu have practical applications. But
himself. The results looked like this:
in recent years a branch of Al (knowledge engineering) has ap-
peared that develops whal arc now called expert systems. Expert Q; Last time you went to Saudi Arabia, where did you stay?
systems are designed to be practical. They arc soft ware supcrspe- A: 111 a palace in Saudi Arabia on Scplcmber 23, 1978.
cialist consultants, typically, that c111 he asked to diagnose medical Q Did you go sightseeing there?
problems, to analyze geological dala, lo analyze the results of A: Yes, al an oilfield i11 Dharan on September 23, 1978.
scientific cxpcrimcnls, and the like. Some of them arc very impres- Q Has your wile ever met l\frs. Begin?
sive. SRI in California announced a kw years ago that PROSl'EC- A: Yes, 111osl recently at a state dinner in Israel in January 1980.
(
..
' I
CAN MACIIINES TIIINK? / 139
138 I llOW WE KNOW
exhibits the sons of shortcomings and failures that the designer
taken to correct these flaws. For instance, my former colleague al knows the system lo have.._This would not be a substitute, however,
,i Tuhs, Benjamin Kuipers, is cunently working on an expert system
j\ for an attitude ofcaulio~ almost obsessive, skepticis111 on the pan
in ncphrology-for diag11osi11g kidney ailments-that will be of users, for designers arc often, if nol always, unaware of the
based 011 an elaborate syste111 of causal reasoning about the subtler flaws in the products they produce. Thal is inevitable and
phcno111c11a being diagnosed. But this is a very ambitious, lo11g-
natural, given the way system designers must think. They are
range project of considerable theoretical dilliculty . A11d even if all trained lo think positively-constructively, one might say-about
the reasonable, cost-clkctive steps are taken lo minimize the su-
the designs that they are constructing.
perficiality of expert systems, they will still be facades, just some- I come, then, lo my conclusions . First, a ·philosophical or thco-
what thicker or wider facades. ret_ical co~~lu_sion: The Turi_ng _lcW!.-.una<luher_a.te<l, unrestricted
When we were considering the fantastic case of the crazy Cham- fo1 m, as l unng presented ll, 1s ,plenl y strong if well used. I am
ber of Commerce of Great Falls, Montana, we couldn't imagine a confident that no computer in the next twenty years is going to
plausible motive for anyone going to any sort of trouble to trick pass the unrestricted Turing tesl. They may well win the World
the Dcnnell tcsl. The quick probe assumption for the Dennett test Chess Championship or even a Nobel Prize in physics, but they
looked quite secure. But when we look al expert systems, we see won't pass the unrestricted Turing tesl. Nevertheless, it is 1101, I
that, however innocently, their designers do have motivation for think, i111possible in principle for a computer lo pass 1he test, fair
doing exactly the sort of trick l hat would fool an unsuspicious and square. I'm not running one of those a priori "computers can't
Turing tester. First, since expert systems arc all superspecialists think" arguments. I stand unabashedly ready, moreover, to de-
who arc only supposed to know about some narrow subject, users clare that any computer that actually passes the unrestricted Tur-
of such systems, not having much time to kill, do not bother prob- ing lest will be, in every theoretically interesting sense, a thinking
ing them at the boundaries at all. They don't bother asking "silly"
thing.
or irrclcvanl questions. Instead, they concenlrale-not unrcasona- But remembering how very strong the Turing test is, we must
bly-on exploiting the syste111's strength~. But shouldn't they try also recognize that there may also be interesting varieties of thin'k-
to obtain a dear vision of such a system 's weaknesses as well? The ing or intelligence that arc not well poised t.o play and win the
normal habit ofhuma11 thoughl when conversing with one another
.,I imitation game. Thal 110 11onh11ma11 Turi11g test winners are yet
is to assume general comprehension, to assume rationality, to visible on the horizon does not mean that there aren't machines
assume, moreover, that the quick probe assumption is, in general,
that already exhibit some of the i111ponanl features of thought.
sound . This amiable habit of thought almost incsistibly leads to A~out them, it is probably liJtile to ask my title question, Oo they
putting too much faith in computer systems, especially user-
thmk? Oo they really think? In some regards they do, and in some
fricndly systems that present themselves 111 a very anthropo- regards they don't. Only a detailed look al what they <lo, and how
morphic manner. they are structured, will reveal what is interesting about them. The
Pan of the solution lo this problem is to leach all users of Turing test, not being a scientific lest, is of scant help on that task,
computers, especially users or expert syste111s, how to probe their but there are p'enty of other ways of examining such systems.
systems before they rely 011 the111 , how to search oul a11d explore Verdicts 011 their i111elligc11ce or capacity for thought or conscious-
the boumbries of the facade . This is an exercise that calls 11ol only
ness would be only as informative and persuasive as the theories
for intelligence a11d imagi11at ion, but also a bit of special under- of intelligence or Lhough1 or consciousness the verdicts were
standing about the limitations and actual structure of computer based 011, and since our task is to create such theories, we should
progra111s . It would · help, or course , if we had standards of truth get 011 with it a1ld leave the Big Verdict for a1101hcr occasion. 111
in advertisi11g, in effect, for expert systems. For instance, each such
the 111ea11li111c, should anyone want a surefire, almos1-g11ara11teed-
system should come with a special demonstration routine that
';! ·;
I
I
'I
140 I 110\V WE KNO\\' ( CAN Mi\ClllNES TlllN I 141
I
lo-be-fail-safe test of thinking by a computer, the Turing lest will going back on 1'i1y earlier claims? Not al all. I am merely pointing
do very nicely. out that the Turing test is so powcrhil that it will ensure indirectly
l\ly second conclusion is more practical, and hence in one clear that these conditions, if they arc truly necessary, are met by any
sense more important. Cheapened versions of the Turing test are succ~ssful contestant. .
everywhere in the air. Tming's test is not just elkctive, it is entirely "You may well be r~t," Turing could say, "that eyes, cars,
natural-this is, a lier all, t ht: way we assay the intelligence of each hands, and a history are necessary conditions for thinking. If so,
other <.:very day . And sine<.: incautious use of such judgments <iml then' I submit that nothing could pass the Turing test 1hat didn't
such tests is the norm, w<.: arc in some considt:rabk danger of have eyes, cars, hands, and a history. That is an empirical claim,
extrapolating loo c<isily, ;ind judging too generously, about the which we can someday hope to test. If you suggest th<il these arc
undt:rstanding or the systems WC are using. Th<.: problem of conceptually necessary, 1101jus1 practically or physically necessary,
overcst imation of cog·nitivc prowess, of comprehension, of intelli- conditions for 1hinki11g, you make a philosophical claim that I for
gence, is not, then, just a philosophical problem, but a real social one would not know how, or carc.,.jp 8sscss . Isn't it more interest-
problem, and we should alert oursdvcs lo it, and take steps lo ing and important in the cml to disc~ver whether or nol it is true
avert it. that no bedridden system could pass a demanding Turing test?"
Suppose we pul; Lo Turing the suggestion that he add another
component to his tesl: Not only must an entity win the ·I mitation
POSTSCRIPT: EYES, EARS, HANDS, AND HISTORY
game, but also it must be able to identify-using whatever sensory
My philosophical conclusion in this paper is that any computer that apparatus it has available to it-a variety of familiar objects placed
actually passed the Turing test would be a thinking thing in every in its room: a tennis racket, a polled palm, a buckcl of yellow paint,
theoretically interesting sense. This conclusion seems to some a live dog. This would ensure l hat somehow or other the entil y was
people to fly in the face of what I have myself argued 011 other capable of moving around and distinguishing things in the world.
occasions. Peter Bieri, commenting on this paper <it Boston Uni- Turing could reply, I a1i1 asserting, that this is an utterly unneces-
versity, noted that I have of'tcn cl;iimed to show the importance lo sary addition lo his lest, making it no more demanding than it
genuine understanding of a rich and intimal e perceptual intercon- already was. A suitably probing convct·sation would surely estab-
nection between an entity and its surrounding world-the need for lish, beyond a shadow of a doubt, that the contestant knew its way
something like eyes and cars-and a similarly complex active cn- around in the real world. The imagined altcrn<il ive of somehow
gagcmcnl with clements in that world-the need for somcl11ing "prestocking" a bedridden, blind computer with enough informa-
like hands with which lo do things in that world. Moreover, I have tion, and a clever enough program, to trick the Turing test is
often held that only a biography of sorts, a history of actual pro- science fiction of the worst kind-possible "in principle" but not
jects, learning experiences, and other bouts with reality, could remotely possible in fact, g·ivcn the combinatorial explosion of
produce the sorts of complexities (both external, or behavioral, possible variation such a system would have to cope with .
and internal) that arc needed lo ground a principled interpretation "But suppose you're wrong. What would you say of an entity
of an entity as a thinking thing, an entity with beliefs, desires, that was created all al once (by some programmers, perhaps) , an
intentions, and other mental altitudes . instant individual wilh all the conversational talents of an embod-
. j
' But the opaque screen in the Turing test discounts or dismisses ied, experienced human being?" This is like the question: "Would
these factors altogether, it scTn1s, by focusing allcntion m1 only the you call a hunk of 11 2 0 that was as hard as steel ;it room tempera-
contemporaneous capacity to engage in crnc very limited sort of ture ice?" I do not know what Turing would say, of course, so I
activity: verbal communication. (I have even coined a pc:jorative will speak for myself. Faced with such an improbable violation of
label for such purely language-using systems: bedridden.) Am I what I take to he the laws of nature, I would probably be speech-
. ·: r.~Er~lli~J~~rr!~~~j1
~,;.~~~~~fiwwte.@~Y.-?.4Ntt<""'·
.... '!V..Ji':C-.:L!;1 .-.'S ~ ~ .....; .. .... t,,..:. ..J+~• - ' .
•.c-~er.-.,~~~,;-. ... ;,.;-~~~ 11 ~Mii~~.£&f&&i&#Q4-j
(
142 I 110\V WE KNOW CAN MACHINES THINK? / 143
less. The lcasl of my worries would he about which lexicog-.aphical human mind is hill of' a11 ex1raordi11ary amounl of' detailed
leap lo take: knowledge, as, for example, Roger Schank has been pointing
out.
11. "11 turns out, to my ;1111a1.c111c11t, that something c111 1hi11k wi1hotll
ha\'ing had the hc11cfi1 of eyes , cars, hands, and a history." As long as th~cientist who is allempling lo manipulate you
/l. "It turns out, to Ill)' a111azc111c111, that something c111 pass the Turing does not share all your knowledge, his or her chances of
test without thi11ki11g." manipulating you arc minimal. People can always hit you over
the head. They c111 do that now. We don't need ariificial
Choosing bclween these ways or expressing my astonishment intelligence lo manipulate people by putting them in chains
would be asking myself a q11es1 ion "100 meaningless lo deser\'e or tortu'r ing them. But if someone tries lo manipulate you by
discussion." controlling your thoughls and ideas, thal person will have 10
kn'.>w. whal you k_11ow and ~!ie. ·1 ~he ~est way lo ke~p .y ourself
DISCUSSION sale from that k111d o~ m~mpulauon 1s to be well mlormed.
Q: Do )'Oil ihi11k we will beI able to program . self-co11sciourness i11lo a
Q: ll'l1y wns Turing i11tnesttd i11 dif!e1entiati11g n 111<111 Jro111 a u•o111<111 i11 w111p11ter?
his fa111ow lest? A: Yes, I dd think that it's possible to program self-consciousness
A: Thal was just an example. I le described a parlor game in into a compuler. Self-co11scio11rness can mean many things. If
which a ma11 would try lo fool the judge by a11sweri11g· ques- you take the simplest, crudest notion or sclf~consciousness, I
tions as a woman would answer. I suppose that Turing· was suppose that would be the sort of self-consciousness that a
playing 011 the idea that maybe, just maybe, there is a hig lobster has: When it's hungry, it eats something, but it never
difference between the way men lhi11k and the way wo111e11 eats itself. It has some way of distinguishing between i1self and
think. But of course they're both thinkers. I le wan led lo use the rest of the \vorld, and it has a rather special regard for
that fact lo make us realize that, even if there were clear itself.
dilkrcuces between lhe way a computer and a person The lowly lobster is, in one regard, sclf~conscious. If you
thought, they'd bolh s1ill he lhiuking. want lo know whether or not you can crealc lhat on lhe com-
Q· 11 'hy docs it seem that so111t jwople <m~ 11/Jset by .111 reseaffh? /)ors . II puter, the answer is yes. It's no trouble al all. The computer
·1, ·
research threaten our srlj-1•.1/t1•111? is already a self~watching, self-monitoring sort of thing. That
~, . ' A: I think Herb Simon has already given the canniest diagnosis is an established part of the technology.
of lhat. For many people the mind is the last refug-c of mystery llut, of course, mosl people have some1hing more in mind
agai11s1 the encroaching· spread of science, and they don't like when they speak of' sell~consciousness . ll is that special inner
lhe idea of science c11g111fi11g lhe bs1 bit of term i11cog11ita. This ligh1, thal private way that it is with you that nobody else can
means that they are lhrealcned, I think irrationally, by the share, something that is forever outside the bounds of com-
prospect that rcscard1ers in artificial in1elligence may come lo puter science. I low could a computer ever be conscious in this
1mders1a11d the human mind as well as biologists u11ders1a11d sense?
the genetic code, or as well as physicists understand cleurici1 y That belief, l hat very gripping, powerful intuition is, I think,
and magnetism . This could lead to lhe "evil scientist" (lo lake in the end simply an illusion of common sense. h is as grip-
a siock character from science liction) who Gill ro111rol you ping as the commonsense illusion that lhe earth stands still
because he or she has a d<'<'P u11<lersta11ding of' what's going- and the sun goes around the earth . But the only way that lhose
011 i11 your mind . This sec111s to me lo be a totally valueless or us who do not believe in the illusion will ever convince the
frar, one lhat you can sci aside, for lhc simple reason 1ha1 the general public that it is an illusion is by gradually unfolding
I . • I 110\V WE KNOW ( CAN MACHINES Till I 145
a very dillicuh and fascinali11g slory aboul jusl whal is going by rather indirect arguments that anyone will be ahle to appre-
on in our minds . ciale L11al lhcsc models cast light on lhe deep lheorelical ques-
In lhc inlerim, people like me-philosophers who have lo tion of how the mind is organized.
live by our wits a11d lell a lot of stories-use whal I call intui-
lion pumps, liule examples thal help lo free up the imagina-
NOTES
!
lion. I simply wanl lo draw your attention lo one facL. If you
1. Alan f\I. Turing, "Compul ing f\lad1inery and Intelligence," Mi111/ 59 ( 1!150).
1 look at a compuler-1 do11'1 care whether il's a gianl Cray or
I 2. Rene Descaries, Oi.1ro1n.11' 011 ,\frlhod ( 1637). trans. Lawrence Lafleur (New
. '·
I
a personal compuler-if you open up the box and look inside York: llobhs f\knill. 19!io) .
3. Terry Winograd, Ulldn.1 /r111d111g ;\'11/ura/ La11g1111ge (New York: Academic Press ,
' I
and sec those chips, you say, "No way could lhat be conscious .
\ I !}72 ).
No way could thal be sell ~ conscious." But lhe same thing is 4 . Ned Bloc:k, "l'sychologis1n and Behaviorism," Plti/n.rn/1hirn/ lleioiew ( 1!)82).
true if you take lhe lop off somebody's skull and look al 1he 5. John 1laugdand, l\/i11d f),._,;~11 (Cambridge, f\lass.: Bradford Books/MIT Press,
gray matter pulsing away in !here. You 1hink, "That is con- I !)8 l), p . l (). . #
G. Josq.ih Weizenbaum. CIC.If 17, no~ (September 197,1) . P· 543·
scious? No way could Ll1al lump of sluff be conSl'ious." 7. I thank Kenneth Colby for prn\'iding me with the complete transcripts (includ-
or course, it makes no difference whether you look al il wilh ing the judges' comn1en1aries and reactions). fro111 whid1 these exchanges are
a microscope or with a macroscope: Al no level of inspeclion 'lnoted. The lirst published aC<"ounl of the experiment is .Jon f. lleiser,
Kenneth Mark Colby, William S. Faught, ancl Roger C. Parkison, "Can l'sychia,
does a brain look like the seal of consciousness. Therefore, 1ris1s llistiuguish a Compulcr Simulation of Paranoia from the Real Thing?
don'l expect a complllcr lo look like lhc seal of consciousness. The l.i111i1a1ions ofTmiug -1.ike Tests as Measures of the Adc'luacy of Si11111la-
If you wanl lo get a grasp of' how a computer could be con- 1ions," in journal of Ps)'l'hiatrir llese<1rch 15, no. 3 ( 1980), pp. q9-li2. Colby
disrnsses PARRY and i1s implications in "Modeling a Paranoid Mind," in
scious, il 's no more dillicuh in 1he end than gelling a grasp of lll'hrwiornl m11/ /lrni11 Srie11rr.1 .j. 110. 4 ( l !)8 1), pp. 5 15-60.
how a brain could be conscious. 8. Janel L. Kolodner, "Retrieval and Organization S1ra1egies in Conceptual
As we develop good accounls of consciousness, it will no Memory: A Computer f\lodel" (Ph .D. diss.), Research Rcporl # 187. Dept. of
Computer Scie1a:e, Yale University; idem, "Maintaining Organization m a
longer seem so obvious lo everyone lhat lhe idea of a sell~ Dynamic Long,term Memory ." Cog11ili11e Science 7 (1983), 2,13-280; idem,
conscious compulcr is a conlradiclion in lcnus. At the same "Rl'cunstruOil'e Memory: A Computer Model." Cog111ti11e Sciwce 7 ( 1u83), 281-
time, I doubl 1ha1 lhcrc will ever be sell~conscious robols . B1H 328.
for boring reasons. There won ' t he any point in making them.
Theoretically, could we make a gall bladder oul of atoms? In
principle we could. A gall bladder is just a collee1ion 0L1toms,
bul manufacturing one would cost the moon. It would he
more expensive than every pn~jccl NASA has even dreamed
of, and there would be no scientific payoff. We wouldn't learn
anything new about how gall bladders work. For the same
reason, I don'l think we're going to sec really humanoid ro-
bots, because practical, cost-effective rohols dou 't need lo be
very humanoid at all. They 11ced lo be like the robots you can
already sec al General l\lotors . or like boxy little computers
1hat do special-purpose things .
The theoretical issues will he studied hy artificial intelli-
gence researchers by looking at models 1ha1, to lhc layman,
will show very little sign of hu111ani1y al all, and il will he 011ly