Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/285475907

Can Machines Think?

Chapter · January 2004


DOI: 10.1007/978-3-662-05642-4_12

CITATIONS READS

30 8,102

1 author:

Daniel Dennett
Tufts University
181 PUBLICATIONS   21,349 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Daniel Dennett on 21 November 2019.

The user has requested enhancement of the downloaded file.


...

r
NOBEL CONFERENCE XX
Gustavus Adolphus Coll*. St. Pettt. ~inncsota

HOW
WE KNOW
I
I
Ed;ted by
MICHAEL SHAFTO
f
l
I
W itl'\ Contnbutton~ bv

5. Can Machines Think?


Gerald M. Edelman. 8 r~nda Milner. Ro~er C. Schank.
Herben .-\ . Simon. Dan•~ Oenn ~t.
• nd Artnur P~ acocke

- DANIEL C. DENNETT
'.L
:b
H:if1'tt .k Ro w. P\ibli1hen. s~n F'nncuco
~-_.. Hu"°'10 - . ;.:..,. , ..,.... i'hib!Hlon"ll
1,.oo.ckon. Mnoo:o U 1• . 5.io P1Wo. Sin~pot~. ::> •ClfW"I'

·'"iEan machines think? This has been a conundrum for philosophers


for years , but in their fascination with the pure conceptual issues
they have -for the most part overlooked the real social importance
o-fthe answer.-ffis of more than academic importance that we learn
to think clearly about the actual cognitive powers of computers, for
they are now being introduced into a variety of sensitive social
roles, where their powers will be put to the ultimate test: In a wide
variety of areas, we are on the verge of making ourselves depen-
dent upon their cognitive powers. The cost of overestimating them
could be enormous.
One of the principal inventors of the computer was the great
British mathematician Alan Turing. It was he who first figured
out, in highly abstract terms , how to design a programmable
computing device-what we now call a universal Turing ma-
chine. All programmable computers in use today are in essence
Turing machines . Over thirty years ago , at the dawn of the com-
puter age , Turing began a classic article, 1'Computing Machinery
and Intelligence" with the words: "I propose to consider the
question , 'Can machines think?' "-but then went on to say that
this was a bad question, a question that leads only to sterile de-
bate and haggling over definitions, a question, as he put it, "too
meaningless to deserve discussion." 1 In its place he substituted
what he took to be a much better question , a question that would
be crisply answerable and intuitively satisfying-in every way an
acceptable substitute for the philosophic puzzler with which he
began.
First he described a parlor game of sorts, the "imitation game,"
to be played by a man, a woman, and a judge (of either gender).
The man and woman are hidden from the judge's view but able to
communicate with the judge by teletype; the judge's task is to
~ ··· 1 · • 1 1.. It •. J.:-...;;.:.i: .. ~...i:.. ~ - '4--~ ··--- .. --- · ·-··- '••"- . •• •. j
( (
C/\N M/\ClllNES TlllNK? I 123
122 I llOW WE KNOW
could pass this test would surely have it; then we could turn to
guess, after a period of questioning each conlcstanl, which inter-
asking how or whether some machine could be designed and buih
locutor is the 111an and which the won1an. The man tries to con-
that might pass the test faiat.m1<l square." Alas, philosophers-
vince the judge he is the woman (and the woman tries to co11vim·c
amatcur and professional-fi]l_,e instead taken Turing's proposal
the judge of the lluth), and the 11w1 wins if the jndge makes the
as the pretext for just the son or <lcli11itio11al haggling and intermi-
wrong identification. A little refle(·1io11 will convince you, I am
nable arguing about imaginary counterexamples he was hoping to
sure, that, aside from lucky breaks, it would take a clever 111a11 Lo
convince the judge that he was the woman-assuming the judge squelch.
This thirty-year preo~cupation with the Turing test has been all
is clever too, of course.
the more regrettable because it has focused allention on the wrong
Now suppose, Turing said, we replace the man or woman with
issues. 'I 'here are real world problems that are revealed by consider-
a computer, and give the judge the task or determining which is
ing the strengths and weaknesses ofth~·Jh11g test, but these have
the human being and which is the computer. Turing proposed that
been concealed behind a smokcscrccli of misguided criticisms. A
any computer that can regularly or often fool a discerning judge
failure to think imaginatively about the lest actually proposed by
in this game would be intelligent-would be a computer that
Turing has led many to underestimate its severity and to confuse
thinks-beyond 1111y reasonable doubt. Now, it is important Lo realize
it with much less interesting proposals.
that failing this lest is not supposed LO be a sign of lack of intelli-
So first I want to show that the Turing test, conceived as he
gence. l'vlany intelligent people, alier all, might not be willing or
conceived it, is (as he thought) plenty strong enough as a lest of
able lo play the imitation game, and we should allow compulcrs
thinking. I defy anyone to improve upon it. Uul here is the point
the same opporlunity Lo decline to prove themselves. This is, then,
almost universally overlooked by the literature: There is a com-
a one-way Lest; failing it proves nothing.
mo11 111isapplicatio11 of the sort of testing exhibited by the Turing
Furthermore, Turing was not conn11iui11g hi111sclf tu the view
test that ol'ten leads lo drastic overestimation of the powers of
(although it is easy to sec how one 111ight think he was) that to think
aclually existing conlputer systems. The follies of this familiar son
is to think just like a human bci11g-a11y more than he was ronunit ..
of thinking about computers c;111 best be brought out by a recon-
ting himself to the view that for a 111:111 lo think, he musl 1hi11k
sideration of the Tqring test itself'.
exactly like a woman. Men and wo111c11, and co1111>1t1ers, may all
The insight underlying the Turihg I est is the same insight I hat
have different ways of thinking. But smdy, he 1hought, if onc c;111
inspires the new pra~: tice among sy~nphony orchestras of conduct-
think in one's own peculiar style well t·11ough to imitate a thinking
ing auditions with An opaque screen between the jury and the
man or woman, one can think well, indeed. This imagined exercise
musician. What matters in a musician, obviously, is musical abilil y
has come Lo be known as the Turing test. and only musical ability; such f'caturcs as sex, hair length, skin
IL is a sad irony that Turing's proposal has had exactly the
color, and weight are strictly irrelevant. Since juries might be
opposite effect on the discussion of I hal which he intended. Turing
biased-even innocently and unawares-by these irrelevant fea-
didn't design the test as a useful tool in scientific psychology, a
tures, they are carefully screened off so only the essential f'catures,
method of confirming or disconfinni11g scie111ific theories or evalu-
musicianship, can he examined . Turing recognized that people
ating particular models of mental function; he designed it to be
similarly ' •night be biased in 1heir judgments of intelligence by
1101 hing more I han a philosophical co1 l\"Crsat ion-stopper. I le pro-
whether 1he contestant had soli skin, warm blood, facial f'catures,
posed-in the spirit of "Put up or sl111l up!"-a simple test ror
hands and eyes-which are obviously 1101 themselves essential
thinking that was .1111l'iy strong c11011gh to satisf}' the sternest skcp-
components of intelligence-so he devised a screen that would let
lic (or so he thought). I le was saying, in effccl, "1 nstead of arguing
through only a sample of what really mallered: the capacity Lo
i11tcn11inably ahout the ultimate 11;1t111'C and essence of thinking,
understand, and think cleverly about, challe11gi11g problems. Per-
why don't we all agree that whatever 1hat 11alure is, anything that
(
124 I llOW WE KNOW CAN MAClllNES TlllNK? I 125

haps he was inspired by Descartes, who in his Disrourse 011 Me/hod Is this high op111io11 of the Turing· tcsl's severity misguided?
(1637) plausibly argued that there was no more demanding tcsl of Certainly many have thought so-but usually because they have
hum;111 mcntalit y than the capacity 10 hold an intellig·enl conversa- not imagined the test in sufficient de1ail, and hence have trndcrc-
tion: stimated it. Trying~ forestall Lhis skepticism, Turing im;igined
It is indeed co11cciv;ihk 1h;11 ;1 111achi11c could be so made 1hal it would
several lines ofques1io11ing 1hat ajudge might employ in this game
ullcr words, a11d l'\'Cll words appropriate 10 the prcscm-c of physical acls -about wri1ing poetry, or playing rhess-lhal would be taxing
or ohjt-cls 11hid1 cu1sc so111e d1;111i-:c i11 its oq~aus; as, for cxa111pk, if it was indeed, but with thirty years' experience wi1h the actual talents and
touched in some spot that ii 11011ld ask what you wanted lo say lO it; if in foibles of computers behind us, perhaps we can add a few more
;inothcr, that it would cry th;ll it w;is hurt, and so on for similar things. tough lines of questioning.
Bui it could never modify i1s phrases to reply Lo the sense of whatever was Terry Winograd, a leader in artificial i11telligence efforts to pro-
said in its presence, as even the most Stupid lllCll Giii do. 2 duce conversalional ability in vQfnpuler, draws our allcntion to
a pair of sentences. 3 They diller in only one word. The first sen-
This seemed obvious to lkscancs in the sevc11tec111h century, but
tence is this:
of course the fanciest machines he knew were elaborate clockwork
figures, nol electronic computers. Tod;1y it is far from obvious that The committee denied the group a parade permit because they advocated
such machines arc impossible. but Descartes's hunch that ordi11;11')' \'iolence.
conversation would put as severe a slrain on artificial i1.1tclligencc Herc's the second sentence:
as an)' other lest was sh;ired by Turing. Of course there is nothing
sacred about the particular com·crsational game chosen by Turing The committee denied the group a parade permit because lhey karcd
for his test; it is just a cannily chosen test of more general intelli- violence.
gence. The assumption Tmi11g· was prepared to make was this: The difference is jusL 'in the vcrb-advoraled or feared. As Winograd
Nothing could possibly pass the Turing test by wi11ni11g the imita- points out, the pronoun /hey in each sentence is officially ambigu-
tion game without being able to perform indcli11itcly many other ous. Both readings of the pronoun arc always legal. Thus we can
clearly intelligent actions. Lei us call that assumption the quick- imagine a world i11 which governmcnlal commillecs in charge of
probc assumplion. Turing realized, as anyone would, thal there parade permits advocalc viole11ce in the streets and, for some
are hundreds and thousands of telling signs of intelligent thinking slrang·c reason, use this as their prelext for denying a parade
to be observed in our fellow creatures, and one could, if one pcrmil. Bul the natural, reasonable, intelligent reading of the first
wanted, compile a \'asl battery of dillcrcnl tests to assay the capac- sentence is that iL's the group that advocated violence, a11d of the
ity for intelligent thought. But success on his chosen test, he second, thal it's the commillce Lhat teared 1he violence.
thought, would be highly predictive of success 011 many olher Now if sentences like this are embedded in a conversation, the
intuitively acceptable tests of i11tcllige11cc. Remember, failure 011 computer must figure oul which reading of the pronoun is meant,
the Turing test does nol prcdicl failure 011 those others, but suc- if it is to respond intelligently. But mere rules of grammar or
cess would surely predict success. His test was so severe, he vocabulary will not fix the rigl11 reading. What fixes the right read-
thought, that no1hing that could pass it fair and square would ing for us is knowledge about the world, about politics, social
disappoint us in other quarters. l\laybc ii wouldn't do everything circumstances, committees and their all ii udcs, groups that want to
we hoped-maybe it wouldn '1 appreciate ballet, or understand parade, how they Lend Lo behave, and the like. One must know
quantu111 physics, or have a good plan for world peace, hut we'd about the world, in short, to make sense of such a sentence.
all sec that it was surely one of the intelligent, thinking enlities in J111hcjargon 'ofartificial i11Lclligcnce (Al). a conversational com-
the ncighhorl1ood. pulc.r needs lots of world k1101~1/edge Lo do its job. But, it seems, if

f!ii;/, .. ·
~·'' "'"·~ ·
. ...,. .4.
~ 'Zn .-··· -...-.• ·;~
I I
110\\' WE KNOW
( CAN :lllAClllNES TlltNK? 127
I

somehow il is endowed wilh 1ha1 world knowledge on many lopics, That's not a good test , as it t ur11s oul. Chess prowess has Ill u1·c11
to he an isolatablc t;llcnt. There arc programs today that ran phi}·
il should he able lo do much more with Lhat world knowledge 1han
merely make sense of a co11versatio11 conlaini11gjusl that senlence. line chess but can de; nothing else. So Lhc quilk prnbc assumption
The only way , ii appears, for a computer to disambiguale Lhal is false for the test of playing winning chess.
se11te11ce and keep up ils end of a conversalion thal uses thal Candidate 2: The co111p11ter is ~Iii gent if it soh-cs the Arab-Israeli con-
sentence 1rnuld be for it lo ha\'c a much more general ability to flict. .,.,
respond i11tellige11tly to information aboul social and political cir-
This is surely a more SC\'cre test than Turing's. But it has some
cumslanccs, and many other topics. Thus, such sc111c11ccs, by put-
defects: il is unrepeatable, if passed once; slow, no doubt; and it
ting a demand 011 such abilities, arc good quick probes. That is,
is 1101 crisply clear what would l<>tnll as passing it. 1lcre's another
lhcy test for a wider compctc11cc.
People Lypically ignore the prospccl of having the judge ask prospecl, then:
off-the-wall questions in the ·1·uring test, and hence Lhey underesti- Candidate ;r A t·o111putcr is intelligent if it su<Tceds in stealing the British
mate the con1petenrc a computer would have lo have 10 pass the uown jewels wit ho11t the use of force or ~>If nee.
lest. But remember, the rules of the imitation game as Turing I

Now this is beuer. First, il could be repealed again and again,


presented il permit the judge to ask any question that could be
Lhough of course each repeal test would presumably be hardcr-
asked of a human being-no holds barred. Suppose then we give
bul this is a fealurc it shares with the Turing test. Second, the
mark or success is dear-either you've got the jewels to show for
a contestanl in the game this question:
An Irishman found a genie in a bollk who ofkred him two wishes. "First your efforts or you don't. But it is expensive an<l slow, a socially
I'll have a pint of Guinness," said the Irishman, and when it appeared he dubious caper at best, and no doubl luck would play too greal a
took several long drinks fro111 it and was delighted to sec that the glass role.
filled itself 111agicall)' as he drank . "What about your second wish?" asked Wilh ingenuily and effort one mighl be able to come up with
the genie. "Oh well," said the Irishman, "that's easy. I'll have another one other candidates Lhat would equal the Turing test in SCl'crity,
of these!" fairness, and cllicicncy, but I think these few examples should
-Please explain this story to inc, and tcll 111e if there is anything h11111)'
sullice lo convince us Lhat il would be hard to improve on Turing 's
or sad about it.
original proposal.
Now even a child could express, if not eloquently, the undcrsland- But still, you may protest, something might pass the Turing tesl
ing that is required to gel this joke. But Lhi11k of how much one has and still 11ol be intelligent, nol be a Lhi11ker. What docs 111igltt mean
Lo know and understand about human culture, to put it pom- here? If what you have in mind is that by cosmic accident, by a
pously, lo be able to give any account of the poinl of this joke. I supernatural coincidence, a stupid person or a stupid computer
am not supposing that the computer would have to laugh at, or be 111igltt fool a clever judge repeatedly, well, yes, but so what? The
amused by, the joke. Bul if il wants to win the imitation game- same frivolous possibility "i11 principle" holds for any test 1vhal-
and that's the test, alter all-it had helter know enough in its own evcr. A playful god, or evil <lemon, let us agree, could fool the
alien, hun1orlcss way about human psychology and culture to be world's scienlilic com1m111it y about Lhe presence of 11 2 0 in the
ahlc lo pretend effectively that ii \\'as amused and explain why. Pacilic Ocean. But still, Lhe lcsts they rely on to establish that there
It may seem to you that we could devise a better test. Let's is 11 2 0 in the l'acilir Ocean arc quite beyond reasonable criticism .
compare the Turing test with so111c other G11Hlidates. Ir the Turiug tcsl for thinking is no worse Lha11 any wcll-cstablishnl
scicnlilic Lesl, we can set skepticism aside and go back to serious
Candidate 1: i\ co1111H11er is intellige11t ii' it wins the World Chess Cham-
mailers .· Is there any more likelihood of a "false positive" result
pionship.
/
(
CAN MACHINES THINK? / 129 ·
128 I llOW WE KNOW
pcrs. parks, repertory theaters, libraries, li11e ard1itcn111T, a11d all
011 the Turing test than 011, say, the tests n11n·111ly used for the
the other things Lhat go Lo make a city grcal. l\ly test was sin1ply
presence of iron in an ore sa1nple?
devised lo locate a telli11g sample thal could not help but be repre-
This question is uhcn obscured by a "move" that philosophers
sentative of the rcsl of Ll~ity's treasures. I would cheerfully run
have sometimes made called operationalism. Turi11g and Lhose
the minisculc risk of havi11g my bluff called. Obviously, the tesl
who 1hi11k well of his test arc ot'len accused of being operaLional-
items arc not all that I care about in a city. In fan, some of them
ists. Oper;1tio11alis1n is the lactic or dr/i11i11g the presence or some
I don 'I care about al all. I just thi11k I hey would be cheap and easy
properly, for instam:c, intcllige11ce, as hci11g established once and
ways of' assuring myself that the sublle things I <lo care about in
for all by the passing or so111e tesl. Let's illustrate this with a
cities arc prese11t. Similarly, I think it would be entirely u11rcaso11-
different cxa1 nplc. ahlc Lo suppose thal Alan Turing had a11 inordinate fondness for
Suppose I offer the following test-we'll call it the Dennett test
party games, or put too high a value on.,party game prowess i11 his
-for being a great city: lest. In both the Turing Lest and tl~lk1111cll test, a very umisky
A great cil y is one in which, 011 a r;111do111ly chosen day, one Gill do all gamble is being taken : the gambl~ Lhal the quick-probe assump-
three or the ((1llowi11g: tion is, in general, safe. ;
I lcar a sy111pho11y orchestra But Lwo Gill play this game of playing the odds. Suppose some
Sec a Rl'l11hra11dt and a prokssio11;1I athletic co111cst computer programmer happens to be, for ·whatever strange rea-
Eat q11r11l'lll's de bmdiel a la Xa11t11a for lunch son, dead set 011 ta~icking me into judging an e11Lity to be a thinking,
To make the operation<ilist move would be to declare that any i111cllige11l thi11g when it is nol. Such a trickster could rely as well
city that passes the Denncu test is by de/i11itio11 a great city. Whal as I can on u11likelihoo<l a11<l take a few gambles. Thus, if the
being a great city a11101111/s to is jnsl passing- the Dennett test. Well programmer can expect that it is not remotely likely that I, as the
then, if the Cha111her of Co111111e1-ce of Great Falls, Montana, judge, will bring up the Lopic or children's bir1hday parties, or
wanted-and I can't imagine why-to gel their ho111etown 011 111y baseball, or moon rocks, Lhc11 he or she can avoid the trouble of
list of great cities, they could accu111plish this by the relatively building world knowledge 011 those topics into the dala base.
i11cxpensivc route of' hiring full time about te11 basketball players, Whereas if I do improbably raise these issues, the system will draw
forty musicians, and a quick-order quenelle chef and renting a a blank and I will unmask the pretender easily. But given all the
cheap Rembrandt from some 11111scum. An idiotic operation;tlist Lopics and words that I might raise, such a savings would 11p doubl
would then be stuck ad111itLi11g that Great Falls, l\lontana, was in be negligible. Turn the idea inside oul, however, and the trickster
racl a great city, si11ce all he or she cares ahoul in great cities is Lhal will have a fighti11g chance . Suppose the programmer has reason
to believe that I will ask 011/y about children's binhday parties , or
they pass the Dennett test.
Sane operationalisls (who for that very reason arc perhaps not baseball, or moo11 rocks-all other topics being, for one reaso11 or
opcrationalists al all, since o/1natio11alist seems lo be a dirty word) another, out of' bounds . Nol 011ly docs the task shrink dramatically,
would cling conlide11tly to their test, hut 011ly because they have but there already exist systems or prcli111i11ary sketches of systems
whal they consider 10 he very good reasons for Lhinki11g the odds in artificial intelligence that can do a whiz-bang job of responding
againsl a f;ilse positive result, like the imagi11ed Chamber of Com - with apparent intelligence 011 jusl those specialized Lopics.
merce caper, arc astronomical. I devised the Dcnncll test, of William Woods's LUNAR program, to take what is perhaps the
course, with the rcalizatio11 1h;1t 110 one would be both stupid and best example, answers scientists' questions-posed in ordinary
rich c11011gh to go to s11ch preposterous lengths to foil the test. In E11glish-aho111 moon rocks. In one test it answered correctly and
the actual world, wherever you li11d symphony orchestras, q11e11r/l1•5, appropriately something like 90 pcrcc11t of the questions that
Remhrandts, a11d prnkssio11al sports, you also li11d daily ncwspa- geologists and other experts thought of asking it about moon

".,
I
I! 130 llOW WE KNOW
(
possible good conversalions within the limils and heat judge
~ (In 1 2 percenl of those correct respo11ses Lhere were Lrivial,
..5.
with nolhing more sophisticated Lhan a syslem of table lov1rnp . In
correctable defects.) Of course, Woods's molive in creating
fact, Lhal isn'l in Lhc cards. Even wilh Lhcsc severe and improbable
LUNAR was nol Lo Lrick unwary geologists into Lhinking Lhey were
and suspicious restriclions imposed upon Lhe imitation game, the
conversing with an intelligent being. And if Lhal had been his
number of legal games, though linilc, is mind-bogglingly large. I
motive, his project would still be a long way from success.
haven't bothered lrying tp calculate il, bul il surely exceeds as-
For it is easy enough to unmask LUNAR withoul ever straying
tronomically Lhc numhc~I possible chess games with 110 more
from the presnibed topic of 1110011 rotks. l'ul LUNAR i11 one room,
than forty moves, and Lhal number has been calculated . .fohn
and a moo11 rocks specialist i11 a110Lher, a11d then ask Lhem both
1laugdand says it's in Lhe neighborhood of ten lo the one hu11dred
their opinion of the social value of the moon-rocks-gal11ering ex-
twentieth power. For comparison, I laugcland suggests there have
peditions, for ins lance. Or ask 1he contesta11ts their opinion of the
only been ten lo the eighLeenlli seconds since Lhe beginning of Lhe
suitability of moon rocks as ashtrays, or whether people who have
touched moon rocks arc i11cligiblc for the drah . Any intclligenl univcrse. 5
01 course, the number ol good, sensible conversations under
person knows a lot more aboul moon rocks than their geology.
these limits is a tiny fraction, maybs- Qi11e in a quadrillion, of the
Although it might be 1111fa11 lO demand this extra knowledge of a
number of merely grammatically ."fcl\ formed conversations. So
computer moon rock specialist, it would be an easy way to get il
let's say, Lo be very conservative, that there arc only lc_Jl to the
to fail the Turing Lesl.
lihieth difkrcnl smart convcrsalions such a compuler would have
But just suppose that someone could extend LUNAR to cover
to store. Well, the task shouldn't Lake more than a kw trillion years
itself plausibly on such probes, so long as the topic was still, how-
-given generous federal support. Finilc numbers can be very
ever indirectly, moon rocks. \Ve might come to think it was a lol
more like the human moon rocks specialist than it really was. The large.
So though we needn'l worry that Lhis particular Lrick of storing
moral we should draw is that as Turing test judges we should resist
all the smart conversations would work, we c;111 appreciate that
all limital ions and waterings-down of the Turing test. They make
there arc lols of ways of making Lhc Lask easier thal may appear
the game too easy-vastly easier than the original test. I lence they
innocenl al first. We alfo gel a reassuring measure ofjusl how
lead us into Lhe risk of overcstimaling the aelual comprehension
severe Lhe unrestricted Turing Lesl is by rellecling on the more
of the sysLem being tested.
I han a·s lronomical size of even Lhat severely rcslriclcd version of
Consider a different limital ion on the Turing Lcsl that should
slrike a suspicious chord in us as soon as we hear il. This is a it.
4 Block's imagined-and ullcrly impossible-program exhibils
varialion on a Lheme developed in a recent aniclc by Ned Block.
the dreaded fcalure known in computer science circles as ro111-
Suppose someone were lo propose lo restrict the judge to a vocab-
ulary of, say, the 850 words ol "Basic English," and to single- bi1wtorial explosion! No conceivable compuler could overpower a
combinalorial ex1)losion with sheer speed and size. Since the prob-
senlcnce probes-that is "moves"-of no more than four words.
lem areas addressed by artilici~l intelligeqcc are veritable mine-
rvlorcovcr, conteslants musl respond lo these probes with no more
liclds of combinatorial cxplosi()n, and since il has often proven
than four words per move, and a test may involve no more than
diflicult lo find a11v solution lo a problem that avoids them, there
forty questions. is considerable plausibility in Newell and .Simon 's proposal 1hal
Is Lhis an innocent variation on Turing's original lesl? These
avoiding combin;ltorial explosion (by any means at all) be viewed
restrictions would make the imitation game clearly linile. Thal is,
the lotal 1111111her of all possible pennissible games is a large, but as one of the hallmarks of intelligence.
Our brains are millions of times bigger than the lira ins of gnats,
linilc, number. One mighl suspecl Lhal such a limitation would
bul Lhey arc still, for all their vast complcxit y, compact, eflicicnl,
pcrmil the trickster simply lo store, in alphabetical order, all the

' '
,t: 'ii.'!.~
!o,. ..... \\ " •
I 110\V WE KNOW <I CAN MACHINES TlllNK? 1 33

timely organs that somehow or other manage lo perform all their didn't even need a computer for it. All you needed w;is an electric
tasks while avoiding combinatorial explosion . A computer a mil- typewriter. I lis program 111odclcd infantile autis111. Aud the Lran-
lion Limes bigger or faster than a human brain might nol look like scripls-you type in your questions, and the thing jusl sits there
the brain of a human being, or even be internally organized like and hums-cannot be disl\nguished by experts from transcripts of
the brain of a human being, but. ii, for all its dillcrences, it some- real conversations with i1~1tile autistic patients. Whal was wrong,
how managed lo control a wise and timely seL of activities, il would of course, with Colby's test was Lhal the unsuspecting interviewers
have Lo be the beneficiary of a very special design Lhal avoided had no motivation at all Lo try oul any of'Lhe sorts of'qucslions thal
co1nbinatorial explosion, and whatever that design was, would we easily would have unmasked PARRY.
nol be right Lo consider the entity intelligent? Colby was undaunted, and a fl er his team had improved I' ARR Y
Turing's Lesl was designed to allow for this possibility. His point he pul il Lo a much more severe Lest-a surprisingly severe Lesl.
was that we should nol be species-chauvinistic, or anthropocentric, This time, the interviewers-again, psychiatrists-were given the
about the in sides of an intelligent being, for there 111ighL be inhu- Lask al the oulsel of telling the co•,!_!,Ptj(er from the real patient.
man ways of being intdligent. They were sel up in a classic Turi1)~ i;1achine Lest: the patient in
To my knowledge, the only serious and interesting allcmpl by one room, the computer PARRY in the other room, with the
any program designer to win even a severely modified Turing tesl judges conducting interviews with both of them (on successive
has been Kenneth Colby's . Coll»' is a psychiatrist and intelligence days). The judges' task was lo find oul which oi1e was Lhe computer
artificer al UCLA. He has a program called PARRY, which is a and which one was the real patient. Amazingly, they didn't do
computer simulation of a paranoid patient who has delusions much beuer, which leads some people to say, "Well, that jusl
about the l\lalia being oul Lo get him. As you do with other conver- confirms my impression of the intelligence of psychiatrists!"
sational programs, you inleracl with il by sill ing al a lenninal and llut now, more seriously, was this an honest-lo-goodness Turing
typing questions and answers back and forth. A number of years lest? Were there Lacil restrictions on the lines of questioning of the
ago, Colby pul PARRY lo a very restricLed lest. He had genuine judges? Like the geologis(s interacting with.LUNAR, the psychia-
psychiatrists interview PARRY. lie did nol suggest to them that trists' professional preoccupations and habits kept them from ask-
they might he talking or typing to a computer; rather, he made up ing the sons of unlikely questions that would have easily unmasked
some plausible story about why they were co111m1micating with a PARRY. After all, they realized that since one of the contestants
real live patient by teletype. I le also had the psychiatrists interview was a real, live paranoid person, 111edical ethics virtually forbade
real, human paranoids via teletype. Then he took a PARRY tran- them from toying with, upseLLing, or allempling Lo confuse their
script, inserted il in a group of teletype lranscripls from real pa- interlocutors. Moreover, they also knew that this was a tesL of a
tients, gave them to a110/hn group of experts-more psychiatrists model of paranoia, so there were certain questions thal wouldn't
-and said, "One of these was a conversation with a computer. Can be deemed to be relevant lo testing the model as a model of J1ara11oia .
you figure out which one il was?" They couldn't. They didn't do So, they asked jusL the sort of questions Lhal therapists lyj1ically ask
better than chance. of such patients, and of course PARRY had been ingeniously and
Colby presented this with some huzzah, but critics sn>ffed al the laboriously prepared lo deal with jusl that sort of question.
suggestion that this was a legitimate Turing lesl. l\ly favorite com- 011e of the psychiatrist judges did, in fact, make a rather half-
mentary 011 il was Joseph \Veizenhaum's; in a leller lo the Co1111111111i- hearted auempl lo break ouL of the mold and ask so1'ne telling
calio11s of the 1ls.wciatio11 of Co111jmli11g J\£Mhi11crv, 6 he said that, in- questions: "l\laybe you've heard the saying 'Don't cry over spilled
spired by Coll>)'. he had designed ;111 even l>l'tter program, which milk.' Whal does that mean lo you?" I' ARR\' answered: "Maybe
passed the sa111e Lesl. I lis also had the virtue of being a very you have to watch oul for the l\lafia." When then asked "Okay, now
inexpensive program, in these Limes of Light money. In fa cl you if you were in a movie theater watching a movie and smelled
j
r {

13.1 I 110\V WE KNOW CAN MACHINES TlllNK? / 135

something like burning wood or rubber, what would you do?" TOR, an SRl-dcveldped expert system in geology, had correctly
PARRY replied: "You know, they kuow me." And the next ques- predicted the existe1\ce of a large, important mineral deposit that
tion was, "If you found a stamped, addressed letter in your path had been entirely unanticipated by the human geologists who had
as you were walking dowu the street, what would you do?" PARRY fed it its data. MYCIN, perl~s thr most famous of these expert
replied: "Whal else do you waul lo k11uw?" 7 systems, diagnoses inkctions of tlje blood, and it docs probably
Clearly PARRY was, you might say, /H11ry111g these quest ions, as well as, maybe bcucr than, any human consultants. And mauy
whid1 were incomprehensible tu it, with more or less stock para- other expert systems arc on the way.
noid formulas . We see a bit of a dodge, which is apl lo work, apl All expert systems.~ like all other large Al programs, are what you
to seem plausible Lo the judge, only because the "contestant" is might call Potemkin villages. Thal is, they are cleverly constructed
511/Jf>osed to be a paranoid, and such people arc expected lo respond facades, like cinema sels. The actual filling-in of details of Al
uucoopcratively on such occasions. These unimpressive responses programs is time-c~msuming, costly ~~Ir, so economy dictates
didu 't p<frticularly arouse the suspicious of the judge, as a mat Ler that only those surfaces of the pheuoinenon that are likely lo be
of fact, though probably tl1ey should have. probed or observed are represented.
PARRY, like all other large computer programs, is dramatically Consider, for example, the CYRUS program developed by Janel
bo1111d hy limitatious of cost-effectiveness . \\lh;1t was imponaut to Kolodncr in Roger Schank's Al group al Yale a kw years ago.s
Colby and his crew was simulating· his model of paranoia. This was CYRUS stands (we are told) for Computerized Yale Retrieval and
a massive effort. PARRY has a thesaurus or dictionary of about Updating System, bul surely it is no accident that CYR US modeled
4500 words and 700 idioms and the grammatical competence Lo the memory of Cyrus Vance, who was then secretary of stale in the
use it-a parser, in the jargon of compulaliunal linguistics . The Carter administration. The point of the CYRUS project was Lo
entire PARRY program Lakes up aboul 200,000 words of computer devise and tcsl some plausi~lc ideas about how people organize
memory, all laboriously installed hy the programming lcam. Now their memories of the events they participate iu; hence il was
once all the effort had gouc iuto devising the model of paranoid meant lo be a "pure" Al system, a scientific model, not an expert
thought processes and linguistic aliilit y, there was liulc if any time, system intended for any practical purpose. CYRUS was updated
energy, moucy, or interest lch over lo build iu huge amounts of daily by being fed all UPI wire service news stories that mentioned
world knowledge of the sort that any actual paranoid, of course,
Vance, and it was fed them directly, with no doctoring and no
would have. (Not that anyone yet knows how to build in world
human intervention. Thanks lo an ingenious news-reading pro-
knowledge in the first place.) Building in the world kuuwlcdge, if
gram called FRUMP, it could take any story just as it came in on
one could even du it, would no doubt have made PARRY orders
the wire and could digest it and use it to update its data base so
of 111ag11itude larger and slower. Aud what would have bccu the
that it could answer more questions. You could address questions
point, given Colby's theoretical aims?
lo CYRUS in English by typing at a terminal. You addressed them
PARRY is a theoretician's model of a psychological phenome-
in the second person, as if you were talking with Cyrus Vance
non: paranoia. It is not intended tu have practical applications. But
himself. The results looked like this:
in recent years a branch of Al (knowledge engineering) has ap-
peared that develops whal arc now called expert systems. Expert Q; Last time you went to Saudi Arabia, where did you stay?
systems are designed to be practical. They arc soft ware supcrspe- A: 111 a palace in Saudi Arabia on Scplcmber 23, 1978.
cialist consultants, typically, that c111 he asked to diagnose medical Q Did you go sightseeing there?
problems, to analyze geological dala, lo analyze the results of A: Yes, al an oilfield i11 Dharan on September 23, 1978.
scientific cxpcrimcnls, and the like. Some of them arc very impres- Q Has your wile ever met l\frs. Begin?
sive. SRI in California announced a kw years ago that PROSl'EC- A: Yes, 111osl recently at a state dinner in Israel in January 1980.
(

CAN MACHINES THINK? / 137


I llOW WE KNOW
you I hen traced oul the path lo h11111a11 bei11g, you'd lind oul various
CYRUS could correctly answer thousands of questions-almost
obvious things CYRUS "knew" about human beings (hence about
any fair question one could think of aski11g it. Bul if ot1e actually
lawyers), but that is nol ~iot. Thal lawyers ;Ji"~ university gradu-
sci out to explore the boundaries of its facade and lind the ques-
al cs , that they arc better 1>31d than chambermaids, that they know
tions that overshot the mark, 011e could soon find them . "I lave you
how lo tic their shoes, thal they are unlikely lo be found in the
ever met a female head of state?" was a question I asked it, wo11-
<:ompany of lumberjacks-these trivial, if weird, facts about law-
dering if CYRUS knew that l11dira Ghandi and Marga rel Thatcher
yers would nol be explicit or implicit anywhere in this system. In
were wo111e11. But for some reason the connection could not be
other .words, a very thin slereolype or a lawyer would be incorpo-
drawn, a11d CYRUS failed to a11swer either yes or no. I had
rated mlo the system, so that almost nothing you could tell it about
s1111npcd it, in spite of the fact that CYRUS could ha11dlc a host of
iJ
a lawyer would surprise it .
what you might call neighboring questions llawlessly. One soon
So long as surprising things don'ktt\appcn, so long as Mr. Vance,
': i learns from this sort of probing exercise that it is very hard to
for instance, leads a typical diplomat's life, allending stale dinners,
extrapolate accurately from a sample of performance. that one has
giving speeches, flying from Cairo lo Rome, and so forth, this
observed to such a system's total competence. It's also very hard
system works very well. But as soon as his path is crossed by an
I
•i
i to keep from extrapolating mud1 too generously. imponanl anomaly, the system is unable lo cope, and unable to
While I was visili11g Scha11k's laboratory in the spri11g of 1980,
recover without foirly massive hun1an in1crven1io11. In the case of
something rcveali11g happened . The real Cyrus Va11ce rcsig11ed the sudden resigl1a1ion, Kolodner and her associates soon had
su<lde11ly. The effect on the program CYRUS was chaotic. IL was CYRUS up and ru1111ing again , with a new talent-answering ques-
ullcrly unable to cope with the llood of "unusual" news aboul tions aboul Edmund Muskie, Vance's successor-but it was no less
Cyrus Vance. The only sons of episodes CYR US could understand vulnerable to unexpected events. Not that ii mallcrcd particularly,
al all were diplomatic meetin~·s, flights, press co11ferenccs, stale since CYRUS was a theoretical :model, 1101 a practical system.
dinners, a11d the like-less than two dozen general sons of activi- There are a hosl of ways of inliproving the performance or such
ties (the ki11ds that are newsworthy and typical of secretaries of systems, and, ofccnirsc, some systems arc much belier than others.
stale). It had no provision for sudden resignation . It was as if I he But all Al progr;~ms in one way or another have this facaddike
UPI had reported that a wicked witch had turned Vance i1110 a frog. quality, simply for reasons of economy. For instance, most expert
IL is distinctly possible thal CYR US would have taken that report systems in medical diagnosis so far developed opera le with statisti-
more in stride than the actual 11ews. One can imagine the con versa- cal information. They have no deep or even shallow knowledge of
Lion: the underlying causal 111cd1anis111s of I he phenomena that they are
Q; lkllo, J\lr. Vance, what's new? diagnosing. To take an imaginary example, an expert system asked
A: I was turned into a frog ycsicnby . to diagnose an abdominal pain would be oblivious 10 the potential
import of the fact that the patient had recently been employed as
But of course il wouldn't know enough aboul what il had just
a sparring partner by ~luhammed Ali-there being 110 st;.itistical
wriuen lo he puzzled, or start led, or embarrassed. The reason is
dat~ available lo it on the rate of kidney stones among a1hle1e's
obvious. \\'hen you look inside CYRUS, you find that il has skeletal
ass1s1a111s. That's a 1;111ciful case no doubt-too obvious, pnhaps,
definitions of thousands of words, but these definitions arc mini-
to lead to an actual failure of diagnosis and practice. But more
mal. They contain as little as the system designers think thal they
subtle a11d hard-10-dclecl limits lo comprehension arc always pre-
Giii get away with . Thus, perhaps, lawyer would he dclined as
sent, and even experts, even the systc111's designers , crn be uncer-
synonymous with attorney and legal cow1sel, but aside fron1 that, all
tain of where and how these limits will interfere with the d<' .~ircd
one would discover about lawyers is thal they arc adult human
operation of the system . Again, steps can be taken and arc being
~ :
beings and that they perform various functions in legal areas. If

..
' I
CAN MACIIINES TIIINK? / 139
138 I llOW WE KNOW
exhibits the sons of shortcomings and failures that the designer
taken to correct these flaws. For instance, my former colleague al knows the system lo have.._This would not be a substitute, however,
,i Tuhs, Benjamin Kuipers, is cunently working on an expert system
j\ for an attitude ofcaulio~ almost obsessive, skepticis111 on the pan
in ncphrology-for diag11osi11g kidney ailments-that will be of users, for designers arc often, if nol always, unaware of the
based 011 an elaborate syste111 of causal reasoning about the subtler flaws in the products they produce. Thal is inevitable and
phcno111c11a being diagnosed. But this is a very ambitious, lo11g-
natural, given the way system designers must think. They are
range project of considerable theoretical dilliculty . A11d even if all trained lo think positively-constructively, one might say-about
the reasonable, cost-clkctive steps are taken lo minimize the su-
the designs that they are constructing.
perficiality of expert systems, they will still be facades, just some- I come, then, lo my conclusions . First, a ·philosophical or thco-
what thicker or wider facades. ret_ical co~~lu_sion: The Turi_ng _lcW!.-.una<luher_a.te<l, unrestricted
When we were considering the fantastic case of the crazy Cham- fo1 m, as l unng presented ll, 1s ,plenl y strong if well used. I am
ber of Commerce of Great Falls, Montana, we couldn't imagine a confident that no computer in the next twenty years is going to
plausible motive for anyone going to any sort of trouble to trick pass the unrestricted Turing tesl. They may well win the World
the Dcnnell tcsl. The quick probe assumption for the Dennett test Chess Championship or even a Nobel Prize in physics, but they
looked quite secure. But when we look al expert systems, we see won't pass the unrestricted Turing tesl. Nevertheless, it is 1101, I
that, however innocently, their designers do have motivation for think, i111possible in principle for a computer lo pass 1he test, fair
doing exactly the sort of trick l hat would fool an unsuspicious and square. I'm not running one of those a priori "computers can't
Turing tester. First, since expert systems arc all superspecialists think" arguments. I stand unabashedly ready, moreover, to de-
who arc only supposed to know about some narrow subject, users clare that any computer that actually passes the unrestricted Tur-
of such systems, not having much time to kill, do not bother prob- ing lest will be, in every theoretically interesting sense, a thinking
ing them at the boundaries at all. They don't bother asking "silly"
thing.
or irrclcvanl questions. Instead, they concenlrale-not unrcasona- But remembering how very strong the Turing test is, we must
bly-on exploiting the syste111's strength~. But shouldn't they try also recognize that there may also be interesting varieties of thin'k-
to obtain a dear vision of such a system 's weaknesses as well? The ing or intelligence that arc not well poised t.o play and win the
normal habit ofhuma11 thoughl when conversing with one another
.,I imitation game. Thal 110 11onh11ma11 Turi11g test winners are yet
is to assume general comprehension, to assume rationality, to visible on the horizon does not mean that there aren't machines
assume, moreover, that the quick probe assumption is, in general,
that already exhibit some of the i111ponanl features of thought.
sound . This amiable habit of thought almost incsistibly leads to A~out them, it is probably liJtile to ask my title question, Oo they
putting too much faith in computer systems, especially user-
thmk? Oo they really think? In some regards they do, and in some
fricndly systems that present themselves 111 a very anthropo- regards they don't. Only a detailed look al what they <lo, and how
morphic manner. they are structured, will reveal what is interesting about them. The
Pan of the solution lo this problem is to leach all users of Turing test, not being a scientific lest, is of scant help on that task,
computers, especially users or expert syste111s, how to probe their but there are p'enty of other ways of examining such systems.
systems before they rely 011 the111 , how to search oul a11d explore Verdicts 011 their i111elligc11ce or capacity for thought or conscious-
the boumbries of the facade . This is an exercise that calls 11ol only
ness would be only as informative and persuasive as the theories
for intelligence a11d imagi11at ion, but also a bit of special under- of intelligence or Lhough1 or consciousness the verdicts were
standing about the limitations and actual structure of computer based 011, and since our task is to create such theories, we should
progra111s . It would · help, or course , if we had standards of truth get 011 with it a1ld leave the Big Verdict for a1101hcr occasion. 111
in advertisi11g, in effect, for expert systems. For instance, each such
the 111ea11li111c, should anyone want a surefire, almos1-g11ara11teed-
system should come with a special demonstration routine that

';! ·;
I
I
'I
140 I 110\V WE KNO\\' ( CAN Mi\ClllNES TlllN I 141
I
lo-be-fail-safe test of thinking by a computer, the Turing lest will going back on 1'i1y earlier claims? Not al all. I am merely pointing
do very nicely. out that the Turing test is so powcrhil that it will ensure indirectly
l\ly second conclusion is more practical, and hence in one clear that these conditions, if they arc truly necessary, are met by any
sense more important. Cheapened versions of the Turing test are succ~ssful contestant. .
everywhere in the air. Tming's test is not just elkctive, it is entirely "You may well be r~t," Turing could say, "that eyes, cars,
natural-this is, a lier all, t ht: way we assay the intelligence of each hands, and a history are necessary conditions for thinking. If so,
other <.:very day . And sine<.: incautious use of such judgments <iml then' I submit that nothing could pass the Turing test 1hat didn't
such tests is the norm, w<.: arc in some considt:rabk danger of have eyes, cars, hands, and a history. That is an empirical claim,
extrapolating loo c<isily, ;ind judging too generously, about the which we can someday hope to test. If you suggest th<il these arc
undt:rstanding or the systems WC are using. Th<.: problem of conceptually necessary, 1101jus1 practically or physically necessary,
overcst imation of cog·nitivc prowess, of comprehension, of intelli- conditions for 1hinki11g, you make a philosophical claim that I for
gence, is not, then, just a philosophical problem, but a real social one would not know how, or carc.,.jp 8sscss . Isn't it more interest-
problem, and we should alert oursdvcs lo it, and take steps lo ing and important in the cml to disc~ver whether or nol it is true
avert it. that no bedridden system could pass a demanding Turing test?"
Suppose we pul; Lo Turing the suggestion that he add another
component to his tesl: Not only must an entity win the ·I mitation
POSTSCRIPT: EYES, EARS, HANDS, AND HISTORY
game, but also it must be able to identify-using whatever sensory
My philosophical conclusion in this paper is that any computer that apparatus it has available to it-a variety of familiar objects placed
actually passed the Turing test would be a thinking thing in every in its room: a tennis racket, a polled palm, a buckcl of yellow paint,
theoretically interesting sense. This conclusion seems to some a live dog. This would ensure l hat somehow or other the entil y was
people to fly in the face of what I have myself argued 011 other capable of moving around and distinguishing things in the world.
occasions. Peter Bieri, commenting on this paper <it Boston Uni- Turing could reply, I a1i1 asserting, that this is an utterly unneces-
versity, noted that I have of'tcn cl;iimed to show the importance lo sary addition lo his lest, making it no more demanding than it
genuine understanding of a rich and intimal e perceptual intercon- already was. A suitably probing convct·sation would surely estab-
nection between an entity and its surrounding world-the need for lish, beyond a shadow of a doubt, that the contestant knew its way
something like eyes and cars-and a similarly complex active cn- around in the real world. The imagined altcrn<il ive of somehow
gagcmcnl with clements in that world-the need for somcl11ing "prestocking" a bedridden, blind computer with enough informa-
like hands with which lo do things in that world. Moreover, I have tion, and a clever enough program, to trick the Turing test is
often held that only a biography of sorts, a history of actual pro- science fiction of the worst kind-possible "in principle" but not
jects, learning experiences, and other bouts with reality, could remotely possible in fact, g·ivcn the combinatorial explosion of
produce the sorts of complexities (both external, or behavioral, possible variation such a system would have to cope with .
and internal) that arc needed lo ground a principled interpretation "But suppose you're wrong. What would you say of an entity
of an entity as a thinking thing, an entity with beliefs, desires, that was created all al once (by some programmers, perhaps) , an
intentions, and other mental altitudes . instant individual wilh all the conversational talents of an embod-
. j

' But the opaque screen in the Turing test discounts or dismisses ied, experienced human being?" This is like the question: "Would
these factors altogether, it scTn1s, by focusing allcntion m1 only the you call a hunk of 11 2 0 that was as hard as steel ;it room tempera-
contemporaneous capacity to engage in crnc very limited sort of ture ice?" I do not know what Turing would say, of course, so I
activity: verbal communication. (I have even coined a pc:jorative will speak for myself. Faced with such an improbable violation of
label for such purely language-using systems: bedridden.) Am I what I take to he the laws of nature, I would probably be speech-

. ·: r.~Er~lli~J~~rr!~~~j1
~,;.~~~~~fiwwte.@~Y.-?.4Ntt<""'·
.... '!V..Ji':C-.:L!;1 .-.'S ~ ~ .....; .. .... t,,..:. ..J+~• - ' .
•.c-~er.-.,~~~,;-. ... ;,.;-~~~ 11 ~Mii~~.£&f&&i&#Q4-j
(
142 I 110\V WE KNOW CAN MACHINES THINK? / 143

less. The lcasl of my worries would he about which lexicog-.aphical human mind is hill of' a11 ex1raordi11ary amounl of' detailed
leap lo take: knowledge, as, for example, Roger Schank has been pointing
out.
11. "11 turns out, to my ;1111a1.c111c11t, that something c111 1hi11k wi1hotll
ha\'ing had the hc11cfi1 of eyes , cars, hands, and a history." As long as th~cientist who is allempling lo manipulate you
/l. "It turns out, to Ill)' a111azc111c111, that something c111 pass the Turing does not share all your knowledge, his or her chances of
test without thi11ki11g." manipulating you arc minimal. People can always hit you over
the head. They c111 do that now. We don't need ariificial
Choosing bclween these ways or expressing my astonishment intelligence lo manipulate people by putting them in chains
would be asking myself a q11es1 ion "100 meaningless lo deser\'e or tortu'r ing them. But if someone tries lo manipulate you by
discussion." controlling your thoughls and ideas, thal person will have 10
kn'.>w. whal you k_11ow and ~!ie. ·1 ~he ~est way lo ke~p .y ourself
DISCUSSION sale from that k111d o~ m~mpulauon 1s to be well mlormed.
Q: Do )'Oil ihi11k we will beI able to program . self-co11sciourness i11lo a
Q: ll'l1y wns Turing i11tnesttd i11 dif!e1entiati11g n 111<111 Jro111 a u•o111<111 i11 w111p11ter?
his fa111ow lest? A: Yes, I dd think that it's possible to program self-consciousness
A: Thal was just an example. I le described a parlor game in into a compuler. Self-co11scio11rness can mean many things. If
which a ma11 would try lo fool the judge by a11sweri11g· ques- you take the simplest, crudest notion or sclf~consciousness, I
tions as a woman would answer. I suppose that Turing· was suppose that would be the sort of self-consciousness that a
playing 011 the idea that maybe, just maybe, there is a hig lobster has: When it's hungry, it eats something, but it never
difference between the way men lhi11k and the way wo111e11 eats itself. It has some way of distinguishing between i1self and
think. But of course they're both thinkers. I le wan led lo use the rest of the \vorld, and it has a rather special regard for
that fact lo make us realize that, even if there were clear itself.
dilkrcuces between lhe way a computer and a person The lowly lobster is, in one regard, sclf~conscious. If you
thought, they'd bolh s1ill he lhiuking. want lo know whether or not you can crealc lhat on lhe com-
Q· 11 'hy docs it seem that so111t jwople <m~ 11/Jset by .111 reseaffh? /)ors . II puter, the answer is yes. It's no trouble al all. The computer
·1, ·
research threaten our srlj-1•.1/t1•111? is already a self~watching, self-monitoring sort of thing. That
~, . ' A: I think Herb Simon has already given the canniest diagnosis is an established part of the technology.
of lhat. For many people the mind is the last refug-c of mystery llut, of course, mosl people have some1hing more in mind
agai11s1 the encroaching· spread of science, and they don't like when they speak of' sell~consciousness . ll is that special inner
lhe idea of science c11g111fi11g lhe bs1 bit of term i11cog11ita. This ligh1, thal private way that it is with you that nobody else can
means that they are lhrealcned, I think irrationally, by the share, something that is forever outside the bounds of com-
prospect that rcscard1ers in artificial in1elligence may come lo puter science. I low could a computer ever be conscious in this
1mders1a11d the human mind as well as biologists u11ders1a11d sense?
the genetic code, or as well as physicists understand cleurici1 y That belief, l hat very gripping, powerful intuition is, I think,
and magnetism . This could lead to lhe "evil scientist" (lo lake in the end simply an illusion of common sense. h is as grip-
a siock character from science liction) who Gill ro111rol you ping as the commonsense illusion that lhe earth stands still
because he or she has a d<'<'P u11<lersta11ding of' what's going- and the sun goes around the earth . But the only way that lhose
011 i11 your mind . This sec111s to me lo be a totally valueless or us who do not believe in the illusion will ever convince the
frar, one lhat you can sci aside, for lhc simple reason 1ha1 the general public that it is an illusion is by gradually unfolding
I . • I 110\V WE KNOW ( CAN MACHINES Till I 145

a very dillicuh and fascinali11g slory aboul jusl whal is going by rather indirect arguments that anyone will be ahle to appre-
on in our minds . ciale L11al lhcsc models cast light on lhe deep lheorelical ques-
In lhc inlerim, people like me-philosophers who have lo tion of how the mind is organized.
live by our wits a11d lell a lot of stories-use whal I call intui-
lion pumps, liule examples thal help lo free up the imagina-
NOTES

!
lion. I simply wanl lo draw your attention lo one facL. If you
1. Alan f\I. Turing, "Compul ing f\lad1inery and Intelligence," Mi111/ 59 ( 1!150).
1 look at a compuler-1 do11'1 care whether il's a gianl Cray or
I 2. Rene Descaries, Oi.1ro1n.11' 011 ,\frlhod ( 1637). trans. Lawrence Lafleur (New
. '·
I
a personal compuler-if you open up the box and look inside York: llobhs f\knill. 19!io) .
3. Terry Winograd, Ulldn.1 /r111d111g ;\'11/ura/ La11g1111ge (New York: Academic Press ,
' I
and sec those chips, you say, "No way could lhat be conscious .
\ I !}72 ).
No way could thal be sell ~ conscious." But lhe same thing is 4 . Ned Bloc:k, "l'sychologis1n and Behaviorism," Plti/n.rn/1hirn/ lleioiew ( 1!)82).
true if you take lhe lop off somebody's skull and look al 1he 5. John 1laugdand, l\/i11d f),._,;~11 (Cambridge, f\lass.: Bradford Books/MIT Press,
gray matter pulsing away in !here. You 1hink, "That is con- I !)8 l), p . l (). . #
G. Josq.ih Weizenbaum. CIC.If 17, no~ (September 197,1) . P· 543·
scious? No way could Ll1al lump of sluff be conSl'ious." 7. I thank Kenneth Colby for prn\'iding me with the complete transcripts (includ-
or course, it makes no difference whether you look al il wilh ing the judges' comn1en1aries and reactions). fro111 whid1 these exchanges are
a microscope or with a macroscope: Al no level of inspeclion 'lnoted. The lirst published aC<"ounl of the experiment is .Jon f. lleiser,
Kenneth Mark Colby, William S. Faught, ancl Roger C. Parkison, "Can l'sychia,
does a brain look like the seal of consciousness. Therefore, 1ris1s llistiuguish a Compulcr Simulation of Paranoia from the Real Thing?
don'l expect a complllcr lo look like lhc seal of consciousness. The l.i111i1a1ions ofTmiug -1.ike Tests as Measures of the Adc'luacy of Si11111la-
If you wanl lo get a grasp of' how a computer could be con- 1ions," in journal of Ps)'l'hiatrir llese<1rch 15, no. 3 ( 1980), pp. q9-li2. Colby
disrnsses PARRY and i1s implications in "Modeling a Paranoid Mind," in
scious, il 's no more dillicuh in 1he end than gelling a grasp of lll'hrwiornl m11/ /lrni11 Srie11rr.1 .j. 110. 4 ( l !)8 1), pp. 5 15-60.
how a brain could be conscious. 8. Janel L. Kolodner, "Retrieval and Organization S1ra1egies in Conceptual
As we develop good accounls of consciousness, it will no Memory: A Computer f\lodel" (Ph .D. diss.), Research Rcporl # 187. Dept. of
Computer Scie1a:e, Yale University; idem, "Maintaining Organization m a
longer seem so obvious lo everyone lhat lhe idea of a sell~ Dynamic Long,term Memory ." Cog11ili11e Science 7 (1983), 2,13-280; idem,
conscious compulcr is a conlradiclion in lcnus. At the same "Rl'cunstruOil'e Memory: A Computer Model." Cog111ti11e Sciwce 7 ( 1u83), 281-
time, I doubl 1ha1 lhcrc will ever be sell~conscious robols . B1H 328.
for boring reasons. There won ' t he any point in making them.
Theoretically, could we make a gall bladder oul of atoms? In
principle we could. A gall bladder is just a collee1ion 0L1toms,
bul manufacturing one would cost the moon. It would he
more expensive than every pn~jccl NASA has even dreamed
of, and there would be no scientific payoff. We wouldn't learn
anything new about how gall bladders work. For the same
reason, I don'l think we're going to sec really humanoid ro-
bots, because practical, cost-effective rohols dou 't need lo be
very humanoid at all. They 11ced lo be like the robots you can
already sec al General l\lotors . or like boxy little computers
1hat do special-purpose things .
The theoretical issues will he studied hy artificial intelli-
gence researchers by looking at models 1ha1, to lhc layman,
will show very little sign of hu111ani1y al all, and il will he 011ly

View publication stats

You might also like