Professional Documents
Culture Documents
Oxford University Press Mind Association
Oxford University Press Mind Association
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .
http://www.jstor.org/page/info/about/policies/terms.jsp
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact support@jstor.org.
Oxford University Press and Mind Association are collaborating with JSTOR to digitize, preserve and extend
access to Mind.
http://www.jstor.org
VOL. LIX. No. 236.] [October,1950
M IND
A QUARTERLY REVIEW
OF
1. The ImitationGame.
I PROPOSE tO considerthe question, 'Can machinesthink?
This shouldbeginwith definitions of the meaningof the terms
'machine' and 'think'. The definitions mightbe framedso as to
reflectso far as possiblethe normaluse of the words,but this
attitudeis dangerous. If the meaningof the words'machine'
and 'think'are to be foundby examininghowtheyare commonly
used it is difficult
to escape the conclusionthat the meaning
and the answerto tlie question,' Can machinesthink? ' is to be
soughtin a statisticalsurveysuch as a Gallup poll. But thisis
absurd. Insteadofattempting sucha definition
I shallreplacethe
questionby another,whichis closelyrelatedto it and is expressed
in relativelyunambiguous words.
The new formof the problemcan be describedin termsof
a game whichwe call the 'imitationgame'. It is playedwith
threepeople,a man (A), a woman(B), and an interrogator (C) who
may be of eithersex. The interrogator stays in a room apart
fromthe othertwo. The objectofthe game forthe interrogator
is to determinewhichof the othertwo is the man and whichis
the woman. He knowsthemby labels X and Y, and at the end
ofthe gamehe says either' X is A and Y is B ' or ' X is B and Y
is A'. The interrogator is allowedto put questionsto A and B
thus:
C: Will X pleasetell me the lengthofhis or herhair 2
Now suppose X is aetually A, then A must answer. It is A's
28 433
434 A. M. TURING:
2. CritiqueoftheNew Problem.
As well as asking,'What is the answerto thisnewformofthe
question', one may ask, 'Is this new questiona worthyone
to investigate? ' This latter question we investigatewithout
further ado, therebycuttingshortan infiniteregress.
The new problemhas the advantageof drawinga fairlysharp
line betweenthephysicaland theintellectualcapacitiesofa man.
No engineeror chemistclaims to be able to producea material
whichis indistinguishable fromthe human skin. It is possible
that at sometimethismightbe done,but evensupposingthisin-
ventionavailable we shouldfeeltherewas littlepointin trying
to make a 'thinkingmachine' morehumanby dressingit up in
suchartificial flesh. The formin whichwe,have set the problem
reflectsthisfactin the conditionwhichpreventsthe interrogator
fromseeingor touchingthe othercompetitors, or hearingtheir
voices. Some otheradvantagesofthe proposedcriterion may be
shownup by specimenquestionsand answers. Thus:
Q: Please write me a sonnet on the subject of the Forth
Bridge.
A: Countme out on thisone. I nevercouldwritepoetry.
Q: Add 34957to 70764
A: (Pause about 30 secondsand thengive as answer)105621.
Q: Do you play chess?
A: Yes..
COMPUTING MACHINERY AND INTELLIGENCE 435
Q: I have K at my Ki, and no otherpieces. You have only
K at K6 and R at RI. It is yourmove. What do you
play ?
A: (Aftera pause of 15 seconds)R-R8 mate.
The question and answer method seems to be suitable for
introducing almostany one ofthe fieldsofhumanendeavourthat
we wish to include. We do not wish to penalisethe machine
for its inabilityto shinein beautycompetitions, norto penalise
a man forlosingin a race againstan aeroplane. The conditions
of our game make thesedisabilitiesirrelevant. The 'witnesses'
can brag, if they considerit advisable,as much as theyplease
about their charms,strengthor heroism,but the interrogator
cannotdemandpracticaldemonstrations.
The game may perhapsbe criticisedon the groundthat the
odds are weightedtoo heavilyagainstthe machine. If the man
wereto tryand pretendto be the machinehe wouldclearlymake
a verypoorshowing. He wouldbe givenawayat oncebyslowness
and inaccuracyin arithmetic. May notmachinescarryout some-
thingwhichoughtto be describedas thinkingbut whichis very
different fromwhata man does ? This objectionis a verystrong
one, but at least we can say that if,nevertheless,
a machinecan
be constructed to play the imitationgame satisfactorily,
we need
not be troubledby thisobjection.
It mightbe urged that when playingthe 'imitationgame'
the best strategyfor the machinemay possiblybe something
otherthanimitationofthebehaviourofa man. Thismaybe,but
I thinkit is unlikelythat thereis any greateffectof this kind.
In any case thereis no intentionto investigateherethe theory
of the game, and it will be assumed that the best strategyis
to tryto provideanswersthatwouldnaturallybe givenby a man.
4. DigitalComputers.
The idea behinddigitalcomputersmaybe explainedby saying
that these machinesare intendedto carryout any operations
whichcouldbe donebya humancomputer. Thehumancomputer
is supposedto be followingfixedrules; he has no authority
to deviatefromthemin any detail. We may supposethat these
rulesare suppliedin a book,whichis alteredwheneverhe is put
on to a new job. He has also an unlimitedsupplyof paper on
whichhe doeshiscalculations. He mayalso do hismultiplications
and additionson a 'desk machine',but thisis not important.
If we use the above explanationas a definition
we shall be in
COMPUTING MACHINERY AND INTELLIGENCE 437
danger of circularityof argument. We avoid this by giving
an outlineof the means by whichthe desiredeffectis achieved.
A digitalcomputercan usuallybe regardedas consistingofthree
parts:
(i) Store.
(ii) Executiveunit.
(iii) Control.
5. Universality
ofDigitalComputers.
The digital computersconsideredin the last sectionmay be
classifiedamongstthe ' discretestate machines'. These are the
machineswhichmove by suddenjumps or clicksfromone quite
definitestateto another. Thesestatesare sufficiently for
different
the possibilityofconfusionbetweenthemto be ignored. Strictly
speakingthereare no such machines. Everythingreallymoves
continuously.But thereare many kindsof machinewhichcan
profitablybe thought of as being discretestate machines. For
instancein consideringthe switchesfor a lightingsystemit is
a convenientfictionthat each switchmust be definitely on or
definitelyoff. There must be intermediatepositions,but for
most purposeswe can forgetabout them. As an example of a
discretestate machinewe mightconsidera wheel whichclicks
440 A. M. TURING:
i0 q2 q3 q1
Input
?i q1 q2 q3
State q1 q2 q3
Output 00 0? 01
Be kind,resourceful, beautiful,friendly
(p. 448),have initiative,
have a sense of humour,tell rightfromwrong,make mistakes
(p. 448), fallin love,enjoystrawberriesand cream(p. 448),make
some one fall in love with it, learnfromexperience(pp. 456 f.),
use words properly,be the subjectof its own thought(p. 449),
have as much diversityof behaviouras a man, do something
reallynew (p. 450). (Someofthesedisabilitiesare givenspecial
considerationas indicatedby the page numbers.)
7. LearningMachines.
The readerwillhave anticipatedthatI have no veryconvincing
argumentsof a positivenatureto supportmyviews. If I had I
should not have taken such pains to point out the fallaciesin
contraryviews. Such evidenceas I have I shall now give.
Let us returnfor a momentto Lady Lovelace's objection,
whichstatedthat the machinecan onlydo whatwe tell it to do.
One could say that a man can ' inject' an idea intothe machine,
and that it will respond to a certain extent and then drop
intoquiescence,likea piano stringstruckby a hammer. Another
simile would be an atomic pile of less than criticalsize : an
injected idea is to correspondto a neutronenteringthe pile
fromwithout. Each suchneutronwillcausea certaindisturbance
whicheventuallydies away. If, however,the size of the pile is
sufficientlyincreased,the disturbancecausedbysuchan incoming
neutronwill verylikelygo on and on increasinguntilthe whole
pile is destroyed. Is there a corresponding phenomenonfor
minds,and is there one for machines? There does seem to
be one forthe humanmind. The majorityof themseem to be
' sub-critical',i.e. to correspondin this analogyto piles of sub-
criticalsize. An idea presentedto such a mindwill on average
give rise to less than one idea in reply. A smallishproportion
are super-critical.An idea presentedto such a mindmay give
rise to a whole 'theory' consistingof secondary,tertiaryand
moreremoteideas. Animalsmindsseem to be very definitely
sub-critical. Adheringto this analogywe ask, 'Can a machine
be made to be super-critical ?'
The 'skin ofan onion' analogyis also helpful. In considering
the functionsofthe mindor the brainwe findcertainoperations
whichwe can explainin purelymechanicalterms. This we say
does not correspondto the real mind: it is a sortof skinwhich
we muststripoffif we are to findthe real mind. But then in
whatremainswe finda further skinto be strippedoff,and so on.
COMPUTING MACHINERY AND INTELLIGENCE 455
Proceedingin thisway do we ever come to the 'real' mind,or
do we eventuallycometo the skinwhichhas nothingin it ? In
the latter case the whole mind is mechanical. (It would not
be a discrete-statemachinehowever. We have discussedthis.)
These last two paragraphsdo not claim to be convincing
arguments. They should rather be describedas 'recitations
tendingto producebelief'.
The onlyreallysatisfactory supportthat can be givenforthe
view expressedat the beginningr of ? 6, will be thatprovidedby
waitingforthe end ofthe centuryand thendoingthe experiment
described. But what can we say -in the meantime? What
steps should be taken now if the experimentis to be
successful?
As I have explained,theproblemis mainlyoneofprogramming.
Advances'in engineering willhave to be made too, but it seems
unlikelythat these will not be adeqaate forthe requirements.
Estimatesof the storagecapacity of the brain vary from1010
to 1015binarydigits. I inclineto the lowervalues and believe
that only a very small fractionis used forthe highertypes of
thinking. Most of it is probablyused forthe retentionof visual
impressions. I shouldbe surprised ifmorethan 109was required
forsatisfactory playingoftheimitationgame,at anyrateagainst
a blind man. (Note-The capacity of the Encyclopaedia
Britannica,11thedition,is 2 X 109.) A storagecapacityof 107
would be a very practicablepossibilityeven by presenttech-
niques. It is probablynot necessaryto increasethe speed of
operationsof the machinesat all. Parts of modernmachines
which can be regardedas analogues of nerve cells workaboat
a thousandtimesfasterthan the latter. This shouldprovidea
' marginof sa.fety ' which could cover losses of speed arising
in manyways. Ourproblemthenis to findouthowto programme
thesemachinesto playthegame. At mypresentrateofworking
I produceabout a thousanddigitsof programmea day, so that
about sixty workers,workingsteadilythroughthe fiftyyears
mightaccomplishthe job, if nothingwentinto the waste-paper
basket. Some more expeditiousmethod seems desirable.
In the processof tryingto imitatean adult hiumanmindwe
are bound to thinka good deal about the process whichhas
broughtit to the state that it is in. We may notice three
components,
(a) The initialstate ofthe mind,say at birth,
(b) The educationto whichit has been subjected,
(e) Other experience,not to be describedas education,to
whichit has been subjected.
456 A. M. TURING:
Onemayhope,however,thatthisprocesswillbe moreexpeditious
than evolution. The survivalof the fittestis a slow methodfor
measuringadvantages. The experimenter, by the exerciseof
shouldbe able to speedit up. Equally importantis
intelligence,
the fact that he is not restrictedto randommutations. If he
can tracea cause forsomeweaknesshe can probablythinkofthe
kindofmutationwhichwillimproveit.
It will not be possible to apply exactly the same teaching
processto the machineas to a normalchild. It willnot, for
instance,be providedwith legs, so that it could not be asked
to go out and fillthe coal scuttle. Possiblyit mightnot have
eyes. But howeverwell these deficienciesmightbe overcome
by cleverengineering, one could not send the creatureto school
withoutthe otherchildrenmakingexcessivefunof it. It must
be given some tuition. We need not be too concernedabout
the legs, eyes, etc. The example of Miss Helen Keller shows
that education can take place provided that communication
in both directionsbetweenteacherand pupil can take place by
some means or other.
COMPUTING AMACHINERY AND INTELLIGENCE 457
We normallyassociate punishmentsand rewardswith the
teaching process. Some simple child-machinescan be con-
structedor programmed on this sortof principle. The machine
has to be so constructedthat eventswhichshortlyprecededthe
occurrenceof a punishment-signal are unlikelyto be repeated,
whereas a reward-signal increasedthe probabilityof repetition
of the eventswhichled up to it. These definitions do not pre-
suppose any feelingson the part of the machine. I have done
some experiments with one such child-machine, and succeeded
in teachingit a few things,but the teachingmethodwas too
unorthodoxforthe experiment to be consideredreallysuccessful.
The use of punishments and rewardscan at best be a part of
the teachingprocess. Roughlyspeaking,if the teacherhas no
other means of communicatingto the pupil, the amount of
information whichcan reachhimdoes not exceedthe total num-
ber of rewardsand punishmentsapplied. By the time a child
has learntto repeat 'Casabianca' he would probablyfeel very
sore indeed,if the text could only be discoveredby a 'Twenty
Questions' technique,every 'NO' takingthe formof a blow.
It is necessarythereforeto have some other 'unemotional'
channelsof communication.If theseare available it is possible
to teach a machineby punishments and rewardsto obey orders
givenin somelanguage,e.g. a symboliclanguage. These orders
are to be transmittedthroughthe 'unemotional' channels.
The use of this language will diminishgreatlythe numberof
punishments and rewardsrequired.
Opinionsmay vary as to the complexitywhichis suitablein
the child machine. One mighttry'to make it as simple as
possibleconsistently withthe generalprinciples. Alternatively
one mighthavea completesystemoflogicalinference ' builtin '.,
In the lattercase the storewould be largelyoccupiedwithde-
finitions and propositions. The propositions wouldhave various
kinds of status,e.g. well-established facts,conjectures,mathe-
maticallyprovedtheorems,statementsgiven by an authority,
expressionshavingthe logicalformofpropositionbut not belief-
value. Certainpropositions may be describedas 'imperatives'.
The machineshould be so constructedthat as soon as an im-
perativeis classed as 'well-established' the appropriateaction
automaticallytakesplace. To illustratethis,supposetheteacher
says to the machine,'Do your homeworknow'. This may
cause " Teachersays 'Do yourhomeworknow ' " to be included
amongstthe well-established facts. Anothersuchfactmightbe,
1Or rather'programmed in' forourchild-machine willbe programmed
in a digitalcomputer. But the logicalsystemwillnothave to be learnt.
458 A. M. TURING:
BIBLIOGRAPHY
Samuel Butler, Erewhon, London, 1865. Chapters 23, 24, 25, The Book
of the Machines.
Alonzo Church," An Unsolvable Problem of ElementaryNumber Theory ",
American J. of Math., 58 (1936), 345-363.
K. G6del, " Vber formalunentscheidbareSatze der Principia Mathematica
und verwandter Systeme, I ", Monatsheftefuir Math. und Phys.,
(1931), 173-189.
D. R. Hartree, Calculating Instrumentsand Machines, New York, 1949.
S. C. Kleene, " General Recursive Functions of Natural Numbers ",
American J. of Math., 57 (1935), 153-173 and 219-244.
G. Jefferson," The Mind of Mechanical Man ". Lister Oration for 1949.
BritishMedical Journal, vol. i (1949), 1105-1121.
Countess of Lovelace, ' Translator's notes to an article on Babbage's
Analytical Engire', Scientific Memoirs (ed. by R. Taylor), vol. 3
(1842), 691-731.
Bertrand Russell, History of WesternPhilosophy,London, 1940.
A. M. Turing, " On Computable Numbers, with an Application to the
Entscheidungsproblem", Proc. London Math. Soc. (2), 42 (1937),
230-265.
V7ictoria
Universitty
of Manchester.