Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

The principle of the modern computer was first described by mathematician and

pioneering computer scientist Alan Turing, who set out the idea in his seminal 1936 paper,[20] On
Computable Numbers. Turing reformulated Kurt Gdel's 1931 results on the limits of proof and
computation, replacing Gdel's universal arithmetic-based formal language with the formal and
simple hypothetical devices that became known as Turing machines. He proved that some such
machine would be capable of performing any conceivable mathematical computation if it were
representable as an algorithm. He went on to prove that there was no solution to
the Entscheidungsproblem by first showing that the halting problem for Turing machines
is undecidable: in general, it is not possible to decide algorithmically whether a given Turing machine
will ever halt.
He also introduced the notion of a 'Universal Machine' (now known as a Universal Turing machine),
with the idea that such a machine could perform the tasks of any other machine, or in other words, it
is provably capable of computing anything that is computable by executing a program stored on
tape, allowing the machine to be programmable.Von Neumann acknowledged that the central
concept of the modern computer was due to this paper.[21] Turing machines are to this day a central
object of study in theory of computation. Except for the limitations imposed by their finite memory
stores, modern computers are said to be Turing-complete, which is to say, they
have algorithm execution capability equivalent to a universal Turing machine.

You might also like