Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 40

HISTORY OF

COMPUTERS

Maria Fe Novere C. Evangelio


Computer
machine that manipulates data
according to a set of instructions
first resembling a modern computer
were developed in the mid-20th century
(19401945)
first electronic computers were size of a
large room, consuming as much power
as several hundred modern personal
computer
Modern computers
based on tiny integrated circuits
millions to billions of times more
capable than the early machines
occupy a fraction of the space
Simple computers
small enough to fit into a wristwatch
can be powered by a watch battery
ability to store and execute lists of
instructions called programs makes
computers extremely versatile,
distinguishing them from calculators
History of the modern
computer
automated calculation and
programmability
Examples of early mechanical
calculating devices
the abacus
the slide rule and arguably the astrolabe
the Antikythera mechanism (which dates
from about 150100 BC)
Antikythera
abacus astrolabe
mechanism

slide rule
castle clock" - astronomical clock
invented by Al-Jazari in 1206
considered to be the earliest
programmable analog computer
displayed the zodiac, the solar and
lunar orbits
a crescent moon-shaped pointer
traveling across a gateway causing
automatic doors to open every hour
five robotic musicians who played music
when struck by levers operated by a
camshaft attached to a water wheel
Middle Ages saw a re-invigoration of
European mathematics and engineering
Wilhelm Schickard's 1623 device
first of a number of mechanical calculators
constructed by European engineers
In 1801, Joseph Marie Jacquard - textile loom
introduced series of punched paper cards as
a template which allowed his loom to weave
intricate patterns automatically
use of punched cards to define woven
patterns can be viewed as an early, albeit
limited, form of programmability
Jacquard loom
- one of the first programmable device
fusion of automatic calculation with
programmability that produced the first
recognizable computers
In 1837, Charles Babbage - first to
conceptualize and design a fully
programmable mechanical computer
analytical engine
In the late 1880s Herman
Hollerith
invented recording of data on a
machine readable medium
he settled on punched cards

tabulator, and key punch machines

Large-scale automated data


processing of punched cards was
performed for the 1890 United States
Census by Hollerith's company, which
later became the core of IBM
end of the 19th century
the punched card
Boolean algebra
vacuum tube (thermionic valve)
the teleprinter
first half of the 20th century
many scientific computing needs were
met by increasingly sophisticated
analog computers
used a direct mechanical or electrical
model of the problem as a basis for
computation
not programmable and generally lacked
the versatility and accuracy of modern
digital computers.
Alan Turing - widely regarded to be the father
of modern computer science
In 1936 Turing - influential formalization of
the concept of the algorithm and computation
with the Turing machine
Time Magazine name Turing one of the 100
most influential people of the 20th century,
states: "The fact remains that everyone who
taps at a keyboard, opening a spreadsheet or
a word-processing program, is working on an
incarnation of a Turing machine
George Stibitz - internationally recognized as
a father of the modern digital computer
invented and built a relay-based calculator he

dubbed the "Model K" (for "kitchen table", on


which he had assembled it)
first to use binary

circuits to perform an
arithmetic operation
Later models added

greater sophistication
including complex
arithmetic and
programmability
First Generation - 1940-1956
vacuum tubes for circuitry & magnetic
drums for memory
often enormous, taking up entire rooms
very expensive to operate and in addition
to using a great deal of electricity,
generated a lot of heat, which was often
the cause of malfunctions
relied on machine language to perform
operations,
could only solve one problem at a time
Input was based on punched cards and
paper tape
output was displayed on printouts
UNIVAC and ENIAC computers are
examples of first-generation computing
devices. The UNIVAC was the first
commercial computer delivered to a
business client, the U.S. Census Bureau
in 1951
UNIVAC UNIVersal Automatic Computer
Vacuum Tubes
Second Generation - 1956-1963
transistor was invented in
1947 but did not see
widespread use in
computers until the late 50s
allowing computers to
become smaller, faster,
cheaper, more energy-
efficient and more reliable
than their first-generation
predecessors
Vacuum Tubes
still relied on punched cards for input
and printouts for output.
moved from cryptic binary machine
language to symbolic, or assembly,
languages, which allowed programmers
to specify instructions in words
Punch Cards
Third Generation - 1964-1971:
Integrated Circuits
hallmark of the third generation of
computers
Transistors were miniaturized and
placed on silicon chips, called
semiconductors, which drastically
increased the speed and efficiency of
computers
interacted with third generation
computers through keyboards and
monitors and interfaced with an
operating system
Computers for the first time became
accessible to a mass audience because
they were smaller and cheaper than
their predecessors
Fourth Generation - 1971-
Present: Microprocessors
as thousands of integrated circuits were
built onto a single silicon chip
Intel 4004 chip, developed in 1971,
located all the components of the
computer - from the central processing
unit and memory to input/output
controls - on a single chip.
In 1981 IBM introduced its first
computer for the home user
in 1984 Apple introduced the Macintosh
Microprocessors also moved out of the
realm of desktop computers and into
many areas of life as more and more
everyday products began to use
microprocessors
more powerful, they could be linked
together to form networks
led to the development of the Internet
Fourth generation computers also saw
the development of GUIs
the mouse and handheld devices
A graphical user interface (GUI)
human-computer interface (i.e., a way
for humans to interact with computers)
that uses windows, icons and menus
and which can be manipulated by a
mouse (and often to a limited extent by
a keyboard as well).
Fifth Generation - Present and
Beyond: Artificial Intelligence
based on artificial intelligence, are still
in development, though there are some
applications, such as voice recognition,
that are being used today
use of parallel processing and
superconductors is helping to make
artificial intelligence a reality
Quantum computation and molecular
and nanotechnology
The goal of fifth-generation computing
is to develop devices that respond to
natural language input and are capable
of learning and self-organization
ASIMO

You might also like