History of Computers

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 10

History of

Computers
A presentation by
Abhimanyu Kumar
Introduction
A computer is a device that can be instructed to carry out sequences of
arithmetic or logical operations automatically via computer programming.

The abacus was a calculating device that was developed in China.

The Difference Engine was designed in the 1820s by the English


mathematician and inventor Charles Babbage , that could automatically
compute mathematical tables which, until that time, had been tediously
calculated by hand and were prone to error.

ENIAC, short for Electronic Numeric Integrator and Computer was the
first electronic computer. It filled an entire room, weighing thirty tons and
consumed two hundred kilowatts of power.
First Generation of Computers
The first generation of computers was from 1940 to 1956.
The first computer systems used vacuum tubes for circuitry and
magnetic drums for memory.
They were often enormous, taking up entire rooms.
First generation computers relied on machine language, the
lowest-level programming language understood by computers, to
perform operations, and they could only solve one problem at a
time.
The UNIVAC and ENIAC computers are examples of first-
generation computing devices. The UNIVAC was the first
commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.
ENIAC
ENIAC was the first general-purpose electronic computer. When it was
finished the ENIAC filled an entire room, weighed 30 tons, and
consumed 200 kilowatts of power. It generated so much heat that it
had to be placed in a room with a forced air cooling system. Vacuum
tubes, over 19000 of them were the principle elements in the
computer’s circuitry. In one second, ENIAC could perform 5000
additions, 357 multiplications or 38 divisions.

UNIVAC
UNIVAC was the first commercial computer made in the United States
The UNIVAC handled both numbers and alphabetic characters equally
well. In 1952, the UNIVAC successfully predicted the outcome of the
1952 presidential elections, during a televised news broadcast
Second Generation of Computers

The second generation of computers was from 1956 to 1963.


In the second generation, the vacuum tubes were replaced by
transistor.
The transistor was far superior to the vacuum tube, allowing
computers to become smaller, faster, cheaper, more energy-
efficient and more reliable than their first-generation
predecessors.
Second-generation computers moved from cryptic binary
machine language to symbolic, or assembly, languages, which
allowed programmers to specify instructions in words. High-
level programming languages were also being developed at
this time, such as early versions of COBOL and FORTRAN.
Third Generation of Computers
The third generation of computers was from 1964 to 1971.
The development of the integrated circuit was the hallmark of the third
generation of computers. An integrated circuit (IC) is a small electronic
device made out of a semiconductor material.
Transistors were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and efficiency of
computers.
Instead of punched cards and printouts, users interacted with third
generation computers through keyboards and monitors and interfaced with
an operating system, which allowed the device to run many different
applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience
because they were smaller and cheaper than their predecessors.
Fourth Generation of Computers
The fourth generation of computers was developed from 1971 to
present.
The microprocessor brought the fourth generation of computers, as
thousands of integrated circuits were built onto a single silicon chip.
The Intel 4004 chip, developed in 1971, located all the components of
the computer—from the central processing unit and memory to
input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in
1984 Apple introduced the Macintosh. Microprocessors also moved out
of the realm of desktop computers and into many areas of life as more
and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked
together to form networks, which eventually led to the development of
the Internet.
Fourth generation computers also saw the development of GUIs, the
mouse and handheld devices.
Fifth Generation of Computers
Fifth generation computing devices, based on artificial
intelligence, are still in development, though there are some
applications, such as voice recognition, that are being used
today.
The use of parallel processing and superconductors is
helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology
will radically change the face of computers in years to
come. The goal of fifth-generation computing is to develop
devices that respond to natural language input and are
capable of learning and self-organization.
Artificial Intelligence
Artificial intelligence is the branch of computer science concerned with making
computers behave like humans. The term was coined in 1956 by John McCarthy at
the Massachusetts Institute of Technology.
Artificial intelligence includes the following areas of specialization:
Playing games: programming computers to play games against human opponents.
The greatest advances have occurred in the field of games playing. In May, 1997, an
IBM super-computer called Deep Blue defeated world chess champion Gary
Kasparov in a chess match.
Expert Systems: programming computers to make decisions in real-life situations (for
example, some expert systems help doctors diagnose diseases based on symptoms)
Natural Language: programming computers to understand natural human languages
Neural Networks: Systems that simulate intelligence by attempting to reproduce the
types of physical connections that occur in animal brains
Robotics: programming computers to see and hear and react to other sensory
stimuli.
There are several programming languages that are known as AI languages because
they are used almost exclusively for AI applications. The two most common are LISP
and Prolog.

You might also like