Professional Documents
Culture Documents
Bus IT 2
Bus IT 2
Bus IT 2
Introduction
The computer was born not for entertainment or email but out of a need to solve a serious number-crunching
crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the
U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based
computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was available in these early models. The
following brief history of computing is a timeline of how computers evolved from their humble beginnings to
the machines of today that surf the Internet, play games and stream multimedia in addition to crunching
numbers.
Earliest Computer
• Originally calculations were computed by humans, whose job title was computers.
• These human computers were typically engaged in the calculation of a mathematical expression.
• The calculations of this period were specialized and expensive, requiring years of training in
mathematics.
• The first use of the word "computer" was recorded in 1613, referring to a person who carried out
calculations, or computations, and the word continued to be used in that sense until the middle of the
20th century.
1
TALLY STICKS
ABACUS
An abacus is a mechanical device used to aid an individual in performing mathematical calculations. The
abacus was invented in Babylonia in 2400 B.C. The abacus in the form we are most familiar with was first used
in China in around 500 B.C. It used to perform basic arithmetic operations.
2
NAPIER’S BONES
• Invented by John Napier in 1614.
• Allowed the operator to multiply,
divide and calculate square and
cube roots by moving the rods around
and placing them in specially constructed
boards.
SLIDE RULE
• Invented by William Oughtred in 1622.
• Is based on Napier's ideas about logarithms.
•The slide rule, also known colloquially in the
United States as aslipstick, is a mechanical
analog computer
• Used primarily for multiplication, division,
roots, logarithms, trigonometry
• Not normally used for addition or subtraction.
PASCALINE
3
• It is the first digital calculator.
• It was its limitation to addition and subtraction.
• It is too expensive.
STEPPED RECKONER
• Invented by Gottfried Wilhelm Leibniz in 1672.
• The machine that can add, subtract, multiply and divide automatically
• Interesting thing about the Stepped Reckoner is that Leibnitz’s design was way ahead of his time. A working
model of the machine did not appear till 1791, long after the inventor was dead and gone.
JACQUARD LOOM
• The Jacquard loom is a mechanical loom,
invented by Joseph-Marie Jacquard in 1881.
• It an automatic loom controlled by punched cards.
•His invention also provided a model for the input
and output of data in the electro-mechanical and
electronic computing industry.
4
•Analytical engine is general-purpose, used binary system punched cards as input, machined parts not accurate
enough never quite completed.
TABULATING MACHINE
• Invented by Herman Hollerith in 1890.
• Herman Hollerith worked as a statistician
for the U.S. Census Bureau in the 1880s and 1890s.
• To assist in summarizing information
and accounting.
HAVARD MARK 1
• Also known as IBM Automatic Sequence
Controlled Calculator (ASCC).
• Invented by Howard H. Aiken in 1943
• The first electro-mechanical computer.
•The computer had mechanical relays (switches)
which flip-flopped back and forth to represent
mathematical data. It was huge (of course),
weighting some 35 tons with 500 miles of wiring.
5
Z1
• The first programmable computer.
• Created by Konrad Zuse in Germany
from 1936 to 1938.
• To program the Z1 required that the user
insert punch tape into a punch tape reader
and all output was also generated through
punch tape.
ENIAC
• ENIAC stands for
Electronic Numerical Integrator and Computer.
• It was the first electronic general purpose computer.
• Completed in 1946.
• Developed by John Presper Eckert and John W. Mauchly.
•The ENIAC: 30 tons, 18,000 vacuum tubes, with the
computing power of little more than the modern calculator
UNIVAC 1
• The UNIVAC I (UNIVersal Automatic Computer 1)
was the first commercial computer.
• Designed by J. Presper Eckert and
John Mauchly.
6
EDVAC
• EDVAC stands for Electronic Discrete Variable Automatic Computer
• The First Stored Program Computer
• Designed by Von Neumann in 1952.
• It has a memory to hold both a
stored program as well as data.
COMPUTER GENERATIONS
The evolution of computer started from 16th century and resulted in the form that we see today. The present
day computer, however, has also undergone rapid change during the last fifty years. This period, during which
the evolution of computer took place, can be divided into five distinct phases, basis of the type of switching
circuits known as Generations of Computers.
7
The First Generation
• First generation computers relied on machine language, the lowest-level programming language understood by
computers, to perform operations, and they could only solve one problem at a time.
• Input was based on punched cards and paper tape, and output was displayed on printouts.
8
is helping to make artificial intelligence a reality.
• The goal is to develop devices that respond to natural
language input and are capable of learning and self-organization.
• There are some applications, such as voice recognition, that are
being used today.
MOORE’S LAW
What of the future? The power of computers (the number of components packed on a chip) has doubled roughly every
18 months to 2 years since the 1960s. But the laws of physics are expected to bring a halt to Moore's Law, as this
idea is known, and force us to explore entirely new ways of building computers. What will tomorrow's PCs look like?
One long-touted idea is that they'll be using particles of light—photons—instead of electrons, an approach known as
optical computing or photonics. Currently, much of the smart money is betting on quantum computers, which deploy
cunning ways of manipulating atoms to process and store information at lightning speed. There's also hope we might
use spintronics (harnessing the "spin" of particles) and biomolecular technology (computing with DNA, proteins, and
other biological molecules), though both are in the very early stages of research. Chips made from new materials
such as graphene may also offer ways of extending Moore's law. Whichever technology wins out, you can be quite
certain the future of computing will be just as exciting as the past!
Definition of Computer
• Computer is a programmable machine.
• Computer is a machine that manipulates data according to a list of instructions.
• Computer is any device which aids humans in performing various kinds of computations or
calculations.
The term computer has many definitions. The easiest to understand and remember is – a computer is an electronic
device designed to manipulate data so that useful information can be generated. In the succeeding chapter, you will
learn to understand the rudiments of how computer works. After you develop computer awareness, you will be able
to determine your place in the ‘electronic revolution’.
10