Presented By:-Varsha. Sukhramani Shabeen. Samnani

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 21

Presented By:-

Varsha.
Sukhramani
Shabeen.
 Introduction

History of Computing
How Where Computers Introduces
Abacus Computers
Charles Babbage
Harvard Mark I
ENIAC Computers
Super Computers
Development of Computers
How Computers Work
A computer is a machine which manipulates data according to
a list of instructions.
The first devices that resemble modern computers date to mid-
20th century (around 1940-1941). Early electronic computers
were the size of a large room, consuming as much power as
several hundred modern personal computers. Today, simple
computers may be made small enough to fit into a wrist watch
and be powered from a watch battery. However, the most
common form of computer in use today is by far the embedded
computer. Embedded computers are small, simple devices that
are often used to control other devices.
Any computer with a certain minimum capability is, in
It is difficult to identify any one device as the earliest computer, partly
because the term "computer" has been subject to varying
interpretations over time.Originally, the term "computer" referred to a
person who performed numerical calculations (a human computer),
often with the aid of a mechanical calculating device.. In 1801, Joseph
Marie Jacquard made an improvement to the textile loom that used a
series of punched paper cards as a template to allow his loom to weave
intricate patterns automatically. In 1837, Charles Babbage was the first
to conceptualize and design a fully programmable mechanical
computer that he called "The Analytical Engine".During the first half
of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct
mechanical or electrical model of the problem as a basis for
computation. However, these were not programmable and generally
lacked the versatility and accuracy of modern digital computers.A
The first to be demonstrated working was the Manchester Small-Scale
Experimental Machine (SSEM) ,while the technologies used in computers have
changed dramatically since the first electronic general-purpose computers of the
1940s.Vacuum tube-based computers were in use throughout the 1950s, but were
largely replaced in the 1960s, which were smaller, faster, cheaper, used less
power and were more reliable. By the 1970s, the adoption of integrated circuit
technology and the subsequent creation of microprocessors such as the Intel
4004 which were, speed, cost and reliability. The early and mid-1980s saw
machines with a modest number of vector processors working in parallel become
the standard. Typical numbers of processors were in the range of four to sixteen.
By the 1980s, computers had become sufficiently small and cheap to replace
simple mechanical controls in domestic appliances such as washing machines.
Around the same time, computers became widely accessible for personal use by
individuals in the form of home computers and the now ubiquitous personal
computer. In the later 1980s and 1990s, attention turned from vector processors
to massive parallel processing systems with thousands of "ordinary" CPUs, some
being off the shelf units and others being custom designs. In conjunction with
The old Abacus

The Abacus was an early aid for mathematical computations. Its


only value is that it aids the memory of the human performing the
calculations. A skilled abacus operator can work on addition and
subtraction problems at the speed of a person equipped with a
hand calculator (multiplication and division are slower). The
oldest surviving Abacus was used in 300bc, by the Babylonians.
The Abacus is still an use today, principally in far east. A modern
Abacus consists of rings that slide over rods but the older one
dates from the time when pebbles were used for counting. The
modern Abacus is just the representation of the human fingers:
the lower rings on each rod represent the 5 fingers and the 2 upper
rings represent the 2 hands.
The modern Abacus
By 1822 the English mathematician Charles Babbage was proposing
a steam driven calculating machine the size of a room, which he
called the Difference Engine. This machine would be able to compute
tables of numbers, such as logarithm tables. He obtained government
funding for this project due to the importance of numeric tables in
ocean navigation. It was
Babbage who made an important intellectual leap regarding the
punched cards. Babbage realized that punched paper could be
employed as a storage mechanism, holding computed numbers for
future reference. Babbage called the two main parts of his Analytic
Engine the "Store" and the "Mill", as both terms are used in the
weaving industry. The Store was where numbers were held and the
Millsection
Small was where
of thethey
type were "woven"
of mechanism into new
employed results. InDifference
in Babbage's a modern Engine
One early success was the Harvard Mark I computer which was built as a
partnership between Harvard and IBM in 1944. This was the first
programmable digital computer made in the U.S. But it was not a purely
electronic computer. Instead the Mark I was constructed out of switches,
relays, rotating shafts, and clutches. One of the primary
programmers for the Mark I was a woman, Grace Hopper. Hopper found
the first computer "bug. The word "bug" had been used to describe a defect
since at least 1889 but Hopper is credited with coining the word
"debugging" to describe the work to eliminate program faults.
The Mark I operated on numbers that
were 23 digits wide. It could add or subtract two of these numbers in three-
tenths of a second, multiply them in four seconds, and divide them in ten
seconds. Forty-five years later computers could perform an addition in a
billionth of a second!. This kind of speed is obviously impossible for a
machine whichThe Harvard
must move Mark I: an electro-mechanical
a rotating shaft and that iscomputer
why electronic
The title of forefather of today's all-electronic digital computers is
usually awarded to ENIAC, which stood for Electronic Numerical
Integrator and Calculator. ENIAC was built at the University of
Pennsylvania between 1943 and 1945 by two professors, John
Mauchly and the 24 year old J. Presper Eckert, . Like the Mark I,
ENIAC employed paper card readers obtained from IBM.

To perform this computation on ENIAC you had to rearrange a


large number of patch cords and then locate three particular knobs on
that vast wall of knobs and set them to 3, 1, and 4.
One of the most obvious problems
was that the design would require
"Electronic 18,000
Numerical vacuum
Integrator tubes to all work
and Calculator"
A supercomputer is a computer that is considered, or was considered at
the time of its introduction, to be at the frontline in terms of processing
capacity, particularly speed of calculation.
Supercomputers introduced in the 1960s were
designed primarily by Seymour Cray at Control Data Corporation
(CDC), and led the market into the 1970s until Cray left to form his own
company, Cray Research. He then took over the supercomputer market
with his new designs, holding the top spot in supercomputing for five
years (1985–1990).
The term supercomputer itself is rather fluid, and today's
supercomputer tends to become tomorrow's normal computer. In the
1970s most supercomputers were dedicated to running a vector
processor, Today, parallel designs are based on "off the shelf" server-
class microprocessors, such as the PowerPC, Itanium, or x86-64, and
The Cray-2 was the world's fastest computer from 1985 to 1989.
 First Generation Computers

Second Generation Computers


Third Generation Computers
Fourth Generation Computers
As time progressed, people found they were using adding machines
and slide rules to perform more and more extremely tedious
calculations. Aiken, developed the Mark I in 1944 to ease this
calculating burden. During World War II, researchers made more
advances to ease the burden of performing calculations. In 1946,
they developed the ENIAC, Electronic Numerical Integrator and
Calculator. The computer had 18,000 vacuum tubes which were
used to perform calculations at a rate of 5,000 additions per second..
In the next few years, a number of other "first generation"
computers were built. All of these early computers used vacuum
tubes to perform their calculations. In 1945, John von Neumann
wrote a paper describing how a binary program could be
electronically stored in a computer .In 1947, the EDVAC, Electronic
Discrete Variable Automatic Computer, was built by Eckert and
Mauchley. In 1951, Eckert and Mauchley built
Digital Equipped the UNIVAC for use
Computer
In 1947, Bell Laboratories invented the transistor. This
creation sparked the production of a wave of "second
generation" computers. By using transistors in place of
vacuum tubes, manufacturers could produce more reliable
computers. Using transistors was also less expensive than
building a computer with vacuum tubes. The combination of
smaller size, better reliability, and lower cost made these
second generation computers very popular with buyers. For
scientists and engineers, large powerful computers were built
which were good at performing calculations. For banks and
insurance companies, computers which were smaller , faster
and those which were good at sorting and printing were built.
Computer companies found
Vanguard thatAnalyzr
Motion it wasSmall
expensive to produce
In 1958, the first integrated circuit was made. This invention has
led to the widespread use of computers today. Scientists found a
way to reduce the size of transistors so they could place hundreds
of them on a small silicon chip, about a quarter of an inch on
each side. This enabled computer manufacturers to build smaller
computers. In 1956, FORTRAN, the first programming
language, was developed. The introduction of programming
languages enabled this third generation of computers to contain
something called an operating system.. Another aspect of new
computing to the third generation machines was the presence of
multiprogramming. It enabled the computer to run a number of
jobs simultaneously. The companies who manufactured the
Vectral General Computer
Then, in 1971 Intel created the first microprocessor. The
microprocessor was a large-scale integrated circuit which
contained thousands of transistors. In 1976, Steve Jobs and Steve
Wozniak built the first Apple computer in a garage in California.
Then, in 1981, IBM introduced its first personal computer. The
personal computer was such a revolutionary concept and was
expected to have such an impact on society that in 1982. Within a
matter of years, computers spread from the work place into the
home. Personal computers have changed a great deal since the
early eighties. The hardware has definitely changed, the computers
are faster now, have more memory, The increased processing
speed and memory in computers has led to an increase in the
quality of computer graphics. The introduction of the integrated
circuit and its development into the very-large scale integrated
Control Unit
Memory
Input/output (I/O)
Arithmetic/Logic Unit (ALU)
The control unit directs the various components of a computer. It reads
and interprets instructions in the program. Control systems in
advanced computers may change the order of some instructions so as to
improve performance. A key component common to all CPUs is the
program counter. The control system's function is as follows. Some of
these steps may be performed concurrently or in a different order
depending on the type of CPU:-
Read the code for the next instruction from the cell indicated by the
program counter.
 Decode the numerical code for the instruction into a set of commands
or signals for each of the other systems.
Increment the program counter so it points to the next instruction.
 Read whatever data the instruction requires from cells in memory.
The location of this required data is typically stored within the
instruction code.
Provide the necessary data to an ALU or register.
 If the instruction requires an ALU or specialized hardware to
Magnetic core memory was popular main memory for computers
through the 1960s until it was completely replaced by semiconductor
memory. A computer's memory can be
viewed as a list of cells into which numbers can be placed or read. The
information stored in memory may represent practically anything.
Letters, numbers, even computer instructions. In almost all modern
computers, each memory cell is set up to store binary numbers in
groups of eight bits (called a byte). Each byte is able to represent 256
different numbers. A computer can store any kind of information in
memory as long as it can be somehow represented in numerical form.
Computer main memory comes in two principal varieties: random
access memory or RAM and read-only memory or ROM. ROM is pre-
loaded with data and software that never changes. The contents of
RAM is erased when the power to the computer is turned off while
ROM retains its data indefinitely Software that is stored in ROM is
often called firmware because it is Core
Magnetic notionally
Memorymore like hardware than
Hard disks are Common I/O devices used with computers.I/O is the
means by which a computer receives information from the outside
world and sends results back. Devices that provide input or output to
the computer are called peripherals. On a typical personal computer,
peripherals include input devices like the keyboard and mouse, and
output devices such as the display and printer. Hard disk drives,
floppy disk drives and optical disc drives serve as both input and
output devices. Computer networking is another form of I/O.Often,
I/O devices are complex computers in their own right with their own
CPU and memory. A graphics processing unit might contain fifty or
more tiny computers that perform the calculations necessary to
display 3D graphics. Modern desktop computers contain many
smaller computersCommon
that assist the main
I/O Devices CPU
Used With in performing I/O.
Computers
The ALU is capable of performing two classes of operations:
arithmetic and logic.The set of arithmetic operations that a
particular ALU supports may be limited to adding and
subtracting or might include multiplying or dividing. Any
computer can be programmed to perform any arithmetic
operation—although it will take more time to do so if its ALU
does not directly support the operation.. Logic operations involve
Boolean logic: AND, OR, XOR and NOT. These can be useful
both for creating complicated conditional statements and
processing boolean logic.Superscalar computers contain multiple
ALUs so that they can process several instructions at the same
time. Graphics processors and computers with SIMD and MIMD

You might also like