Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 15

Kurdistan Technical Institute

IT/Department

First Semester

Group of B

History and development of computer

Prepared by:
Sara Abdulla Ali

Supervised by:
Mr/Jawhar

2021-2022
Content:
 Background

 Analyze:

 Introduction:

 Achieving Practical Devices:

 Development of Computers
Background:
Computers and electronics play an enormous role in today's
society, impacting everything from communication and medicine
to science.

Although computers are typically viewed as a modern invention


involving electronics, computing predates the use of electrical
devices. The ancient abacus was perhaps the first digital
computing device. Analog computing dates back several millennia
as primitive computing devices were used as early as the ancient
Greeks and Romans, the most known complex of which being the
Antikythera mechanism. Later devices such as the castle clock
(1206), slide rule (c. 1624) and Babbage's Difference Engine (1822)
are other examples of early mechanical analog computers.

The introduction of electric power in the 19th century led to the


rise of electrical and hybrid electro-mechanical devices to carry
out both digital (Hollerith punch-card machine) and analog
(Bush’s differential analyzer) calculation. Telephone switching
came to be based on this technology, which led to the
development of machines that we would recognize as early
computers.

The presentation of the Edison Effect in 1885 provided the


theoretical background for electronic devices. Originally in the
form of vacuum tubes, electronic components were rapidly
integrated into electric devices, revolutionizing radio and later
television. It was in computers however, where the full impact of
electronics was felt. Analog computers used to calculate ballistics
were crucial to the outcome of World War II, and the Colossus
and the ENIAC, the two earliest electronic digital computers, were
developed during the war.

With the invention of solid-state electronics, the transistor and


ultimately the integrated circuit, computers would become much
smaller and eventually affordable for the average consumer.
Today “computers” are present in nearly every aspect of everyday
life, from watches to automobiles.

Analyze:
Some of the most difficult problems in science and technology
involve solving equations relating to complex physical situations,
such as predicting the heights of tides, designing antenna systems
for radio communication, creating a reliable electrical power grid,
solving problems in nuclear physics, and accurately predicting
where an artillery shell would fall. These problems were only
capable of being solved when mechanical analog devices were
invented to aid in the solution of differential equations. The
creation of the differential analyzer in the first half of the 20th
century was a breakthrough that allowed for advances in these
and many other areas.
Introduction:
Many problems encountered in practical science and engineering
involve changing values of speed, voltage, heights, acceleration,
and similar measurements. These problems are fundamental to
the design of almost everything we take for granted today:
electrical power systems, radio and television, prediction of tides
and ocean currents, design of airplanes, and ballistic problems
such as accurate aiming of artillery. The solutions to most of these
problems involve representing the situation as a mathematical
model, and this model usually involves an area of mathematics
known as differential equations. To obtain useful numerical results
from these models required integrating these differential
equations. The numerical solution of differential equations by
hand usually involves some form of numerical integration, a
process that is labor intensive, error prone, and difficult,
occasionally even impossible, without modern digital computing
machines.

The process of integration is essentially finding the area under the


graph of the function. While this sounds simple, it can be difficult
in practice because the equation of the graph is sometimes not
known—for example the amount of light received from a variable
star can be measured and graphed, but the equation that results
in this graph is not known. Simple expedients can be used to gain
some idea of the numerical solution to these problems. One
involves nothing more than graph paper and an accurate chemical
balance. If you plot the function to be integrated on the graph
paper and cut out the resulting area between the x axis and the
function value (the y axis), then the weight of this cut-out piece of
paper is proportional to the area. If you then cut out a unit square
from the same paper, you can weigh the graph, divide it by the
weight of the unit square, and you will have a reasonable
approximation to the area. The method is, of course, full of
potential sources of error.

Achieving Practical Devices:


The problem was finally solved in 1930 by Vannevar Bush, a
professor at MIT. (He was later to become President of MIT and a
very influential advisor to the President of the USA during the
Second World War.) Bush began to consider the problem of how
to construct a mechanical integrating mechanism when he had to
solve some differential equations concerned with an electrical
power network. After spending several months attempting to
solve just one equation, he decided it was better to spend his time
designing an analog computing mechanism that could solve
many such equations.

Bush finally resolved the problem of connecting several disk-


wheel integrators together for the solution of real word problems
when he invented a device that would amplify the torque in the
rotating shaft (C), giving enough power to drive other devices
while still permitting the delicate movements of the integrating
wheel on the disk. This torque amplifier, based on the same
principles as a capstan used to raise an anchor on a ship, was an
extremely delicate device to construct and maintain. One of the
very best differential analyzers had been constructed at the
Institute of Physics in Oslo Norway in 1938, just prior to World
War Two. Svend Rosseland, an astrophysicist in charge of the
machine, realized that the German military would be interested in
using this device. After Norway was invaded by Germany, Svend
removed the delicate torque amplifiers and, after carefully
wrapping them to prevent corrosion, buried them in the garden.
This simple expedient rendered this fine machine useless until
they were dug up and reinstalled after the war was over.

Figure 4: Bush’s torque amplifier mechanism. Source: V. Bush, "The


differential analyzer" (see reference of historical significance).

The torque amplifier consisted of two “friction drums” that were


rotated by belts attached to electrical motors. An input shaft was
connected to the knife-edge wheel of an integrator, and that, in
turn, would gently nudge an input arm. The slight movement of
the input arm would cause the string to momentarily tighten on
the right hand friction drum, giving a pull on the output arm
which would then cause the output shaft to rotate with
considerable force, the power coming from the left hand friction
drum.
Figure 5: An integrating mechanism from a differential analyzer
constructed at Manchester University, England. The steel
integrating disk and knife-edge wheel are on the right and torque
amplifier on the left. Source: photo by the author.

It was now possible to have several integrating units with their


power amplified outputs being mechanically connected together
via gears and the final result connected to a plotting table. The
usual format was to have several input tables (one for each
function being integrated) on one side of the machine. As human
operators followed the graphs with pointers (usually with the aid
of magnifying glasses), the output from these tables would be
used to position the knife-edge wheel along the radius of the
integrating disk. The rotations of the knife-edge disk would be
enhanced by the torque amplifiers and fed into an
interconnection table of gears and other mechanisms. These
mechanisms could add, subtract, and even multiply together two
quantities, and then these would, in turn, be combined with
others until the final result was obtained. The result was usually
sent to a plotting table to produce a graphical representation on
paper.

The gearing for the arithmetic operations was well understood.


For example, the gears required to add together two sets of
values (rotations) is nothing other than the differential gear
mechanism used in rear axle of rear-wheel-drive automobiles.
Similar gearing systems were well understood for other types of
arithmetic operations. The only real problem was that gears have
backlash. That is, when a gear is rotated it tends to rotate slightly
in the opposite direction when stopped. This lead to inaccuracies
in the final result that became more significant as the number of
interconnecting gears increased for complex problems. This
problem was solved by using gears with specially shaped teeth.

Figure 6. Vannevar Bush examining the interconnection gearing of


a differential analyzer at the Aberdeen Proving Ground in
Maryland. The integrating mechanisms are on the far side and
one of the input or output tables is immediately behind him.
Source: photo courtesy of MIT.
Development of Computers
Although the development of digital computers is rooted in
the abacus and early mechanical calculating devices, Charles Babbage is
credited with the design of the first modern computer, the analytical
engine, during the 1830s. Vanover Bush built a mechanically operated
device, called a differential analyzer, in 1930; it was the first general-
purpose analog computer. John Atanasoff constructed the first electronic
digital computing device in 1939; a full-scale version of the prototype
was completed in 1942 at Iowa State College (now Iowa State Univ.). In
1943 Conrad Zuse built the Z3, a fully operational electromechanical
computer.
During World War II, the Colossus was developed for British
codebreakers; it was the first programmable electronic digital computer.
The Mark I, or Automatic Sequence Controlled Calculator, completed in
1944 at Harvard by Howard Aiken, was the first machine to execute
long calculations automatically, while the first all-purpose electronic
digital computer, ENIAC (Electronic Numerical Integrator And
Calculator), which used thousands of vacuum tubes, was completed in
1946 at the Univ. of Pennsylvania. UNIVAC (UNIVersal Automatic
Computer) became (1951) the first computer to handle both numeric and
alphabetic data with equal facility; intended for business and
government use, this was the first widely sold commercial computer.
First-generation computers were supplanted by the transistorized
computers (see transistor) of the late 1950s and early 60s, second-
generation machines that were smaller, used less power, and could
perform a million operations per second. They, in turn, were replaced by
the third-generation integrated-circuit machines of the mid-1960s and
1970s that were even smaller and were far more reliable. The 1970s,
80s, and 90s were characterized by the development of the
microprocessor and the evolution of increasingly smaller but powerful
computers, such as the personal computer and personal digital
assistant (PDA), which ushered in a period of rapid growth in the
computer industry.
The World Wide Web was unveiled in 1990, and with the development
of graphical web browser programs in succeeding years the Web and the
Internet spurred the growth of general purpose home computing and the
use of computing devices as a means of social interaction. Smartphones,
which integrate a range of computer software with a cellular
telephone that now typically has a touchscreen interface, date to 2000
when a PDA was combined with a cellphone. Although computer tablets
date to the 1990s, they only succeeded commercially in 2010 with the
introduction of Apple's iPad, which built on software developed for
smartphones. The increasing screen size on some smartphones has made
them the equivalent of smaller computer tablets, leading some to call
them phablets.

computer, device capable of performing a series of arithmetic or


logical operations. A computer is distinguished from a calculating
machine, such as an electronic calculator, by being able to store
a computer program (so that it can repeat its operations and make logical
decisions), by the number and complexity of the operations it can
perform, and by its ability to process, store, and retrieve data without
human intervention. Computers developed along two separate
engineering paths, producing two distinct types of computer—analog
and digital. An analog computer operates on continuously varying data;
a digital computer performs operations on discrete data.
Computers are categorized by both size and the number of people who
can use them concurrently. Supercomputers are sophisticated machines
designed to perform complex calculations at maximum speed; they are
used to model very large dynamic systems, such as weather
patterns. Mainframes, the largest and most powerful general-purpose
systems, are designed to meet the computing needs of a large
organization by serving hundreds of computer terminals at the same
time. Minicomputers, though somewhat smaller, also are multiuser
computers, intended to meet the needs of a small company by serving up
to a hundred terminals. Microcomputers, computers powered by
a microprocessor, are subdivided into personal computers and
workstations, the latter typically incorporating RISC processors.
Although microcomputers were originally single-user computers, the
distinction between them and minicomputers has blurred as
microprocessors have become more powerful. Linking multiple
microcomputers together through a local area network or joining
multiple microprocessors together in a parallel-processing system has
enabled smaller systems to perform tasks once reserved for mainframes,
and the techniques of grid computing have enabled computer scientists
to utilize the unemployed processing power of computers connected
over a network or the Internet.
Advances in the technology of integrated circuits have spurred the
development of smaller and more powerful general-purpose digital
computers. Not only has this reduced the size of the large, multi-user
mainframe computers—which in their early years were large enough to
walk through—to that of pieces of furniture, but it has also made
possible powerful, single-user personal computers and workstations that
can sit on a desktop or be easily carried. These, because of their
relatively low cost and versatility, have replaced typewriters in the
workplace and rendered the analog computer inefficient. The reduced
size of computer components has also led to the development of thin,
lightweight notebook computers and even smaller computer tablets and
smartphones that have much more computing and storage capacity than
that of the desktop computers that were available in the early 1990s.

Although the development of digital computers is rooted in


the abacus and early mechanical calculating devices,
Charles Babbage is credited with the design of the first modern
computer, the analytical engine, during the 1830s.
Vannevar Bush built a mechanically operated device, called a
differential analyzer, in 1930; it was the first general-purpose
analog computer. John Atanasoff constructed the first electronic
digital computing device in 1939; a full-scale version of the
prototype was completed in 1942 at Iowa State College (now
Iowa State Univ.). In 1943 Conrad Zuse built the Z3, a fully
operational electromechanical computer.
During World War II, the Colossus was developed for British
codebreakers; it was the first programmable electronic digital
computer. The Mark I, or Automatic Sequence Controlled
Calculator, completed in 1944 at Harvard by Howard Aiken, was
the first machine to execute long calculations automatically, while
the first all-purpose electronic digital computer, ENIAC (Electronic
Numerical Integrator And Calculator), which used thousands of
vacuum tubes, was completed in 1946 at the Univ. of
Pennsylvania. UNIVAC (UNIVersal Automatic Computer) became
(1951) the first computer to handle both numeric and alphabetic
data with equal facility; intended for business and government
use, this was the first widely sold commercial computer.
First-generation computers were supplanted by the transistorized
computers (see transistor) of the late 1950s and early 60s,
second-generation machines that were smaller, used less power,
and could perform a million operations per second. They, in turn,
were replaced by the third-generation integrated-circuit machines
of the mid-1960s and 1970s that were even smaller and were far
more reliable. The 1970s, 80s, and 90s were characterized by the
development of the microprocessor and the evolution of
increasingly smaller but powerful computers, such as the personal
computer and personal digital assistant (PDA), which ushered in a
period of rapid growth in the computer industry.
The World Wide Web was unveiled in 1990, and with the
development of graphical web browser programs in succeeding
years the Web and the Internet spurred the growth of general
purpose home computing and the use of computing devices as a
means of social interaction. Smartphones, which integrate a
range of computer software with a cellular telephone that now
typically has a touchscreen interface, date to 2000 when a PDA
was combined with a cellphone. Although computer tablets date
to the 1990s, they only succeeded commercially in 2010 with the
introduction of Apple's iPad, which built on software developed for
smartphones. The increasing screen size on some smartphones
has made them the equivalent of smaller computer tablets,
leading some to call them phablets.
Reference:

https://ethw.org/
Category:Computing_and_elect
ronics

https://www.infoplease.com/
encyclopedia/science/
engineering/computer/computer

You might also like