Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 22

Assignment of Computer:

PREPARED BY:
M. FAHAD BUTT (ROLL #43)
MBA (MORNIG)
2010-2013
SUMITTIED TO: SIR SOHAIL.

DATE ON:
GERNARATIONS OF COMPUTER:
Generations of computer can also be describing as Past, Present and future of Computer.

Followings are Generations of computer according to the time period.

 The First generation


 The Second Generation
 The Third Generation
 The Fourth Generation
 The Fifth Generation

The computer age changed very fast. First four Generations over 50 years.

 Trends across generations


– Decrease size
– Increase speed

“The First Generation”:


Time period of first generation:
1942—1955
Technology used:
Vacuum Tube

Language used:
Machine language
Memory:
Magnetic core memory
Storage:
Punched cards
Tape (1957)

Examples:
 1951, UNIVAC,
 1951, SAGE
 1952, EDVAC
 1953, IBM 701
 1953, The Whirlwind
Characteristics of 1st Generation Computers
 Computers big and clumsy
 Electricity consumption is high
 Electric failure occurred regularly  - computers not very reliable
 Large air conditioners was necessary because the computers generated heat
 Batch processing

1951, UNIVAC
 Eckert and Mauchly completed the first commercial computer in the USA – the
UNIVAC (Universal Automatic Computer)
 First computer built for business
 Short Code - A set of instructions called Short Code is developed for the UNIVAC.
Programmers
 The most famous UNIVAC product was the UNIVAC I mainframe computer of 1951,
which became known for predicting the outcome of the U.S. presidential election the
following year. This incident is particularly infamous because the computer predicted an
Eisenhower landslide when traditional pollsters all called it for Adlai Stevenson. The
numbers were so skewed that CBS's news boss in New York, Mickelson, decided the
computer was in error and refused to allow the prediction to be read. Instead they showed
some staged theatrics that suggested the computer was not responsive, and announced it was
predicting 8-7 odds for an Eisenhower win (the actual prediction was 100-1). When the
predictions proved true and Eisenhower won a landslide within 1% of the initial prediction,
Charles Collingwood, the on-air announcer, embarrassingly announced that they had
covered up the earlier prediction.

1951, SAGE – (Semi Automatic Ground Environment was developed).


IBM built the SAGE computers and became leaders in real-time applications and used the
technology of Whirlwind.

 SAGE computers were used in an early U.S. air defense system. They were fully deployed
in 1963, which consisted of 27 centers throughout North America, each with a duplexes
AN/FSQ-7 computer system containing over 50,000 vacuum tubes, weighing 250 tons and
occupying an acre of floor space.
 SAGE was the first large computer network to provide man-machine interaction in real time.

1952, EDVAC- Electronic Discreet Variable Computer


John Von Neumann designed with a central control unit which would calculate and output all
mathematical and logical problems and a memory which could be written to and read. (RAM in
modern terms) which would store programs and data.

1953, IBM 701


The 701 was formally announced on May 21, 1952. It was the unit of the overall 701 Data
Processing Systems in which actual calculations were performed. That activity involved 274
assemblies executing all the system's computing and control functions by means of electronic pulses
emitted at speeds ranging up to one million a second. Nineteen IBM 701 systems were installed. The
University of California at Livermore developed a language compilation and runtime system called the
KOMPILER for their 701. A FORTRAN compiler was not released by IBM until the IBM 704.
The 701 can claim to be the first computer displaying the potential of artificial intelligence in the
Samuel Checkers-playing Program.

1953, the Whirlwind


Whirlwind was a large scale, general purpose digital computer begun at the Servomechanisms
Laboratory of the Massachusetts Institute of Technology in 1946.
Advantages:
 Vacuum tubes were the only electronic component available during those days.
 Vacuum tube technology made possible to make digital computers.
 These computers could calculate data in millisecond.

Disadvantages:
 These computers were very large in size.
 They consumed a large amount of energy.
 They heated very soon due to thousands of vacuum tubes.
 They were not very reliable.
 Air conditioning was required.
 Non-portable
 Costly commercial production
 Limited commercial use
 Continuous checking of hardware
 Difficult in use
“Second Generation Computers”:

Time period of second generation:


1955-1964
Technology used:
Transistor
– Smaller
– No warm-up time
– Less energy
– Less heat
– Faster
– More reliable
Storage:
Removable disk pack (1954)
Magnetic tape
Programming languages:
– Assembly language
– FORTRAN (1954)
– COBOL(1959)
Examples:
 1963, Mini-computer: PDP-8
 1964, IBM’s System 360
 1964 Real-time reservation system
 CDC 164 ‘etc’

Characteristics of 2nd Generation Computers:


– Computers became smaller  
– Generate less heat
– Electricity consumption lower
– More reliable and faster 
– Core memory developed
– Magnetic tapes and disks used
– First operating systems developed
– A new processing method was needed.
– Time-sharing (processing technique)
Example:
 1963, Mini-computer: PDP-8:
 1964 Real-time reservation system:
 1964, IBM’s System 360:
 1964, BASIC (programming language):

1963, Mini-computer: PDP-8:


Digital introduces the first successful minicomputer – the PDP-8.  It
was about as large as a fridge and used transistors and magnetic core
memory. 
1964 Real-time reservation system:
IBM developed a real-time computerised ticket reservation system
for American Airways.
 It was smaller than SAGE and was called SABRE
(Semi-Automatic Business-Related Environment).

1964, IBM’s System 360:


The IBM System/360 (S/360) was a mainframe computer system family first announced by IBM
on April 7, 1964, and sold between 1964 and 1978. It was the first family of computers designed to
cover the complete range of applications, from small to large, both
commercial and scientific. The design made a clear distinction
between architecture and implementation, allowing IBM to release
a suite of compatible designs at different prices. All but the most
expensive systems used microcode to implement the instruction
set, which featured 8-bit byte addressing and binary, decimal and
floating-point calculations.It consisted of 6 processors and 40
peripheral units. More than 100 computers per month were
ordered. The 360s were extremely successful in the market,
allowing customers to purchase a smaller system with the
knowledge they would always be able to migrate upward if their
needs grew, without reprogramming of application software. The
design is considered by many to be one of the most successful
computers in history, influencing computer design for years to
come.
1964, BASIC (programming language):
A programming language was necessary that could be used in a time-sharing environment and that
could serve as a training language.

Advantages:
 Smaller in size as compared to first generation computer.
 More reliable
 Used less energy and ere not heated
 Better portability
 Better speed and could calculate data in microsecond.
 Used faster peripherals like tape drives, magnetic disks, printer etc.
 Accuracy improved.

Disadvantages:
 Air conditioning required
 Constant maintenance was required
 Commercial production was difficult
 Only used for specific purposes
 Costly and not versatile
 Punch card were used of input
“Third Generations of computer”.
Time period of Third generation:

“1964-1971”

Technology used:
“Integrated Circuit”
 Electronic circuit on small
silicon chip
 Reliability
 Compactness
 Low cost
 Inexpensive – mass-produced
Programming languages:
High-level languages appeared

Brief introduction:
The development of the integrated circuit was the hallmark of the third generation of computers.
Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically
increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through
keyboards and monitors and interfaced with an operating system, which allowed the device to run
many different applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they were smaller and
cheaper than their predecessors.
Fairchild Camera and Instrument Corp. built the first standard metal oxide semiconductor product
for data processing applications, an eight-bit arithmetic unit and accumulator.  The fundamental
components of this semiconductor laid the groundwork for the future discovery of the
microprocessor in 1971.  Another company that took advantage of the third generation
advancements was IBM with the unveiling of the IBM System/360.  The company was making a
transition from discrete transistors to integrated circuits, and its major source of revenue moved
from punched-card equipment to electronic computer systems.
    In 1969 AT&T Bell Laboratories programmers Kenneth Thompson and Dennis Ritchie
developed the UNIX operating system on a spare DEC minicomputer.  UNIX was the first
modern operating system that provided a sound intermediary between software and hardware. 
UNIX provided the user with the means to allocate resources on the fly, rather than requiring the
resources is allocated in the design stages.  The UNIX operating system quickly secured a wide
following, particularly among engineers and scientists at universities and other computer science
organizations.
Examples:

 IBM 370
 UNIVAC 1108
 UNIVAC9000 etc
1965, Gordon Moore
– The semi-conductor pioneer, Gordon Moore
(founder of Intel), predicted that the number of transistors
that occurred on a microchip would double every year. It
became known as Moore’s Law and is still valid today.

 Burroughs used integrated circuits in parts of two


computers - the B2500 and the B3500.
 Control Data and NCR made two computers using only integrated circuits - the CDC 7600
and the Century series respectively.

1968, Intel was founded (Integrated Electronics).


– They developed more sophisticated memory chips.
 1968, Magnetic core memory was replaced by a microchip.
– The first 256 bit RAM microchips, and later the first 1Kb RAM (1024 byte) chips,
caused the disappearance of Magnetic Core Memory that was used since the mid
1950's. 
 1969, IBM System/370 replaced their System/360 with the System/370 that only used
integrated circuits.

“UNIVAC 1108”:

The UNIVAC 1108 was the second member of Sperry


Rand's UNIVAC 1100 series of computers, introduced
in 1964. Integrated circuits replaced the thin film
memory that the UNIVAC 1107 used for register
storage. Smaller and faster cores, compared to the
1107, were used for main memory.
In addition to faster components, two significant
design improvements were incorporated: base
registers and additional hardware instructions. The
two 18-bit base registers (one for instruction storage
and one for data storage) permitted dynamic
relocation: as a program got swapped in and out of
main memory, its instructions and data could be placed anywhere each time it got reloaded. The
additional hardware instructions included double precision arithmetic, double word load, store, and
comparison instructions. The processor could have up to 16 input/output channels for peripherals.
Just as the first UNIVAC 1108 systems were being delivered in 1965, Sperry Rand announced the
UNIVAC 1108 II (also known as the UNIVAC 1108A) which had support for multiprocessing: up
to three CPUs, four memory banks totaling 262,144 words, and two independent programmable
input/output controllers (IOCs). With everything busy, five activities could be going on at the same
moment: three programs running in the CPUs and two input/output processes in the IOCs. One
more instruction was incorporated: test-and-set, to provide for synchronization between the CPUs.
Although a 1964 internal study indicated only about 43 might sell, in all, 296 processors were
produced.
When Sperry Rand replaced the core memory with semiconductor memory, the same machine was
released as the UNIVAC 1100/20. In this new naming convention, the final digit represented the
number of CPUs (e.g., 1100/22 was a system with two CPUs) in the system

IBM System/370

The IBM System/370 (S/370) was a model range of IBM mainframes announced on June 30,
1970 as the successors to the System/360 family. The series maintained backward compatibility
with the S/360, allowing an easy migration path for customers; this, plus improved performance,
were the dominant themes of the product announcement. Improvements over the S/360 first
released in the S/370 model range included:
 standard dual-processor capability;
 "monolithic main memory" based on integrated circuits instead of magnetic cores;[1]
 full support for virtual memory
through a new microcode floppy disk
on the 370/145[2] and a hardware
upgrade to include a DAT box on the
370/155 and 370/165; these were not
announced until 1972;
 128-bit floating point arithmetic.

 Address relocation hardware on all


S/370s except the original models 155
and 165
 The new S/370-158 and -168
 Four new operating systems:
DOS/VS (DOS with virtual storage),
OS/VS1 (OS/360 MFT with virtual
storage), OS/VS2 (OS/360 MVT with
virtual storage) Release 1, termed SVS (Single Virtual Storage), and Release 2, termed MVS
(Multiple Virtual Storage) and planned to be available 20
months later (at the end of March 1974), and VM/370 – the
re-implemented CP/CMS.

“UNIVAC9000”
The UNIVAC 9000 Series (9200, 9300, 9400, and 9700) was
introduced in the mid-1960s to compete with the low end of the IBM
360 series. The 9000 series implemented the IBM 360 instruction
set. The 9200 and 9300 (which differed only in CPU speed)
implemented the same restricted 360 subset as the IBM 360/20,
while the UNIVAC 9400 implemented the full 360 instruction set. The 9400 was roughly equivalent
to the IBM 360/30.The 9000 series used plated wire memory, which functioned somewhat like core
memory but used a non-destructive read. Since the 9000 series was intended as direct competitors to
IBM, they used 80-column cards and EBCDIC character encoding.

Advantages:
 Smaller in size as compared to previous generations.
 More reliable.
 Used less energy.
 Produced less heat as compared to previous generation computers.
 Betters in speed and could calculate data in nanosecond.
 Used fan for heat discharge to prevent damage.
 Maintenance cost was low because hardware failure is rare totally general purpose
computer.
 Could be used for high-level languages.
 Good storage.
 Versatile too an extent.
 Less expensive.
 Better accuracy.
 Commercial production increased.
 Used mouse and keyboard for input.

Disadvantages: -
 Air conditioning was required.
 High sophisticated technology required for the maintenance of IC Chips
 “Fifth Generation” - Present and Beyond: Artificial Intelligence
Intelligent computers

Artificial intelligence
Expert systems
Natural language

“Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today”.

Applications for 5th Gen computers

 Intelligent robots that could ‘see’ their environment (visual input  - e.g. a video camera) and
could be programmed to carry out certain tasks and should be able to decide for itself how the
task should be accomplished, based on the observations it made of its environment.
 Intelligent systems that could control the route of a missile and defence-systems that could fend
off attacks. 
 Word processors that could be controlled by means of speech recognition.
 Programs that could translate documents from one language to another.

Some technological developments that could make the development of fifth-


generation computers possible include:

– Parallel-processing - many processors are grouped to function as one large group


processor.
– Superconductors - a superconductor is a conductor through which electricity can travel
without any resistance resulting in faster transfer of information between the components of a
computer.
– Expert Systems - helps doctors to reach a diagnosis by following the logical steps of
problem solving just as if the doctor would have done it himself. 
– Speech recognition systems - capable of recognising dictation and entering the text into a
word processor, are already available.
The Fifth Generation AI – Artificial Intelligence
How computers can be used for tasks that required human characteristics.

1. Problem Solving by Search:


An important aspect of intelligence is goal-based problem solving. The solution of many
problems (e.g. naught and crosses, timetabling, chess) can be described by finding a sequence of
actions that lead to a desirable goal. Each action changes the state and the aim is to find the
sequence of actions and states that lead from the initial (start) state to a final (goal) state.

 A well-defined problem can be described by:


1. Initial state
2. Operator or successor function - for any state x returns s(x), the set of states
reachable from x with one action
3. State space - all states reachable from initial by any sequence of actions
4. Path - sequence through state space
5. Path cost - function that assigns a cost to a path. Cost of a path is the sum of costs of
individual actions along the path Goal test - test to determine if at goal state
2. Expert Systems
Programming computers to make decisions in real-life situations (for example, some expert systems
help doctors diagnose diseases based on symptoms) Software used with an extensive set of
organized data that presents the computer as an expert on a particular topic.

3. Natural Language:

Humans communicate with computers in the language they use on a daily basis

Programming computers to understand natural human languages Natural-language processing offers


the greatest potential rewards because it would allow people to interact with computers without
needing any specialized knowledge. You could simply walk up to a computer and talk to it.
Unfortunately, programming computers to understand natural languages has proved to be more
difficult than originally thought. Some rudimentary translation systems that translate from one
human language to another are in existence, but they are not nearly as good as human translators.

4. Robotics:
Computer - controlled device that can physically manipulate its surroundings.

Robot development firm Speecys Corp. of Tokyo developed a small humanoid robot, powered
entirely by easy-to-replace, environmentally friendly fuel-cell batteries.

Today, the hottest area of artificial intelligence is neural networks, which are proving successful in
an umber of disciplines such as voice recognition and natural-language processing. There are
several programming languages that are known as AI languages because they are used almost
exclusively for AI applications. The two most common are

 LISP
 Prolog.

The goal of fifth-generation computing is to develop devices that respond to


natural language input and are capable of learning and self-organization.
The EN D

You might also like