Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 42

The Evolution of Cloud Computing

• Hardware Evolution
– First-Generation Computers
– Second-Generation Computers
– Third-Generation Computers
– Fourth-Generation Computers
– Fifth-Generation Computers
• Software Generations
• History of Internet Software Evolution
• Server Virtualization
First-Generation Computers: 1940-1956
• The first computers used vacuum tubes for circuitry
and magnetic drums for memory.
• They were often enormous and taking up entire
room.
• First generation computers relied on machine
language.
• .They were very expensive to operate and in addition
to using a great deal of electricity, generated a lot of
heat, which was often the cause of malfunctions.
• The UNIVAC and ENIAC computers are examples
of first-generation computing devices.
First-Generation Computers
UNIVAC (Universal Automatic Computer) – a
tabulating machine which won the contest for the
fastest machine which could count the US 1890
census.

VACUUM TUBES – electronic tubes about the size of


light bulbs.
First-Generation Computers
• 1940-1956
Second-Generation Computers:1956-1963
• Transistors replaced vacuum tubes and
ushered in the second generation of
computers.
• Second-generation computers moved from
cryptic binary machine language to symbolic.
•  High-level programming languages were also
being developed at this time, such as early
versions of COBOL and FORTRAN.
• These were also the first computers that stored
their instructions in their memory.
Second-Generation Computers:1956-1963
TRANSISTOR – was a three-legged component which
shrunk the size of the first generation computers.
Occupied only 1/100th of the space occupied by a vacuum
tube
Benefits: More reliable, had greater computational speed,
required no warm-up time and consumed far less
electricity.
Second-Generation Computers:1956-1963
• Figs
Third generation computers:1964-1971
• The development of the integrated circuit was the
hallmark of the third generation of computers.
• Transistors were miniaturized and placed
on silicon chips, called semiconductors.
• Instead of punched cards and printouts, users
interacted with third generation computers
through keyboards and monitors and interfaced
with an operating system.
• Allowed the device to run many
different applications at one time.
Third generation computers:1964-1971
• INTEGRATED CIRCUITS – are square silicon
chips containing circuitry that can perform the
functions of hundreds of transistors .
Third generation computers:1964-1971
Fourth generation computers:
1971-present
• The microprocessor brought the fourth generation of
computers, as thousands of integrated circuits were built
onto a single silicon chip.
• The Intel 4004 chip, developed in 1971, located all the
components of the computer.
• From the central processing unit and memory to
input/output controls—on a single chip.
• Fourth generation computers also saw the development
of GUIs, the mouse and handheld devices.
Fourth generation computers:
1971-present
MICROPROCESSOR – is a silicon chip that contains
the CPU – part of the computer where all processing
takes place.
4004 chip – was the first microprocessor introduced by
Intel Corporation.
Fourth generation computers
• 1971-present
The laws influencing Information
Technology
• The laws that are generally accepted as
governing the spread of technology:
1.Moore's Law
2.Gilder's Law
3.Metcalfe's Law
4.Disk Law
Putting together network, storage and computing speed
…rates of growth of digital power
1. Moore’s Law.
• Transistors on a single chip 1,000,000,000,000
doubles approximately every 100,000,000,000
10,000,000,000
18–24 months. 1,000,000,000
Moore/
2. Gilder’s Law. 100,000,000 Transistors
10,000,000
• Aggregate bandwidth triples
1,000,000
approximately every year. 100,000
Metcalf/
3. Metcalfe’s Law. 10,000
Network
1,000 Gilder/ Nodes
• The value of a network may 100 Bandwidth

grow exponentially with the 10


number of 1
1970 1975 1980 1985 1990 1995 2000 2005 2010
participants(community
2,300 6,000 29,000 275,000 1.2 mil 5.5 mil 42 mil 252 mil 1.344 bil
Law). 50 50 56 1,544 45,000 145,000 10 mil 2.43 bil 200.49 bil
4. Disk Law 4 111 200 10,000 300,000 1 mil 140 mil 3.5 bil 300 bil

• Storage doubles every 12 months


Fifth generation computers
• The Fifth Generation Computer Systems
project (FGCS) was an initiative by Japan's
Ministry of International Trade and Industry,
begun in 1982, to create a computer using
massively parallel computing/processing. It was
to be the result of a massive government/industry
research project in Japan during the 1980s. It
aimed to create an "epoch-making computer"
with-supercomputer-like performance and to
provide a platform for future developments in
artificial intelligence.
Fifth generation computers
(present and beyond)
• Fifth generation computing devices, based on artificial
intelligence.
• Are still in development, though there are some
applications, such as voice recognition.
• The use of parallel processing and superconductors is
helping to make artificial intelligence a reality.
• The goal of fifth-generation computing is to develop
devices that respond to natural language input and are
capable of learning and self-organization.
Fifth generation computers

Fifth generation computers
Software Generations
• There was also a parallel set of generations
for software:
1. First generation: Machine language.
2. Second generation: Assembly language.
3. Third generation: Structured programming
languages such as C, COBOL and FORTRAN.
4. Fourth generation: Domain-specific languages
such as SQL (for database access) and TeX
(for text formatting)
Internet Software Evolution
• Common protocol for the Internet
• Ipv6
• Communicate using the Internet Protocol
• Building a common interface to the Internet
Common types of protocols
The Internet Protocol is used in concert with other
protocols within the Internet Protocol Suite.

• Transmission Control Protocol (TCP)


• User Datagram Protocol (UDP)
• Internet Control Message Protocol (ICMP)
• Hypertext Transfer Protocol (HTTP)
• Post Office Protocol (POP)
• File Transfer Protocol (FTP)
• Internet Message Access Protocol (IMAP)
• The Internet is named after the Internet
Protocol, the standard communications
protocol used by every computer on the
Internet.
• Conceptual foundation for creation of the
Internet
– Vannevar Bush, Norbert Wiener,  Marshall
McLuhan
• 1957 – ARPA ( Advanced Research Projects Agency)
• 1972 - renamed DARPA (Defense Advanced
Research Projects Agency)
SAGE (Semi-Automatic Ground Environment)
– was a continental air-defense network commissioned
by the U.S. military and designed to help protect the
United States against a space-based nuclear attack.
BBN (Bolt Beranek and Newman) - the company that
supplied the first computers connected on the ARPANET
 
Interface Message Processor
- Mini computer
• NCP – Network Control Program
– First networking protocol
• TCP/IP – 1983
– Replaced NCP
– Widely used protocol in the world
Common protocol for the Internet
• AHHP (ARPANET Host-to-Host Protocol)
• ICP (Initial Connection Protocol)
• FTP (File Transfer Protocol)
• SMTP (Simple Mail Transfer Protocol)
• 1983 – TCP/IP
• Versions of TCP/IP
– TCP v1, TCP v2, a split into TCP v3 and IP v3,and
TCP v4 and IPv4
– IPv4 – standard protocol
• 1990 – ARPANET transferred to NSFNET
(National Science Foundation Network)
• Connected to CSNET (Computer Science
Network)
– Linked universities around North America
• EUnet
– Connected research facilities on Europe
Evolution of IPv6
• IPv4
– Not designed to scale to global level
– To increase available address space
– Process larger data packets
– Resulted in longer IP address
• Caused problems in existing hardware and software
– To solve
– required changes on TCP/IP routing software
– New architecture and hardware implementation
IPv6
• Internet Engineering Task Force (IETF)
– IPv6, which was released in January 1995 as RFC
1752 
• Next Generation Internet Protocol
(IPNG) or TCP/IP v6
• 2004 - widely available from industry as an
integrated TCP/IP protocol
Finding a Common Method to Communicate
Using the Internet Protocol
• Engelbart – 1962
• Hypertext
• GUI – mouse
• NLS (oN-Line System)
– first working hypertext system
– designed to cross-reference research papers for sharing among
geographically distributed researchers
– Provided groupware capabilities, screen sharing among
remote users, and reference links for moving between
sentences within a research paper and from one research
paper to another
• Engelbart’s NLS system was chosen as the
second node on the ARPANET, giving him a role
in the invention of the Internet as well as the
World Wide Web
• 1980 - Tim Berners-Lee and Robert Cailliau
– precursor to the web
– HyperCard
• first hypertext editing system available to the general
public
• 1990 -  National Center for Supercomputer
Applications (NCSA), a research institute at the
University of Illinois, developed
the Mosaic and Netscape browsers
Building a Common Interface to the Internet

• Berners-Lee - first web browser


• info.cern.ch - world’s first web server
•  Viola WWW browser
• Mosaic was the first widely popular web
browser available to the general public
• Mosaic Communications, which was later
renamed Netscape Communications
• October 1994, Netscape released the first beta
version of its browser, Mozilla 0.96b
• The final version, named Mozilla 1.0, was
released in December 1994
– First commercial web browser
• Netscape Navigator
– Mosaic programming team then developed
another web browser
– was later renamed Netscape Communicator
– Again renamed to Netscape
• Microsoft
– 1995
– Internet Explorer 1.0
• Add on to Windows 95 OS
• Success – no need of manual installation
• Netscape
– 2002
» Free open source software version of Netscape
» Mozilla – popular on non-Windows platform
– 2004
» Mozilla Firefox
The Appearance of Cloud Formations (From One
Computer to a Grid of Many)

• Clustering
• Load balancing
• Data residency
• 1990 – Grid
– Storage management
– Migration of data
– Security provisioning
• Globus toolkit
– Open source software toolkit
– Build grid systems and application
– allows people to share computing power,
databases, instruments, and other online tools
securely across corporate, institutional, and
geographic boundaries without sacrificing local
autonomy
• Cloud computing
• Propagate grid computing model
• Data centers – concept of grid
• Amazon S3 (Simple Storage Service)
• EMC – CAS (Content Addressable Storage) - Centera
Server Virtualization

• Virtualization
– method of running multiple independent virtual
operating systems on a single physical computer
– 1960 – Virtual machine or pseudo machine
– Platform virtualization
• Control program
• Guest software
Parallel Processing

• simultaneous execution of program


instructions that have been allocated across
multiple processors with the objective of
running a program in less time
• Multiprogramming
– Scheduling – Round Robin
– FIFO
– Context switch
– Deadlock
Vector Processing

• Multiprocessing
– Master / Slave model

Symmetric Multiprocessing systems


(SMP)
• Single-processor, multiprogramming
Massively Parallel Processing systems
• a computer system with many independent 
arithmetic units or entire microprocessors, which run in
parallel. “Massive” connotes hundreds if not thousands of
such units

You might also like