Bus IT 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Course Number : CC213

Course Title : IT TOOLS APPLICATION IN BUSINESS


Credits : 3 units (3 hours lecture; 2hours lab)
Week : 2 (two) : September 7-11, 2020
Instructor : NELSON G. BALNEG JR.
MODULE 2: COMPUTER CONCEPT AND DESCRIPTION
UNIT 1: HISTORY OF COMPUTERS
I. TOPICS:
Lesson 1: Earliest Computer
Lesson 2: Evolution of Computer
Lesson 3: Computer Generations
Lesson 4: Definition of Computer

II. LEARNING OUTCOME(S):


At the end of this unit, you must:
1. Determine key events in the history of computers;
2. Describe how calculations were made during ancient times.
3. Identify the personages who made significant invention of calculating machine.
4. Differentiate forms and types of computers that emerged throughout history;

III. COURSE CONTENT

Introduction

The computer was born not for entertainment or email but out of a need to solve a serious number-crunching
crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the
U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based
computers that took up entire rooms.

Today, we carry more computing power on our smartphones than was available in these early models. The
following brief history of computing is a timeline of how computers evolved from their humble beginnings to
the machines of today that surf the Internet, play games and stream multimedia in addition to crunching
numbers.

Lesson 1: Earliest Computer

Earliest Computer
• Originally calculations were computed by humans, whose job title was computers.
• These human computers were typically engaged in the calculation of a mathematical expression.
• The calculations of this period were specialized and expensive, requiring years of training in
mathematics.
• The first use of the word "computer" was recorded in 1613, referring to a person who carried out
calculations, or computations, and the word continued to be used in that sense until the middle of the
20th century.

Lesson 2: Evolution of Computer

1
TALLY STICKS

A tally stick was an ancient memory aid device to


record and document numbers, quantities, or even
messages.

ABACUS

An abacus is a mechanical device used to aid an individual in performing mathematical calculations. The
abacus was invented in Babylonia in 2400 B.C. The abacus in the form we are most familiar with was first used
in China in around 500 B.C. It used to perform basic arithmetic operations.

2
NAPIER’S BONES
• Invented by John Napier in 1614.
• Allowed the operator to multiply,
divide and calculate square and
cube roots by moving the rods around
and placing them in specially constructed
boards.

SLIDE RULE
• Invented by William Oughtred in 1622.
• Is based on Napier's ideas about logarithms.
•The slide rule, also known colloquially in the
United States as aslipstick, is a mechanical
analog computer
• Used primarily for multiplication, division,
roots, logarithms, trigonometry
• Not normally used for addition or subtraction.

PASCALINE

• Invented by Blaise Pascal in 1642,


famous French philosopher and mathematician.

3
• It is the first digital calculator.
• It was its limitation to addition and subtraction.
• It is too expensive.

STEPPED RECKONER
• Invented by Gottfried Wilhelm Leibniz in 1672.
• The machine that can add, subtract, multiply and divide automatically
• Interesting thing about the Stepped Reckoner is that Leibnitz’s design was way ahead of his time. A working
model of the machine did not appear till 1791, long after the inventor was dead and gone.

JACQUARD LOOM
• The Jacquard loom is a mechanical loom,
invented by Joseph-Marie Jacquard in 1881.
• It an automatic loom controlled by punched cards.
•His invention also provided a model for the input
and output of data in the electro-mechanical and
electronic computing industry.

DIFFERENCE ENGINE AND ANALYTICAL ENGINE


• Invented by Charles Babbage in 1822 and 1834
• Charles Babbage is recognized today as the Father of Computers
• It is the first mechanical computer.
• It an automatic, mechanical calculator designed to tabulate polynomial functions.
• Difference engine built in early 1800’s is a special purpose calculator and used for naval navigation charts.

4
•Analytical engine is general-purpose, used binary system punched cards as input, machined parts not accurate
enough never quite completed.

TABULATING MACHINE
• Invented by Herman Hollerith in 1890.
• Herman Hollerith worked as a statistician
for the U.S. Census Bureau in the 1880s and 1890s.
• To assist in summarizing information
and accounting.

HAVARD MARK 1
• Also known as IBM Automatic Sequence
Controlled Calculator (ASCC).
• Invented by Howard H. Aiken in 1943
• The first electro-mechanical computer.
•The computer had mechanical relays (switches)
which flip-flopped back and forth to represent
mathematical data. It was huge (of course),
weighting some 35 tons with 500 miles of wiring.

5
Z1
• The first programmable computer.
• Created by Konrad Zuse in Germany
from 1936 to 1938.
• To program the Z1 required that the user
insert punch tape into a punch tape reader
and all output was also generated through
punch tape.

ATANASOFF-BERRY COMPUTER (ABC)


• It was the first electronic digital computing device.
• Invented by Professor John Atanasoff and
graduate student Clifford Berry (right) at Iowa
State University between 1939 and 1942.
•The ABC was a digital computer, so-called because it
processed data in discrete, digital units (the digits 1 and 0).
•. The ABC used vacuum tubes, punched cards and a
memory device that looked like a drum (shown in the
pictures below being held by Atanasoff and as it was located in the
working machine).

ENIAC
• ENIAC stands for
Electronic Numerical Integrator and Computer.
• It was the first electronic general purpose computer.
• Completed in 1946.
• Developed by John Presper Eckert and John W. Mauchly.
•The ENIAC: 30 tons, 18,000 vacuum tubes, with the
computing power of little more than the modern calculator

UNIVAC 1
• The UNIVAC I (UNIVersal Automatic Computer 1)
was the first commercial computer.
• Designed by J. Presper Eckert and
John Mauchly.

6
EDVAC
• EDVAC stands for Electronic Discrete Variable Automatic Computer
• The First Stored Program Computer
• Designed by Von Neumann in 1952.
• It has a memory to hold both a
stored program as well as data.

THE FIRST PORTABLE COMPUTER


• Osborne 1 – the first portable computer.
• Released in 1981 by the Osborne Computer
Corporation.

THE FIRST COMPUTER COMPANY


• The first computer company was the
Electronic Controls Company.
• Founded in 1949 by J. Presper Eckert
and John Mauchly.

Lesson 3: COMPUTER GENERATIONS

COMPUTER GENERATIONS

The evolution of computer started from 16th century and resulted in the form that we see today. The present
day computer, however, has also undergone rapid change during the last fifty years. This period, during which
the evolution of computer took place, can be divided into five distinct phases, basis of the type of switching
circuits known as Generations of Computers.

There are five generations of computer:

• First generation – 1946 - 1958


• Second generation – 1959 - 1964
• Third generation – 1965 - 1970
• Fourth generation – 1971 - today
• Fifth generation – Today to future

The First Generation

• The first computers used vacuum tubes


for circuitry and magnetic drums for memory,
and were often enormous, taking up entire rooms.

• They were very expensive to operate and in addition


to using a great deal of electricity, generated a lot of
heat, which was often the cause of malfunctions.

7
The First Generation
• First generation computers relied on machine language, the lowest-level programming language understood by
computers, to perform operations, and they could only solve one problem at a time.
• Input was based on punched cards and paper tape, and output was displayed on printouts.

The Second Generation


• Transistors replaced vacuum tubes and ushered in the second
generation of computers.
• One transistor replaced the equivalent of 40 vacuum tubes.
• Allowing computers to become smaller, faster, cheaper, more
energy-efficient and more reliable.
• Still generated a great deal of heat that can damage the computer.

The Second Generation


• Second-generation computers moved from cryptic binary machine language to symbolic, or assembly,
languages, which allowed programmers to specify instructions in words.
• Second-generation computers still relied on punched cards for input and printouts for output.
• These were also the first computers that stored their instructions in their memory, which moved from a magnetic
drum to magnetic core technology.

The Third Generation


Invented separately by 2 people ~1958
 Jack Kilby at Texas Instruments
 Robert Noyce at Fairchild Semiconductor (1958-59)
• The development of the integrated circuit was the
hallmark of the third generation of computers.
• Transistors were miniaturized and placed on silicon
chips, called semiconductors, which drastically
increased the speed and efficiency of computers.
• Much smaller and cheaper compare to the second
generation computers.
• It could carry out instructions in billionths of a second.

The Third Generation


• Users interacted with third generation computers through keyboards and monitors and interfaced with an
operating system, which allowed the device to run many different applications at one time with a central program
that monitored the memory.
• Computers for the first time became accessible to a mass audience because they were smaller and cheaper than
their predecessors.

The Fourth Generation


• The microprocessor brought the fourth generation of
computers, as thousands of integrated circuits were built
onto a single silicon chip.
• As these small computers became more powerful, they
could be linked together to form networks, which
eventually led to the development of the Internet.
• Fourth generation computers also saw the development
of GUIs, the mouse and handheld devices.

The Fifth Generation


• Based on Artificial Intelligence (AI).
• Still in development.
• The use of parallel processing and superconductors

8
is helping to make artificial intelligence a reality.
• The goal is to develop devices that respond to natural
language input and are capable of learning and self-organization.
• There are some applications, such as voice recognition, that are
being used today.

MOORE’S LAW

 deals with steady rate of miniaturizing of technology


 named for Intel co-founder Gordon Moore
 not really a law
◦ more a “rule of thumb”
 a practical way to think about something
9
 observation that chip density about doubles every 18 months
◦ also, prices decline
◦ first described in 1965
◦ experts predict this trend might continue until ~2020
◦ limited when size reaches molecular level

What Is Moore's Law?


 Moore's Law asserts that the number of transistors on a microchip doubles every two years, though the
cost of computers is halved.
 In other words, we can expect that the speed and capability of our computers will increase every couple of
years, and we will pay less for them.
 Another tenet of Moore's Law is that this growth in the microprocessor industry is exponential—meaning
that it will expand steadily and rapidly over time.

And now where?

What of the future? The power of computers (the number of components packed on a chip) has doubled roughly every
18 months to 2 years since the 1960s. But the laws of physics are expected to bring a halt to Moore's Law, as this
idea is known, and force us to explore entirely new ways of building computers. What will tomorrow's PCs look like?
One long-touted idea is that they'll be using particles of light—photons—instead of electrons, an approach known as
optical computing or photonics. Currently, much of the smart money is betting on quantum computers, which deploy
cunning ways of manipulating atoms to process and store information at lightning speed. There's also hope we might
use spintronics (harnessing the "spin" of particles) and biomolecular technology (computing with DNA, proteins, and
other biological molecules), though both are in the very early stages of research. Chips made from new materials
such as graphene may also offer ways of extending Moore's law. Whichever technology wins out, you can be quite
certain the future of computing will be just as exciting as the past!

Lesson 4: Definition of Computer

Definition of Computer
• Computer is a programmable machine.
• Computer is a machine that manipulates data according to a list of instructions.
• Computer is any device which aids humans in performing various kinds of computations or
calculations.

The term computer has many definitions. The easiest to understand and remember is – a computer is an electronic
device designed to manipulate data so that useful information can be generated. In the succeeding chapter, you will
learn to understand the rudiments of how computer works. After you develop computer awareness, you will be able
to determine your place in the ‘electronic revolution’.

10

You might also like