Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 8

COMP 1153

INTRODUCTION TO INFORMATION AND COMMUNICATIONS TECHNOLOGY


INTRODUCTION TO DATA PROCESSING
There are different things we use today to make our work easy. Humans always
look for ways to change our way of life in terms of making our work easy. One of the
most important things invented in our day today in terms of communication is Cell
phones. And it is continuously developing to make our communication with our love
ones who are far from us like 3Gs.
What is Data?
-

Is the manipulation of data into a more useful form


Data are raw materials. These are the things you see around us.

Data Processing System refers to the equipment or devices and procedures by


which the result is achieved.
Information
These are processed data.
- Data that has been gathered or put together to create a real value (processed
data).
CATEGORIES OF DATA PROCESSING
1. Mechanical Data Processing uses a combination of manual procedures and
mechanical equipment. (eg. Typewriters, calculators)
2. Electronic Data Processing uses different types of input, output and storage
devices and is interconnected to an Electronic computer to process data.
DATA PROCESSING CYCLE

INPUT

PROCESS

OUTPUT

1. INPUT input data which are prepared in a convenient form for processing
2. PROCESS input data are changed. Usually contained with either information.
The action done to the data.
3. OUTPUT the result of the proceeding processing steps are collected. The result
of the processed data.
Expanded Data Processing Cycle
4. Origination refers to the process of collecting the original data. Source
documents are original recording of data.
5. Distribution refers to the distribution of output data. Report documents is
recording of the output data
6. Storage storing of result. File is a unified set of data in storage. These are
collection of records.

Origination

INPUT

Storage

PROCESS

OUTPUT

Distribution

AREAS OF DATA PROCESSING


-

Data processing can be classified into 2:


1. Scientific in nature the need to establish, retain and process files of
data for producing useful information.
2. Business in nature includes a limited volume of input and many
logical or arithmetic calculations.

DATA PROCESSING OPERATIONS


1.
2.
3.
4.

Recording refers to the transfer of data into some form of document.


Verifying data that are carefully checked for errors.
Duplicating reproducing the data into many forms of documents.
Classifying separates data into various categories. Identifying and arranging
items with like characteristics into groups or classes.
a. Classifying is done using a method called Code:
b. 3 types of code:
i. Numeric
ii. Alphabetic
iii. Alphanumeric

5. Sorting arranging data in a specific order


6. Calculating automatic operations/manipulation of the data
7. Summarizing and Reporting a collection of data is considered and certain
conclusions from the data are represented in a meaningful format
8. Merging sets of data that are put together to form a single set of data
9. Storing placing similar files for future reference.
10.
Retrieving recovering stored data or information when needed.
11.
Feedback comparison of the output(s) and the goal set in advance.
METHODS OF PROCESSING DATA
2

This are use to cope up in terms of timeliness, effectiveness, and availability of


information

1. Batch Processing the simplest form of data processing. Is a technique in which


data to be processed or programs to be collected into groups to permit convenient,
0efficient, and serial processing.
2. Online Processing refers to the equipment or devices under the direct control
of the central processing unit of a computer. (inquiries)
3. Real-Time Processing - is a method of data processing in which has the
capability of a fast response to obtain data from an activity or a physical process,
perform computations, and return a response rapidly enough to affect the outcome
of the activity or process. (airlines)
4. Distributed Processing most complex level of computer processing.
Computers are connected in large computers to. (electric power plants)
HISTORY OF COMPUTERS
3 types of devices that a man has invented to help man in calculating and processing
data:
1. Manual-Mechanical powered by hand
2. Electromechanical powered by electric motor
3. Electronic uses circuit boards, transistors pr silicon chips
Earliest Computing Device
1. Abacus developed in China in 12th Century A.D. This machine is simple and
effective.
2. Napiers Bone invented by John Napier a Scottish Mathematician. He also
invented the Logarithms (trigonometry). Napiers Bone represented a
significant contribution to the development of computing device. His machine
is capable of changing multiplication calculations to a simple addition method.
3. Oughtreds Slide Rule Invented by William Oughtred and English
Mathematician. His invention made multiplication and division easy.
4. Pascals Calculator (The Pascaline) Invented by Blaise Pascal a French
Mathematician and experimental Physicist. In 1645, he developed a calculating
machine able to add and subtract numbers that contains up to 8 digits.
5. Leibnizs Calculator (Leibniz Stepped Drum) A German polymath,
mathematician and philosopher, named Gottfried Wilhelm Von Leibniz invented
this in the 17th Century. He is also the co-inventor of Differential Calculus. He
completed his calculator in 1694, a machine that can add, subtract, divide,
multiply and extract square roots.
6. Analytical Engine an English Man named Charles Babbage was the inventor
of this machine. He is commonly referred to as the father of Modern
computational devices because of his ideas. In 1882 he started to work on a
machine called the Difference Engine. This machine can calculate roots of
polynomials but eventually he didnt finished this machine. He then started
creating the Analytical Engine which uses Operation Cards that contains
functions to be performed by the machine and a Variable Cards to specify
actual data. Lady Ada Byron, Countess of Lovelace worked with Babbage.
She wrote demonstration programs for the Analytical Engine. She has been
known as the First Lady Programmer

7. Holleriths Punch Car Machine In 1880 a statistician in the US Bureau of


the Census named Herman Hollerith invented this machine that would help
calculate their tabulating problems. He uses electricity to build an
electromagnetic counting machine that sorts data manually and tabulate the
data. Hollerith later resign and built his own company that sells tabulating
machines. Later then, his company became a forerunner of IBM Corporation.

DEVELOPMENT IN ELECTRONIC DATA PROCESSING


Many consider that the modern computer era commenced with the first large-scale
automatic digital computer, which was developed between 1939 and 1944
1. MARK 1 invented by Howard H. Aiken at Harvard University with the help of
graduate students and engineers of IBM.
- was officially known as the IBM automatic sequence controlled
calculator (ASCC), but is more commonly referred to as the Harvard
Mark I.

- Description of Mark I
Was constructed out of switches, relays, rotating shafts, and clutches, and was described
as sounding like a "roomful of ladies knitting." The machine contained more than
750,000 components, was 50 feet long, 8 feet tall, and weighed approximately 5 tons!
The device consisted of many calculators which worked
on parts of the same problem under the guidance of a
single control unit. Instructions were read in on paper
tape, data was provided on punched cards, and the
device could only perform operations in the sequence in
which they were received.
This machine was based on numbers that were 23 digits wide -- it could add or subtract
two of these numbers in three-tenths of a second, multiply them in four seconds, and
divide them in ten seconds.
2. ENIAC - In 1946, John Mauchly and J Presper Eckert developed the ENIAC I
(Electrical Numerical Integrator And Calculator). The U.S. military sponsored their
research; they needed a calculating device for writing artillery-firing tables (the
settings used for different weapons under varied conditions for target accuracy).
He had begun designing (1942) a better calculating machine based
on the work of John Atanasoff that would use vacuum tubes to speed
up calculations.
On May 31, 1943, the military commission on the new computer
began; John Mauchly was the chief consultant and J Presper Eckert
was the chief engineer.
It took the team about one year to design the ENIAC and 18 months
and 500,000 tax dollars to build it. By that time, the war was over.
Description of ENIAC
The ENIAC contained 17,468 vacuum tubes, along with
70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000
manual switches and 5 million soldered joints. It covered
1800 square feet (167 square meters) of floor space,
weighed 30 tons, consumed 160 kilowatts of electrical
power. In one second, the ENIAC (one thousand times faster
than any other calculating machine to date) could perform
5,000 additions, 357 multiplications or 38 divisions.
Programming changes would take the technicians weeks, and the machine always
required long hours of maintenance. As a side note, research on the ENIAC led to many
improvements in the vacuum tube.
In 1948, Dr. John Von Neumann made several modifications to the ENIAC. The
ENIAC had performed arithmetic and transfer operations concurrently, which caused
programming difficulties. Von Neumann suggested that switches control code selection
so pluggable cable connections could remain fixed. He added a converter code to enable
serial operation.
In 1946, J Presper Eckert and John Mauchly started the Eckert-Mauchly Computer
Corporation. In 1949, their company launched the BINAC (BINary Automatic) computer
that used magnetic tape to store data.

J Presper Eckert and John Mauchly both received the IEEE Computer Society
Pioneer Award in 1980. At 11:45 p.m., October 2, 1955, with the power finally shut off,
the ENIAC retired.
3. EDVAC - (Electronic Discrete Variable Automatic Computer) was one of the earliest
electronic computers. Unlike the ENIAC, it was binary rather than decimal.
The design for the EDVAC was developed before the ENIAC was even
operational. It was intended to resolve many of the problems created by the
ENIAC's design. Like the ENIAC, the EDVAC was built for the U.S. Army's
Ballistics Research Laboratory at the Aberdeen Proving Ground by the
University of Pennsylvania. The ENIAC designers Eckert & Mauchly were joined
by John von Neumann and some others and the new design was based on von
Neumann's 1945 report, First Draft of a Report on the EDVAC.
A contract to build the new computer was signed in April 1946 with an initial
budget of US$100,000 and the contract named the device the Electronic
Discrete Variable Automatic Calculator. A major concern in construction was to
balance reliability and economy. The final cost of EDVAC, however, ended up
similar to the ENIAC's at just under $500,000; five times the initial estimate.
Technical description
The computer that was built was to be binary with automatic
addition, subtraction, multiplication, programmed division and
automatic checking with a memory capacity of 1,000 44-bit words
(later set to 1,024 words, thus giving a memory, in modern terms, of
5.5 kilobytes).
Physically the computer was built out of the following components:

a magnetic tape reader-recorder


a control unit with an oscilloscope
a dispatcher unit to receive instructions from the control and memory and direct
them to other units
a computational unit to perform arithmetic operations on a pair of numbers at a
time and send the result to memory after checking on a duplicate unit
a timer
a dual memory unit consisting of two sets of 64 mercury acoustic delay lines of
eight words capacity on each line
three temporary tanks each holding a single word

EDVAC's addition time was 864 microseconds and its multiplication time was 2900
microseconds (2.9 milliseconds).
The computer had almost 6,000 vacuum tubes and 12,000 diodes, and consumed 56
kW of power. It covered 490 ft (45.5 m) of floor space and weighed 17,300 lb (7,850
kg). The full complement of operating personnel was thirty people for each eight-hour
shift.
The computer began operation in 1951 although only on a limited basis. Its
completion was delayed because of a dispute over patent rights between Eckert &
Mauchly and the University of Pennsylvania. This resulted in Eckert and Mauchly leaving
to form the Eckert-Mauchly Computer Corporation and taking most of the senior
engineers with them.
By 1960 EDVAC was running over 20 hours a day with error-free run time averaging
eight hours. EDVAC received a number of upgrades including punch-card I/O in 1953,
extra memory in slower magnetic drum form in 1954, and a floating point arithmetic unit

in 1958. EDVAC ran until 1961 when it was replaced by BRLESC. During its lifetime it had
proved to be reliable for its time and productive.

THE FIVE GENERATIONS OF COMPUTERS


The history of computer development is often referred to in reference to the
different generations of computing devices. Each generation of computer is
characterized by a major technological development that fundamentally changed the
way computers operate, resulting in increasingly smaller, cheaper, more powerful and
more efficient and reliable devices. Read about each generation and the developments
that led to the current devices that we use today.

First Generation - 1940-1956: Vacuum Tubes


The first computers used vacuum tubes for circuitry and magnetic
drums for memory, and were often enormous, taking up entire rooms.
They were very expensive to operate and in addition to using a great
deal of electricity, generated a lot of heat, which was often the cause
of malfunctions. First generation computers relied on machine
language to perform operations, and they could
only solve one problem at a time. Input was based
on punched cards and paper tape, and output was
displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation
computing devices. The UNIVAC was the first commercial computer
delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation - 1956-1963: Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers.
The transistor was invented in 1947 but did not see widespread use in computers until
the late 50s. The transistor was far superior to the vacuum tube, allowing computers to
become smaller, faster, cheaper, more energy-efficient and
more reliable than their first-generation predecessors. Though
the transistor still generated a great deal of heat that
subjected the computer to damage, it was a vast improvement
over the vacuum tube. Second-generation computers still
relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary
machine language to symbolic, or assembly, languages, which
allowed programmers to specify instructions in words. Highlevel programming languages were also being developed at this time, such as early
versions of COBOL and FORTRAN. These were also the first computers that stored their
instructions in their memory, which moved from a magnetic drum to magnetic core
technology. The first computers of this generation were developed for the atomic energy
industry.

Third Generation - 1964-1971: Integrated Circuits


The development of the integrated circuit was the hallmark of the third generation of
computers. Transistors were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and efficiency
of computers.
Instead of punched cards and printouts, users interacted with third
generation computers through keyboards and monitors and
interfaced with an operating system, which allowed the device to run
many different applications at one time with a central program that monitored the
memory. Computers for the first time became accessible to a mass audience because
they were smaller and cheaper than their predecessors.
Fourth Generation - 1971-Present: Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of
integrated circuits were built onto a single silicon chip. What in the first generation filled
an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in
1971, located all the components of the computer - from the central processing unit and
memory to input/output controls - on a single chip.

In 1981 IBM introduced its first computer for the home user, and in
1984 Apple introduced the Macintosh. Microprocessors also moved
out of the realm of desktop computers and into many areas of life as
more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the
development of the Internet. Fourth generation computers also saw the development of
GUIs, the mouse and handheld devices.
Fifth Generation - Present and Beyond: Artificial Intelligence
Fifth generation computing devices, based on artificial
intelligence, are still in development, though there are some
applications, such as voice recognition, that are being used today.
The use of parallel processing and superconductors is helping
to make artificial intelligence a reality. Quantum computation and
molecular and nanotechnology will radically change the face of
computers in years to come. The goal of fifth-generation computing
is to develop devices that respond to natural language input and are
capable of learning and self-organization.

You might also like