Professional Documents
Culture Documents
PDF of Luna 02 1St Edition Maria Llovet Full Chapter Ebook
PDF of Luna 02 1St Edition Maria Llovet Full Chapter Ebook
PDF of Luna 02 1St Edition Maria Llovet Full Chapter Ebook
https://ebookstep.com/product/luna-03-1st-edition-maria-llovet/
https://ebookstep.com/product/luna-01-1st-edition-maria-llovet/
https://ebookstep.com/product/luna-05-1st-edition-maria-llovet/
https://ebookstep.com/product/luna-04-1st-edition-maria-llovet/
Luna 1st Edition María Llovet
https://ebookstep.com/product/luna-1st-edition-maria-llovet/
https://ebookstep.com/product/girls-12-1st-edition-the-luna-
brothers-jonathan-y-joshua-luna/
https://ebookstep.com/product/girls-16-1st-edition-the-luna-
brothers-jonathan-y-joshua-luna/
https://ebookstep.com/product/girls-15-1st-edition-the-luna-
brothers-jonathan-y-joshua-luna/
https://ebookstep.com/product/girls-14-1st-edition-the-luna-
brothers-jonathan-y-joshua-luna/
Another random document with
no related content on Scribd:
International Business Machines Corp.
Edison was illuminating the world and the same electrical power
was brightening the future of computing machines. As early as 1915
the Ford Instrument Company was producing in quantity a device
known as “Range Keeper Mark I,” thought to be the first electrical-
analog computer. In 1920, General Electric built a “short-circuit
calculating board” that was an analog or model of the real circuits
being tested. Westinghouse came up with an “alternating-current
network analyzer” in 1933, and this analog computer was found to
be a powerful tool for mathematics.
International Business Machines Corp.
Shannon’s work and the thinking of others in the field indicated the
power of the digital, yes-no, approach. A single switch can only be
on or off, but many such switches properly interconnected can do
amazing things. At first these switches were electromechanical; in
the Eckert-Mauchly ENIAC, completed for the government in 1946,
vacuum tubes in the Eccles-Jordan “flip-flop” circuit married
electronics and the computer. The progeny have been many, and
their generations faster than those of man. ENIAC has been followed
by BINAC and MANIAC, and even JOHNNIAC. UNIVAC and
RECOMP and STRETCH and LARC and a whole host of other
machines have been produced. At the start of 1962 there were some
6,000 electronic digital computers in service; by year’s end there will
be 8,000. The golden age of the computer may be here, but as we
have seen, it did not come overnight. The revolution has been slow,
gathering early momentum with the golden wheels of Homer’s
mechanical information-seeking vehicles that brought the word from
the gods. Where it goes from here depends on us, and maybe on the
computer itself.
“Theory is the guide to practice, and practice is the ratification and
life of theory.”
—John Weiss
3: How Computers Work
COMPUTER DICTIONARY
Access Time—Time required for computer to locate data and
transfer it from one computer element to another.
Adder—Device for forming sums in the computer.
Address—Specific location of information in computer memory.
Analog Computer—A physical or electrical simulator which
produces an analogy of the mathematical problem to be solved.
Arithmetic Unit—Unit that performs arithmetical and logical
operations.
Binary Code—Representation of numbers or other information
using only one and zero, to take advantage of open and closed
circuits.
Bit—A binary digit, either one or zero; used to make binary
numbers.
Block—Group of words handled as a unit, particularly with
reference to input and output.
Buffer—Storage device to compensate for difference in input and
operation rate.
Control Unit—Portion of the computer that controls arithmetic
and logical operations and transfer of information.
Delay Line—Memory device to store and later reinsert
information; uses physical, mechanical, or electrical techniques.
Digital Computer—A computer that uses discrete numbers to
represent information.
Flip-Flop—A circuit or device which remains in either of two
states until the application of a signal.
Gate—A circuit with more than one input, and an output
dependent on these inputs. An AND gate’s output is energized only
when all inputs are energized. An OR gate’s output is energized when
one or more inputs are energized. There are also NOT-AND gates,
EXCLUSIVE-OR gates, etc.
Logical Operation—A nonarithmetical operation, i.e., decision-
making, data-sorting, searching, etc.
Magnetic Drum—Rotating cylinder storage device for memory
unit; stores data in coded form.
Matrix—Circuitry for transformation of digital codes from one type
to another; uses wires, diodes, relays, etc.
Memory Unit—That part of the computer that stores information
in machine language, using electrical or magnetic techniques.
Microsecond—One millionth of a second.
Millisecond—One thousandth of a second.
Nanosecond—One billionth of a second.
Parallel Operation—Digital computer operation in which all
digits are handled simultaneously.
Programming—Steps to be executed by computer to solve
problem.
Random Access—A memory system that permits more nearly
equal access time to all memory locations than does a nonrandom
system. Magnetic core memory is a random type, compared with a
tape reel memory.
Real Time—Computer operation simultaneous with input of
information; e.g., control of a guided missile or of an assembly line.
Register—Storage device for small amount of information while,
or until, it is needed.
Serial Operation—Digital computer operation in which all digits
are handled serially.
Storage—Use of drums, tapes, cards, and so on to store data
outside the computer proper.
The Computer’s Parts
Looking at computers from a distance, we are vaguely aware that
they are given problems in the form of coded instructions and that
through some electronic metamorphosis this problem turns into an
answer that is produced at the readout end of the machine. There is
an engineering technique called the “black box” concept, in which we
are concerned only with input to this box and its output. We could
extend this concept to “black-magic box” and apply it to the
computer, but breaking the system down into its components is quite
simple and much more informative.
There are five components that make up a computer: input,
control, arithmetic (or logic) unit, memory, and output. As machine
intelligence expert, Dr. W. Ross Ashby, points out, we can get no
more out of a brain—mechanical or human—than we put into it. So
we must have an input. The kind of input depends largely on the
degree of sophistication of the machine we are considering.
With the abacus we set in the problem mechanically, with our
fingers. Using a desk calculator we punch buttons: a more refined
mechanical input. Punched cards or perforated tapes are much used
input methods. As computers evolve rapidly, some of them can
“read” for themselves and the input is visual. There are also
computers that understand verbal commands.
Input should not be confused with the control portion of the
computer’s anatomy. We feed in data, but we must also tell the
computer what to do with the information. Shall it count the number
of cards that fly through it, or shall it add the numbers shown on the
cards, record the maximum and minimum, and print out an average?
Control involves programming, a computer term that was among the
first to be assimilated into ordinary language.
The arithmetic unit—that part of the computer that the pioneer
Babbage called his “mill”—is the nuts and bolts end of the business.
Here are the gears and shafts, the electromechanical relays, or the
vacuum tubes, transistors, and magnetic cores that do the addition,
multiplication, and other mathematical operations. Sometimes this is
called the “logic” unit, since often it manipulates the ANDS, ORS,
NORS, and other conjunctives in the logical algebra of Boole and his
followers.
The memory unit is just that; a place where numbers, words, or
other data are stored and ready to be called into use whenever
needed. There are two broad types of memory, internal and external,
and they parallel the kind of memory we use ourselves. While our
brain can store many, many facts, it does have a practical limit. This
is why we have phone books, logarithm tables, strings around
fingers, and so on. The computer likewise has its external memory
that may store thousands of times the capacity of its internal
memory. Babbage’s machine could remember a thousand fifty-digit
numbers; today’s large computers call on millions of bits of data.
After we have dumped in the data and told the computer what to
do with them, and the arithmetic and memory have collaborated, it
remains only for the computer to display the result. This is the output
of the computer, and it can take many forms. If we are using a simple
analog computer such as a slide rule, the answer is found under the
hairline on the slide. An electronic computer in a bank prints out the
results of the day’s transactions in neat type at hundreds of lines a
minute. The SAGE defense computer system displays an invading
bomber and plots the correct course for interceptors on a scope; a
computer in a playful mood might type out its next move—King to Q7
and checkmate.
With this sketchy over-all description to get us started, let us study
each unit in a little more detail. It is interesting to compare these
operations with those of our human computer, our brain, as we go
along.