Brief History of Computers

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 7

Brief History of Computers

A Look Back at Computing


Computers have become one of the most important parts of modern society. Nearly
everything that is modern required or uses computer related technology in some
way. But how did computers as we know them come to exist? Did someone sitting in
his lab just one day say, "Aha! I've got it! The computer!"? Well, no, that is not how
this happened. Rather, many years of brilliant ideas and research from many
different individuals contributed to modern computing.
The Early days (1,000 B.C. to 1940)
Ancient Civilations
Computers are named so because they make mathematical computations at fast speeds. As a
result, the history of computing goes back at least 3,000 years ago, when ancient civilizations
were making great strides in arithmetic and mathematics. The Greeks, Egyptians, Babylonians,
Indians, Chinese, and Persians were all interested in logic and numerical computation. The
Greeks focused on geometry and rationality [1], the Egyptians on simple addiction and
subtraction [2], the Babylonians on multiplication and division [3], Indians on the base-10 decimal
numbering system and concept of zero [4], the Chinese on trigonometry, and the Persians on
algorithmic problem solving. [5] These developments carried over into the more modern
centuries, fueling advancements in areas like astronomy, chemistry, and medicine.
Pascal, Leibnitz, and Jacquard
During the first half of the 17th century there were very important advancements in the
automation and simplification of arithmetic computation. John Napier invented logarithms to
simplify difficult mathematical computations. [6] The slide rule was introduced in the year
1622 [7], and Blaise Pascal spent most of his life in the 1600's working on a calculator called the
Pascaline. [9] The Pascaline was mostly finished by 1672 and was able to do addition and
subtraction by way of mechanical cogs and gears. [8] In 1674 the German mathematician
Gottfried Leibnitz created a mechanical calculator called the Leibnitz Wheel. [10] This 'wheel'
could perform addition, subtraction, multiplication, and division, albeit not very well in all
instances.
Neither the Pascaline or Leibnitz wheel can be categorized as computers because they did not
have memory where information could be stored and because they were not
programmable. [5] The first device that did satisfy these requirements was a loom developed in
1801 by Joseph Jacquard. [11] Jacquard built his loom to automate the process of weaving rugs
and clothing. It did this using punched cards that told the machine what pattern to weave. Where
there was a hole in the card the machine would weave and where there was no hole the machine
would not weave. Jacquard's idea of punched cards was later used by computer companies like
IBM to program software..
Hollerith
In America during the late 1800's there were many immigrants pouring in from all over the
world. Officials at the U.S. Census Bureau estimated that it would take ten to twelve years to do
the 1890 census. By the time they finished it would be 1900, and they'd have to do the census all
over again! The problem was that all of the calculations for the census were performed manually.
To solve their problems the U.S. Census Bureau held a competition that called for proposals
outlining a better way to do the census. [17] The winner of the competition was Herman Hollerith,
a statistician, who proposed that the use of automation machines would greatly reduce the time
needed to do the census. He then designed and built programmable card processing machines
that would read, tally, and sort data entered on punch cards. The census data was coded onto
cards using a keypunch. Then these cards were taken to a tabulator (counting and tallying) or
sorter (ordering alphabetically or numerically). [18]
Hollerith's machines were not all-purpose computers but they were a step in that direction.
They successfully completed the census in just 2 years. The 1880 census had taken 8 years to
complete and the population was 30% smaller then, which meant that automated processing was
definitely more efficient for large scale operations. [5] Hollerith saw the potential in his tabulating
and sorting machines, so he left the U.S. Census Bureau to found the Computer Tabulating
Recording Company. His punch-card machines became national bestsellers and in 1924
Hollerith's company changed its name to IBM after a series of mergers with other similar
companies. [19] The computer age was about to begin.

Birth of Computers (1940-1950)


WWII
World War II brought concerns about how to calculate the logistics of such a large scale battle.
The United States needed to calculate ballistics, deploy massive amounts of troops, and crack
secret codes. The military started a number of research projects to try and build computers that
could help with these tasks and more. In 1931 the U.S. Navy and IBM began working together to
build a general-purpose computer called the Mark 1. It was the first computer to use the base-2
binary system, was programmable, and made of vacuum tubes, relays, magnets, and gears. The
Mark 1 was completed in 1944. [20] The Mark 1 had a memory for 72 numbers and could perform
23-digit multiplication in 4 seconds. [5] It was operational for 15 years and performed many
calculations for the U.S. Navy during WWII.
Von Neumann
Though the computers developed in the second World War were definitely computers, they
were not the kind of computers we are used to in modern times. Jon Von Neumann helped work
on the ENIAC and figured out how to make computers even better. The ENIAC was programmed
externally with wires, connectors, and plugs. Von Neumann wanted to make programming
something that was internalized. Instead of rerouting wires and plugs, a person could write a
different sequence of instructions that changes the way a computer runs. Neumann created the
idea of the stored computer program, which is still implemented today in computers that use the
'Von Neumann Architecture'. [24]

First Generation (1950 - 1957)

The first computer to implement Von Neumann's idea was the EDVAC in 1951, developed in a
project led by Von Neumann himself. At the same time a computer using stored programs was
developed in England, called the EDSAC. [25] The EDVAC was commercialized and called the
UNIVAC 1. It was sold to the U.S. Bureau of the Census in March, 1951. This was actually the
first computer ever built for sale. [26] The UNIVAC 1 made a famous appearance on CBS in
November, 1952 during the presidential election. [27] The television network had rented the
computer to boost ratings, planning to have the computer predict who would win the election. The
UNIVAC predicted very early on that Eisenhower would beat Stevenson, which was correct.
Network executives were skeptical and did not go live with the prediction until they had arrived at
the same conclusion using manual methods. The UNIVAC sat right behind CBS staff during the
broadcast, and it was the first time that many people had the chance to see this elusive new
technology called the computer.
IBM's first production computer was the IBM 701 Defense Calculator, introduced in April,
1952. [28] The IBM 701 was used mostly for scientific calculation. The EDVAC, EDSAC, UNIVAC
1, and IBM 701 were all large, expensive, slow, and unreliable pieces of technology-- like all
computers of this time. [29] Some other computers of this time worth mentioning are the
Whirlwind, developed at Massachussets Institute of Technology, and JOHNNIAC, by the Rand
Corporation. The Whirlwind was the first computer to display real time video and use core
memory. [33] The JOHNNIAC was named in honor of Jon Von Neumann. Computers at this time
were usually kept in special locations like government and university research labs or military
compounds. Only specially trained personnel were granted access to these computers. Because
they used vacuum tubes to calculate and store information, these computers were also very hard
to maintain. First generation computers also used punched cards to store symbolic programming
languages. [5] Most people were indirectly affected by this first generation of computing machines
and knew little of their existence.

Second Generation (1957 - 1965)


The second generation of computing took place between 1957 and 1965. Computers were now
implementing transistors, which had been invented in 1947 by a group of reseachers at Bell
Laboratories, instead of vacuum tubes. [30] Because of the transistor and advances in electrical
engineering, computers were now cheaper, faster, more reliable, and cheaper than ever before.
More universities, businesses, and government agencies could actually afford computers now.
In 1957 the first FORTRAN compiler was released. FORTRAN was the first high-level
programming language ever made. [31] It was developed by IBM for scientific and engineering
use. In 1959, the COmmon Business-Oriented Language (COBOL) programming language was
released. Where FORTRAN was designed for science and engineering, COBOL was designed to
serve business environments with their finances and administrative tasks. [32] These two
programming languages essentially helped to create the occupation of a programmer. Before
these languages, programming computers required electrical engineering knowledge.
This generation of computers also had an increase in the use of core memory and disks for
mass storage. A notable computer to mention from this time period is the IBM System/360, a
mainframe computer that is considered one of the important milestones in the industry. It was
actually a family of computer models that could be sold to a wide variety of businesses and
institutions. [37]

Third Generation (1965 - 1975)

The third generation of computing spanned from 1965 to 1975. During this time integrated
circuits with transistors, resistors, and capacitors were etched onto a piece of silicon. This
reduced the price and size of computers, adding to a general trend in the computer industry of
miniaturization. In 1960 the Digital Equipment Corporation introduced the Programmed Data
Processor- 1 (PDP-1), which can be called the first minicomputer due to its relatively small
size. [34] It is classified as a third generation computer because of the way it was built, even
though it was made before 1965. The PDP-1 was also the computer that ran the very first video
game, called Spacewar (written in 1962). [35]
The software industry came into existence in the mid 1970's as companies formed to write
programs that would satisfy the increasing number of computer users. Computers were being
used everywhere in business, government, military, and education environments. Because of
there target market, the first software companies mostly offered accounting and statistical
programs. [5] This time period also had the first set of computing standards created for
compatibility between systems.
E-mail originated sometime between 1961 and 1966, allowing computer users to send
messages to each other as long as they were connected through a network. [38] This is closely
tied to the work that was being done on Advanced Research Projects Agency Network
(ARPANET), networking technology and innovation that would one day bring the internet. [50]

Fourth Generation (1975 - 1985)


The fourth generation of computing spanned from 1975 to 1985. Computer technology had
advanced so rapidly that computers could fit in something the size of a typewriter. These were
called microcomputers, the first one being the Altair 8800. The Altair 8800 debuted in 1975 as a
mail-order hobby kit. Many people acknowledge the Altair 8800 as the computer that sparked the
modern computer revolution, especially since Bill Gates and Paul Allen founded Microsoft with a
programming language called Altair BASIC-- made specifically for the 8800. [36] Now that
computers could fit on desks they became much more common.
A small company called Apple Computer, Inc. was established in 1976 and single handedly
changed the industry forever. Steve Wozniak and Steve Jobs began to sell their Apple 1
computer that same year, and it quickly gained popularity. It came with a keyboard and only
required a monitor to be plugged into the back of the system, which was a novel idea for
computers at that time. The Apple II was released the next year and was the first mass produced
microcomputer to be commercially sold, and also ushered in the era of personal computing.
In 1981, Microsoft Disk Operating System (MS-DOS) was released to run on the Intel 8086
microprocessor. [39] Over the next few years MS-DOS became the most popular operating
system in the world, eventually leading to Microsoft Windows 1.0 being released in 1985. [40] In
1984 Apple introduced their Mac OS, which was the first operating system to be completely
graphical. Both Mac OS and Windows used pull-down menus, icons, and windows to make
computing more user-friendly. Computers were now being controlled with a mouse as well as
keyboard. The first mouse was developed in 1981 by Xerox. [41]
Software became much more common and diverse during this period with the development of
spreadsheets, databases, and drawing programs. Computer networks and e-mail became much
more prevalent as well.
The first truly portable computer, called the Osborne 1, was released in 1981. [37] Portable
computers like the TRS-80 Model 100 / 102 and IBM 5155 followed afterward. [38]
Not all the computers of the time were small, of course. There were still being supercomputers
built with the aim of being as fast as possible. These supercomputers were sold to companies,
universities, and the military. An example of one such supercomputer is the Cray-1, which was
released in 1976 by Cray Research. [39] It became one of the best known and most successful
supercomputers ever for its unique design and fast speed of 250 MFLOPS.
This generation was also important for the development of embedded systems. These are
special systems, usually very tiny, that have computers inside to control their
operation. [42] These embedded systems were put into things like cars, thermostats, microwave
ovens, wristwatches, and more.

Fifth Generation (1985 - Present)

The changes that have occurred since 1985 are plentiful. Computers have gotten tinier, more
reliable, and many times faster. Computers are mostly built using components from many
different corporations. For this reason, it is easier to focus on specific component advancements.
Intel and AMD are the main computer processor companies in the world today and are constant
rivals. [42] There are many different personal computer companies that usually sell their
hardware with a Microsoft Windows operating system preinstalled. Apple has a wide line of
hardware and software as well. [45] Computer graphics have gotten very powerful and are able to
display full three dimensional graphics at high resolution. [41] Nvidia and ATI are two companies
in constant battle with one another to be the computer graphics hardware king.
The software industry has grown a lot as well, offering all kinds of programs for almost anything
you can think of. Microsoft Windows still dominates the operating system scene. In 1995
Microsoft released Windows 95, an operating system that catapulted them to a new level of
dominance. [46] In 1999 Apple revamped its operating system with the release of Mac OS
X. [47] In 1991 Linus Torvalds wrote the Linux kernel that has since spawned countless open
source operating systems and open source software. [44]
Computers have become more and more online orientated in modern times, especially with the
development of the World Wide Web. Popular companies like Google and Yahoo! were started
because of the internet. [43]
In 2008 the IBM Roadrunner was introduced as the fastest computer in the world at
1.026 PFLOPS. [40]Fast supercomputers aid in the production of movie special effects and the
making of computer animated movies. [48][49]

You might also like