Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

ICT-101 BLIS-1A

HISTORY OF COMPUTER
A computer is a digital electronic machine that can be programmed to carry
out sequences of arithmetic or logical operations (computation) automatically. Modern computers can
perform generic sets of operations known as programs. These programs enable computers to perform a
wide range of tasks. A computer system is a "complete" computer that includes the hardware, operating
system (main software), and peripheral equipment needed and used for "full" operation. This term may
also refer to a group of computers that are linked and function together, such as a computer
network or computer cluster.
The history of computers dates back over 200 years. At first theorized by mathematicians and
entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve
the increasingly complex number-crunching challenges. The advancement of technology enabled ever
more-complex computers by the early 20th century, and computers became larger and more powerful.

Pre-20th century
 Devices have been used to aid computation for thousands of years, mostly using one-to-one
correspondence with fingers. The abacus was initially used for arithmetic tasks. The Roman
abacus was developed from devices used in Babylonia as early as 2400 BC. Since then, many
other forms of reckoning boards or tables have been invented. In a medieval European
counting house, a checkered cloth would be placed on a table, and markers moved around on
it according to certain rules, as an aid to calculating sums of money. Hence became the origin
of the concepts of computer.

First computer
 Charles Babbage, an English mechanical engineer and polymath, originated the concept of a
programmable computer. Considered the "father of the computer", he conceptualized and
invented the first mechanical computer in the early 19th century. After working on his
revolutionary difference engine, designed to aid in navigational calculations, in 1833 he
realized that a much more general design, an Analytical Engine, was possible. The input of
programs and data was to be provided to the machine via punched cards, a method being used
at the time to direct mechanical looms such as the Jacquard loom. For output, the machine
would have a printer, a curve plotter and a bell. The machine would also be able to punch
numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit,
control flow in the form of conditional branching and loops, and integrated memory, making
it the first design for a general-purpose computer that could be described in modern terms as
Turing-complete.

Analog computers

 During the first half of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct mechanical or electrical
model of the problem as a basis for computation. However, these were not programmable and
generally lacked the versatility and accuracy of modern digital computers. The first modern
analog computer was a tide-predicting machine, invented by Sir William Thomson (later to
become Lord Kelvin) in 1872. The differential analyzer, a mechanical analog computer
designed to solve differential equations by integration using wheel-and-disc mechanisms, was
conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William
Thomson.

Digital computers
Electromechanical
 By 1938, the United States Navy had developed an electromechanical analog computer small
enough to use aboard a submarine. This was the Torpedo Data Computer, which used
trigonometry to solve the problem of firing a torpedo at a moving target. During World War
II similar devices were developed in other countries as well. The Z2, created by German
engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical
relay computer.
 In 1941, Zuse followed his earlier machine up with the Z3, the world's first working
electromechanical programmable, fully automatic digital computer. The Z3 was built with
2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–
10 Hz. Program code was supplied on punched film while data could be stored in 64 words of
memory or supplied from the keyboard. It was quite similar to modern machines in some
respects, pioneering numerous advances such as floating-point numbers. Rather than the
harder-to-implement decimal system (used in Charles Babbage's earlier design), using a
binary system meant that Zuse's machines were easier to build and potentially more reliable,
given the technologies available at that time

Vacuum tubes and digital electronic circuits


 Purely electronic circuit elements soon replaced their mechanical and electromechanical
equivalents, at the same time that digital calculation replaced analog. In the US, John Vincent
Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–
Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design
was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a
mechanically rotating drum for memory.
 Colossus was the world's first electronic digital programmable computer. It used a large
number of valves (vacuum tubes). It had paper-tape input and was capable of being
configured to perform a variety of boolean logical operations on its data, but it was not
Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making
ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark
II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly
speeding the decoding process.

Modern computers and computer Inventions


Concept of modern computer
 The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper,
On Computable Numbers. Turing proposed a simple device that he called "Universal Computing
machine" and that is now known as a universal Turing machine. He proved that such a machine is
capable of computing anything that is computable by executing instructions (program) stored on
tape, allowing the machine to be programmable. The fundamental concept of Turing's design is
the stored program, where all the instructions for computing are stored in memory. Von Neumann
acknowledged that the central concept of the modern computer was due to this paper. Turing
machines are to this day a central object of study in theory of computation. Except for the
limitations imposed by their finite memory stores, modern computers are said to be Turing-
complete, which is to say, they have algorithm execution capability equivalent to a universal
Turing machine.

First Computer Transistor


 1947: William Shockley of Bell Labs invented the first transistor and drastically changed the
course of computing history. The transistor replaced the common vacuum tube which allowed for
computers to be much more efficient while still greatly reducing their size and energy
requirements.

First General-Purpose Commercial Computer

 1951: Professors John Mauchly and J. Presper Eckert built UNIVAC (Universal
Automatic Computer), the first general-purpose commercial computer in history.
The early UNIVAC models utilized 5,000 vacuum tubes but later models in the
series adopted transistors. It was a massive computer weighing in at around 16,000
pounds. However, the massive size allowed for more than 1,000 computations per
second.

First Computer Programming Language


 1954: A team at IBM led by John Backus created the first commercially available general
purpose computer programming language, FORTRAN. FORTRAN stands for Formula
Translation and is still used today. When the language first appeared, however, there were bugs
and inefficiencies which led people to speculate on the commercial usability of FORTRAN. Yet,
the bugs were worked out many of the programming languages that came after were inspired by
FORTRAN.

First Computer Operating System


 1956: The first computer operating system in history was released in 1956 and produced by
General Motors, called the GM-NAA I/O. It was created by Robert L. Patrick and allowed for
direct input and output, hence the name. It also allowed for batch processing: the ability to
execute a new program automatically after the current one finishes.

First Supercomputer
 1964: History’s first supercomputer, known as the CDC 6600, was developed by
Control Data Corp. It consisted of 400,000 transistors, 100 miles of wiring, and
used Freon for internal cooling. Thus, the CDC 6600 was able to reach a processing
speed of up to 3 million floating-point operations per second (3 megaFLOPS).
Amazingly, this supercomputer was ten times faster than the fastest computer at the
time and cost a whopping $8 million..

First Computer Mouse

 1964: Douglas Engelbart invented the first computer mouse in history but it
wouldn’t accompany the first Apple Macintosh until 1984. The computer mouse
allowed for additional control of the computer in conjunction with the keyboard.
These two input devices have been the primary source of user input ever since.
However, voice commands from present day smart devices are increasingly
becoming the norm.

First Wide Area Computer Network

 1969: DARPA created the first Wide Area Network in the history of computers
called ARPAnet which was a precursor to the internet. It allowed for computers to
connect to a central hub and interact in nearly real-time. The term “internet”
wouldn’t come around until 1973 when computers in Norway and England connect
to ARPAnet. Although the internet has continued to advance through the decades,
many of the same protocols from ARPAnet are still standards today.

First Personal Computer

 1971: The first personal computer in history, the Kenbak-1, is created by John
Blankenbaker, and sold for only $750. However, only around 40 of these computers
ever sold. As small as it was, it was able to execute hundreds of calculations in a
single second. Blankenbaker had the idea for the personal computer more than two
decades before completing his first one.

First Computer Microprocessor

 1971: Intel releases the first microprocessor in the history of computers, the Intel
4004. This tiny microprocessor had the same computing power as the ENIAC
computer that was the sizeza an entire room. Even by todays standards, the Intel
4004 is a small microprocessor, housed on a 2-inch wafer opposed to todays 12-
inch wafers. That said, the initial model had only 2,300 transistors while it’s not
uncommon for todays microprocessors to have several hundred million transistors.

First Apple Computer

 1976: Apple takes the stage and releases their first computer: the Apple-1. The
Apple-1 was different from other computer at the time. It came fully assembled and
on a single motherboard. It sold for nearly $700 and had only 4 KB of memory,
which is almost laughable compared to today’s standards. However, that was plenty
of memory for the applications at the time.

First IBM Personal Computer

 1981: IBM launches their first personal computer, the IBM Model-5150. It only
took a year to develop and cost $1,600. However, that was a steep drop from other
IBM computers before this that sold for several million dollars. The IBM Model-
5150 had only 16 KB of RAM when it was first released, but eventually increased
to up to 640 KB maximum RAM.

First Laptop Computer

 1981: The first laptop in the history of computers, the Osborne 1, was released by
the Osborne Computer Corporation. It had an incredibly small 5-inch display
screen, a bulky fold-out keyboard, 64 KB of main memory, and weighed 24 pounds.
Not surprising, the Osborne 1 was actually very popular, selling more than 125,000
units in 1982 alone. The going rate for an Osborne 1 was $1,795.

First Windows Operating System

 1985: Microsoft released its first version of the Windows operating system,
Windows 1.0. What made Windows 1.0 remarkable was its reliance on the computer
mouse which wasn’t standard yet. It even included a game, Reversi, to help users
become accustomed to the new input device. Love it or hate it, the Windows 1.0
operating system, and its subsequent versions have become commonplace among
computers ever since it’s creation. The development of the original Windows OS
was led by none other than Bill Gates himself.

World Wide Web Is Created

 1989: The World Wide Web is created by Sir Tim Berners-Lee of CERN. When it
was first created, it wasn’t intended to grow into a massive platform that would
connect the average person. Rather, it was originally just intended to easily share
information between scientists and universities. The first website in the history of
computers was actually just a guide to using the world wide web.

First Flash-Based Solid State Drive

 1991: The first flash-based solid-state drive was created by SanDisk (at the time it
was called SunDisk). These drives presented an alternative option to hard drives
and would prove to be very useful in computers, cell phones, and similar devices.
This first flash-based SSD had a 20 MB memory and sold for approximately $1,000.

First Smartphone Is Created

 1992: IBM created the first-ever smartphone in history, the IBM Simon, which
released two years later in 1994. It was a far cry from the smartphones we’re used
to today. However, at the time, the IBM Simon was a game-changer. It sold for
$1,100 when it was first released and even had a touchscreen and several
applications including mail, a calendar, a to-do list, and a few more.

First Platform Independent Language

 1995: Sun Microsystems releases the first iteration of the Java programming
language. Java was the first computer programming language in history to be
platform-independent, popularizing the phrase: “Write once, run anywhere.” Unlike
other computer programming languages at the time, a program written with Java
could run on any device with the Java Development Kit (JDK).

First Smartwatch Is Released

 1998: The first-ever smartwatch, the Ruputer, was released by the watch company
Seiko. If you look at the original Ruputer, you’ll see that it really doesn’t look
much different than present-day smartwatches with the exception of a better display
and minor styling changes. As it wasn’t a touchscreen, a small joystick assisted
with navigating the various feature of the watch.

First USB Flash Drive

 2000: The first USB Flash drive in computer history, the ThumbDrive, is released
by Trek, a company out of Singapore. However, there were other flash drives that
hit the market almost immediately after, such as I.B.M.’s DiskOnKey, a 1.44 MB
flash drive. This led to some speculation as to who was actually first. However, as
evidenced by the patent application back in 1999, and the fact that Trek’s
ThumbDrive made it to market first, the debate was shortly settled.

iPhone Generation-1 Released


 2007: Steve Jobs of Apple released the first-ever iPhone, revolutionizing the smartphone
industry. The screen was 50% bigger than the popular smartphones as the time, such as the
beloved Blackberry and Treo. It also had a much longer-lasting battery. Additionally, the iPhone
normalized web browsing and video playback on phones, setting a new standard across the
industry. The cost was what you could expect from an iPhone, selling at around $600, more than
twice as much as it’s competitors.

Apple’s iPad Is Released


 2010: Only three years after the iPhone is released, Steve Jobs announces the first-ever iPad,
Apple’s first tablet computer. It came with a 9.7-inch touchscreen and options for either 16GB,
32GB, and 64GB. The beauty of the iPad is that it was basically a large iPhone, as it ran on the
same iOS and offered the same functionally. The original iPad started at $499 with the 64GB Wi-
Fi + 3G version selling for $829.

First Reprogrammable Quantum Computer


 2016: Quantum Computers have made considerable progress and the first reprogrammable
quantum computer is finally complete. It’s made up of 5 singular atoms that act as switches.
These switches are activated by a laser beam that controls the state of the qubit. This leap has
brought us very close to quantum supremacy.

First Brain-Computer Interface


 2019: Elon Musk announces Neuralink’s progress of their brain-machine interface that would
lend humans the same information processing abilities that computers have while linking to
Artificial Intelligence. In this announcement, Neuralink revealed that they had already
successfully tested their technology on mice and apes.

IBM’s “Eagle” Quantum Computer Chip


 2021: IBM continues to lead the charge in quantum computer development and in November,
they showcased their new “Eagle” chip. This is currently the most cutting-edge quantum chip in
existence, packing 127 qubits, making it the first to reach over 100 qubits. IBM plans to create a
new chip more than three times powerful than the “Eagle” by next year, 2022.

You might also like