Professional Documents
Culture Documents
Information Age
Information Age
on Age
Period of rapid
development in
computing
The Information
Age (also known
as the Computer
Age, Digital
Age, Silicon Age,
or New Media Age) is
a historical
period that began in
the mid-20th century,
characterized by a
rapid epochal shift
from traditional
industry established
by the Industrial
Revolution to an
economy primarily
based
upon information
technology. The
onset of the
Information Age has
been associated with
the development of
the transistor in
1947, the basic
building block of
modern electronics,
the optical
amplifier in 1957, the
basis of long-
distance fiber optic
communications, an
d Unix
time measured from
the start of Jan. 1,
1970, the basis
of Coordinated
Universal
Time and Network
Time Protocol which
now synchronizes all
computers
connected to
the Internet.
A laptop connects to
the Internet to display
information from Wikipedia;
long-distance communication
between computer systems is a
hallmark of the Information Age
According to
the United Nations
Public Administration
Network, the
Information Age was
formed
by capitalizing on co
mputer
microminiaturization
advances, which led
to modernized inform
ation systems and
internet
communications as
the driving force
of social evolution.
Overview
of early
developm
ents
Information
transmission
The world's
technological
capacity to receive
information through
one-way broadcast
networks was
432 exabytes of
(optimally compress
ed) information in
1986; 715 (optimally
compressed)
exabytes in 1993; 1.2
(optimally
compressed) zettaby
tes in 2000; and 1.9
zettabytes in 2007,
the information
equivalent of
174 newspapers per
person per day.
The world's effective
capacity to exchange
information through t
wo-
way telecommunicati
on networks was
281 petabytes of
(optimally
compressed)
information in 1986;
471 petabytes in
1993; 2.2 (optimally
compressed)
exabytes in 2000;
and 65 (optimally
compressed)
exabytes in 2007, the
information
equivalent of 6
newspapers per
person per day. In the
1990s, the spread of
the Internet caused a
sudden leap in
access to and ability
to share information
in businesses and
homes globally.
Technology was
developing so quickly
that a computer
costing $3000 in
1997 would cost
$2000 two years later
and $1000 the
following year.
Computation
The world's
technological
capacity to compute
information with
humanly guided
general-purpose
computers grew from
8
3.0 × 10 MIPS in
1986, to 4.4 ×
9
9
10 MIPS in 1993; to
11
2.9 × 10 MIPS in
2000; to 6.4 ×
12
10 MIPS in
2007. An article
featured in
the journal Trends in
Ecology and
Evolution in 2016
reported that:
Digital
technology h
as vastly
exceeded
the cognitive
capacity of
any single
human being
and has done
so a decade
earlier than
predicted. In
terms of
capacity,
there are
two
measures of
importance:
the number
of operations
a system can
perform and
the amount
of
information
that can be
stored. The
number
of synaptic
operations
per
second in a
human brain
has been
estimated to
lie between
10^15 and
10^17. While
this number
is
impressive,
even in 2007
humanity's g
eneral-
purpose
computers w
ere capable
of
performing
well over
10^18
instructions
per second.
Estimates
suggest that
the storage
capacity of
an individual
human brain
is about
10^12 bytes.
On a per
capita basis,
this is
matched by
current
digital
storage
(5x10^21
bytes per
7.2x10^9
people).
Genetic information
Genetic code may
also be considered
part of
the information
revolution. Now that
sequencing has been
computerized, geno
me can be rendered
and manipulated as
data. This started
with DNA
sequencing, invented
by Walter Gilbert and
Allan Maxam in
1976-1977 and
Frederick Sanger in
1977, grew steadily
with the Human
Genome Project,
initially conceived by
Gilbert and finally, the
practical applications
of sequencing, such
as gene testing, after
the discovery
by Myriad
Genetics of
the BRCA1 breast
cancer gene
mutation. Sequence
data in Genbank has
grown from the 606
genome sequences
registered in
December 1982 to
the 231 million
genomes in August
2021. An additional
13 trillion incomplete
sequences are
registered in
the Whole Genome
Shotgun submission
database as of
August 2021. The
information
contained in these
registered sequences
has doubled every 18
months.
Different
stage
conceptua
lizations
During rare times in
human history, there
have been periods of
innovation that have
transformed human
life. The Neolithic
Age, the Scientific
Age and
the Industrial Age all,
ultimately, induced
discontinuous and
irreversible changes
in the economic,
social and cultural
elements of the daily
life of most people.
Traditionally, these
epochs have taken
place over hundreds,
or in the case of the
Neolithic Revolution,
thousands of years,
whereas the
Information Age
swept to all parts of
the globe in just a
few years. The
reason for its rapid
adoption is the
rapidly advancing
speed of information
exchange.
Between 7,000 and
10,000 years ago
during the Neolithic
period, humans
began to
domesticate animals,
began to farm grains
and to replace stone
tools with ones made
of metal. These
innovations allowed
nomadic hunter-
gatherers to settle
down. Villages
formed along
the Yangtze
River in China in
6,500 B.C., the Nile
River region
of Africa and
in Mesopotamia (Iraq
) in 6,000 B.C. Cities
emerged between
6,000 B.C. and 3,500
B.C. The
development of
written
communication
(cuneiform in Sumeri
a and hieroglyphs in
Egypt in 3,500 B.C.
and writing in Egypt
in 2,560 B.C. and
in Minoa and China
around 1,450 B.C.)
enabled ideas to be
preserved for
extended periods to
spread extensively. In
all, Neolithic
developments,
augmented by writing
as an information
tool, laid the
groundwork for the
advent of civilization.
The Scientific Age
began in the period
between Galileo's
1543 proof that the
planets orbit the sun
and Newton's
publication of the
laws of motion and
gravity in Principia in
1697. This age of
discovery continued
through the 18th
Century, accelerated
by widespread use of
the moveable type
printing
press by Johannes
Gutenberg.
The Industrial Age
began in Great
Britain in 1760 and
continued into the
mid-19th Century. It
altered many aspects
of life around the
world. The invention
of machines such as
the mechanical
textile weaver by
Edmund Cartwrite,
the rotating
shaft steam
engine by James
Watt and the cotton
gin by Eli Whitney,
along with processes
for mass
manufacturing, came
to serve the needs of
a growing global
population. The
Industrial Age
harnessed steam and
waterpower to
reduce the
dependence on
animal and human
physical labor as the
primary means of
production. Thus, the
core of the Industrial
Revolution was the
generation and
distribution of energy
from coal and water
to produce steam
and, later in the 20th
Century, electricity.
The Information Age
also
requires electricity to
power the global
networks of compute
rs that process and
store data. However,
what dramatically
accelerated the pace
of adoption of The
Information Age, as
compared to
previous ones, was
the speed by which
knowledge could be
transferred and
pervaded the entire
human family in a
few short decades.
This acceleration
came about with the
adoptions of a new
form of power.
Beginning in 1972,
engineers devised
ways to harness light
to convey data
through fiber optic
cable. Today, light-
based optical
networking systems
at the heart of
telecom networks
and the Internet span
the globe and carry
most of the
information traffic to
and from users and
data storage
systems.
Economic
s
Eventually, Informatio
n and
communication
technology (ICT)—
i.e. computers, comp
uterized
machinery, fiber
optics, communicatio
n satellites,
the Internet, and
other ICT tools—
became a significant
part of the world
economy, as the
development
of optical
networking and micr
ocomputers greatly
changed many
businesses and
industries. Nicholas
Negroponte captured
the essence of these
changes in his 1995
book, Being Digital, in
which he discusses
the similarities and
differences between
products made
of atoms and
products made
of bits.
Automation,
productivity, and job
gain
Information-intensive
industry
Main article: Information
industry
Industry has become
more information-
intensive while
less labor-
and capital-intensive.
This has left
important
implications for
the workforce, as
workers have
become
increasingly producti
ve as the value of
their labor decreases.
For the system
of capitalism itself,
the value of labor
decreases, the value
of capital increases.
In the classical
model, investments
in human and financi
al capital are
important predictors
of the performance
of a
new venture. Howeve
r, as demonstrated
by Mark
Zuckerberg and Face
book, it now seems
possible for a group
of relatively
inexperienced people
with limited capital to
succeed on a large
scale.
Innovatio
ns
Transistors
Main
articles: Transistor, History
of the transistor,
and MOSFET
Further
information: Semiconductor
device
The onset of the
Information Age can
be associated with
the development
of transistor technolo
gy. The concept of
a field-effect
transistor was first
theorized by Julius
Edgar Lilienfeld in
1925. The first
practical transistor
was the point-contact
transistor, invented
by the
engineers Walter
Houser
Brattain and John
Bardeen while
working for William
Shockley at Bell
Labs in 1947. This
was a breakthrough
that laid the
foundations for
modern
technology. Shockley'
s research team also
invented the bipolar
junction transistor in
1952. The most
widely used type of
transistor is
the metal–oxide–
semiconductor field-
effect
transistor (MOSFET),
invented
by Mohamed M.
Atalla and Dawon
Kahng at Bell Labs in
1960. The compleme
ntary MOS (CMOS)
fabrication process
was developed
by Frank
Wanlass and Chih-
Tang Sah in 1963.
Computers
Main
articles: Computer and Hist
ory of computers
Further
information: Integrated
circuit, Invention of the
integrated
circuit, Microprocessor,
and Moore's law
Data
Further information: History
of
telecommunications, Co
mputer
memory, Computer data
storage, Data
compression, Internet
access, and Social media
The first
developments for
storing data were
initially based on
photographs, starting
with microphotograp
hy in 1851 and
then microform in the
1920s, with the ability
to store documents
on film, making them
much more compact.
Early information
theory and Hamming
codes were
developed about
1950, but awaited
technical innovations
in data transmission
and storage to be put
to full use.
Magnetic-core
memory was
developed from the
research of Frederick
W. Viehe in 1947
and An
Wang at Harvard
University in
1949. With the
advent of the MOS
transistor,
MOS semiconductor
memory was
developed by John
Schmidt at Fairchild
Semiconductor in
1964. In
1967, Dawon
Kahng and Simon
Sze at Bell Labs
described in 1967
how the floating gate
of an MOS
semiconductor
device could be used
for the cell of a
reprogrammable
ROM. Following the
invention of flash
memory by Fujio
Masuoka at Toshiba i
n 1980, Toshiba
commercialized NAN
D flash memory in
1987.
Copper wire cables
transmitting digital
data
connected computer
terminals and periph
erals to mainframes,
and special
message-sharing
systems leading
to email, were first
developed in the
1960s. Independent
computer-to-
computer networking
began
with ARPANET in
1969. This expanded
to become
the Internet (coined
in 1974). Access to
the Internet improved
with the invention of
the World Wide
Web in 1991. The
capacity expansion
from dense wave
division
multiplexing, optical
amplification and opt
ical networking in the
mid-1990s led to
record data transfer
rates. By 2018,
optical networks
routinely delivered
30.4 terabits/s over a
fiber optic pair, the
data equivalent of 1.2
million simultaneous
4K HD video streams.
MOSFET scaling, the
rapid miniaturization
of MOSFETs at a rate
predicted by Moore's
law, led to computers
becoming smaller
and more powerful,
to the point where
they could be carried.
During the 1980s–
1990s, laptops were
developed as a form
of portable computer,
and personal digital
assistants (PDAs)
could be used while
standing or
walking. Pagers,
widely used by the
1980s, were largely
replaced by mobile
phones beginning in
the late 1990s,
providing mobile
networking features
to some computers.
Now commonplace,
this technology is
extended to digital
cameras and other
wearable devices.
Starting in the late
1990s, tablets and
then smartphones co
mbined and extended
these abilities of
computing, mobility,
and information
sharing. Metal–
oxide–
semiconductor (MOS
) image sensors,
which first began
appearing in the late
1960s, led to the
transition from
analog to digital
imaging, and from
analog to digital
cameras, during the
1980s–1990s. The
most common image
sensors are
the charge-coupled
device (CCD) sensor
and
the CMOS (complem
entary MOS) active-
pixel sensor (CMOS
sensor).
Electronic paper,
which has origins in
the 1970s, allows
digital information to
appear as paper
documents.
Personal computers
Main article: History
of
personal computers
By 1976, there were
several firms racing
to introduce the first
truly successful
commercial personal
computers. Three
machines, the Apple
II, PET 2001 and TRS-
80 were all released
in 1977, becoming
the most popular by
late
1978. Byte magazine
later referred to
Commodore, Apple,
and Tandy as the
"1977 Trinity". Also in
1977, Sord Computer
Corporation released
the Sord M200 Smart
Home Computer in
Japan.
Apple II
Main article: Apple II
Apr. 1977: Apple II.
Steve
Wozniak (known as
"Woz"), a regular
visitor to Homebrew
Computer
Club meetings,
designed the single-
board Apple
I computer and first
demonstrated it
there. With
specifications in
hand and an order for
100 machines at
US$500 each
from the Byte Shop,
Woz and his
friend Steve
Jobs founded Apple
Computer.
About 200 of the
machines sold before
the company
announced the Apple
II as a complete
computer. It had
color graphics, a full
QWERTY keyboard,
and internal slots for
expansion, which
were mounted in a
high quality
streamlined plastic
case. The monitor
and I/O devices were
sold separately. The
original Apple
II operating
system was only the
built-in BASIC
interpreter contained
in ROM. Apple
DOS was added to
support the diskette
drive; the last version
was "Apple DOS 3.3".
Its higher price and
lack of floating
point BASIC, along
with a lack of retail
distribution sites,
caused it to lag in
sales behind the
other Trinity
machines until 1979,
when it surpassed
the PET. It was again
pushed into 4th place
when Atari introduce
d its popular Atari 8-
bit systems.
Despite slow initial
sales, the Apple II's
lifetime was about
eight years longer
than other machines,
and so accumulated
the highest total
sales. By 1985
2.1 million had sold
and more than 4
million Apple II's were
shipped by the end of
its production in
1993.
Optical networking
Further information: Optical
communication, Image
sensor, and Optical fiber
Optical
communication plays
a crucial role
in communication
networks. Optical
communication
provides the
transmission
backbone for
the telecommunicati
ons and computer
networks that
underlie the Internet,
the foundation for
the Digital
Revolution and
Information Age.
The two core
technologies are the
optical fiber and light
amplification
(the optical
amplifier). In 1953,
Bram van Heel
demonstrated image
transmission through
bundles of optical
fibers with a
transparent cladding.
The same
year, Harold
Hopkins and Narinde
r Singh
Kapany at Imperial
College succeeded in
making image-
transmitting bundles
with over 10,000
optical fibers, and
subsequently
achieved image
transmission through
a 75 cm long bundle
which combined
several thousand
fibers.
Gordon
Gould invented
the optical
amplifier and
the laser, and also
established the first
optical
telecommunications
company, Optelecom,
to design
communication
systems. The firm
was a co-founder
in Ciena Corp., the
venture that
popularized
the optical
amplifier with the
introduction of the
first dense wave
division
multiplexing system.
This massive scale
communication
technology has
emerged as the
common basis of
all telecommunicatio
n networks and, thus,
a foundation of the
Information Age.
Economy,
society
and
culture
Manuel Castells
captures the
significance of the
Information Age
in The Information
Age: Economy,
Society and
Culture when he
writes of our global
interdependence and
the new relationships
between economy,
state and society,
what he calls "a new
society-in-the-
making." He cautions
that just because
humans have
dominated the
material world, does
not mean that the
Information Age is
the end of history:
It is in fact,
quite the
opposite:
history is just
beginning, if
by history
we
understand
the moment
when, after
millennia of
a prehistoric
battle with
Nature, first
to survive,
then to
conquer it,
our species
has reached
the level of
knowledge
and social
organization
that will
allow us to
live in a
predominant
ly social
world. It is
the
beginning of
a new
existence,
and indeed
the
beginning of
a new age,
The
Information
Age, marked
by the
autonomy of
culture vis-à-
vis the
material
basis of our
existence.
See also
Attention economy
Attention inequality
Big data
Cognitive-cultural
economy
Computer crime
Cyberterrorism
Cyberwarfare
Datamation – first
print magazine
dedicated solely to
covering
information
technology
Digital dark age
Digital detox
Digital divide
Digital
transformation
Digital world
Imagination age,
the hypothesized
successor of the
information age: a
period in which
creativity and
imagination
become the
primary creators of
economic value
Indigo Era
Information
explosion
Information
revolution
Information society
Internet governance
Netocracy
Social Age
Technological
determinism
Telecommunication
s
Zettabyte Era
The Hacker Ethic
and the Spirit of the
Information Age
Information and
communication
technologies for
environmental
sustainability
Referenc
es
Further
reading
External
links
ABOUT THIS ARTICLE
View edit history
Updated 5 days ago
READ MORE
Edholm's law
Law predicting that
bandwidth and data …
Bandwidth (comput…
Maximum rate of data
transfer
Random-acces…
Form of computer
data storage
Content is available
under CC BY-SA 3.0 unless
otherwise noted.
View article in browser