Download as pdf or txt
Download as pdf or txt
You are on page 1of 129

Informati

on Age
Period of rapid
development in
computing

The Information
Age (also known
as the Computer
Age, Digital
Age, Silicon Age,
or New Media Age) is
a historical
period that began in
the mid-20th century,
characterized by a
rapid epochal shift
from traditional
industry established
by the Industrial
Revolution to an
economy primarily
based
upon information
technology.  The
onset of the
Information Age has
been associated with
the development of
the transistor in
1947, the basic
building block of
modern electronics,
the optical
amplifier in 1957, the
basis of long-
distance fiber optic
communications, an
d Unix
time measured from
the start of Jan. 1,
1970, the basis
of Coordinated
Universal
Time and Network
Time Protocol which
now synchronizes all
computers
connected to
the Internet.
A laptop connects to
the Internet to display
information from Wikipedia;
long-distance communication
between computer systems is a
hallmark of the Information Age

According to
the United Nations
Public Administration
Network, the
Information Age was
formed
by capitalizing on co
mputer
microminiaturization 
advances, which led
to modernized inform
ation systems and
internet
communications as
the driving force
of social evolution.

Overview
of early
developm
ents

A timeline of major milestones of


the Information Age, from the first
message sent by the Internet
protocol suite to
global Internet access

Library expansion and


Moore's law
Library expansion
was calculated in
1945 by Fremont
Rider to double in
capacity every 16
years where
sufficient space
made available. He
advocated replacing
bulky, decaying
printed works
with miniaturized mic
roform analog
photographs, which
could be duplicated
on-demand for library
patrons and other
institutions.
Rider did not foresee,
however, the digital
technology that
would follow
decades later to
replace analog micro
form with digital
imaging, storage,
and transmission
media, whereby vast
increases in the
rapidity of
information growth
would be made
possible
through automated,
potentially-
lossless digital
technologies.
Accordingly, Moore's
law, formulated
around 1965, would
calculate that
the number of
transistors in a
dense integrated
circuit doubles
approximately every
two years.
By the early 1980s,
along with
improvements
in computing power,
the proliferation of
the smaller and less
expensive personal
computers allowed
for
immediate access to
information and the
ability
to share and store it.
Connectivity between
computers within
organizations
enabled access to
greater amounts of
information.
Information storage
and Kryder's law
Main articles: Data
storage and Computer
data storage

Hilbert & López (2011). The


World's Technological Capacity to
Store, Communicate, and
Compute Information. Science,
332(6025), 60–
65. https://www.science.or
g/doi/pdf/10.1126/scienc
e.1200970
The world's
technological
capacity to store
information grew
from 2.6
(optimally compress
ed) exabytes (EB) in
1986 to 15.8 EB in
1993; over 54.5 EB in
2000; and to 295
(optimally
compressed) EB in
2007.  This is the
informational
equivalent to less
than one 730-
megabyte (MB) CD-
ROM per person in
1986 (539 MB per
person); roughly four
CD-ROM per person
in 1993; twelve CD-
ROM per person in
the year 2000; and
almost sixty-one CD-
ROM per person in
2007. It is estimated
that the world's
capacity to store
information has
reached
5 zettabytes in
2014, the
informational
equivalent of 4,500
stacks of printed
books from
the earth to the sun.
The amount of digital
data stored appears
to be growing
approximately expon
entially, reminiscent
of Moore's law. As
such, Kryder's
law prescribes that
the amount of
storage space
available appears to
be growing
approximately
exponentially.

Information
transmission
The world's
technological
capacity to receive
information through
one-way broadcast
networks was
432 exabytes of
(optimally compress
ed) information in
1986; 715 (optimally
compressed)
exabytes in 1993; 1.2
(optimally
compressed) zettaby
tes in 2000; and 1.9
zettabytes in 2007,
the information
equivalent of
174 newspapers per
person per day.
The world's effective
capacity to exchange
information through t
wo-
way telecommunicati
on networks was
281 petabytes of
(optimally
compressed)
information in 1986;
471 petabytes in
1993; 2.2 (optimally
compressed)
exabytes in 2000;
and 65 (optimally
compressed)
exabytes in 2007, the
information
equivalent of 6
newspapers per
person per day. In the
1990s, the spread of
the Internet caused a
sudden leap in
access to and ability
to share information
in businesses and
homes globally.
Technology was
developing so quickly
that a computer
costing $3000 in
1997 would cost
$2000 two years later
and $1000 the
following year.

Computation

The world's
technological
capacity to compute
information with
humanly guided
general-purpose
computers grew from
8
3.0 × 10  MIPS in
1986, to 4.4 ×
9
9
10  MIPS in 1993; to
11
2.9 × 10  MIPS in
2000; to 6.4 ×
12
10  MIPS in
2007. An article
featured in
the journal Trends in
Ecology and
Evolution in 2016
reported that:

Digital
technology h
as vastly
exceeded
the cognitive 
capacity of
any single
human being
and has done
so a decade
earlier than
predicted. In
terms of
capacity,
there are
two
measures of
importance:
the number
of operations
a system can
perform and
the amount
of
information
that can be
stored. The
number
of synaptic
operations
per
second in a
human brain
has been
estimated to
lie between
10^15 and
10^17. While
this number
is
impressive,
even in 2007
humanity's g
eneral-
purpose
computers w
ere capable
of
performing
well over
10^18
instructions
per second.
Estimates
suggest that
the storage
capacity of
an individual
human brain
is about
10^12 bytes.
On a per
capita basis,
this is
matched by
current
digital
storage
(5x10^21
bytes per
7.2x10^9
people).
Genetic information
Genetic code may
also be considered
part of
the information
revolution. Now that
sequencing has been
computerized, geno
me can be rendered
and manipulated as
data. This started
with DNA
sequencing, invented
by Walter Gilbert and
Allan Maxam in
1976-1977 and
Frederick Sanger in
1977, grew steadily
with the Human
Genome Project,
initially conceived by
Gilbert and finally, the
practical applications
of sequencing, such
as gene testing, after
the discovery
by Myriad
Genetics of
the BRCA1 breast
cancer gene
mutation. Sequence
data in Genbank has
grown from the 606
genome sequences
registered in
December 1982 to
the 231 million
genomes in August
2021. An additional
13 trillion incomplete
sequences are
registered in
the Whole Genome
Shotgun submission
database as of
August 2021. The
information
contained in these
registered sequences
has doubled every 18
months.

Different
stage
conceptua
lizations
During rare times in
human history, there
have been periods of
innovation that have
transformed human
life. The Neolithic
Age, the Scientific
Age and
the Industrial Age all,
ultimately, induced
discontinuous and
irreversible changes
in the economic,
social and cultural
elements of the daily
life of most people.
Traditionally, these
epochs have taken
place over hundreds,
or in the case of the
Neolithic Revolution,
thousands of years,
whereas the
Information Age
swept to all parts of
the globe in just a
few years. The
reason for its rapid
adoption is the
rapidly advancing
speed of information
exchange.
Between 7,000 and
10,000 years ago
during the Neolithic
period, humans
began to
domesticate animals,
began to farm grains
and to replace stone
tools with ones made
of metal. These
innovations allowed
nomadic hunter-
gatherers to settle
down. Villages
formed along
the Yangtze
River in China in
6,500 B.C., the Nile
River region
of Africa and
in Mesopotamia (Iraq
) in 6,000 B.C. Cities
emerged between
6,000 B.C. and 3,500
B.C. The
development of
written
communication
(cuneiform in Sumeri
a and hieroglyphs in 
Egypt in 3,500 B.C.
and writing in Egypt
in 2,560 B.C. and
in Minoa and China
around 1,450 B.C.)
enabled ideas to be
preserved for
extended periods to
spread extensively. In
all, Neolithic
developments,
augmented by writing
as an information
tool, laid the
groundwork for the
advent of civilization.
The Scientific Age
began in the period
between Galileo's
1543 proof that the
planets orbit the sun
and Newton's
publication of the
laws of motion and
gravity in Principia in
1697. This age of
discovery continued
through the 18th
Century, accelerated
by widespread use of
the moveable type
printing
press by Johannes
Gutenberg.
The Industrial Age
began in Great
Britain in 1760 and
continued into the
mid-19th Century. It
altered many aspects
of life around the
world. The invention
of machines such as
the mechanical
textile weaver by
Edmund Cartwrite,
the rotating
shaft steam
engine by James
Watt and the cotton
gin by Eli Whitney,
along with processes
for mass
manufacturing, came
to serve the needs of
a growing global
population. The
Industrial Age
harnessed steam and
waterpower to
reduce the
dependence on
animal and human
physical labor as the
primary means of
production. Thus, the
core of the Industrial
Revolution was the
generation and
distribution of energy
from coal and water
to produce steam
and, later in the 20th
Century, electricity.
The Information Age
also
requires electricity to
power the global
networks of compute
rs that process and
store data. However,
what dramatically
accelerated the pace
of adoption of The
Information Age, as
compared to
previous ones, was
the speed by which
knowledge could be
transferred and
pervaded the entire
human family in a
few short decades.
This acceleration
came about with the
adoptions of a new
form of power.
Beginning in 1972,
engineers devised
ways to harness light
to convey data
through fiber optic
cable. Today, light-
based optical
networking systems
at the heart of
telecom networks
and the Internet span
the globe and carry
most of the
information traffic to
and from users and
data storage
systems.

Three stages of the Information


Age

There are different


conceptualizations of
the Information Age.
Some focus on the
evolution of
information over the
ages, distinguishing
between the Primary
Information Age and
the Secondary
Information Age.
Information in the
Primary Information
Age was handled
by newspapers, radio 
and television. The
Secondary
Information Age was
developed by
the Internet, satellite
televisions and mobil
e phones. The
Tertiary Information
Age was emerged by
media of the Primary
Information Age
interconnected with
media of the
Secondary
Information Age as
presently
experienced.
Others classify it in
terms of the well-
established Schumpe
terian long
waves or Kondratiev
waves. Here authors
distinguish three
different long-term
metaparadigms, each
with different long
waves. The first
focused on the
transformation of
material,
including stone, bron
ze, and iron. The
second, often
referred to
as industrial
revolution, was
dedicated to the
transformation of
energy,
including water, stea
m, electric,
and combustion
power. Finally, the
most recent
metaparadigm aims
at
transforming informa
tion. It started out
with the proliferation
of communication an
d stored data and
has now entered the
age of algorithms,
which aims at
creating automated
processes to convert
the existing
information into
actionable
knowledge.

Economic
s
Eventually, Informatio
n and
communication
technology (ICT)—
i.e. computers, comp
uterized
machinery, fiber
optics, communicatio
n satellites,
the Internet, and
other ICT tools—
became a significant
part of the world
economy, as the
development
of optical
networking and micr
ocomputers greatly
changed many
businesses and
industries.  Nicholas
Negroponte captured
the essence of these
changes in his 1995
book, Being Digital, in
which he discusses
the similarities and
differences between
products made
of atoms and
products made
of bits.

Jobs and income


distribution

The Information Age


has affected
the workforce in
several ways, such as
compelling workers
to compete in a
global job market.
One of the most
evident concerns is
the replacement of
human labor by
computers that can
do their jobs faster
and more effectively,
thus creating a
situation in which
individuals who
perform tasks that
can easily
be automated are
forced to find
employment where
their labor is not as
disposable. This
especially creates
issue for those
in industrial cities,
where solutions
typically involve
lowering working
time, which is often
highly resisted. Thus,
individuals who lose
their jobs may be
pressed to move up
into joining "mind
workers"
(e.g. engineers, docto
rs, lawyers, teachers, 
professors, scientists
, executives, journalis
ts, consultants), who
are able to compete
successfully in
the world market and
receive (relatively)
high wages.
Along with
automation, jobs
traditionally
associated with
the middle
class (e.g. assembly
line, data
processing, manage
ment,
and supervision)
have also begun to
disappear as result
of outsourcing. Unabl
e to compete with
those in developing
countries, production
 and service workers
in post-industrial (i.e.
developed)
societies either lose
their jobs through
outsourcing,
accept wage cuts, or
settle for low-
skill, low-
wage service jobs. In
the past, the
economic fate of
individuals would be
tied to that of their
nation's. For example,
workers in the United
States were once
well paid in
comparison to those
in other countries.
With the advent of
the Information Age
and improvements in
communication, this
is no longer the case,
as workers must now
compete in a
global job market,
whereby wages are
less dependent on
the success or failure
of individual
economies.
In effectuating
a globalized
workforce, the
internet has just as
well allowed for
increased
opportunity
in developing
countries, making it
possible for workers
in such places to
provide in-person
services, therefore
competing directly
with their
counterparts in other
nations.
This competitive
advantage translates
into increased
opportunities and
higher wages.

Automation,
productivity, and job
gain

The Information Age


has affected the
workforce in
that automation and
computerization have
resulted in
higher productivity co
upled with net job
loss in manufacturin
g. In the United
States, for example,
from January 1972 to
August 2010, the
number of people
employed in
manufacturing jobs
fell from 17,500,000
to 11,500,000 while
manufacturing value
rose 270%. Although
it initially appeared
that job loss in
the industrial
sector might be
partially offset by the
rapid growth of jobs
in information
technology,
the recession of
March
2001 foreshadowed
a sharp drop in the
number of jobs in the
sector. This pattern
of decrease in jobs
would continue until
2003, and data has
shown that, overall,
technology creates
more jobs than it
destroys even in the
short run.

Information-intensive
industry
Main article: Information
industry
Industry has become
more information-
intensive while
less labor-
and capital-intensive.
This has left
important
implications for
the workforce, as
workers have
become
increasingly producti
ve as the value of
their labor decreases.
For the system
of capitalism itself,
the value of labor
decreases, the value
of capital increases.
In the classical
model, investments
in human and financi
al capital are
important predictors
of the performance
of a
new venture. Howeve
r, as demonstrated
by Mark
Zuckerberg and Face
book, it now seems
possible for a group
of relatively
inexperienced people
with limited capital to
succeed on a large
scale.

Innovatio
ns

A visualization of the various


routes through a portion of the
Internet.
The Information Age
was enabled by
technology
developed in
the Digital Revolution,
which was itself
enabled by building
on the developments
of the Technological
Revolution.

Transistors
Main
articles: Transistor, History
of the transistor,
and MOSFET
Further
information: Semiconductor
device
The onset of the
Information Age can
be associated with
the development
of transistor technolo
gy. The concept of
a field-effect
transistor was first
theorized by Julius
Edgar Lilienfeld in
1925. The first
practical transistor
was the point-contact
transistor, invented
by the
engineers Walter
Houser
Brattain and John
Bardeen while
working for William
Shockley at Bell
Labs in 1947. This
was a breakthrough
that laid the
foundations for
modern
technology. Shockley'
s research team also
invented the bipolar
junction transistor in
1952.  The most
widely used type of
transistor is
the metal–oxide–
semiconductor field-
effect
transistor (MOSFET),
invented
by Mohamed M.
Atalla and Dawon
Kahng at Bell Labs in
1960. The compleme
ntary MOS (CMOS)
fabrication process
was developed
by Frank
Wanlass and Chih-
Tang Sah in 1963.

Computers
Main
articles: Computer and Hist
ory of computers
Further
information: Integrated
circuit, Invention of the
integrated
circuit, Microprocessor,
and Moore's law

Before the advent


of electronics, mecha
nical computers, like
the Analytical
Engine in 1837, were
designed to provide
routine mathematical
calculation and
simple decision-
making capabilities.
Military needs
during World War
II drove development
of the first electronic
computers, based
on vacuum tubes,
including the Z3,
the Atanasoff–Berry
Computer, Colossus
computer,
and ENIAC.
The invention of the
transistor enabled
the era of mainframe
computers (1950s–
1970s), typified by
the IBM 360. These
large, room-sized
computers provided
data calculation
and manipulation tha
t was much faster
than humanly
possible, but were
expensive to buy and
maintain, so were
initially limited to a
few scientific
institutions, large
corporations, and
government
agencies.
The germanium integ
rated circuit (IC) was
invented by Jack
Kilby at Texas
Instruments in
1958. The silicon inte
grated circuit was
then invented in 1959
by Robert
Noyce at Fairchild
Semiconductor,
using the planar
process developed
by Jean Hoerni, who
was in turn building
on Mohamed Atalla's
silicon surface
passivation method
developed at Bell
Labs in
1957.  Following the
invention of the MOS
transistor by
Mohamed Atalla
and Dawon Kahng at
Bell Labs in
1959, the MOS integr
ated circuit was
developed by Fred
Heiman and Steven
Hofstein at RCA in
1962. The silicon-
gate MOS IC was
later developed
by Federico Faggin at
Fairchild
Semiconductor in
1968. With the
advent of the MOS
transistor and the
MOS IC, transistor
technology rapidly
improved, and the
ratio of computing
power to size
increased
dramatically, giving
direct access to
computers to ever
smaller groups of
people.
The first commercial
single-chip
microprocessor
launched in 1971,
the Intel 4004, which
was developed by
Federico Faggin
using his silicon-gate
MOS IC technology,
along with Marcian
Hoff, Masatoshi
Shima and Stan
Mazor.
Along with
electronic arcade
machines and home
video game
consoles pioneered
by Nolan Bushnell in
the 1970s, the
development
of personal
computers like
the Commodore
PET and Apple
II (both in 1977) gave
individuals access to
the computer.
But data
sharing between
individual computers
was either non-
existent or
largely manual, at
first using punched
cards and magnetic
tape, and later floppy
disks.

Data
Further information: History
of
telecommunications, Co
mputer
memory, Computer data
storage, Data
compression, Internet
access, and Social media
The first
developments for
storing data were
initially based on
photographs, starting
with microphotograp
hy in 1851 and
then microform in the
1920s, with the ability
to store documents
on film, making them
much more compact.
Early information
theory and Hamming
codes were
developed about
1950, but awaited
technical innovations
in data transmission
and storage to be put
to full use.
Magnetic-core
memory was
developed from the
research of Frederick
W. Viehe in 1947
and An
Wang at Harvard
University in
1949.  With the
advent of the MOS
transistor,
MOS semiconductor
memory was
developed by John
Schmidt at Fairchild
Semiconductor in
1964.  In
1967, Dawon
Kahng and Simon
Sze at Bell Labs
described in 1967
how the floating gate
of an MOS
semiconductor
device could be used
for the cell of a
reprogrammable
ROM. Following the
invention of flash
memory by Fujio
Masuoka at Toshiba i
n 1980,  Toshiba
commercialized NAN
D flash memory in
1987.
Copper wire cables
transmitting digital
data
connected computer
terminals and periph
erals to mainframes,
and special
message-sharing
systems leading
to email, were first
developed in the
1960s. Independent
computer-to-
computer networking
began
with ARPANET in
1969. This expanded
to become
the Internet (coined
in 1974). Access to
the Internet improved
with the invention of
the World Wide
Web in 1991. The
capacity expansion
from dense wave
division
multiplexing, optical
amplification and opt
ical networking in the
mid-1990s led to
record data transfer
rates. By 2018,
optical networks
routinely delivered
30.4 terabits/s over a
fiber optic pair, the
data equivalent of 1.2
million simultaneous
4K HD video streams.
MOSFET scaling, the
rapid miniaturization
of MOSFETs at a rate
predicted by Moore's
law, led to computers
becoming smaller
and more powerful,
to the point where
they could be carried.
During the 1980s–
1990s, laptops were
developed as a form
of portable computer,
and personal digital
assistants (PDAs)
could be used while
standing or
walking. Pagers,
widely used by the
1980s, were largely
replaced by mobile
phones beginning in
the late 1990s,
providing mobile
networking features
to some computers.
Now commonplace,
this technology is
extended to digital
cameras and other
wearable devices.
Starting in the late
1990s, tablets and
then smartphones co
mbined and extended
these abilities of
computing, mobility,
and information
sharing. Metal–
oxide–
semiconductor (MOS
) image sensors,
which first began
appearing in the late
1960s, led to the
transition from
analog to digital
imaging, and from
analog to digital
cameras, during the
1980s–1990s. The
most common image
sensors are
the charge-coupled
device (CCD) sensor
and
the CMOS (complem
entary MOS) active-
pixel sensor (CMOS
sensor).
Electronic paper,
which has origins in
the 1970s, allows
digital information to
appear as paper
documents.
Personal computers
Main article: History
of
personal computers
By 1976, there were
several firms racing
to introduce the first
truly successful
commercial personal
computers. Three
machines, the Apple
II, PET 2001 and TRS-
80 were all released
in 1977, becoming
the most popular by
late
1978. Byte magazine
later referred to
Commodore, Apple,
and Tandy as the
"1977 Trinity". Also in
1977, Sord Computer
Corporation released
the Sord M200 Smart
Home Computer in
Japan.
Apple II
Main article: Apple II
Apr. 1977: Apple II.

Steve
Wozniak (known as
"Woz"), a regular
visitor to Homebrew
Computer
Club meetings,
designed the single-
board Apple
I computer and first
demonstrated it
there. With
specifications in
hand and an order for
100 machines at
US$500 each
from the Byte Shop,
Woz and his
friend Steve
Jobs founded Apple
Computer.
About 200 of the
machines sold before
the company
announced the Apple
II as a complete
computer. It had
color graphics, a full
QWERTY keyboard,
and internal slots for
expansion, which
were mounted in a
high quality
streamlined plastic
case. The monitor
and I/O devices were
sold separately. The
original Apple
II operating
system was only the
built-in BASIC
interpreter contained
in ROM. Apple
DOS was added to
support the diskette
drive; the last version
was "Apple DOS 3.3".
Its higher price and
lack of floating
point BASIC, along
with a lack of retail
distribution sites,
caused it to lag in
sales behind the
other Trinity
machines until 1979,
when it surpassed
the PET. It was again
pushed into 4th place
when Atari introduce
d its popular Atari 8-
bit systems.
Despite slow initial
sales, the Apple II's
lifetime was about
eight years longer
than other machines,
and so accumulated
the highest total
sales. By 1985
2.1 million had sold
and more than 4
million Apple II's were
shipped by the end of
its production in
1993.

Optical networking
Further information: Optical
communication, Image
sensor, and Optical fiber
Optical
communication plays
a crucial role
in communication
networks. Optical
communication
provides the
transmission
backbone for
the telecommunicati
ons and computer
networks that
underlie the Internet,
the foundation for
the Digital
Revolution and
Information Age.
The two core
technologies are the
optical fiber and light
amplification
(the optical
amplifier). In 1953,
Bram van Heel
demonstrated image
transmission through
bundles of optical
fibers with a
transparent cladding.
The same
year, Harold
Hopkins and Narinde
r Singh
Kapany at Imperial
College succeeded in
making image-
transmitting bundles
with over 10,000
optical fibers, and
subsequently
achieved image
transmission through
a 75 cm long bundle
which combined
several thousand
fibers.
Gordon
Gould invented
the optical
amplifier and
the laser, and also
established the first
optical
telecommunications
company, Optelecom,
to design
communication
systems. The firm
was a co-founder
in Ciena Corp., the
venture that
popularized
the optical
amplifier with the
introduction of the
first dense wave
division
multiplexing system. 
This massive scale
communication
technology has
emerged as the
common basis of
all telecommunicatio
n networks and, thus,
a foundation of the
Information Age.
Economy,
society
and
culture
Manuel Castells
captures the
significance of the
Information Age
in The Information
Age: Economy,
Society and
Culture when he
writes of our global
interdependence and
the new relationships
between economy,
state and society,
what he calls "a new
society-in-the-
making." He cautions
that just because
humans have
dominated the
material world, does
not mean that the
Information Age is
the end of history:

It is in fact,
quite the
opposite:
history is just
beginning, if
by history
we
understand
the moment
when, after
millennia of
a prehistoric
battle with
Nature, first
to survive,
then to
conquer it,
our species
has reached
the level of
knowledge
and social
organization
that will
allow us to
live in a
predominant
ly social
world. It is
the
beginning of
a new
existence,
and indeed
the
beginning of
a new age,
The
Information
Age, marked
by the
autonomy of
culture vis-à-
vis the
material
basis of our
existence.

See also
Attention economy
Attention inequality
Big data
Cognitive-cultural
economy
Computer crime
Cyberterrorism
Cyberwarfare
Datamation – first
print magazine
dedicated solely to
covering
information
technology
Digital dark age
Digital detox
Digital divide
Digital
transformation
Digital world
Imagination age,
the hypothesized
successor of the
information age: a
period in which
creativity and
imagination
become the
primary creators of
economic value
Indigo Era
Information
explosion
Information
revolution
Information society
Internet governance
Netocracy
Social Age
Technological
determinism
Telecommunication
s
Zettabyte Era
The Hacker Ethic
and the Spirit of the
Information Age
Information and
communication
technologies for
environmental
sustainability

Referenc
es

Further
reading

External
links
ABOUT THIS ARTICLE
View edit history
Updated 5 days ago

View talk page


Discuss improvements
to this article

READ MORE

Edholm's law
Law predicting that
bandwidth and data …

Bandwidth (comput…
Maximum rate of data
transfer
Random-acces…
Form of computer
data storage

Content is available
under CC BY-SA 3.0  unless
otherwise noted.
View article in browser

You might also like