What Is Analog-To-Digital Conversion (ADC)

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Home > Electronics

DEFINITION

analog-to-digital conversion (ADC)


9 By Paul Kirvan
WhatIs.com
g
u
What is analog-to-digital conversion (ADC)?
c Analog-to-digital conversion (ADC) is an electronic process in which a continuously
o variable, or analog, signal is changed into a multilevel digital signal without altering its
essential content.
i
n An analog-to-digital converter changes an analog signal that's continuous in terms of
both time and amplitude to a digital signal that's discrete in terms of both time and
amplitude. The analog input to a converter consists of a voltage that varies among a
theoretically infinite number of values. Examples are sine waves, the waveforms
representing human speech and the signals from a conventional television camera.

The output of the analog-to-digital converter has defined levels or states. The number of
states is almost always a power of two -- that is, 2, 4, 8, 16, etc. The simplest digital
signals have only two states and are called binary. All whole numbers can be
represented in binary form as strings of ones and zeros.

w
k Analog-to-digital conversion changes continuous analog signals to discrete digital

Why is digitization important?


Digital signals propagate more efficiently than analog signals, largely because digital
impulses are well defined and orderly. They're also easier for electronic circuits to
distinguish from noise, which is chaotic. That is the chief advantage of digital
communication modes.

Computers "talk" and "think" in terms of binary digital data. While a microprocessor can
analyze analog data, it must be converted into digital form for the computer to make
sense of it.

A typical telephone modem makes use of ADC to convert the incoming audio from a
twisted-pair line into signals the computer can understand. In a digital signal
processing system, an analog-to-digital converter is required if the input signal is
analog.

What is the Nyquist theorem and why is it important?


The Nyquist or sampling theorem describes analog-to-digital conversion. It enables the
reproduction of a pure sine wave measurement, which is also known as the sample rate.

The way people experience the world is mostly analog; think sound and light waves. For
these signals to be used in computing, they must be converted to digital ones. However,
digital electronics works in discrete numbers. To convert an analog signal to a digital
one, measurements must be sampled at a regular frequency. The sample rate must be
at least twice its frequency. This approach is used in digital audio and video to reduce
aliasing, or the production of a false frequency.

A sample rate that is too low won't accurately depict the original signal; it will be
distorted or have aliasing when reproduced. A rate that is too high will use more storage
and processing resources than necessary.

The Nyquist theorem is used to locate the point where the right amount of information
is gathered. Other names for the theorem are the Nyquist-Shannon theorem or the
Whittaker-Nyquist-Shannon sampling theorem.
How the Nyquist theorem works
Analog signal frequency is measured in hertz. Their frequency describes the number of
times they go up and down in a second. Electrical engineer and mathematician Claude
Shannon explained the theorem as follows: "If a function x(t) contains no frequencies
higher than B hertz, it is completely determined by giving its ordinates at a series of
points spaced 1/(2B) seconds apart."

To correctly reproduce a signal, the sample rate has to be two times the highest
frequency.

To show how this works, imagine a sensor on the Earth tasked with measuring the
brightness of the sky. It takes a measurement once a day, every 24 hours. Data from
this sensor would lead a researcher to inaccurately believe the sky stays at a constant
brightness throughout the day. If the experiment is changed so that measurements are
taken 18 hours apart, it would produce equally inaccurate data, randomly alternating
between full daylight, complete darkness and some dim light.

Have the sensor take a measurement every 12 hours, however, and the results depict
the Earth's day-night cycle over a 24-hour period. To get an accurate measure of the
Earth's 24-hour rotation, the measurements must be done at least twice its rate, which
are 12-hour intervals.

Importance of analog-to-digital conversion


A key role ADC has played in modern technology development is the evolution of voice
communication systems from old-style analog signal processing to today's voice over
IP, or VoIP, systems. From the 1950s through the 1970s, telephone systems were unable
to communicate directly with computers. The emergence of modems made it possible,
but they were not always cost-effective.

For computer input devices, such as teletypewriters, to communicate with computer


systems, they had to connect to a modem that linked to the front end of a computer
system, such as a mainframe. Modem transmission speeds were slow compared with
today's ultrahigh-speed networks. A fast modem in the 1960s and 1970s provided 2,400
bits per second of throughput to computers. By contrast, today's systems operate at
gigabit speeds.
ADC technology became the linchpin for developing digital private branch exchange, or
PBX, systems, as well as systems for smaller office applications. These systems used a
fully digital switching architecture, and ADC units embedded in telephone sets -- and,
sometimes, within the switch itself -- converted analog voice signals to digital bit
streams that the digital switch could process.

Conversely, the opposite process was used when a voice call was delivered to another
user. A digital-to-analog converter, or DAC, converted the digital code from the switch
into audible analog signals. This model is still in use today, more than 50 years later.
ADC technology is also used to process video signals into digital bit streams for
transmitting visual images also with voice communications.

The future of analog-to-digital conversion


The nature of audible sounds generating sine waves that ADC units can sample is not
going to change unless there is a quantum change in physics. As such, ADC technology
is likely to be embedded in all types of computing devices long into the future. ADC is
perhaps one of the most significant technological advancements of the past century.

Digital Transformation: What You Need to Know to Get Started

Find out more about the future of digital transformation in the enterprise in our ultimate
guide.
This was last updated in July 2022
m Continue Reading About analog-to-digital conversion

∙∙
(ADC)
Digital transformation strategy needs realistic expectations

∙∙
Chief transformation officer takes digital one step further

Top 5 digital transformation trends


VoIP platforms offer a wide range of benefits to the enterprise

Defining enterprise AI: From ETL to modern AI infrastructure

Related Terms

OLED TV (organic light-emitting diode television)


An OLED TV (organic light-emitting diode television) is a type of display technology that uses
OLEDs to render images on the ... See complete definitionq

tablet (tablet PC)


A tablet is a wireless, portable personal computer with a touchscreen interface.
See complete definitionq

x86-64
x86-64 (also called x86_64, x64, or amd64) is the 64-bit CPU architecture that is used in Intel and
AMD processors. It is an ... See complete definitionq

Word of the Day 20 Newest and Updated Terms

blockchain digitization

Blockchain is a record-keeping technology go-live (go live)

designed to make it impossible to hack the


system or forge the data stored on it, thereby citizen development

making it secure and immutable. User Datagram Protocol (UDP)


hardware RAID (hardware redundant array
of independent disk)
Subscribe to the Word of the Day
offensive security
cyber resilience
stacked ranking (stack ranking)
shell program
magnetic tape storage
greenhouse gas
PCAOB (Public Company Accounting
Oversight Board)
Telnet
big-endian and little-endian
Apple Automated Device Enrollment
Universal Process Notation (UPN)
digital process automation
attack surface
Mitre ATT&CK framework
exbibyte (EiB)

-ADS BY GOOGLE

Latest TechTarget Networking


resources

NETWORKING 2 User Datagram Protocol


(UDP)
A
User Datagram Protocol (UDP) is a
SECURITY communications protocol primarily used to
establish low-latency and loss-tolerating
connections...
CIO
HR SOFTWARE

CUSTOMER EXPERIENCE
2 Telnet
Telnet is a network protocol used to virtually
access a computer and provide a two-way,
collaborative and text-based ...

Browse by Topic Browse Resources

About Us Meet The Editors Editorial Ethics Policy Contact Us Advertisers Business Partners Events

Media Kit Corporate Site Reprints

All Rights Reserved, Copyright 1999 - 2023, TechTarget

Privacy Policy
Do Not Sell or Share My Personal Information

You might also like