Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Digital Information - the signal that

represent information are restricted to a


specific, limited group of values.

Information knowledge or intelligence


that is communicated between two or
more points.

Analog Signal - represents a varying


physical quantity such as voltage, voice,
temperature, pressure or even video
intensity.

Digital Modulation/Digital Radio


transmittal of digitally modulated analog
signals between two or more points in a
communication system.

Digital Signal - signalis a


physicalsignalthat is a representation of
a sequence of discrete values
(aquantifieddiscretetime signal), for
example of an arbitrarybit stream, or of
adigitized(sampledandanalogtodigital
converted)analog signal.

Digital Transmission transmittal of


digital pulses between two or more
points in a communication system.

Factors that determine the relationship


between an analogand its corresponding
digital value.
Accuracy - It is how perfect and correct
the digital equivalent of the original
analog value is when compared to other
betterstandard.
Resolution - It shows how fine the
gradiationsof the digital values are, or
into how many distinct values the
overall signal span has been divided.

Resolution=

_____________________________________
Information Theory a highly
theoretical study of the efficient use of
bandwidth to propagate information
through electronic commmunication
system. quantitative body of
knowledge which has been established
about information to enable system
designers and users to use the channels
allocated to them as possible.
Inforamtion Measure the inforamtion
sent from a digital source when the ith
message is transmitted

I =log 2

V max
2n

1
bits
P

Entropy average information content

1
P

Quantification Error - It is used to


describe that a digital value corresponds
to a distinct span of analog signal.

H=P log 2

Dynamic Range - It is the ratio of the


largest signal value that can be
expressed compared to the smallest
signal value.

H max =log 2 N

DR=

V max
V min

V max
Resolution

2n1

DR dB=20 log 10 DR

equal probability

3.32 bits=dits ( Hartley ) ( decits )


1 nat=1.443 bits
Relative Entropy

H r=
____________________________________

bits
symbol

H
H max

Redundancy

r=1H r

1 keyssymbol
T
sec

Rate of Information/Data Rate

R=

H bits
=H r
T sec

Coding Efficiency

Bmin
100
Bmax

Coding Redundancy

Noiseless Channel a channel that is


both lossless and deterministic.

C3 =log 2 N =log 2 M
Shannon Limit for Information Capacity

C=BW log2 1+

Shannon Hartley Theorem

C=2 BW log 2 M
Additive White Gaussian Noise Channel
(AWGN)

1
S
C= log 2 1+
2
N

=1

S
N

M output symbols | N- input symbols


_______________________________________
Channel Capacity tha maximum rate
at which information can be
transmmited through a channel.
Information Capacity measure of how
much information can be propagated
through a communication system and is
a function of bandwidth and
transmission time.
Lossless Channel/Source Entropy a
channel described by a channel matrix
with only one non-zero element in each
column.

C=log 2 N

bits
sec

Deterministic Channel/Destination
Entropy a channel described by a
channel matrix with only one non-zero
element in each row.

C2 =log 2 M

M =2n
n number of bits per sample(sign bit
excluded)
Nyquist Sampling Theorem

f s 2 f m

Bit Error Rate(BER) empirical record of


a systems actual bit error performance.

BER =R infoC
Probability of Error is a theoretical
expectation of a bit error rate of a given
system.
______________________________________
Line Coding mapping of binary
information sequence into a digital
signal that enters the channel.
Unipolar NRZ

Polar NRZ NRZ-L -

Differential Manchester encoding

NRZ-inverted (differential encoding

Alternate Mark Inversion(AMI)

Bipolar encoding NRZ-M

HDB3 High Density Bipolar with max 3


zeros

Manchester encoding Biphase-L

You might also like