Professional Documents
Culture Documents
An Introduction To Digital Communications
An Introduction To Digital Communications
Digital Communications
Costas N. Georghiades
Electrical Engineering Department
Texas A&M University
These
Thesenotes
notesare
aremade
madeavailable
availablefor
forstudents
studentsof
ofEE
EE455,
455,and
andthey
theyare
areto
to
be
used
to
enhance
understanding
of
the
course.
Any
unauthorized
be used to enhance understanding of the course. Any unauthorized
copy
copyand
anddistribution
distributionof
ofthese
thesenotes
notesisisprohibited.
prohibited.
Course Outline
Introduction
Analog Vs. Digital Communication Systems
A General Communication System
Some
Probability Theory
functions, independence
Expectation, conditional expectation, Bayes rule
Stochastic processes, autocorrelation function,
Outline (contd)
Analog-to-digital
conversion
theorem
Huffman coding, Run-length coding, Lempel-Ziv
Communication
channels
Bandlimited channels
The AWGN channel, fading channels
Costas N. Georghiades
Outline (contd)
Receiver
design
M-ary signaling
Maximum-likelihood receivers
Performance in an AWGN channel
RThe Chernoff and union/Chernoff bounds
RSimulation techniques
Signal spaces
Modulation: PAM, QAM, PSK, DPSK, coherent FSK,
incoherent FSK
Costas N. Georghiades
Outline (contd)
Channel
coding
performance
Convolutional codes, the Viterbi algorithm,
performance bounds
Trellis-coded modulation (TCM)
Signaling
response signaling
Equalization
Costas N. Georghiades
Outline (contd)
Signaling
Costas N. Georghiades
Introduction
A General Communication System
Source
Transmitter
Channel
Receiver
User
Source:
Speech, Video, etc.
Transmitter: Conveys information
Channel:
Invariably distorts signals
Receiver:
Extracts information signal
User:
Utilizes information
Costas N. Georghiades
Analog
Receiver
RF
Oscillator
Costas N. Georghiades
discrete alphabet
Example:
1
0110010...
Costas N. Georghiades
or
Transmitter
Noise
s1(t)
s2(t) 0
r(t)
1?
Channel
?
( )dt
0
s1(t)
Costas N. Georghiades
t=T
>
<
1
0
0
Comparator
10
11
A/D
Conversion
Source
Encoder
Channel
Encoder
Modulator
C
h
a
n
n
e
l
Synchronization
User
Costas N. Georghiades
D/A
Conversion
Source
Decoder
Channel
Decoder
Demodulator
12
1)
Ai and
2)
Ai Ai
1) = {, } an algebra
2) = { , , {1} , {2} , {0} , {1,2} , {1,0} , { 0,2}} an algebra
3) = { , , {0} , {1} , {2}}
Costas N. Georghiades
not an algebra
13
Probability Measure
Definition: A class of subsets, , of a space is a -algebra
(or a Borel algebra) if:
1) Ai Ai .
2) Ai , i = 1,2,3,... U Ai .
i =1
A .
3) P U Ai = P[ Ai ] for
i =1 i =1
Costas N. Georghiades
Ai A j = , i j .
14
Probability Measure
Let = (the real line) and be the set of all intervals (x1, x2] in .
Also, define a real valued function f which maps such that:
1) f ( x) 0 x .
2)
f ( x)dx = 1.
Then:
] [
P { x ; x1 < x x 2 } = P ( x1 , x 2 ] =
x2
x1
f ( x )dx
Costas N. Georghiades
15
Probability Space
The following conclusions can be drawn from the above definition:
1) P[ ] = 0
[ ]
2) P A = 1 P[ A]
( P ( A + A ) = P ( ) = 1 = P ( A ) + P ( A )) .
3) If A1 A2 P ( A1 ) P ( A2 )
4) P[ A1 A2 ] = P[ A1 ] + P[ A2 ] P[ A1 A2 ] .
sample space
event space
P
probability measure
Costas N. Georghiades
16
( x ) d X = 1 as follows
FX ( x ) = P [ X x ] =
( ) d .
17
Density Functions
We have the following observations based on the above definitions:
1) F X ( ) =
f X (x)d X = 0
2) F X ( ) =
f X (x)d X = 1
3) If x 1 x 2 F X ( x 1 ) F X ( x 2 ) ( F X ( x ) non-decreasing)
Examples of density functions:
a) The Gaussian density function (Normal)
f X ( x) =
Costas N. Georghiades
1
2
( x )2
2 2
18
1, x [ 0,1]
f X (x) =
0, otherwise
fX (x)
1
x
0
Costas N. Georghiades
a
exp ( a x )
2
19
Conditional Probability
Let A and B be two events from the event space . Then, the probability of event A,
given that event B has occurred, P[A | B ] , is given by
P[ A B]
.
P[ A| B ] =
P[ B]
Example: Consider the tossing of a dice:
P [{2} | "even outcome"]=1/3,
Thus, conditioning can increase or decrease the probability of an event, compared to its
unconditioned value.
The Law of Total Probability
M
UA
i =1
= and Ai A j = i j .
P[ B] = P[ B| Ai ] P[ Ai ] , B .
i =1
Costas N. Georghiades
20
A3
B
A1
A2
P ( B | A2 )
Costas N. Georghiades
21
1
2
P00
P01
P00=P[receive 0 | 0 sent]
P10=P[receive 0 | 1 sent]
P01=P[receive 1 | 0 sent]
1
Pr(1) =
2
P10
1
P11
P11=P[receive 1 | 1 sent]
P01 = 0.01
P10 = 0.01
1
1
0.01 + 0.01
2
2
= 0.01
Costas N. Georghiades
22
Bayes Law
Bayes Law:
Let
P Aj | B =
][ ]
P B| A j P A j
M
P[ B| A ] P[ A ]
i
i =1
Proof:
[
]
] P[ B] P[ A B] = P[ A | B] P[ B] = P[ B| A ] P[ A ]
P[ B| A ] P[ A ]
P[ B| A ] P[ A ]
P[ A | B] =
=
P Aj | B =
P Aj B
P( B )
P[ B| A ] P[ A ]
i =1
Costas N. Georghiades
23
Costas N. Georghiades
24
Expectation
Consider a random variable X with density fX (x). The expected (or mean) value of X is
given by
E[ X ] =
xf
( x )dx .
g( x ) f
( x ) dx .
2
x
E
(
X
)
f X ( x )dx
[
]
f X ( x )dx E 2 ( X )
= E ( X 2 ) E 2 ( X ).
Costas N. Georghiades
25
Example, Expectation
Example: Let X be Gaussian with
f X ( x) =
1
2 2
( x ) 2
exp
2
2
Then:
E( X ) =
1
2
xe
( x)2
2 2
dx = ,
Var( X ) = E [ X 2 ] E 2 ( X ) =
2
2
2
.
x
f
(
x
)
dx
Costas N. Georghiades
26
Random Vectors
Definition: A random vector is a vector whose elements are random variables, i.e., if X1,
X2, ..., Xn are random variables, then
X = ( X 1 , X 2 ,..., X n )
is a random vector.
Random vectors can be described statistically by their joint density function
f X (x) = f X 1 X 2 ... X n ( x1 , x 2 ,L , x n ) .
Example: Consider tossing a coin twice. Let X1 be the random variable associated with
the outcome of the first toss, defined by
1, if heads
X1 =
0, if tails
Similarly, let X2 be the random variable associated with the second tossing defined as
1, if heads
X2 =
.
0, if tails
The vector X = ( X 1 , X 2 ) is a random vector.
Costas N. Georghiades
27
f X ,Y ( x , y ) = f X ( x ) f Y ( y ) .
The definition can be extended to independence among an arbitrary number of random
variables, in which case their joint density function is the product of their marginal
density functions.
Definition: Two random variables X and Y are uncorrelated if
E [ XY ] = E [ X ] E [ Y ] .
It is easily seen that independence implies uncorrelatedness, but not necessarily the other
way around. Thus, independence is the stronger property.
Costas N. Georghiades
28
X ( j ) = E e
j X
]= e
j x
f X ( x )dx .
X ( j ) =
1
2
j x
( x )2
2
dx = e
1
j 2 2
2
sX
] = e
sx
f X ( x ) dx .
Fact: The moment-generating function of a random variable X can be used to obtain its
moments according to:
n
X ( s)
d
n
|s = 0
E[ X ] =
ds n
Costas N. Georghiades
29
Stochastic Processes
A stochastic process {X (t ); < t < } is an ensemble of signals, each of which can be
realized (i.e. it can be observed) with a certain statistical probability. The value of a
stochastic process at any given time, say t1, (i.e., X(t1)) is a random variable.
Definition: A Gaussian stochastic process is one for which X(t) is a Gaussian random
variable for every time t.
5
Amplitude
Fast varying
0
-5
0
0.2
0.4
0.6
0.8
0.8
Time
1
Amplitude
Slow varying
Costas N. Georghiades
0
-1
0
0.2
0.4
0.6
30
xf
X (t )
( x; t )dx ,
VAR[ X ( t )] = E ( X ( t ) X ( t ) )
].
Example: Consider the Gaussian random process whose value X(t) at time t is a
Gaussian random variable having density
x2
f X ( x; t ) =
exp
,
2
t
2 t
0.3
0.2
0.1
Costas N. Georghiades
0
-6
t= 2
-4
-2
31
[(
)]
)(
C XX ( t1 , t 2 ) = E X ( t1 ) X ( t1 ) X ( t 2 ) X ( t 2 ) ,
t1 , t 2 . .
R XX (t1 , t 2 ) = E X (t1 ) X (t 2 ) ,
t1 , t 2 .
32
Spectral Density
Example: (Correlation stationary process)
X (t ) = a
],
= t1 t 2 .
Definition: For a wide-sense stationary process we can define a spectral density, which is the Fourier transform of the
stochastic process's autocorrelation function:
SX ( f ) =
XX
( ) e j 2 f d .
The autocorrelation function is the inverse Fourier transform of the spectral density:
R XX ( ) =
S (f) e
j2 f
df .
S ( f ) df .
X
Costas N. Georghiades
33
H(f )
y (t )
SY ( f ) = S X ( f ) H ( f )
Costas N. Georghiades
34
Costas N. Georghiades
35
Analog-to-Digital Conversion
r
Two steps:
Sampling
Discreetize amplitude: Quantization
Discreetize time:
A
m
p
l
i
t
u
d
e
Costas N. Georghiades
Time
36
Sampling
content
X ( f ) = x (t )e j 2ft dt
Time, sec
x (t ) = X ( f )e
Costas N. Georghiades
j 2ft
df
X(f)
0.5
Frequency, Hz
37
Ideal Sampling
Mathematically, the sampled version, xs(t), of signal x(t) is:
x s (t ) = h(t ) x (t ) X s ( f ) = H ( f ) X ( f ) ,
1
h (t ) = (t kTs ) =
Ts
k =
j 2 k
t
Ts
Sampling
function
k =
h(t)
...
-4Ts -3Ts -2Ts -Ts
...
0
Ts
xs(t)
Ts
Costas N. Georghiades
2 Ts 3 Ts 4 Ts
38
Ideal Sampling
1
H ( f ) =
Ts
j2
K =
kt
Ts
1
=
Ts
k =
k
.
Ts
Then:
1
X s ( f ) = H( f ) * X ( f ) =
Ts
.
X
f
K =
s
X s ( f)
Aliasing
fs < 2 W
X(f)
...
...
-fs
-W
fs
(b )
X s ( f)
No Aliasing
fs > 2 W
...
...
-fs
-W
fs
(a )
Costas N. Georghiades
39
Ideal Sampling
If fs>2W, the original signal x(t) can be obtained from xs(t) through simple low-pass
filtering. In the frequency domain, we have
X ( f ) = X s ( f ) G ( f ),
where
Ts , f B
G( f ) =
0, oherwise.
for W B f s W .
G( f )
g( t ) = 1 G( f ) = G( f ) e j 2 ft df = 2 BTs
B
sin(2 Bt )
.
2 Bt
Ts
B
B f
Costas N. Georghiades
40
Ideal Sampling
G(f)
T
-B -W
W B
g(t)
t
The Sampling Theorem:
A bandlimited signal with no spectral components above W Hz can be recovered
uniquely from its samples taken every Ts seconds, provided that
Nyquist
1
Ts
, or, equivalent ly, f s 2W .
Rate
2W
Extraction of x(t) from its samples can be done by passing the sampled signal through a
low-pass filter. Mathematically, x(t) can be expressed in terms of its samples by:
x (t ) = x (kTs ) g (t kTs )
k
Costas N. Georghiades
41
Natural Sampling
A delta function can be approximated by a
rectangular pulse p(t)
T1 , T2 t T2
p (t ) =
0, elsewhere.
h p (t ) =
Costas N. Georghiades
p(t kT )
k =
It can be shown
that in
this case as well
the original
signal can be
reconstructed
from its samples
at or above
the Nyquist rate
through simple
low-pass filtering
42
Zero-Order-Hold Sampling
x s ( t ) = p ( t ) [x ( t ) h ( t ) ]
x(t)
P(f)
0
Ts
x s (t )
1
P( f )
Equalizer
Costas N. Georghiades
G( f )
Low-pass
Filter
x (t )
Reconstruction is
possible but an
equalizer may be
needed
43
50
0
-50
-100
-150
0
0.2
0.4
0.6
0.8
Example:
Music in general has a spectrum with frequency components in the range ~20kHz. The
ideal, smallest sampling frequency fs is then 40 Ksamples/sec. The smallest practical
sampling frequency is 44Ksamples/sec. In compact disc players, the sampling frequency
is 44.1Ksamples/sec.
Costas N. Georghiades
44
x(t)
xs(t)
Costas N. Georghiades
Ts
1
f =
s T > 2W
s
t
45
xs(t)
Low-Pass
Filter
0
Costas N. Georghiades
46
Quantization
Quantization
Quantization:
Uniform Vs. Nonuniform
Scalar Vs. Vector
Costas N. Georghiades
47
Example (Quantization)
Let N=3 bits. This corresponds to L=8
quantization levels:
x(t)
111
110
101
100
000
3-bit
Uniform
Quantization
001
010
011
Costas N. Georghiades
48
Quantization (contd)
There
Examples:
Telephone speech signals: 8-bit quantization
CD digital audio: 16-bit quantization
Costas N. Georghiades
49
Input-Output Characteristic
7
2
3-Bit (8-level)
Uniform
Quantizer
Output,
x = Q ( x )
5
2
3
2
3
2
5
2
Costas N. Georghiades
Input, x
Quantization Error:
d = ( x x ) = ( x Q ( x ) )
7
2
50
PX
SQNR =
D
where :
1
PX = lim
T T
1
D = lim
T T
Costas N. Georghiades
PX
SQNR =
D
where :
T
2
[ ]
PX = E X 2
T E X (t ) dt
2
T E [X (t ) Q ( X (t ))] dt
T
2
D = E ( X Q( X ))
2
1 2 2
D = e de =
2
12
1/
/2
/2 e
V2
1 T2 2 2
PX = lim T V sin (t )dt =
T T
2
2
= 2V / 2 N
P
SQNR = 10 log10 X
D
Costas N. Georghiades
p(e)
= 6.02 N + 1.76 dB
52
Example
A zero-mean, stationary Gaussian source X(t) having spectral density as given below
is to be quantized using a 2-bit quantization. The quantization intervals and levels are
as indicated below. Find the resulting SQNR.
SX ( f ) =
200
(
)
=
100
e
XX
2
1 + (2f )
PX = RXX (0 ) = 100
f X ( x) =
-5
-15
-10
D = E ( X Q( X )) = 2
2
10
( x 5)
5
0
1
e
200
x2
200
15
10
10
100
SQNR = 10 log10
= 9.25 dB
11.885
Costas N. Georghiades
53
Non-uniform Quantization
Costas N. Georghiades
54
Costas N. Georghiades
55
Companding (compressing-expanding)
Uniform
Quantizer
Compressor
Low-pass
Filter
- Law Companding :
Expander
= 255
0.8
ln (1 + x )
g ( x) =
sgn( x ),
ln (1 + )
= 10
0.6
1 x 1
0.4
=0
0.2
0 0
1
g ( x) =
[
(1 + ) 1] sgn( x ),
0.2
0.4
0.6
0.8
0.8
1 x 1
=0
0.6
0.4
= 10
0.2
= 255
0
Costas N. Georghiades
0.2
0.4
0.6
0.8
56
57
Data Compression
A/D Converter
Analog
Source
Sampler
Quantizer
Source
Encoder
001011001...
Discrete Source
Discrete
Source
01101001...
Source
Encoder
10011...
58
Costas N. Georghiades
59
Measuring Information
Not
Example:
Discrete
Source 1
Discrete
Source 2
Discrete
Source 3
Costas N. Georghiades
P(0)=1
No information provided
P(1)=0
P(0)=0.99
I(x)=-log2(p) bits
H ( x ) = p log 2 ( p ) (1 p ) log 2 (1 p )
61
H(x)
1
0.5
Costas N. Georghiades
0.5
p
62
Non-Binary Sources
In general, the entropy of a source that produces L symbols with
probabilities p1, p2, ,pL, is
L
H ( X ) = pi log 2 ( pi ) bits
i =1
0 H ( X ) log 2 (L )
Equality iff the source
probabilities are equal
Costas N. Georghiades
63
Codeword
Length
3/8
3/8
11
1/8
100
1/8
101
Costas N. Georghiades
3
3
1
1
M = mi pi = 1 + 2 + 3 + 3
8
8
8
8
i =1
= 1.875 bits/symbol
4
64
65
coding
Run-length coding
Lempel-Ziv
There are also lossy compression algorithms
that do not exactly represent the source, but
do a good job. These provide much better
compression ratios (more than a factor of
10, depending on reproduction quality).
Costas N. Georghiades
66
largest
Lengths
111
011
110
010
101
.081
001
011
.081
00011
100
.009
00010
010
.009
00001
001
00000
000
smallest
.729
.081
.009
.001
.162
1.0
0
.271
1
1
.109
.018 1
0
1
0
0
.028
0
.01
0
0
Costas N. Georghiades
67
Run Lengths
Probability
Codewords
0.100
1000
01
0.090
1001
001
0.081
1010
0001
0.073
1011
00001
0.066
1100
000001
0.059
1101
0000001
0.053
1110
00000001
0.048
1111
00000000
0.430
Costas N. Georghiades
M 1 2.710
=
= 0.475
M 2 5.710
68
Algorithms
Costas N. Georghiades
69
Video-conference
MPEG2
Costas N. Georghiades
70