Professional Documents
Culture Documents
Notes 1
Notes 1
Notes 1
548
Coding, Information Theory (and
Advanced Modulation)
Prof. Jay Weitzen
Ball 411
Jay_weitzen@uml.edu
1
Class Coverage
Fundamentals of Information Theory (4
weeks)
Block Coding (3 weeks)
Advanced Coding and modulation as a way
of achieving the Shannon Capacity bound:
Convolutional coding, trellis modulation,
and turbo modulation, space time coding (7
weeks)
2
Grading Policy
4 Mini-Projects (25% each project)
This is
Important
Source Modeling
10
It has been said, that if you get enough monkeys, and sit them down at enough
typewriters, eventually they will complete the works of Shakespeare
11
12
13
14
15
16
17
18
19
21
Definition of Entropy
Shannon used the ideas of randomness and entropy from the
study of thermodynamics to estimate the randomness (e.g.
information content or entropy) of a process
22
23
24
25
26
27
28
29
30
31
32
33
Kind of
Intuitive,
but hard to
prove
34
35
Bounds on Entropy
37
39
40
N
1 2 3
F(N) .5 .3 .2
1
1 .35
2 .15
3 0
2
.1
.1
.1
3
.05
.05
.1
41
N
1 2 3
F(N) .5 .3 .2
1
1 .35
2 .15
3 0
2
.1
.1
.1
3
.05
.05
.1
N
1 2 3
F(N) .5 .3 .2
Since he rides to work with Fred (at least until the directing career
works out), Barney Rubble is late to work with the same probability
function too. What do you think the joint probability function for
Fred and Barney looks like?
Its diagonal!
1
1 .5
2 0
3 0
2
0
.3
0
3
0
0
.2
N
1 2 3
F(N) .5 .3 .2
A little-known fact: there is
actually another famous person
who is late to work like this.
SPOCK!
(Pretty embarrassing for a Vulcan.)
Before you try to guess what the joint density function for Fred
and Spock is, remember that Spock lives millions of miles (and
years) from Fred, so we wouldnt expect these variables to
influence each other at all.
In fact, theyre independent.
44
N
1 2 3
F(N) .5 .3 .2
1
1 .25
2 .15
3 .1
2
.15
.09
.06
3
.1
.06
.04
46
49
50
Marginal
Density
Functions
51
Conditional Probability
another event
P ( A, B )
P( A | B)
P( B)
52
53
P(A B)
P(A|B) =
P(B)
Conditional Probability
A
B
0.25
0.25
0.45
P(A B) 0.25
P(A|B) =
0.357
P(B)
0.70
P(A B) 0.25
P(B|A) =
0.5
P(A)
0.50
55
P ( A) P ( B ) P ( A| B) P ( B ' ) P ( A| B ' )
56
Bayes Theorem
57
58
P( Bi ) P( A / Bi )
P( Bi / A)
P( B1 ) P( A / B1 ) P( B2 ) P( A / B2 ) ....... P( Bk ) P( A / Bk )
for i = 1, 2,......, k.
59
Urn Problems
Applications of Bayes Theorem
Begin to think about concepts of Maximum
likelihood and MAP detections, which we
will use throughout codind theory
60
61
62
63
End of Notes 1
64