Professional Documents
Culture Documents
Shannon Final SCS
Shannon Final SCS
Shannon Final SCS
http://ee.stanford.edu/∼ gray
Source coding/compression/quantization
source reproduction
bits-
{Xn} -
encoder decoder -
{X̂n}
Simulation/synthesis/fake process
simulation
random bits -
coder -
{X̃n}
Shannon entropy
P
− x N µ N N
(x ) log µ N N
(x ) AX discrete
H(X ) = H(µ ) =
N N
∞
otherwise
Shannon entropy (rate) H(X) = H(µ) = inf H(X N )/N = lim H(X N )/N
N N→∞
[Shannon (1959)]
What is the best tradeoff between the rate in bits per source sample
and the quality of the reproduction with respect to the input?
simulation
random bits -
coder -
{X̃n}
source
↓ reproduction
bits
{Xn} -
encoder -
decoder -
{X̂n}
source reproduction
bits-
{Xn} -
encoder 6
decoder -
{X̂n}
Simulation/synthesis/fake process
?
simulation
random bits -
coder -
{X̃n}
U
• Far more known about design: e.g., transform codes, vector
quantization, clustering
D
• Does not preserve key properties (stationarity, ergodicity, mixing,
0-1 law)
In general output neither stationary nor ergodic (it is N -stationary
and can have a periodic structure, not necessarily N -ergodic).
Can “stationarize” with uniform random start, but retains possible
periodicities. Not equivalent to SBC of input.
XXX
XXX
XXX
XXX
XXX
XXX
XXX '$
?
XXX
f
Xz
X 9
&%
Encoder EN : ANX → I
Distortion D(EN , DN ) = E dN (X , DN (EN (X )))
N N
1
N log M
fixed-rate
Rate R(EN ) =
N −1 H(EN (X N )) variable-rate
δ(N)
BC (R) = inf D(EN , DN )
EN ,DN :R(EN )≤R
Optimal performance:
SBC (R) =
δ(N,K) inf D( fN , gK )
fN ,gK :R( f )≤R
*ditto
∆ h i 1r
ρ̄1/r (µN , νN ) = `r (µN , νN ) = N d̄N (µN , νN )
h i 1r
= inf E(||X N − Y N ||rr )
pN ⇒µN ,νN
δ(N)
BC (R) = inf ρ̄ N (µ ,ν )
N N
νN
α Sinai theorem
Xn -
f ? -
g -
X̂n ∼ µ̂
Un
Sort of . . .
• The d̄ close to iid property is nice for intuition, but does it actually
help?