Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 32

The Role of Specialization in

LDPC Codes

Jeremy Thorpe
Pizza Meeting Talk
2/12/03
Talk Overview

LDPC Codes
Message Passing Decoding
Analysis of Message Passing Decoding
(Density Evolution)
Approximations to Density Evolution
Design of LDPC Codes using D.E.
The Channel Coding Strategy

Encoder chooses the mth


codeword in codebook C m {0,1}k x C X n
and transmits it across the Encoder
channel
Decoder observes the
channel output y and Channel
generates m based on the
knowledge of the codebook Decoder
C and the channel m'{0,1}k y Y n
statistics.
Linear Codes

A linear code C (over a finite field) can be


defined in terms of either a generator matrix
or parity-check matrix.
Generator matrix G (kn)
C {mG }
Parity-check matrix H (n-kn)
C {c : cH ' 0}
LDPC Codes

LDPC Codes -- linear codes defined in terms


of H.
H has a small average number of non-zero
elements per row
0 1 1 0 0 1 0 0 0 0
1 0 1 1 0 0 0 0 1 1
H 1 1 0 0 1 0 1 1 1 0
0 1 0 1 1 1 0 0 1 0
1 0 0 1 0 0 1 1 0 1
Graph Representation of LDPC
Codes
H is represented by a Variable nodes
bipartite graph.
There is an edge from
v to c if and only if:
H (v, c ) 0

...
A codeword is an

...
assignment of v's s.t.:
x
v |c
v 0c Check nodes
Regular (,) LDPC codes

Every variable node Variable nodes


has degree , every
check node has degree
.
Ensemble is defined by
matching left edge

...
...
"stubs" with right edge
"stubs via a random
permutation Check nodes
Message-Passing Decoding of
LDPC Codes

Message Passing (or Belief Propagation)


decoding is a low-complexity algorithm
which approximately answers the question
what is the most likely x given y?
MP recursively defines messages mv,c(i) and
mc,v(i) from each node variable node v to each
adjacent check node c, for iteration i=0,1,...
Two Types of Messages...

Liklihood Ratio Probability Difference


p( y | x 1) x , y p( x 1 | y ) p( x 0 | y )
x , y
p( y | x 0)

For y1,...yn independent For x1,...xn


conditionally on x: independent:
x , y x , y
n
i
x , y x , y
i i
1
i i i
...Related by the Biliniear
Transform
Definition: B ( x , y ) B (
p ( y | x 1)
)
1 x p ( y | x 0)
B( x) p ( y | x 0) p ( y | x 1)
1 x
p ( y | x 0) p ( y | x 1)
Properties: p ( y | x 0) p ( y | x 1)

2 p( y)
B( B( x)) x
2 p( x 0 | y) p( y) 2 p( x 1 | y) p( y )
B ( x , y ) x , y
2 p( y )
B( x , y ) x , y p( x 0 | y) p( x 1 | y)
x, y
Message Domains

Probability Difference Likelihood Ratio


B() P( y | x 0)
P( x 0 | y ) P( x 1 | y ) P( y | x 1)

Log Prob. Difference Log Likelihood Ratio


B' ()
Variable to Check Messages

On any iteration i, the


message from v to c is: v
P(e | x 1) c
P(e | x 0)

...
It is computed like:

...
xv , yv B(mc ',v
(i ) ( i 1)
mv ,c )
c '|v c
Check to Variable Messages

On any iteration, the


message from c to v is: v
P ( v 0 | e ) P ( v 1 | e) c
It is computed like:
B(m
(i )

(i )

...
mc ,v )

...
v ',c
v '|c v
Assumption: Incoming
messages are indep.
Decision Rule

After sufficiently many iterations, return the


likelihood ratio:
0, if x , y B(mc ,v (i 1) ) 0

x
v v
c|v

1, otherwise
Theorem about MP Algorithm

If the algorithm stops after r


iterations, then the algorithm r
returns the maximum a
posteriori probability estimate
of xv given y within radius r of

...
v.
However, the variables within

...
a radius r of v must be v
dependent only by the
equations within radius r of v,

...
Analysis of Message Passing
Decoding (Density Evolution)
in Density Evolution we keep track of
message densities, rather than the densities
themselves.
At each iteration, we average over all of the
edges which are connected by a permutation.
We assume that the all-zeros codeword was
transmitted (which requires that the channel
be symmetric).
D.E. Update Rule

The update rule for Density Evolution is


defined in the additive domain of each type
of node.
Whereas in B.P, we add (log) messages:
B' ( m
(i ) (i )
mc,v v ',c )
v '|c v
In D.E, we convolve message densities:
D(mc ,v ) * B' ( D(mv ',c ))
(i ) (i )

v '|c v
Familiar Example:

If one die has density function given by:

1 2 3 4 5 6
The density function for the sum of two dice
is given by the convolution:

2 3 4 5 6 7 8 9 10 11 12
D.E. Threshold

Fixing the channel message densities, the


message densities will either "converge" to
minus infinity, or they won't.
For the gaussian channel, the smallest
(infimum) SNR for which the densities
converge is called the density evolution
threshold.
Regular (,) LDPC codes

Every variable node has degree , every


check node has degree .
Best rate 1/2 code is (3,6), with threshold
1.09 dB.
This code had been invented by 1962 by
Robert Gallager.
D.E. Simulation of (3,6) codes

Set SNR to 1.12 dB


(.03 above threshold)
Watch fraction of
"erroneous messages"
from check to variable.
(note that this fraction
does not characterize
the distribution fully)
Improvement vs. current error
fraction for Regular (3,6)
Improvement per
iteration is plotted
against current error
fraction.
Note there is a single
bottleneck which took
most of the decoding
iterations.
Irregular (, ) LDPC codes

a fraction i of variable Variable nodes


nodes have degree i. i
of check nodes have 2
degree i. 4
Edges are connected by 3
a single random

...
...
permutation. m
Nodes have become n
specialized. Check nodes
D.E. Simulation of Irregular
Codes (Maximum degree 10)
Set SNR to 0.42 dB
(~.03 above threshold)
Watch fraction of
erroneous check to
variable messages.
This Code was
designed by
Richardson et. al.
Comparison of Regular and
Irregular codes
Notice that the
Irregular graph is much
flatter.
Note: Capacity
achieving LDPC codes
for the erasure channel
were designed by
making this line
exactly flat.
Multi-edge-type construction

Edges of a particular
"color" are connected
through a permutation.
Edges become
specialized. Each edge
type has a different
message distribution
each iteration.
D.E. of MET codes.

For Multi-edge-type
codes, Density
evolution tracks the
density of each type of
message separately.
Comparison was made
to real decoding, next
slide (courtesy of Ken
Andrews).
MET D.E. vs. decoder
simulation
Regular, Irregular, MET
comparison
Multi-edge-type LDPC
codes improve
gradually through most
of the decoding.
MET codes have a
threshold below the
more "complex"
irregular codes.
Design of Codes using D.E.

Density evolution provides a moderately fast


way to evaluate single- and multi- edge type
degree distributions (hypothesis-testing).
Much faster approximations to Density
Evolution can easily be put into an outer loop
which performs function minimization.
Semi-Analytic techniques exist as well.
Review

Iterative Message Passing can be used to


decode LDPC codes.
Density Evolution can be used to predict the
performance of MP for infinitely large codes.
More sophisticated classes of codes can be
designed to close the gap to the Shannon
limit.
Beyond MET Codes
(future work)
Traditional LDPC codes are designed in two stages:
design of the degree distribution and design of the
specific graph.
Using very fast and accurate approximations to
density evolution, we can evaluate the effect of
placing or removing a single edge in the graph.
Using an evolutionary algorithm like Simulated
Annealing, we can optimize the graph directly,
changing the degree profile as needed.

You might also like