Lecture Notes 5 5510 - 2017

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 113

ELEC5510 Satellite

Communication Systems
Lecture 5
Error Control Coding II

Prof. Yonghui Li
Lecture Outline

 Decoding of convolutional codes - Viterbi algorithm


 Performance of convolutional codes
 Concatenated codes
 Interleaving
 LDPC codes
 LDPC standards in satellite systems
Satellite Communication Systems
Earth
station
antenna
Speech Channel Modu- Power
encoder encoder lator amplifier Uplink
channel

Satellite

Error
control
coding

Earth
Downlink
station
channel
antenna
Speech Channel Demodu
decoder decoder -lator

3
Decoding of Convolutional Codes

Consider transmission of convolutionally encoded data over a


BSC channel.
The demodulator in this model makes hard decisions and
presents a binary sequence at the input of the decoder.
Since the error probability is typically small, the most likely
error patterns are those with the minimum number of errors.
Thus, the maximum likelihood rule, which gives the minimum
error probability, consists of finding a code sequence with the
minimum Hamming distance from the received sequence.
The search for the received code sequence can, therefore, be
achieved by comparing the received binary sequence with all
possible paths in the trellis.

4
Decoding of Convolutional Codes

 Each path in the trellis is a valid code sequence.

1/01 1/01 1/01


11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01
0/01 1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00 00 0/00 00
S0 S1 S2 S3 S4 S5 S6 S7
 Send v = (00 00 00 00 00 00 00) (encoded sequence)
 Receive r = (01 00 00 10 00 00 00)
 The closest path in the trellis (00 00 00 00 00 00 00) – corrected 2 errors
5
Decoding of Convolutional Codes
 The minimum Hamming distance between code sequences is the free
distance dfree ,which is equal to 5 for this code.

1/01 1/01 1/01


11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01
0/01 1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00 00
S0 S1 S2 S3 S4 S5 S6 S7
 Send v = (00 00 00 00 00 00 00) (encoded sequence)
 Receive r = (01 01 10 00 00 00 00)
 The closest path in the trellis (11 01 11 00 00 00 00) – unable to correct
errors, added two more! 6
Viterbi Algorithm

 Exhaustive search for the closest sequence is very


complex.
 The Viterbi algorithm is a maximum likelihood
decoding method with reduced computational
complexity.
 It is based on eliminating less likely paths at each node
and keeping only one path with the highest likelihood.
 The comparison can be performed along sequences
approximately five times the code memory order.
 Let us consider a message c
c  (c 0 , c1 , , cl ,)
where the lth message block is
cl  (1) ( 2) (k )
cl , cl ,, cl  7
Viterbi Algorithm
 The code sequence v, at the output of an (n, k, m)
convolutional encoder consists of code blocks of n digits
each
v  ( v 0 , v1,, v l ,)
 where the l-th code block is given by
 
v l  vl(1) , vl( 2) ,, vl( n)
 Each code block vl is represented by a branch and each
code sequence v by a path in the code trellis.
 Let us assume that code sequence v is transmitted. Let
r  (r0 , r1,, rl ,)
 be the received sequence, where the l-th received block is
rl   (1) ( 2)
rl , rl ,, rl(n)

8
Branch and Path Metrics
 The branch metric, denoted by μl(rl ,vl), is defined as the
Hamming distance between the received block rl and a
code block vl in the trellis
μl (rl , v l )  d (rl , v l )
 The path metric, denoted by Ml(i)(r, v), is defined as the
Hamming distance between the received sequence r and
a path v in the trellis. It is computed as the sum of branch
metrics
l
(i )
M l (r , v)    j (r j , v j ) i  1,2,,2l (15)
j 1
 For soft decision decoding squared Euclidean distance is
used for branch and path metrics.
 The Viterbi decoding procedure consists of the following
operations.
9
Summary of the Viterbi Algorithm
1. Generate the trellis for the code.
2. Assume that the optimum signal sequences from the infinite
past to all trellis states at time l are known; their path metrics
are denoted by Ml(i), i = 1,…,2m, where m is the memory order
of the code.
3. Increase time l by 1. At time (l + 1), the decoder computes the
path metrics Ml+1(i), for all the paths entering (l + 1)st nodes by
adding branch metrics to the path metrics of connecting
survivors Ml(i).
4. Compare the path metrics of all paths entering each node and
choose the path with the minimum path metric. These paths
are called survivors. Store 2m survivors.
5. Repeat the procedure iteratively.
6. Select the symbol from the common history path at time (l –
D) as the decoded output, where D is called the decision
depth (approximately five times the code memory order).
10
Example of the Viterbi Algorithm

 If the transmitted sequence is finite, m zero symbols are


appended in the end to clear the encoder shift register.
 Example: Consider the (2,1,2) convolutional code from
the previous example, with the encoder shown below.

 Let us assume that the code sequence with L=5 is


transmitted
V = ( 00, 11, 10, 10, 00, 01, 11)

over a BSC and that the received sequence is


r = (01, 11, 10, 10, 00, 11, 10). 11
The Viterbi Decoding Algorithm

 Example (continued):
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r=( 01, 11, 10, 10, 00, 11, 10)

12
The Viterbi Decoding Algorithm

 Example (continued):
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

13
The Viterbi Decoding Algorithm
 Example (continued): Path metrics shown above a node
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

14
The Viterbi Decoding Algorithm

 Example (continued):
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

15
The Viterbi Decoding Algorithm

 Example (continued):
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

16
The Viterbi Decoding Algorithm

 Example (continued):
2
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

17
The Viterbi Decoding Algorithm

 Example (continued):
2
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1
10 10 10 10 10
0/01 0/01 0/01 0/01
0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

18
The Viterbi Decoding Algorithm

 Example (continued):
2
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

19
The Viterbi Decoding Algorithm

 Example (continued):
2
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

20
The Viterbi Decoding Algorithm

 Example (continued):
2
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

21
The Viterbi Decoding Algorithm

 Example (continued):
2 4
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

22
The Viterbi Decoding Algorithm

 Example (continued):
2 4
1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1 1/10 1/10
1 1
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

23
The Viterbi Decoding Algorithm

 Example (continued): keep only one survivor for each node


2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10 10 10 10
0/01
0/01 0/01 0/01 0/01
2 1/00 1/00 1/00
1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

24
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10
3 10 10 10
0/01 0/01 0/01 0/01 0/01
2 1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

25
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1
10 10
3 10 10 10
0/01 0/01 40/01 0/01 0/01
2
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

26
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
2 1/00 1/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

27
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

28
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
21/00 21/00 1/00

1/11 1/11 01 3 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

29
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

30
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

31
The Viterbi Decoding Algorithm
 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00
4
S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

32
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00 0/00 0/00
00 00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

33
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

34
The Viterbi Decoding Algorithm

 Example (continued):
2 1
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
2 1/00 2 1/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

35
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

36
The Viterbi Decoding Algorithm
 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 21/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

37
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 2 1/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

38
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 40/01 0/01
21/00 2 1/00 1/00

1/11 1/11 01 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

39
The Viterbi Decoding Algorithm
 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 21/00 1/00

1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

40
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 2 1/00 11/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

41
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01 0/01
21/00 2 1/00 1 1/00
1/11 1/11 01 01 5 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

42
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 2 1/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

43
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 2 1/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

44
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 2 1/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00 0/00
00 00 00 00 00
4
S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

45
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

46
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

47
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

48
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

49
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

50
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

51
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 50/01
2 1/00 21/00 1 1/00
1/11 1/11 01 01 01 1/11 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

52
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00

1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

53
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

54
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

55
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

56
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 21/00 1 1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

57
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

58
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 5
1/00 4
1/11 1/11 01 01 01 01 01

0/11 0/11 0/11 0/11


1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

59
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1 5
1/00 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

60
The Viterbi Decoding Algorithm
 Example (continued):
2 1 3 4
1/01 1/01
11 11 11 11
0/10 0/10 0/10
1/10 1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

61
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
1/00 2 1/00
2 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

62
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3 4
00 0/00 00 0/00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

63
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 21/00 1 4
1/00
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3 4
00 0/00 00 0/00 0/00 0/00
00 00 00 00
5
S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

64
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01
0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11 0/11
1 3 3 4
00 0/00 00 0/00 0/00
00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

65
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11 0/11
1 3 4
00 0/00 00 0/00
00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

66
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00
1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11
1 4
00 0/00 00 0/00
00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

67
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
21/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11
1 4
00 0/00 00 0/00
00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

68
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11
1 4 3
00 0/00 00 0/00
00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

69
The Viterbi Decoding Algorithm

 Example (continued):
2 1 3
1/01
11 11 11
0/10 0/10 0/10
1/10 1/10 1/10
1 1 3 3 1
10 10 10 10 10
0/01 0/01 0/01
2 1/00 2 1/00 1 1/00 4
1/11 1/11 01 01 01 01
2 01
0/11 0/11
1 4 3
00 0/00 00 0/00
00 00
5
S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

70
The Viterbi Decoding Algorithm
 Example (continued):
1
11
0/10
1/10
1 1
10 10
0/01
1 1/00
01
1/11 2 01
1 0/11
3
00 0/00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

71
The Viterbi Decoding Algorithm

 Example (continue):
1
11
1/10
1 1
0/10
10 10
1/00
1 0/01
01
1/11 2 01
0/11
1 3
0/00
00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( 01, 11, 10, 10, 00, 11, 10)

v̂ = ( 00, 11, 10, 10, 00, 01, 11)

72
The Viterbi Decoding Algorithm

 Viterbi soft decision decoding for AWGN channel: Very


similar 11
1/01
11
1/01
11
1/01
11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( -2 ,1, 2 ,2, -3, 3, 1, 1, -1 ,-1, 3 ,-1, 1, -2)

73
The Viterbi Decoding Algorithm

 Viterbi decoding for AWGN channel: Very similar


1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( -2, 1, 2 ,2, -3, 3, 1, 1, -1 ,-1, 3, -1, 1, -2)

74
The Viterbi Decoding Algorithm

 Euclidean distance as the branch and path metrics for AWGN


channel: 11
1/01
11
1/01
11
1/01
11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10

10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( -2, 1, 2 ,2, -3, 3, 1, 1, -1, -1, 3, -1, 1, -2)

d E 2  (2  1) 2  (1  1) 2  32  02  9
75
The Viterbi Decoding Algorithm
 Euclidean distance as the branch and path metrics for AWGN
channel: 1/01 1/01 1/01
11 11 11 11
0/10 0/10 0/10 0/10
1/10 1/10 1/10 1/10
9
10 10 10 10 10
0/01 0/01 0/01 0/01 0/01
1/00 1/00 1/00

1/11 1/11 01 1/11 01 1/11 01 1/11 01 01

0/11 0/11 0/11 0/11 0/11

00 0/00 00 0/00 00 0/00 0/00 0/00 0/00 0/00


00 00 00 00 00

S0 S1 S2 S3 S4 S5 S6 S7
r = ( -2 1, 2, 2, -3, 3, 1, 1, -1, -1, 3, -1, 1, -2)

76
Example of the Viterbi Algorithm
 If the received sequence has a finite length, it is decoded by
appending 2 zero’s (in general m zeros) to the end of the
message sequence to clear the register.
 If the received sequence is very long and continuous, the
decision depth is fixed to 10 (in general 5m). After processing
each new branch the decoder moves back 10 branches and
decodes a message block on the path with the smallest metric.

77
Error Performance for Convolutional Codes
with Hard Decision Decoding

 A measure of error performance is the decoded bit-


error rate (BER), denoted by Pb.
 For a BSC channel with the channel transition
probability p, Pb can be approximated as follows:
1 d free d free /2
Pb  Bd free 2 p
k

where Bdfree is the total number of nonzero message


bits on all the code sequences of weight dfree.

78
Error Performance for Convolutional Codes
with Hard Decision Decoding
 On a BSC channel the transition probability for QPSK and
BPSK is given by
 2E  1  E /No
p  Q    e
 N0  2
where E is the energy per transmitted uncoded bit and No is
the one-sided noise power spectral density and the
approximation applies for large E /N0.
 For a code of rate R = k/n, the energy per message bit is
E
Eb 
R
 For large Eb/N0, the bit-error probability with hard decision
d free
decoding is 1  ( Rd /2)( Eb / No )
Pb  Bd free 2 2 e free
k 79
Error Performance of Convolutional Codes
with Soft Decision Decoding

 For large Eb/N0 the BER for soft decision decoding


is approximated by
1  ( Rd )( E / N )
Pb  Bd free e free b o
2k

80
Coding Gain of Convolutional Codes

 Hard-Decision Decoding: The asymptotic coding gain of the


coded system over the uncoded BPSK system is
 Rd free 
G  10 log10   (dB)
 2 
 Soft Decision Decoding: The asymptotic coding gain of a
coded system with soft-decision decoding over an uncoded
BPSK system is
 
G  10 log10 Rd free (dB)

 A soft decision decoder requires 3dB less power than a hard


decision decoder to achieve the same error probability at
very high Eb/N0 .

81
Coding Gain of the (2,1,2) Convolutional
Code with Hard Decision Decoding
 If the modulator output is quantised to 8 levels, the
coding gain of the soft-decision decoding is about 0.25
dB lower than for an unquantised demodulator output.
 Example : The most widely used convolutional code is
the (2,1,6) code generated by the generator sequences
g (1)  (1101101)
g ( 2)  (1001111)
 This code has dfree=10. The hard decision decoding
asymptotic coding gain is

 Rd free 
G  10 log10  
 2 
 10 log10 (10 / 4)  3.98dB
82
Coding Gain of the (2,1,2) Convolutional
Code with Soft Decision Decoding

 With soft-decision decoding, the coding gain is


G  10 log10 Rd free 
 10 log10 (10 / 2)  6.98dB

83
FEC Codes in Intelsat International
Business Services (IBS) Systems

84
Concatenated Codes

 Two-stage coding scheme

 Inner code with soft decision decoding

 Inner decoding errors are bursty

 Outer RS code corrects inner decoding errors

 Good for very low BER (10-12)

85
A concatenated coded system

86
Standard Concatenated Codes

 Standard NASA Galileo and ESA concatenated code consists


of a (255,223) outer Reed-Solomon code and a (2,1) memory
order 6 convolutional inner code.
 DVB-S standard has a concatenated code consisting of a
variable rate memory order 6 convolutional inner code and a
shortened RS(204,188) outer code.
 DVB-S2 consists of an LDPC inner code and a BCH outer
code.
 The outer code corrects sporadic errors after LDPC decoding.
 BCH codes have the same length as the LDPC codes and
error correction capability from 8-12 bits.

87
Interleaving

 Used against burst errors


 If the number of errors within a codeword exceeds the
code's error-correcting capability, it fails to recover the
original codeword.
 Interleaving shuffles transmitted message symbols across a
number of code words and thus makes a more uniform
distribution of errors.
 Types of interleavers
- Block
- Convolutional
- Random
88
Block Interleaver

89
Block Interleaver

 Read in the interleaver row by row: x0, x1, x2,x3, x4, x5 , x6 , x7 , x8, x9,
x10, x11
 Interleaved- transmitted column by column: x0, x3 , x6 ,x9 ,x1 ,x4 , x7, x10,
x 2, x 5, x 8, x 1 1
 Transmission with a burst error: x0, x3 , x6 ,x9 ,x1 ,x4 , x7, x10, x2, x5, x8,
x1 1
 Received codewords after deinterleaving: x0, x1, x2,
x 3, x 4, x 5 ,
x 6, x 7, x 8,
x9, x10, x11
 One error in each codeword – easier to correct than a burst

90
Convolutional Interleaver

d – shift register with a


memory of order 4

x0, x-3 , x-6 ,x-9 ,x4 ,x1 , x-2, x-5

91
Error Performance of a Concatenated Error
Code Used in the NASA standard

92
Four Jupiter Moons

Galileo Photos

93
Binary Linear Block Codes: Review

 An (n, k) binary linear block code can be generated by


v  c G
codeword generator
message
matrix

 If the generator matrix is in the form


G   P Ik 
where P is an kx(n-k) binary matrix and Ik is a kxk identity
matrix, the code is systematic, which means that the last k
symbols are the message and the first n-k symbols are the
parity symbols.
Parity Check Matrix of Linear Block Codes

 A parity check matrix of an (n, k) binary linear block code


describes the linear relations that code components
must satisfy.
 The parity check matrix for a systematic code can be
obtained from its generator matrix as
H   I nk P T 
 For any codeword v in a linear block code

v  H T  (0 0  0)
 
n-k

 This is a useful property for decoding.


95
Parity Check Matrix - Example

Example: Consider a (7,4) linear systematic code with


generator matrix
I3 PT
   
1 1 0 1 0 0 0
0 1 1 1 0 0 1 0 1 1
0 1 0 0   
G H  0 1 0 1 1 1 0
1 1 1 0 0 1 0
  0 0 1 0 1 1 1

1 0 1 0 0 0 1
 
P I4

Let v = (1 1 1 0 0 1 0). .
v  H T  (000)
Syndrome and Error Detection
 To test whether a received vector r contains
transmission errors, we compute the syndrome of r

s  ( s0 , s1 , , sn  k 1 )  r  H T
 If s = 0, r is a codeword, and r is assumed to be error-
free and accepted by the receiver.
 If s  0, r is not a codeword and must contain
transmission errors.
LDPC Codes

 LDPC codes are specified by a sparse parity check


matrix
 LDPC codes can be decoded by an iterative probabilistic
decoding method.
 Sparseness of the parity check matrix reduces decoding
complexity.
 Long LDPC codes can perform within 0.0045 dB of the
Shannon limit.

98
Tanner Graph

 A Tanner graph is a bipartite graph which contains two


types of nodes: variable nodes (v-nodes) and check
nodes (c-nodes).
 Check node j is connected to variable node i if and only
if element hji in H has the value of 1.
 All of the v-nodes connected to a particular c-node must
sum (modulo-2) to zero.

99
Tanner Graph - Example
Example: Consider a (7,4) linear block code with parity check
matrix
1 0 0 1 0 1 1

H  0 1 0 1 1 1 0 
0 0 1 0 1 1 1
It can be represented by the following Tanner graph
v0 v1 v2 v3 v4 v5 v6
Variable nodes

Check nodes
s0 s1 s2
100
Encoding of LDPC Codes

 LDPC codes are specified by their parity check matrix H.

 The generator matrix G can be derived from H.

 Transform H into systematic form by row operations

H row
 operations 
  I n  k PT 
 Collect the parity part and construct G

G  P Ik 
 Encode

v  c G
101
Decoding of LDPC Codes

 LDPC codes can be decoded in various ways:

- Bit-flipping (BF) decoding


- Sum-product algorithm (SPA)

102
LDPC Decoding by Bit-Flipping

Received binary sequence


r = (1 0 1 1 0 0 1)
v0 v1 v2 v3 v4 v5 v6
Variable nodes

1 0 0 1 0 1 1
s0 s1 s2
H  0 1 0 1 1 1 0
Check nodes 0 0 1 0 1 1 1
Transmitted code word – unknown to the receiver
u = (1 0 1 0 0 0 1)

103
LDPC Decoding by Bit-Flipping

 Set the initial variable nodes


v  (v0 , v1 , v2 , v3 , v4 , v5 , v6 )  (1011001)
equal to the received binary sequence

v0 v1 v2 v3 v4 v5 v6 Variable nodes v  (1011001)

Check nodes s  ( s0 , s1 , s2 )
s0 s1 s2

104
LDPC Decoding by Bit-Flipping

 Calculate the check nodes as the syndrome components


s  ( s0 , s1 , s2 )  v  H  r  H  (110)
T T

s0  v0  v3  v5  v6  1; s1  v1  v3  v4  v5  1
s2  v2  v4  v5  v6  0
v0 v1 v2 v3 v4 v5 v6
Variable nodes v  (1011001)

s0  1 s2  0
Check nodes s  (110)
s1  1

 There are two syndrome components different from 0.


 Check which variable node appears in these two syndromes.
105
LDPC Decoding by Bit-Flipping

 We see that v3 is contained in both s0 and s1.


 Flip the variable node v3 from v3 =1 to v3 =0.
v0 v1 v2 v3 v4 v5 v6
Variable nodes v  (1010001)

s0  1 s1  1 s2  0 Check nodes s  (110)

106
LDPC Decoding by Bit-Flipping

 Pass this new value of the variable node v3 to all check nodes
and recalculate parity check values.
 They are all zeros now, which means that v is a codeword.
 Send this codeword as the estimate of the transmitted
Codeword.

v0 v1 v2 v3 v4 v5 v6 Variable nodes v  (1010001)

s0  0 s1  0 s2  0 Check nodes s  (000)

107
LDPC Decoding by Bit-Flipping

v=1 0 1 0 0 0 1 – decode this code word


v0 v1 v2 v3 v4 v5 v6

s0  0 s1  0 s2  0

Transmitted code word


u=1 0 1 0 0 0 1

108
Generalised Bit-flipping Decoding
 Step 1: Compute parity-checks
 - If all checks are zero, stop decoding

 Step 2: Flip any digit contained in T or more failed check


equations

 Step 3: Repeat 1 to 2 until all the parity checks are zero or a


maximum number of iterations are reached

 The parameter T can be varied for a faster convergence.

109
Sum-Product Algorithm (SPA)
 Decoding is accomplished by passing messages
 along the graph edges.
 The messages on the edges that connect to the i-the
variable node vi , are estimates of Pr[ vi =1] (or some
equivalent information).
 At the nodes the various estimates are combined in a
particular way and transmitted again.

v0 v1 v2 v3 v4 v5 v6

s0 s1 s2
110
Performance of LDPC Codes

n=106
R=1/2

111
LDPC Standards
DVB – S2 Standard
Applications: Satellite Video Broadcasting
LDPC
Codes

Block
Code Rates
Length

1/5, 1/3, 2/5,


4/9, 3/5, 2/3,
16200
11/15, 7/9,
37/45, 8/9

1/4, 1/3, 2/5,


1/2, 3/5, 2/3,
64800
3/4, 4/5, 5/6,
8/9, 9/10
112
References
1. S. Lin and D. Costello, Error Control Coding, Second Edition, Prentice Hall.
2. Digital Video Broadcasting (DVB): Framing structure, channel coding and modulation
for 11/12 GHz satellite services. Available Online:
http://www.etsi.org/deliver/etsi_en/300400_300499/300421/01.01.02_60/en_300421v010102p.pdf
3. Digital Video Broadcasting (DVB); Second generation framing structure,
channel coding and modulation systems for Broadcasting, Interactive Services, News Gathering
and other broadband satellite applications (DVB-S2). Available Online:
http://www.etsi.org/deliver/etsi_en/302300_302399/302307/01.02.01_60/en_302307v010201p.pdf
4. Kou, Lin, Fossorrier, “Low-Density Parity-Check Codes Based on Finite
Geometries: A Rediscovery and New Results”, IEEE Trans. On IT, Vol 47-7, p2711, November,
2001
5. Design of Low-Density Parity-Check (LDPC)
Codes for Deep-Space Applications
6. K.Andrews Dolinar et al, Design of Low-Density Parity-Check (LDPC)
Codes for Deep-Space Applications,
http://ipnpr.jpl.nasa.gov/progress_report/42-159/159K.pdf

113

You might also like