Professional Documents
Culture Documents
ECEVSP L04 Channel Coding
ECEVSP L04 Channel Coding
• Bad
• Ugly
Medium
Error
Compression
Sink Correction Demodulator
Decoder
Decoder
x 2 AX Qj|i = P (y = bj |x = ai ) y 2 AY
0 0 P (y = 0|x = 0) = 1 f P (y = 0|x = 1) = f
x y
1 1 P (y = 1|x = 0) = f P (y = 1|x = 1) = 1 f
Noisy Channels
x 2 AX Qj|i = P (y = bj |x = ai ) y 2 AY
0 0 P (y = 0|x = 0) = 1 f P (y = 0|x = 1) = 0
x ? y P (y =?|x = 0) = f P (y =?|x = 1) = f
1 1 P (y = 1|x = 0) = 0 P (y = 1|x = 1) = 1 f
Noisy Channels
p 1 (z µ)2 /(2 2
)
z ⇠ Normal(µ, ) : p(z) = 2⇡ 2
e
Shot noise:
- random fluctuations of electric current in electronic devices
Man-made noise:
- cross-talk across wires
- interfering wireless signals
Encoder Decoder
Noisy Channel
http://www.inference.org.uk/itprnn/book.pdf
Some binary codes
(and encoding and decoding)
Source Sink
s ŝ
Encoder Decoder
t r
Noisy Channel
More generally
1 f
0 0
t f r
f
1 1
1 f
pb =
Repetition code
Encoding Code rate
t = [s s . . . s] R = 1/N
| {z }
N
Decoding: majority vote
Example: N=3
s 1 0 1 0 0 0 1
t 111 000 111 000 000 000 111
n 000 001 000 000 110 000 000
r 111 001 111 000 110 000 111
ŝ 1 0 1 0 1 0 1
corrected errors *
undetected errors *
Repetition code - error probability
Note: bit=block
0.1 10 -1
10 -2
0.09
0.08
0.07 10 -5
0.06
ef u 10 -10
0.03
u s
0.02
re
0.01
m o
0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Rate Rate
f = 0.1
The (7,4) Hamming code
A K/N block code with K=4 and N=7.
Code rate R = K/N = 4/7
Encoding:
t5
s1 s2
s3
t7 s4 t6
= [s1 , s2 , s3 , s4 , s1 + s2 + s3 , s2 + s3 + s4 , s1 + s3 + s4 ]
The (7,4) Hamming code
s t s t s t s t
Example: t = [1, 0, 0, 0, 1, 0, 1] 1 0
0
1 0 0
XOR (modulo-2)
r5
r1 r2
r3
r7 r4 r6
The (7,4) Hamming code
Syndrome decoding:
Transmission: r =t+n
T T
Syndrome computation: z = rH = nH
The (7,4) Hamming code
Decoding failure - Example
1
Transmitted 1
0
0
s = [1, 0, 0, 0] t = [1, 0, 0, 0, 1, 0, 1] 1 0 0
Received 1
r = t + n = [1, 0, 1, 0, 1, 0, 0] 1 0
1*
0* 0 0
T
Syndrome z = rH = [1, 1, 0]
1
Decoded
1 1
1*
t̂ = [1, 1, 1, 0, 1, 0, 0] ŝ = [1, 1, 1, 0]
0* 0 0
(7,4) Hamming code - error probability
0.1 10 -1
10 -2
0.09
0.08
0.07 10 -5
0.06
ef u 10 -10
0.03
u s
0.02
re
0.01
m o
0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Rate Rate
f = 0.1
Convolutional Codes
Use shift registers for encoding
Example t(a)
t(b)
(i) P
k
(i) convolution
Encoding tm = g sm
=0
Example
0 0
s=[10111101…]
Convolutional Codes
Use shift registers for encoding
Example 1
1 0 0
s=[10111101…]
1
Convolutional Codes
Use shift registers for encoding
Example
1 0 0
s=[10111101…] t=[11…
Convolutional Codes
Use shift registers for encoding
Example
1 0
s=[10111101…] t=[11…
Convolutional Codes
Use shift registers for encoding
Example 0
0 1 0
s=[10111101…] t=[11…
1
Convolutional Codes
Use shift registers for encoding
Example
0 1 0
s=[10111101…] t=[1101…
Convolutional Codes
Use shift registers for encoding
Example
0 1
s=[10111101…] t=[1101…
Convolutional Codes
Use shift registers for encoding
Example 0
1 0 1
s=[10111101…] t=[1101…
0
Convolutional Codes
Use shift registers for encoding
Example
1 0 1
s=[10111101…] t=[110100…
Convolutional Codes
Use shift registers for encoding
Example
1 0
s=[10111101…] t=[110100…
Convolutional Codes
Use shift registers for encoding
Example 1
1 1 0
s=[10111101…] t=[110100…
0
Convolutional Codes
Use shift registers for encoding
Example
1 1 0
s=[10111101…] t=[11010010…
Convolutional Codes
Use shift registers for encoding
Example t(a)
t(b)
(i) P
k
(i)
Encoding tm = g sm
=0
t(b)
(z1 , z2 )
00 01
State transition
diagram
10 11
Convolutional Codes
Example t(a)
z1 z2
s
t(b)
(z1 , z2 )
(a) (b)
(sm , tm , tm ) (0,11)
(0,00)
00 01
State transition
diagram
(1,11) (0,01)
(0,10)
(1,00)
10 (1,10) 11 (1,01)
Convolutional Codes
Example t(a)
z1 z2
s
t(b)
Trellis diagram
Convolutional Codes
Decoding
Branch metric
00 received 10 1
11 1
11 1
00 1
01 2
10 0
10 0
01 2
Convolutional Codes
Decoding
Branch metric
00 received 11 2
11 0
11 0
00 2
01 1
10 1
10 1
01 1
Convolutional Codes
Decoding
received 111010
2 1 1
0 1 1
0 1 1
2 1 1
1 2 2
1 0 0
1 0 0
1 2 2
Convolutional Codes
Decoding
Path metric
received 111010
2 1 1
0 1 1 3
0 1 1
2 1 1
1 2 2
1 0 0
1 0 0
1 2 2
Convolutional Codes
Decoding
Path metric
received 111010
2 1 1 4
0 1 1 3
0 1 1
2 1 1
1 2 2
1 0 0
1 0 0
1 2 2
Convolutional Codes
Decoding via the Viterbi algorithm
pathmetricm(00) pathmetricm+1(00)
pathmetricm(01) pathmetricm+1(01)
pathmetricm(10) pathmetricm+1(10)
pathmetricm(11) pathmetricm+1(11)
Convolutional Codes
Decoding via the Viterbi algorithm
received 11
+2
pathmetricm(00) pathmetricm+1(00)
+0
+0
pathmetricm(01) +2 pathmetricm+1(01)
+1
pathmetricm(10) +1 pathmetricm+1(10)
+ 1
pathmetricm(11) pathmetricm+1(11)
+1
Convolutional Codes
Decoding via the Viterbi algorithm
+2 (pm,00+2)
pathmetricm(00) pathmetricm+1(00)
+0 ) =min(pm,00+2,pm,01+0)
(pm,00) + 0
,01
+0
(p m
pathmetricm(01) +2 pathmetricm+1(01)
(pm,01)
+1
pathmetricm(10) +1 pathmetricm+1(10)
+ 1
pathmetricm(11) pathmetricm+1(11)
+1
Error probability
0.1 10 -1
10 -2
0.09
(5,7)8-code
0.08
0.07 10 -5
0.06
0.05
pb
0.04
10 -10
0.03
0.02
(5,7)8-code
0.01
0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Rate Rate
f = 0.1
Error probability
10 -1
uncoded
10 -2
(5,7)8-code
10 -3
10 -4
pb
(171,133)8-code
10 -5
R=1/2
10 -6
0.9 0.91 0.92 0.93 0.94 0.95 0.96 0.97 0.98 0.99
1-f