Download as pdf or txt
Download as pdf or txt
You are on page 1of 60

Communication Systems

Technology Embedded in Daily Life

VSP 2019, Dr. Lutz Lampe & Dr. Paul Lusina


Importance of Redundancy
Importance
Nature loves to repeat its self.
Redundancy … what is it?
• Good www.shirtaday.com, labeled for reuse

• Bad
• Ugly

Niels Heldenreich, labeled for reuse


Text Errors 1: (random errors)
The PrimC Direc-ive is not just a set of rules. It is a
piilosophy,band a_very !orrect one. His5ory ha~
.rm{en aggin ano again that whKGUver|mankiRz
_nterQeleb with a lhEs }e}elope~xcivyUizatron, no
matteR0itw wslI intenX+UnedMizbtJint&}f#rencl Naf
bN, theAres7_ts 2e #nva}i_El-zdi3actr#u{.
Jean-Luc Picard, On the prime directive
Text Errors 2: (systematic errors)
Zhree Rings for zhe Elven-kings under zhe sky,
Seveq for zhe Dwarf-lords iq zheir halls of szoqe,
Qiqe for Morzal Meq doomed zo die,
Oqe for zje Dark Lord oq jix dark zjroqe,
Iq zje Laqd of Mokdok wjeke zje Xjadowx lie,
Oqe kiqg zo kule zjem all, oqe kiqg zo fiqv zjem,
Oqe kiqg zo bkiqg zjem abb aqv iq zje vakkqexx biqv zjem
Iq zje Baqv of Mokvok wjeke zje Xjavowx bie.
JRR Tolken, Lord of the Rings
Text Errors 3: (burst erasure errors)
From the*very beginning— from *he first moment, I may almost
say— of my acquaintanc**wit* y*u, your mannor, impre**ing
me with the fullest**e**f of your arrogance, your **nceit, a**
your selfish di**ain of**he feelings of ***ers, were such as***
form***e groundwork of dis***robation o***hich succ***ing
events ***e built so immovable a disl**** ****I ha****t known
you a month before I****t tha****u****e the last man in the
w**** whom I could***** be *****iled on to*****y.
Jane Austin, Pride and Prejudice
Errors in numbers:

Lottery number has come up and you have won:


1. $5,040,112 (random errors)
2. $3,999,999 (systematic errors)
3. $*,00*,0*0 (erasure errors)
Reflecting on your experience:

• Were you able to read parts of the text messages –


why / why not?
• Were you able to read parts of the numeric
message – why / why not?
• Which error type is the most severe (random,
systematic, burst erasure)?
Fundamental questions about redundancy
Question Communication system topic
How much redundancy is in a Shannon information content
message?
What is useful / not useful Information theory
redundancy?
How do we remove non-useful Source Coding
redundancy from a message?
How do we add useful redundancy to Error correction coding
a message?
How do we add useful redundancy to Signals & Systems
a signal?
Communication systems:
The art of delivering a message by removing and adding redundancy.

Communication Block Redundancy Strategy


Modulation Create a symbol alphabet using
amplitude, phase and frequency
Error correction code Add controlled redundancy for
detecting and correcting errors
Source code Mapping the message to a form that
uses less resources
Guidelines for designing a system by managing
redundancy

• Determine the required reliability of the message


User
Complexity Constraint

• Determine the resources available for sending the message


• Know the type of error: random/systematic/burst
Channel
• Estimate the frequency of the error events
Source / Error • Choose a method of adding / removing (or vice versa)
correction coding redundancy ‘easily’
• Design the signal to send the message using the allowed
Signals & Systems
resources (bandwidth, energy, rate, etc)
Communication channel
The part of an communication system which can
or should not be changed (and thus optimized).
We will now look at
unreliable channel
Error
Compression
Source Correction Modulator
Encoder
Encoder

Medium
Error
Compression
Sink Correction Demodulator
Decoder
Decoder

Three questions of error correction coding


1. How to express reliability of the channel?
2. How can we encode (decode) data to achieve reliable communication?
3. What is the smallest packet size possible (limit)?
Noisy Channels

x 2 AX Qj|i = P (y = bj |x = ai ) y 2 AY

Binary symmetric channel (BSC):

0 0 P (y = 0|x = 0) = 1 f P (y = 0|x = 1) = f

x y

1 1 P (y = 1|x = 0) = f P (y = 1|x = 1) = 1 f
Noisy Channels

x 2 AX Qj|i = P (y = bj |x = ai ) y 2 AY

Binary erasure channel (BEC):

0 0 P (y = 0|x = 0) = 1 f P (y = 0|x = 1) = 0

x ? y P (y =?|x = 0) = f P (y =?|x = 1) = f

1 1 P (y = 1|x = 0) = 0 P (y = 1|x = 1) = 1 f
Noisy Channels

Gaussian channel n ⇠ Normal(0, 2


n)
with quantization:
x 2 AX y 2 AY
r

p 1 (z µ)2 /(2 2
)
z ⇠ Normal(µ, ) : p(z) = 2⇡ 2
e

probability density function


Sources of noise
Thermal noise:
- electromagnetic radiation generated by objects in the
environment of an antenna
- noise generated by electric circuits due to thermal fluctuations

Shot noise:
- random fluctuations of electric current in electronic devices

Man-made noise:
- cross-talk across wires
- interfering wireless signals

summation over a large number of different and independent


contributions to effective noise allows to invoke the central limit
theorem Gaussian noise
How can we get the error rate down?
Add redundancy!

- Instead of transmitting k bits, transmit n bits where n>k.


- The redundant (n-k) bits will help us reverse the errors if we are
clever.
Error correction coding
Source Sink

Encoder Decoder

Noisy Channel

http://www.inference.org.uk/itprnn/book.pdf
Some binary codes
(and encoding and decoding)

Source Sink
s ŝ

Encoder Decoder
t r
Noisy Channel

Probability of bit error: pb = P (ŝ 6= s)

Probability of block error: pB = P (ŝ 6= s)


Error probabilities

More generally

Probability of block error: pB (s) = P (ŝ 6= s|s)


P
Average probability of block error: pB = P (s)P (ŝ 6= s|s)
s

Probability of bit error: pb (sk ) = P (ŝk 6= sk |sk )


Average probability of bit error:
1
P
K P
pb = K P (sk )P (ŝk 6= sk |sk )
k=1 sk
Binary symmetric channel (BSC)

1 f
0 0
t f r
f
1 1
1 f

pb =
Repetition code
Encoding Code rate
t = [s s . . . s] R = 1/N
| {z }
N
Decoding: majority vote

Example: N=3
s 1 0 1 0 0 0 1
t 111 000 111 000 000 000 111
n 000 001 000 000 110 000 000
r 111 001 111 000 110 000 111
ŝ 1 0 1 0 1 0 1
corrected errors *
undetected errors *
Repetition code - error probability
Note: bit=block

for BSC, N odd:


P
N
N
pb = n f n (1 f )N n
n=(N +1)/2
Repetition code - error probability

0.1 10 -1

10 -2
0.09

0.08

0.07 10 -5

0.06

es more useful codes


0.05
o d pb
l c
0.04

ef u 10 -10
0.03

u s
0.02
re
0.01
m o
0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Rate Rate

f = 0.1
The (7,4) Hamming code
A K/N block code with K=4 and N=7.
Code rate R = K/N = 4/7

Encoding:
t5
s1 s2
s3
t7 s4 t6

t = [t1 , t2 , t3 , t4 , t5 , t6 , t7 ] XOR (modulo-2)

= [s1 , s2 , s3 , s4 , s1 + s2 + s3 , s2 + s3 + s4 , s1 + s3 + s4 ]
The (7,4) Hamming code
s t s t s t s t

0000 0000000 0100 0100110 1000 1000101 1100 1100011

0001 0001011 0101 0101101 1001 1001110 1101 1101000

0010 0010111 0110 0110001 1010 1010010 1110 1110100

0011 0011100 0111 0111010 1011 1011001 1111 1111111

Can also write this as 2 3


1 0 0 0 1 0 1
t = sG 6 0 1 0 0 1 1 0 7
G=6
4 0
7
row vectors generator
0 1 0 1 1 1 5
matrix 0 0 0 1 0 1 1
The (7,4) Hamming code
s t s t s t s t

0000 0000000 0100 0100110 1000 1000101 1100 1100011

0001 0001011 0101 0101101 1001 1001110 1101 1101000

0010 0010111 0110 0110001 1010 1010010 1110 1110100

0011 0011100 0111 0111010 1011 1011001 1111 1111111

Observation: Any pair of codewords differ from each other in


at least three bits.

can at least correct 1 bit error


The (7,4) Hamming code
Decoding
1

Example: t = [1, 0, 0, 0, 1, 0, 1] 1 0
0
1 0 0
XOR (modulo-2)

r =t+n binary noise vector


=BSC
0* 1
1
1 0 1 0
1 1* 0 1*
0
1 0 0 1 0 0
1 0 0

r = [1, 1, 0, 0, 1, 0, 1] r = [1, 0, 0, 0, 0, 0, 1] r = [1, 0, 1, 0, 1, 0, 1]


The (7,4) Hamming code
Decoding
Pattern of violations of parity rules is called a syndrome.

syndrome z 000 001 010 011 100 101 110 111

Flip this bit none r7 r6 r4 r5 r1 r2 r3

r5

r1 r2
r3

r7 r4 r6
The (7,4) Hamming code
Syndrome decoding:

generator matrix: G = [I 4 P ] binary codes


T T
parity-check matrix: H = [ P I 3 ] = [P I 3]
T
t = sG tH = 03

Transmission: r =t+n

T T
Syndrome computation: z = rH = nH
The (7,4) Hamming code
Decoding failure - Example
1

Transmitted 1
0
0

s = [1, 0, 0, 0] t = [1, 0, 0, 0, 1, 0, 1] 1 0 0

Received 1

r = t + n = [1, 0, 1, 0, 1, 0, 0] 1 0
1*
0* 0 0
T
Syndrome z = rH = [1, 1, 0]

1
Decoded
1 1
1*
t̂ = [1, 1, 1, 0, 1, 0, 0] ŝ = [1, 1, 1, 0]
0* 0 0
(7,4) Hamming code - error probability

0.1 10 -1

10 -2
0.09

0.08

0.07 10 -5

0.06

es more useful codes


0.05
o d pb
l c
0.04

ef u 10 -10
0.03

u s
0.02
re
0.01
m o
0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Rate Rate

f = 0.1
Convolutional Codes
Use shift registers for encoding

Example t(a)

t(b)
(i) P
k
(i) convolution
Encoding tm = g  sm 
=0

(a) (b) (a) (b)


t= [t1 t1 t2 t2 . . .]

Rate R=1/2 for this example


Convolutional Codes
Use shift registers for encoding

Example

0 0
s=[10111101…]
Convolutional Codes
Use shift registers for encoding

Example 1

1 0 0
s=[10111101…]

1
Convolutional Codes
Use shift registers for encoding

Example

1 0 0
s=[10111101…] t=[11…
Convolutional Codes
Use shift registers for encoding

Example

1 0
s=[10111101…] t=[11…
Convolutional Codes
Use shift registers for encoding

Example 0

0 1 0
s=[10111101…] t=[11…

1
Convolutional Codes
Use shift registers for encoding

Example

0 1 0
s=[10111101…] t=[1101…
Convolutional Codes
Use shift registers for encoding

Example

0 1
s=[10111101…] t=[1101…
Convolutional Codes
Use shift registers for encoding

Example 0

1 0 1
s=[10111101…] t=[1101…

0
Convolutional Codes
Use shift registers for encoding

Example

1 0 1
s=[10111101…] t=[110100…
Convolutional Codes
Use shift registers for encoding

Example

1 0
s=[10111101…] t=[110100…
Convolutional Codes
Use shift registers for encoding

Example 1

1 1 0
s=[10111101…] t=[110100…

0
Convolutional Codes
Use shift registers for encoding

Example

1 1 0
s=[10111101…] t=[11010010…
Convolutional Codes
Use shift registers for encoding

Example t(a)

t(b)
(i) P
k
(i)
Encoding tm = g  sm 
=0

(a) (a) (a) (a)


Generator g = [g0 , g1 , g2 ] = [1, 0, 1] = 58
polynomials (b) (b) (b)
(5,7)8-code
(b)
g = [g0 , g1 , g2 ] = [1, 1, 1] = 78
Convolutional Codes
Example t(a)
z1 z2
s

t(b)

(z1 , z2 )

00 01
State transition
diagram

10 11
Convolutional Codes
Example t(a)
z1 z2
s

t(b)

(z1 , z2 )
(a) (b)
(sm , tm , tm ) (0,11)

(0,00)
00 01
State transition
diagram
(1,11) (0,01)
(0,10)
(1,00)

10 (1,10) 11 (1,01)
Convolutional Codes
Example t(a)
z1 z2
s

t(b)

m=0 m=1 m=2 m=3

Trellis diagram
Convolutional Codes
Decoding
Branch metric
00 received 10 1
11 1

11 1
00 1

01 2
10 0

10 0

01 2
Convolutional Codes
Decoding
Branch metric
00 received 11 2
11 0

11 0
00 2

01 1
10 1

10 1

01 1
Convolutional Codes
Decoding

received 111010
2 1 1
0 1 1

0 1 1
2 1 1

1 2 2
1 0 0

1 0 0

1 2 2
Convolutional Codes
Decoding
Path metric
received 111010
2 1 1
0 1 1 3
0 1 1
2 1 1

1 2 2
1 0 0

1 0 0

1 2 2
Convolutional Codes
Decoding
Path metric
received 111010
2 1 1 4
0 1 1 3
0 1 1
2 1 1

1 2 2
1 0 0

1 0 0

1 2 2
Convolutional Codes
Decoding via the Viterbi algorithm

pathmetricm(00) pathmetricm+1(00)

pathmetricm(01) pathmetricm+1(01)

pathmetricm(10) pathmetricm+1(10)

pathmetricm(11) pathmetricm+1(11)
Convolutional Codes
Decoding via the Viterbi algorithm

received 11

+2
pathmetricm(00) pathmetricm+1(00)
+0

+0
pathmetricm(01) +2 pathmetricm+1(01)

+1
pathmetricm(10) +1 pathmetricm+1(10)

+ 1
pathmetricm(11) pathmetricm+1(11)
+1
Convolutional Codes
Decoding via the Viterbi algorithm

received 11 Add Compare Select

+2 (pm,00+2)
pathmetricm(00) pathmetricm+1(00)
+0 ) =min(pm,00+2,pm,01+0)
(pm,00) + 0
,01
+0
(p m
pathmetricm(01) +2 pathmetricm+1(01)
(pm,01)
+1
pathmetricm(10) +1 pathmetricm+1(10)

+ 1
pathmetricm(11) pathmetricm+1(11)
+1
Error probability

0.1 10 -1

10 -2
0.09
(5,7)8-code
0.08

0.07 10 -5

0.06

0.05
pb
0.04
10 -10
0.03

0.02
(5,7)8-code
0.01

0 10 -15
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Rate Rate

f = 0.1
Error probability
10 -1

uncoded
10 -2

(5,7)8-code
10 -3

10 -4

pb
(171,133)8-code
10 -5

R=1/2
10 -6
0.9 0.91 0.92 0.93 0.94 0.95 0.96 0.97 0.98 0.99

1-f

You might also like