Professional Documents
Culture Documents
DC Digital Communication PART4
DC Digital Communication PART4
more difficult is
to relate the mathematical representations for random variables
to the physical properties of the process
X(t) f(x)
X1
t x
t1
pdf
sample function
discrete random process :
● random variables can assume only certain isolated values
X(t) f(x)
100 ½ ½
t x
0 0 100
sample function pdf
pdf
stationary random process:
If all marginal and joint density function of the process do not depend
on the choice of time origin, the process is said to be stationary
( 이 경우 모든 mean 과 moment 는 상수 )
nonstationary random process:
If any of the pdf does change with the choice of time origin,
the process is nonstationary.
random response
input
if stationary
if erogodic
ˆ
var X
1
T
(see Ch.6)
N
1
Xˆ X i
N 1
mean 1
E Xˆ E
N
1
i N
X E X i
1
N
X X
mean-square
ˆ 2
1
E X E 2
N
X X
1
i j N 2 E X X
i j
가정 : statistically independent, that is,
E Xi X j X 2 i j
X
2
i j
ˆ 2 1
N
E X 2 NX 2 N 2 N X
2
1 2
X 2 1 X
1
N N
X2 X
1 2
N
ˆ ˆ 2
var X E X E X
ˆ 2
1
X2
N
zero-mean
Gaussian
Random process
N 0, Y2
5.7 Smoothing Data with a Moving Window Average
Noise
Ni
X X X X X
X X
nR
1
i-nL i-1 i i+1 i+nR
Xˆ i Yi k
n L n R 1 k nL
A kind of LPF
Random variables, Random processes
C X;Y( t 1; t 2) = R X ;Y( t 1; t 2) à m X ( t 1) m Y( t 2)
m X (t ) = 0
R X ( t 1; t 2) = 12 cos(2ùf c( t 1 à t 2))
C X ( t 1; t 2) = R X ( t 1; t 2)
X ( t ) and N ( t ) independent
R X ;Y( t 1; t 2) = R X ( t 1; t 2) + m X ( t 1) m N ( t 2)
• Topics :
• Concepts of deterministic and random processes
stationarity, ergodicity
• random processes :
fX(x)
x(t)
time, t
• Ensemble averaging :
properties of the process are obtained by averaging over a
collection or ‘ensemble’ of sample records using values at
corresponding times
• Time averaging :
properties are obtained by averaging over a single record in
time
Random processes - basic concepts
• Ergodic process :
stationary process in which averages from a single record
are the same as those obtained from averaging over the
ensemble
Most stationary random processes can be treated as
ergodic
Wind loading from extra - tropical synoptic gales can be treated
as stationary random processes
Wind loading from hurricanes - stationary over shorter periods <2
hours
- non stationary over the duration of the
Wind loading from thunderstorms, tornadoes - non
storm
stationary
Random processes - basic concepts
• Mean value :
x(t)
x
time, t T
1 T
x Lim x(t)dt
T T 0
x
x(t)
x
time, t T
1 T 2
x Lim x (t)dt
mean square value,
2
T T 0
variance,
σ x(t) x
2
x 2 1 T
Lim x(t) - x dt
T T 0
2
• Autocorrelation :
x(t)
time, t T
• The autocorrelation, or autocovariance, describes the general
dependency of x(t) with its value at a short time later, x(t+)
1 T
x ( ) Lim x(t) - x . x(t τ) - x dt
T T 0
R()
0
Time lag,
1
T1 R( )d
0
R()
0
Time lag,
• The area under the normalized autocorrelation function for the
fluctuating wind velocity measured at a point is a measure of
the average time scale of the eddies being carried passed the
measurement point, say T1
• If we assume that the eddies are being swept passed at the
mean velocity, U.T1 is a measure of the average length scale
of the eddies
• This is known as the ‘integral length scale’, denoted by u
Random processes - basic concepts
• Spectral density :
Sx(n)
frequency, n
• Spectral density :
Basic relationship 2 2
Sx (n) Lim X T (n)
T T
(2) :
• Spectral density :
Basic relationship Sx (n) 2 x ( )e i 2n dτ
(3) : -
Inverse relationship :
ρ x ( ) Re al Sx (n)e
0
i 2n
dn Sx (n)cos(2n )dn
0
x
time, t T
y(t)
y
time, t T
• Covariance :
1 T
c xy (0) x (t).y(t) Lim x(t) - x . y(t) - y dt
T T 0
Note that here x'(t) and y'(t) are used to denote the
fluctuating parts of x(t) and y(t) (mean parts
subtracted)
• Correlation coefficient :
• Correlation - application :
• The fluctuating wind loading of a tower depends on the
correlation coefficient between wind velocities and hence wind
loads, at various heights
z2
z1
frequency, n
John Holmes
225-405-3789 JHolmes@lsu.edu
54
Random Events
A {1, 2}
Probability
• Joint Probability: P ( AB ) P ( A B )
- Probability that both A and B occur
P ( AB )
• Conditional Probability: P ( A | B )
P( B)
• Statistical Independence:
- Events A and B are statistically independent if:
P ( AB ) P ( A) P ( B )
- If A and B are independence than:
P ( A | B ) P ( A) and P ( B | A) P ( B )
Random Variables
• Definition: FX ( x) F ( x) P ( X x)
• Properties:
FX ( x) is monotonically nondecreasing
F () 0
F ( ) 1
P(a X b) F (b) F (a )
• While the CDF defines the distribution of a random
variable, we will usually work with the pdf or pmf
• In some texts, the CDF is called PDF (Probability
Distribution function)
Wireless Communication Research Laboratory (WiCoRe)
63
dFX ( x) dF ( x)
• Definition: PX ( x) or P( x)
dx dx
• Interpretations: pdf measures how fast the CDF is
increasing or how likely a random variable is to lie around
a particular value
• Properties:
P( x) 0
P ( x)dx 1
b
P ( a X b)
a
P ( x )dx
Expected Values
-Variance: E ([ X mx ]2 )
2
( x m x ) p( x)dx
p( x) 1
X
b
P ( a X b) p ( x )
xa
• Binary Distribution
1 p x0
p ( x)
p x 1
• This is frequently used for binary data
• Mean: E ( X ) p
2
• Variance: X p(1 p)
n y
• Then pY ( y ) p (1 p )n y y 0,1,..., n
y
• Mean: E ( X ) np
2
• Variance: X np (1 p)
Wireless Communication Research Laboratory (WiCoRe)
68
1
• Mean: E ( X ) ( a b)
2
1
X
2
• Variance: ( a b) 2
12
Wireless Communication Research Laboratory (WiCoRe)
69
1 ( x mx ) 2 2
• Gaussian pdf: p( x) e
2
The Q-function
• The function that is frequently used for the area under the
tail of the gaussian pdf is the denoted by Q(x)
Q( x) e t 2 2
dt , x0
x
S a RSN
Transmitter Receiver R 0?
N (0, n )
2
Random Processes
Autocorrelation
Linear systems
• Input: x(t )
• Impulse Response: h(t )
• Output: y (t )
x(t ) h(t ) y (t )
• Deterministic Signals:
- Time domain: y (t ) h(t ) * x(t )
- Frequency domain: Y ( f ) F { y (t )} X ( f ) H ( f )
- Frequency domain: Y ( f ) X ( f ) | H ( f ) |2
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*4*t);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
discrete random amplitude (PAM)')
Sinusoid of Random Phase
X (t ) = cos(2p 4t + Q ) Q uniform [- p , p ]
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*4*t+ra
ndom('unif',-pi,pi,1,1));
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random phase')
Sinusoid of Random Frequency
X (t ) = cos(2p ft ) f uniform [1, 4]
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*random
('unid',4,1,1)*t);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
discrete random frequency (FSK)')
Sinusoid of Random Amp, Freq, Phase
X (t ) = A cos(2p ft + Q) A uniform [1, 4] f uniform [1, 4] Q uniform [- p , p ]
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-pi,pi,1,1));
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random amp, freq, phase')
White Gaussian Random Process
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULAT
ION_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of WGN process')
Noisy Random Sinusoid
X (t ) = A cos(2p ft + Q ) + N
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-pi,pi,1,1))
+0.1*randn(1,SIMULATION_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of noisy random
sinusoid')
Poisson Arrival Process
[l (t2 - t1 )]k [- l ( t2 - t1 )]
P[Q(t2 ) - Q (t1 ) = k ] = e k = 0,1, 2,...
k!
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
lambda=0.01;
figure(1);
clf;
for n=1:NUM_REAL
arrivals=random('poiss',lambda,1,
SIMULATION_LENGTH);
realizations(n,:)=cumsum(arrivals
);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
Picking a RV from a Random Process
NUM_REAL=10000;
SIMULATION_LENGTH=8;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULAT
ION_LENGTH);
end
x=realizations(:,3);
hist(x,30);
1 N- m
Rx (m) = å
N - m n= 1
X n X n+ m m = 0,1,..., M
function [Rxall]=Rx_est(X,M)
N=length(X);
Rx=zeros(1,M+1);
for m=1:M+1,
for n=1:N-m+1,
Rx(m)=Rx(m)+X(n)*X(n+m-1);
end;
Rx(m)=Rx(m)/(N-m+1);
end;
for i=1:M,
Rxall(i)=Rx(M+2-i);
end
Autocorrelation of Gaussian Random
Process
N=1000;
X=randn(1,N);
M=50;
Rx=Rx_est(X,M);
plot(X)
title('Gaussian Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
Autocorrelation of Gauss-Markov
Random Process
X [n] = 0.95 X [n - 1] + w[n], w[ n] ~ N (0,1)
X [0] = 0
rho=0.95;
X0=0;
N=1000;
Ws=randn(1,N);
X(1)=rho*X0+Ws(1);
for i=2:N,
X(i)=rho*X(i-1)+Ws(i);
end;
M=50;
Rx=Rx_est(X,M);
plot(X)
title('Gauss-Markov Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
Random Processes
Introduction
Nn (A)
0 1
n
2) Statistical regularity Probability of event A.
N (A)
P(A) lim n
n
n
3. Axioms of Probability.
1) 용어
a) Sample points sk: kth outcome of experiment
b) Sample space S: totality of sample points
c) Sure event: entire sample space S
d) : null or impossible event
( i ) P(S) 1
e) Elementary event: a single sample point Axioms
(i i) 0 P(A) 1
2) Definition of probability of (iii) If A B is the union of two mutually
a) A sample space S of elementary events Probability execlusive events in the class , then
b) A class of events that are subsets of S. P(A B) P(A) P(B)
3) Property 1. P( A ) 1 P(A)
4) Property 2. If M mutually the exclusive events A 1, A 2 , , A M
have the exclusive property
A1 A 2 A M S
then
P(A 1 ) P(A 2 ) P(A M ) 1
5) Property 3.
P(A B) P(A) P(B) - P(AB)
4. Conditional Probability
1) Conditional Probability of given A
(given A means that event A has occurred)
P(AB)
P(B | A)
P(A)
P(AB) P(A)P(B)
where P(AB) joint probabilit y of A & B
P(AB) P(B | A)P(A) P(A | B)P(B)
P(A | B)P(B)
P(B | A) ; Bayes' rule
P(A)
2) Statistically independent
ex1) BSC (Binary Symmetric Channel) 1-p
Discrete memoryless channel
[0] A0 B0 [0]
p
[1] A1 p B1 [1]
1-p
1 10
1 1
10 E[X] 10 xdx 20 x
2
5
0
0
10
CNU Dept. of Electronics 120
Lecture on Communication Theory
2) Discrete
Nn (k)
E[X]
k n
x x k p(k)
k
k
1 11
ex) 주사위 E[X] (1 2 3 4 5 6)
6 3
2. Function of r. v.
Y=g(X) X, Y : r. v.
E[Y] E[g(X)] g(x)fX (x)dx
ex) Y g(X) cos(X)
1
-π x π
where f X (x) 2π
0 otherwise
π
π 1 1
E[Y] π cosx dx sinx 0
2π 2π π
3. Moments
1)n-th moments
E[X n ] x n f X (x)dx
n 1 E[X] μ x mean
n 2 E[X 2 ] mean square value of X
E[(X μ )n ] (x μ X ) f X (x)dx
n
2) Central moments X
n 2, σ 2
X var[X] E[(X μ X )2 ]
4. Characteristic function
Characteristic function X(v) fX(x)
φ X (v) E[exp(jvx)] f X (x)exp(jvx)dx
1
f X (x)
φ X (v)exp(-jvx)dv
2π
0 for n odd
CNU Dept. of Electronics 122
Lecture on Communication Theory
5. Joint moments
Joint moments
E[X i Y j ] x y f X,Y (x, y)dxdy
i j
Correlatio n
E[XY] xyfX,Y (x, y)dxdy
Covariance
cov[XY] E[(X E[X])(Y E[Y])]
E[XY] μ Xμ Y
Correlati on coefficien t
cov[XY]
ρ
σ Xσ Y
X and Y are uncorrelat ed cov [XY] 0
X and Y are orthogonal E[XY] 0
E[X] = 0 or E[Y] = 0
uncorrelated
X, Y are orthogonal
X, Y are statistically independent O uncorrelated
X
x X
f X (x) f X (x)
f Y (y)
dy/dx dg/dx x g1(y)
2. Many-to-one transformations
f Y (x)
f Y (y)
k dg/dx k x g1(y)
k
where xk = solution of g(x) = y
2. 통신 시스템에서의 random 의 예
(1) Information-bearing signal : voice signal consists of randomly spaced
bursts of energy of random duration.
(2) Digital communication 에서 전송되는 파형의 형태가 pseudo random
sequence 형태임 .
(3) Interference component : spurious electromagnetic wave 형태임 .
(4) Thermal noise : by the random motion of the electrons in conductors and
devices at the front end of the receiver.
CNU Dept. of Electronics 125
Lecture on Communication Theory
Ex 1.1)
b1 b3
a1 A possible
a3
sample function
t2
t1 t3
b2
a2
1
π θ π
where fΘ (θ ) 2π
0 otherwise
R X (τ ) E[X(t τ )X(t)]
E[A 2 cos(2π fc t 2π fcτ Θ )cos(2π fc t Θ )]
A2
cos(2π fcτ )
2
CNU Dept. of Electronics 132
Lecture on Communication Theory
RX(0) = E[X(t)X(t)] = A2
RX(T) = E[X(t)X(t+T)] = 0
where = t-u
여기서 RXY() RXY(-) i.e. not even fct.
RXY(0) is not maximum
RXY() = RYX(-)
1 → orthogonal
R12(0)=E[X1(t)X2(t)]=0
R ( )sin(2π f
X C )
1.5 Ergodic Processes 2
1. Ensemble average 와 time average
(1) Expectation or ensemble average of r.p. X(t)
→ average “across the process”
(2) Time average or long-term sample average
→ average “along the process”
(3) For sample function x(t) of w. s. s. r. p. X(t) with -T t T
(a) Time average (dc value)
1 T
μ X (T)
2T
T
x(t)dt
1 T
where RX(,T) = 2T T
x(t τ )x(t)dt
h(τ 1 )μ X (t τ 1 )dτ 1
μ X
h(τ 1 ) dτ 1 w. s. s. X(t)
μ Y μ X H(0) X(t), Y(t) are w. s. s.
2. Autocorrelation fct.
R (t, u) E[Y(t)Y(u)
Y ]
E[ h(τ 1 )X(t τ 1 )dτ 1 h(τ 2 )X(u τ 2 )dτ 2 ]
dτ 1h(τ 1 )
dτ 2h(τ 2 )R X (t τ 1 , u τ 2 )
dτ 1h(τ 1 ) dτ 2h(τ 2 )R X (τ τ 1 τ 2 )
where τ t u w. s. s. X(t)
Y(t) is also w. s. s.
E[Y 2 (t)] [ H(f)exp(j 2π fτ
1 )df]h(τ 2 )R X (τ 2 τ 1 )dτ 1dτ 2
dfH(f) dτ h(τ ) R 2 2 (τ 2 -τ 1 )exp(j2π fτ 1 )dτ 1 (Let τ τ 2 -τ 1 )
- - X
dfH(f) dτ
-
h(τ
2 2 )exp(j2 π fτ 2 ) R (τ )exp(-j2π fτ )dτ
- X
2
H(f) S X (f)df
-
4) Property 3.
For w. s. s. r. p., SX(f) 0 for all f.
E[X 2 (t)] R X (0) S X (f)df
5) Property 4.
SX(-f) = SX(f): even fct.
RX(-) = RX()
6) Property 5.
The p. s. d., appropriately normalized, has S X (f)
the properties
PX (f)
S X (f)df
usually associated with a probability density fct.
1
Wrms ( f 2p X (f)df ) 2
FIGURE 1.10
(f )
Power spectral density of sine wave with random phase;
denotes the delta function at f=0.
A 2 Tsinc 2 (fT)
E g (f)
S X (f)
T
( let τ τ 1 τ 2 τ 0 i.e. τ τ 0 τ 1 τ 2 )
S Y (f) H(f)H (f)S X (f)
2
S Y (f) H(f) S X (f)
differentiator
CNU Dept. of Electronics 144
Lecture on Communication Theory
4. Relation among the Power Spectral Density and the
Amplitude Spectrum of a Sample Function
T
X(f, t) -T
x(t)exp(-j2π ft)dt
obtain R X (τ ) using time-average formula
1 T
R X (τ ) lim
T 2T
T
x(t τ )x(t)dt
S X (f) lim
T
1
2T
E X(f, T)
2
1 T 2
lim 2T E
T
T
x(t)exp( j2π ft)dt
R Z (t, u) E[Z(t)Z(u) ]
R X (t, u) R XY (t, u) R YX (t, u) R Y (t, u)
(let τ t - u)
R Z (τ ) R X (τ ) R XY (τ ) R YX (τ ) R Y (τ )
S Z (f) S X (f) S XY (f) S YX (f) S Y (f)
when X(t) and Y(t) are uncorrelat ed
S Z (f) S X (f) S Y (f)
R VZ (τ )
h1(τ 1 )h1(τ 2 )R XY (τ τ 1 τ 2 )dτ 1dτ 2
where h1, h S VZ (f) linear,
2 are stable,
H1(f)Htime-invariant
2 (f)S XY (f) filter
strictly stationary.
4) Property 4.
If random variables X(t1), X(t2), , X(tn), obtained by sampling a Gaussian process X(t)
at times t1,t2,…,tn, are uncorrelated, i. e.