Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 150

Chapter 5 Random Processes

※ random process : a collection of time functions


and an associated probability description

marginal or joint pdf

ensemble X(t) : a sample function

{x(t)} X(t1) … random variable


=
X1
 How to extended the concepts of random variables
to those of random processes? simple

 more difficult is
to relate the mathematical representations for random variables
to the physical properties of the process

※ classification of random processes


continuous / discrete
deterministic / nondeterministic
stationary / nonstationary
ergodic / nonergodic
5.2 Continuous & Discrete Random Processes

 Dependent on the possible values of the random variables

 continuous random process :


● random variables X(t1), X(t2), … can assume any value

within a specified range of possible values


● PDF is continuous (pdf has NO δ function)

X(t) f(x)

X1

t x
t1
pdf
sample function
 discrete random process :
● random variables can assume only certain isolated values

● pdf contains only δ functions

X(t) f(x)
100 ½ ½

t x
0 0 100
sample function pdf

 mixed process : have both continuous and discrete component


5.3 Deterministic & Non deterministic Random Processes

random ftn of time

 nondeterministic random process :


future values of each sample ftn cannot be
exactly predicted from the observed past values
(almost all random processes are nondeterministic)

 deterministic random process :


~ can be exactly predicted ~
(Example)
X(t) = A cos(ωt + θ)
A, ω … known constant
θ … constant for all t
but different for each sample function
→ random variation over the ensemble, not wrt time
→ still possible to define r.v. X(t1), X(t2), …
and to determine pdf for r.v.
Tx Rx

사이의 거리를 모르면 θ 모름

Remark : convenient way to obtain a probability model


for signals that are known except for one or two
parameters
5.4 Stationary & Nonstationary Random Processes

 dependency of pdf on the value of time

ensemble sample X(t1)


ftn time r.v.
t1

pdf
 stationary random process:
If all marginal and joint density function of the process do not depend
on the choice of time origin, the process is said to be stationary
( 이 경우 모든 mean 과 moment 는 상수 )
 nonstationary random process:
If any of the pdf does change with the choice of time origin,
the process is nonstationary.

 All maginal & joint density ftn should be independent of


the time origin!
⇒ too stringent
⇒ relaxed condition
 mean value of any random variable X(t1) is independent of t1
& the correlation of two r.v. depends only on the time difference t 2  t1
⇒ stationary in the wide sense

mean, mean-square, variance, correlation coefficient of


any pair of r.v. are constant

random response
input

system analysis 결과는 strictly stationary 와 stationary


in the wide sense 의 두 경우 동일함 ! → 구분하지 않고 사용
5.5 Ergodic & Nonergodic Random Process

 If Almost every member of the ensemble shows the


same statistical behavior as the whole ensemble,
then it is possible to determine the statistical
behavior by examining only one typical sample
function.
⇒ Ergodic process
 For ergodic process, the mean values and
moments can be determined by time averages as
well as by ensemble averages, that is,
 1 T
X n   x f ( x)dx  lim
n
 X n (t )dt
 T  2T T

(Note) This condition cannot exist


unless the process is stationary.
→ ergodic means stationary (not vice verse)
5.6 Measurement of Random Processes
 Statistical parameters of a random process
= the sets of statistical parameters associated with
the r.v. X(t) at various times t

if stationary

these parameters are the same for all r.v.


→ consider only one set of parameters
 How to estimate the process parameters from
the observations of a single sample function?
← We cannot make an ensemble average for
obtaining the parameters!

if erogodic

make a time average

but, we cannot have a sample function


over infinite time interval

make a time average over a finite


time interval  approximation to the true value
※ Will determine
 How good this approximation is?
 Upon what aspects of measurement the goodness of the
approximation depends?

 Estimation of the mean value of an ergodic random process {X(t)}


ˆ 1 T
X
T 0
X (t )dt

→ random variable 이며 true mean valueX 와 같지


않음 X
→ 얼마나 에 가까운가 ?
Xˆ mean 이
의 X와 같고 variance 가 작아야 함 !

ˆ 1 T 
E X  E   X (t )dt 
T 0 
1 T
  Xdt  X
T 0

 
ˆ
var X 
1
T
(see Ch.6)

→ T 가 길수록 good estimate !

(Remark) 위의 계산을 위해선 X(t) 의 표현식을 알아야


함 → 실제론 불가능
⇒ discrete measurement 를 취해서 해결
If we measure X(t) at equally spaced time interval , that is,

X 1  X (t ), X 2  X (2t ), ...

then the estimate of X can be expressed as

N
1
Xˆ  X i
N 1

mean  1
E Xˆ  E 
N
 1
 i   N
X  E X i  
1
N
X  X
mean-square


 
 ˆ 2

 1
E X   E 2
N
X X
 1
 i j   N 2  E X X 
i j
가정 : statistically independent, that is,
 
E Xi X j  X 2 i j
 X
2
i j

 
 ˆ 2 1
  N

E  X   2 NX 2   N 2  N   X 
2

 1 2
 X 2  1    X 
1
N  N
  X2   X 
1 2

N
      
ˆ  ˆ 2
var X  E  X   E X
 
ˆ 2

1
  X2
N

→ mean of estimate = true mean


 
var Xˆ 
1
N
※ See the example in pp.201~202

Y (t)  2 X(t) Y 2 (t)

zero-mean
Gaussian
Random process

N 0,  Y2 
5.7 Smoothing Data with a Moving Window Average

Sample Observed Data


Xi
Yi

Noise
Ni

X X X X X
X X

nR
1
i-nL i-1 i i+1 i+nR
Xˆ i   Yi  k
n L  n R  1 k   nL
A kind of LPF
Random variables, Random processes

Analog and Digital Communications


Autumn 2005-2006

CS477: Analog and Digital


Sep 20, 2005 Communications 20
Random Variables
 Outcomes and sample space
 Random Variables
 Mapping outcomes to:
 Discrete numbers  Discrete RVs
 Real line  Continuous RVs
 Cumulative distribution function
 One variable
 Joint cdf

Sep 20, 2005 CS477: Analog and Digital Communications 21


Random Variables
 Probability mass function (discrete RV)
 Probability density function (cont. RV)
 Joint pdf of independent RVs
 Mean
 Variance
 Characteristic function
R
 Ð( X ) = E [e j 2ùf X
]= x e j 2ùf x
dx (IFT of pdf)

Sep 20, 2005 CS477: Analog and Digital Communications 22


Random Processes
 Mapping of an outcome (of an
experiment) to a range set R where R is
a set of continuous functions
 Denoted as X ( t; s ) or simply X ( t )
 For a particular outcome s = s 0; X ( t; s 0) is
a deterministic function
 For t = t 0; X ( t 0; s ) or simply X ( t 0) is a
random variable

Sep 20, 2005 CS477: Analog and Digital Communications 23


Random Processes
 Mean
m X ( t ) = E [X ( t; s )] = E [X ( t )]
 Autocorrelation
R X ( t 1; t 2) = E [X ( t 1) X ( t 2)]
 Autocovariance
C X ( t 1; t 2) = E [X ( t 1) X ( t 2)] à m X ( t 1) m X ( t 2)
C X ( t 1; t 2) = R X ( t 1; t 2) à m X ( t 1) m X ( t 2)

Sep 20, 2005 CS477: Analog and Digital Communications 24


Random Processes
 Cross-correlation
R X;Y( t 1; t 2) = E [X ( t 1) Y( t 2)]
(Processes are orthogonal if R X;Y( t 1; t 2) = 0)
 Cross-covariance
C X;Y( t 1; t 2) = E [X ( t 1) Y( t 2)] à m X ( t 1) m Y( t 2)

C X;Y( t 1; t 2) = R X ;Y( t 1; t 2) à m X ( t 1) m Y( t 2)

Sep 20, 2005 CS477: Analog and Digital Communications 25


Example
X ( t ) = A cos 2ùt A is random variable

m X ( t ) = E [A] cos 2ùt

R X ( t 1; t 2) = E [A 2] cos 2ùt 1 cos 2ùt 2

C X ( t 1; t 2) = Var( A) cos 2ùt 1 cos 2ùt 2

Sep 20, 2005 CS477: Analog and Digital Communications 26


Example
X ( t ) = cos(2ùf ct + Ê ); Ê ø U(0; 2ù )

m X (t ) = 0

R X ( t 1; t 2) = 12 cos(2ùf c( t 1 à t 2))

C X ( t 1; t 2) = R X ( t 1; t 2)

Mean is constant and autocorrelation is dependent ont 1 à t 2

Sep 20, 2005 CS477: Analog and Digital Communications 27


Example
Y( t ) = X ( t ) + N ( t )

X ( t ) and N ( t ) independent

R X ;Y( t 1; t 2) = R X ( t 1; t 2) + m X ( t 1) m N ( t 2)

Sep 20, 2005 CS477: Analog and Digital Communications 28


Stationary and WSS RP
 Stationary Random Process (RP)
pX(t) ( x ) = pX (t+ T) ( x ) 8T

 Wide sense stationary (WSS) RP


 Mean constant in time
 Autocorrelation depends only on t 1 à t 2
R X ( t 1; t 2) = R X ( t 2 à t 1)= R X ( ü)
 Stationary  WSS (Converse not true!)

Sep 20, 2005 CS477: Analog and Digital Communications 29


Power Spectral Density (PSD)
 Defined for WSS processes
 Provides power distribution as a
function of frequency
 Wiener-Khinchine theorem
 PSD is Fourier transform of ACF
R
1
S X (f ) = R X ( ü) e à j 2ùf üdü
à1

Sep 20, 2005 CS477: Analog and Digital Communications 30


Wind loading and structural response
Lecture 5 Dr. J.D. Holmes

Random processes - basic


concepts
Random processes - basic concepts

• Topics :
• Concepts of deterministic and random processes
stationarity, ergodicity

• Basic properties of a single random process


mean, standard deviation, auto-correlation, spectral
density
• Joint properties of two or more random processes
correlation, covariance, cross spectral density, simple input-
output relations
Refs. : J.S. Bendat and A.G. Piersol “Random data: analysis and
measurement procedures” J. Wiley, 3rd ed, 2000.
D.E. Newland “Introduction to Random Vibrations, Spectral and Wavelet
Analysis” Addison-Wesley 3rd ed. 1996
Random processes - basic concepts
• Deterministic and random processes :
• both continuous functions of time (usually), mathematical
concepts
• deterministic processes :
physical process is represented by explicit mathematical
relation
• Example :
response of a single mass-spring-damper in free vibration
in laboratory
• Random processes :
result of a large number of separate causes. Described in
probabilistic terms and by properties which are averages
Random processes - basic concepts

• random processes :
fX(x)
x(t)

time, t

• The probability density function describes the general


distribution of the magnitude of the random process, but it
gives no information on the time or frequency content of the
process
Random processes - basic concepts
• Averaging and stationarity :
• Underlying process

• Sample records which are individual representations of the


underlying process

• Ensemble averaging :
properties of the process are obtained by averaging over a
collection or ‘ensemble’ of sample records using values at
corresponding times
• Time averaging :
properties are obtained by averaging over a single record in
time
Random processes - basic concepts

• Stationary random process :


• Ensemble averages do not vary with time

• Ergodic process :
stationary process in which averages from a single record
are the same as those obtained from averaging over the
ensemble
Most stationary random processes can be treated as
ergodic
Wind loading from extra - tropical synoptic gales can be treated
as stationary random processes
Wind loading from hurricanes - stationary over shorter periods <2
hours
- non stationary over the duration of the
Wind loading from thunderstorms, tornadoes - non
storm
stationary
Random processes - basic concepts
• Mean value :

x(t)

x

time, t T
1 T
x  Lim  x(t)dt
T  T 0

• The mean value,x , is the height of the rectangular area


having the same area as that under the function x(t)

• Can also be defined as the first moment of the p.d.f. (ref.


Lecture 3)
Random processes - basic concepts
• Mean square value, variance, standard deviation :

x
x(t)

x

time, t T
1 T 2
x  Lim  x (t)dt
mean square value,
2
T  T 0

variance, 
σ  x(t)  x
2
x  2 1 T
 Lim   x(t) - x  dt
T  T 0
2

(average of the square of the deviation of x(t) from the mean


value,x)
standard deviation, x, is the square root of the variance
Random processes - basic concepts

• Autocorrelation :
x(t)

time, t T
• The autocorrelation, or autocovariance, describes the general
dependency of x(t) with its value at a short time later, x(t+)

1 T
  
 x ( )  Lim  x(t) - x . x(t  τ) - x dt
T  T 0

The value of x() at  equal to 0 is the variance, x2

Normalized auto-correlation : R()= R(0)= 1


x()/x2
Random processes - basic concepts
• Autocorrelation :
1

R()

0
Time lag, 

• The autocorrelation for a random process eventually decays to


zero at large 

• The autocorrelation for a sinusoidal process (deterministic) is


a cosine function which does not decay to zero
Random processes - basic concepts
• Autocorrelation :

1 
T1   R( )d
0
R()

0
Time lag, 
• The area under the normalized autocorrelation function for the
fluctuating wind velocity measured at a point is a measure of
the average time scale of the eddies being carried passed the
measurement point, say T1
• If we assume that the eddies are being swept passed at the
mean velocity, U.T1 is a measure of the average length scale
of the eddies
• This is known as the ‘integral length scale’, denoted by u
Random processes - basic concepts

• Spectral density :
Sx(n)

frequency, n

• The spectral density, (auto-spectral density, power spectral


density, spectrum) describes the average frequency content of a
random process, x(t) 
Basic relationship (1) σ x   S x (n) dn
2
0
:
The quantity Sx(n).n represents the contribution to x2
from the frequency increment n
Units of Sx(n) : [units of
x]2 . sec
Random processes - basic concepts

• Spectral density :

Basic relationship 2 2
Sx (n)  Lim  X T (n) 
T  T
(2) :  

Where XT(n) is the Fourier Transform of the process x(t) taken


over the time interval -T/2<t<+T/2

The above relationship is the basis for the usual method


of obtaining the spectral density of experimental data

Usually a Fast Fourier Transform (FFT) algorithm is used


Random processes - basic concepts

• Spectral density :

Basic relationship Sx (n)  2   x ( )e i 2n dτ
(3) : -

The spectral density is twice the Fourier Transform of the


autocorrelation function

Inverse relationship :
 
ρ x ( )  Re al  Sx (n)e
0
i 2n
 
dn   Sx (n)cos(2n )dn
0

Thus the spectral density and auto-correlation are


closely linked -
they basically provide the same information about the
process x(t)
Random processes - basic concepts
• Cross-correlation :
x(t)

x

time, t T
y(t)

y

time, t T

• The cross-correlation function describes the general


dependency of x(t) with another random process y(t+),
1 T 
delayed by a time delay,
cxy ( )  Lim
T  T 0
 x(t) - x . y(t  τ) - y dt
Random processes - basic concepts

• Covariance :

• The covariance is the cross correlation function with the time


delay, , set to zero


1 T
 
c xy (0)  x (t).y(t)  Lim  x(t) - x . y(t) - y dt
T  T 0

Note that here x'(t) and y'(t) are used to denote the
fluctuating parts of x(t) and y(t) (mean parts
subtracted)

(Section 3.3.5 in “Wind loading of


structures”)
Random processes - basic concepts

• Correlation coefficient :

• The correlation coefficient, , is the covariance normalized by


the standard deviations of x and y

x' (t).y' (t)


ρ
σ x .σ y

When x and y are identical to each other, the value of


 is +1 (full correlation)

When y(t)=x(t), the value of  is  1

In general,  1<  < +1


Random processes - basic concepts

• Correlation - application :
• The fluctuating wind loading of a tower depends on the
correlation coefficient between wind velocities and hence wind
loads, at various heights

z2
z1

u' (z1 ).u' (z 2 )


For heights, z1, and ρ(z1 , z 2 ) 
σ u (z1 ).σ u (z 2 )
z2 :
Random processes - basic concepts

• Cross spectral density :



By analogy with the spectral Sxy (n)  2  c xy ( )e i 2n dτ
-
density :
The cross spectral density is twice the Fourier Transform of
the cross-correlation function for the processes x(t) and y(t)

The cross-spectral density (cross-spectrum) is a


complex number :
Sxy (n)  C xy (n)  iQxy (n)

Cxy(n) is the co(-incident) spectral density - (in phase)


Qxy(n) is the quad (-rature) spectral density - (out of
phase)
Random processes - basic concepts

• Normalized co- spectral density :


C xy (n)
 xy (n) 
S x (n).S y (n)

It is effectively a correlation coefficient for fluctuations at


frequency, n
Application : Excitation of resonant vibration of structures
by fluctuating wind forces

If x(t) and y(t) are local fluctuating forces acting at


different parts of the structure, xy(n1) describes how
well the forces are correlated (‘synchronized’) at the
structural natural frequency, n1
Random processes - basic concepts
• Input - output relationships :
Input x(t) Output y(t)
Linear system

There are many cases in which it is of interest to know how an


input random process x(t) is modified by a system to give a
random output
Application process
: The y(t)
input is wind force - the output is
structural response (e.g. displacement acceleration,
stress). The ‘system’ is the dynamic characteristics of
the structure.
Linear system : 1) output resulting from a sum of inputs,
is equal to the sum of outputs produced by each input
individually (additive property)
Linear system : 2) output produced by a constant times
the input, is equal to the constant times the output
produced by the input alone (homogeneous property)
Random processes - basic concepts
• Input - output relationships :

Relation between spectral density of output and spectral density


of input :
2
Sy (n)  A . H(n) .Sx (n)

|H(n)|2 is a transfer function, frequency response function, or


‘admittance’

Sx(n) A.|H(n)|2 Sy(n)

frequency, n

Proof : Bendat & Piersol,


Newland
End of Lecture 5

John Holmes
225-405-3789 JHolmes@lsu.edu
54

Review of Probability and Random Processes

Wireless Communication Research Laboratory (WiCoRe)


55

Importance of Random Processes

• Random variables and processes talk about quantities and


signals which are unknown in advance
• The data sent through a communication system is modeled
as random variable
• The noise, interference, and fading introduced by the
channel can all be modeled as random processes
• Even the measure of performance (Probability of Bit
Error) is expressed in terms of a probability

Wireless Communication Research Laboratory (WiCoRe)


56

Random Events

• When we conduct a random experiment, we can use set notation to describe


possible outcomes
• Examples: Roll a six-sided die
Possible Outcomes:
S  {1, 2,3, 4,5, 6}
• An event is any subset of possible outcomes:

A  {1, 2}

Wireless Communication Research Laboratory (WiCoRe)


57

Random Events (continued)

• The complementary event: A  S  A  {3, 4,5, 6}


• The set of all outcomes in the certain event: S
• The null event: 
• Transmitting a data bit is also an experiment

Wireless Communication Research Laboratory (WiCoRe)


58

Probability

• The probability P(A) is a number which measures the


likelihood of the event A
Axioms of Probability
• No event has probability less than zero: P( A)  0
P ( A)  1 and P( A)  1  A  S
• Let A and B be two events such that: A  B  
Then: P ( A  B )  P ( A)  P ( B)
• All other laws of probability follow from these axioms

Wireless Communication Research Laboratory (WiCoRe)


59

Relationships Between Random Events

• Joint Probability: P ( AB )  P ( A  B )
- Probability that both A and B occur

P ( AB )
• Conditional Probability: P ( A | B ) 
P( B)

- Probability that A will occur given that B has occurred

Wireless Communication Research Laboratory (WiCoRe)


60

Relationships Between Random Events

• Statistical Independence:
- Events A and B are statistically independent if:
P ( AB )  P ( A) P ( B )
- If A and B are independence than:
P ( A | B )  P ( A) and P ( B | A)  P ( B )

Wireless Communication Research Laboratory (WiCoRe)


61

Random Variables

• A random variable X(S) is a real valued function of the


underlying even space: s  S
• A random variable may be:
-Discrete valued: range is finite (e.g.{0,1}) or countable
infinite (e.g.{1,2,3…..})
-Continuous valued: range is uncountable infinite (e.g.  )
• A random variable may be described by:
- A name: X
- Its range: X  
- A description of its distribution

Wireless Communication Research Laboratory (WiCoRe)


62

Cumulative Distribution Function

• Definition: FX ( x)  F ( x)  P ( X  x)
• Properties:
 FX ( x) is monotonically nondecreasing
 F ()  0
 F ( )  1
 P(a  X  b)  F (b)  F (a )
• While the CDF defines the distribution of a random
variable, we will usually work with the pdf or pmf
• In some texts, the CDF is called PDF (Probability
Distribution function)
Wireless Communication Research Laboratory (WiCoRe)
63

Probability Density Function

dFX ( x) dF ( x)
• Definition: PX ( x)  or P( x) 
dx dx
• Interpretations: pdf measures how fast the CDF is
increasing or how likely a random variable is to lie around
a particular value
• Properties:

P( x)  0  
P ( x)dx  1
b
P ( a  X  b)  
a
P ( x )dx

Wireless Communication Research Laboratory (WiCoRe)


64

Expected Values

• Expected values are a shorthand way of describing a


random variable
• The most important examples are:

-Mean: E ( X )  mx   xp( x)dx





-Variance: E ([ X  mx ]2 ) 
  2
( x m x ) p( x)dx


Wireless Communication Research Laboratory (WiCoRe)


65

Probability Mass Functions (pmf)


• A discrete random variable can be described by a pdf if we
allow impulse functions
• We usually use probability mass functions (pmf)
p( x)  P ( X  x)
• Properties are analogous to pdf
p( x)  0

 p( x)  1
X
b
P ( a  X  b)   p ( x )
xa

Wireless Communication Research Laboratory (WiCoRe)


66

Some Useful Probability Distributions

• Binary Distribution
1  p x0
p ( x)  
 p x 1
• This is frequently used for binary data
• Mean: E ( X )  p

2
• Variance:  X  p(1  p)

Wireless Communication Research Laboratory (WiCoRe)


67

Some Useful Probability Distributions


(continued)
n
• Let Y  X
i 1
i where  X i , i  1,..., n are independent

binary random variables with


1  p x0
p ( x)  
 p x 1

n y
• Then pY ( y )    p (1  p )n  y y  0,1,..., n
 y
• Mean: E ( X )  np
2
• Variance:  X  np (1  p)
Wireless Communication Research Laboratory (WiCoRe)
68

Some Useful Probability Distributions


(continued)
• Uniform pdf:
 1 a xb

p( x)   b  a

 0 otherwise
• It is a continuous random variable

1
• Mean: E ( X )  ( a  b)
2

1
X
2
• Variance:  ( a  b) 2
12
Wireless Communication Research Laboratory (WiCoRe)
69

Some Useful Probability Distributions


(continued)

1 ( x  mx ) 2 2
• Gaussian pdf: p( x)  e
2

• A gaussian random variable is completely determined by


its mean and variance

Wireless Communication Research Laboratory (WiCoRe)


70

The Q-function

• The function that is frequently used for the area under the
tail of the gaussian pdf is the denoted by Q(x)

Q( x)   e t 2 2
dt , x0
x

• The Q-function is a standard form for expressing error


probabilities without a closed form

Wireless Communication Research Laboratory (WiCoRe)


71

A Communication System with Guassian noise

S    a RSN

Transmitter Receiver R 0?

N (0,  n )
2

• The probability that the receiver will make an error is


  ( x  a )2
1  a 
P( R  0 | S  a)   e 2 n2
dx  Q  
0 2 n n 

Wireless Communication Research Laboratory (WiCoRe)


72

Random Processes

• A random variable has a single value. However, actual


signals change with time
• Random variables model unknown events
• Random processes model unknown signals
• A random process is just a collection of random variables
• If X(t) is a random process, then X(1), X(1.5) and X(37.5)
are all random variables for any specific time t

Wireless Communication Research Laboratory (WiCoRe)


73

Terminology Describing Random Processes

• A stationary random process has statistical properties


which do not change at all time
• A wide sense stationary (WSS) process has a mean and
autocorrelation function which do not change with time
• A random process is ergodic if the time average always
converges to the statistical average
• Unless specified, we will assume that all random processes
are WSS and ergodic

Wireless Communication Research Laboratory (WiCoRe)


74

Description of Random Processes

• Knowing the pdf of individual samples of the random


process is not sufficient.
- We also need to know how individual samples are
related to each other
• Two tools are available to decribe this relationship
- Autocorrelation function
- Power spectral density function

Wireless Communication Research Laboratory (WiCoRe)


75

Autocorrelation

• Autocorrelation measures how a random process changes


with time
• Intuitively, X(1) and X(1.1) will be strongly related than
X(1) and X(100000)
• The autocorrelation function quantifies this
• For a WSS random process,
 X     E  X  t  X  t    
• Note that Power   X (0)

Wireless Communication Research Laboratory (WiCoRe)


76

Power Spectral Density


•   f  tells us how much power is at each frequency
• Wiener-Khinchine Theorem:  ( f )  F{ ( )}
- Power spectral density and autocorrelation are a
Fourier Transform pair
• Properties of Power Spectral Density
 ( f )  0
  ( f )  ( f )

 Power    ( f )df


Wireless Communication Research Laboratory (WiCoRe)


77

Gaussian Random Processes

• Gaussian random processes have some special properties


- If a gaussian random process is wide-sense stationary,
then it is also stationary
- If the input to a linear system is a Gaussian random
process, then the output is also a Gaussian random
process

Wireless Communication Research Laboratory (WiCoRe)


78

Linear systems

• Input: x(t )
• Impulse Response: h(t )
• Output: y (t )

x(t ) h(t ) y (t )

Wireless Communication Research Laboratory (WiCoRe)


79

Computing the Output of Linear Systems

• Deterministic Signals:
- Time domain: y (t )  h(t ) * x(t )

- Frequency domain: Y ( f )  F { y (t )}  X ( f ) H ( f )

• For a random process, we can still relate the statistical


properties of the input and output signal
- Time domain: Y ( )   X ( )* h( )* h(  )

- Frequency domain: Y ( f )   X ( f ) | H ( f ) |2

Wireless Communication Research Laboratory (WiCoRe)


Random Signals
Sinusoid of Random Amplitude
X (t ) = A cos(2p 4t ) A uniform [1, 4]

NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*4*t);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
discrete random amplitude (PAM)')
Sinusoid of Random Phase
X (t ) = cos(2p 4t + Q ) Q uniform [- p , p ]

NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*4*t+ra
ndom('unif',-pi,pi,1,1));
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random phase')
Sinusoid of Random Frequency
X (t ) = cos(2p ft ) f uniform [1, 4]

NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*random
('unid',4,1,1)*t);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
discrete random frequency (FSK)')
Sinusoid of Random Amp, Freq, Phase
X (t ) = A cos(2p ft + Q) A uniform [1, 4] f uniform [1, 4] Q uniform [- p , p ]

NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-pi,pi,1,1));
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random amp, freq, phase')
White Gaussian Random Process

NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULAT
ION_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of WGN process')
Noisy Random Sinusoid
X (t ) = A cos(2p ft + Q ) + N
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-pi,pi,1,1))
+0.1*randn(1,SIMULATION_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of noisy random
sinusoid')
Poisson Arrival Process
[l (t2 - t1 )]k [- l ( t2 - t1 )]
P[Q(t2 ) - Q (t1 ) = k ] = e k = 0,1, 2,...
k!
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
lambda=0.01;
figure(1);
clf;
for n=1:NUM_REAL
arrivals=random('poiss',lambda,1,
SIMULATION_LENGTH);
realizations(n,:)=cumsum(arrivals
);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
Picking a RV from a Random Process

NUM_REAL=10000;
SIMULATION_LENGTH=8;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULAT
ION_LENGTH);
end
x=realizations(:,3);
hist(x,30);

A Gaussian RV of mean 0 and std 1


Autocorrelation

1 N- m
Rx (m) = å
N - m n= 1
X n X n+ m m = 0,1,..., M

function [Rxall]=Rx_est(X,M)
N=length(X);
Rx=zeros(1,M+1);

for m=1:M+1,

for n=1:N-m+1,
Rx(m)=Rx(m)+X(n)*X(n+m-1);

end;
Rx(m)=Rx(m)/(N-m+1);

end;
for i=1:M,
Rxall(i)=Rx(M+2-i);
end
Autocorrelation of Gaussian Random
Process

N=1000;
X=randn(1,N);

M=50;
Rx=Rx_est(X,M);
plot(X)
title('Gaussian Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
Autocorrelation of Gauss-Markov
Random Process
X [n] = 0.95 X [n - 1] + w[n], w[ n] ~ N (0,1)
X [0] = 0
rho=0.95;

X0=0;
N=1000;

Ws=randn(1,N);
X(1)=rho*X0+Ws(1);
for i=2:N,
X(i)=rho*X(i-1)+Ws(i);
end;

M=50;
Rx=Rx_est(X,M);
plot(X)
title('Gauss-Markov Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
Random Processes
Introduction

Professor Ke-Sheng Cheng


Department of Bioenvironmental Systems
Engineering
E-mail: rslab@ntu.edu.tw
Introduction
 A random process is a process (i.e.,
variation in time or one dimensional
space) whose behavior is not completely
predictable and can be characterized by
statistical laws.
 Examples of random processes
 Daily stream flow
 Hourly rainfall of storm events
 Stock index
Random Variable
 A random variable is a mapping
function which assigns outcomes of a
random experiment to real numbers.
Occurrence of the outcome follows
certain probability distribution.
Therefore, a random variable is
completely characterized by its
probability density function (PDF).
 The term “stochastic processes”
appears mostly in statistical
textbooks; however, the term
“random processes” are frequently
used in books of many engineering
applications.
Characterizations of a
Stochastic Processes
 First-order densities of a random process
A stochastic process is defined to be completely or
totally characterized if the joint densities for the
random variables X (t1 ), X (t2 ), X (tn ) are known for
all times t1 , t2 ,, tn and all n.
In general, a complete characterization is
practically impossible, except in rare cases. As a
result, it is desirable to define and work with
various partial characterizations. Depending on
the objectives of applications, a partial
characterization often suffices to ensure the
desired outputs.
 For a specific t, X(t) is a random variable
with distribution F ( x, t )  p[ X (t )  x].
 The function F ( x, t ) is defined as the first-
order distribution of the random variable
X(t). Its derivative with respect to x
F ( x, t )
f ( x, t ) 
x
is the first-order density of X(t).
 If the first-order densities defined for all time
t, i.e. f(x,t), are all the same, then f(x,t) does
not depend on t and we call the resulting
density the first-order density of the random
process  X (t ) ; otherwise, we have a family of
first-order densities.
 The first-order densities (or distributions) are
only a partial characterization of the random
process as they do not contain information
that specifies the joint densities of the random
variables defined at two or more different
times.
 Mean and variance of a random process
The first-order density of a random
process, f(x,t), gives the probability density
of the random variables X(t) defined for all
time t. The mean of a random process,
mX(t), is thus a function of time specified by

m X (t )  E[ X (t )]  E[ X t ]   xt f ( xt , t )dxt

 For the
case where the mean of X(t) does
not depend on t, we have
mX (t )  E[ X (t )]  mX (a constant).
 Thevariance of a random process, also a
function of time, is defined by
 X2 (t )  E[ X (t )  mX (t )]2   E[ X t2 ]  [mX (t )]2
 Second-order densities of a random
process
For any pair of two random variables X(t1)
and X(t2), we define the second-order
densities of a random process as f ( x1 , x2 ; t1 , t2 )
or f ( x1 , x2 ) .
 Nth-order densities of a random process
The nth order density functions for  X (t )
at times t1 , t2 ,, tn are given by
f ( x1 , x2 ,, xn ; t1 , t2 ,, tn ) or f ( x1 , x2 ,, xn ) .
 Autocorrelationand autocovariance
functions of random processes
Given two random variables X(t1) and X(t2),
a measure of linear relationship between
them is specified by E[X(t1)X(t2)]. For a
random process, t1 and t2 go through all
possible values, and therefore, E[X(t1)X(t2)]
can change and is a function of t1 and t2. The
autocorrelation function of a random
process is thus defined by
R(t1 , t2 )  E X (t1 ) X (t2 )  R(t2 , t1 )
 Stationarity of random processes

f  x1 , x2 ,, xn ; t1 , t2 ,, tn   f  x1 , x2 ,, xn ; t1  , t2  ,, tn   


Strict-sense stationarity seldom holds for random
processes, except for some Gaussian processes.
Therefore, weaker forms of stationarity are
needed.
E  X (t )  m (constant) for all t.
R(t1 , t 2 )  R  t 2  t1   R t 2  t1 , for all t1 and t 2 .
Equality and continuity of
random processes
 Equality

 Note that “x(t, i) = y(t, i) for every i”


is not the same as “x(t, i) = y(t, i) with
probability 1”.
 Mean square equality
Lecture on Communication Theory

Chapter 1. Random Processes - Preliminary


P.1 Introduction
1. Deterministic signals: the class of signals that may be modeled as
completely specified functions of time.
2. Random signals: it is not possible to predict its precise value in advance. ex)
thermal noise
3. Random variable: A function whose domain is a sample
space and whose range is some set of real numbers.
– obtained by observing a random process at a fixed
instant of time.
4. Random process: ensemble (family) of sample
functions, ensemble of random variables.
P.2 Probability Theory
1. Random experiment
1) Repeatable under identical conditions
2) Outcome is unpredictable
3) For a large number of trials of the experiment, the outcomes exhibit statistical
regularity, i.e., a definite average pattern of
outcomes is observed for a large number of trials.

CNU Dept. of Electronics 115


Lecture on Communication Theory
2. Relative-Frequency Approach
1) Relative frequency

Nn (A)
0  1
n
2) Statistical regularity  Probability of event A.

 N (A) 
P(A)  lim  n 
n 
 n 

3. Axioms of Probability.
1) 용어
a) Sample points sk: kth outcome of experiment
b) Sample space S: totality of sample points
c) Sure event: entire sample space S
d) : null or impossible event
( i ) P(S)  1
e) Elementary event: a single sample point Axioms
(i i) 0  P(A)  1
2) Definition of probability of (iii) If A  B is the union of two mutually
a) A sample space S of elementary events Probability execlusive events in the class  , then
b) A class  of events that are subsets of S. P(A  B)  P(A)  P(B)

c) A probability measure P() assigned to each event A in the class ,


which has the following properties:

CNU Dept. of Electronics 116


Lecture on Communication Theory

3) Property 1. P( A )  1  P(A)
4) Property 2. If M mutually the exclusive events A 1, A 2 ,    , A M
have the exclusive property
A1  A 2       A M  S
then
P(A 1 )  P(A 2 )      P(A M )  1
5) Property 3.
P(A  B)  P(A)  P(B) - P(AB)

4. Conditional Probability
1) Conditional Probability of given A
(given A means that event A has occurred)

P(AB)
P(B | A) 
P(A)
P(AB)  P(A)P(B)
where P(AB)  joint probabilit y of A & B
P(AB)  P(B | A)P(A)  P(A | B)P(B)
P(A | B)P(B)
 P(B | A)  ; Bayes' rule
P(A)
2) Statistically independent
ex1) BSC (Binary Symmetric Channel) 1-p
Discrete memoryless channel
[0] A0 B0 [0]
p
[1] A1 p B1 [1]
1-p

CNU Dept. of Electronics 117


Lecture on Communication Theory
Priori
.
prob.
P(A 0 ) p 0 , P(A 1 ) p 1, 여기에서 p1  p 2 1

Conditional prob. or likelihood


P(B1| A 0 )  P(B 0 | A 1 )  p;[0] 송신 [1] 수신확률
[0] 송신
P(B0 | A 0 )  P(B1 | A 1 )  1 p; [0] 수신확률
Output prob.
P(B 0 )  (1  p)p 0  pp1
P(B1 )  pp 0  (1  p)p1
Posteriori prob.
P(B 0 |A 0 )P(A 0 ) (1  p)p 0
P(A 0 | B 0 )   ;
P(B 0 ) (1  p)p 0  pp1
P(B1|A 1 )P(A 1 ) (1  p)p1
P(A 1| B1 )   ;
P(B1 ) pp 0  (1  p)p1
P.3 Random variables
1) Random variable: A function whose domain is a sample
space and whose range is some set of real numbers
2) Discrete r. v. : X(k), k 번째 sample ex) 주사위 range {1,,6}
Continuous r. v. : X ex) 8 시 ~ 8 시 10 분 버스도착시간
3) Cumulative distribution function (cdf) or distribution fct.
FX(x) = P(X  x) a) 0  FX(x) 1 b) if x1 < x2, FX(x1)  FX(x2), monotone-nondecreasing fct.
CNU Dept. of Electronics 118
Lecture on Communication Theory
4) pdf (probability density fct.)
d
f X (x)  FX (x)
dx
x
FX (x)    f X (ξ )dξ

  f X (x)dx  1
x
P(x 1  X  x 2 )  x 2 f X (x)dx
1

pdf: nonnegative fct., total area = 1


ex2)

CNU Dept. of Electronics 119


Lecture on Communication Theory
2. Several random variables (2 random variables)
1) Joint distribution fct. FX,Y (x, y)  P(X  x, Y  y)
∂ 2FX,Y (x, y)
2) Joint pdf f X,Y (x, y) 
∂ x∂ y
 
- - f X,Y (ξ ,η ) dξ dη  1
3) Total area  x
FX (x)  - - f X,Y (ξ ,η ) dξ d

f X (x)  - f X,Y (x,η ) dη ;
marginal density
f X,Y (x, y)
If f X (x)  0 f Y (y | x)  0
f X (x)
4) Conditional prob. density fct. (given thatX = fixed x)
 f Y (y | x)dy  1

If X,Y are statistically independent


fY(y|x) = fY(y)
 Statistically independent fX,Y(x,y) = fX(x)fY(y)
P.4 Statistical Average
1. Mean or expected value
1) Continuous

μ X  E[X]    xf X (x)dx
ex)

1 10
 1 1
10 E[X]   10 xdx  20 x
2
 5
0

0
10
CNU Dept. of Electronics 120
Lecture on Communication Theory
2) Discrete
Nn (k)
 
E[X] 
k   n
x   x k p(k)
k
k  

1 11
ex) 주사위 E[X]  (1  2  3  4  5  6) 
6 3

2. Function of r. v.
Y=g(X) X, Y : r. v.

E[Y]  E[g(X)]   g(x)fX (x)dx
ex) Y  g(X)  cos(X)
 1
 -π  x  π
where f X (x)   2π

 0 otherwise
π
π 1 1
E[Y]  π cosx dx   sinx  0
2π 2π π

3. Moments
1)n-th moments 
E[X n ]    x n f X (x)dx
n  1  E[X]  μ x mean
n  2  E[X 2 ] mean square value of X

E[(X  μ )n ]   (x  μ X ) f X (x)dx
n

2) Central moments X

 n  2, σ 2
X  var[X]  E[(X  μ X )2 ]

where σ is standard deviation


X

CNU Dept. of Electronics 121


Lecture on Communication Theory
X2 의 meaning: randomness, effective width of f X(x)
그 이유는 Chebyshev inequality 을 통해서 알 수 있다 . 2
σX
P( X - μ ε ) 
X
ε 2 ; Chebyshev inequality
2 2 2
σ X  E[(X  μ X ) ]  E[X ]  2μ
2 2
X E(X)  μ X  E[X 2 ]  μ X
2
If μ X  0, σ X  E[X 2 ]
2
σ X : variance, E[X 2 ] : mean square value

4. Characteristic function
Characteristic function X(v) fX(x)

φ X (v)  E[exp(jvx)]    f X (x)exp(jvx)dx
1 
 f X (x)  
φ X (v)exp(-jvx)dv

ex4) Gaussian Random


1
Variable
 (x  μ 2

X )
f X (x)  exp
  2

   x 
2π σ X  2σ X 
 1 2 2 
φ X (v)  exp jvμ X  v σX 
 2 
1  x2 
If μ X  0, f X (x)  exp -
 2σ 2 

2π σ X  X 
 v σX 2 2

φ X  x   exp
 

 2 
central moments
1  3  5    (n  1)σ n
for n even
E[(x  μ X ) ]   n X

0 for n odd
CNU Dept. of Electronics 122
Lecture on Communication Theory
5. Joint moments
Joint moments
 
E[X i Y j ]    x y f X,Y (x, y)dxdy
i j

Correlatio n
 
E[XY]    xyfX,Y (x, y)dxdy
Covariance
cov[XY]  E[(X  E[X])(Y  E[Y])]
 E[XY]  μ Xμ Y
Correlati on coefficien t
cov[XY]
ρ 
σ Xσ Y
 X and Y are uncorrelat ed  cov [XY]  0
 
 X and Y are orthogonal  E[XY]  0

E[X] = 0 or E[Y] = 0
 uncorrelated
X, Y are orthogonal
X, Y are statistically independent O uncorrelated
X

CNU Dept. of Electronics 123


Lecture on Communication Theory
P.5 Transformations of Random variables: Y=g(X)
1. Monotone transformations: one-to-one

x X
f X (x) f X (x)
f Y (y)  
dy/dx dg/dx x g1(y)

2. Many-to-one transformations

f Y (x)
f Y (y)  
k dg/dx k x  g1(y)
k
where xk = solution of g(x) = y

CNU Dept. of Electronics 124


Lecture on Communication Theory

Chapter 1. Random Processes


1.1 Introduction
1. Mathematical modeling of a physical phenomenon
(1) “Deterministic” if there is no uncertainty about its time-dependent behavior
)  cos 2f c t
at any instant fof(ttime.
ex)
(2) “Random” or “stochastic” if it is not possible to predict its precise value in
advance. ---> Average power 나 power spectral density 같이 statistical
parameter 로 신호를 표시

2. 통신 시스템에서의 random 의 예
(1) Information-bearing signal : voice signal consists of randomly spaced
bursts of energy of random duration.
(2) Digital communication 에서 전송되는 파형의 형태가 pseudo random
sequence 형태임 .
(3) Interference component : spurious electromagnetic wave 형태임 .
(4) Thermal noise : by the random motion of the electrons in conductors and
devices at the front end of the receiver.
CNU Dept. of Electronics 125
Lecture on Communication Theory

1.2 Mathematical Definition of a Random Process


1. Properties of random process
(1) Random process is a function of time.
(2) They are random, i.e., it is not possible to exactly define the waveforms
in the future, before conducting an experiment.
2. r. v. {X}: outcomes of a random experiment is mapped into a number
3. r. p. {X(t)} or {X(t,s)}: outcomes of a random experiment is mapped
into a waveform that is fct. of time.
 X(t,s), t is time, s is sample; indexed ensemble(family) of r.v.
 Sample Space or Ensemble S, s1, s2, … , sn are sample points
 Sample function
xj(t) = X(t,sj)  {x1(t),x2(t),,xn(t)} ; sample space
 {x1(tk),x2(tk),xn(tk)} = {X(tk,s1),X(tk,s2)X(tk,sn)}
 constitutes a random variable
 r. p. 의 예 ) X(t) = A cos (2fct+), Random Binary Wave,
Gaussian noise

CNU Dept. of Electronics 126


Lecture on Communication Theory

FIGURE 1.1 An ensemble of sample functions.

CNU Dept. of Electronics 127


Lecture on Communication Theory

1.3 Stationary Processes


1. Stationary : statistical characterization of a process is
independent of time
2. r. p. X(t) is stationary in the strict sense or strictly stationary
– If FX ( t1   ),..., X ( t k   ) ( x1,..., xk )  FX ( t1 ),..., X ( t k ) ( x1,..., xk )

for all time shift , all k and all possible t1,,tk.


< Observation > FX ( t1 ), X ( t 2 ) ( x1, x 2)  FX ( 0 ), X ( t 2  t1 ) ( x1 , x 2 )

1) k = 1, FX(t)(x) = FX(t+)(x) = FX(x) for all t & .


1st order distribution fct. of a stationary r. p. is independent
of time
2) k = 2 &  = -t, for all t1& t2
2nd order distribution fct. of a stationary r. p. depends only
on the differences between the observation time
2. Two r. p. X(t),Y(t) are jointly stationary if the joint distribution functions of r. v. X(t 1),,X(tk) and Y(t1’),
,Y(tk’) are invariant with respect to the location of the origin t = 0 for all k and j, and all choices of observation
times t1,,tk and t1’, ,tk’.

CNU Dept. of Electronics 128


Lecture on Communication Theory

Ex 1.1)

b1 b3

a1 A possible
a3
sample function
t2
t1 t3

b2

a2

FIGURE 1.2 Illustrating the probability of a joint event.

CNU Dept. of Electronics 129


Lecture on Communication Theory
probability of the joint event
i < X(ti)  bi} i=1, 2, 3
A={aP(A) F X(t ),X(t ),X(t ) (b1, b 2 , b 3 )  FX(t ),X(t ),X(t ) (a1, a 2 , a 3 )
1 2 3 1 2 3

FIGURE 1.3 Illustrating the concept of stationarity in Example 1.1


(a) 일때와 (b) 일때의 확률이 같으면 stationary
1.4 Mean, Correlation, and Covariance functions

1. Mean of r. p.  ( t )  E[ X ( t )]   xf ( x )dx, x : r.v.
x

x(t)

 For stationary r. p. ,constant, for all t


 x (t)   x 
2. Autocorrelation fct. of r. p. X(t)
R X (t1, t 2 )  E[X(t1 )X(t 2 )]
 
  
 
x 1x 2 f X(t )X(t ) (x1, x 2 )dx1dx 2
1 2
 For stationary r. p. RX(t1,t2) = RX(t2-t1)

CNU Dept. of Electronics 130


Lecture on Communication Theory
3. Autocovariance fct. of stationary r. p. X(t)
CX(t1,t2)=E[(X(t1) - X)(X(t2) - X)]
=RX(t2 - t1) - X2
4. Wide-sense stationary, second-order stationary, weakly stationary
μ X (t)  μ X  constant, for all t

R X (t 1, t 2 )  R X (t 2  t1 ) for all t 1 and t 2

 strict-sense stationary ox wide sense stationary

5. Properties of the Autocorrelation Function


(1) Autocorrelation fct. of stationary process X(t)
RX()=E[X(t+)X(t)] for all t
(2) Properties
a) Mean-square value ←by setting  = 0 , RX(0) = E[X2(t)]
b) RX(): even fct. , RX() = RX(-)
c) RX() has its maximum at  = 0, R2X()  RX(0)
E[(X(t  τ )  X(t)) ]  0
pf. of c)
E[X 2 (t  τ )]  2E[X(t  τ )X(t)]  E[X 2 (t)]  0
2R X (0)  2R X (τ )  0
 R X (0)  R X (τ )  R X (0)

CNU Dept. of Electronics 131


Lecture on Communication Theory
(3) Physical meaning of RX()

FIGURE 1.4 Illustrating the autocorrelation functions of slowly and rapidly


fluctuating random processes.

 “Interdependence “ of X(t) and X(t+)


 Decorrelation time 0: for  > 0, RX() < 0.01RX(0)

Ex 1.2) Sinusoidal wave with


X(t)  Acos(2 π Random phase
f t Θ )c

 1
 π  θ  π
where fΘ (θ )   2π

 0 otherwise
R X (τ )  E[X(t  τ )X(t)]
 E[A 2 cos(2π fc t  2π fcτ  Θ )cos(2π fc t  Θ )]
A2
 cos(2π fcτ )
2
CNU Dept. of Electronics 132
Lecture on Communication Theory

FIGURE 1.5 Autocorrelation function of a sine wave with random


phase.

CNU Dept. of Electronics 133


Lecture on Communication Theory
Ex 1.3) Random Binary Wave
1 1
P(  A)  P(-A)   , 0  td  T
2 fTd (t d )   T
 E X(t)  0 
0, otherwise

FIGURE 1.6 Sample function of random binary wave.

RX(0) = E[X(t)X(t)] = A2
RX(T) = E[X(t)X(t+T)] = 0

FIGURE 1.7 Autocorrelation function of random binary wave.


CNU Dept. of Electronics 134
Lecture on Communication Theory
6. Cross-correlation Functions
 r. p. X(t) with autocorrelation RX(t,u)
 r. p. Y(t) with autocorrelation RY(t,u)
 Cross-correlation fct. of X(t) and Y(t)
 RXY(t,u) = E[X(t)Y(u)]
 RYX(t,u) = E[Y(t)X(u)]
 Correlation Matrix of r. p. X(t) and Y(t)
R X (t, u) R XY (t, u)
R(t, u)  
R YX (t, u) R Y (t, u)

R X (τ ) R XY (τ )
R(τ )  
R YX (τ ) R Y (τ )
 If X(t) and Y(t) are each w. s. s. and jointly w. s. s.

where  = t-u
여기서 RXY()  RXY(-) i.e. not even fct.
RXY(0) is not maximum
RXY() = RYX(-)

CNU Dept. of Electronics 135


Lecture on Communication Theory
Ex 1.4) Quadrature - Modulated Processes
X1(t) and X2(t) from w. s. s. r. p. X(t)
X1(t)=X(t)cos(2fct + )  1
0  Θ  2π

Θ   2π
X2(t)=X(t)sin(2fct + ) where 
 0
 is independent of X(t)
Cross-correlation fct.
R12() = E[X1(t)X2(t-)]
= E[X1(t)X2(t-)]E[cos(2fct+)sin(2f
1 t-2fc +)]
 R Xc ( )sin(2 π fC )
= 2

1 → orthogonal
R12(0)=E[X1(t)X2(t)]=0
R ( )sin(2π f
X C )
1.5 Ergodic Processes 2
1. Ensemble average 와 time average
(1) Expectation or ensemble average of r.p. X(t)
→ average “across the process”
(2) Time average or long-term sample average
→ average “along the process”
(3) For sample function x(t) of w. s. s. r. p. X(t) with -T t  T
(a) Time average (dc value)

1 T
μ X (T) 
2T 
T
x(t)dt

CNU Dept. of Electronics 136


Lecture on Communication Theory
(b) Mean of time average X(T)
X  unbiased estimate of ensemble -averaged mean μ X (T) μ
 μ X; mean of r. p. X(t)
2T  T
 μ X dx
1 T
2T  T
Thus
E[μ X (T)]  E[x(t)]dt
1 T

1. w. s. s. r. p. X(t) is ergodic in the mean


 lim μ X (T)  μ X
T  
If lim var[μ (T)]  0

T  
X

2. w. s. s. r. p. X(t) is ergodic in the autocorrelation fct.


 lim R X (τ , T)  R X (τ )
T  
If 
 lim var[R X (τ , T)]  0
T  

1 T

where RX(,T) = 2T  T
x(t  τ )x(t)dt

= time averaged autocorrelation fct.


of sample fct. x(t) from w. s. s. r. p. X(t)
1.6 Transmission of a r. p. through a linear filter
w.s.s r.p w.s.s r.p
FIGURE 1.8 Transmission of a random process through a linear time-invariant
F (x    x )  F (y    y )
filter. X(t1 ) X(tk ) 1 k Y(t1 ) Y(t k ) 1, k

CNU Dept. of Electronics 137


Lecture on Communication Theory
1. Mean of Y(t) 
μ Y (t)  E[Y(t)]  E[  h(τ 1 )X(t  τ 1 )dτ 1 ]


  
h(τ 1 ) E[X(t  τ 1 )]dτ 1


  
h(τ 1 )μ X (t  τ 1 )dτ 1


μ X 

h(τ 1 ) dτ 1  w. s. s. X(t)
 μ Y μ X H(0) X(t), Y(t) are w. s. s.

2. Autocorrelation fct.
R (t, u)  E[Y(t)Y(u)
Y ]
 
 E[  h(τ 1 )X(t τ 1 )dτ 1  h(τ 2 )X(u τ 2 )dτ 2 ]
 
 
  
dτ 1h(τ 1 )

dτ 2h(τ 2 )R X (t τ 1 , u τ 2 )
 
  dτ 1h(τ 1 ) dτ 2h(τ 2 )R X (τ τ 1 τ 2 )
 

where τ  t  u  w. s. s. X(t)
 Y(t) is also w. s. s.

Mean square value E[Y2(t)]=RY(0)


 
E[Y 2 (t)]   
 
h(τ 1 )h(τ 2 )R X (τ 2 τ 1 )dτ 1dτ 2  constant

CNU Dept. of Electronics 138


Lecture on Communication Theory
1.7 Power Spectral density
1. Mean square value of Y(t) 를 p. s. d. 로 표현
h1(1) H(f)
Power spectral density or power spectrum of w. s. s. r. p. X(t)

Mean square value


S X (f)  of

Y(t)
R X (τ )exp( j2π fτ )dτ [watt/Hz]


  
E[Y 2 (t)]    [  H(f)exp(j 2π fτ
   1 )df]h(τ 2 )R X (τ 2 τ 1 )dτ 1dτ 2

  
  dfH(f) dτ h(τ ) R 2 2 (τ 2 -τ 1 )exp(j2π fτ 1 )dτ 1 (Let τ  τ 2 -τ 1 )
 - - X
  
 

dfH(f) dτ
-
h(τ
2 2 )exp(j2 π fτ 2 )  R (τ )exp(-j2π fτ )dτ
- X


2
 H(f) S X (f)df
-

FIGURE 1.9 Magnitude response of ideal narrowband filter.


E[Y 2 (t)]  (2Δ f)S X (fC )

Where Sx(fc) is Freq. Density of average power in r.p.X(t)

CNU Dept. of Electronics 139


Lecture on Communication Theory
2. Properties of the Power Spectral Density
1) Einstein - Wiener- Khintchine relations

S X (f)   
R X (τ )exp( j2π fτ )dτ


R X (τ )  
S X (f)exp(j2π fτ )df
w here, X(t) : w. s. s. r. p.
2) Property 1.
For w. s. s. r. p.,
3) Property 2. 
S X (0) 
Mean square value of w. s. s. r. p.  
R X (τ )dτ

4) Property 3.
For w. s. s. r. p., SX(f)  0 for all f. 
E[X 2 (t)]  R X (0)    S X (f)df
5) Property 4.
SX(-f) = SX(f): even fct.
 RX(-) = RX()
6) Property 5.
The p. s. d., appropriately normalized, has  S X (f)
the properties
PX (f) 
 S X (f)df
usually associated with a probability density fct.


1

Wrms  (  f 2p X (f)df ) 2


7) rms bandwidth of w. s. s. r. p. X(t)

CNU Dept. of Electronics 140


Lecture on Communication Theory
Ex 1.5) Sinusoidal wave with Random Phase
r. p. X(t) = A cos (2fC(t) + )
where  is uniform r. v. over [-, ]
A2
R X (τ )  cos(2π fC t)
2
A2
 S X (f)  [δ (f  fC )  δ (f  fC )]
4

FIGURE 1.10
 (f )
Power spectral density of sine wave with random phase;
denotes the delta function at f=0.

CNU Dept. of Electronics 141


Lecture on Communication Theory
Ex 1.6) Random Binary wave with +A & -A
 τ
 A (1   T
2
) τ
R X ( τ)   T
0 τ  T

T τ
SX ( f )  T
A 2 (1 
T
)exp(-j2π fτ )dτ

 A 2 Tsinc 2 (fT)

FIGURE 1.11 Power spectral density of random binary wave.


Energy spectral 2density of 2a rectangular pulse g(t)
E g (f)  A T sinc (fT)
2

E g (f)
 S X (f) 
T

CNU Dept. of Electronics 142


Lecture on Communication Theory
Ex 1.7) Mixing of a r. p. with a sinusoidal process .
Y(t)  X(t)cos(2π fC t  Θ)
where x(t) is w.s.s. r.p
Θ is r.v and independen t of X(t)
1
R Y (τ )  R X (τ )cos(2π fCτ )
2
1
S Y (f)   S X (f  fC )  S X (f  fC )
4

3. Relation among the Power Spectral Density of the Input


and Output Random Process

S Y (f)   R (τ

Y )exp( j2π fτ )dτ
  
    h(τ 1 )h(τ 2 )R X (τ  τ 1 τ 2 )exp( j2π fτ )dτ 1dτ 2 dτ
  

( let τ τ 1 τ 2 τ 0 i.e. τ  τ 0 τ 1 τ 2 )
S Y (f)  H(f)H (f)S X (f)
2
 S Y (f)  H(f) S X (f)

CNU Dept. of Electronics 143


Lecture on Communication Theory
ex) Comb filter

Figure. Comb filter. (a) Block diagram. (b) Frequency response.


H (f )  1  exp( j2fT )
 1  cos(2fT )  j sin( 2fT )
 1  cos(2fT ) 
2
 sin 2 ( 2fT )
2
H (f )
 21  cos(2fT ) 
 4 sin 2 (fT )
SY (f )  4 sin 2 (fT )SX (f )
For small f , i. e. , fT  1 , sin(fT )  fT
 SY (f )  4 2 f 2 T 2SX (f )

differentiator
CNU Dept. of Electronics 144
Lecture on Communication Theory
4. Relation among the Power Spectral Density and the
Amplitude Spectrum of a Sample Function

Sample fct. x(t) of w. s. s. & ergodic r. p. X(t) with S X(f)


X(f,T): FT of truncated sample fct. x(t)

T
X(f, t)  -T
x(t)exp(-j2π ft)dt
obtain R X (τ ) using time-average formula
1 T
R X (τ )  lim
T  2T 
T
x(t  τ )x(t)dt

 S X (f)  lim
T 
1
2T

E X(f, T)
2

1  T 2

 lim 2T E  
T 
T
x(t)exp( j2π ft)dt 

Conclusion) Sample function


R XY (τ )  S XY (τ )
R YX (τ )  S YX (τ )
5. Cross Spectral Density 
R XY (τ )  R YX ( τ )  S XY (f)  S YX (  f)  S YX (f)
A measure of the freq. interrelationship between 2 random
process

CNU Dept. of Electronics 145


Lecture on Communication Theory
Ex 1.8)
– X(t) and Y(t) has zero mean, w. s. s. r. p.
– Consider Z(t) = X(t)+Y(t)
– Auto correlation of Z(t)

R Z (t, u)  E[Z(t)Z(u) ]
 R X (t, u)  R XY (t, u)  R YX (t, u)  R Y (t, u)
(let τ  t - u)
R Z (τ )  R X (τ )  R XY (τ )  R YX (τ )  R Y (τ )

S Z (f)  S X (f)  S XY (f)  S YX (f)  S Y (f)
when X(t) and Y(t) are uncorrelat ed
S Z (f)  S X (f)  S Y (f)

X(t) h1(t) V(t)


Ex 1.9)
Y(t)
X(t), Y(t); Jointly w. s. s. r. p. h2(t) Z(t)


R VZ (τ )  

h1(τ 1 )h1(τ 2 )R XY (τ  τ 1 τ 2 )dτ 1dτ 2


where h1, h S VZ (f) linear,
2 are stable,
H1(f)Htime-invariant
2 (f)S XY (f) filter

Cross correlation fct. of V(t) and Z(t)

CNU Dept. of Electronics 146


Lecture on Communication Theory
Table 1.2 Graphical Summary of Autocorrelation Functions and Power Spectral Densities
of Random Processes of Zero Mean and Unit Variance

CNU Dept. of Electronics 147


Lecture on Communication Theory
1.8 Gaussian Process
1. Definition
Process X(t) is a Gaussian process if every linear functional
of X(t) is a Gaussian r. v. T
Y  0 g(t)X(t)dt
g(t) : some fct., Y : r. v.

If the r. v. Y is a Gaussian distributed r. v. for every g(t), then


X(t) is a Gaussian process
1  (y  μ )  2

여기서 f (y)  2π σ exp  2σ


Y 
Y
2
Y  Y 
2
normalized (μ Y  0,σ Y  1) Gaussian distributi on : N(0,1)
1  y2 
f Y (y)  exp
  

2π  2 

FIGURE 1.13 Normalized Gaussian distribution.


CNU Dept. of Electronics 148
Lecture on Communication Theory
2. Virtues of Gaussian process
1) Gaussian process has many properties that make analytic
results possible
2) Random processes produced by physical phenomena are
often such that a Gaussian model is appropriate.

3. Central Limit Theorem


1) Let Xi, I = 1, 2, , N be a set of r. v. that satisfies
a) The Xi are statistically independent
b) The Xi have the same p. d. f. with mean X and variance X2
 Xi : set of independently and identically distributed (i. i. d.)
r. vs. Yi 
1
(Xi  μ X ) , i  1,2,  , N.
σX
 Now Normalized r. v.
 E[Yi ]  0
var[Yi ]  1
N
1
define r. v. VN 
N
Y
i 1
i

< Central limit theorem >


The probability distribution of VN approaches a normalized Gaussian distribution N(0,1)
in the limit as N approaches infinity. Normalized r. v. 이 많이 모여서 하나의 r. v. 을
만들면 이는 N(0,1) 이 된다 .

CNU Dept. of Electronics 149


4. Properties
Lecture of Gaussian
on Communication Theory Process
1) Property 1.
X(t) h(t) Y(t)
Gaussian P. stable, linear Gaussian P.
If a Gaussian process X(t) is applied to a stable linear filter, then the random process
Y(t) developed at the output of the filter is also Gaussian.
2) Property 2.
Consider the set of r. v. or samples X(t1), X(t2), , X(tn) obtained by observing a r. p.
X(t) at times t1, t2, , tn.
If the process X(t) is Gaussian, then this set of r. vs. are jointly Gaussian for any n, with
their n-fold joint p. d. f. being completely determined by specifying the set of means
and the setμof autocovariance functions
E[X(t )] , i  1,2,  , n
X(t i ) i
3) Property 3.
C (t , t )  E[(X(t )  μ )(X(t )  μ )
If a Gaussian process is stationary, then the process is also
X k i k X(t k ) i X(t i )

strictly stationary.
4) Property 4.
If random variables X(t1), X(t2), , X(tn), obtained by sampling a Gaussian process X(t)
at times t1,t2,…,tn, are uncorrelated, i. e.

then these random variables are statistically independent


)]  0, i  k X(t i ) )(X(t i )  μ X(t k ) E[(X(t k )  μ

CNU Dept. of Electronics 150

You might also like