Professional Documents
Culture Documents
DC Stochastic Process
DC Stochastic Process
Stochastic Process
Ashok N Shinde
ashok.shinde0349@gmail.com
International Institute of Information Technology
Hinjawadi Pune
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are constant.
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are constant.
Random: signal that is not repeatable in a predictable manner.
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are constant.
Random: signal that is not repeatable in a predictable manner.
s(t) = Acos(2πfc t + θ)
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are constant.
Random: signal that is not repeatable in a predictable manner.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are variable.
Signals
Deterministic: can be reproduced exactly with repeated
measurements.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are constant.
Random: signal that is not repeatable in a predictable manner.
s(t) = Acos(2πfc t + θ)
where A,fc and θ are variable.
Unwanted signals: Noise
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X(s, t), −T ≤ t ≤ T
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X(s, t), −T ≤ t ≤ T
Sample function denoted as:
Definition:
A stochastic process is a set of random variables indexed in time.
Mathematically:
Mathematically relationship between probability theory and
stochastic processes is as follows-
Sample point →Sample Function
Sample space →Ensemble
Random Variable→ Random Process
Sample point s is function of time :
X(s, t), −T ≤ t ≤ T
Sample function denoted as:
xj (t) = X(t, sj ), −T ≤ t ≤ T
Stochastic Process
Each sample point in S is associated with a sample function x(t)
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1 (tk ), x2 (tk ), . . . , xn (tk ) = X(tk , s1 ), X(tk , s2 ), . . . , X(tk , sn )}
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1 (tk ), x2 (tk ), . . . , xn (tk ) = X(tk , s1 ), X(tk , s2 ), . . . , X(tk , sn )}
Probability rules assign probability to any meaningful event
associated with an observation An observation is a sample function
of the random process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1 (tk ), x2 (tk ), . . . , xn (tk ) = X(tk , s1 ), X(tk , s2 ), . . . , X(tk , sn )}
Probability rules assign probability to any meaningful event
associated with an observation An observation is a sample function
of the random process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1 (tk ), x2 (tk ), . . . , xn (tk ) = X(tk , s1 ), X(tk , s2 ), . . . , X(tk , sn )}
Probability rules assign probability to any meaningful event
associated with an observation An observation is a sample function
of the random process
Stochastic Process
Each sample point in S is associated with a sample function x(t)
X(t, s) is a random process
is an ensemble of all time functions together with a probability rule
X(tk , sj ) is a realization or sample function of the random process
{x1 (tk ), x2 (tk ), . . . , xn (tk ) = X(tk , s1 ), X(tk , s2 ), . . . , X(tk , sn )}
Probability rules assign probability to any meaningful event
associated with an observation An observation is a sample function
of the random process
Stationary Process:
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
If a process is divided into a number of time intervals exhibiting
different statistical properties, is called as Non-Stationary.
Stationary Process:
If a process is divided into a number of time intervals exhibiting
same statistical properties, is called as Stationary.
It is arises from a stable phenomenon that has evolved into a
steady-state mode of behavior.
Non-Stationary Process:
If a process is divided into a number of time intervals exhibiting
different statistical properties, is called as Non-Stationary.
It is arises from an unstable phenomenon.
Mean:
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Correlation:
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X(t) is product of
two random variables, X(t1 ) and X(t2 )
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X(t) is product of
two random variables, X(t1 ) and X(t2 )
MXX (t1 , t2 ) = E[X(t1 )X(t2 )]
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X(t) is product of
two random variables, X(t1 ) and X(t2 )
MXX (t1 , t2 ) = E[X(t
R ∞ R1∞ )X(t2 )]
MXX (t1 , t2 ) = −∞ −∞ x1 x2 fX(t1 ),X(x2 ) (x1 , x2 )dx1 dx2
where fX(t1 ),X(x2 ) (x1 , x2 ) is joint probability density function of the
process X(t) sampled at times t1 and t2 . MXX (t1 , t2 ) is a
second-order moment. It is depend only on time difference t1 − t2 so
that the process X(t) satisfies the second condition of weak
stationarity and reduces to.
Mean:
Mean of real-valued stochastic process X(t), is expectation of the
random variable obtained by sampling the process at some time t, as
shown by
µX (t) = E[X(t)]
R∞
µX (t) = −∞ xfX(t) (x)dx
where fX(t) (x) is the first-order probability density function of the
process X(t).
µX (t) = µX for weakly stationary process
Correlation:
Autocorrelation function of the stochastic process X(t) is product of
two random variables, X(t1 ) and X(t2 )
MXX (t1 , t2 ) = E[X(t
R ∞ R1∞ )X(t2 )]
MXX (t1 , t2 ) = −∞ −∞ x1 x2 fX(t1 ),X(x2 ) (x1 , x2 )dx1 dx2
where fX(t1 ),X(x2 ) (x1 , x2 ) is joint probability density function of the
process X(t) sampled at times t1 and t2 . MXX (t1 , t2 ) is a
second-order moment. It is depend only on time difference t1 − t2 so
that the process X(t) satisfies the second condition of weak
stationarity and reduces to.
MXX (t1 , t2 ) = E[X(t1 )X(t2 )] = RXX (t2 − t1 )
Covariance:
Covariance:
Autocovariance function of a weakly stationary process X(t) is
defined by
Covariance:
Autocovariance function of a weakly stationary process X(t) is
defined by
CXX (t1 , t2 ) = E[(X(t1 ) − µx )(X(t2 ) − µx )]
Covariance:
Autocovariance function of a weakly stationary process X(t) is
defined by
CXX (t1 , t2 ) = E[(X(t1 ) − µx )(X(t2 ) − µx )]
CXX (t1 , t2 ) = RXX (t2 − t1 ) − µ2x
Covariance:
Autocovariance function of a weakly stationary process X(t) is
defined by
CXX (t1 , t2 ) = E[(X(t1 ) − µx )(X(t2 ) − µx )]
CXX (t1 , t2 ) = RXX (t2 − t1 ) − µ2x
The autocovariance function of a weakly stationary process X(t)
depends only on the time difference (t2 − t1 )
Covariance:
Autocovariance function of a weakly stationary process X(t) is
defined by
CXX (t1 , t2 ) = E[(X(t1 ) − µx )(X(t2 ) − µx )]
CXX (t1 , t2 ) = RXX (t2 − t1 ) − µ2x
The autocovariance function of a weakly stationary process X(t)
depends only on the time difference (t2 − t1 )
The mean and autocorrelation function only provide a weak
description of the distribution of the stochastic process X(t).
Examples
Show that the Random Process X(t) = Acos(ωc t + Θ) is wide sense
stationary process, where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc , but phase Θ is random.
Examples
Show that the Random Process X(t) = Acos(ωc t + Θ) is wide sense
stationary process, where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc , but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π).
Examples
Show that the Random Process X(t) = Acos(ωc t + Θ) is wide sense
stationary process, where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc , but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π).
Θ is RV uniformly distributed over the range (0, 2π).
Examples
Show that the Random Process X(t) = Acos(ωc t + Θ) is wide sense
stationary process, where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc , but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π).
Θ is RV uniformly distributed over the range (0, 2π).
1
fΘ (θ) = , 0 ≤ θ ≤ 2π
2π
Examples
Show that the Random Process X(t) = Acos(ωc t + Θ) is wide sense
stationary process, where Θ is RV uniformly distributed in range (0, 2π)
Answer:
The ensemble consist of sinusoids of constant amplitude A and constant
frequency ωc , but phase Θ is random.
The phase is equally likely to any value in the range (0, 2π).
Θ is RV uniformly distributed over the range (0, 2π).
1
fΘ (θ) = , 0 ≤ θ ≤ 2π
2π
= 0, elsewhere
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= Acos(ωc t + Θ)
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
1 R 2π
= cos(ωc t + θ)dθ
2π 0
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
1 R 2π
= cos(ωc t + θ)dθ
2π 0
=0
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
1 R 2π
= cos(ωc t + θ)dθ
2π 0
=0
X(t) =0
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
1 R 2π
= cos(ωc t + θ)dθ
2π 0
=0
X(t) =0
Thus the ensemble mean of sample function amplitude at any time
instant t is zero.
Answer:
Because cos(ωc t + Θ) is function of RV Θ, Mean of Random Process
X(t) is
X(t) = Acos(ωc t + Θ)
= RAcos(ωc t + Θ)
2π
cos(ωc t + Θ) = 0 cos(ωc t + θ)fΘ (θ)dθ
1 R 2π
= cos(ωc t + θ)dθ
2π 0
=0
X(t) =0
Thus the ensemble mean of sample function amplitude at any time
instant t is zero.
The Autocorrelation function RX X(t1 , t2 ) for this process can also be
determined as
RXX (t1 , t2 ) = E[X(t1 )X(t2 )]
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
Answer:
Continue. . .
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
1 R 2π
cos(ωc (t2 + t1 ) + 2Θ) = cos(ωc (t2 + t1 ) + 2θ)dθ
2π 0
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
1 R 2π
cos(ωc (t2 + t1 ) + 2Θ) = cos(ωc (t2 + t1 ) + 2θ)dθ
2π 0
=0
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
1 R 2π
cos(ωc (t2 + t1 ) + 2Θ) = cos(ωc (t2 + t1 ) + 2θ)dθ
2π 0
=0
A2
RXX (t1 , t2 ) = cos(ωc (t2 − t1 )),
2
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
1 R 2π
cos(ωc (t2 + t1 ) + 2Θ) = cos(ωc (t2 + t1 ) + 2θ)dθ
2π 0
=0
A2
RXX (t1 , t2 ) = cos(ωc (t2 − t1 )),
2
2
A
RXX (τ ) = cos(ωc (τ )), · · · τ = t2 − t1
2
Answer:
Continue. . .
= A2 cos(ωc t1 + Θ)cos(ωc t2 + Θ)
A2 h i
= cos(ωc (t2 − t1 )) + cos(ωc (t2 + t1 ) + 2Θ)
2
The term cos(ωc (t2 − t1 )) does not contain RV Hence,
cos(ωc (t2 − t1 )) = cos(ωc (t2 − t1 ))
The term cos(ωc (t2 + t1 ) + 2Θ) is a function of RV Θ, and it is
1 R 2π
cos(ωc (t2 + t1 ) + 2Θ) = cos(ωc (t2 + t1 ) + 2θ)dθ
2π 0
=0
A2
RXX (t1 , t2 ) = cos(ωc (t2 − t1 )),
2
2
A
RXX (τ ) = cos(ωc (τ )), · · · τ = t2 − t1
2
2
A
From X(t) = 0 and RXX (τ ) = cos(ωc (τ )) it is clear that X(t) is
2
Wide Sense Stationary Process
Ensemble Average
Ensemble Average
Difficult to generate a number of realizations of a random process
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
limT →∞ var [µX (T )] = 0
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
limT →∞ var [µX (T )] = 0
it is ergodic in autocorrelation:
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
limT →∞ var [µX (T )] = 0
it is ergodic in autocorrelation:
limT →∞ RXX (τ, T ) = RXX (τ )
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
limT →∞ var [µX (T )] = 0
it is ergodic in autocorrelation:
limT →∞ RXX (τ, T ) = RXX (τ )
limT →∞ var [RXX (τ, T )] = 0
Ensemble Average
Difficult to generate a number of realizations of a random process
Use time averages
1 R +T
Mean ⇒ µX (T ) = x(t)dt
2T −T
1 R +T
Autocorrelation ⇒ RXX (τ, T ) = x(t)x(t + τ )dt
2T −T
Ergodic Process
Ergodicity: A random process is called Ergodic if
it is ergodic in mean:
limT →∞ µX (T ) = µX
limT →∞ var [µX (T )] = 0
it is ergodic in autocorrelation:
limT →∞ RXX (τ, T ) = RXX (τ )
limT →∞ var [RXX (τ, T )] = 0
where µX and RXX (τ ) are the ensemble averages of the same
random process.
Z +∞ Z +∞
RY Y (τ ) = h(τ1 )h(τ2 )RXX (τ + τ1 − τ2 )dτ1 dτ2
−∞ −∞
Z +∞ Z +∞
RY Y (τ ) = h(τ1 )h(τ2 )RXX (τ + τ1 − τ2 )dτ1 dτ2
−∞ −∞
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
Z +∞ Z +∞
RY Y (τ ) = h(τ1 )h(τ2 )RXX (τ + τ1 − τ2 )dτ1 dτ2
−∞ −∞
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the output process Y (t) is obtained by putting
τ = 0. We haveRY Y (0) = E Y 2 (t)
Z +∞ Z +∞
RY Y (τ ) = h(τ1 )h(τ2 )RXX (τ + τ1 − τ2 )dτ1 dτ2
−∞ −∞
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the output process Y (t) is obtained by putting
τ = 0. We haveRY Y (0) = E Y 2 (t)
Z +∞ Z +∞
E Y 2 (t) =
h(τ1 )h(τ2 )RXX (τ1 − τ2 )dτ1 dτ2
−∞ −∞
Z +∞ Z +∞
RY Y (τ ) = h(τ1 )h(τ2 )RXX (τ + τ1 − τ2 )dτ1 dτ2
−∞ −∞
.
“If the input to a stable linear time invariant filter is a weakly
stationary process, then the output of the filter is also a weakly
stationary process”.
mean square value of the output process Y (t) is obtained by putting
τ = 0. We haveRY Y (0) = E Y 2 (t)
Z +∞ Z +∞
E Y 2 (t) =
h(τ1 )h(τ2 )RXX (τ1 − τ2 )dτ1 dτ2
−∞ −∞
by τ = τ1 − τ2
Z +∞ Z +∞ Z +∞
= H(f ) h(τ2 ) exp(j2πf τ2 )dτ2 RXX (τ ) exp(−j2πf τ )dτ
−∞ −∞ −∞
Z +∞ Z+∞
= |H(f )|2 RXX (τ ) exp(−j2πf τ )dτ df
−∞ −∞
Z +∞
SXX (f ) = RXX (τ ) exp(−j2πf τ )dτ
−∞
Z +∞
E Y 2 (t) = |H(f )|2 SXX (f )df
−∞
Z +∞
SXX (f ) = RXX (τ ) exp(−j2πf τ )dτ
−∞
Z +∞
E Y 2 (t) = |H(f )|2 SXX (f )df
−∞
Properties of PSD
Properties of PSD
4 Nonnegativeness of Power Spectral Density
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process
is an even function of frequency
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process
is an even function of frequency
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process
is an even function of frequency
6 Normalization
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process
is an even function of frequency
6 Normalization
The power spectral density, appropriately normalized, has the
properties associated with a probability density function in probability
theory
Properties of PSD
4 Nonnegativeness of Power Spectral Density
The power spectral density of a stationary process X(t) is always
nonnegative.
SXX (f ) ≥ 0
5 Symmetry
The power spectral density of a real-valued weakly stationary process
is an even function of frequency
6 Normalization
The power spectral density, appropriately normalized, has the
properties associated with a probability density function in probability
theory
SXX (f )
PXX (f ) = R ∞
−∞ SXX (f )df
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
2 Stationarity
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
2 Stationarity
If a Gaussian process is weakly stationary, then the process is also
strictly stationary.
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
2 Stationarity
If a Gaussian process is weakly stationary, then the process is also
strictly stationary.
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
2 Stationarity
If a Gaussian process is weakly stationary, then the process is also
strictly stationary.
3 Independence
(y − µ)2
1
fY (y) = √ exp −
2πσ 2 2σ 2
where µ is the mean and σ 2 is the variance of the random variable Y
Properties The Gaussian Process
1 Linear Filtering
If a Gaussian process X(t) is applied to a stable linear filter, then the
stochastic process Y(t) developed at the output of the filter is also
Gaussian.
2 Stationarity
If a Gaussian process is weakly stationary, then the process is also
strictly stationary.
3 Independence
If the random variables X(t1 ), X(t2 ), . . . , X(tn ), obtained by
respectively sampling a Gaussian process X(t) at times t1 , t2 , . . . , tn ,
are uncorrelated, that is
E (X(tk ) − µX(tk ) )(X(ti ) − µX(ti ) ) = 0 i 6= k
then these random variables are statistically independent.
Ashok N Shinde DC Unit-III Stochastic Process 24/28
Noise
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
SW (f ) has constant value across all frequencies.
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
SW (f ) has constant value across all frequencies.
PSD of white noise is
N0
SW W (f ) = 2
for all f
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
SW (f ) has constant value across all frequencies.
PSD of white noise is
N0
SW W (f ) = 2
for all f
Its auto-correlation function is
N0
RW W (τ ) = 2
δ(τ )
Adjective ’white’ is used in the sense that white light contains equal
amount of all frequencies within the visible band of electromagnetic
radiation.
White noise denoted by W (t), is a stationary process whose PSD
SW (f ) has constant value across all frequencies.
PSD of white noise is
N0
SW W (f ) = 2
for all f
Its auto-correlation function is
N0
RW W (τ ) = 2
δ(τ )