Professional Documents
Culture Documents
Random Processes Sept 25 26 Oct 3 5
Random Processes Sept 25 26 Oct 3 5
Gaurav S. Kasbekar
Dept. of Electrical Engineering
IIT Bombay
Motivation
• For a study of the noise performance of communication
systems, we model noise as a random process:
❑noise often modeled as Additive White Gaussian Noise
• Also, concept of power spectral density (PSD) useful for
frequency domain analysis of random processes
❑e.g., message signals and noise
• Later, we will see that PSD of a wide-sense stationary random
process is Fourier transform of its autocorrelation function
• So now we study:
❑random processes,
❑stationarity, strict-sense stationary and wide-sense stationary
processes
❑mean function and autocorrelation function
❑power spectral density
❑Gaussian random processes
❑white noise processes
Review of Random Vectors
• Consider a random vector 𝑿 = (𝑋1 , 𝑋2 , … , 𝑋𝑛 )
• Joint CDF:
❑ 𝐹𝑋1,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 = 𝑃 𝑋1 ≤ 𝑥1 , … , 𝑋𝑛 ≤ 𝑥𝑛
• Joint PDF:
𝜕𝑛 𝐹𝑋1 ,…,𝑋𝑛 𝑥1 ,…,𝑥𝑛
❑ 𝑓𝑋1,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 = 𝜕𝑥1 …𝜕𝑥𝑛
❑ 𝑃 (𝑋1 , 𝑋2 , … , 𝑋𝑛 ) ∈ 𝐷 :
❑ 𝑋𝑓 𝐷1 ,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 𝑑𝑥1 … 𝑑𝑥𝑛
• 𝑋1 , … , 𝑋𝑛 independent if:
❑ the events {𝑋1 ∈ 𝐴1 }, … , {𝑋𝑛 ∈ 𝐴𝑛 } are independent for all 𝐴1 ⊆ ℛ, …, 𝐴𝑛 ⊆ ℛ
• 𝑋1 , … , 𝑋𝑛 independent iff:
1) 𝐹𝑋1,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 = 𝐹𝑋1 𝑥1 … 𝐹𝑋𝑛 𝑥𝑛
2) 𝑓𝑋1,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 = 𝑓𝑋1 𝑥1 … 𝑓𝑋𝑛 𝑥𝑛 (continuous case)
• Mean vector of the random vector 𝑿 = (𝑋1 , 𝑋2 , … , 𝑋𝑛 ) :
❑ 𝝁𝑿 = (𝜇1 , 𝜇2 , … , 𝜇𝑛 ), where 𝜇𝑗 = 𝐸(𝑋𝑗 )
• Covariance matrix:
❑ K 𝑿 : 𝑛 × 𝑛 matrix with (𝑖, 𝑗)’th element 𝐶𝑖,𝑗 = Cov(𝑋𝑖 , 𝑋𝑗 )= 𝐸 𝑋𝑖 − 𝜇𝑖 𝑋𝑗 − 𝜇𝑗
• Properties of covariance matrix K 𝑿 of a random vector 𝑿 = (𝑋1 , 𝑋2 , … , 𝑋𝑛 ):
❑ K 𝑿 is a symmetric matrix
❑ K 𝑿 = 𝐸 (𝑿 − 𝝁𝑿 )(𝑿 − 𝝁𝑿 )𝑇
❑ K 𝑿 is positive-semidefinite
Recall
• Consider a probability space (Ω, ℱ, 𝑃)
• A random variable 𝑋:
❑ for each ω ∈ Ω, 𝑋(ω) a real number
• A random vector 𝑋1 , … , 𝑋𝑛 :
❑for each ω ∈ Ω, 𝑋1 (ω), … , 𝑋𝑛 (ω) a real vector
• Next: a random process 𝑋(𝑡)
❑for each ω ∈ Ω, 𝑋(𝑡, ω) is a real function
Random Process
• Random process is a function from Ω to set of real
functions
• For fixed 𝑡, 𝑋(𝑡) is:
❑ a random variable
• For fixed 𝑡 and ω, 𝑋(𝑡, ω) is:
❑ a real number
❑ where 𝑿 = (𝑋1 , 𝑋2 , … , 𝑋𝑛 )
• Recall: if 𝑿 = (𝑋1 , … , 𝑋𝑛 ) is a Gaussian random
vector and 𝒂 = (𝑎1 , … , 𝑎𝑛 ) ≠ 𝟎, then 𝑌 = 𝒂𝑇 𝑿 is
a Gaussian random variable
• Definition: 𝑋(𝑡) is a Gaussian random process if
for all 𝑛 ≥ 1 and 𝑡1 , … , 𝑡𝑛 , 𝑋(𝑡1 ), … , 𝑋(𝑡𝑛 ) are
jointly Gaussian
Properties
• Recall: distribution of 𝑋(𝑡) specified by
𝐹𝑋 𝑡1 ,…,𝑋 𝑡𝑛 𝑥1 , … , 𝑥𝑛 = 𝑃(𝑋(𝑡1 ) ≤ 𝑥1 , … , 𝑋(𝑡𝑛 ) ≤
𝑥𝑛 )
1) Distribution of a Gaussian process 𝑋(𝑡) completely
determined by 𝜂𝑋 𝑡 and 𝑅𝑋 𝑡1 , 𝑡2
❑ Proof: follows from fact that distribution of Gaussian random vector
completely determined by mean vector and autocovariance matrix
❑ 𝐹𝑋 𝑡1 ,…,𝑋 𝑡𝑛 𝑥1 , … , 𝑥𝑛 completely specified by 𝝁𝑿 and K 𝑿 , where
𝑿 = (𝑋(𝑡1 ), 𝑋(𝑡2 ), … , 𝑋(𝑡𝑛 )); also, 𝑅𝑋 𝑡𝑖 , 𝑡𝑗 = 𝐾𝑋 𝑡𝑖 , 𝑡𝑗 +
𝝁𝑿 𝑡𝑖 𝝁𝑿 𝑡𝑗
2) If a Gaussian process 𝑋(𝑡) is WSS, then it is also SSS
❑ Proof: since 𝑋(𝑡) is WSS, 𝜂𝑿 𝑡 = 𝜂𝑿 (say) and 𝑅𝑋 𝑡1 , 𝑡2 = 𝑅𝑋 𝜏
(say), where 𝜏 = 𝑡2 − 𝑡1
❑ Let 𝑌 𝑡 = 𝑋(𝑡 + 𝑐); then 𝜂𝒀 𝑡 = 𝜂𝑿 and 𝑅𝑌 𝑡1 , 𝑡1 + 𝜏 =
𝐸 𝑌 𝑡1 𝑌 𝑡1 + 𝜏 = 𝐸 𝑋 𝑡1 + 𝑐 𝑋 𝑡1 + 𝑐 + 𝜏 = 𝑅𝑋 𝜏
Properties (contd.)
• Recall: 𝑋1 , … , 𝑋𝑛 are jointly Gaussian iff 𝑎1 𝑋1 + ⋯ +
𝑎𝑛 𝑋𝑛 is a Gaussian random variable for all 𝑎1 , … , 𝑎𝑛
• Generalization to Gaussian process:
3) 𝑋(𝑡) is a Gaussian process iff for all functions 𝑔(𝑡)
and 𝑇1 < 𝑇2 :
𝑇2
𝑌𝑔 = 𝑔 𝑇𝑡 𝑋 𝑡 𝑑𝑡 is a Gaussian random variable
1
whenever 𝐸 𝑌𝑔 2 < ∞
❑ Proof of “if” part: Suffices to show: 𝑋(𝑡1 ), … , 𝑋(𝑡𝑛 ) are jointly
Gaussian for arbitrary 𝑡1 , … , 𝑡𝑛
❑ In turn, suffices to show: for given constants 𝑡1 , … , 𝑡𝑛 and 𝑎1 , … , 𝑎𝑛 ,
σ𝑛𝑖=1 𝑎𝑖 𝑋 𝑡𝑖 is a Gaussian random variable
❑ Select some 𝑇1 < 𝑡1 , 𝑇2 > 𝑡𝑛 , and let 𝑔 𝑡 = σ𝑛𝑖=1 𝑎𝑖 𝛿 𝑡 − 𝑡𝑖
𝑇2
❑ Then 𝑌𝑔 = 𝑔 𝑇 𝑡 𝑋 𝑡 𝑑𝑡 = σ𝑛𝑖=1 𝑎𝑖 𝑋 𝑡𝑖 is a Gaussian random
1
variable
❑ So 𝑋(𝑡1 ), … , 𝑋(𝑡𝑛 ) are jointly Gaussian
Applications of Gaussian Processes
• Many physical processes are approximately
Gaussian
• Reasons:
1) A large-scale phenomenon often arises from the
combination of a large number of small-scale
i.i.d. phenomena
2) Central Limit Theorem (CLT)
• 𝑋(𝑡): noise current Thermal Noise: Intuition
• 𝑋 𝑡 = σ𝑁 𝑖=1 𝑋𝑖 (𝑡),
❑ where 𝑋𝑖 (𝑡): noise current due to random motion of
electron 𝑖
• 𝑋1 𝑡 , … , 𝑋𝑁 (𝑡) approximately independent and
identically distributed
• Want to show: 𝑋 𝑡 approximately a Gaussian process
• For fixed 𝑡, 𝑋 𝑡 approximately a Gaussian random
variable by CLT
• Next, fix 𝑔(𝑡) and 𝑇1 < 𝑇2
𝑇2
• 𝑌𝑔 = 𝑔 𝑇 𝑡 𝑋 𝑡 𝑑𝑡 :
1
𝑇2
❑ σ𝑁
𝑖=1 𝑔 𝑇 𝑡 𝑋𝑖 𝑡 𝑑𝑡 (substituting for 𝑋 𝑡 and assuming
1
integral and summation can be interchanged)
❑ approximately Gaussian by CLT
Gaussian Process Through LTI System
• Claim: If a Gaussian process, 𝑋(𝑡), is input to LTI
system, then output 𝑌(𝑡) is a Gaussian process
• Proof:
∞
❑𝑌 𝑡 = −∞ 𝑋 𝜏 ℎ 𝑡 − 𝜏 𝑑𝜏
❑Fix a function 𝑔(𝑡) and 𝑇1 < 𝑇2
𝑇2
❑Let 𝑌𝑔 = 𝑔 𝑇 𝑡 𝑌 𝑡 𝑑𝑡
1
𝑇 ∞
❑Then 𝑌𝑔 = 𝑇2 𝑔 𝑡 −∞ 𝑋 𝜏 ℎ 𝑡 − 𝜏 𝑑𝜏 𝑑𝑡
1
∞ 𝑇2
❑𝑌𝑔 = −∞ 𝑋 𝜏 𝑇ℎ 𝑡 − 𝜏 𝑔 𝑡 𝑑𝑡 𝑑𝜏
1
∞ 𝑇
❑𝑌𝑔 = −∞ 𝑋 𝜏 𝑔(𝜏)𝑑𝜏,
where 𝑔 𝜏 = 𝑇2 ℎ 𝑡 − 𝜏 𝑔 𝑡 𝑑𝑡
1