Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

University of Benghazi

Faculty of Engineering
Electrical and Electronics Engineering Department

Probability and Random (Stochastic) Process


EE277

Multiple Random Processes


Salma Elkawafi
Salma.elkawafi@uob.edu.ly
Goals
Understand the following:
• Multiple Random Processes

• Cross Correlation

• Cross Covariance

• Independent Random Processes

• Gaussian Random Processes


Multiple Random Processes
Cross-correlation and cross-covariance functions.

For two random processes {𝑋(𝑡), 𝑡 ∈ 𝑇} and {𝑌(𝑡), 𝑡 ∈ 𝑇}:

• the cross-correlation function 𝑅𝑋𝑌 (𝑡1 , 𝑡2 ), is defined by

𝑅𝑋𝑌 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 𝑌 𝑡2 , 𝑓𝑜𝑟 𝑡1 , 𝑡2 ∈ 𝑇

• the cross-covariance function 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ), is defined by

C𝑋𝑌 𝑡1 , 𝑡2 = Cov 𝑋 𝑡1 , 𝑌 𝑡2 = 𝑅𝑋𝑌 𝑡1 , 𝑡2 − 𝜇𝑋 𝑡1 𝜇𝑌 𝑡2 , 𝑓𝑜𝑟 𝑡1 , 𝑡2 ∈ 𝑇.


Multiple Random Processes
Cross-correlation and cross-covariance functions.
Example
Let 𝐴, 𝐵, and 𝐶 be independent normal 𝑁(1,1) random variables. Let {𝑋(𝑡), 𝑡 ∈ [0, ∞)} be
defined as
𝑋(𝑡) = 𝐴 + 𝐵𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞).
Also, let {𝑌(𝑡), 𝑡 ∈ [0, ∞)} be defined as
𝑌(𝑡) = 𝐴 + 𝐶𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞)
Find 𝑅𝑋𝑌 (𝑡1 , 𝑡2 ) and 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ), for 𝑡1 , 𝑡2 ∈ [0, ∞).
Solution

First, note that


𝜇𝑋 (𝑡) = 𝐸[𝑋(𝑡)] = 𝐸[𝐴] + 𝐸[𝐵] ⋅ 𝑡 = 1 + 𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞).
Similarly,
𝜇𝑌 (𝑡) = 𝐸[𝑌(𝑡)] = 𝐸[𝐴] + 𝐸[𝐶] ⋅ 𝑡 = 1 + 𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞).
Multiple Random Processes
Cross-correlation and cross-covariance functions.
Example. Cont.
Let 𝐴, 𝐵, and 𝐶 be independent normal 𝑁(1,1) random variables. Let {𝑋(𝑡), 𝑡 ∈ [0, ∞)} be defined as
𝑋(𝑡) = 𝐴 + 𝐵𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞).
Also, let {𝑌(𝑡), 𝑡 ∈ [0, ∞)} be defined as
𝑌(𝑡) = 𝐴 + 𝐶𝑡, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡 ∈ [0, ∞)
Find 𝑅𝑋𝑌 (𝑡1 , 𝑡2 ) and 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ), for 𝑡1 , 𝑡2 ∈ [0, ∞).
Solution
To find 𝑅𝑋𝑌 (𝑡1 , 𝑡2 ) 𝑓𝑜𝑟 𝑡1 , 𝑡2 ∈ [0, ∞), we write
𝑅𝑋𝑌 (𝑡1 , 𝑡2 ) = 𝐸[𝑋(𝑡1 )𝑌(𝑡2 )]
= 𝐸[(𝐴 + 𝐵𝑡1 )(𝐴 + 𝐶𝑡2 )]
= 𝐸[𝐴2 + 𝐴𝐶𝑡2 + 𝐵𝐴𝑡1 + 𝐵𝐶𝑡1 𝑡2 ]
= 𝐸[𝐴2 ] + 𝐸 𝐴𝐶 𝑡2 + 𝐸 𝐵𝐴 𝑡1 + 𝐸 𝐵𝐶 𝑡1 𝑡2
= 𝐸 𝐴2 + 𝐸 𝐴 𝐸 𝐶 𝑡2 + 𝐸 𝐵 𝐸 𝐴 𝑡1 + 𝐸 𝐵 𝐸 𝐶 𝑡1 𝑡2 , (by independence)
= 2 + 𝑡1 + 𝑡2 + 𝑡1 𝑡2 .

To find 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ) for 𝑡1 , 𝑡2 ∈ [0, ∞), we write


𝐶𝑋𝑌 𝑡1 , 𝑡2 = 𝑅𝑋𝑌 𝑡1 , 𝑡2 − 𝜇𝑋 𝑡1 𝜇𝑌 𝑡2
= (2 + 𝑡1 + 𝑡2 + 𝑡1 𝑡2 ) − (1 + 𝑡1 )(1 + 𝑡2 ) = 1.
Multiple Random Processes
Independent Random Processes
If for two random processes 𝑋(𝑡) and 𝑌(𝑡), the random variables 𝑋(𝑡𝑖 ) are independent
from the random variables 𝑌(𝑡𝑗 ), we say that the two random processes are independent.
More precisely, we have the following definition:
Two random processes {𝑋(𝑡), 𝑡 ∈ 𝑇} and {𝑌(𝑡), 𝑡 ∈ 𝑇′} are said to be independent if, for all
𝑡1 , 𝑡2 , … , 𝑡𝑚 ∈ 𝑇 and 𝑡1′ , 𝑡2′ , … , 𝑡𝑛′ ∈ 𝑇′,
the set of random variables
𝑋(𝑡1 ), 𝑋(𝑡2 ), ⋯ , 𝑋(𝑡𝑚 )
are independent of the set of random variables
𝑌(𝑡1′ ), 𝑌(𝑡2′ ), ⋯ , 𝑌(𝑡𝑛′ ).
Multiple Random Processes
Independent Random Processes
The previous definition implies that for all real numbers 𝑥1 , 𝑥2 , ⋯ , 𝑥𝑚 and 𝑦1 , 𝑦2 , ⋯ , 𝑦𝑛 , we
have
𝐹𝑋 𝑡1 ,𝑋 𝑡2 ,⋯,𝑋 𝑡𝑚 ,𝑌 𝑡1′ ,𝑌 𝑡2′ ,⋯,𝑌 𝑡𝑛
′ 𝑥1 , 𝑥2 , ⋯ , 𝑥𝑚 , 𝑦1 , 𝑦2 , ⋯ , 𝑦𝑛 = 𝐹𝑋 𝑡1 ,𝑋 𝑡2 ,⋯,𝑋 𝑡𝑚 𝑥1 , 𝑥2 , ⋯ , 𝑥𝑚 ⋅ 𝐹𝑌 𝑡1′ ,𝑌 𝑡2′ ,⋯,𝑌 𝑡𝑛
′ (𝑦1 , 𝑦2 , ⋯ , 𝑦𝑛 )

The above equation might seem complicated; however, in many real-life applications we
can often argue that two random processes are independent by looking at the problem
structure. For example, in engineering we can reasonably assume that the thermal noise
processes in two separate systems are independent.

Note that if two random processes 𝑋(𝑡) and 𝑌(𝑡) are independent, then their covariance
function, 𝐶𝑋𝑌 (𝑡1 , 𝑡2 ), for all 𝑡1 and 𝑡2 is given by

𝐶𝑋𝑌 𝑡1 , 𝑡2 = 𝐶𝑜𝑣 𝑋 𝑡1 , 𝑌 𝑡2 =0 (since 𝑋(𝑡1 ) and 𝑌(𝑡2 ) are independent).


Gaussian Random Processes
A random process {𝑋(𝑡), 𝑡 ∈ 𝑇} is said to be a Gaussian (normal) random process if, for all
𝑡1 , 𝑡2 , … , 𝑡𝑛 ∈ 𝑇, the random variables 𝑋(𝑡1 ), 𝑋(𝑡2 ), … , 𝑋(𝑡𝑛 ) are jointly normal.

Example
2
Let 𝑋(𝑡) be a zero-mean WSS Gaussian process with 𝑅𝑋 (𝜏) = 𝑒 −𝜏 , for all 𝜏 ∈ 𝑅.
1. Find 𝑃(𝑋(1) < 1).
2. Find 𝑃(𝑋(1) + 𝑋(2) < 1).
Solution
1. 𝑋(1) is a normal random variable with mean 𝐸[𝑋(1)] = 0 and variance
𝑉𝑎𝑟(𝑋(1)) = 𝐸[𝑋 1 2 ] = 𝑅𝑋 (0) = 1.
Thus,
1−0
𝑃(𝑋(1) < 1) = Φ( ) = Φ(1) ≈ 0.84
1
Gaussian Random Processes
Example:Cont.
2
Let 𝑋(𝑡) be a zero-mean WSS Gaussian process with 𝑅𝑋 (𝜏) = 𝑒 −𝜏 , for all 𝜏 ∈ 𝑅.
1. Find 𝑃(𝑋(1) < 1).
2. Find 𝑃(𝑋(1) + 𝑋(2) < 1).
Solution
Let 𝑌 = 𝑋(1) + 𝑋(2). Then,𝑌 is a normal random variable. We have
𝐸[𝑌] = 𝐸[𝑋(1)] + 𝐸[𝑋(2)] = 0;
𝑉𝑎𝑟(𝑌) = 𝑉𝑎𝑟(𝑋(1)) + 𝑉𝑎𝑟(𝑋(2)) + 2𝐶𝑜𝑣(𝑋(1), 𝑋(2)).
Note that
𝑉𝑎𝑟(𝑋(1)) = 𝐸[𝑋 1 2 ] − 𝐸 𝑋 1 2 = 𝑅𝑋 (0) − 𝜇𝑋2 = 1 − 0 = 1 = 𝑉𝑎𝑟(𝑋(2));
1
𝐶𝑜𝑣(𝑋(1), 𝑋(2)) = 𝐸[𝑋(1)𝑋(2)] − 𝐸[𝑋(1)]𝐸[𝑋(2)] = 𝑅𝑋 (−1) − 𝜇𝑋2 = 𝑒 −1 = .
𝑒
Therefore,
2
𝑉𝑎𝑟(𝑌) = 2 + .
𝑒
2
We conclude 𝑌 ∼ 𝑁 0,2 + . Thus,
𝑒
1−0
𝑃(𝑌 < 1) = Φ = Φ(0.6046) ≈ 0.73
2
2+
𝑒

You might also like