Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 49

Stochastic Processes

Lecture-5
Random Variables
Definition and Distribution Function

One dimension two dimensions

Marginal Conditional 

Multi dimensions Independence property


Random Variables
Chebyshev's Inequality

 
Let X be a random variable with finite expected value μ and finite non-
zero variance σ2. Then for any real number > 0,
  𝑫[𝑿]
𝑷 {|𝑿 − 𝑬 [ 𝑿 ]|≥ 𝜺 } ≤ 𝟐 Derivation
𝜺
Random Variables
Chebyshev's Inequality

 
Let X be a random variable with finite expected value μ and finite non-
zero variance σ2. Then for any real number > 0,
  𝑫[𝑿]
𝑷 {|𝑿 − 𝑬 [ 𝑿 ]|≥ 𝜺 } ≤ 𝟐
𝜺

 
The probability of the event decreasing with the variance decreasing.

X will be closer to its expectation as the variance decreasing.


Random Variables
Chebyshev's Inequality

 
Let X be a random variable with finite expected value μ and finite non-
zero variance σ2. Then for any real number > 0,
  𝑫[𝑿]
𝑷 {|𝑿 − 𝑬 [ 𝑿 ]|≥ 𝜺 } ≤ 𝟐
𝜺

  𝑫[𝑿]
𝑷 {|𝑿 − 𝑬 [ 𝑿 ]|< 𝜺 } ≥ 𝟏− 𝟐
𝜺

 
This is widely used to access the probability of event , if the variance is
already known.
Random Variables
Chebyshev's Inequality
 𝑫 [ 𝑿 ] =𝟎 → 𝑷 { 𝑿= 𝑬 [ 𝑿 ] }= 𝟏 Derivation
Random Variables
Moment – One dimensional random variable

 
Besides and , moment is also an important characteristic for a random
variable.
 
When , the kth moment of a random variable is defined to be

𝒌
 𝜸
𝒌 = 𝑬 [ 𝑿 ] , 𝒌 =𝟎 , 𝟏 , 𝟐 ⋯ The k-th moment about zero

𝒌
𝒂 𝒌 = 𝑬 [| 𝑿| ] , 𝒌 =𝟎 , 𝟏 , 𝟐 ⋯
 
The k-th absolute moment about zero

 𝜸 =𝟏 , 𝜸 𝟏= 𝑬 [ 𝑿 ]
𝟎
Random Variables
Central Moment – One dimensional random variable

 
When , the kth central moment of a random variable is defined to be

𝒌
 𝝁
𝒌 = 𝑬 {( 𝑿 − 𝑬 [ 𝑿 ] ) } , 𝒌 =𝟎 ,𝟏 , 𝟐 ⋯

𝒌
 𝜷
[ ]
𝒌 = 𝑬 |𝑿 − 𝑬 [ 𝑿 ]| , 𝒌= 𝟎 ,𝟏 , 𝟐 ⋯

 
Random Variables
Moment – Two-dimensional random variables

The k+s moment is defined as


𝒌 𝒔
𝒌 , 𝒔= 𝑬 [ 𝑿 ]
 𝜶 𝒀 Joint moment

 𝜶 = 𝑬 [ 𝑿 ] , 𝜶 𝟎 ,𝟏= 𝑬 [ 𝒀 ]
𝟏, 𝟎
Random Variables
Central Moment – Two-dimensional random variable

The k+s central moment is defined as


𝒌 𝒔
𝒌 , 𝒔 = 𝑬 {( 𝑿 − 𝑬 [ 𝑿 ] ) ( 𝒀 − 𝑬 [ 𝒀 ] ) }
 𝝁
Joint central moment

 𝝁 = 𝑬 {( 𝑿 − 𝑬 [ 𝑿 ] )( 𝒀 − 𝑬 [ 𝒀 ] ) }
𝟏 ,𝟏

Covariance 𝒄𝒐𝒗
  ( 𝑿 ,𝒀 )=𝑬 {( 𝑿 − 𝑬 [ 𝑿 ] )( 𝒀 − 𝑬 [ 𝒀 ] ) }
Random Variables
Characteristic of a random vector : Expectation

 
Suppose the distribution density of a n–dimensional random vector is

  𝑻
𝑬 [ 𝑿 ] =[ 𝑬 [ 𝑿 𝟏 ] 𝑬 [ 𝑿 𝟐 ] ⋯ 𝑬 [ 𝑿 𝒏 ] ] =∫ 𝒙 𝒑 𝑿 ( 𝒙 ) 𝒅𝒙
−∞
𝑻
 𝒙= [ 𝒙𝟏 𝒙𝟐 ⋯ 𝒙𝒏 ]

 If is rewritten as
∞ ∞
  𝑻
𝑬 [ 𝑿 ] = ∫ ⋯ ∫ [ 𝒙𝟏 𝒙𝟐 ⋯ 𝒙 𝒏 ] 𝒑𝒏 ( 𝒙𝟏 𝒙𝟐 ⋯ 𝒙𝒏 ) 𝒅 𝒙𝟏 𝒅 𝒙𝟐 ⋯ 𝒅 𝒙𝒏
−∞ −∞
Random Variables
Characteristic of a random vector : variance and covariance

 
∞∞
𝑻
{ } {
𝑫[𝑿 ]=𝑬{(𝑿−𝑬[𝑿])(𝑿 −𝑬[𝑿]) }= 𝑬{(𝑿𝒊−𝑬[𝑿𝒊])(𝑿 𝒋−𝑬[𝑿𝒋])} 𝒏×𝒏= ∫ ⋯∫ (𝒙𝒊−𝑬[ 𝑿𝒊])(𝒙𝒋−𝑬[𝑿𝒋]) 𝒑(𝒙𝟏,𝒙𝟐,⋯,𝒙𝒏)𝒅 𝒙𝟏𝒅𝒙𝟐⋯𝒅𝒙𝒏
−∞ −∞
}
𝒏 ×𝒏
Variance matrix


  𝑻
𝑫 [ 𝑿 ] =∫ ( 𝒙 − 𝑬 [ 𝑿 ] ) ( 𝒙 − 𝑬 [ 𝑿 ] ) 𝒑 𝒙 ( 𝒙 ) 𝒅𝒙
−∞
Random Variables
Characteristic of a random vector : variance and covariance

 𝑪 𝒊𝒌 =𝒄𝒐𝒗 ( 𝑿 𝒊 , 𝑿 𝒌 ) = 𝑬 Covariance matrix


{( 𝑿 − 𝑬 [ 𝑿 ]) ( 𝑿
𝒊 𝒊 𝒌 − 𝑬 [ 𝑿 𝒌 ] )}

𝟐
  𝑪 𝟏𝟏 𝑪 𝟏𝟐 ⋯ 𝑪𝟏𝒏  𝑪 𝒌 𝒌 = 𝑬 {( 𝑿𝒌 − 𝑬 [ 𝑿 𝒌 ]) }
𝑪=
𝑪𝟐𝟏

𝑪𝒏𝟏[ 𝑪 𝟐𝟐

𝑪𝒏𝟐



𝑪𝟐𝒏

𝑪 𝒏𝒏 ]  

 
Random Variables
Exercise:

The joint distribution density of two-dimensional random variable (X,Y) is


  𝝅 𝝅
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐

 
What is the covariance matrix
Random Variables
  𝝅 𝝅
Exercise:
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐
Random Variables
  𝝅 𝝅
Exercise:
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐
Random Variables
  𝝅 𝝅
Exercise:
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐
Random Variables
  𝝅 𝝅
Exercise:
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐
Random Variables
  𝝅 𝝅
Exercise:
{
𝒇 ( 𝒙 , 𝒚 )=
𝒄𝒐𝒔𝒙𝒄𝒐𝒔𝒚 𝟎 ≤ 𝒙 ≤ , 𝟎 ≤ 𝒚 ≤
𝟐
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
𝟐
Random Variables
Correlation coefficient

 
If the variances and are existed and positive, then

  𝑬 {( 𝑿 𝟏 − 𝑬 [ 𝑿 𝟏 ] ) ( 𝑿 𝟐 − 𝑬 [ 𝑿 𝟐 ] ) }
𝒓= 𝒓 𝟏𝟐 =
√𝑫 [ 𝑿 ] √ 𝑫 [ 𝑿 ]
𝟏 𝟐

Correlation matrix
  𝑟 11 𝑟 12 ⋯ 𝑟1 𝑛
 
𝒓 𝒊𝒌 =
𝑪 𝒊𝒌
√ 𝑪 𝒊𝒊 √ 𝑪 𝒌𝒌
𝒓=
[ 𝑟 21

𝑟𝑛 1
𝑟 22

𝑟𝑛 2



𝑟2 𝑛

𝑟 𝑛𝑛
]
Random Variables
Exercise:

The joint distribution density of two-dimensional random variable (X1,X2) is


  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ 𝒙 − 𝒂
( 𝟏 𝟏)
𝝈 𝟐
𝟏
−𝟐 𝒓
𝒙 −𝒂 𝒙 − 𝒂 𝒙 − 𝒂
( 𝟏 𝟏 )( 𝟐 𝟐 ) ( 𝟐 𝟐 )
𝝈 𝟏𝝈 𝟐
+ 𝟐
𝝈𝟐 ]}
What is the correlation coefficient of X1 and X2?
Random Variables
Exercise:
  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ ( 𝒙 𝟏 − 𝒂𝟏 )
𝝈
𝟐
𝟏
−𝟐 𝒓
( 𝒙 𝟏 −𝒂𝟏 )( 𝒙 𝟐 − 𝒂𝟐 ) ( 𝒙 𝟐 − 𝒂𝟐 )
𝝈 𝟏𝝈 𝟐
+
𝝈
𝟐
𝟐
]}
𝑬 {( 𝑿 𝟏 − 𝑬 [ 𝑿 𝟏 ] ) ( 𝑿 𝟐 − 𝑬 [ 𝑿 𝟐 ] ) }
= 𝒓 𝟏𝟐 =
√𝑫 [ 𝑿 ] √ 𝑫 [ 𝑿 ]
𝟏 𝟐
Random Variables
Exercise:
  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ ( 𝒙 𝟏 − 𝒂𝟏 )
𝝈
𝟐
𝟏
−𝟐 𝒓
( 𝒙 𝟏 −𝒂𝟏 )( 𝒙 𝟐 − 𝒂𝟐 ) ( 𝒙 𝟐 − 𝒂𝟐 )
𝝈 𝟏𝝈 𝟐
+
𝝈
𝟐
𝟐
]}
Random Variables
Exercise:
  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ ( 𝒙 𝟏 − 𝒂𝟏 )
𝝈
𝟐
𝟏
−𝟐 𝒓
( 𝒙 𝟏 −𝒂𝟏 )( 𝒙 𝟐 − 𝒂𝟐 ) ( 𝒙 𝟐 − 𝒂𝟐 )
𝝈 𝟏𝝈 𝟐
+
𝝈
𝟐
𝟐
]}
Random Variables
Exercise:
  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ ( 𝒙 𝟏 − 𝒂𝟏 )
𝝈
𝟐
𝟏
−𝟐 𝒓
( 𝒙 𝟏 −𝒂𝟏 )( 𝒙 𝟐 − 𝒂𝟐 ) ( 𝒙 𝟐 − 𝒂𝟐 )
𝝈 𝟏𝝈 𝟐
+
𝝈
𝟐
𝟐
]}
Random Variables
Exercise:
  𝟐 𝟐

𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 )=
𝟏
𝟐 𝝅 𝝈 𝟏 𝝈 𝟐 √ 𝟏 −𝒓
𝟐 {
𝒆𝒙𝒑 −
𝟏
𝟐
𝟐 ( 𝟏 −𝒓 ) [ ( 𝒙 𝟏 − 𝒂𝟏 )
𝝈
𝟐
𝟏
−𝟐 𝒓
( 𝒙 𝟏 −𝒂𝟏 )( 𝒙 𝟐 − 𝒂𝟐 ) ( 𝒙 𝟐 − 𝒂𝟐 )
𝝈 𝟏𝝈 𝟐
+
𝝈
𝟐
𝟐
]}
Random Variables
Characteristic function

 
The distribution function of random variable X is , then the characteristic
function of X is defined as

 
𝒋𝒗𝑿
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ]=∫ 𝒆 𝒋𝒗𝒙 𝒅𝑭 ( 𝒙 )
−∞

 
is a real variable, is also a random variable.

 
Characteristic function is the expectation of .
 is a complex function of
  ∞ ∞

|−∞
𝒋𝒗𝒙
|
|𝝓 ( 𝒗 )|= ∫ 𝒆 𝒅𝑭 ( 𝒙 ) ≤ ∫ |𝒆 𝒋𝒗𝒙|𝒅𝑭 ( 𝒙 )=𝟏
−∞
Random Variables
Characteristic function

 
The distribution function of random variable X is , then the characteristic
function of X is defined as

 
𝒋𝒗𝑿
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ]=∫ 𝒆 𝒋𝒗𝒙 𝒅𝑭 ( 𝒙 )
−∞
  ∞
𝒋𝒗𝑿 𝒋𝒗 𝒙 𝒌
For a discrete random variable 𝝓 ( 𝒗 )=𝑬 [ 𝒆 ] =∑ 𝒆 𝒑𝒌
𝒌=𝟏

 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆
For a continuous random variable ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞


  𝟏 − 𝒋𝒗𝒙
Fourier Transform 𝒇 ( 𝒙 )= ∫ 𝝓 ( 𝒗 )𝒆 𝒅𝒗
𝟐𝝅 −∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then what is the characteristic function?



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then what is the characteristic function?



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then what is the characteristic function?



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Properties of characteristic function:

 1.

∞ ∞
 
𝝓 ( 𝟎 ) = ∫ 𝒆 𝒋𝒗𝒙 𝒇 ( 𝒙 ) 𝒅 𝒙= ∫ 𝒇 ( 𝒙 ) 𝒅𝒙 =𝟏
−∞ −∞

∞ ∞
 
|
|𝝓 ( 𝒗 )|= ∫ 𝒆
−∞
𝒋𝒗𝒙
|
𝒅𝑭 ( 𝒙 ) ≤ ∫ 𝒇 ( 𝒙 ) 𝒅𝒙 =𝝓 ( 𝟎 )=𝟏
−∞
Random Variables
Properties of characteristic function:

 2 、 If , then

 
Random Variables
Properties of characteristic function:

 3 、 If , then
𝒋𝒗 ( 𝑿 𝟏+ 𝑿 𝟐)
 𝝓
𝒀 ( 𝒗 )= 𝑬 [ 𝒆 ] = 𝑬 [ 𝒆 𝒋𝒗 𝑿 ] 𝑬 [ 𝒆 𝒋𝒗 𝑿 ]= 𝝓 𝑿 ( 𝒗 ) 𝝓 𝑿
𝟏 𝟐

𝟏 𝟐
(𝒗 )

 If , then
Random Variables
Properties of characteristic function:

 4 、 , then:  𝑬 [ 𝑿 𝒌 ] = 𝒋 ( − 𝒌 ) 𝝓( 𝒌 ) ( 𝟎 ) differential

∞ ∞
  𝒌 𝒌 𝒋𝒗𝒙 𝒌
∫ |𝒋 𝒙 𝒆 𝒇 ( 𝒙 )|𝒅𝒙= ∫ | 𝒙 𝒇 ( 𝒙 )| 𝒅𝒙=¿ 𝜷 𝒌 <¿ ∞ ¿ ¿
−∞ −∞

∞ ∞
  (𝒌 ) 𝒌 𝒌 𝒋𝒗𝒙 𝒌
𝝓 ( 𝒗 )= ∫ 𝒋 𝒙 𝒆 𝒇 ( 𝒙 ) 𝒅𝒙= 𝒋 ∫ 𝒙 𝒌 𝒆 𝒋𝒗𝒙 𝒇 ( 𝒙 ) 𝒅𝒙=¿ 𝒋𝒌 𝑬 [ 𝑿 𝒌 𝒆 𝒋𝒗 𝑿 ] ¿
−∞ −∞

 𝒗 =𝟎 → 𝝓( 𝒌 ) ( 𝟎 ) = 𝒋 𝒌 𝑬 [ 𝑿 𝒌 ] → 𝑬 [ 𝑿 𝒌 ] = 𝒋 − 𝒌 𝝓( 𝒌 ) ( 𝟎 )
characteristic function can be helpful when deriving the moment of a random variable.
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then calculate the and



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then calculate the and



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then calculate the and



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Random Variables
Exercise:

 Suppose X is Guassian distributed, , then calculate the and



 
𝒋𝒗𝑿 𝒋𝒗𝒙
𝝓 ( 𝒗 )= 𝑬 [ 𝒆 ] =∫ 𝒆 𝒇 (𝒙)𝒅 𝒙
−∞
Stochastic Process
Definition

Stochastic process X={ X(t,e), t∈T, e∈E } is a family of random variables defined on a event space E={e}
and indexed by time t.
It can also be denoted as { X(t), t∈T }

 Case 1: both e and t are variables, then X(t ) is a family of function of t. It is a stochastic process.
 Case 2: e is given and t is a variable, then X(t ) is just a function of t. Not a stochastic process.
 Case 3: t is given and e is a variable, , then X(t ) is a random variable.
 Case 4: both e and t are given, , then X(t ) is a scalar or a vector.
Stochastic Process
Definition

Stochastic process X={ X(t,e), t∈T, e∈E } is a family of random variables defined on a event space E={e}
and indexed by time t.
It can also be denoted as { X(t), t∈T }

 Case 1: continuous stochastic process, X(t ) and t are continuous.


 Case 2: discrete stochastic process. X(t ) is discrete, t is continuous.
 Case 3: continuous stochastic sequency, X(t ) is continuous, t is discrete.
 Case 4: discrete stochastic sequency. X(t ) is discrete, t is discrete.
Stochastic Process
Statistical characteristic
The statistical characteristic of a random variable is a number
The statistical characteristic of a stochastic process is a function of t.

 
Expectation 𝒎 (𝒕 )= 𝑬 [ 𝑿 (𝒕 ) ] = ∫ 𝒙𝒇 ( 𝒙 ,𝒕 ) 𝒅𝒙
−∞
Stochastic Process
Statistical characteristic
The statistical characteristic of a random variable is a number
The statistical characteristic of a stochastic process is a function of t.
Second order moment
𝟐
Variance  𝑫
[ 𝑿 ( 𝒕 ) ] = 𝑬 {[ 𝑿 ( 𝒕 ) − 𝒎 ( 𝒕 ) ] }  
𝜳 𝟐
𝑿
𝟐
=𝑬 [ 𝑿 (𝒕 )]

Strong correlated Weak correlated


Stochastic Process
Statistical characteristic
The statistical characteristic of a random variable is a number
The statistical characteristic of a stochastic process is a function of t.

Autocorrelation function
 
Suppose and are two states of a stochastic process at time and . is the
two-dimensional density.

∞ ∞
 
𝑹 𝑿 ( 𝒕 𝟏 ,𝒕 𝟐 )= 𝑬 [ 𝑿 ( 𝒕 𝟏 ) 𝑿 ( 𝒕 𝟐 ) ] =∫ ∫ 𝒙 𝟏 𝒙 𝟐 𝒇 ( 𝒙 𝟏 , 𝒙 𝟐 ; 𝒕 𝟏 ,𝒕 𝟐 ) 𝒅 𝒙 𝟏 𝒅 𝒙 𝟐
−∞ −∞

Second order moment


Stochastic Process
Autocorrelation function
Second order joint central moment (covariance function)
  ∞ ∞
𝑪 𝑿 ( 𝒕𝟏 ,𝒕𝟐 ) =𝑬 {[ 𝑿 ( 𝒕 𝟏 ) −𝒎 𝑿 ( 𝒕 𝟏 ) ] [ 𝑿 ( 𝒕 𝟐 ) −𝒎𝑿 ( 𝒕𝟐 ) ]}= ∫ ∫ [ 𝒙 ( 𝒕 𝟏) −𝒎𝑿 ( 𝒕𝟏 ) ][ 𝒙 ( 𝒕 𝟐 ) −𝒎𝑿 ( 𝒕𝟐 ) ] 𝒇 ( 𝒙 𝟏 ,𝒙 𝟐 ;𝒕 𝟏 ,𝒕 𝟐 ) 𝒅𝒙 𝟏 𝒅𝒙 𝟐
−∞ −∞

𝟐 𝟐
 
𝜳 𝑿 = 𝑬 [ 𝑿 ( 𝒕 ) ]= 𝑹 𝑿 ( 𝒕 , 𝒕 )
 𝑪 ( 𝒕 𝟏 , 𝒕 𝟐 ) =𝑹 𝑿 ( 𝒕 𝟏 ,𝒕 𝟐 ) − 𝒎 𝑿 ( 𝒕 𝟏) 𝒎 𝑿 ( 𝒕 𝟐 )
𝑿

 𝝈 𝟐 𝒕 𝒕 , 𝒕 𝑹 𝒕 ,𝒕 − 𝒎 𝟐
( ) =𝑪 ( ) = ( ) 𝑿 (𝒕 )
 )
𝑿 𝑿 𝑿
Random Variables
Exercise:
 Suppose , then calculate variance
Stochastic Process
Exercise:
 Suppose , then calculate variance
Stochastic Process
Exercise:
 Suppose , then calculate variance
Stochastic Process
Homework
 Suppose and are independent with E=0 and D=1, Z(t)=X+Yt, then calculate m(t), D[Z(t)],
R(t1,t2) and C(t1,t2).

You might also like