Professional Documents
Culture Documents
Joint Moments and Joint Characteristic Functions
Joint Moments and Joint Characteristic Functions
1
PILLAI
However, the situation here is similar to that in (6-13), and
it is possible to express the mean of Z = g( X ,Y ) in terms
of f XY ( x, y) without computing f Z (z). To see this, recall from
(5-26) and (7-10) that
P (z < Z ≤ z + ∆ z ) = f Z ( z ) ∆ z = P (z < g ( X , Y ) ≤ z + ∆ z )
= f XY ( x , y ) ∆ x ∆ y (10-3)
( x , y )∈D ∆z
or
+∞ +∞
E[ g ( X , Y )] = g ( x, y ) f XY ( x, y )dxdy. (10-6)
−∞ −∞
E ak gk ( X , Y ) = ak E[ gk ( X , Y )]. (10-8)
k k
3
PILLAI
If X and Y are independent r.vs, it is easy to see that Z = g ( X )
and W = h(Y ) are always independent of each other. In that
case using (10-6), we get the interesting result
+∞ +∞
E[ g ( X )h(Y )] = g ( x )h( y ) f X ( x ) fY ( y )dxdy
−∞ −∞
+∞ +∞
= g ( x ) f X ( x )dx h( y ) fY ( y )dy = E[ g ( X )]E[h(Y )]. (10-9)
−∞ −∞
[
Var (U ) = E {a ( X − µ X ) + ( Y − µ Y ) }
2
]
= a 2Var ( X ) + 2 a Cov ( X , Y ) + Var ( Y ) ≥ 0 . (10-13)
5
PILLAI
The right side of (10-13) represents a quadratic in the
variable a that has no distinct real roots (Fig. 10.1). Thus the
roots are imaginary (or double) and hence the discriminant
[Cov ( X , Y ) ] 2 − Var ( X ) Var (Y )
must be non-positive, and that gives (10-12). Using (10-12),
we may define the normalized parameter
Cov ( X , Y ) Cov ( X , Y )
ρ XY = = , − 1 ≤ ρ XY ≤ 1, (10-14)
Var ( X )Var (Y ) σ Xσ Y
or
Var (U )
Cov ( X , Y ) = ρ XY σ X σ Y (10-15)
8
PILLAI
Example 10.2: Let Z = aX + bY . Determine the variance of Z
in terms of σ X ,σ Y and ρ XY .
Solution:
µ Z = E ( Z ) = E ( aX + bY ) = a µ X + bµY
= a 2 E ( X − µ X )2 + 2abE ( ( X − µ X )(Y − µY ) ) + b2 E (Y − µY ) 2
= a 2σ X2 + 2ab ρ XY σ X σ Y + b2σ Y2 . (10-23)
In particular if X and Y are independent, then ρ XY = 0, and
(10-23) reduces to
σ Z2 = a 2σ X2 + b2σ Y2 . (10-24)
where
µ Z =∆ a µ X + b µ Y , (10-34)
σ Z2 =∆ a 2σ X2 + 2 ρ ab σ X σ Y + b 2σ Y2 . (10-35)
Notice that (10-33) has the same form as (10-31), and hence
we conclude that Z = aX + bY is also Gaussian with mean and
variance as in (10-34) - (10-35), which also agrees with (10-
23).
From the previous example, we conclude that any linear
combination of jointly Gaussian r.vs generate a Gaussian14r.v.
PILLAI
In other words, linearity preserves Gaussianity. We can use
the characteristic function relation to conclude an even more
general result.
Example 10.4: Suppose X and Y are jointly Gaussian r.vs as
in the previous example. Define two linear combinations
Z = aX + bY , W = cX + dY . (10-36)
Φ ZW ( u , v ) = E ( e j ( Zu +Wv ) ) = E ( e j ( aX + bY ) u + j ( cX + dY ) v )
= E ( e jX ( au + cv ) + jY ( bu + dv ) ) = Φ XY ( au + cv , bu + dv ). (10-37)
where
µ Z = aµ X + bµY , (10-39)
µW = cµ X + dµY , (10-40)
σ Z2 = a 2σ X2 + 2 ab ρσ X σ Y + b 2σ Y2 , (10-41)
σ W2 = c 2σ X2 + 2 cd ρσ X σ Y + d 2σ Y2 , (10-42)
and
acσ X2 + (ad + bc ) ρσ X σ Y + bdσ Y2 (10-43)
ρ ZW = .
σ Zσ W
From (10-38), we conclude that Z and W are also jointly
distributed Gaussian r.vs with means, variances and
correlation coefficient as in (10-39) - (10-43). 16
PILLAI
To summarize, any two linear combinations of jointly
Gaussian random variables (independent or dependent) are
also jointly Gaussian r.vs.
Linear
Gaussian input Gaussian output
operator
Fig. 10.3
Of course, we could have reached the same conclusion by
deriving the joint p.d.f f ZW ( z , w) using the technique
developed in section 9 (refer (7-29)).
Gaussian random variables are also interesting because of
the following result:
Central Limit Theorem: Suppose X 1 , X 2 , , X n are a set of
zero mean independent, identically distributed (i.i.d) random
17
PILLAI
variables with some common distribution. Consider their
scaled sum
X1 + X + + X
Y = 2 n
. (10-44)
n
Then asymptotically (as n → ∞ )
Y → N ( 0 , σ 2 ). (10-45)
Proof: Although the theorem is true under even more
general conditions, we shall prove it here under the
independence assumption. Let σ 2 represent their common
variance. Since
E ( X i ) = 0, (10-46)
we have
Var ( X i ) = E ( X i2 ) = σ 2 . (10-47)
18
PILLAI
Consider
( ) = ∏ E (e
n
j ( X1 + X 2 + + X n )u / n
ΦY ( u ) = E ( e jYu
)=Ee jX i u / n
)
i =1
n
= ∏ Φ X i (u / n ) (10-48)
i =1