Two Functions of Two Random Variables: W y X H Z y X G

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

12/18/2013

represent these multiple solutions such that (see Fig. 9.3)


g ( xi , yi ) = z , h ( xi , yi ) = w. (9-14)

9. Two Functions of Two Random Variables


y ∆2
w ∆1

( x2 , y2 )
w + ∆w •
( x1, y1 ) ∆i
If g ( x , y ) and h ( x , y ) are continuous and differentiable ( z, w) •
∆w •
( xi , yi )
z + ∆z x
functions, then as in the case of one random variable (see (5- ∆z ∆n
30)) it is possible to develop a formula to obtain the joint z •
( xn , yn )
p.d.f fZW ( z, w) directly. Towards this, consider the equations (a) (b)
Fig. 9.3
g ( x, y ) = z, h ( x, y ) = w. (9-13)
Consider the problem of evaluating the probability
For a given point (z,w), equation (9-13) can have many P (z < Z ≤ z + ∆z, w < W ≤ w + ∆w)
solutions. Let us say = P (z < g ( X , Y ) ≤ z + ∆z, w < h( X , Y ) ≤ w + ∆w) . (9-15)
( x1 , y1 ), ( x2 , y2 ), , ( xn , yn ), 1 2
PILLAI PILLAI

Using (7-9) we can rewrite (9-15) as Finally D goes to D′, and A′B ′C ′D ′ represents the equivalent
P (z < Z ≤ z + ∆z, w < W ≤ w + ∆w) = f ZW ( z, w)∆z∆w. (9-16) parallelogram in the XY plane with area ∆i . Referring back
But to translate this probability in terms of f XY ( x, y ), we need to Fig. 9.3, the probability in (9-16) can be alternatively
to evaluate the equivalent region for ∆z ∆w in the xy plane. expressed as
Towards this referring to Fig. 9.4, we observe that the point
P (( X , Y ) ∈ ∆ i ) = f XY ( xi , yi ) ∆ i . (9-17)
A with coordinates (z,w) gets mapped onto the point A′ with i i

coordinates ( xi , yi ) (as well as to other points as in Fig. 9.3(b)). Equating (9-16) and (9-17) we obtain
As z changes to z + ∆z to point B in Fig. 9.4 (a), let B′ ∆i
f ZW ( z, w) = f XY ( xi , yi ) . (9-18)
represent its image in the xy plane. Similarly as w changes i ∆z ∆w
to w + ∆w to C, let C′ represent its image in the xy plane. To simplify (9-18), we need to evaluate the area of the ∆i
w y
D′ parallelograms in Fig. 9.3 (b) in terms of ∆z∆w. Towards this,
C′
C D
∆i let g1 and h1 denote the inverse transformation in (9-14), so
∆w θ ϕ B′
w
A B
yi
A′
that
∆z xi = g1 ( z, w), yi = h1 ( z, w). (9-19)
z z xi x
3 4
(a) Fig. 9.4 (b) PILLAI PILLAI

As the point (z,w) goes to ( xi , yi ) ≡ A′, the point ( z + ∆z, w) → B′, But from Fig. 9.4 (b), and (9-20) - (9-22)
the point ( z, w + ∆w) → C′, and the point ( z + ∆z, w + ∆w) → D′. ∂g1 ∂h
A′B ′ cos ϕ = ∆z , A′C ′ sin θ = 1 ∆w, (9-24)
Hence the respective x and y coordinates of B′ are given by ∂z ∂w
∂g1 ∂g ∂h ∂g
g1 ( z + ∆z , w ) = g1 ( z , w ) + ∆z = xi + 1 ∆z , (9-20) A′B ′ sin ϕ = 1 ∆z , A′C ′ cos θ = 1 ∆w. (9-25)
∂z ∂z ∂z ∂w
so that
and
∂ g 1 ∂ h1 ∂ g 1 ∂ h1
∂h1 ∂h (9-21) ∆i = − ∆ z∆ w (9-26)
h1 ( z + ∆z , w) = h1 ( z , w ) + ∆ z = yi + 1 ∆ z. ∂z ∂w ∂w ∂z
∂z ∂z
and
Similarly those of C′ are given by
∂g1 ∂g1
∂g1 ∂h1 (9-22) ∆i ∂ g 1 ∂ h1 ∂ g 1 ∂ h1 ∂z ∂w
xi + ∆w, yi + ∆w. = − = det (9-27)
∂w ∂w ∆ z∆ w ∂z ∂w ∂w ∂z ∂ h1 ∂ h1
The area of the parallelogram A′B ′C ′D ′ in Fig. 9.4 (b) is ∂z ∂w
given by The right side of (9-27) represents the Jacobian J ( z , w ) of
∆ i = ( A′B ′ )( A′C ′) sin( θ − ϕ ) the transformation in (9-19). Thus
(9-23)
= ( A′B ′ cos ϕ )( A′C ′ sin θ ) − ( A′B ′ sin ϕ )( A′C ′ cos θ ) . 5 6
PILLAI PILLAI

1
12/18/2013

∂g1 ∂g1 Next we shall illustrate the usefulness of the formula in


∂z ∂w
J ( z, w ) = det . (9-29) through various examples:
(9-28)
∂ h1 ∂ h1
∂z ∂w Example 9.2: Suppose X and Y are zero mean independent
Gaussian r.vs with common variance σ 2 .
Substituting (9-27) - (9-28) into (9-18), we get
Define Z = X 2 + Y 2 , W = tan−1(Y / X ), where | w |≤ π / 2.
1 Obtain f ZW ( z, w).
f ZW ( z, w) = | J ( z, w) | f XY ( xi , yi ) = f XY ( xi , yi ), (9-29)
| J ( xi , yi ) |
i i
Solution: Here
since
1
(9-32)
2 2 2
1 f XY ( x, y ) = e − ( x + y ) / 2σ .
| J ( z, w) | = (9-30) 2πσ 2
| J ( xi , yi ) |
Since
where J ( xi , yi ) represents the Jacobian of the original z = g ( x, y ) = x 2 + y 2 ; w = h ( x, y ) = tan −1 ( y / x ), | w |≤ π / 2, (9-33)
transformation in (9-13) given by
∂g ∂g if ( x1, y1 ) is a solution pair so is (− x1,− y1 ). From (9-33)
∂x ∂y y
J ( x i , y i ) = det . (9-31) = tan w, or y = x tan w. (9-34)
∂h ∂h x
7 8
∂x ∂y x = xi , y = yi PILLAI PILLAI

Substituting this into z, we get We can also compute J ( x, y) using (9-31). From (9-33),
x y

z = x + y = x 1 + tan w = x sec w, or x = z cosw.


2 2 2 + y + y
2 2 2 2
(9-35) x x
1 1
J (x, y) = =
x 2
+ y 2
=
z
. (9-40)
− y x
and x2 + y 2
x 2
+ y 2

(9-36)
y = x tan w = z sin w. Notice that | J ( z, w) |= 1/ | J ( xi , yi ) |, agreeing with (9-30).
Substituting (9-37) and (9-39) or (9-40) into (9-29), we get
Thus there are two solution sets
f ZW ( z, w) = z ( f XY ( x1 , y1 ) + f XY ( x2 , y2 ) )
x1 = z cos w, y1 = z sin w, x2 = − z cos w, y 2 = − z sin w. (9-37) z 2
/ 2σ 2 π
= e−z , 0 < z < ∞, | w |< . (9-41)
πσ 2 2
We can use (9-35) - (9-37) to obtain J ( z, w). From (9-28)
Thus
∂x ∂x
π /2 z 2
/ 2σ 2
∂z ∂w cos w − z sin w fZ ( z) = f ZW ( z , w ) dw = e−z , 0 < z < ∞, (9-42)
J (z, w ) = =
sin w z cos w
= z, (9-38) −π / 2 σ2
∂y ∂y
∂z ∂w which represents a Rayleigh r.v with parameter σ , 2
and
so that fW ( w ) =

f ZW ( z , w ) dz =
1
, | w |<
π
, (9-43)
| J ( z , w) |= z. (9-39) 9 0 π 2
10
PILLAI PILLAI

which represents a uniform r.v in the interval (−π / 2,π / 2). a complex Gaussian r.v are independent with Rayleigh
Moreover by direct computation and uniform distributions (U ~ ( −π , π ) ) respectively. The
f ZW ( z , w ) = f Z ( z ) ⋅ fW ( w ) (9-44) statistical independence of these derived r.vs is an interesting
observation.
implying that Z and W are independent. We summarize these Example 9.3: Let X and Y be independent exponential
results in the following statement: If X and Y are zero mean random variables with common parameter λ.
independent Gaussian random variables with common Define U = X + Y, V = X - Y. Find the joint and marginal
variance, then X 2 + Y 2 has a Rayleigh distribution and tan−1(Y / X) p.d.f of U and V.
has a uniform distribution. Moreover these two derived r.vs Solution: It is given that
are statistically independent. Alternatively, with X and Y as 1
independent zero mean r.vs as in (9-32), X + jY represents a f XY ( x, y ) = e −( x + y ) / λ , x > 0, y > 0. (9-46)
λ2
complex Gaussian r.v. But
Now since u = x + y, v = x - y, always | v |< u , and there is
X + jY = Ze jW , (9-45)
only one solution given by
u+v u−v
where Z and W are as in (9-33), except that for (9-45) to hold x= , y= . (9-47)
2 2
good on the entire complex plane we must have − π < W < 11π ,
and hence it follows that the magnitude and phase of PILLAI Moreover the Jacobian of the transformation is given by 12
PILLAI

2
12/18/2013

J ( x, y ) =
1 1
= −2
Auxiliary Variables:
1 −1
and hence Suppose
1 Z = g ( X , Y ), (9-51)
fUV (u, v ) = 2 e−u / λ , 0 < | v | < u < ∞, (9-48)

represents the joint p.d.f of U and V. This gives where X and Y are two random variables. To determine f Z (z )
u 1 u u by making use of the above formulation in (9-29), we can
fU ( u ) = fUV (u, v )dv = 2 e −u / λ dv = e −u / λ , 0 < u < ∞, (9-49)
−u 2λ −u λ2 define an auxiliary variable
and
W = X or W =Y (9-52)
∞ 1 ∞ 1 −|v|/ λ
fV ( v ) = fUV (u, v ) du = e −u / λ du = e , − ∞ < v < ∞. (9-50)
|v| 2λ 2 |v | 2λ and the p.d.f of Z can be obtained from f ZW ( z, w) by proper
Notice that in this case the r.vs U and V are not independent. integration.
As we show below, the general transformation formula in Example 9.4: Suppose Z = X + Y and let W = Y so that the
(9-29) making use of two functions can be made useful even transformation is one-to-one and the solution is given
when only one function is specified. 13 by y1 = w, x1 = z − w. 14
PILLAI PILLAI

The Jacobian of the transformation is given by Find the density function of Z.


1 1 Solution: We can make use of the auxiliary variable W = Y
J ( x, y ) = =1
0 1 in this case. This gives the only solution to be
and hence x1 = e − ( z sec( 2πw ) )
2
/2
, (9-55)
f ZW ( x, y ) = f XY ( x1 , y1 ) = f XY ( z − w, w) y1 = w, (9-56)

or and using (9-28)


+∞
(9-53) ∂ x1 ∂ x1 ∂ x1
fZ ( z) = f ZW ( z, w)dw = f XY ( z − w, w)dw, − z sec 2 ( 2 π w ) e − ( z sec( 2πw ) )2 / 2
−∞ ∂z ∂w ∂w
J ( z, w ) = = (9-57)
which agrees with (8.7). Note that (9-53) reduces to the ∂ y1 ∂ y1 0 1
∂z ∂w
convolution of f X (z ) and fY (z ) if X and Y are independent = − z sec 2 ( 2 π w ) e − ( z sec( 2 πw ) ) 2 / 2
.
random variables. Next, we consider a less trivial example.
Substituting (9-55) - (9-57) into (9-29), we obtain
Example 9.5: Let X ∼ U (0,1) and Y ∼ U (0,1) be independent. f ZW ( z , w ) = z sec 2 (2 π w ) e − ( z sec( 2 π w ) )
2
/2
,
Define Z = (− 2 ln X )1 / 2 cos( 2π Y ). (9-54) (9-58)
15 − ∞ < z < +∞ , 0 < w < 1, 16
PILLAI PILLAI

and
z sec 2 (2π w ) e (
− z tan( 2 π w ) ) / 2
1 2 1 2

f Z ( z) = f ZW ( z , w ) dw = e − z /2
dw. (9-59)
0 0

Let u = z tan(2π w ) so that du = 2π z sec 2 (2π w ) dw. Notice


that as w varies from 0 to 1, u varies from − ∞ to + ∞.
Using this in (9-59), we get
1 2 +∞ 2 du 1 2
fZ ( z) = e−z /2 e −u /2
= e−z / 2 , − ∞ < z < ∞, (9-60)
2π −∞ 2π 2π
1

which represents a zero mean Gaussian r.v with unit


variance. Thus Z ∼ N ( 0 ,1 ). Equation (9-54) can be used as
a practical procedure to generate Gaussian random variables
from two independent uniformly distributed random
sequences. 17
PILLAI

You might also like