Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Multivariate Distributions

Outline

• Jointly Distributed Random Variables


• Independence
• Functions of several random variables
• Expected value of a function of several
variables
• Covariance
•Correlation
Jointly Distributed Random Variables
Definition
The function f(x,y) is a joint probability distribution of the discrete
random variables X and Y if
1. f ( x, y )  0 for all (x,y)
2.  f ( x, y ) = 1
x y

3.P ( X = x, Y = y ) = f ( x, y )
For any region A in the xy plane,
P ( X , Y )  A =   f ( x, y )
A
Jointly Distributed Random Variables

( b ) P ( X , Y )  A , where A is the region ( x, y ) | x + y  1


Jointly Distributed Random Variables
Solutions
• Hence, f(0,1) = 6/28=3/14
 3  2  3 
   
− −
f ( x, y ) =     for x = 0,1, 2; y = 0,1, 2; and 0  x + y  2
x y 2 x y
8 
 
 2
Jointly Distributed Random Variables
Solutions
(b) The probability that (X,Y) fall in the region A is

P ( X , Y )  A = P ( X + Y  1) = f ( 0,0 ) + f ( 0,1) + f (1,0 )


3 3 9 9
= + + =
28 14 28 14
Jointly Distributed Random Variables
Definition
The function f(x,y) is a joint density function of the continuous
random variables X and Y if

1. f ( x, y )  0 for all (x,y)


 
2.  f ( x, y ) dxdy = 1
− −

3.P ( X , Y )  A =  f ( x, y )dxdy,


A

for any region A in the xy plane


Jointly Distributed Random Variables
Example ***
A privately owned business operates both a drive-in facility and a
walk-in facility. On a random selected day, X and Y, respectively, be
the proportions of the time that the drive-in and walk-in facilities
are in use, and suppose that the joint density function of these
random variables is

2
 ( 2x + 3y ) , 0  x  1, 0  y  1,
f ( x, y ) =  5

0, elsewhere
 
(a) Verify that  
− −
f ( x, y )dxdy = 1
 1 1 1
(b) Find P ( X , Y )  A , where A = ( x, y ) | 0  x  ,  y  
 2 4 2
Jointly Distributed Random Variables
Solution
  2
(a)   f ( x, y )dxdy =   ( 2 x + 3 y ) dxdy
1 1

− − 0 05

x =1
1  2 x 6 xy 
2
=  +  dy
0
 5 5  x =0
1
 2 6y 
1  2 y 3y  2 32
=  +  dy =  +  = + =1

0 5 5  5 5 0 5 5
Jointly Distributed Random Variables
Solution  1 1 1
(b) P ( X , Y )  A =P  0  x  ,  y  
 2 4 2
( 2 x + 3 y ) dxdy
12 1/2
=  2
5
14 0
x =1/2
12  2 x 6 xy  2
=  +  dy
14
 5 5  x =0
12 1 3y 
=   + dy

1 4 10 5 
1/2
 y 3y  2
= + 
 10 10  1/4
1  1 3   1 3   13
=   +  −  +  =
10  2 4   4 16   160
Jointly Distributed Random Variables
Marginal Distributions
Given the joint probability distribution f(x,y) of the discrete random
variables X and Y, the probability distribution g(x) of X alone is
obtained by summing f(x,y) over the values of Y. Similarly, the
probability distribution h(y) of Y alone is obtained by summing f(x,y)
over the values of X. We define g(x) and h(y) to be the marginal
distributions of X and Y, respectively.
Definition
The marginal distributions of X alone and of Y alone are
g ( x ) =  f ( x, y ) and h ( y ) =  f ( x, y )
y x

for the discrete case, and


 
g ( x) =  f ( x, y ) dy and h ( y ) =  f ( x, y ) dx
− −

for the continuous case


Jointly Distributed Random Variables
Example (discrete)
Show that the column and row totals in the Table below give the
marginal distribution of X alone and of Y alone
Jointly Distributed Random Variables
Example (discrete)
Show that the column and row totals in the Table below give the
marginal distribution of X alone and of Y alone

For X
3 3 1 5
g (0) = f ( 0,0 ) + f ( 0,1) + f ( 0, 2 ) =+ + =
28 14 28 14
9 3 15
g (1) = f (1,0 ) + f (1,1) + f (1, 2 ) = + + 0 =
28 14 28
Jointly Distributed Random Variables
Example (continuous)
Find g(x) and h(y) for the joint distribution function of Example ***
Jointly Distributed Random Variables
Example (continuous)
Find g(x) and h(y) for the joint distribution function of Example ***
Solution
y =1
 2  4 xy 6 y 
2
4x + 3
g ( x) =  f ( x, y )dy =  ( 2 x + 3 y ) dy = 
1
+  =
− 0 5
 5 10  y =0 5
for 0  x  1, and g ( x) = 0 elsewhere
 2 2 (1 + 3 y )
h ( y ) =  f ( x, y )dx =  ( 2 x + 3 y ) dx =
1

− 0 5 5
for 0  y  1, and h( y ) = 0 elsewhere
Jointly Distributed Random Variables
Expectation
Definition
Let X and Y be random variables with joint probability
distribution f(x,y). The expected value of the random
variable g(X,Y) is
 g ( X ,Y ) = E  g ( X , Y )  =  g ( x, y ) f ( x, y )
x y

if X and Y are discrete, and


 
 g ( X ,Y ) = E  g ( X , Y )  =   g ( x, y ) f ( x, y )
− −

if X and Y are continuous.


Jointly Distributed Random Variables
Expectation
Example (discrete)
Let X and Y be random variables with joint probability
distribution f(x,y) in the Table below. Find the expected
value of the random variable g(X,Y) =XY
Jointly Distributed Random Variables
Expectation
Example (discrete)
Solution
 g ( X ,Y ) = E  g ( X , Y )  =  g ( x, y ) f ( x, y )
x y
2 2
E  XY  =  xyf ( x, y )
x =0 y =0

= ( 0 )( 0 ) f ( 0, 0 ) + ( 0 )(1) f ( 0,1)
+ (1)( 0 ) f (1, 0 ) + (1)(1) f (1, ) + ( 2 )( 0 ) f ( 2, 0 )
3
=f (1,1) =
14
Jointly Distributed Random Variables
Expectation
Example (continuous)
Find E(Y/X) for the density function
 x (1 + 3 y 2 )
 0  x  2, o  y  1,
f ( x, y ) =  4
,
0
 elsewhere
Jointly Distributed Random Variables
Expectation
Example (continuous)
Solution
 
E  g ( X , Y )  =   g ( x, y ) f ( x, y )dxdy
− −

Y  1 2y ( + ) ( + )
2 2
x 1 3 y 1 2 y 1 3 y
E  =   dxdy =   dxdy
X 0 0 x 4 0 0 4
1 2 y (1 + 3 y )
x=2 2
Y  1 xy (1 + 3 y )
2
E  = 0 dy =  dy
X  4 x =0
0 4
Y  1y + 3y 5 3
E  = 0 dy =
X  2 8
Jointly Distributed Random Variables
Covariance
Definition
Let X and Y be random variables with joint probability
distribution f(x,y). The covariance is

 XY = E ( X −  X )(Y − Y ) =  ( x −  x ) ( y −  y ) f ( x, y )


x y

if X and Y are discrete, and


 XY = E ( X −  X )(Y − Y ) =   ( x −  x ) ( y −  y ) f ( x, y )dxdy
 

− −

if X and Y are continuous


Jointly Distributed Random Variables
Jointly Distributed Random Variables

 XY = E ( XY ) −  X Y
Jointly Distributed Random Variables
Covariance
Theorem – proof (discrete)

 XY =  ( x −  X )( y − Y ) f ( x, y )
x y

 XY =  xy f ( x, y ) −  X  y f ( x, y )
x y x y

− Y  x f ( x, y ) +  X Y  f ( x, y )
x y x y
Jointly Distributed Random Variables
Covariance
Theorem – proof (discrete)

Remember  X =  xf ( x, y ), Y =  yf ( x, y ) ,
x y

 f ( x, y ) = 1, and E( XY ) =  xy f ( x, y )
x y x y

 XY = E ( XY ) −  X Y − Y  X +  X Y
 XY = E ( XY ) −  X Y
Jointly Distributed Random Variables
Covariance
Example (discrete)----C1
Find the covariance of X and Y.
h(y)

g(x)
Jointly Distributed Random Variables
Covariance
Solution
2
5  15   3 3
 X =  xg ( x) = ( 0 )   + (1)   + ( 2 )   =
x =0  14   28   28  4
2
 15  3 1 1
Y =  yh( y ) = ( 0 )   + (1)   + ( 2 )   =
y =0  28  7  28  2
3  3  1  9
 XY = E ( XY ) −  X Y = −    = −
14  4  2  56
Jointly Distributed Random Variables
Covariance
Example (continuous) – C2
The fraction X of male runners and the fraction Y of female runners
who compete in marathon races are described by the joint density
8 xy, 0  y  x  1
function
f ( x, y ) = 
0, elsewhere
Find the covariance of X and Y.
Jointly Distributed Random Variables

 XY
 XY =
 XY
Jointly Distributed Random Variables
Correlation coefficient
Example (discrete)
Find the correlation coefficient between X and Y in Example C1
 5  2  15  2  3 
E(X ) = (0 )   + (1 )   + (2 )
27
=
2 2

 14   28   28  28
2  15  2  3  2  1 
E(X2) = (0 )   + (1 )   + ( 2 ) 
4
=
 28  7  28  7
2 2
27  3  45 4 1 9
 X2 = −  = and  Y2 = −  =
28  4  112 7 2 28

 XY − 9
1
 XY = = 56 =−
 XY  45   9  5
  
 112  28 
Jointly Distributed Random Variables
Correlation coefficient
Example (continuous)
Find the correlation coefficient between X and Y in Example C2

You might also like