Professional Documents
Culture Documents
Chap 5 PME
Chap 5 PME
PROBABILITY METHODS IN
ENGINEERING
Bakhtiar Ali
Assistant Professor,
Electrical Engineering,
COMSATS, Islamabad.
In this chapter we are InShaAllah going to study
5.1 Two Random Variables
5.2 Pairs of Discrete Random Variables
5.2.1 Marginal Probability Mass Function
5.3 The Joint CDF of X And Y
5.4 The Joint Pdf Of Two Continuous Random Variables
5.5 Independence Of Two Random Variables
5.6 Joint Moments And Expected Values Of A Function Of
Two Random Variables
5.6.1 Expected Value Of A Function Of Two Random Variables
5.6.2 Joint Moments, Correlation, And Covariance
PX ,Y ( x, y ) = P { X = x} ∩ {Y = y}
P [= ,Y y]
X x= for ( x, y ) ∈ R 2
The probability of any event B is the sum of the pmf
over the outcomes in B: P X in B =
[ ] ∑∑
p X ,Y x j , yk ( )
( x j , yk ) in B
When the event B is the entire sample space we have:
∞ ∞
∑∑ p ( x , y ) = 1
=j 1 =
k 1
X ,Y j k
Graphical representation of pmf’s
a) Table format
b) Use of arrows to show height
c) Labeled dots corresponding
to pmf values
Example 5.6: A random experiment consists of tossing
two “loaded” dice and noting the pair of numbers (X, Y)
facing up. The joint pmf 𝜌𝜌𝑋𝑋,𝑌𝑌 (𝑗𝑗, 𝑘𝑘) for 𝑗𝑗 = 1, … , 6 and 𝑘𝑘 =
1, … , 6 is given by the two dimensional table shown in
figure. The (j, k) entry in the table contains the value
𝜌𝜌𝑋𝑋,𝑌𝑌 (𝑗𝑗, 𝑘𝑘). Find the 𝑃𝑃[min 𝑋𝑋, 𝑌𝑌 = 3].
𝑃𝑃[min 𝑋𝑋, 𝑌𝑌 = 3] =
= 𝜌𝜌𝑋𝑋,𝑌𝑌 6,3 + 𝜌𝜌𝑋𝑋,𝑌𝑌 5,3 +
𝜌𝜌𝑋𝑋,𝑌𝑌 4,3 + 𝜌𝜌𝑋𝑋,𝑌𝑌 3,3 +
𝜌𝜌𝑋𝑋,𝑌𝑌 3,4 + 𝜌𝜌𝑋𝑋,𝑌𝑌 3,5 +
𝜌𝜌𝑋𝑋,𝑌𝑌 (3,6)
1 2 8
=6 + = .
42 42 42
The joint pmf of X provides the information about the
joint behavior of X and Y. We are also interested in the
probabilities of events involving each of the random
variables in isolation. These can be found in terms of
the marginal probability mass functions:
and similarly
The joint cumulative distribution function of X
and Y is defined as the probability of the event
𝑋𝑋 ≤ 𝑥𝑥1 ∩ 𝑌𝑌 ≤ 𝑌𝑌1
𝐹𝐹𝑋𝑋,𝑌𝑌 𝑥𝑥1 , 𝑦𝑦1 = 𝑃𝑃[𝑋𝑋 ≤ 𝑥𝑥1 , 𝑌𝑌 ≤ 𝑦𝑦1 ]
The joint cdf satisfies the following properties.
The joint cdf is a non-decreasing function of x and y:
X is uniformly
distributed in the
unit interval.
Example 5.12: The joint cdf for the vector of random
variable 𝑿𝑿 = (𝑋𝑋, 𝑌𝑌) is given by
𝑃𝑃 𝑋𝑋 + 𝑌𝑌 ≤ 1 =
X and Y are independent random variables if any
event 𝐴𝐴1 defined in terms of X is independent of any
event 𝐴𝐴2 defined in terms of Y; that is,
2
𝑝𝑝𝑌𝑌 5 5 =
7
1
𝑝𝑝𝑌𝑌 2 5 = 7
Suppose Y is a continuous random variable. Define the
conditional cdf of Y given 𝑿𝑿 = 𝒙𝒙𝒌𝒌
An interesting corollary
5.8.1 One Function of Two Random Variables: Let
the random variable Z be defined as a function of two
random variables:
𝑓𝑓𝑋𝑋𝑋𝑋 (𝑥𝑥,𝑦𝑦)
𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 =
𝑓𝑓𝑋𝑋 (𝑥𝑥)
1−𝑥𝑥 2
1 2
𝑓𝑓𝑋𝑋 𝑥𝑥 = � 𝑑𝑑𝑑𝑑 = 1 − 𝑥𝑥 2
− 1−𝑥𝑥 2 𝜋𝜋 𝜋𝜋
𝑓𝑓𝑋𝑋𝑋𝑋 (𝑥𝑥, 𝑦𝑦) 1/𝜋𝜋 1
𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 = = =
𝑓𝑓𝑋𝑋 (𝑥𝑥) 2 2
1 − 𝑥𝑥 2 2 1 − 𝑥𝑥
𝜋𝜋
∞
𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = ∫−∞ 𝑦𝑦 𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 𝑑𝑑𝑑𝑑
1−𝑥𝑥 2
1
𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = � 𝑦𝑦 𝑑𝑑𝑑𝑑
− 1−𝑥𝑥 2 2 1− 𝑥𝑥 2
1 𝑦𝑦 2 1−𝑥𝑥 2
𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = |− 1−𝑥𝑥 2 = 0
2 1 − 𝑥𝑥 2 2
𝐸𝐸 𝑦𝑦 = 𝐸𝐸 𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = 0
Example 5.34: X is selected at random from the unit
interval; Y is then selected at random from the interval
(0, X). Find the cdf of Y.
When 𝑋𝑋 = 𝑥𝑥, Y is uniformly distributed in (0, x) so the
conditional cdf given 𝑋𝑋 = 𝑥𝑥 is