Professional Documents
Culture Documents
Lec8 MTH305
Lec8 MTH305
Lec8 MTH305
Chapters 2 Probability
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Introduction
The main picture of any random variable is contained in its probability distribution
function pdf ( or cumulative distribution function).
In many cases, we want to summarize this detailed information of a R.V. in some
measure (numbers).
For example, answers to the following questions are important to decision makers:
1 What is the expected value of a random quantity?
2 What is the probability that the actual measured value of a random quantity deviates from the
expected value?
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Definition
Let X be a R.V. with probability distribution function f (x), the mean or the expected value of
X is defined by:
Discrete case: X
µ = E(X ) = xf (x).
x
Continuous case: Z +∞
µ = E(X ) = xf (x)dx.
−∞
Example
If a two coins are tossed 16 times and X is the number of heads that occur per toss, then the
values of X can be 0, 1 and 2.
Suppose that the experiments yields no head, one head and two heads a total of 4, 7 and 5
times, respectively.
Then, the average number of heads per unit toss is
Example
A lot containing 7 components is sampled by a quality inspector, the lot contains 4 good and 3
defective components. A sample of 3 is taken by inspector. Find the expected value of
number of good component in this sample.
Solution: Let X represent the number of good component in the sample. The probability
distribution of X is: 4C × 3C
x 3−x
f (x) = 7C
for x = 0, 1, 2, 3.
3
So
1 12 18 4
f (0) = , f (1) = , f (2) = , andf (3) =
35 35 35 35
Thus
1 12 18 4
µ = E(X ) = (0)( ) + (1)( ) + (2)( ) + (3)( ) = 1.7
35 35 35 35
Thus, if a sample of size 3 is selected at random over and over again from a lot of 4 good and
3 defective components, it would contain, an average 1.7 good components.
Example
Let X be a random variable that denotes the life of hours of a certain electronic device. The
probability density function is:
(
2000
x3
x > 100,
f (x) =
0 elsewhere.
h x −1 i+∞
= 2000 = 200hr.
−1 100
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Rule
Let X be a R.V. with probability distribution function f (x), the mean or the expected value of
g(X ) is defined by:
Discrete case: X
µg(x) = E[g(X )] = g(x)f (x).
x
Continuous case: Z +∞
µg(x) = E[g(X )] = g(x)f (x)dx.
−∞
Example (1)
Suppose that the number of cars X that pass through a car wash between 4:00 P.M. and 5:00
P.M. on any sunny Friday has the following probability distribution:
x 4 5 6 7 8 9 Total
f (x) 1/12 1/12 1/4 1/4 1/6 1/6 1
Let g(x) = 2x − 1, represent the amount of money in dollars paid to the attendant by
manager. Find the attendant expected earning for this particular period.
Solution:
X9
µg(x) = E[g(X )] = E[2X − 1] = (2x − 1)f (x)
x=4
1 1 1 1 1 1
7( ) + 9( ) + 11( ) + 13( ) + 15( ) + 17( ) = 12.67.
12 12 4 4 6 6
Example (2)
Let X be a random variable with density function is:
( 2
x
3
−1 < x < 2,
f (x) =
0 elsewhere.
1
Z 2
= (4x 3 + 3x 2 )dx
3 −1
1h 4 i2
= x + x3 = 8.
3 −1
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Definition
Let X be a R.V. with probability distribution function f (x) and mean µ, the variance of X is
defined by:
Discrete case:
X
σ 2 = Var (X ) = E[(X − µ)2 ] = (x − µ)2 f (x).
x
Continuous case:
Z +∞
σ 2 = Var (X ) = E[(X − µ)2 ] = (x − µ)2 f (x)dx.
−∞
Definition
Let X be a R.V., then
X − µ is called the deviation from the mean.
p
σX = Var (X ) is called the standard deviation of X
Rule
The variance Var (X ) of a random variable X is:
Rule
Example (1)
Let the random variable X represent the number of cars that are used for official purposes on
any given workday. The probability distribution for company A is:
x 1 2 3 Total
f (x) 0.3 0.4 0.3 1
x 0 1 2 3 4 Total
f (x) 0.2 0.1 0.3 0.3 0.1 1
Show that the variance of the probability distribution for company B is greater than of
company A .
Example (1)
Solution:
For company A, we find that:
= 0.6
For company B, we have:
Example (2)
Find the mean, variance and standard deviation of the following density function:
(
2(x − 1) 1 < x < 2,
f (x) =
0 elsewhere.
Solution:
Z +∞ Z 2 h i2 5
µ = E[X ] = xf (x)dx = 2x(x − 1)dx = 2/3x 3 − x 2 = .
−∞ 1 1 3
Z +∞ Z 2 h2 2 3 i2 17
E[X 2 ] = x 2 f (x)dx = 2x 2 (x − 1)dx = x4 − x = .
−∞ 1 4 3 1 6
So
17 25 1
Var(X ) = E[X 2 ] − µ2 = − = .
6 9 18
p
Thus σ = 1/18.
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Definition
Let X and Y be a random variables with joint probability distribution f (x, y), then the
expectation of any function g of X and Y is given by:
Discrete case: XX
E[g(X , Y )] = g(x, y)f (x, y).
x y
Continuous case:
Z +∞ Z +∞
E[g(X , Y )] = g(x, y)f (x, y)dxdy.
−∞ −∞
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Definition
Let X and Y be two random variables with joint probability distribution f (x, y) and means
E[X ], E[Y ] respectively. The covariance of X and Y is given by:
Where
Discrete case: XX
E[XY ] = x y f (x, y).
x y
Continuous case: Z +∞ Z +∞
E[XY ] = x y f (x, y)dxdy.
−∞ −∞
Notation
The covariance of two R.V.s X and Y is sometimes denoted by σXY .
Remark
The covariance can be positive : we have positive correlation between X and Y (X and
Y have same directions).
The covariance can be negative: we have negative correlation between X and Y (X and
Y have opposite directions).
The covariance can be equal to 0: no linear relationship exists between X and Y . They
are not necessarily independent.
If X and Y are statistically independent , then their covariance is zero.
Example (1)
Find the covariance of X and Y whose joint probability distribution is:
X
0 1 2 h(y)
Y
0 3/28 9/28 3/28 15/28
1 3/14 3/14 0 6/14
2 1/28 0 0 1/28
g(x) 5/14 15/28 3/28 1
Solution: We have
5 15 3 3
µX = E[X ] = (0)( ) + (1)( ) + (2)( ) = .
14 28 28 4
15 6 1 1
µY = E[Y ] = (0)( ) + (1)( ) + (2)( ) = .
28 14 28 2
Example (1)
Solution: We have
3
E[XY ] = (0)(y)(· · · ) + (1)(0)(· · · ) + (1)(1)( ) + (1)(2)(0)
14
3
+(2)(0)(· · · ) + (2)(1)(0) + (2)(2)(0) = .
14
Thus
3 31 9
Cov(X , Y ) = E[XY ] − E[X ]E[Y ] = − =− .
14 42 56
Example (2)
The fraction X of male runners and the fraction Y of female runners who compete in marathon
races are described by the joint density function:
(
8xy 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0 elsewhere.
Example (2)
Solution: From these marginal density functions, we compute
Z +∞ Z 1 h i1 4
µX = E[X ] = xg(x)dx = x 4x 3 dx = 4/5x 5 = .
−∞ 0 0 5
Z +∞ Z 1 h i1 8
µY = E[Y ] = yh(y)dy =y 4y(1 − y 2 )dy = 4/3 y 3 − 4/5 y 5 = .
−∞ 0 0 15
Z +∞ Z +∞ Z 1Z 1
4
E[XY ] = x y f (x, y)dxdy = xy 8xy dx dy =
−∞ −∞ 0 y 9
Thus
4 4 8 4
Cov(X , Y ) = E[XY ] − E[X ]E[Y ] = − =− .
9 5 15 225
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Properties
Let X and Y be two R.V.s with means E[X ], E[Y ] and Variance Var(X ) , Var (Y )
respectively. Let a, b, c and d are real constants. Then we have the following properties:
E[aX + b] = a E[X ] + b.
E[a] = a
Var (a) = 0
Example (1)
Let X and Y be a random variables such that E(X ) = 5, Var(X ) = 10, E(Y ) = 4,
Var(Y ) = 12 and Cov(X , Y ) = 3, then :
E[2X + 1] = 2 E[X ] + 1 = 11.
Example (2)
Let X be a random variable with probability distribution as follows:
x 0 1 2 3 Total
f (x) 1/3 1/2 0 1/6 1
1 1 1
E[X ] = (0)( ) + (1)( ) + (2)(0) + (3)( ) = 1.
3 2 6
X 1 1 1
E[X 2 ] = x 2 f (x) = (0)2 ( ) + (1)2 ( ) + (2)2 (0) + (3)2 ( ) = 2.
3 2 6
So
E[(X − 1)2 ] = E[X 2 ] − 2E[X ] + 1 = 2 − 2(1) + 1 = 1.
Example (3)
The weekly demand for a certain drink, in thousands of liters, at a chain of convenience stores
is a continuous random variable g(X ) = X 2 + X − 2, where X has the density function:
(
2(x − 1) 1 < x < 2,
f (x) =
0 elsewhere.
Find the expected value of the weekly demand for the drink.
Solution: We have E[X 2 + X − 2] = E[X 2 ] + E[X ] − 2, and
Z +∞ Z 2 h i2 5
E[X ] = xf (x)dx = x 2(x − 1)dx = 2/3x 3 − x 2 = .
−∞ 1 1 3
Z +∞ Z 2 h i2 17
E[X 2 ] = x 2 f (x)dx = x 2 2(x − 1)dx = 2/4x 4 − 2/3x 3 = .
−∞ 1 1 6
17 5 5
Thus E[X 2 + X − 2] = + −2= .
6 3 2
Theorem
Let X and Y be two independent random variables. Then:
Example (4)
It is known that the ratio of gallium to arsenide does not affect the functioning of
gallium-arsenide wafers, which are the main components of microchips. Let X denote the
ratio of gallium to arsenide and Y denote the functional wafers retrieved during a 1-hour
period. X and Y are independent random variables with the joint density function:
x(1+3y 2 )
(
0 < x < 2, 0 < y < 1,
f (x, y) = 4
0 elsewhere.
Example (4)
Solution: We have
Z +∞ Z 2 4
E[X ] = xg(x)dx = x g(x)dx = .
−∞ 0 3
Z +∞ Z 1 5
E[Y ] = yh(y)dy = y h(y)dy = .
−∞ 0 8
4 5 5
Thus E[X ]E[Y ] = × = = E[XY ].
3 8 6
Example (5)
If X and Y are random variables with variances σX2 = 2 and σY2 = 4 and covariance
σXY = −2, find the variance of the random variable Z = 3X − 4Y + 8.
Solution: We have: Var(Z ) = Var (3X − 4Y + 8) = Var(3X − 4Y ), so
Example (6)
If X and Y denote the amounts of two different types of impurities in a batch of a certain
chemical product. Suppose that X and Y are independent random variables with variances
σX2 = 2 and σY2 = 3, find the variance of the random variable Z = 3X − 2Y + 5.
Solution: We have: Var(Z ) = Var (3X − 2Y + 5) = Var(3X − 2Y ) , so
1 Introduction
2 Mean-Expected Value
4 Variance
6 Covariance of 2 R.V.s
8 Exercises
Exercise 1.
Exercise 2.