Lec8 MTH305

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

MTH305: Probability and Statistics

Chapter 4: Mathematical Expectation


Lecture #8

Lebaese American University

Dr. Houssein NASSER EL DINE


Course Plan

Chapters 2 Probability

Chapter 3 Random Variables and Probability Distributions

Chapter 4 Mathematical Expectation

Chapter 5 Some Discrete Probability Distributions

Chapter 6 Some Continuous Probability Distributions

Chapter 8 Sampling Distributions

Chapter 9 One Sample Estimation Problems

Houssein NASSER EL DINE 2 / 41


Table of contents

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 3 / 41


Introduction

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 4 / 41


Introduction

Introduction

The main picture of any random variable is contained in its probability distribution
function pdf ( or cumulative distribution function).
In many cases, we want to summarize this detailed information of a R.V. in some
measure (numbers).
For example, answers to the following questions are important to decision makers:
1 What is the expected value of a random quantity?
2 What is the probability that the actual measured value of a random quantity deviates from the
expected value?

Houssein NASSER EL DINE 5 / 41


Mean-Expected Value

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 6 / 41


Mean-Expected Value

Definition
Let X be a R.V. with probability distribution function f (x), the mean or the expected value of
X is defined by:
Discrete case: X
µ = E(X ) = xf (x).
x

Continuous case: Z +∞
µ = E(X ) = xf (x)dx.
−∞

Houssein NASSER EL DINE 7 / 41


Mean-Expected Value

Example
If a two coins are tossed 16 times and X is the number of heads that occur per toss, then the
values of X can be 0, 1 and 2.
Suppose that the experiments yields no head, one head and two heads a total of 4, 7 and 5
times, respectively.
Then, the average number of heads per unit toss is

(0)(4) + (1)(7) + (2)(5)


= 1.06
16
the computation for the average number of heads can be written in the following equivalent
form
4 7 5
µ = E[X ] = (0)( ) + (1)( ) + (2)( ) = 1.06
16 16 16

Houssein NASSER EL DINE 8 / 41


Mean-Expected Value

Example
A lot containing 7 components is sampled by a quality inspector, the lot contains 4 good and 3
defective components. A sample of 3 is taken by inspector. Find the expected value of
number of good component in this sample.
Solution: Let X represent the number of good component in the sample. The probability
distribution of X is: 4C × 3C
x 3−x
f (x) = 7C
for x = 0, 1, 2, 3.
3

So
1 12 18 4
f (0) = , f (1) = , f (2) = , andf (3) =
35 35 35 35
Thus
1 12 18 4
µ = E(X ) = (0)( ) + (1)( ) + (2)( ) + (3)( ) = 1.7
35 35 35 35
Thus, if a sample of size 3 is selected at random over and over again from a lot of 4 good and
3 defective components, it would contain, an average 1.7 good components.

Houssein NASSER EL DINE 9 / 41


Mean-Expected Value

Example
Let X be a random variable that denotes the life of hours of a certain electronic device. The
probability density function is:
(
2000
x3
x > 100,
f (x) =
0 elsewhere.

Find the expected life of this type of device.


Solution:
Z +∞ Z +∞ 2000
Z +∞
µ = E(X ) = xf (x)dx = x dx = 2000 x −2 dx
−∞ 100 x3 100

h x −1 i+∞
= 2000 = 200hr.
−1 100

Houssein NASSER EL DINE 10 / 41


Expectation of a Function of R.V.

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 11 / 41


Expectation of a Function of R.V.

Rule
Let X be a R.V. with probability distribution function f (x), the mean or the expected value of
g(X ) is defined by:
Discrete case: X
µg(x) = E[g(X )] = g(x)f (x).
x

Continuous case: Z +∞
µg(x) = E[g(X )] = g(x)f (x)dx.
−∞

Houssein NASSER EL DINE 12 / 41


Expectation of a Function of R.V.

Example (1)
Suppose that the number of cars X that pass through a car wash between 4:00 P.M. and 5:00
P.M. on any sunny Friday has the following probability distribution:
x 4 5 6 7 8 9 Total
f (x) 1/12 1/12 1/4 1/4 1/6 1/6 1

Let g(x) = 2x − 1, represent the amount of money in dollars paid to the attendant by
manager. Find the attendant expected earning for this particular period.
Solution:
X9
µg(x) = E[g(X )] = E[2X − 1] = (2x − 1)f (x)
x=4

1 1 1 1 1 1
7( ) + 9( ) + 11( ) + 13( ) + 15( ) + 17( ) = 12.67.
12 12 4 4 6 6

Houssein NASSER EL DINE 13 / 41


Expectation of a Function of R.V.

Example (2)
Let X be a random variable with density function is:
( 2
x
3
−1 < x < 2,
f (x) =
0 elsewhere.

Find the expected value of g(x) = 4x + 3.


Solution:
Z +∞ Z 2 x2
E[g(X )] = E[4X + 3] = 4x + 3f (x)dx = (4x + 3) dx
−∞ −1 3

1
Z 2
= (4x 3 + 3x 2 )dx
3 −1

1h 4 i2
= x + x3 = 8.
3 −1

Houssein NASSER EL DINE 14 / 41


Variance

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 15 / 41


Variance

Definition
Let X be a R.V. with probability distribution function f (x) and mean µ, the variance of X is
defined by:
Discrete case:
X
σ 2 = Var (X ) = E[(X − µ)2 ] = (x − µ)2 f (x).
x

Continuous case:
Z +∞
σ 2 = Var (X ) = E[(X − µ)2 ] = (x − µ)2 f (x)dx.
−∞

Definition
Let X be a R.V., then
X − µ is called the deviation from the mean.
p
σX = Var (X ) is called the standard deviation of X

Houssein NASSER EL DINE 16 / 41


Variance

Rule
The variance Var (X ) of a random variable X is:

Var(X ) = E[X 2 ] − (E[X ])2 = E[X 2 ] − µ2 .

Rule

E[(aX + b)] = aE(X ) + b

Houssein NASSER EL DINE 17 / 41


Variance

Example (1)
Let the random variable X represent the number of cars that are used for official purposes on
any given workday. The probability distribution for company A is:

x 1 2 3 Total
f (x) 0.3 0.4 0.3 1

And for company B is:

x 0 1 2 3 4 Total
f (x) 0.2 0.1 0.3 0.3 0.1 1

Show that the variance of the probability distribution for company B is greater than of
company A .

Houssein NASSER EL DINE 18 / 41


Variance

Example (1)
Solution:
For company A, we find that:

µ = E[X ] = (1)(0.3) + (2)(0.4) + (3)(0.3) = 2.0


X
σ 2 = Var(X ) = (x − µ)2 f (x) = (1 − 2)2 (0.3) + (2 − 2)2 (0.4) + (3 − 2)2 (0.3)

= 0.6
For company B, we have:

µ = E[X ] = (0)(0.2) + (1)(0.1) + (2)(0.3) + (3)(0.3) + (4)(0.4) = 2


X
σ 2 = Var(X ) = (x − µ)2 f (x) = (0 − 2)2 (0.2) + (1 − 2)2 (0.1)

+(2 − 2)2 (0.3) + (3 − 2)2 (0.3) + (4 − 2)2 (0.1) = 1.6

Houssein NASSER EL DINE 19 / 41


Variance

Example (2)
Find the mean, variance and standard deviation of the following density function:
(
2(x − 1) 1 < x < 2,
f (x) =
0 elsewhere.

Solution:
Z +∞ Z 2 h i2 5
µ = E[X ] = xf (x)dx = 2x(x − 1)dx = 2/3x 3 − x 2 = .
−∞ 1 1 3
Z +∞ Z 2 h2 2 3 i2 17
E[X 2 ] = x 2 f (x)dx = 2x 2 (x − 1)dx = x4 − x = .
−∞ 1 4 3 1 6
So
17 25 1
Var(X ) = E[X 2 ] − µ2 = − = .
6 9 18
p
Thus σ = 1/18.

Houssein NASSER EL DINE 20 / 41


Expectation of Two R.V.s

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 21 / 41


Expectation of Two R.V.s

Expectation of Two R.V.s

Definition
Let X and Y be a random variables with joint probability distribution f (x, y), then the
expectation of any function g of X and Y is given by:
Discrete case: XX
E[g(X , Y )] = g(x, y)f (x, y).
x y

Continuous case:
Z +∞ Z +∞
E[g(X , Y )] = g(x, y)f (x, y)dxdy.
−∞ −∞

Houssein NASSER EL DINE 22 / 41


Covariance of 2 R.V.s

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 23 / 41


Covariance of 2 R.V.s

Definition
Let X and Y be two random variables with joint probability distribution f (x, y) and means
E[X ], E[Y ] respectively. The covariance of X and Y is given by:

Cov(X , Y ) = E[XY ] − E[X ]E[Y ],

Where
Discrete case: XX
E[XY ] = x y f (x, y).
x y

Continuous case: Z +∞ Z +∞
E[XY ] = x y f (x, y)dxdy.
−∞ −∞

Notation
The covariance of two R.V.s X and Y is sometimes denoted by σXY .

Houssein NASSER EL DINE 24 / 41


Covariance of 2 R.V.s

Remark
The covariance can be positive : we have positive correlation between X and Y (X and
Y have same directions).
The covariance can be negative: we have negative correlation between X and Y (X and
Y have opposite directions).
The covariance can be equal to 0: no linear relationship exists between X and Y . They
are not necessarily independent.
If X and Y are statistically independent , then their covariance is zero.

Houssein NASSER EL DINE 25 / 41


Covariance of 2 R.V.s

Example (1)
Find the covariance of X and Y whose joint probability distribution is:
X
0 1 2 h(y)
Y
0 3/28 9/28 3/28 15/28
1 3/14 3/14 0 6/14
2 1/28 0 0 1/28
g(x) 5/14 15/28 3/28 1

Solution: We have

5 15 3 3
µX = E[X ] = (0)( ) + (1)( ) + (2)( ) = .
14 28 28 4
15 6 1 1
µY = E[Y ] = (0)( ) + (1)( ) + (2)( ) = .
28 14 28 2

Houssein NASSER EL DINE 26 / 41


Covariance of 2 R.V.s

Example (1)
Solution: We have

3
E[XY ] = (0)(y)(· · · ) + (1)(0)(· · · ) + (1)(1)( ) + (1)(2)(0)
14
3
+(2)(0)(· · · ) + (2)(1)(0) + (2)(2)(0) = .
14
Thus
3 31 9
Cov(X , Y ) = E[XY ] − E[X ]E[Y ] = − =− .
14 42 56

Houssein NASSER EL DINE 27 / 41


Covariance of 2 R.V.s

Example (2)
The fraction X of male runners and the fraction Y of female runners who compete in marathon
races are described by the joint density function:
(
8xy 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0 elsewhere.

Find the covariance of X and Y .


Solution: We first compute the marginal density functions. They are
( (
4x 3 0 ≤ x ≤ 1, 4y(1 − y 2 ), 0 ≤ y ≤ 1,
g(x) = h(y) =
0 elsewhere. 0 elsewhere.

Houssein NASSER EL DINE 28 / 41


Covariance of 2 R.V.s

Example (2)
Solution: From these marginal density functions, we compute
Z +∞ Z 1 h i1 4
µX = E[X ] = xg(x)dx = x 4x 3 dx = 4/5x 5 = .
−∞ 0 0 5
Z +∞ Z 1 h i1 8
µY = E[Y ] = yh(y)dy =y 4y(1 − y 2 )dy = 4/3 y 3 − 4/5 y 5 = .
−∞ 0 0 15
Z +∞ Z +∞ Z 1Z 1
4
E[XY ] = x y f (x, y)dxdy = xy 8xy dx dy =
−∞ −∞ 0 y 9
Thus
4 4 8 4
Cov(X , Y ) = E[XY ] − E[X ]E[Y ] = − =− .
9 5 15 225

Houssein NASSER EL DINE 29 / 41


Linear combinations of R.V.s

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 30 / 41


Linear combinations of R.V.s

Properties
Let X and Y be two R.V.s with means E[X ], E[Y ] and Variance Var(X ) , Var (Y )
respectively. Let a, b, c and d are real constants. Then we have the following properties:
E[aX + b] = a E[X ] + b.

Var (aX + b) = a2 Var (x).

E[a] = a

Var (a) = 0

E[aX + bY ] = a E[X ] + b E[Y ].

Var [aX + bY ] = a2 Var(x) + b 2 Var(Y ) + 2ab Cov (X , Y ).

Cov (aX + b, cY + d) = ab Cov (X , Y ).

E[ag(X ) + bh(Y )] = a E[g(X )] + b E[h(Y )].

Houssein NASSER EL DINE 31 / 41


Linear combinations of R.V.s

Example (1)
Let X and Y be a random variables such that E(X ) = 5, Var(X ) = 10, E(Y ) = 4,
Var(Y ) = 12 and Cov(X , Y ) = 3, then :
E[2X + 1] = 2 E[X ] + 1 = 11.

Var (2X + 1) = 22 Var(x) = 40.

E[2X + 3Y ] = 2 E[X ] + 3 E[Y ] = 10 + 12 = 22.

Var [2X + 3Y ] = 22 Var(x) + 32 Var(Y ) + 2(2)(3) Cov(X , Y ) = 40 + 108 + 36 = 184.

Cov (2X + 1, 3Y + 2) = (2)(3) Cov (X , Y ) = 18.

Houssein NASSER EL DINE 32 / 41


Linear combinations of R.V.s

Example (2)
Let X be a random variable with probability distribution as follows:
x 0 1 2 3 Total
f (x) 1/3 1/2 0 1/6 1

Find the expected value of Y = (X − 1)2 .


Solution: We have E[(X − 1)2 ] = E[X 2 − 2X + 1] = E[X 2 ] − 2E[X ] + 1, and

1 1 1
E[X ] = (0)( ) + (1)( ) + (2)(0) + (3)( ) = 1.
3 2 6
X 1 1 1
E[X 2 ] = x 2 f (x) = (0)2 ( ) + (1)2 ( ) + (2)2 (0) + (3)2 ( ) = 2.
3 2 6
So
E[(X − 1)2 ] = E[X 2 ] − 2E[X ] + 1 = 2 − 2(1) + 1 = 1.

Houssein NASSER EL DINE 33 / 41


Linear combinations of R.V.s

Example (3)
The weekly demand for a certain drink, in thousands of liters, at a chain of convenience stores
is a continuous random variable g(X ) = X 2 + X − 2, where X has the density function:
(
2(x − 1) 1 < x < 2,
f (x) =
0 elsewhere.

Find the expected value of the weekly demand for the drink.
Solution: We have E[X 2 + X − 2] = E[X 2 ] + E[X ] − 2, and
Z +∞ Z 2 h i2 5
E[X ] = xf (x)dx = x 2(x − 1)dx = 2/3x 3 − x 2 = .
−∞ 1 1 3
Z +∞ Z 2 h i2 17
E[X 2 ] = x 2 f (x)dx = x 2 2(x − 1)dx = 2/4x 4 − 2/3x 3 = .
−∞ 1 1 6
17 5 5
Thus E[X 2 + X − 2] = + −2= .
6 3 2

Houssein NASSER EL DINE 34 / 41


Linear combinations of R.V.s

Theorem
Let X and Y be two independent random variables. Then:

E[XY ] = E[X ]E[Y ].

Houssein NASSER EL DINE 35 / 41


Linear combinations of R.V.s

Example (4)
It is known that the ratio of gallium to arsenide does not affect the functioning of
gallium-arsenide wafers, which are the main components of microchips. Let X denote the
ratio of gallium to arsenide and Y denote the functional wafers retrieved during a 1-hour
period. X and Y are independent random variables with the joint density function:

x(1+3y 2 )
(
0 < x < 2, 0 < y < 1,
f (x, y) = 4
0 elsewhere.

Show that E[XY ] = E[X ]E[Y ].


Solution: We have
Z +∞ Z +∞ Z 1 Z 2 x(1 + 3y 2 ) 5
E[XY ] = x y f (x, y)dxdy = xy dx dy =
−∞ −∞ 0 0 4 6

Houssein NASSER EL DINE 36 / 41


Linear combinations of R.V.s

Example (4)
Solution: We have
Z +∞ Z 2 4
E[X ] = xg(x)dx = x g(x)dx = .
−∞ 0 3
Z +∞ Z 1 5
E[Y ] = yh(y)dy = y h(y)dy = .
−∞ 0 8
4 5 5
Thus E[X ]E[Y ] = × = = E[XY ].
3 8 6

Houssein NASSER EL DINE 37 / 41


Linear combinations of R.V.s

Example (5)
If X and Y are random variables with variances σX2 = 2 and σY2 = 4 and covariance
σXY = −2, find the variance of the random variable Z = 3X − 4Y + 8.
Solution: We have: Var(Z ) = Var (3X − 4Y + 8) = Var(3X − 4Y ), so

Var(Z ) = 32 Var(x) + (−4)2 Var(Y ) + 2(3)(−4)Cov(X , Y )

= 9(2) + 16(4) − 24(−2) = 130.

Example (6)
If X and Y denote the amounts of two different types of impurities in a batch of a certain
chemical product. Suppose that X and Y are independent random variables with variances
σX2 = 2 and σY2 = 3, find the variance of the random variable Z = 3X − 2Y + 5.
Solution: We have: Var(Z ) = Var (3X − 2Y + 5) = Var(3X − 2Y ) , so

Var(Z ) = 32 Var(x) + (−2)2 Var(Y ) = 9(2) + 12 = 30.

Houssein NASSER EL DINE 38 / 41


Exercises

1 Introduction

2 Mean-Expected Value

3 Expectation of a Function of R.V.

4 Variance

5 Expectation of Two R.V.s

6 Covariance of 2 R.V.s

7 Linear combinations of R.V.s

8 Exercises

Houssein NASSER EL DINE 39 / 41


Exercises

Exercise 1.

Let X and Y be independent random variables, with: E(X ) = 2, E(Y ) = 6, Var(X ) = 9,


and Var (Y ) = 16.
1 Calculate E(3X − 4Y − 5) and Var(3X − 4Y − 5).
2 Prove that E(X 2 ) = 13; then find E(X 2 − Y 2 ).
3 Calculate E((X + Y )2 ).
Solution:
1 E(3X − 4Y − 5) = 3E[X ] − 4E[Y ] − 5 = 3(2) − 4(6) − 5 = −23,
Var (3X − 4Y − 5) = 32 Var 5X ) + (−4)2 Var(Y ) = 337.
2 Var (X ) = E[X 2 ] − (E[X ])2 ⇒ E(X 2 ) = Var (X ) + (E[X ])2 = 9 + 22 = 13, and
E(Y 2 ) = Var(Y ) + (E[Y ])2 = 16 + 62 = 52
⇒ E(X 2 − Y 2 ) = E[x 2 ] − E[Y 2 ] = 13 − 52 = −39.
3 E((X + Y )2 ) = E[X 2 ] + 2E[XY ] + E[Y 2 ] = E[X 2 ] + 2E[X ]E[Y ] + E[Y 2 ]
= 13 + 2(2)(6) + 52 = 89 .

Houssein NASSER EL DINE 40 / 41


Exercises

Exercise 2.

Let X and Y be random variables, with: E(X ) = 2, E(Y ) = 5, σX = 3, E[XY ] = 7 and


E[(Y − 1)2 ] = 20.
1 Are X and Y independent? Why?
2 Find E(X 2 ) and Var(Y ).
3 Find Cov(X , Y )
4 Deduce Var(2X + 3Y − 1).
Solution:
1 E[X ]E[Y ] = 2 × 5 = 10 6= 7 = E[XY ].So X and Y are not independent.
2 σX2 = Var(X ) = E[X 2 ] − (E[X ])2 ⇒ E(X 2 ) = Var(X ) + (E[X ])2 = 32 + 22 = 13,
and E((Y − 1)2 ) = E(Y 2 ) − 2E(Y ) + 1 = 20 ⇒ E(Y 2 ) = 20 + 2E(Y ) − 1 = 29,
Thus Var (Y ) = E[Y 2 ] − (E[Y ])2 = 29 − 52 = 4
3 Cov(X , Y ) = E(XY ) − E(X )E(Y ) = 7 − 2 × 5 = −3.
4 Var (2X + 3Y − 1) = 22 Var (X ) + 32 Var(Y ) + 2(2)(3)Cov(X , Y ) = 36 .

Houssein NASSER EL DINE 41 / 41

You might also like