Statistics 8

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

CLASS Statistics

08 Expectation & Variance

Random Experiment

Trial that will result in one of several possible outcomes can’t


predict outcome of any specific trial can predict pattern in the
LONG RUN that is, each possible outcome has a certain
PROBABILITY of occurring

Expected Value

k
E(X) = Σ xp i i
i=1

Value of X X1 X2 ... Xk

Probability p1 p2 ... pk

E(X) = x 1p 1 + x 2p 2+ ... + x kp k

Variance of Discrete Random Variable

▪Variance of random variable is the measure of the spread or


dispersion of the probability distribution of a random variable

k
Var(X) = Σ (x i - E(X)) p i 2

i=1

Example of Expectation and Variance

0.40
0.35
0.35

0.30
0.25 0.25
0.25

0.20
0.15
0.15

0.10

0.05

0.00 E(X)
3 4 5 6
Valve of X

Xi pi xi pi Xi -E(X) (Xi -E(X)) 2


(Xi -E(X)) pi 2

3 0.35 1.05 -1.3 1.69 0.592


4 0.25 1 -0.3 0.09 0.023
5 0.15 0.75 0.7 0.49 0.074
6 0.25 1.5 1.7 2.89 0.723

E(X) = 4.3 Var(X) = 1.41

Adding a Constant

E(X+a) = E(X) + a
Var(X +
- a) = Var(X)

0.35
0.30
0.30
0.25
0.25
0.20
0.20
0.15
0.15
0.10
0.10

0.05

0.00
100 120 140 160 180 200

Value of X (Weight in pounds)

Value of X
(Weight in pounds) 100 120 140 160 180

Probability 0.10 0.15 0.30 0.25 0.20

E(X) = 146 lb.

Value of X
(Weight in pounds) 120 140 160 180 200

Probability 0.10 0.15 0.30 0.25 0.20

E(X) = 166 lb. = 146+20

E(X+a) = E(X) + a

Effect of variance on adding a constant

0.35
0.30 0.30
0.30
0.25 0.25
0.25
0.20 0.20
0.20
0.15 0.15
0.15
0.10 0.10
0.10

0.05

0.00
100 120 140 160 180 200

Value of X (Weight in pounds)

Var(X +
- a) = Var(X)

The whole distribution shifts to the right but it doesn’t change


the shape meaning the variance stays the same
Multiplying by a constant

E(aX) = a * E(X)
Var(aX) = a * Var(X) 2

0.35
0.30
0.30
0.25 0.25
0.25

0.20

0.15
0.10 0.10
0.10

0.05

0.00
1.5 1.6 1.7 1.8 1.9

Value of X (Height in meters)

Value of X
(Height in meters) 1.5 1.6 1.7 1.8 1.9

Probability 0.10 0.25 0.30 0.25 0.10

E(X) = 1.7 meters

Value of X
(Weight in pounds) 4.92 5.25 5.58 5.90 6.23

Probability 0.10 0.25 0.30 0.25 0.10

E(X) = 5.58 ft = 3.28 * 1.7

E(aX) = a * E(X)

Medium of Central Tendency

The components that make up the central tendency

▪Mean
▪Median
▪Mode

Outliers (Extreme values)

Outlier

0 1 2 3 4 5 6 7 8 9 10

The two are examples of outliers The top one is single set of
numbers Second one is for double set

80 B
C

70

D
60
A
Points B and D
50 are Outliers

500 1000 1500 2000

How to find the range of non-outlier

Quiz Score
5
8
11
12 1.5(IQR) = 1.5(3) = 4.5
12 Q1
12
13
13
13
Median IQR = 15 - 12 = 3
13
14
14
14
15 Q3 Q1 - 4.5 = 12 - 4.5 = 7.5
15
15 Q3 + 4.5 = 15 + 4.5 = 19.5
15
15

Outliers
Q1 - 1.5 * IQR

Q1

Q3

Q3 + 1.5 * IQR
Outlier

Effect of an Outlier on Mean and Median

Data Set Measure With Outlier Without Outlier


26.0O C
Mean -28 25.667
15.0O C
20.5 C O Median 26 28.25
31.0O C
-350.0O C
31.0O C
Median cushions the No big difference
30.5 C O

effect of an outlier
Mode

2, 4, 5, 5, 4, 5
2, 4, 4, 5, 5, 5
1 2 3

Mode = 5

Relation between Variance and Expectation

Var(X) = E[(X - E[X]) ] 2

= E[(X - 2XE[X] + E[X] ]


2 2

= E[X ] - 2E[X] E[X] + E[X]


2 2

= E[X ] - E[X]
2 2

Expectation and variance in two random variables


E(X +Y) = E(X) + E(Y)
and if X & Y are independent:

E(X *Y) = E(X) * E(Y)


Var(X + Y) = Var(X) + Var(Y)

Comparison of properties of expectation & variance

E(a) = a
Var(X -
+ a) = Var(X)
E(aX) = a * E(X)
Var(aX) = a * Var(X)
2
E(X + a) = E(X) + a
Var(X ) = Var(X) + E(X)
2 2
E(X + Y) = E(X) + E(Y)
Var(X) = E(X ) - (E(X))
2 2
E(X ) = Var(X) + (E(X))
2 2

If X and Y are independant

E(XY) = E(X) * E(Y) Var(X +Y) = Var(X) + Var(Y)

Linear Combination of random variables

The linear combination of random variables X 1 , X 2 ,...., X n


and constants c 1 , c 2 ,...., c n

Y = c 1 X 1 + c 2 X 2 + ... + c n X n

E(Y) = c 1 E(X 1 ) + c 2 E(X 2 ) + ... + c n E(X n )

Variance when the variables are dependent

V(Y) = 2 2
c 1σ 1 + 2 2
c 2σ 2 + ... + 2 2
c nσ n +2 Σ Σ c ic j σ XY
i j

Variance when the variables are independent


2 2 2 2 2 2
V(Y) = c 1σ 1 + c 2σ 2 + ... + c nσ n

Expectation for Bernoulli and Binomial

The Mean and Variance of a Binomial Distribution


X ~ B (n,p)

The mean of the distribution is the expected value of X


E(X) = np

The random variance of the distribution


Var (X) = npq (q = 1 - p)

The Standard Deviation


Var (X) = (npq) (q = 1 - p)

Expected value of a r.v. with Geometric Dist

1 (1-p) 1
E(X) = - V(X) =- μ = E(X) = -
p p 2 p

3 1
p = - = 0.04 E(X) =- = 25
75 0.04

k≥6 (1-p)
σ = V(X) =-
2
p 2

P(X ≥ k) = (1 - p) k-1

P(X ≥ 6) = (1 - 0.04) 6-1 (1-0.04)


V(X) =-= 600
P(X ≥ 6) = (0.96) = 0.815
5
(0.04) 2

Two random variables are independent if


Random variables X and Y are independent if their joint
distribution function factors into the product of their
marginaldistribution functions.

F X,Y(x,y) = F X(x) F Y(y)

You might also like