Professional Documents
Culture Documents
Continuous RV-handout 2
Continuous RV-handout 2
1. Uniform Distribution
a+b
E( X ) =
2
b2 − a 2
Var ( X ) =
12
MGF:
ebt − eat
M X (t ) = , for any real number t
(b − a)t
Example
The future lifetime (in years) of a
newborn is uniformly distributed over
the interval (0,100).
We write X ~ Exp( ).
PDF: f ( x) = e− x , x 0, 0
CDF: F ( x) = 1 − e− x
1
E( X ) =
1
Var ( X ) =
2
1 1
MGF: M X (t ) = , t
1 − t
Example
The future lifetime (in years) of
a newborn is exponentially
distributed with mean 𝜆.
+
−1 −t
Let c = t e dt = ( ).
0
4. Beta Distribution
We write X ~ B(1, 2 ).
PDF:
(1 + 2 ) 1 −1
f ( x) = x (1 − x) 2 −1, 0 x 1
(1 )( 2 )
EXERCISE!
1
E( X ) =
1 + 2
1 2
Var ( X ) =
(1 + 2 )2 (1 + 2 + 1)
Example
From the previous problem on
the income of Mr. and Mrs.
Razon, calculate the expected
value and variance of the ratio
of income of Mr. Razon to the
total income of the couple.
5. Normal Distribution
Recall the transformation discussed
in the previous chapter as follows:
We write X ~ N ( , 2 ).
If X ~ N ( , 2 ), then
Parameters: , 2 X −
Z= is standard normal.
PDF: This transformation from normal to
1 ( x − ) 2 standard normal is helpful specially
f ( x) = exp − , − x + if we are computing probabilities
2 2 2
such as P(X ≤ a). There is no closed
CDF:
form for the CDF of X. To avoid the
x x
1
( y − )2 complication of performing the
F ( x) = f ( y )dy = 2
exp −
2 2 dy
integral, we use the transformation
− −
There is no closed form for the CDF. and use the standard distribution
table.
Using change of variables x=y + 𝜇,
the moment generating function is Recall the following notes on
reading the standard table for z ≥ 0:
computed and obtain
1. P(Z ≤ z) is the area under the
2 2
M X (t ) = exp t + t , for any real number t. graph of the pdf from -∞ to z
2
2. The graph of the PDF is
❖ The derivation is left as an exercise. symmetric about the origin.
3. P(Z > z) = 1 - P(Z ≤ z) is the
Using the MGF, the moments are area under the graph of the
easily computed. PDF from z to + ∞
E( X ) = Var ( X ) = 2
Example
✓ It is easy to verify using the Suppose X~N(2,4). Calculate
MGF that the sum of two
independent normal 1. P(X ≤ 1.16)
variables is also normal with 2. P(X > -1)
mean equal to the sum of 3. P(0≤ X ≤ 0.5)
the means and variance
equal to the sum of the
variances.
Standard Normal
Distribution
When a random variable following a
normal distribution has mean 0 and
variance 1, we call it the STANDARD
NORMAL DISTRIBUTION, denoted
by Z. Hence, the PDF of Z is given by
1
z2
f ( z) = exp − , − z +.
2 2
Normal Approximations Continuity Correction
The central limit theorem Let S be a sum of n independent,
Let X1, X 2 ,..., X n be identically identically distributed discrete
random variables with mean 𝜇 and
distributed and independent
variance 𝜎 2 . Then the normal
random variables with mean μ and
approximation is given by
variance σ2. Then, as n becomes large
X1 + X 2 + ... + X n ~ N (n , n 2 ). k + 0.5 − n
In other words, P( S k ) = P Z .
X1 + X 2 + ... + X n − n n 2
~ Z.
n 2
This is used when a discrete random
One of the applications of Central variable is approximated by a
limit theorem is normal normal variable.
approximation.
Example
Example From the previous example,
Suppose that there are 100 approximate P(S ≤ 3) using the
independent and identically continuity correction.
distributed random variables
uniformly distributed over (1,4).
Other Approximation
Theorems
1. Poisson Approximation of
the Binomial
Let X1, X2, ... be independent
Example Binomial random variables
with parameters n and p.
Suppose there are 9 independent
Suppose that limn→∞np = λ.
Bernoulli random variables with
Then Xn converges in
parameter 0.5. Let S= X1+X2+…+
distribution to a Poisson
X9.
random variable with mean λ.
Compare the exact calculation of
P(S ≤ 3) with the approximation 2. Law of Large Numbers
using central limit theorem. Let X1, X2, ... , Xn be an
independent and identically
distributed random
variables with mean μ and
variance σ2. Let Sn =
X1+X2+∙∙∙+Xn. Then,
Sn
→ as n → +.
n