Professional Documents
Culture Documents
StatIdea Slides 4
StatIdea Slides 4
Special Distributions
If k = 1, then x̃ is a constant.
x2 nx1 1 2 3 4 5 6
0 1/12 1/12 1/12 1/12 1/12 1/12
1 1/12 1/12 1/12 1/12 1/12 1/12
Therefore,
1
f (x1 , x2 ) = , for (x1 , x2 ) 2 f1, 2, 3, 4, 5, 6g + f0, 1g .
12
for all A 2 B .
Note that Px̃ fag = 1, i.e., the value a is taken almost surely by the
random variable x̃.
or 8
< 1"q for x = 0
f (x; q ) =
:
q for x = 1.
, - , - x , - 4 "x , - , -4 , -
4 1 1 4 1 1 4
P fx̃ = x g = fx̃ (x ) = = = ,
x 2 2 x 2 16 x
| {z }
b (x ;4, 12 )
for x = 0, 1, 2, 3, 4
| {z }
x̃ (W)
or 8
> 1/16 = 0.0625 for x = 0
>
>
>
>
>
>
>
> 4/16 = 0.25
> for x = 1
, - >>
<
1
fx̃ (x ) = b x; 4, = 6/16 = 0.375 for x = 2
2 >
>
>
>
>
>
>
> 4/16 = 0.25 for x = 3
>
>
>
>
:
1/16 = 0.0625 for x = 4.
J. CaballÈ (UAB - MOVE - BSE) Probability and Statistics IDEA 10 / 90
Probability Histogram (which is symmetric i§ q = 1/2):
, - , - x , - 4 "x
4 1 2
P fx̃ = x g = fx̃ (x ) = for x = 0, 1, 2, 3, 4
x 3 3 | {z }
| {z } x̃ (W)
b (x ;4, 13 )
or 8
> 0.1975 for x = 0
>
>
>
>
>
>
>
> 0.3951 for x = 1
>
, - >>
<
1
fx̃ (x ) = b x; 4, = 0.2963 for x = 2
3 >
>
>
>
>
>
>
> 0.0988 for x = 3
>
>
>
>
:
0.0123 for x = 4.
Proof:
n , -
! " n x
Mx̃ (t ) = E e t x̃
= Âe tx
q ( 1 " q ) n "x
x =0 x
| {z }
b (x ;n,q )
n , -! "x
n
= Â qe t (1 " q )n "x = [qe t + (1 " q )]n = [1 + q (e t " 1)]n .
x =0 x
Then, /
µ = Mx̃0 (0) = n[1 + q (e t " 1)]n "1 qe t /t =0 = nq
and
! "
s2 = E x̃ 2 " [E (x̃ )]2 = Mx̃00 (0) " µ2 =
/
fn(n " 1)[1 + q (e t " 1)]n " 2 q 2 e 2t + n[1 + q (e t " 1)]n " 1 qe t g/t =0
" n2 q 2 = n(n " 1)q 2 + nq " n2 q 2 = "nq 2 + nq = nq (1 " q ).
J. CaballÈ (UAB - MOVE - BSE) Probability and Statistics IDEA 16 / 90
The Pascal (negative binomial or binomial waiting-time)
distribution.
Recall that , -
n n!
= .
x1 , x2 , . . . , xk x1 ! 0 x2 ! 0 . . . 0 xk !
Z
G (x )dF (x ) =
[a,b ]
! " Z
G (a ) F (a ) " F (a " ) + F (b )G (b ) " F (a )G (a ) " F (x )G 0 (x )dx
[a,b ]
Z
= F (b )G (b ) " F (a " )G (a ) " F (x )G 0 (x )dx.
[a,b ]
Z
G (x )dF (x ) =
[a,b )
! " Z
" "
G (a) F (a)"F (a ) + F (b )G (b )"F (a)G (a)" F (x )G 0 (x )dx
(a,b )
| R
{z }
(a,b ) G (x )dF (x )
Z
" "
= F (b )G (b ) " F (a )G (a ) " F (x )G 0 (x )dx.
[a,b )
which is given by
0 1
∂g 1 (y ) ∂g 1 (y ) ∂g 1 (y )
∂y1 ∂y2 000 000 ∂yn
B C
B C
B C
B ∂g 2 (y ) ∂g 2 (y )
000 000 ∂g 2 (y ) C
Jg (y1 , y2 , ..., yn ) = B
B
∂y1 ∂y2 ∂yn C,
C
B ... ... ... ... ... C
B C
@ A
∂g n (y ) ∂g n (y ) ∂g n (y )
∂y1 ∂y2 000 000 ∂yn
g : R ++ + [0, 2p ) "! R2 ,
given by 8
< x = r 0 cos q
(x, y ) = g (r , q ), with
:
y = r 0 sin q.
Moreover,
! the function
" g restricted to
g "1 R2 n (0, 0) = R ++ + [0, 2p ) ,
g : R ++ + [0, 2p ) "! R2 n (0, 0) , is bijective.
8 ! 2 "
> 2 1/2
< r = x +y
(r , q ) = g "1 (x, y ), with : ;
>
: q = arctan y .
x
Moreover, 2 3
cos q "r 0 sin q
Jg ( r , q ) = 4 5,
sin q r 0 cos q
so that / /
jJg (r , q )j = /r cos2 q + r sin2 q / = jr j = r .
,Z - ,Z - " 2
#•
"r 2 " e "r 1 p p
= e rdr dq = 0 [q ]0p/2 = 0 = .
R ++ [0,p/2 ) 2 2 2 4
0
Note that
Z •Z • ,Z •
- ,Z •
-
" (x 2 +y 2 ) "x 2 "y 2
e dxdy = e dx e dy = M2,
0 0 0 0
Z • Z •
2 2
where M = e "x dx = e "y dy .
0 0
Therefore,
Z • : p ;1/2 p Z •
2 p 2 p
M= e "x dx = = =) e "x dx = p.
0 4 2 "•
Mean:
Z • - Z b ,
0 2 1b
1 1 x
µ = E (x̃ ) = xf (x )dx = x dx =
"• a b"a b"a 2 a
" #
1 b2 " a2 b+a
= = ,
b"a 2 2
so that
0 1
! " b3 " a3
2 b+a 2 ( b " a )2
Var (x̃ ) = E x̃ 2
" [E (x̃ )] = " = ,
3( b " a ) 2 12
Therefore,
1
k= .
ba G(a)
Note that (a), (b), and the continuity of the gamma function imply
that
00 1 2 3 4
Corollary.
(i) µ = µ10 = ab,
(ii) µ20 = a(a + 1) b2 ,
(iii) s2 = µ20 " µ2 = ab2 .
Moment-generating function:
1
Mx̃ (t ) = (1 " bt )"a = for t < 1/q.
1 " qt
Moment-generating function:
Notation:
1 1 x "µ 2
n(x; µ, s) = p e " 2 ( s ) , with s > 0, for all x 2 R.
s 2p
Notation: x̃ 9 N(µ, s2 ).
(3)
Mx̃0 (t ) = (µ + s2 t )Mx̃ (t ) ) Mx̃0 (0) = µ = E(x̃ ).
'
。
'
i ! !
( ! !
1
溠
J. CaballÈ (UAB - MOVE - BSE) Probability and Statistics IDEA 67 / 90
The previous table gives the area of the shaded region.
N(b ) " N(a) = 1 " N("b ) " [1 " N("a)] = N("a) " N("b ).
N(b ) " N(a) = N(b ) " [1 " N("a)] = N(b ) + N("a) " 1.
and
Cov(x̃ i ,x̃ j ) = sij .
Note: If n1 = n2 = 1, then
s12
E(x̃ 1 jx̃ 2 = x2 ) = µ1 + (x2 " µ2 ) .
s22
E([x̃ 1 " E(x̃ 1 jx̃ 2 = x2 )][x̃ 1 " E(x̃ 1 jx̃ 2 = x2 )]| jx̃ 2 = x2 )
be the n1 + n1 conditional covariance matrix of the random vector x̃ 1
given x̃2 = x2 . Then,
"1
Sx̃ 1 jx̃ 2 =x2 = S11 " S12 S22 S21 ,
which does not depend on the value x2 taken by the random vector
x̃ 2 .
Note: If n1 = n2 = 1, then
s212
Var(x̃ 1 jx̃ 2 = x2 ) = s21 " = s21 (1 " r2 ),
s22
Note that in the previous result the vector x̃ = (x̃ 1 ,x̃ 2 , . . . ,x̃ n )| has
to be multivariate normal. It is not enough that each component of
that vector be normal.
Mean:
= a + b| µx + 0 = a + b| µx . (1)
| {z }
=µ|x b
Variance:
= a + E ( b| x̃ j x̃ = x ) + E (#̃jx̃ = x )
= a + E ( b| x ) + E (#̃) = a + b| x ,
|{z}
=x | b
and
b| = Sy ,x Sx"1 or b = Sx"1 S|y ,x = Sx"1 Sx ,y . (4)
so that
Sy ,x Sx̃"1 = b| Sx Sx"1 = b| .
and, thus, Var(ỹ jx̃ = x ) does not depend on the value x taken by the
random vector x̃, i.e., the random variable Var(ỹ jx̃ ) is a constant.
We can check that the previous equality holds indeed since, from (2)
and (4), we get
and , -
s2y Sy ,x
S= .
Sx ,y Sx
Then, we know from Property 4 above that
: ;
E(ỹ jx̃ =x )=µy +Sy ,x Sx"1 (x "µx )= µy "Sy ,x Sx̃"1 µx +Sy ,x Sx"1 x.
or equivalently,
: ;
E(ỹ jx̃ ) = µy " Sy ,x Sx̃"1 µx + Sy ,x Sx"1 x̃,
so that E(ỹ jx̃ ) is an a¢ne transformation of x̃. Thus, since
x̃ 9 MN(µx , Sx ), the random variable E(ỹ jx̃ ) is normal as dictated
by the General Proposition above.
J. CaballÈ (UAB - MOVE - BSE) Probability and Statistics IDEA 85 / 90
DeÖne the random variable
: ;
#̃ = ỹ " E(ỹ jx̃ ) = ỹ " µy " Sy ,x Sx̃"1 µx " Sy ,x Sx"1 x̃.
E(#̃) = E [ỹ " E(ỹ jx̃ )] = E (ỹ ) " E [E(ỹ jx̃ )] = E (ỹ ) " E (ỹ ) = 0.
Then, we can deÖne the scalar a = µy " Sy ,x Sx̃"1 µx and the column
vector b = Sx"1 Sx ,y 2 Rn so that the previous equation becomes
ỹ = a + b| x̃ + #̃.
|{z}
=x̃ | b
Note that when we consider an interval [a; b] ; we must have a < b: From now on,
whenever we write the integral of a function w.r.t. the Lebesgue measure it should be
understood that the function is not only integrable w.r.t. that measure, but also that
it is Riemann integrable so that
Z Z b
f (x)dx = f (x)dx: (1)
[a;b] a
Let x = g(y) and assume that g is di§erentiable on g !1 ([a; b]) : This requirement is
fulÖlled if we assume that the function g : M !! R is di§erentiable and M is an open
subset of R with g !1 ([a; b]) # M (or equivalently with [a; b] # g(M ) ).
Consider the inverse function g !1 : [a; b] !! R so that y = g !1 (x): This inverse
function g !1 exists if and only if the function g restricted to g !1 ([a; b]) ; i.e.,
g : g !1 ([a; b]) !! [a; b] ; is bijective (or a one-to-one correspondence). That is, g must
be strictly increasing (g 0 > 0 a.e.) or strictly decreasing (g 0 < 0 a.e.) on g !1 ([a; b]) :
Therefore, if x = a then y = g !1 (a), whereas if x = b then y = g !1 (b):
From the theory of Riemann integration, recall that
Z b
f (x)dx = F (b) ! F (a); where F 0 = f on [a; b] :
a
Therefore,
Z a Z b
f (x)dx = F (a) ! F (b) = ! f (x)dx: (2)
b a
Note that the primitive (or antiderivative) of f (g(y)) $ g 0 (y) is F (g(y)) as follows
from the chain rule,
dF (g(y))
= F 0 (g(y)) $ g(y) = f (g(y)) $ g 0 (y):
dy
Therefore,
Z g !1 (b) " !1
"g (b)
f (g(y))g 0 (y)dy = F (g(y)) "g!1 (a) = F (g(g !1 (b))) ! F (g(g !1 (a)))
g !1 (a)
(3)
Z b
= F (b) ! F (a) = f (x)dx:
a
The previous formula holds both for g strictly increasing and for g strictly decreasing.
1. The Gamma Distribution
! "
1 p
1.A. ! = !:
2
Proof.
R1
1st step: Let !(#) = 0
y !"1 e"y dy. We make the following change of variable:
1 dy
y = g(z) = z 2 ) =g 0 (z) = z; for y > 0; z > 0:
2 dz
Z 1 Z 1
1 2!"2 " 1 z2 1"! 1 2
) !(#) = !"1
z e 2 zdz = 2 z 2!"1 e" 2 z dz:
0 2 0
! " Z
1 p 1
1 2
=) ! = 2 e" 2 z dz:
2 0
2nd step:
% ! "&2 %Z 1 & %Z 1 & Z 1Z 1
1 " 12 z 2 " 12 x2 1 2 2
! =2 e dz e dx = 2 e" 2 (z +x ) dzdx:
2 0 0 0 0
!Z " Z !
1 h i
1 2 1
%=2
!
" 12 r 2 %=2
= e = #e" 2 r
rdr $ [-]0 = :
d-
0 0 0 2
% ! "&2 +! , ! "
1 1 p
=) ! =2 = ! =) ! = !:
2 2 2
###############
1
Z 1 r
" 12 z 2 !
1.B. e dz = :
0 2
where the Örst equality comes from step 1 in 1.A. and the second one comes from
step 2 in 1.A. Then, Z 1 r
" 12 z 2 !
e dz = :
0 2
###############
/ r !(# + r)
1.C. .0r = :
!(#)
Proof.
Z 1
1
.0r = xr ! x!"1 e"x=( dx:
0 / !(#)
Making the change of variable
x dx
x = g(y) = /y , y = g "1 (x) = ; so that = g 0 (y) = / > 0;
/ dy
Z 1
1
.0r = / r yr ! / !"1 y !"1 e"y / dy
0 / !(#)
r Z
/ 1
!+r"1 "y / r !(# + r)
= y e dy = :
!(#) !(#)
|0 {z }
'(!+r)
###############
2
1.D. Mx~ (t) = (1 # /t)"! if t < 1=/.
Proof.
Z 1 Z 1
1 1 1
Mx~ (t) = tx
e ! x!"1 e"x=( dx = ! x!"1 e"x( ! "t) dx:
0 / !(#) / !(#) 0
Change of variable:
! "
y 1
x = g(y) = 1 , y = g "1 (x) = x #t ;
(
#t /
so that
dx 1
= g 0 (y) = 1 > 0; if t < 1=/.
dy (
#t
Z !!"1 !
1
1 y 1
Mx~ (t) = ! 1 e"y 1 dy
/ !(#) 0 (
#t (
#t
Z 1
1 1
= + ,! y !"1 e"y dy = + ,!
/ ! !(#) 1
(
# t |0 {z } / ! 1
(
# t
'(!)
1
= =(1 # /t)"! if t < 1=/:
(1 # /t)!
###############
3
2. The Normal distribution
Z 1
2.A. n(x; .; 5) = 1:
"1
Proof. Let
x#. dx
x = g(z) = . + 5z () z = g "1 (x) = ) = g 0 (z) = 5 > 0:
5 dz
Then, Z Z 1
1
1 " 12 ( x!# )
2 1 1 " 1 z2
p e $ dx = p e 2 5 dz
"1 5 2! 2! "1 5
Z 1 r r
2 1 2 2 !
=p e" 2 z dz = $ = 1;
2! 0 ! 2
1 2
where the second equality comes from the symmetry of the function e" 2 z and
the third equality comes from 1.B.
###############
1 2 2
2.B. Mx~ (t) = e*t+ 2 t +
.
Proof. We will use in this proof a technique called "completing the square".
Z 1 Z 1
1 1 x!# 2 1 1
e p e" 2 ( $ ) dx =
2 2
Mx~ (t) = tx
p e" 2$2 ["2xt+ +(x"*) ] dx
"1 5 2! "1 5 2!
Observe that
R1 1 2 2
since p1 e" 2$2 [x"(*+t+ )] dx is the integral of the normal density function
"1 + 2%
1 2 2
with parameters . + t5 and 5, n(x; . + t5 2 ; 5) =
2 p1 e" 2$2 [x"(*+t+ )] :
+ 2%
4
The Standard Normal Distribution Function N(0,1)
z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0199 0.0239 0.0279 0.0319 0.0359
0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2967 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
o
1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3531 0.3554 0.3577 0.3599 0.3621 n
1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830
1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817
2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857
2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936
2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952
2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964
2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986
3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990
Also, for z = 4.0, 5.0, and 6.0 the probabilities are 0.49997, 0.4999997, 0.499999999.