Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Chapter 4 (Discrete) Random Variables

Chapter 7 Properties of Expectations


X, Y , E(f(x)g(y)) = E(f(X)) E(g(Y))

Chapter 6 Jointly distributed Random Variables


Chapter 5 Continuous Random Variables

dy

dy

(g1, g2)

4. Sum of Binomial(ni,p) is Binomial(ni,p)


5. Hyper-geometric Model ,
Binomial Model, E Var ,
, p = /,
E , Var (total
red)/(total 1),
Binomial
6. r = 1 Negative Binomial Geometric
7. r Geometric(p) Sum NB(r,p),
NB(ri,p) Sum NB(ri,p)
8. if X is a binomial random variable with
parameters (n, p) and n is large and p is small,
then if = np,

9. For X: Normal

Important Notes
1. For independent X, Y, MX+Y(t) = Mx(t)My(t)
2. Cov(X,Y) = Cov(Y,X), Cov(X,X) = Var(X), Cov(kX, Y) =
kCov(X,Y), Cov(X, Y) = (1/2)(Var(X+Y) Var(X) Var(Y))
3.
Other Properties for sample mean & sample variance
1. Var(SM) = Var(X)/n (
)
2. E(S2) = [Var(X)]2 (
)
3. Cov(Xi SM, SM) = 0 (
uncorrelated , )
4. Xi () Normal Random Variables
, (Xi SM) SM , SM S2
, () Jointly distributed normal random
variables , uncorrelated independent

Table of moment-generating functions

with Fx being the CDF,

10. The Gamma distribution is often used to


model the time elapsed until the occurrence of
the n-th random event in a Poisson process
(Gamma(n, ) Poisson Process of parameter
event n Tn ), :
Exp()

11. Gamma (of same )


12. X, Y Poisson, 1, 2,
(X|X+Y=n) Binomial(n, 1/(1+2)); X+Y
Poisson, (X|X+Y=n) Binomial(n,p), X,
Y Poisson(p), Poisson((1-p))
13. Gamma Beta Distribution ()

1. 52 , 4 , spade, club,
heart, diamond, 13
1. ,

2. ,
, , 0 (o.w.)
3. Bivariate Normal joint density function

0, marginal
Normal,
Conditional Distribution, i.e., fX|Y(x|y)

4. n , , X
(n-1)
,
X = X1+X2++Xn-1, Xi
Other Properties of Random Variables
(i-1),
,
i
1. Sum of Poisson is still Poisson ()
, Xi Geometric((n+1-i)/n), E(Xi) =
2. Sum of normal is still normal ()
3. Sum of n N(0,1) Chi-squared of n, n/(n+1-i), E(X) = = 1+n/(n-1)++n/2
Gamma(n/2, 1/2) ! 5. Domain factor X, Y !

6. Var(X) = E(X^2)-(E(X))^2 !
7. Negative Hyper-geometric Random Variable
(n + m) balls, n special, m ordinary.
, the random variable Y, the number
of balls that need to be withdrawn until a total of
r special balls have been removed

8. Joint MGF , X, Y ,
( (X|Y=y)
Random Variable ):
E(esX+tY|Y=y) = eyE(esX|Y=y)
MX,Y(s,t)
MX,Y(0,t) = E(etY), MX,Y(s,0) = E(esX)

You might also like