Professional Documents
Culture Documents
Exam P Formula Sheet PDF
Exam P Formula Sheet PDF
0 𝑖𝑓 𝑥 < 𝑑
Deductible of d 𝑌 = max(0, 𝑋 − 𝑑) = {
𝑋 − 𝑑 𝑖𝑓 𝑋 ≥ 𝑑
−𝑏±√𝑏2 −4𝑎𝑐
Quadratic Formula 2𝑎
𝑎(1−𝑟 𝑛 )
Geometric Sequence ∑𝑛𝑖=1 𝑎 ∗ 𝑟 𝑖−1 = 1−𝑟
2
∑𝑛 ̅̅̅2
𝑖=1(𝑥𝑖 −𝑥) 𝑛(∑𝑛 2 𝑛
𝑖=1 𝑥𝑖 )−(∑𝑖=1 𝑥𝑖 )
Variance 𝑠2 = 𝑛−1
= 𝑛(𝑛−1)
Standard Deviation 𝑠 = √𝑠 2
𝑛!
Permutations (r objects from n) 𝑛 𝑃𝑟 = (𝑛−𝑟)!
𝑛! 𝑛 𝑃𝑟
Combination of r object from n total) 𝑛 𝐶𝑟 = 𝑟!(𝑛−𝑟)! = 𝑟!
𝑃(𝐴∩𝐵)
Conditional Probability 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) ∗ 𝑃(𝐵|𝐴) → 𝑃(𝐵|𝐴) = 𝑃(𝐴)
𝑃(𝐵̅|𝐴) = 1 − 𝑃(𝐵|𝐴)
𝑃(𝐴𝑖 ∗𝑃(𝐵|𝐴𝑖 )
Bayes’ Rule 𝑃(𝐴𝑖 |𝐵) = ∑𝑁
𝑖=1 𝑃(𝐴𝑖 )∗𝑃(𝐵|𝐴𝑖 )
1
Uniform Distribution 𝑓(𝑥) = 𝑏−𝑎+1 𝑤ℎ𝑒𝑟𝑒 𝑥 = 𝑎, 𝑎 + 1, 𝑎 + 2, … , 𝑏 − 1, 𝑏 𝑎𝑛𝑑 𝑎 ≤ 𝑏
𝑎+𝑏
𝜇𝑋 = 𝐸(𝑋) = 2
(𝑏−𝑎+1)2 −1
𝜎𝑥2 = 𝑉(𝑋) = 12
𝑒 𝑎𝑡 −𝑒 (𝑏+1)𝑡
𝑀𝑋 (𝑡) = (𝑏−𝑎−1)(1−𝑒 𝑡 )
Hypergeometric Distribution – counts number of successes from selecting n objects w/o replacement
from N dichotomous objects
(𝑁1 )( 𝑁2 )
𝑃(𝑋 = 𝑥) = 𝑥 𝑁𝑛−𝑥
(𝑛 )
Where x = 0,1,2,….,n
X <= N1
n – x <=N2
N = N1 + N2
𝑛 𝑛!
( )=
𝑥 𝑥! (𝑛 − 𝑥)!
𝑁1
𝜇𝑋 = 𝐸(𝑋) = 𝑛 ( )
𝑁
2 𝑁1 𝑁 𝑁−𝑛
𝜎𝑥 = 𝑉(𝑋) = 𝑛 ( ) ( 2 ) ( )
𝑁 𝑁 𝑁−1
𝑛!
𝑃(𝑋 = 𝑥) = 𝑝 𝑥 𝑞𝑛−𝑥
𝑥! (𝑛 − 𝑥)!
Where x = 0,1,2,….,n
q = 1 – p, 0 < p < 1
𝜇𝑋 = 𝐸(𝑋) = 𝑛 ∗ 𝑝
𝜎𝑥2 = 𝑉(𝑋) = 𝑛 ∗ 𝑝 ∗ 𝑞
𝑀𝑋 (𝑡) = (𝑞 + 𝑝𝑒 𝑡 )𝑛
Sum of binomial random variables (same p)
Xi ~ Binom(ni,p), i = 1, 2, …, m
Let X = X1 + X2 + …+ Xm. Then
X ~ Binom (n=n1+n2+…+nm, p)
𝑛!
Multinomial Distribution 𝑃(𝑋1 = 𝑥1, 𝑋2 = 𝑥2 , … , 𝑋𝑚 = 𝑥𝑚 ) = 𝑝 𝑥1 𝑝2𝑥2 … 𝑝𝑚
𝑥𝑚
𝑋1 !𝑋2 !…𝑋𝑚 ! 1
Geometric Distribution – counts the number of independent trials need to obtain the first success
𝑃(𝑋 = 𝑥) = 𝑞 𝑥−1 𝑝
Where x = 0,1,2,….,n
q = 1 – p, 0 < p < 1
𝑃(𝑋 > 𝑥) = 𝑞 𝑥 → 𝐹𝑋 (𝑥) = 𝑃(𝑋 ≤ 𝑥) = 1 − 𝑞 𝑥
1
𝜇𝑋 = 𝐸(𝑋) =
𝑝
2
𝑞
𝜎𝑥 = 𝑉(𝑋) = 2
𝑝
𝑝𝑒 𝑡
𝑀𝑋 (𝑡) =
1 − 𝑞𝑒 𝑡
Memoryless property 𝑃(𝑋 > 𝑎 + 𝑏|𝑋 > 𝑎) = 𝑃(𝑋 > 𝑏) , 𝑃(𝑋 < 𝑎 + 𝑏|𝑋 > 𝑎) = 𝑃(𝑋 < 𝑏)
Negative Binomial Distribution – counts the number of trials to obtain the rth success
𝑥 − 1 𝑥−𝑟 𝑟
𝑓(𝑥) = ( )𝑞 𝑝
𝑟−1
Where x= r, r+1, r+2,… and 0 < p < 1
𝑥−1 (𝑥 − 1)!
( )=
𝑟−1 (𝑟 − 1)! (𝑥 − 𝑟)!
𝑟
𝜇𝑋 = 𝐸(𝑋) =
𝑝
𝑟𝑞
𝜎𝑥2 = 𝑉(𝑋) = 2
𝑝
𝑟
𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = ( )
1 − 𝑞𝑒 𝑡
Sum of geometric trials random variables (same p)
Xi ~ GeoT(p), i = 1, 2, …, m
Let X = X1 + X2 + …+ Xm. Then
X ~ NegBinomT (r=m, p)
Poisson Distribution – counts the number of events in a fixed interval of time (average =λ)
𝜆𝑥 exp(−𝜆)
𝑃(𝑋 = 𝑥) = 𝑤ℎ𝑒𝑟𝑒 𝑥 = 0,1,2, … 𝑎𝑛𝑑 𝜆 > 0
𝑥!
𝜇𝑋 = 𝐸(𝑋) = 𝜆
𝜎𝑥2 = 𝑉(𝑋) = 𝜆
𝑀𝑋 (𝑡) = exp[ 𝜆(𝑒 𝑡 − 1)]
Sum of Poisson random variables (independent)
Xi ~ Pois(𝜆𝑖), i = 1, 2, …, m
Let X = X1 + X2 + …+ Xm. Then
X ~ Pois (𝜆 = 𝜆1 + 𝜆2 + ⋯ + 𝜆𝑚)
Negative Binomial Failures Distribution – number of failures to obtain the rth success
𝑦+𝑟−1 𝑦 𝑟
𝑃(𝑌 = 𝑦) = ( )𝑞 𝑝
𝑟−1
Y=0,1,2,… and 0 < p < 1 and q=1-p
𝑦+𝑟−1 (𝑦 + 𝑟 − 1)!
( )=
𝑟−1 𝑦! (𝑟 − 1)!
𝑟𝑞
𝜇𝑌 = 𝐸(𝑌) =
𝑝
𝑟𝑞
𝜎𝑥2 = 𝑉(𝑋) = 2
𝑝
𝑝 𝑟
𝑀𝑌 (𝑡) = ( )
1 − 𝑞𝑒 𝑡
Sum of negative binomial failures random variables (same p)
Xi ~ NegBinomF(ri,p), i = 1, 2, …, m
Let X = X1 + X2 + …+ Xm. Then
X ~ NegBinomF (r=r1+r2+…+rm, p)
∞
Continuous Probability ∫−∞ 𝑓(𝑥) = 1 , 𝑓(𝑥) ≥ 0
𝑏
𝑃(𝑎 ≤ 𝑋 ≤ 𝑏) = ∫ 𝑓(𝑥)𝑑𝑥
𝑎
𝑃(𝑎 ≤ 𝑋 ≤ 𝑏) = 𝑃(𝑋 ≤ 𝑏) − 𝑃(𝑋 ≤ 𝑎) = 𝑃(𝑋 ≥ 𝑎) − 𝑃(𝑋 ≥ 𝑏)
𝑥
𝐹𝑋 (𝑥) = 𝑃(𝑋 ≤ 𝑥) = ∫ 𝑓(𝑡)𝑑𝑡 → 𝐹𝑋 (−∞) = 0, 𝐹𝑋 (∞) = 1
−∞
𝑑
𝑓(𝑥) = 𝐹 (𝑥)
𝑑𝑥 𝑋
∞
Expected Value 𝐸[𝑔(𝑋)] = ∫−∞ 𝑔(𝑥)𝑓(𝑥) 𝑑𝑥
∞
First Moment 𝑓𝑖𝑟𝑠𝑡 𝑚𝑜𝑚𝑒𝑛𝑡 = 𝜇 = 𝐸(𝑋) = ∫−∞ 𝑥 ∗ 𝑓(𝑥) 𝑑𝑥
∞
Second Moment second 𝑚𝑜𝑚𝑒𝑛𝑡 = 𝐸(𝑋 2 ) = ∫−∞ 𝑥 2 ∗ 𝑓(𝑥)𝑑𝑥
c = Constant 𝐸[𝑐𝑔(𝑋)] = 𝑐𝐸[𝑔(𝑋)]
Expectation of Sum 𝐸[∑𝑚 𝑚
𝑖=1 𝑔𝑖 (𝑥)] = ∑𝑖=1 𝐸[𝑔𝑖 (𝑥)]
∞
Moment Generating Functions 𝑀𝑋 (𝑡) = 𝐸[𝑒 𝑡𝑋 ] = ∫−∞ 𝑒 𝑡𝑥 ∗ 𝑓(𝑥) 𝑑𝑥
𝑑 𝑘 𝑀𝑋 (𝑡)
kth moment 𝑘 𝑡ℎ 𝑚𝑜𝑚𝑒𝑛𝑡 = 𝐸(𝑋 𝑘 ) = 𝑑𝑡 𝑘
|𝑡=0
1
Uniform Distribution 𝑓(𝑥) = 𝑏−𝑎 𝑤ℎ𝑒𝑟𝑒 𝑎 ≤ 𝑥 ≤ 𝑏
𝑥−𝑎
𝐹(𝑋) = 𝑃(𝑋 ≤ 𝑥) =
𝑏−𝑎
𝑎+𝑏
𝜇𝑋 = 𝐸(𝑋) =
2
2 (𝑏−𝑎)2
𝜎𝑥 = 𝑉(𝑋) = 12
𝑒 𝑡𝑏 −𝑒 𝑡𝑎
𝑀𝑋 (𝑡) =
𝑡(𝑏−𝑎)
Normal Distribution
1 1 𝑥−𝜇 2
𝑓(𝑥) = 𝑒𝑥𝑝 [− ( ) ]
√2𝜋𝜎 2 𝜎
−∞ < 𝑥 < ∞, −∞ < 𝜇 < ∞, 0 < 𝜎 < ∞
𝐸(𝑋) = 𝜇, 𝑉(𝑋) = 𝜎 2
𝑋 − 𝜇𝑋
𝑍=
𝜎𝑋
1
𝑀𝑋(𝑡) = 𝑒𝑥𝑝 (𝜇𝑡 + 𝜎 2 𝑡 2 )
2
Sum of normal random variables
Xi ~ N(µi, 𝜎 2 𝑖), i = 1, 2, …, m
Let X = X1 + X2 + …+ Xm. Then
X ~ N (µ= µ1+ µ2+…+ µm, 𝜎 2 = 𝜎 2 1 + 𝜎 2 2 + ⋯ + 𝜎 2 𝑚)
Continuous Transformations
Method of Distributions
1. Let Y=f(X). Solve for X (𝑋 = 𝑓 −1 (𝑌)
2. Calculate cdf of Y:𝐹𝑋 (𝑌) = 𝑃(𝑌 ≤ 𝑦) = 𝑃(𝑓(𝑋) ≤ 𝑦) = 𝑃(𝑋 ≤ 𝑓 −1 (𝑦)) = 𝐹𝑋 (𝑓 −1 (𝑦))
3. Differentiate answer from #2 to obtain f(y)
4. f(y) and Fy(Y) are now know.
Method of Transformations
1. Let Y=f(X). Solve for X (𝑋 = 𝑓 −1 (𝑌)
𝑑𝑋 𝑑
2. Differentiate answer from #1 to find 𝑑𝑌 = 𝑑𝑌 𝑓 −1 (𝑌)
𝑑𝑋 𝑑𝑋
3. 𝑓𝑌 (𝑦) = 𝑓𝑋 (𝑥) | | = 𝑓𝑋 (𝑓 −1 (𝑦)) | |
𝑑𝑌 𝑑𝑌
4. f(y) and Fy(Y) are now know.
Order Statistics
Minimum of N identically distributed variables
𝐹𝑋(1) (𝑥) = 1 − [1 − 𝐹𝑥 (𝑥)]𝑁
𝑓𝑋(1) (𝑥) = 𝑁[1 − 𝐹𝑋 (𝑥)]𝑁−1 𝑓(𝑥)
Maximum of N identically distributed variables
𝐹𝑋(𝑁) (𝑥) = [𝐹𝑥 (𝑥)]𝑁
𝑓𝑋(𝑁) (𝑥) = 𝑁[𝐹𝑋 (𝑥)]𝑁−1 𝑓(𝑥)
Multivariate Distributions
Discrete ∑𝑥 ∑𝑦 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) = 1 , 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) ≥ 0
Continuous ∬ 𝑓(𝑥, 𝑦)𝑑𝑦𝑑𝑥 = 1, 𝑓(𝑥, 𝑦) ≥ 0
Expectation Properties
𝐸[𝑔(𝑥, 𝑦)] = ∑ ∑ 𝑔(𝑥, 𝑦)𝑓(𝑥, 𝑦)
𝑥 𝑦
𝑋
If 𝑋~𝐵𝑖𝑛𝑜𝑚(𝑛, 𝑝)𝑎𝑛𝑑 𝑖𝑓 𝑝̂ = , 𝑡ℎ𝑒𝑛
𝑛,
𝜇𝑝̂ = 𝐸(𝑝̂ ) = 𝑝
𝑝𝑞
𝜎𝑝2̂ = 𝑉(𝑝̂ ) =
𝑛
𝑋
𝐼𝑓 𝑋~𝐵𝑖𝑛𝑜𝑚(𝑛, 𝑝)𝑎𝑛𝑑 𝑖𝑓 𝑝̂ = 𝑎𝑛𝑑 𝑖𝑓 𝑛 𝑖𝑠 𝑠𝑢𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡𝑙𝑦 𝑙𝑎𝑟𝑔𝑒, 𝑡ℎ𝑒𝑛
𝑛
𝑝𝑞
𝑝̂ ~𝑁(𝜇𝑝̂ = 𝑝, 𝜎𝑝̂ = √
𝑛