Statistics Formulas Cheatsheet

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

http://aprendeconalf.

es

Statistics Formulas
Descriptive Statistics Shape statistics
!
(x i − x̄ )3
Frequencies Coefficient of skewness g 1 =
ns 3
!
(x i − x̄ )4
Sample size n num of individuals in the sample. Coefficient of kurtosis g 2 = −3
ns 4
Absolute frequency n i (num of x i in the sample)
Relative frequency fi = n i /n
!i Linear transformations
Cumulative absolute freq N i = k =0 n i
Cumulative relative freq Fi = N i /n Linear transformation y = a + bx
ȳ = a + b x̄
s y = bs x
Central tendency statistics x − x̄
Standarization z =
! sx
xi
Mean x̄ =
n
Median me The value with cum.rel.freq. F me = 0.5.
Mode mo The most frequent value. Regression and correlation
Linear regression
Position statistics !
xi yj
Covariance s x y = − x̄ ȳ
Quartiles Q 1 , Q 2 , Q 3 divide the distribution into 4 n
equal parts. Their cum.rel.freqs. are FQ 1 = 0.25, Regression lines :
FQ 2 = 0.5 and FQ 3 = 0.75. sx y
Percentiles P 1 , P2 , · · · , P 99 divide the distribution into y on x : y = ȳ + (x − x̄ )
s x2
100 equal parts. sx y
The cum.rel.freq. is FPi = i /100. x on y : x = x̄ + (y − ȳ )
s 2y
Interpolation
F Regression coefficients
sx y sx y
(y on x ) b y x = (x on y ) b x y =
Fi
s x2 s 2y

Coefficient of determination
i
100 s x2y
r2 = 0 ≤ r2 ≤ 1
s x2 s 2y

α
Correlation coefficient
Fi −1 sx y
r = . −1 ≤ r ≤ 1
sx s y
l i −1 Pi li X

Pi = l i +
i
100 − Fi −1
(l i − l i −1 )
Non-linear regression
F i − F i −1
Exponential model y = e a +bx
Apply the logarithm to the dependent variable and
compute the line log y = a + bx .
Dispersion statistics Logarithmic model y = a + b log x
Apply the logarithm to the independent variable and
Interquartile range I Q R = Q 3 − Q 1 compute the line y = a + b log x .
! ! 2
(x i − x̄ )2 xi Potential model y = ax b
Variance s 2 = = − x̄ 2 Apply the logarithm to both variables and compute the
n n
√ line log y = a + b log x .
Standard deviation s = + s 2
s
Coefficient of variation cv =
| x̄ |

1
http://aprendeconalf.es

Probability Basic probability

Event operations Union P (A ∪ B) = P (A) + P (B) − P (A ∩ B)


Intersection P (A ∩ B) = P (A)P (B |A)
Union Difference P (A − B) = P (A) − P (A ∩ B)

Ω Contrary P (A) = 1 − P (A)


A B

Conditional probability
P (A ∩ B)
Conditional probability P (A |B) =
A∪B P (B)
Independent events P (A |B) = P (A).
Intersection Total probability Theorem

Ω "
n
P (B) = P (Ai )P (B |Ai )
A B i =1

Bayes Theorem
A∩B
P (Ai )P (B |Ai )
P (Ai |B) = !n
i =1 P (Ai )P (B |Ai )

Complement Risks
Ω E E
Treatment a b
Control c d
A A Prevalence Proportion of individuals with E : P (E )
a
Incidence rate or absolute risk R (E ) =
a +b
a
Odds O (E ) =
b
Difference a/(a + b)
Relative risk R R (E ) =
c/(c + d )

a/b a ·d
A B Odds ratio O R (E ) = =
c/d b ·c

A−B

Diagnostic tests
Disease D No disease D
Test + VP FP
Test − FN VN
Algebra of events
VP
Sensitivity P (+ |D ) =
Idempotency A ∪ A = A, A∩A =A VP +FN
Commutative A ∪ B = B ∪ A, VN
A∩B =B ∩A Specificity P (− |D ) =
F P +V N
Associative (A ∪ B) ∪ C = A ∪ (B ∪ C ), (A ∩ B) ∩ C =
VP
A ∩ (B ∩ C ) Positive Predictive Value (PPV) P (D |+) =
VP +FP
Distributive (A ∪ B) ∩ C = (A ∩ C ) ∪ (B ∩ C ), (A ∩ VN
B) ∪ C = (A ∪ C ) ∩ (B ∪ C ) Negative Predictive Value (NPV) P (D |−) =
F N +V N
Neutral element A ∪ ∅ = A, A∩Ω=A P (+ |D )
Positive Likelihood Ratio (LR+)
Absorbing element A ∪ Ω = Ω, A ∩ ∅ = ∅. P (+ |D )
Complementary symmetric element A ∪ A = Ω, A∩ P (−|D )
A=∅ Negative Likelihood Ratio (LR-)
P (−|D )
Double contrary A = A
Morgan’s laws A ∪ B = A ∩ B, A∩B =A∪B

You might also like