Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Exam 1 Content

1.3 Permutations

Permutations of objects, of which some are alike

1.4 Combinations

Combinations of objects

Often we can solve problems by treating outcomes either as ordered (permutations) or


unordered (combinations). e.g. Select 5 unrelated individuals from a group of 10 couples

Binomial theorem

● Not sure exactly where to apply this

Mutually exclusive events cannot occur at the same time


2.2 Sample Spaces, Events

Using curly brackets to represent sample space/events


● e.g. S = {(i, j): i, j = 1, 2, 3, 4, 5, 6}
● e.g. E = {x: 0 <= x <= 5}
Unions and intersections, complements
● Commutative, associative, and distributive laws
● DeMorgan's laws

2.4 Propositions

Inclusion-exclusion identity

3.2 Conditional Probabilities

Conditional probability

Multiplication rule

3.3 Bayes' Formula

Bayes' formula
3.4 Independent Events

Independent events

● E is independent of F if knowledge that F has occurred does not change the probability
that E occurs
● Can extend to multiple events

Exam 2 Content

4.2 Discrete RV's

Probability mass function (PMF)


● Outputs are the prob. of discrete events
● Sum over all events is 1

Cumulative distribution function (CDF)


● Step function for PMF
● Can be discrete or continuous

4.3 Expected Value

For discrete random variables:


4.4 Expectation of a Function of an RV

4.5 Variance

Alternatively,

Also,

4.6 Bernoulli and Binomial RV's

(See formula sheet below for the probability mass functions)

Bernoulli random variable: A discrete RV with either 0 or 1 as the outcome


Binomial random variable: n independent trials of Bernoulli RVs (essentially)

4.7 Poisson RV

Poisson RV: An approximation of a binomial RV when n is large and p is small enough so that
np is of moderate size
● Sum of two Poisson RV’s is another Poisson RV (the parameter is also the sum)

4.8.1 Geometric RV

Geometric RV: An RV where independent trials are performed until a success occurs

5.1 Intro
Probability density function (PDF)

● Similar to discrete cass, cumulative density function (CDF) obtained by integrating from
negative infinity (differentiate CDF to obtain PDF)

5.2 Expectation, Variance of Continuous RV's

For a continuous RV,

● E[aX + b] = aE[X] + b

For function of a continuous RV,

5.3 Uniform RV

Uniform RV: An RV uniformly distributed over a certain interval


● CDF is straight line going from (a, 0) to (b, 1) and 1 afterwards
5.4 Normal RV

Normal RV: An RV with the normal distribution


● If Y = aX + b, Y is also normally distributed with parameters aμ + b and a2σ2
● Standardize distribution by subtracting mean and dividing by SD (not variance)
● Sum of two normal RV’s is another normal RV (the parameters are also the sum)

5.4.1 Normal to Binomial

A binomial RV with parameters n and p will have approximately the same distribution as a
normal RV with the same mean and variance. Therefore, we can just “standardize” a binomial
RV and use the standard normal distribution as an approximation.

● Special case of the Central Limit Theorem


● Good if both np and n(1 - p) are > 5; i.e. both n and the binomial’s variance is large

5.5 Exponential RV

Exponential RV: An RV with an exponential distribution (duh.)


5.7 Distribution of Function of RV

Find the distribution of Y = g(X) by expressing the event that g(X) <= y in terms of X being in
some set (X <= something, to use X’s CDF). After obtaining Y’s CDF, differentiate for its PDF.
● Examples starting on page 357
● Don’t need to know the original CDF, just differentiate
● Write down range of X and Y first

6.1 Joint Distribution Functions

Joint cumulative probability distribution function

Join probability mass function (discrete)

● Can represent as a table


● Row/column sums would just be the individual RV’s

Joint probability density function

● Put dependent variance in the inside (e.g. put x on the inside for P{X < Y})
● If no dependency, do in whatever order is the most convenient

Marginal distribution functions:

● Vice versa for Y


Exam 3 Content

6.2 Independent RV’s

● This relation holds for the RV’s PDF/PMF and CDF


● Prove independence by establishing this relationship for PDF

6.3 Sums of Independent RV’s

Formula for the sum (last step)

● Be careful with the bounds when a is involved


○ Set it up such that a - y is within fX’s domain
○ For instance, 0 <= a <= 2 and fX only takes 0 <= x <= 1
○ For 0 <= a <= 1, integrate from 0 to 1
○ For 1 < a <= 2, integrate from a - 1 to 1
○ Example on page 409
6.4 Discrete Conditional Distributions

6.5 Continuous Conditional Distributions

7.2 Expectation of Sums

7.4 Covariance and Correlation


● In other words Cov(X, Y) = E[XY] - E[X]E[Y]
● If X and Y are independent, then their covariance is zero

8.2 Chebyshev and Weak LLN

8.3 Central Limit Theorem


12/11 10:30 - 12:30 in regular classroom
- 8 questions, 2 hours (15 minute per)
- Same formula sheet as Exam 2
- Def. for expectation, variance, expectation for joint pdf, covariance, pdf for sum, etc.
- Memorize everything except the different probability distributions
- No integration by parts on the final
- Study exams, hw, quizzes, lectures

Questions
- 3: 1.1 - 4.1 (Exam 1 content)
- 2: 4.3 - 6.1 (Exam 2 content)
- 3: 6.2 - 8.3

6.2: Independence, know def. and how to prove


6.3: Sums of Ind. RVs
- Formula
- How to find pmf, pdf
6.4/6.5: Conditional distributions
- Formulas
- How to find
7.2: E of sums
- Formulas
- How to find
7.4: Know Cov, how to calculate
8.2: Formula sheet: Markov, Chebyshev inequalities
- How to use
8.3: CLT
- Know CLT and how to use it

You might also like