Xác Suất Thống Kê

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Chapter 1: Combinatorial Analysis

Permutations
A permutation of n elements of the set A is a group of sufficient order n given elements.
The number of permutations of n elements is:

Pn=1.2.3⋯n=n!.

Note. Here, we are only interested in the number of permutations that can be obtained from
a certain set.

Arrangement
A convolution tuning k of n elements of the set A (0≤ k ≤n) is an ordered group of k
different elements derived from n given elements.

𝑛!
𝐴𝑘𝑛 =
(𝑛 − 𝑘)!

Combination
When randomly deriving k elements from a set of n elements (k ≤ n), such that the two
ways of deriving k elements are called different if there is at least 1 different element
between them (i.e., regardless of the order of the elements), then: The number of ways to
derive k elements from n elements as above is called the convolutional combination k
of n, denoted by

𝐴𝑘𝑛 𝑛!
𝐶𝑛𝑘 = =
𝑘! 𝑘! (𝑛 − 𝑘)!
Chapter 2. Basics and probability formulas
1. The phenomenon of course
Phenomena that, when carried out under the same conditions, give the same result are
called course phenomena
2. Random phenomena
Phenomena that, when performed under the same conditions but produce different
results, are called random phenomena
If in each trial of an experiment conducted under identical conditions, and the outcomes
are not known, then such an experiment is known as a Random Experiment.
Example: Tossing a coin, throwing a die
Outcome:
The result of a random experiment is called Outcome.
Example: In tossing a coin, there are only 2 possible outcomes Head (H) and Tail (T)
Sample Space:
The set of all possible outcomes of a random experiment is called Sample Space.
Example: Sample space of rolling dice: {1, 2, 3, 4, 5, 6}
Trial and Events:
• Any particular performance of a random experiment is called a trial.
• A combination of outcomes is termed as events.
Let’s understand Trail and Event by an example of dice:
• Rolling dice is a trial
o Possible outcomes {1, 2, 3, 4, 5, 6}
• Possible events corresponding to rolling dice
o Odd number of points: {1, 3, 5}
o Even number of points: {2, 4, 6}
o Points between 2 and 6: {3, 4, 5}

Types of Events:
• Exhaustive Events:
Total number of distinct possible outcomes of a random experiment is known as the
exhaustive events.
Example: In tossing coin, there are 2 exhaustive events head and tail.
• Favorable Events:
Those outcomes of trials in which a given event may happen are called favorable cases
for that events.
Example: Rolling two dice
Number of cases favorable to getting a sum of 7
(1,6), (2,5), (3,4), (4,3), (5,2), and (6,1)
• Mutually Exclusive Events:
Events that do not occur at the same time are called Mutually Exclusive Events.
Example: Tossing a coin, the event Head and Tail are mutually exclusive events.

• Equally Likely Events:


Events that have the same probability of occurrence are called Equally likely Events.
Example: All 6 outcomes on throwing a dice are mutually exclusive
• Independent Events:
An event that doesn’t depend on another event in any random experiment is known as
an independent event.
Example: In throwing a coin, getting Head and Tails both of these events are
independent.
1. Probability of the event
Probability formula
1. Definitions:
Definition 1: Events A 1, A 2, ..., An are called full events, conflicting in pairs if they
conflict in pairs and their sum is certain.
Yes: Ai A j = Ø and A 1 A2 . . An = W.
Definition 2: Two events A and B are called independent if the occurrence or absence
of one event does not affect the occurrence or absence of the other event and vice versa.
Definition 3: Events A 1A 2...An are called total independent if each of them is
independent of the product of any other combination of events.
There are three major types of probabilities:
• Theoretical Probability
• Experimental Probability
• Axiomatic Probability
Theoretical Probability
It is based on the possible chances of something to happen. The theoretical probability is mainly
based on the reasoning behind probability. For example, if a coin is tossed, the theoretical
probability of getting a head will be ½.

Experimental Probability
It is based on the basis of the observations of an experiment. The experimental probability can be
calculated based on the number of possible outcomes by the total number of trials. For example,
if a coin is tossed 10 times and head is recorded 6 times then, the experimental probability for
heads is 6/10 or, 3/5.

Axiomatic Probability
In axiomatic probability, a set of rules or axioms are set which applies to all types. These axioms
are set by Kolmogorov and are known as Kolmogorov’s three axioms. With the axiomatic
approach to probability, the chances of occurrence or non-occurrence of the events can be
quantified. The axiomatic probability lesson covers this concept in detail with Kolmogorov’s
three rules (axioms) along with various examples.
2. Public formula
Where A and B are any two events:

P(A+B) = P(A) + P(B) – P(AB)

If A and B are two conflicting events:

P(A+B) = P(A) + P(B)

If A1, A 2 , ..., An are pairwise conflicting events:

P(A1+A 2+ .. +A n) = P(A 1) + P(A 2) + . . . +P(An)

Conditional probability
Conditional probability is known as the possibility of an event or outcome happening,
based on the existence of a previous event or outcome. It is calculated by multiplying
the probability of the preceding event by the renewed probability of the succeeding, or
conditional, event.

P(A|B) = N(A∩B)/N(B)

OR
P(B|A) = N(A∩B)/N(A)

where P(A|B) denotes the probability of occurrence of A given B having occurred.


N(A ∩ B) is the number of common elements of both A and B.
N(B) is the number of elements in B and it cannot be zero.
Let N be the total number of elements in the sample space.
Bayes' theorem:
Bayes' theorem is an extension of Conditional Probability.
It consists of two conditional probabilities.
It gives the relationship between the conditional probability and its inverted form.
The group of events of a test is called a full group if:𝐴1 , 𝐴2 , ⋯ , 𝐴𝑛 , (𝑛 ≥ 2)
1. 𝐴𝑖 ∩ 𝐴𝑗 = ∅ (Pair-by-pair conflict)

2. 𝐴1 ∪ 𝐴2 ∪ ⋯ ∪ 𝐴𝑛 = Ω

𝑃(𝐴𝑘 )𝑃(𝐵|𝐴𝑘 ) 𝑃(𝐴𝑘 )𝑃(𝐵|𝐴𝑘 )


𝑃(𝐴𝑘 |𝐵) = = 𝑛
𝑃(𝐵) ∑ 𝑃(𝐴𝑖 )𝑃(𝐵|𝐴𝑖 )
𝑖=1
Chapter 3: Random quantities and the law of probability distribution
Concept-classification

What is Random Variable?


Set of all possible values from a Random Experiment is called Random Variable.
It is represented by X.
Example: Outcome of coin toss
Types of Random Variable:
Discrete Random Variable
X is a discrete because it has a countable values between two numbers
Example : number of balls in a bag, number of tails in tossing coin
Continuous Random Variable
X is a continuous because it has a infinite number of values between two values
Example : distance travelled, Height of students

Features of random quantities


If X is a random variable, and its possible values are x1, x2, x3,…xn associated with
the probabilities p1, p2, p3, ..pn, respectively, then the mean of the random variable X
is given by the formula:

𝐸(𝑋) = 𝜇 = ෍ 𝑥𝑖 𝑝𝑖
𝑖=1

The mean of the random variable X can also be represented by

E(x) = x1p1+x2p2+x3p3+…..+xnpn

Thus, the mean or the expectation of the random variable X is defined as the sum of
the products of all possible values of X by their respective probability values.
Variance of a Random Variable
Assume that X is a random variable whose possible values are x1, x2, x3,…xn which
occurs with the probability p(x1), p(x2), p(x3),…p(xn), respectively, then the variance
of the random variable X is given by:

Variance (X) = σx2 =


𝑛

෍(𝑥𝑖 − 𝜇)2 𝑝(𝑥𝑖 )


𝑖=1

The above formula can also be expressed as:

σx2 = E(X-µ)2

(or)

Var (X) = E(X2)-[E(X)]2, where E(X2) =

𝓃
෍ 𝓍𝒾2 𝜌(𝓍𝒾 )
𝒾=1

Here,

E(x) = µ = Mean or the expectation of the random variable

Var (X) = σx2 = Variance of the random variable.

Standard Deviation of a Random Variable


From the variance formula, we can easily derive the formula for the standard deviation
of a random variable.
For the non-negative number, the standard deviation of the random variable X is given
as:
𝑛

𝜎𝑥 = √𝑉𝑎𝑟(𝑋) = √෍(𝑥𝑖 − 𝜇)2 𝑝(𝑥𝑖 )


𝑖=1

Common laws of probability distribution

1. Types of Probability Distributions

• Uniform Distribution
What is Uniform Distribution?
Probability distribution in which all the outcome have equal probability is known as
Uniform Distribution.

𝟏
𝐟(𝐱) =
𝐛−𝐚

b: heighest value of X
a: lowest value of X

−∞ < 𝑎 ≤ 𝑥 ≤ 𝑏 < ∞

• Normal distribution law (Gaussian distribution)


The random variable N that takes the value in (−∞,+∞) is said to obey the
normal distribution law or Gaussian distribution law, denoted by N (𝜇, 𝜎 2 ) if the
probability density function of X has the following form:
1 (𝓍−𝜇)2
f(x)= 𝑒𝑥𝑝 (− )
√ 2𝜋𝜎 𝜎2

• Normative law of distribution


Definition: If the random variable X obeys the normal distribution law with
E(X)=0, D(X)=1, BNN X is said to obey the canonical distribution law, denoted
N(0,1).
The density function of the canonical distribution denoted () for by:𝜑𝓍
1 𝑥2
𝜑(𝑥) = exp (− ).
√2𝜋 2

The distribution function of the canonical distribution denotes an


expression Φ(𝑥)
𝑥
1 𝑡2
Φ(𝑥) = ∫ exp (− )𝑑𝑡, ∀𝑥 ∈ ℝ.
√2𝜋 2
−∞

• The law of binomial distribution


Suppose we conduct an iterative, independent test, some event that we
need to be concerned about. In each test it can only happen that 1 of 2 cases
either occurs or does not occur, occurs with . Calling the "number of occurrences
of the event in the test" BNN is discrete and it can take 1 of the values: with the
corresponding probabilities calculated 𝑛𝐴𝐴𝐴𝐴𝑃(𝐴) = 𝑝𝑋 = 𝐴𝑛𝑋0,1,2, ⋯ , 𝑛 by
Bernoully's formula:

𝑝𝑘 = 𝑝(𝑋 = 𝑘) = 𝐶𝑛𝑘 𝑝 𝑘 (1 − 𝑝)𝑛−𝑘 ,với 𝑘 = 0,1,2, ⋯ , 𝑛.

Poisson distribution law


When conducting n independent tests on BNN "the number of occurrences
of events" with a small probability (<0.1) and product𝑋 = 𝐴𝑝𝑛𝑝 = 𝑎 =
const, we have the following formula:(*):

𝑎 𝑥 −𝑎
𝑝𝑥 = 𝑃(𝑋 = 𝑥) = 𝐶𝑛𝑥 𝑝 𝑥 𝑞 𝑛−𝑥 ≈ 𝑒
𝑥!
Chapter 4: Probabilistic Limit Theorem
Weak Law of Large Numbers
Aresult in probability theory, also known as Bernoulli's theorem. Where P
is a series of independent and evenly distributed random variables, each
series has an mean value and a standard deviation value.
1
0 = 𝑙𝑖𝑚𝓃→∞ 𝑃 {|𝑋 − 𝜇| > }
𝑛

You might also like