Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

PSQT NOTES

Q. WHAT IS PROBABILITY?
Probability is a measure of uncertainty of various phenomenon. The role of
probability theory is to provide a framework for analyzing phenomena with
uncertain outcomes.

Q. WHAT IS RANDOM VARIABLE?


In life, we perform many experimental activities, where the result may not be
same, when the experiments are repeated under identical conditions. We are not
sure which one of many possible results will actually be obtained. Such
experiments are called random experiments.

THREE DIFFERENT APPROACHES OF PROBABILITY:


1. The classical theory of probability: Example: tossing a coin; outcomes:
head or tail

2. The statistical approach of probability: the probability on the basis of


observations and collected data. The above two approaches assume that
all outcomes are equally likely.

3. The axiomatic approach of probability: We may have reason to believe


that one outcome is more likely to occur than the other. In this approach,
some axioms are stated to interpret probability of events.

OUTCOME: A possible result of a random experiment is called its outcome.

An experiment is called random experiment if it satisfies the following two


conditions:
(i) It has more than one possible outcome.
(ii) It is not possible to predict the outcome in advance.
SAMPLE SPACE:
The sample space S is a list (set) of possible outcomes.
The list must be
● Mutually exclusive, and
● Collectively exhaustive
Types of outcomes:
Discrete
Continuous.

EVENT: Any subset E of a sample space S is called an event.

1. Mutually exclusive events :


Two events A and B are called mutually exclusive events if the occurrence of
any one of them excludes the occurrence of the other event, i.e., if they can not
occur simultaneously. In this case the sets A and B are disjoint.
2. Exhaustive Events : if E1 , E2 , ..., En are n events of a sample space S
and if E1 ∪ E2 ∪ E3 ∪ ... ∪ En = ∪ Ei = S i = 1 to n then E1 , E2 , ..., En
are called exhaustive events.

ADDITION THEOREM: If A and B are any two events then


P(A𝖴B)=P(A)+P(B)-P(A∩ 𝐵)

INDEPENDENT EVENTS: Two events A and B are said to be


independent if
P(AՈB)=P(A).P(B).
MULTIPLICATIVE RULE: If in a experiment the events A and B can
𝐵
both occur, then 𝑃(𝐴 ∩ 𝐵) = 𝑃( ) P(A)provided P(A) > 0. We can also
𝐴
𝐴
write 𝑃(𝐴 ∩ 𝐵)= 𝑃( ) P(B).
𝐵

TOTAL PROBABILITY
If the events a partition of the sample space S such that
P ( B I ) * 0 for i =1,2,……k
then for any event A of S

BAYE’S RULE:
If the events B1, B2,…,Bk constitute a partition of the sample space S such
that P(Bi ≠ 0) 𝑓𝑜𝑟 𝑖 = 1,2, … , 𝑘 𝑡ℎ𝑒𝑛 𝑓𝑜𝑟 𝑎𝑛𝑦 𝑒𝑣𝑒𝑛𝑡 𝐴 in S such that P(A)
≠0
𝑃(𝐵𝑟 ∩ 𝐴) 𝑃(𝐵𝑟 )𝑃((𝐴⁄𝐵𝑟 )
𝑃(𝐵𝑟 ⁄𝐴) = = , 𝑓𝑜𝑟 = 1,2, . . . , 𝑘
∑𝐾
𝑖=1 𝑃(𝐵𝑖 ∩ 𝐴) ∑ 𝑘
𝑖=1 𝑃(𝐵𝑖 )𝑃(𝐴 ⁄ 𝐵𝑖 )

Random: In an experiment of chance, outcomes occur randomly.


We often summarize the outcome from a random experiment by a
simple number.
Variable: is a symbol such as X or Y that assumes values for
different elements. If the variable can assume only one value, it is
called a constant.
Random variable: A function that assigns a real number to each
outcome in the sample space of a random experiment.
Discrete Random Variables: A random variable is discrete if its
set of possible values consist of discrete points on the number line.
Continuous Random Variables : A random variable is continuous
if its set of possible values consist of an entire interval on the
number line.

PROBABILTY DISTRIBUTION FUNCTION:


Definition: In dealing with continuous variables, f(x) is usually called the
probability density function or simply the density function of X. The
function f(x) is a probability density function for the continuous random
variable X, defined over the set of real numbers R, if
1. f(x)≥ 0, 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥𝜖𝑅

2. ∫−∞ 𝑓(𝑥)𝑑𝑥 = 1, Total area under the curve is 1
b
P(a  x  b) =  f ( x)dx.
3. a

CUMULATIVE DISTRIBUTION FUNCTION:


• The cumulative distribution function of a discreterandom variable X ,
denoted as F(x), is
• F (x) = P( X £ x) = å f (t) for -¥<x<¥
• For a discrete random variable X, F(x) satisfies the following
• 1) 0 <=F (x) <= 1
• 2) If x <=y, then F (x) <= F ( y)
• CUMULATIVE DISTRIBUTION FUNCTION: The cumulative
distribution function F(x) of a continuous random variable X with density
function f(x) is

F ( x) = P( X  x) =  f (t ) for −   x  
tx
PROPERTIES OF RANDOM VARIABLE:

1. E(c)=c
2. E(aX+b)=aE(X)+b
3. V(aX+b)=a2V(X)
4. If X and Y are independent random variables, then
V(aX+bY)=a2V(X)+b2V(Y)
5. σ2=E(X2)-µ2

DIFFERENCE BETWEEN DISCRETE AND RANDOM VARIABLE:

Discrete Random Continuous


Variables Random Variables
Number of Electrical
Scratches on a Current
Surfaceof
Number Length
Defective Parts
among 1000 Time
tested of
Number
Transmitted Bits Temperature
Received Error Weight

Probability Mass Function: A probability function is a


mathematical function that provides probabilities for the possible
outcomes of the random variable, it is typically denoted as f(x).
Sometimes it is also known as the discrete density function.
A Bernoulli random variable/Bernoulli Trial is a random
variable that can only take two possible values, usually 0 and 1.
This random variable, models random experiments that have two
possible outcomes, sometimes referred to as "success" and
"failure“.

BINOMIAL DISTRIBUTION:
A binomial random variable is random variable that represents the number of
successes in n successive independent trials of a Bernoulli experiment.

Definition: A random variable X is said to be a binomial random variable


with parameters n and p, shown as X∼Binomial(n,p), if its PMF is given by

where 0 < p < 1.

Binomial Distribution
Mean µ = 𝑛𝑝
Variance σ2=npq
Standard deviation σ=√npq
POISSON DISTRIBUTION:
There are some experiments, which involve the occurring of the number of
outcomes during a given time interval (or in a region of space). Such a
process is called Poisson process.

You might also like