Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 39

Probability & Probability

Distribution
Lecture Outcomes
• Discuss the fundamentals of probability

• Examine the amount of uncertainty that is


involved before making important decisions.

• Analyse the various probability rules that help


to measure uncertainty
Probability: Meaning and Importance

• Probability is the likelihood or chance that a


particular event will or will not occur;
• The theory of probability provides a
quantitative measure of uncertainty of
occurrence of different events resulting from a
random experiment, in terms of quantitative
measures ranging from 0 to 1;
Some Important Concepts
• Experiment: it is a process which produces
outcomes; Example, tossing a coin is an
experiment and it has two possible outcome:
Head (H) or Tail (T).
• Event: it is the possible outcome of an
experiment; Example, if the experiment is to
toss a fair coin, an event can be obtaining a
head or tail called event.
Contd..
• Independent & Dependent Events: two events are
said to be independent, if the occurrence or non-
occurrence of one is not affected by the occurrence
or non-occurrence of the other; vice versa
• Mutually Exclusive Events: two or more events are
said to be mutually exclusive if the occurrence of
one implies that the other cannot occur; ex:- If
heads comes by tossing a fair coin would not let
Tails come on same time.
Contd..
• Exhaustive events: The total number of possible
outcomes of a trail or experiment are called
exhaustive events.
• E.g:- In case of tossing a dice, the set of six possible
outcomes, i.e., 1,2,3,4,5 and 6 are exhaustive events.
• Equally likely events: If the chances of happening of
each event is equal or same.
• E.g:- If a fair coin is tossed, the events H and T are
equally likely events.
Contd..
• Complementary events: Let there be two
events A and B. A is called the complementary
event of B and B is complementary event of A,
if A and B are mutually exclusive and
exhaustive. E.g:- In tossing a coin, occurrence
of head (H) and tail (t) are complementary
events.
Contd..
• Simple and Compound Events: In case of simple
events, we consider the probability of happening or
not happening of single events. E.g.:- If a dice is
rolled once and A be the event that face number 5 is
turned up, then A is called a simple event.
In case of compound events, we consider the joint
occurrence of two or more events. E.g.:- If two coins
are tossed simultaneously and we shall be finding the
probability of getting two heads, then we are dealing
with compound events.
Classical/ Prior Approach
• This approach happens to be the earliest;

• This school of thought assumes that all the possible outcomes of


an experiment are mutually exclusive & equally likely;

• If there are ‘a’ possible outcomes favorable to the occurrence of


Event E, & ‘b’ possible outcomes unfavorable to the occurrence of
Event E & all these possible outcomes are equally likely & mutually
exclusive, then the probability that the event E will occur, denoted
by P(E), is

P(E)= Number of outcomes favorable to occurrence of E


Total number of outcomes
Contd..
• This approach has two characteristics:
a. The subjects refers to fair coins, fair decks of
cards; but if the coin is unbalanced or there is
a loaded dice, this approach would offer
nothing but confusion;
b. In order to determine probabilities, no coins
had to be tossed, no cards shuffled, i.e. no
experimental data were required to be
collected;
Relative sequence/Empirical Approach
• This method uses the relative frequencies of past
occurrences as the basis of computing present
probability; hence it is based on experiments
conducted in the past;
• If an Event ‘E’ has occurred ‘r’ number of times in a
series of ‘n’ independent trials;, all under uniform
conditions, then the ratio of ‘r’ gives the probability of
Event ‘E’ provided ‘n’ is sufficiently large:
P(E)= r = favorable trials
n total of trials
Subjective/Intuitive/Judgmental Approach

• This approach is based on the intuition of an


individual;

• This is not a scientific approach;

• It is based on accumulation of knowledge,


understanding and experience of an
individual;
Laws of Probability
• For any event probability lies between 0 & 1;

• It is represented in percentages, ratios,


fractions;

• Each event has a complementary event


i.e. P(E1) + P’(E1) =1
Types of Probability
• Marginal Probability;

• Union Probability;

• Joint Probability;

• Conditional Probability.
Marginal Probability
• It is the first type of probability;
• A marginal or unconditional probability is the
simple probability of the occurrence of an event;

• Denoted by P(E) where ‘E’ is some event;

P(E)= Number of outcomes favorable to occurrence of E


Total number of outcomes
Union Probability
• Second type of probability;
• If E1 & E2 are two Events, then Union probability is
denoted by P(E1 U E2 );

• It is the probability that Event E1 will occur or that Event


E2 will occur or both Event E1 & Event E2 will occur;
• For example, union probability is the probability that a
person either owns a Maruti 800 or Maruti Zen. For
qualifying to be part of the union, a person has to have
at least one of these cars
Joint Probability
• It is the probability of the occurrence of Event E 1
and Event E2;

• If E1 & E2 are two Events, then Joint probability is


denoted by P(E1∏E2 );

• For example, it is the probability that a persons


owns both a Maruti 800 & Maruti Zen; for joint
probability, owning a single car is not sufficient;
Joint Probability
Conditional Probability
• It is the fourth type of probability;
• Conditional Probability of two Events E1 & E2
is generally denoted by P(E1/E2);
• It is probability of the occurrence of E1 given
that E2 has already occurred;
• Conditional probability is the probability that a
person owns a Maruti 800 given that he
already has a Maruti Zen;
Addition Rule
• Used to estimate union probability;
• If there are two Events E1 & E2, then the general rule
of addition is given by:
P(E1 or E2) = P(E1) + P(E2) – P (E1 & E2);
P(E1 U E2) = P(E1) + P(E2) – P (E1∏E2);

• Special Rule of addition for mutually exclusive:


P(E1 or E2) = P(E1) + P(E2);
P(E1 U E2) = P(E1) + P(E2);
Multiplication Rule
• Used to estimate joint probability and also
conditional probability;
• If there are two Events E1 & E2, then the general rule
of multiplication is given by:
P(E1 & E2) = P(E1) . P(E2 /E1);
P(E1 ∏ E2) = P(E1) . P(E2 /E1) ;
• Special Rule of multiplication for independent events:
P(E1 & E2) = P(E1) . P(E2);
P(E1 ∏ E2) = P(E1) . P(E2);
Probability Distribution
• The list of all possible outcomes of random
variable along with their probabilities of
occurrence is called PD.
• eg- sale volume
• {HH, HT, TH, TT}= {2,1,1,0}
• Probability Distribution Function
Discrete & Continuous random distributions
• A random variable is a variable which contains the outcome of a
chance experiment;

• For example, in an experiment to measure the number of


customers who arrive in a shop during a time interval of 2 minutes;
the possible outcome may vary from 0 to n customers; these
outcomes (0,1,2,3,4,…n)are the values of the random variable.

• These random variables are called discrete random variables

• Eg- a costumer can buy 1,2,3……….. Shirts, pants etc


Contd..
• In other words , a random variable which assumes either a finite
number of values or a countable infinite number of possible values
is termed as Discrete Random variable

• On the other hand, random variables that assumes any numerical


value in an interval or can take values at every point in a given
interval is called continuous random variable.

• Eg- floor area of house, office


• Price and cost of product
Binomial Probability Distribution
• Most commonly used & widely known distribution among all
discrete distributions.
• Named after Jacob Bernoulli (1654-1705)

1. Only two mutually exclusive outcomes are possible;( one is


referred to as success & the other as failure)

2. Probability of success (p) or failure (q) is constant over a number


of trials;

3. The number of events is discrete & can be represented by


integers(0,1,2,3,4,onwards)
• Examples-
• Defective or good
• Boy or girl
• Zero or one
• Head or tail etc
only 2 outcomes are possible
How to calculate?
• P(X)= Crprqn-r
n

where
n= total number of trials
r = number of successes in n trials
p= probability of success
q= probability of failure
nC n!___
r=

r!(n-r)!
• Mean (µ)= np

• Standard deviation=⌡npq
Poisson Distribution
• It is named after the famous French Mathematician
Simeon Poisson;
• It is also a discrete distribution; but there are a few
differences between Binomial & Poisson
distributions. For a given number of trials the
binomial distribution describes a distribution of two
possible outcomes: either success or failure whereas
Poisson focuses on the number of discrete
occurrences over an interval.
• It is widely used in the field of managerial decision
making; widely used in queuing models
Poisson Process Conditions
• The event occur in a continuum of time & at a
randomly selected point & event either occurs
or doesn’t occur;
• Whether the event occur or doesn’t occur at a
point, it is independent of the previous point
where the event may have occurred or not;
• The probability of occurrence of events remains
same/constant over the whole period or
throughout the continuum;
Formula to calculate
• P(x/\)= \x e- \
x!

\(greek letter lambda) =mean/average


e (constant)= 2.71826
x is a random variable(designated number)
Normal Distribution
• It is the most commonly used distribution
among all probability distributions;
• It has a wide range of practical application
example, where the random variables are
human characteristics such as height, weight,
speed, IQ scores;
• Normal distribution was invented in the 18th
century;
Characteristics of Normal Distribution
• The curve of normal distribution is symmetrical/ mesokurtic;
• The mean, median & mode are identical;
• The two tail of normal curve asymptotic;
• Curve is unimodal or bell shaped;
• The total area under normal distribution is 100% & the distribution is as
follows:
µ+1σ = 68%
µ+2σ =97%
µ+3σ = 99.7%

Z= x- µ
σ
Normal Distribution

P(Y   )  0.50 P(     Y     )  0.68 P(   2  Y    2 )  0.95

You might also like