Probability Distribution: Shreya Kanwar (16eemme023)

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 51

PROBABILITY

DISTRIBUTION
Shreya kanwar (16eemme023)
INTRODUCTION
 a probability distribution is a mathematical
function that provides the probabilities of occurrence
of different possible outcomes in an experiment. In
more technical terms, the probability distribution is a
description of a random phenomenon in terms of
the probabilities of events.

 A probability distribution is defined in terms of an


underlying sample space, which is the set of all
possible outcomes of the random phenomenon being
observed. The sample space may be the set of real
numbers or a higher-dimensional vector space, or it
may be a list of non-numerical values; for example,
the sample space of a coin flip would be {heads, tails} .
TYPES OF PROBABILITY DISTRIBUTION
 A discrete probability distribution (applicable
to the scenarios where the set of possible outcomes
is discrete, such as a coin toss or a roll of dice) can
be encoded by a discrete list of the probabilities of
the outcomes, known as a probability mass
function.

 On the other hand, a continuous probability


distribution (applicable to the scenarios where the
set of possible outcomes can take on values in a
continuous range (e.g. real numbers), such as the
temperature on a given day) is typically described
by probability density functions.
DISCRETE PROBABILITY DISTRIBUTION

 A discrete probability distribution is a


probability distribution characterized by
a probability mass function. Thus, the
distribution of a random variable X is discrete,
and X is called a discrete random variable, if
∑P(X=u) =1
 as u runs through the set of all possible values
of X. A discrete random variable can assume only
a finite or countably infinite number of values.
TYPES OF DISCRETE PROBABILITY
DISTRIBUTION

 Bernoulli distribution
 Binomial distribution

 Geometric distribution

 Negative binomial distribution

 Poisson distribution

 Hyper geometric
1. BERNOULLI DISTRIBUTION

 The Bernoulli distribution, named after Swiss


mathematician Jacob Bernoulli is the discrete
probability distribution of a random variable
which takes the value 1 with probability p and
the value 0 with probability q=1-p, that is, the
probability distribution of any single experiment
that asks a yes–no question.
 Parameters {0<p<1}

 Probability mass function

f(x) = pk(1-p)1-k ; k= 0,1.


 Range

k=(0,1)
CONTINUE …

 Expected value :- The expected value of a


Bernoulli random variable is
E[x]= p
 Proof

E[X] =∑xp(x)
=1*p(1) + 0*p(0)
=1*p + 0*(1-p)
=p
 Variance :- The variance of a Bernoulli random
variable is
Var [x] = p(1-p)
CONTINUE …

 Proof
E[x2] =∑x2p(x)
12*p(1)+02*p(0)
1*p+0*(1-p)
p
E[x]2 = p2
Var [x] = E[x2] – E[x]2
= p-p2 = p(1-p)
 Moment generating function :-The moment
generating function of a Bernoulli random
variable is
Mx(t) = 1-p + p exp(t) ; where 1-p = q
CONTINUE …
 Proof
Mx(t) = E[exp (tx)]
= ∑exp(tx)pX(x)
=exp(t.1) *pX(1) +exp(t.0)pX(0)
=exp(t)*p + 1*(1-p)
=1-p+pexp(t)
BERNOULLI DISTRIBUTION GRAPH
2. BINOMIAL DISTRIBUTION
 The binomial distribution with
parameters n and p is the discrete probability
distribution of the number of successes in a
sequence of n independent experiments each
asking a yes–no question.
 A single success/failure experiment is also called
a Bernoulli trial or Bernoulli experiment and a
sequence of outcomes is called a Bernoulli
process for a single trial, i.e., n = 1, the binomial
distribution is a Bernoulli distribution.

 Parameters :- p= [0,1]

 Range :- k ∈ (0,1….n)
CONTINUE …
 Probability mass function:-

 Expected value:-
µ =E(X) = np
 Variance :-
2= npq
 Standard Deviation:-
= √npq
 Moment generating function:-

Mx(t) = (1-p+pet)n
BINOMIAL DISTRIBUTION GRAPH
3. GEOMETRIC DISTRIBUTION

 A probability distribution to determine the


probability that success will occur on the nth trial
of a binomial experiment.
 Repeated binomial trials

 Continue until first success

 Find probability that first success comes on nth


trial
 Probability of success on each trial = p

 Parameters:- 0<p<1 success probability.

 Range:- k trials where k ∈ {1,2,3….}


CONTINUE …
 Probability mass function (pmf):-
(1-p)k-1 p
 Mean:-

µ = 1/p
 Variance
2 = (1-p)/p2

 Moment generating function:-

Mx(t) = (pet)/1-qet
GEOMETRIC DISTRIBUTION GRAPH
4. NEGATIVE BINOMIAL DISTRIBUTION

 The negative binomial distribution is


a discrete probability distribution of the number
of successes in a sequence of independent and
identically distributed Bernoulli trials before a
specified (non-random) number of failures
(denoted r) occurs. For example, if we define a 1
as failure, all non-1s as successes, and we throw
a dice repeatedly until 1 appears the third time
(r = three failures), then the probability
distribution of the number of non-1s that
appeared will be a negative binomial
distribution.
 Parameters :- r > 0 — number of failures until
the experiment is stopped ; p ∈ (0,1) — success
probability in each experiment (real)
CONTINUE …
 Range :- k ∈ { 0, 1, 2, 3, … } — number of
successes
NEGATIVE BINOMIAL DISTRIBUTION
5. POISSON DISTRIBUTION
 A poisson distribution is used to model the
number of events that happen within a product
unit ( number of defective rivets in an airplane
wing), space or volume (blemishes per 200 square
meter of fabric),or time period(machine
breakdowns per mouth). It is assumed that the
events happen randomly and independently.
 The poisson random variable is denoted by X. An
observed value of X is represented by x. The
probability distribution (or mass) function of the
number of event (x) is given by
p(x)= e-λλx/x!
 Where λ is the mean or average number of events
that happen over the product , volume or time
period specified.
CONTINUE …
 The poisson distribution has one parameter λ.
 The mean and variance of poisson distribution
are equal and are given by,

µ= 2=λ
POISSON DISTRIBUTION GRAPH
6.HYPER GEOMETRIC DISTRIBUTION
 The hyper geometric distribution is useful in
sampling from a finite population (or lot) without
replacement (i.e. without placing the sample
elements back in the population) when the items
or outcomes can be categorized into one of two
groups (usually called success and failure). If we
consider finding a nonconforming item a success,
the probability distribution of the number of
nonconforming items (x) in the sample is given by
CONTINUE …
 Where D= number of nonconforming items in
population, N= size of population, n= size of
sample, x= number of nonconforming items in the
sample.
 The mean (or expected value) of a hyper
geometric distribution is given by
HYPER GEOMETRIC DISTRIBUTION
GRAPH
CONTINUOUS PROBABILITY
DISTRIBUTION

 A continuous probability distribution is a


probability distribution that has a cumulative
distribution function that is continuous. Most
often they are generated by having a probability
density function. Mathematicians call
distributions with probability density
functions absolutely continuous, since
their cumulative distribution function
is absolutely continuous with respect to
the Lebesgue measure λ. If the distribution
of X is continuous, then X is called a continuous
random variable.
TYPES OF CONTINUOUS PROBABILITY DISTRIBUTION
 Uniform probability distribution
 Exponential probability distribution

 Gamma probability distribution

 Chi-Square probability distribution

 Beta probability distribution

 Normal probability distribution


1.UNIFORM PROBABILITY DISTRIBUTION
 The continuous uniform
distribution or rectangular distribution is a
family of symmetric probability distributions
such that for each member of the family,
all intervals of the same length on the
distribution's support are equally probable. The
support is defined by the two
parameters, a and b, which are its minimum and
maximum values.
 The probability density function (PDF) of the
continuous uniform random distribution is given
by
CONTINUE …
 The continuous density function (CDF) can be
derived from the above PDF using the relation

 Therefore, the CDF of the continuous uniform


distribution is given by,
CONTINUE …
 Mean :-
(a+b)/2
 Variance :-
(b-a)2/12
 Parameters :-
-∞ < a < b < ∞
 Range :-
x ∈ [a,b]
 Moment generating function :-
2. EXPONENTIAL PROBABILITY
DISTRIBUTION
 The Exponential probability distribution is used
in reliability analysis to describe the time to the
failure of a component or system. Its probability
density function is given by
ƒ(x) = λe-λx , x≥0 Where λ = failure rate.

 Parameters :- λ > 0 rate

 Range :- x ∈ [0, ∞)

 The exponential cumulative distribution


function
x
F(x) = P(X≤x) = 0∫ λe-λtdt = 1-e-λx
EXPONENTIAL PROBABILITY
DISTRIBUTION GRAPH
CONTINUE …

 Mean :-
µ = 1/λ
 Variance :-
2= 1/λ2
 Moment generating function :-

λ/(λ-t)
3.GAMMA PROBABILITY DISTRIBUTION

 Another important distribution with applications


in reliability analysis is the gamma distribution.
Its probability density function is given by
λk xk-1e-λx
Г(k)
 Where k is a shape parameter, k>o; and λ is a
scale parameter, λ>0.
 Mean :-

µ = k/λ
 Variance :-
2 = k/λ

 Parameter :- k > 0 shape


θ > 0 scale
CONTINUE …
 Range :-x ∈ (0,∞)
 Moment generating function :-

(1-θt)-k for t<1/θ


GAMMA PROBABILITY DISTRIBUTION
GRAPH
4. CHI-SQUARE PROBABILITY DISTRIBUTION

 The chi-square test is one of the most commonly


used non parametric test, in which the sampling
distribution of the test statistic is a chi-square
distribution when the null hypothesis is ture .
 Parameters :- k∈N (known as "degrees of
freedom")
 Range :- x∈(0,+∞)

 probability density function :-


CONTINUE …
 Mean :- k
 Variance :- 2k

 cumulative distribution function :-

 Moment generating function :-


CHI-SQUARE PROBABILITY
DISTRIBUTION GRAPH
5.BETA PROBABILITY DISTRIBUTION
 The beta distribution is a family of
continuous probability distributions defined on
the interval [0, 1] parametrized by two
positive shape parameters, denoted by α and β,
that appear as exponents of the random variable
and control the shape of the distribution.
 The beta distribution has been applied to model
the behaviour of random variables limited to
intervals of finite length in a wide variety of
disciplines.
 Parameters :- α > 0 shape (real)
β > 0 shape (real)
 Range :- x ∈ [0,1]
CONTINUE …
 Mean :-

 Variance :-

 Probability density function :-


CONTINUE …
 Cumulative distribution function
Ix( ,β)
 Moment generating function :-
BETA DISTRIBUTION GRAPH
6. NORMAL PROBABILITY DISTRIBUTION
 The Normal Probability Distribution is very
common in the field of statistics.
 Whenever you measure things like people's
height, weight, salary, opinions or votes, the
graph of the results is very often a normal curve.
 A random variable X whose distribution has the
shape of a normal curve is called a normal
random variable.
PROPERTIES OF A NORMAL DISTRIBUTION

 The normal curve is symmetrical about the


mean μ;
 The mean is at the middle and divides the area
into halves;
 The total area under the curve is equal to 1;
 It is completely determined by its mean and
standard deviation σ (or variance σ2)
 Parameters :- µ ∈ R = mean
2 = 0 = variance

 Range :- x ∈ R
 Mean = µ
 Variance = 2
CONTINUE …
 Probability density function =

 Cumulative distribution function

 Moment generating function :-


THE STANDARD NORMAL DISTRIBUTION
 It makes life a lot easier for us if
we standardize our normal curve, with a mean
of zero and a standard deviation of 1 unit.
 If we have the standardized situation of μ =
0 and σ = 1, then we have:
2
f(X)=e−x /2 /√2
 We can transform all the observations of any
normal random variable X with mean μ and
variance σ to a new set of observations of another
normal random variable Z with mean 0 and
variance 1 using the following transformation:

You might also like