Probability Distributions

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 56

PROBABILITY

UNDERSTANDING PROBABILITIES

 EXPERIMENT The process that can lead to more than one outcome
 RANDOM EXPERIMENT If an experiment, when repeated under identical conditions, do not produce the
same outcome everytime but the outcome in a trial is one of the several possible outcomes, then such an
experiment is called a random experiment or a probabilistic experiment.
 ELEMENTARY EVENT If a random experiment is performed, then each of its outcomes is known as an
elementary event.
 SAMPLE SPACE The set of all possible outcomes of a random experiment called the sample space
 associated with it.
 EVENT A subset of the sample space associated with a random experiment is called an event.
 OCCURRENCE OF AN EVENT An event associated to a random experiment is said to occur if any one of
the elementary events belonging to it is an outcome.
 Corresponding to every event A, associated to a random experiment, we define an event “not A denoted by Ā"
which is said to occur when and only when A does not occur.
UNDERSTANDING PROBABILITIES
 CERTAIN (OR SURE EVENT): An event associated with a random experiment is called a certain event if it
always occurs whenever the experiment is performed.
 IMPOSSIBLE EVENT An event associated with a random experiment is called an impossible event if it never
occurs whenever the experiment is performed.
 COMPOUND EVENT An event associated with a random experiment is a compound event, if it is the disjoint
union of two or more elementary events.
 MUTUALLY EXCLUSIVE EVENTS Two or more events associated with a random experiment are said to
be mutually exclusive or impossible events if the occurrence of any one of them prevents the occurrence of all
others, i.e. if no two or more of them can occur simultaneously in the same trial.
 EXHAUSTIVE EVENTS Two or more events associated with a random experiment are exhaustive if their
union is the sample space.
 FAVOURABLE ELEMENTARY EVENTS Let S be the sample space associated with a random experiment
and A be an event associated with the experiment. Then, elementary events belonging to A are known as
favourable elementary events to the event A.
 Thus, an elementary event E is favourable to an event A, if the occurrence of E ensures the happening or occurrence of
event A.
RANDOM VARIABLE
 Example 1:

 Two socks are selected at random and


removed in succession from a drawer
containing five brown socks and three green
socks. List the elements of the sample space,
the corresponding probabilities, and the
corresponding values w of the random
variable W, where W is the number of brown
socks selected.

 We can write P(W = 2) = 5/14


for the probability of the event that the random variable
W will take on the value 2.
PROBABILITY DISTRIBUTIONS

 For instance, having assigned the probability 1/36 to each element of the sample space of
Figure 1, we immediately find that the random variable X, the total rolled with the pair of
dice, takes on the value 9 with probability 4/36 ; as described in Section 1, X = 9 contains
four of the equally likely elements of the sample space.
 Probability Distributions and Probability Densities associated with
Figure 1; Rolling two dies at the same time.
 Rather than tabulating, it is usually preferable to give a formula, that
is, to express the probabilities by means of a function such that its
values, f (x), equal P(X = x) for each x within the range of the RV - X.
 For instance, for the total rolled with a pair of dice we could write
 Task:-

 Find the distribution function of the random variable W of Example 1 and plot its graph.
 Solution
 Obtaining the values of Probability Distribution from its Distribution Function
 Task:-

 Find the probability distribution of this


random variable

 This represent the Distribution Function of


the total number of points rolled with a pair
of dice.
CONTINUOUS RANDOM VARIABLES
 suppose that a bottler of soft drinks is concerned about
the actual amount of a soft drink that his bottling machine
puts into 16-ounce bottles.
 if he rounds the amounts to the nearest tenth of an ounce
 If he rounds the amounts to the nearest hundredth of an
ounce, he will again be dealing with a discrete random
variable
 if he rounds the amounts to the nearest thousandth of an
ounce
 probability distributions of the corresponding discrete
random variables will approach the continuous curve
PROBABILITY DENSITY FUNCTIONS
P.D.F. s
MULTIVARIATE DISTRIBUTIONS

 We considered only the random variable whose values were the totals rolled with a pair of dice
 Closer to life, an experiment may consist of randomly choosing some of the 300 students attending an
elementary school
 principal may be interested in their I.Q.’s,
 the school nurse in their weights,
 teachers in the number of days they have been absent, and so forth.
 bivariate case
 If X and Y are discrete random variables, we write the probability that X will take on the value x and Y will take
on the value y as P(X = x, Y = y). Thus, P(X = x, Y = y) is the probability of the intersection of the events X = x and
Y = y.
 Actually, as in the univariate case, it is generally preferable to represent
probabilities such as these by means of a formula. In other words, it is
preferable to express the probabilities by means of a function with the
values f (x, y) = P(X = x,Y = y) for any pair of values (x, y) within the
range of the random variables X and Y. For instance, for the two
random variables of Example 12 we can write
 To know the probability that the values of two RVs are less than or equal to some real numbers x and y.

 With reference to Example 12, find F(1, 1).


 The joint distribution function of two RVs is defined for all real numbers.
 For instance, for Example 12 we also get F(−2, 1) =
 Towards the continuous case;
MULTIVARIATE CASES

 All definitions in the bivariate cases are generalized to multivariate cases


 the values of the joint probability distribution of n discrete random variables X1,X2, . . ., and Xn are given by

 the values of their joint distribution function are given by


MARGINAL DISTRIBUTIONS
 Example 12 we derived the joint probability distribution of two
random variables X and Y (Tablets case)
 Find the probability distribution of X alone and that of Y alone?
 Soln:
 Column totals are the probabilities that X will take on values 0, 1, 2

 Row totals are the probabilities that Y will take on values 0, 1, 2


 TASK:-
 The joint probability function of two discrete random variables X and Y is given by f(x, y) c(2x+y), where x and y
can assume all integers such that 0 ≤ x ≤ 2, 0≤ y ≤3, and f(x, y) = 0 otherwise.
 (a) Find the value of the constant c.
 (b) Find P(X = 2, Y = 1)
 (c) Find P(X≥ 1, Y≤2).
 (d) Find the marginal probabilities of (i) X, and (ii) Y
CONDITIONAL DISTRIBUTIONS

 Suppose now that A and B are the events X = x and Y = y so that we can write
 When X and Y are continuous random variables, the probability distributions are replaced by probability densities,
SPECIAL PROBABILITY DISTRIBUTIONS
 Coin tossed 3 times
 Sample space = 2^3= 8 possible events
 X is RV defined as getting head
 P(X=2) = 3/8 ; {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}

 10 times tossed: 2^10 = 1024

 We use distributions to deal with such events


 Distributions are applied based on the behaviour of sample space
DISCRETE AND CONTINUOUS DISTRIBUTIONS

 Discrete Distribution:
 Finite outcomes:
 Rolling a die, Picking a card

 Continuous Distribution:
 Infinite outcomes:
 Recording time, Measuring distance

 We will also discuss the characteristics (parameters) of the distributions:


 Mean and Variance
THE DISCRETE UNIFORM DISTRIBUTION

 If a random variable can take on k different values with equal probability, we


say that it has a discrete uniform distribution
 All outcomes are equally likely - Equiprobable
 Eg:- Rolling a die, flipping a coin
 Random Variable; X~U(a,b)
 Rolling a die:
 P(1)=P(2)=P(3)=…=P(6); 6 equally tall bars with 1/6 value.

 Parameters:
 Expected value doesn’t provide any relevant information.
 Mean = 3.5 , and variance = 105/36
 Uninterpretable,
 Not useful for prediction
 No real intuition
THE BERNOULLI DISTRIBUTION

 If an experiment has two possible outcomes, “success” and (θ)


“failure,” and their probabilities are, respectively, θ and 1−θ, then
the number of successes, 0 or 1, has a Bernoulli distribution.
 One trial, Only two possible outcomes
 Follows a Bernoulli Distribution regardless of whether one is
(1˗ θ)
more likely to the other
 Probability of any one of the events occurring known
 E(X=1) = P ; E(X=0) = (1-P)

 Variance of Bernoulli Distribution


 Example: Flip a fair coin
 Var[X] = E[X2] - (E[X])2
 Probability of getting head; P(H)=1/2
 = p(1 - p)
 P(T) = (1-1/2)
 =p*q

 Mean or Expected Value of Bernoulli Distribution


 E[X] = P(X = 1) *1 + P(X = 0) * 0
 Thus, the mean or expected value of a Bernoulli
distribution is given by E[X] = p
 A success may be getting heads with a balanced coin, it may be catching pneumonia, it may be passing (or failing)
an examination, and it may be losing a race.

 Any event with two outcomes can be converted to a Bernoulli Distribution


 Example:
 A football team of 11 players, 4 International and 7 Native.
 If we assign “selection of a native as the captain” as success,
 The experiment thus follows a Bernoulli Distribution
THE BINOMIAL DISTRIBUTION

 Bernoulli is a single trial


 Binomial deals with multiple trials, carrying out similar experiment several times
 A Bernoulli distribution is a special case of the Binomial distribution when n=1

 number of trials is fixed,


 the parameter θ (the probability of a success) is the same for each trial,
 and the trials are all independent.

 Flipping a coin 3 times to get 2 Heads


 To derive a formula for the probability of getting “x successes in n trials” under the stated conditions, observe
that the probability of getting x successes and n−x failures in a specific order is

 Clearly, the number of ways in which we can select the x trials on which there is to be a success is nCx

You might also like