Download as pdf or txt
Download as pdf or txt
You are on page 1of 85

Probability & Statistics

Rubayet Karim
Assistant Professor

Dept. of Industrial & Production Engineering


Jessore University of Science & Technology

Probability is a branch of mathematics that deals with


calculating the likelihood of a given event's occurrence,
which is expressed as a number between 1 and 0.

P(E)
Where: P(E): probability of occurrence of event E

N: Number of outcomes in E
N: total number of outcomes

P(E)
Where: P(E): probability of occurrence of event E
N: total number of trials
Number of outcomes in E

Example.
Experiment :
Trial :

Tossing 4 Coins.
Tossing each coin.

We can consider the act of tossing each coin as a trial and thus
say that there are 4 trials in the experiment of tossing 4 coins.
In probability theory an elementary event (also called
an atomic event or simple event) is an event which contains
only a single outcome in the sample space.
Example: Die rolling
The possible outcomes of this experiment are 1, 2, 3, 4, 5
and 6.
Single outcome(Elementary event).
When the objective is to get a even number from this
experiment, then possible outcome is 2,4,6 so not a single
outcome thats why this is not a elementary event.

{}
Therefore P(

) )=0

P ( X Y) = P(X) and P ( Y X) = P(Y)

P(A) + P(

)=1

Types of Probability
There are four types:

Marginal probability P(X)


The probability of X occurring

Union Probability P(
)
The probability of X or Y occurring
Joint Probability P(
)
The probability of X and Y occurring
Conditional probability P ( X Y)
The probability of X occurring given that Y has
occurred.
General Law of Addition
P(
) = P(X) + P(Y) P(

P(

) = P(X) + P(Y)

P(T

P(

C) = P(T) + P(C) =

)=P(X) P(Y X) =P(Y) P(X Y)

P(S) =
P(M)=
P(S M) =0.2
P(S M) = P(M) P(S M)=

P(X)= P(X Y),


P(

P(Y) =P(Y X)

) = P(X) P(Y)

P(
P(X Y) =

( )

P(Y X) P(X)
=

P(Y)

P(Y)

(Y Xi) P(Xi)

P(Y Xi)

P(Y Xi) P(Xi )

P(Xi Y)
(Y Xi) P(Xi)

Bayes rule
Events are mutually exclusive(i.e conflict with each other )
Together they must form a sample space
Most of the time use reverse time order probability (i.e
P(cause effect) )
When conditional probability declares reverse time order
then it is called posterior probability
P(Y X)
Time order
Effect
P( X Y)

Cause
Reverse time order

P(Y X1) P(X1)


P(Y X1) P(X1 )+ P(Y X2) P(X2)+ P(Y X3) P(X3)

Example
Marie is getting married tomorrow, at an outdoor ceremony in
the desert. In recent years, it has rained only 5 days each year.
Unfortunately, the weatherman has predicted rain for tomorrow.
When it actually rains, the weatherman correctly forecasts rain
90% of the time. When it doesn't rain, he incorrectly forecasts
rain 10% of the time. What is the probability that it will rain on
the day of Marie's wedding?
Solution: The sample space is defined by two mutually-exclusive
events - it rains or it does not rain. Additionally, a third event
occurs when the weatherman predicts rain. Notation for these
events appears below.
Event A1. It rains on Marie's wedding.
Event A2. It does not rain on Marie's wedding.
Event B. The weatherman predicts rain.

In terms of probabilities, we know the following's P( A1 ) = 5/365


=0.0136985 [It rains 5 days out of the year.]
P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of the
year.]
P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90%
of the time.]
P( B | A2 ) = 0.1 [When it does not rain, the weatherman predicts
rain 10% of the time.]

We want to know P( A1 | B ), the probability it will rain on the


day of Marie's wedding, given a forecast for rain by the
weatherman. The answer can be determined from Bayes'
theorem, as shown below.

Note the somewhat unintuitive result. Even when the weatherman


predicts rain, it only rains only about 11% of the time. Despite the
weatherman's gloomy prediction, there is a good chance that
Marie will not get rained on at her wedding.

c = constant

P(X 12.5 ) =

P(X< 12.5 ) = 1- P(X 12.5 ) = 1- 0.1666 =0.8333

0.1666

Application
The exponential distribution occurs naturally when describing
the lengths of the inter-arrival times in a homogeneous Poisson
Process
Queuing theory :the service times of agents in a system (e.g.
how long it takes for a bank teller etc. to serve a customer)
are often modeled as exponentially distributed variables.
Reliability theory: Because of the memory less property of
this distribution, it is well-suited to model the constant hazard
rate portion of the bathtub curve.

1st moment of x : Mean, E[X]= /1 (x)


1st & 2nd moment of x: Variance, E[ X2 ] {E[X]}2 = /2 (x) [/1 (x)]2
1st , 2nd & 3rd moment of x:
Skewness , E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3
= /3 (x)-3. /1(x) /2 (x)+ 2. [/1 (x)]3
1st , 2nd ,3rd & 4th moment of x:
Kurtosis , E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4
=/4 (x)-4/1 (x) /3 (x)+ 6[/1 (x) ]2 /2 (x) -3. [/1 (x)]4

Example: The value of a piece of factory equipment after


three years of use is 100(0.5)x where X is a random variable
having moment generating function Mx (t)
for t <
Calculate the expected value of this piece of equipment after
three years of use.

Soln: Let , Value Y = 100(0.5)x


So expected value, E[Y] = E[100(0.5)x ]
= 100 E[(0.5)x ]
=100E [
]
=100E [
]
= 100Mx (ln 0.5)
= 100 x
= 41.9060

THE END

You might also like