Professional Documents
Culture Documents
Bsa Unit 4
Bsa Unit 4
Probability:
A probability is a way of assigning every event a value between zero and one, with the
requirement that the event made up of all possible results (in our example, the event
{1,2,3,4,5,6}) is assigned a value of one. To qualify as a probability, the assignment of values
must satisfy the requirement that if you look at a collection of mutually exclusive events (events
with no common results, e.g., the events {1,6}, {3}, and {2,4} are all mutually exclusive), the
probability that at least one of the events will occur is given by the sum of the probabilities of all
the individual events.
Theorem of Probabilities:
Multiplication Theorem
If two events, A and B are independent then the joint probability is
for example, if two coins are flipped the chance of both being heads is
Additional Theorem
If either event A or event B or both events occur on a single performance of an experiment this is
called the union of the events A and B denoted as . If two events are mutually
exclusive then the probability of either occurring is
For example, in a bag of 2 red balls and 2 blue balls (4 balls in total), the probability of taking a
red ball is ; however, when taking a second ball, the probability of it being either a red ball
or a blue ball depends on the ball previously taken, such as, if a red ball was taken, the
probability of picking a red ball again would be since only 1 red and 2 blue balls would
have been remaining.
Baye’s Theorem
Bayes' theorem (also known as Bayes' rule) is a useful tool for calculating conditional probabilities. Bayes'
theorem can be stated as follows:
Bayes' theorem. Let A1, A2, ... , An be a set of mutually exclusive events that together form the sample
space S. Let B be any event from the same sample space, such that P(B) > 0. Then,
P( Ak ∩ B )
P( Ak | B ) =
P( A1 ∩ B ) + P( A2 ∩ B ) + . . . + P( An ∩ B )
Note: Invoking the fact that P( Ak ∩ B ) = P( Ak )P( B | Ak ), Baye's theorem can also be expressed as
P( Ak ) P( B | Ak )
P( Ak | B ) =
P( A1 ) P( B | A1 ) + P( A2 ) P( B | A2 ) + . . . + P( An ) P( B | An )
Probability Distribution
A probability distribution provides the possible values of the random variable and their
corresponding probabilities. A probability distribution can be in the form of a table, graph or
mathematical formula.
Binomial Probability Distribution.
The probability distribution of the random variable X is called a binomial distribution, and is
given by the formula:
P(X)=Cxnpxqn−x
where
x = 0, 1, 2, ... n
(i.e. q = 1 − p)
The Poisson Distribution was developed by the French mathematician Simeon Denis Poisson in
1837.
Apart from disjoint time intervals, the Poisson random variable also applies to disjoint regions of
space.
Applications
the number of deaths by horse kicking in the Prussian army (first application)
birth defects and genetic mutations
rare diseases (like Leukemia, but not AIDS because it is infectious and so not
independent) - especially in legal cases
car accidents
traffic flow and ideal gap distance
number of typing errors on a page
hairs found in McDonald's hamburgers
spread of an endangered animal in Africa
failure of a machine in one month
The probability distribution of a Poisson random variable X representing the number of successes
occurring in a given time interval or a specified region of space is given by the formula:
P(X)=x!e−μμx
where
x=0,1,2,3…
A random variable X whose distribution has the shape of a normal curve is called a normal
random variable.
Normal Curve
This random variable X is said to be normally distributed with mean μ and standard deviation σ if
its probability distribution is given by
f(X)=σ√2π1e−(x−μ)2/2 σ2
Note:
The probability of a continuous normal variable X found in a particular interval [a, b] is the area
under the curve bounded by x=a and x=b and is given by
P(a<X<b)=∫abf(X)dx
It makes life a lot easier for us if we standardize our normal curve, with a mean of zero and a
standard deviation of 1 unit.
f(X)=√2π1e−x2/2
We can transform all the observations of any normal random variable X with mean μ and
variance σ to a new set of observations of another normal random variable Z with mean 0 and
variance 1 using the following transformation:
Z=σX−μ