Professional Documents
Culture Documents
Theory of Probability
Theory of Probability
Theory of Probability
0)
m5. For any two events A and B
There p(A\ B) = P(A 0 B®) = P(A) — P(A. B)
as A=(A\B)U(AnB),
rey (A-DOANB)=9
- P(A) = PI(A\B)U (An B)]
e = P(A\ B) + P(AN B)
P(A) = P(A BS) + P(A B)
P(A B) = P(A)- P(A 9B)
rreorem 6. Addition rule
poof. Forany two events A and B,
PAU B) = P(A) + P\B)- PAO B)
AUB=(A|BUB
viere (A|B)OB=@
fo, RAUB)=PIA|B)+PIB)
= P(A)-P(ANB)+ PB)
oc RAUB)=PIA)+ PAB)- PAB)
similarly, for any three events A, B and C.
MAUBUC) = P(A) + P(B)+ P(C)-P(AMB)
-P(ANC)-P(IBOC)+ PANBOC)
finite Probability Space
1tsample space $ be finite
S= {ay as ordph
a lt)=p,€10, 1, = 12) an,M St,
A) p.20
Gi) ~ py + Py +---+Pp
Ten pis called probability ofa,
XAGS, RA) E,pla)= Ep,
Ths, we have distribution of probability with respect to
"wcomes ofS as follows
Outcome [ay devesdy
Probability |p, Pr Pn
* called probability distribution.
Fite Equiprobable Space
esbilty space (5, €, P), 5 being finite, s.t., each outcome
|... B2! probability, is called a finite equiprobable space.
2 words, if $ has n elements, then each point in $ is
zat the probability Yn and. each event A containing
‘Sassigned the probability r/n.
Ths, 14) = Number of elements of A _ 144)
Number of elements of S—n(S)
927
Discrete Space, Countably Infinite Sample
Space
Let S={aya,
for aeS, pla)=p,€ 10,11
st, @ p20, ¥,
Spat
a
Then, S is called countably infinite sample space and p(@,)= Pi
is called probability of outcome a.
IF Sis finite or countably infinite, then it is called discrete; for
any AGS, P(A) is called discrete probability of A;-
Uncountable Space
Here, we shall consider the uncountable sample spaces as
having points with some finite geometrical measurement mS),
such as length, area, or volume, and where a points in S is
selected at random.
The probability of an event A, i.e,, that the selected point
belongs to A, is then the ratio of mA) to mS)
Lenght of A
Thus, AA Sah ofS
or piay= A
or ra)
~ Volume of
Such a probability space S is said to be uniform
PIA) is called non-discrete probability or
probability.
continuous
Conditional Events and
Conditional Probabilities
Let A and B be events of A sample space S s.t., happening of A
depends upon happening of B. Then, it is said that ‘event A is
conditioned by 8’. It is also expressed as ‘A is conditional
event with respect to B’ and is denoted by A/B.
The conditional probability of A given B is defined as
AN 8)
mi
YA/ B)=
, (B)#0.
Here, n(8) denotes number of points in B, etc.
Multiplication Theorem for
Conditional Probability
Let A and B be events of S. Then,
@ PB/A)= mane provided P{A)>0.
PAB)
PB) Provided PAB)> 0.
(ii) A/ B=928
These imply
P(A\B)=PLA)PGB/ A), if A)> 0
= PB)PIA/ B), if P{B)> 0.
For three independent events A, 8, C, the multiplication law is
PAN BO C)=RA)PB/ AAC/ AM BL
Here, (A)= Probability that A occurs
/\B/ A)= Probability that B occurs assuming that A occurred.
AAC/ A B)= Probability that C occurs assuming that A and B
have occurred.
Finite Stochastic Process and Tree Diagrams
A finite stochastic process is a finite sequence of experiments
where each experiment has a finite number of outcomes with
given probabilities.
A convenient way of describing such a process is by means of
a labelled tree diagram.
Independent Events
Events A and B in probability space S$ are said to be
independent, if the occurrence one of them does not
influence the occurrence of other.
In terms of probability, 8 is independent of A, if
P\B)= PB/ A)and = PLA)= P(A/ Bh.
Property of independent events If A and B are independent
events, then
PIA B)= RA): PB)
We know (A B)= PAIAB/ A} if PLA)> 0.
= PIA)-P(B)
Also, PAM B)=P(B)(A/ By if PB)>0
= P(B)PIA)
Generalization For n independent events Ay, Ay. Ay
PRA A.A.) = A): PUA) PLA)
Theorem If A and B are mutually exclusive (ie,
AqB=6), and independent then either P(A)=0 or
P(A)PB)= P(A B)=P(Q)=0
= Either P(A) =0 or P(B)=0,
Hence proved.
Partitions, Total Probability and
Baye’s Formula
Suppose, a set 5 is the union of mutually disjoint subsets
‘Ay Anse Ane Then, itis said that (Ay, Ay. Ay} 15 2 partition
of S.
Here, 4.0 A;,= 6 for i# j, i,j =12,....n, and
SAU AU Aye
Let for n= 3,{4y, As, Ay} is a partition of S.
Let Ec S.
UGC-CSIR NET Tutor Mathematical Sciences
‘The Venn diagram representation is
Generalization Let {Ay Ay,
res
Then,
AS=EAAUAY.
ED A)UIEO A.
{EQ AyEO Ayr EO Ag} is a partition of E
Total Probability of E
(ED AYU (ED AU (EO All
(EA A) + PEA At. PEO A)
(Ay) PLE / Ay) + PLA, PLE / Ay
= DPAVRE/ AD
Baye’s Formula
= PEO Ag) _ PLAQIE/ Ad)
PA B= a
PLAIPE/ Ag)
2D RA)PE/ Ad
a
This is baye’s formula, after English, Math
Baye’s (s(1702- 1761.
a
Bayes' Formula for Future
Given FES +
Gi) PE EO AQ), k= 42,004
Stochastic Interpretat ont
Probability and Baye's Fo
For the sake of simpli BsDescriptive Statistics and Theory of Probability
any even of S.
bee a a =
$s
sam representation of Sand Ay A, and A,, &
ding to it stochastic tree diagram is ‘
PEA) 6
on di
w
tes
pas
PAA) 4, PEA)
>A, EAD
AI= REO A)
= AAO B=PIA)REA)
1. ADEA)
« MALDNS ne)
PAPEL A)
~ RAVRETA) + PA IAE/A)= RA RETA)
nissochastic process applies to any positive integer n,
Repeated Trials as a Stochastic
Process
be probability space of a repeated trial process may be
iened as a stochastic process whose tree diagram has the
loving properties
{@ each branch point has the same outcomes
(i) Al branches leading to the same outcome have the
same probability
trample 1. Twenty five books are placed at random in a
Pa Find the probability that a particular pair of books shall
always together, and (ii) never together.
Solution. since, 25 books can be arranged among themselves
in 251 ways, the exhaustive number of cases is 251. inf
Let us ticular books are tag
now regard that the two particular F
‘ogether, so that we shall regard them as single book. Now,
‘we have (25 ~ 1) = 24 books which can be arranged among
themselves in 241 ways. But the two books which are
‘sened together can be arranged among themselves is
2 ways. Therefore, associating these two operations, the
fumber of favourable cases for getting a particular pair of
always together is 24121
1x21 2
ence, required probabi ee
20
) Total number _of arrangements of 25 books among
themselves is 25! and the total number of arrangements that
a Particular pair of books will always be together is 241% 2.
Hence, the number of arrangements in which a particular
Pair of books is never together is
251-2 x 241=(25—2)x241=23 x241
23x24! _23
Hence, required probability = esas
‘Alternate Method P [A particular pair of books shall never
be together]
* =1~P [A particular pair of books is always together]
2
25°25
Example 2. if 6n tickets numbered 0, 1, 2,..., 6n~1 are
laced in a bag and three are drawn out, show that the
Chance that the sum of the numbers on them is equal to 6n is
3nl(6n 1) (6n - 2).
Solution. The total number of ways of drawing 3 tickets out of
6n is given by
"°C, =n (60-0160 -2)
Favourable cases for obtaining a sum of 6n on the three
drawn tickets are given below
0,1,60-% ©,2,6n-2)...; ©,3n-1,3n+, ie.
@n-9
cases
3n=1,3n), i.e. Gn =2) cases
5 @,3n-2,30, Le, Bn=4)
cases
i G,3n=2,30-1, ie, Ga-5)
cases
(1,2,6n =3); (3,60 =4)
(2,3,6n-5), (2, 4,6n~ 6;
B,4,6n-7; G,5,6n-8),
(Qn ~2,2n-1,2n+ 3); Qn ~2;2n,2n + 2) .e.,2 cases
Qn -1,2n,2n+ Vire.,1 case
Hence, total number of favourable cases
={n=0+ Bn 4) +..4542}
+(G0-2+G0-9 +. 44
The expression in each bracket of eq. (i) is the sum of n terms
of an arithmetic progression (AP) series.
Hence, Total number of favorable cases
={(2+5+.4 Gn 4) + Bn} + {1+ 4+...+G03)
+@n-2}
AFAx2+ D3} + FA x14 0-03}
a
(4430-3 +2+3n-3)=30?
Hence, the required probability
an
“n@n—V6n=2
sae
(n=¥ (60-2)930
Example 3. There are (N +1) identical urns marked 0, 1,
2 each of which contains N white and red balls. The kth
urn contains k red and N ~ k white balls, (k = 0,1 2,...,N). An
turn is chosen at random and n random drawing of a ball are
made from its the ball drawn being replaced after each draw.
If the balls drawn are all red, shown that the probability that
the next drawing will also yield a red ball is approximately
(0+ iin +2) when N is large.
Solution. Let & denote the event that the kth um is chosen,
(k=0,1,2,...,N, Then, the event, Ep, Ey) EyyossEq are
Pairwise mutually exclusive, one of which certainly occurs.
Then, P(E) = P(E) =...=P ,) = from symmetry).
Ne
Let A denote the event of getting n red balls successively in
1 draws (with replacement from an urn chosen at random)
Then, oft (J ei)
a 1 (ky
nd P= eee aay (ee
ar =F, PE) PANE) wat (t)
Now, using Baye’s theorem, we have
“8 (iy
Pley PAVE) __ N+ WN,
E Pep Pale) _1 # (Ky
ads N+1k=0\N,
PEIA=
Let C be the future event that (n + 1th draw also yields a red
ball. Then, on the same lines, it can be shown that
pe certtent
Pananadali)
Hence, the required probability is, given by
PICIA=
faa
=i (#
pan), Wistaly)
N
IfN is very large, then we have
PA) N (ky
z.(h)
ee PN pare
Fen ely) oe
and
Sigh (k
pana=t E(E
Hence,
This is known is law of Succession due to Laplace.
1. From a city population, the probability of
eee ‘pale of smoker is 7/0 il) a male smoker is
2/5 and (ii) a male, if a smoker is already selected is 2/3.
Find the probability of selecting (a) a non-smoker (b) a male
and (c) a smoker, if a male is first selected.
Solution. Define the following events
male is selected, B : a smoker is selected
UGC-CSIR NET Tutor Mathematical Sciences
We are given that,
PAU B=;
7
y.
ity of selecting a non-smoker i given by,
(a) The probability of selecting 2 by
“78 Rae
7 An 8 =2, PIB =
0" 5
f. PA
since, P (A/B = a
oj
= PB
(by The probability of selecting a male (by addition thon
is given by 7
PA =PAUB + ANB -PB =i
(@ The probability of selecting a smoker, if a mle is fg
selected is given by
PAN 2/5 _4
aie PIA) v2 5
Example 5. An urn contains 5 white and 5 black ball,
4 tala drawn from this urn and put into another um.
From this second urn a ball is drawn and is found to be
white. What is the probability of drawing a white ball again
at the next draw. (The first white ball drawn is not replaced)
Solution. The event of drawing 4 balls from the um is
associated with five mutually exclusive events as follows
Ey = the event of drawing 0 white and 4 black balls
, = the event of drawing 1 white and 3 black balls
E, = the event of drawing 2 white and 2 black balls
£, =the event of drawing 3 white and 1 black balls
"4 = the event of drawing 4 white and 0 black balls
Then, we have
and
Let A denote the
it te
second snot the event of drawing a white ball fom
first draw. Then, we have
PUAIE) =0, Pale) = + piwiey 22,
a a
PUAIE) =>.
(AE) = and P(A/E,) =
Let C denote the future event, i fon
. i drawn
the remaining 3 balls in the ese ahd
Hence, P (C/A Eq) =0, P (C/A cE) =0, PICIAD EI*F,
2
PICIANE) =2 and PCIAN Ed) BaE, PE) P (AVE) P CIA Ey
%, PE) P (ANE)
10° 2) ye
} ooo 3
2x 0
‘ 2420 4 ay ae ape
gandom Variable
Random Vector)
"phe a probability space.
»R,R=]-<, e0[, such that for every ae R,
° vie called one dimensional random variable,
+R, then X is having value which has two
sents ts elements are in the form of pairs
‘xe, is 2-dimensional random variable,
4 it X:5 > R", then X is having values as n-tuples
x,)and X is called n-dimensional random variable.
Nate | dimension of a random variable is not
5 then it would be considered as one
ional random variable.
Important Theorems on Random
Variables (Measurable Functions)
Theorems 1. Let X:S —> R(J- 7,21).
Then, X is a random variable if and only if
{xe $| X(x)< a}ee
where, a is any real « R.
Theorem 2, If X;:S —9 R,%y:S—R
be two random variables and C be a constant (real
or complex), then CX; X; + Xp, and ‘X,X, are also
random variables.
Theorem 3, For any constants C; and C2 and
S—5R, X,:$ —> R, two random variables.
CX, +C,%, is also a random variable.
Phorem 4. If {X,|nz1} is a eae of a
"ables a
defined on ae
PX. iofX, lim Sup Xp, lim inf Xp BF
“ees whenever they are finite for all x€ :
De:
Scriptive Statistics and Theory of Probability
931
Theorem 5. if X is a random variable, then
7
5p where © (x)=, if XW) =0
(i) _X,(x) = max{0, X00]
(ili) _X_(9 = ~ min{0, X09]
ivy |x)
are random variables.
Theorem 6, _ if X, and X, are random variables, then
@® max(X, X,)
Gi) min(X, X,)
are also random variables.
Theorem 7. If X is a random variable and f is a
continuous function, then f(x) is also a random variable,
Theorem 8 If X is a random variable and f is
{increasing function, then f(x) is a random variable.
Theorem 9, If X is a random variable and f is of
bounded variation, then f(x) is a random variable,
Characteristic Random Variable
Let (5, ¢, P) be a probability space. Let Ac S.
Let ¥4:5 —> {0, 1 be defined as follows
won ft xed
A= 10, ifxe
It satisfies following properties
@ ¥,00=0
Here, is null set.
i) s(x) = 1 S being sample sapce.
= B= ¥,(x)= Yel), A, BSS.
Gv) IF ACB, then ¥,005 ¥alx),
ABcS.
() ¥,0)4 95001
AGS, A=S-A.
(Wi) Fp) = FAO Mal)
(ii) Pl) = ¥ 40) + Fa(0d — F400 900)
Discrete and Continuous Random
Variable
fit takes countable number of values, then it is called discrete
random variable.
If it takes non-discrete random values (uncountable number of
values), then, it is called continuous random variable,
or
‘ts values cannot be put in one-to-one correspondence with
Ure irespes then itis called continuous random
variable.932
The set of values which random variable X takes is called
spectrum of X.
e.g, (a random variable X having countably infinite values)
A coin is tossed until a head appears. Its sample space is
S= {HTH TH TITH ne, THs T7)
Let X denote the number of times the coin is tossed. Then, X is
a random variable
X={1,2,3,...,00}
In it ‘eo is for the case that only tails occur.
Here, X is infinite (countably) discrete random variable.
8, (continuous random variable)
A point is chosen in a circle C of radius r.
Let X denote the distance of the point from
the centre.
Then, X is a random variable whose value
can be any number between 0 and r,
inclusive.
Thus, X=(0, r]={x|0sxsr}
X is uncountable set.
It is as continuous random variable.
eg. a linear array EMPLOYEE has n elements. Suppose,
NAME appears randomly in the array and there is a linear
search to find the location K of NAME, ie, to find K such that
EMPLOYEE [kK] = NAME. Let fin) denote the number of
‘comparisons in the linear Search.
(a) Let X =f(n)= The number of comparisons in the linear
search,
‘Since, NAME can appear in any position in the array with the
same probability of 3 SOX =1.2,.00/M,
Its probability distribution is
aw a
poo | vo | vn Vn
wat tea teen t
= (142+...) } cee
+d
z
If the NAME appears at the end of the array, then f(x) =
worst possible case.
Example 6. Let X be a continuous random variable with pdf
a, «OS xS1
for= a, Isxs2
OA) act3a, 25x53
0, elsewhere
() Determine the constant a (i) compute P(X $1.5)
Solution. (i) Constant ‘a’ is determined from the consideration
that total probability is unity, Le., [> fla dx=1
Itis
7,
af maf, fo de+ J 100 dis [fod
+ Jada,
UGC-CSIR NET Tutor Mathematical Sciences
= flaxavt [oa det fic axt 3a deat
-
aie
2 salt a|-S 438 si
L L
seatal(-Z+2 =:
ea 2 ;
1
erat = and :
(ii) We have,
{from Eq, @) in Pat
Example 7. A random variable X has the followin
probability function
1[2] 3 [4 [sien
2k | 7e+k
(Find k
i) Evaluate P(X <6) P(X 26) and P(0, -
> 5: By tral, we get a 4 ‘iDescriptive Statistics and Theory of Probability
igribution function F, (2 of X is given in the adjoining
ted
ble
x
0 0
% Taee
10
3 aha
10
z Shee
10
oo ak=4
5
5 . | snkeeeee
| 10
eae
= 9k + 10K? =1
fanple 8. Let X be a random variable whose density
ian; Forms an isosceles triangle above the unit interval
i Land 0 elsewhere. Then, prove that
i) Height of the triangle is 2.
t) Formula for pdf is
4x, if0Sxs 12
fx)= {ax +4, if Y2Sx81
0, elsewhere
The mean wt = E(X) of X is V2
G
° 112 1
Siution, Let OPQ be isosceles triangle of height say K, base
being OP =[0, 1. As pdf : f forms OPQ, so area of OPQ =".
1 i
= OP-K=1 = otK=T
2 2
*. K=2
‘fis linear between X =0 and x= /2, represented by OQ,
Is sope” greene
y2 Vb
ain, fis linear between x=¥/2 and x=1, represented by
QP its slope m=—K— = 4,
1-2)
4x, if O'SxS2
ah ads, if 28x51
0, elsewhere
Hence,
933
We may view probability as wei
{as the centre of gravity. as or raat sb Sena
Since, triangle is symmetric,
w=E) = 92
The mid-point of the base of triangle between 0 to 1.
w= E00 = [7 aden acai ef) tac ae
=y2
Probability Mass Function and
Probability Distribution for a
Discrete Random Variable X
Let X be one dimensional random variable having countably
infinite values X;,% ‘
For each x, let p, = PL lx)
be probability that X takes value x).
It satisfies
(p20, ¥i
Gi) Ypw)=1
a
Here, p {xy %
It is called probabili
The set {(, px) }1x/€X, P)20, P= 1
a
TF
Xp} —9 10,1
mass function of X.
is called probability distribution of X.
In tabular form probability distribution of X ‘oF
Por) Poa),
tribution Function
Let X be a random variable an (S, €, P). Let
F,:X—3 0, 1
be such that Fy (a) = PX < a) a
= PAxl XS a} AEN,
where, -2< a< oo is called the distribution function of X-
Properties of Distribution Fu
1. If F is a distribution function of the
and ifab)
fo ox
Hence, ft
{i We have,
= [jfora=
= 6fixt—ade=6 f xa
= 3b? — 2° = (1-36? +264)
> 4b’ - 6b? +1=0
= 2b-0(2b?-2b-0=0
e 2b
0
= bel or2b*—2b
pa2tvie8 188
4 ee
Hence, b= } is the only real value lying between 0 and
1 and satisfying (*).
Joint Probability Law
Let (s,¢,P) be a probability space.
Let X:5—> R and ¥:5— R be two random variables at
for any ae R
{xeS|Xi)sahee, {xe SlYosabee
ie, {xe S]Y(x)s a}
and {xe S|¥(x)< a} are events of S.
Here, X and Y have same probability space.
‘We consider X x ¥ = {(x,y)|xeY, ye}
if Pry(t y)= Pl, ye El
Then, it represents probability of the event E.
iiee Probability function of X and Y. wo
et 3 ve
tom aia teen 5? Fa MSEDescriptive Statistics and Theory of Probability
oy cy ={aIXEXG)y eV),
«dimensional random variable on §
520 vied two dimensional random vector on s,
50 @ AY are discrete random variable,
Xn}
Va
11x, € X63), €¥(s)}
int probability function of X and Y is
pK =x, OY =y,)= ple, y)
Yi Ym | Total
Py Pw |B
Pry Pan | Bp
x PoP Pi Pm |
Pay Poem Po
Pi Pom i,
fina
the Probability Distribution of X
s known as marginal probability function of X.
tates Sp
The Probability Distribution of y
Ain= SAK =, ave=y)= Sp)
im
= Lphy))=p,
“
“called marginal probability function of Y.
ies By
Sr Joint
Prob;
nt Distribution, Conditional
abilities
plsvy)) _ Pe
Py) Pi
935
'scalled conditional probability function of x, given ¥ = y,
AY mypx mye PY Pa
Pils) By
is called conditional probability function of Y, given X = x,
Here, Pa a Pa
and
For Joint Distribution X and ¥ (r-v,)are Independent, if
PK =x, ¥ =y))= PX =x)-AY =y))
IF itis not so, then X and ¥ are called dependent.
Independent Random Variables
in Terms of pdf's
Let X and Y have joint pdf as f(x,y) and marginal pdf's f x)
and fly), respectively. They are said to be stochastically
independent, if
fy ly) = £0)-5)
In Terms of Distribution Function
X and Y are stochastically independent if and only if there
joint distribution function.
Fy (x,y) is the product of their marginal distribution functions
00), By).
‘Two variables which are not stochastically independent are
said to be stochastically dependent.
Joint Probability Distribution
Function for Continuous
Random Variables
Let X and Y be two random variables. Then, (X,Y) is two
dimensional random variable.
Its corresponding point distribution function is denoted by
Fy, y).
|
It is defined as the probability that simultaneously the
observation (X,Y) will have the property (X-
You might also like