Theory of Probability

You might also like

Download as pdf
Download as pdf
You are on page 1of 42
Descriptive Statistics Theory of Probability fat st Syllabus age observation does not form statistics. Statistics are @ sum total of observations. istics are expressed quantitatively and not quanti Poe calcd yah § ei ape * Discrete Probability satisics a Purpose. Independent Events satsics in an experiment are comparable and can be classified into various groups. « Baye’s Theorem + Random Variables and Distribution Functions (Univariate an Data Multivariate itmeans information (given facts). It is of two types, primary and secondary =. Pepecrationend Meera himary data When data is collected with a definite plan or design itis called primary 1 IndependentRandomvarcoies da Distributions Secondary data Data which ate not primarily collected but obtained from published or « Characteristic Functions unpublshed scenes are known as secondary data. = Probabitty Inequalities * Tchebyshet Markov Jensen P * Modes of Convergence Presentation of Data = Weak and Strong Laws of Large The collected data is put in some suitable tabular form in order to study its silent Numbers feaues This s called presentation of data in tabular form. The data can be arranged in * Central Limit Theorems (id. case) ™ Sample Space cone aee * Markov, Chains with Finite and (i In serial order or alphabetic order * Classification of States (i) Ascending order Limiting Behaviour of st Tansitin Probabilties © "***P Gi) Descending order 4 The raw data put in ascending or descending order of magnitude i called an array or + StatonaryDistibution Poisson and Birth-and-death aay data, oe Ag Ae for large data to put, it in serial order is tedins work. For sap WS PUL ILIA BOMIAS coe ee cheee art OS fern the first column of the table data is put in from lowest to greatest value, Univarite Distibutons es fois value, ae put 1 bar (a vertil line segment) in second column opposite to it, Sampling Distributions alue, we put 1 bar (a vel : reece second 1 * ue dial bar is put and so on. This process is repeated for each s standard. Errors and ve eae val 24 gata umber of es, ls epeted thesame numberof Distributions sone tes are put in the 2nd column opposite to it. And for ditferent value * Distibition of Oner Statistics ond Focess is repeated. 922 The bars drawn in the 2nd column are known as tally marks. And to facilitate it we record tally marks in the bunches of five In the third column of the table corresponding to each data the counting of tally marks is put opponte toi. This is called frequency of the data value (observation). This way of presentation of data in tabular form is known as frequency distribution values of data (obseration) are called variates, the number of times observation occurs is called frequency of the observation, When collection of data is very large, we divide the whole range (highest value-lowest value) into a number of classes. Then, for values of variates in the same class are tallied and then counted This gives grouped frequency distribution. There are two methods of classifying data according to the dass intervals. (@ Exclusive method (in it upper limit of one class is the lower limit of another class) i) Inclusive method (In it upper limit of the class is included in the same class. Cumulative Frequency Distribution Cumulative Frequency In a discrete frequency distribution the cumulative frequency of a particular value of the variable is the total of all the frequencies of the values of the variable which are less than or ‘equal to the particular value. ‘Two types of cumulative frequencies (i Less than (we add up the frequencies from above) i) Greater than (we add up frequencies from below) The values of the variable and the corresponding frequencies together form cumulative frequencies distribution. Graphical Representation of Statistical Data Bar Graph ‘A bar graph is a pictorial representation of the numerical data by a number of bars (rectangles) of uniform width erected horizontally of vertically with equal spacing between them, Histogram ‘A histogram or frequency histogram is a graphical representation of a frequency distribution in the form of rectangles with class intervals as bases and heights proportional to corresponding frequencies such that there is no gap between any two successive rectangles, It is a two dimensional diagram. UGC-CSIR NET Tutor Mathematical Sciences Frequency Polygon ; J ther method of representing frequiny dina, pee cally it can be drawn in two ways a (i Using histogram iy Without using histogram asure of Central Tendency 742,00 Me: ( Arithmetic mean for Xv ta by tat (ae E+) _1 5. (oe ye Nit where N= fy+ ft, for data : x/f 1 reat Di, reat n Di where, A is assumed mean d=xj-A eae a arash dea where, A is assumed mean d,=2=A, h being common value of intenal (ss Properties of mean ® Yfo;-=0 a Gi) 2= SD te,— AF is minimum at the point A=* BS, a O) Medan =1+4(N f\2 distribution For discrete distibution @) find 2 N= FE 2 f (©) Cumulative frequency >N2 (©. coresponding to the cumulative frequen median. Gil) Mode = 1 4 __Mh ~ §) 0-6 —F) For a symmetrical distribution Mean = Median = Mode frequen for continuous ‘ Deseriy tive Statistics and Theory of Probability jerately asymmetrical distribution Mean ean - mode) iy Median, Ss metric mean q G= Ope neaeg onic mean amon Y Dix N for data x;/f yeasure of Dispersion or the measure of central tendency gives us an idea se smentration of the observations about the central part ribution. It does not give complete idea of the tion. For it we need to measure the dispersion of the w Dapersion (or scatteredness) may be homogeneous (less ad) or heterogeneous (mere scattered) characteristic of Data for an Ideal Measure of Dispersion (it should be rigidly defined. fi) It should be easy to calculate and easy to understand. ii) f should be based on all observations. fv) it should be amendable to further treatment (mathematical). (0) Fluctuations of sampling should have little effect on it. Iwe are four measure of dispersion. ( Range ii) Quartile deviation or ser (ii) Mean deviation (W) Standard deviation terquartle range )Range Range = Greatest observati ‘ Quartile Deviation or Semi-interquartile Range 0 ion — Smallest observation a= HQ -Q) Qi= First quartile, : Q = Third quartile of the distribution ra ' Mean deviation GEN, Xf = 42, cos ean deviation = aE fx -Al (frequency distribution) 923 where, f=N Ix,~ Al= Absolute value of the deviation x,~ A (W) Standard deviation (Root Mean Square Deviation) Given, frequency distribution x] f, i= 2,..-. 9 = (Standard deviation) N=ks zi TE Mls ~F is called variance. If Ais any arbitrary number, then Root mean square deviation Relation between ¢ and 5 Fate where, d=X-A Coefficient of Dispersion AqB @) CD based eee i) on range = 75 A— greatest value, B— smallest value in the data - fon 2(@ 2 Qv/2 i) CD based or jartile deviation = —-——*—_ Toy Se QI2 ) CD based on mean deviation ____Mean deviation ‘Average from which itis calculated (iv) CD based on standad deviation = 52 Mean x Coefficient of Variation +100 times the coefficient of dispersion based upon standard deviation is called coefficient of variation (CV) cv =100%-F Moments 1 = Els -Ay where, Ef=N ty 1 Bix XT N « Relation between moments about mean and moments about a point. wa = Hs -HP fy = BS — 3A + 2H We =H —4usny + OHH ~ SHS 924 Mian tHe BS By — Saat +H Be =e 4s + Onn? +H ‘+ Sheppard's correction for moments. 2 2 aa ny (corrected) = ps (corrected) = 1 7 (corrected) = 1, — "hip, +—Z_ht ie ty ane 240 + Pearson’s B and y coefficients. ui Ban B= Kh t=Br-3 ‘= rth absolute moments about origin For 2, WEIN, N=zE = rth absolute moment about mean For x/f, i=4 2,3, 1 weflnnit + Based upon moments, coefficient of skewness. BiB, +3) 268, - 68,-9) + For symmetric distribution 5 =0 = Either B, =0 or By = As B, it = 8, £0 S« So, 5, =0 < B,=0 = By is the measure of skewness. ‘= For symmetric distribution BrP. (Mean) = Mo= Ms ‘Symmetric distribution The skewness is positive if the larger tail of the distribution lies towards the higher values of the variate, ie., the curve drawn with the help of the given data is stretched more to the right than to the left and is negative in the centrary case. UGC-CSIR NET Tutor Mathematical Sciences fa) Mo vane M, Ma Mean Postively skewed distribution ‘A distribution is said to be skewed i {i) Mean, Median and Mode fall at ferent poing, Mean # Median + Made wartiles are not equidistant from median, (iii) tre re drawn with the help of given data is nq, Symmetrical but stretched more to one side than he other. Kurto: ewe know the measure of central tendency dispersion (iid) skewness We do not get the idea of flatness or peakedness from there, Karl Pearson used the concept of ‘convexity of cue’ o, Kurtosis to measure the flatness or peakedness of the curve. is measured by the coefficient. Br = Eh Ba m8 Curve of the type ‘A’ which is neither flat nor peaked is called normal curve or mesokurtic curve, for such cune B, =3, ie, %=0 Curve of type ‘Bis flatter than the normal curve. Its knawn as platykurtic, For it B, <3, ie., Y, <0. Curve of type ’C’ is more peaked than the normal cune i aalled leptokurtic. For itB, >3 ie., Ya >0. " Theory of Probabi ity Random Experiment The theory of probability provides mathematical models fo ‘teal world phenomena’ involving games and chance such & tossing of a coin or a dice etc. Probability has additive roy ind frequency interpretation. To deal with these rope of probabilsic situations we need a mathematical description or modeh Any such probabilistic situation is referred to as experiment and is dented by E (ay), Emay be tse in or a i ale ‘eeice OF taking a card from a pack of 52 & sca \ Descriptive Statistics and Theory of Probability in a random experiment Ned ormance in a periment is called a trial. All sone ronducted under the same set of conditions also pe dom experiment. sample Space cat of a thal in a random experiment is called an rm dne (or an elementary event oF a sample point se ality ofall possible outcomes constitutes sample space “uppose total outcomes is finite say 1 €y, €:,..,€,» such ce wo or more outcomes occur simultaneously and iy one of the outcomes must occur. Then, erat and Leth fe phrnn{eqh are called rentary events of 5. fe} =5,Vi ee {sample space S may be finite or countably infinite smtable of discrete) or uncountable (non-discrete). amples of discrete S. sfnte sing of a coin one time gives g S={HT} stosing two times gives S={HHHT, TH TT) «Tree diagram of outcomes is Tossing 1st 2nd ‘Outcomes 2 HH Ke HT 7 I! TH Za SS 1 * ls tossing three times gives tossing, Tosing 1st 2nd 3rd | Outcomes 4 HHH mk | ee S; HIT we THH ~~ THT 7 4 1TH Na a Tr a re he HAT, HTH HTT, THH, THT, TH, TT} “fa dice gives $ = {1,2,3,4, 5,6} 925 Tossing of a pair of dice gives Ce ee ee ee ee ee Top faced 1 ay | a2 | a3 | as | a5) | oo 2 [enol en! ox [em | @s) | 26 3 [on | onl en/oo|o5 | a6 4 [an [a2 | 43) | a4 | 5) | 46) 5s [oo lon |e» | os | 65) | 66) 6 [62] 62 | 6» | 64 | 65 | 66 6 =36 outcomes in the form of pairs representing S. 5 is infinite Tossing of a coin until a head appears $= (V2 3,0} n= number of times coin is tossed e refers to the case when head never appears. It shows every posit 5 is infinite. A‘ As N={12,3,...,m,..,2} and Shas one-to-one correspondence N is countably infinite, so 5 is countably infinite. Here, Nis the set of all natural numbers. 8,, A card is drawn from an ordinary deck of 52 cards. The sample space S consists of the four suits, clubs (©), diamonds D and spades (S), where each suit has 13 cards, numbered from 2 to 10, end jack (J), queen (Q), king (K) and ace (A). The hearts (H) and diamonds (D) are red cards and spades (S) and ‘clubs (C) are black cards. € integer is an element of S. H Reauaveos Orx> Sec c ces ole eele & Let E be the event of a picture card, ie, jack (), king (K) and let Fbe the event ofa heart Then Ps” (2) EOF = (JH, QH, KH} is the event of a heart and a picture card, as shaded in the figure. 926 Continuous Sample Space ‘A sample space is continuous, if itis an interval or a product of intervals. In such case, only its measurable subsets are events. eg, Let a pencil drop, head first, into. a B rectangular box and note the point at the A bottom of the box that the pencil first touches, Here, 5 consists of all the points on the s bottom of the box. Let in the figure rectangular area represent these points. Let A and B be the events that the pencil drops into the corresponding areas, Then, AM B is the event that pencil drops in the shaded region (‘As shown in the figure.) Events and Operations on Events For ACS, Ais called event of S. <5, 6= nul set. Iti called impossible event. S.¢S.S is universal set. It is called sure event. Two or more events can be combined using U (union), 0 (intersection) and C (complement). For ABS, A B denotes events A or B or both. Aq B denotes events A and B both. A® denotes complement of A. Av =S-A If AQ B=6 then A and B are called disjoint (or mutually exclusive). Probability p of an Event E (a) Classical (A priori) Definition Suppose, an event E can occur in s ways out of a total of n equally likely possible ways. Then, p (probability of = s/n. (b) Frequency (A posterion) Definition Suppose, after n repetitions, where n is very large, an event E occurs 5 times. Then, (probability of & = s/n. {it is a relative measurement of frequency of € to the frequency of 5). : A Concept of probability is from time unknown in human civilization. Laplace, Pascal etc., give them interpretation, which is growing very fast. ieee UGC-CSIR NET Tutor Mathematical Sci jences Axiomatic Definition of Probability 4 5 be a sample space. ca «be a collection of subsets (events) of Sst (i) For A Bee, A- Bee (ii) For Ae i= 12--.° UAE Let Pie 710, st, Ad. PAIZOV AEE As)=1 Al. a ‘All, WA,Be&, AM B= 6, PAU B)= PA) A ‘Then, Ps called probability function, P(A) ca df event A, and the triplet (5, €, P)is called probab When understood (5, ¢, P)is simply represented is treated as probability space. Al, All A- axioms of probability Generalization of A-lll to any disjoint events Ay, yyy Ani PUA, UA, Un Ay) = PLAY) + Ay 4 PIA (By principle of mathema finite number’ Important Theorems on Pro Theorem 1. P(9) Proof. For any event A, AU@=A, ANO= So, P(AU@)= P(A)+ PCO) Ga P\A)= P(A)+ PQQ) = P@)=0 Theorem 2. For any event A, P(A) =1- P(A) 4 Proof. We know A° = S—A= Complement. ANA‘ = and AUAS=S A P(S)=P(AW A‘) = P(A)+ P(A®) a 1=P(A)+ P(A) = PlA’)=1-P(A) 2 Theorem 3. For any event A, _ O 0) m5. For any two events A and B There p(A\ B) = P(A 0 B®) = P(A) — P(A. B) as A=(A\B)U(AnB), rey (A-DOANB)=9 - P(A) = PI(A\B)U (An B)] e = P(A\ B) + P(AN B) P(A) = P(A BS) + P(A B) P(A B) = P(A)- P(A 9B) rreorem 6. Addition rule poof. Forany two events A and B, PAU B) = P(A) + P\B)- PAO B) AUB=(A|BUB viere (A|B)OB=@ fo, RAUB)=PIA|B)+PIB) = P(A)-P(ANB)+ PB) oc RAUB)=PIA)+ PAB)- PAB) similarly, for any three events A, B and C. MAUBUC) = P(A) + P(B)+ P(C)-P(AMB) -P(ANC)-P(IBOC)+ PANBOC) finite Probability Space 1tsample space $ be finite S= {ay as ordph a lt)=p,€10, 1, = 12) an,M St, A) p.20 Gi) ~ py + Py +---+Pp Ten pis called probability ofa, XAGS, RA) E,pla)= Ep, Ths, we have distribution of probability with respect to "wcomes ofS as follows Outcome [ay devesdy Probability |p, Pr Pn * called probability distribution. Fite Equiprobable Space esbilty space (5, €, P), 5 being finite, s.t., each outcome |... B2! probability, is called a finite equiprobable space. 2 words, if $ has n elements, then each point in $ is zat the probability Yn and. each event A containing ‘Sassigned the probability r/n. Ths, 14) = Number of elements of A _ 144) Number of elements of S—n(S) 927 Discrete Space, Countably Infinite Sample Space Let S={aya, for aeS, pla)=p,€ 10,11 st, @ p20, ¥, Spat a Then, S is called countably infinite sample space and p(@,)= Pi is called probability of outcome a. IF Sis finite or countably infinite, then it is called discrete; for any AGS, P(A) is called discrete probability of A;- Uncountable Space Here, we shall consider the uncountable sample spaces as having points with some finite geometrical measurement mS), such as length, area, or volume, and where a points in S is selected at random. The probability of an event A, i.e,, that the selected point belongs to A, is then the ratio of mA) to mS) Lenght of A Thus, AA Sah ofS or piay= A or ra) ~ Volume of Such a probability space S is said to be uniform PIA) is called non-discrete probability or probability. continuous Conditional Events and Conditional Probabilities Let A and B be events of A sample space S s.t., happening of A depends upon happening of B. Then, it is said that ‘event A is conditioned by 8’. It is also expressed as ‘A is conditional event with respect to B’ and is denoted by A/B. The conditional probability of A given B is defined as AN 8) mi YA/ B)= , (B)#0. Here, n(8) denotes number of points in B, etc. Multiplication Theorem for Conditional Probability Let A and B be events of S. Then, @ PB/A)= mane provided P{A)>0. PAB) PB) Provided PAB)> 0. (ii) A/ B= 928 These imply P(A\B)=PLA)PGB/ A), if A)> 0 = PB)PIA/ B), if P{B)> 0. For three independent events A, 8, C, the multiplication law is PAN BO C)=RA)PB/ AAC/ AM BL Here, (A)= Probability that A occurs /\B/ A)= Probability that B occurs assuming that A occurred. AAC/ A B)= Probability that C occurs assuming that A and B have occurred. Finite Stochastic Process and Tree Diagrams A finite stochastic process is a finite sequence of experiments where each experiment has a finite number of outcomes with given probabilities. A convenient way of describing such a process is by means of a labelled tree diagram. Independent Events Events A and B in probability space S$ are said to be independent, if the occurrence one of them does not influence the occurrence of other. In terms of probability, 8 is independent of A, if P\B)= PB/ A)and = PLA)= P(A/ Bh. Property of independent events If A and B are independent events, then PIA B)= RA): PB) We know (A B)= PAIAB/ A} if PLA)> 0. = PIA)-P(B) Also, PAM B)=P(B)(A/ By if PB)>0 = P(B)PIA) Generalization For n independent events Ay, Ay. Ay PRA A.A.) = A): PUA) PLA) Theorem If A and B are mutually exclusive (ie, AqB=6), and independent then either P(A)=0 or P(A)PB)= P(A B)=P(Q)=0 = Either P(A) =0 or P(B)=0, Hence proved. Partitions, Total Probability and Baye’s Formula Suppose, a set 5 is the union of mutually disjoint subsets ‘Ay Anse Ane Then, itis said that (Ay, Ay. Ay} 15 2 partition of S. Here, 4.0 A;,= 6 for i# j, i,j =12,....n, and SAU AU Aye Let for n= 3,{4y, As, Ay} is a partition of S. Let Ec S. UGC-CSIR NET Tutor Mathematical Sciences ‘The Venn diagram representation is Generalization Let {Ay Ay, res Then, AS=EAAUAY. ED A)UIEO A. {EQ AyEO Ayr EO Ag} is a partition of E Total Probability of E (ED AYU (ED AU (EO All (EA A) + PEA At. PEO A) (Ay) PLE / Ay) + PLA, PLE / Ay = DPAVRE/ AD Baye’s Formula = PEO Ag) _ PLAQIE/ Ad) PA B= a PLAIPE/ Ag) 2D RA)PE/ Ad a This is baye’s formula, after English, Math Baye’s (s(1702- 1761. a Bayes' Formula for Future Given FES + Gi) PE EO AQ), k= 42,004 Stochastic Interpretat ont Probability and Baye's Fo For the sake of simpli Bs Descriptive Statistics and Theory of Probability any even of S. bee a a = $s sam representation of Sand Ay A, and A,, & ding to it stochastic tree diagram is ‘ PEA) 6 on di w tes pas PAA) 4, PEA) >A, EAD AI= REO A) = AAO B=PIA)REA) 1. ADEA) « MALDNS ne) PAPEL A) ~ RAVRETA) + PA IAE/A)= RA RETA) nissochastic process applies to any positive integer n, Repeated Trials as a Stochastic Process be probability space of a repeated trial process may be iened as a stochastic process whose tree diagram has the loving properties {@ each branch point has the same outcomes (i) Al branches leading to the same outcome have the same probability trample 1. Twenty five books are placed at random in a Pa Find the probability that a particular pair of books shall always together, and (ii) never together. Solution. since, 25 books can be arranged among themselves in 251 ways, the exhaustive number of cases is 251. inf Let us ticular books are tag now regard that the two particular F ‘ogether, so that we shall regard them as single book. Now, ‘we have (25 ~ 1) = 24 books which can be arranged among themselves in 241 ways. But the two books which are ‘sened together can be arranged among themselves is 2 ways. Therefore, associating these two operations, the fumber of favourable cases for getting a particular pair of always together is 24121 1x21 2 ence, required probabi ee 20 ) Total number _of arrangements of 25 books among themselves is 25! and the total number of arrangements that a Particular pair of books will always be together is 241% 2. Hence, the number of arrangements in which a particular Pair of books is never together is 251-2 x 241=(25—2)x241=23 x241 23x24! _23 Hence, required probability = esas ‘Alternate Method P [A particular pair of books shall never be together] * =1~P [A particular pair of books is always together] 2 25°25 Example 2. if 6n tickets numbered 0, 1, 2,..., 6n~1 are laced in a bag and three are drawn out, show that the Chance that the sum of the numbers on them is equal to 6n is 3nl(6n 1) (6n - 2). Solution. The total number of ways of drawing 3 tickets out of 6n is given by "°C, =n (60-0160 -2) Favourable cases for obtaining a sum of 6n on the three drawn tickets are given below 0,1,60-% ©,2,6n-2)...; ©,3n-1,3n+, ie. @n-9 cases 3n=1,3n), i.e. Gn =2) cases 5 @,3n-2,30, Le, Bn=4) cases i G,3n=2,30-1, ie, Ga-5) cases (1,2,6n =3); (3,60 =4) (2,3,6n-5), (2, 4,6n~ 6; B,4,6n-7; G,5,6n-8), (Qn ~2,2n-1,2n+ 3); Qn ~2;2n,2n + 2) .e.,2 cases Qn -1,2n,2n+ Vire.,1 case Hence, total number of favourable cases ={n=0+ Bn 4) +..4542} +(G0-2+G0-9 +. 44 The expression in each bracket of eq. (i) is the sum of n terms of an arithmetic progression (AP) series. Hence, Total number of favorable cases ={(2+5+.4 Gn 4) + Bn} + {1+ 4+...+G03) +@n-2} AFAx2+ D3} + FA x14 0-03} a (4430-3 +2+3n-3)=30? Hence, the required probability an “n@n—V6n=2 sae (n=¥ (60-2) 930 Example 3. There are (N +1) identical urns marked 0, 1, 2 each of which contains N white and red balls. The kth urn contains k red and N ~ k white balls, (k = 0,1 2,...,N). An turn is chosen at random and n random drawing of a ball are made from its the ball drawn being replaced after each draw. If the balls drawn are all red, shown that the probability that the next drawing will also yield a red ball is approximately (0+ iin +2) when N is large. Solution. Let & denote the event that the kth um is chosen, (k=0,1,2,...,N, Then, the event, Ep, Ey) EyyossEq are Pairwise mutually exclusive, one of which certainly occurs. Then, P(E) = P(E) =...=P ,) = from symmetry). Ne Let A denote the event of getting n red balls successively in 1 draws (with replacement from an urn chosen at random) Then, oft (J ei) a 1 (ky nd P= eee aay (ee ar =F, PE) PANE) wat (t) Now, using Baye’s theorem, we have “8 (iy Pley PAVE) __ N+ WN, E Pep Pale) _1 # (Ky ads N+1k=0\N, PEIA= Let C be the future event that (n + 1th draw also yields a red ball. Then, on the same lines, it can be shown that pe certtent Pananadali) Hence, the required probability is, given by PICIA= faa =i (# pan), Wistaly) N IfN is very large, then we have PA) N (ky z.(h) ee PN pare Fen ely) oe and Sigh (k pana=t E(E Hence, This is known is law of Succession due to Laplace. 1. From a city population, the probability of eee ‘pale of smoker is 7/0 il) a male smoker is 2/5 and (ii) a male, if a smoker is already selected is 2/3. Find the probability of selecting (a) a non-smoker (b) a male and (c) a smoker, if a male is first selected. Solution. Define the following events male is selected, B : a smoker is selected UGC-CSIR NET Tutor Mathematical Sciences We are given that, PAU B=; 7 y. ity of selecting a non-smoker i given by, (a) The probability of selecting 2 by “78 Rae 7 An 8 =2, PIB = 0" 5 f. PA since, P (A/B = a oj = PB (by The probability of selecting a male (by addition thon is given by 7 PA =PAUB + ANB -PB =i (@ The probability of selecting a smoker, if a mle is fg selected is given by PAN 2/5 _4 aie PIA) v2 5 Example 5. An urn contains 5 white and 5 black ball, 4 tala drawn from this urn and put into another um. From this second urn a ball is drawn and is found to be white. What is the probability of drawing a white ball again at the next draw. (The first white ball drawn is not replaced) Solution. The event of drawing 4 balls from the um is associated with five mutually exclusive events as follows Ey = the event of drawing 0 white and 4 black balls , = the event of drawing 1 white and 3 black balls E, = the event of drawing 2 white and 2 black balls £, =the event of drawing 3 white and 1 black balls "4 = the event of drawing 4 white and 0 black balls Then, we have and Let A denote the it te second snot the event of drawing a white ball fom first draw. Then, we have PUAIE) =0, Pale) = + piwiey 22, a a PUAIE) =>. (AE) = and P(A/E,) = Let C denote the future event, i fon . i drawn the remaining 3 balls in the ese ahd Hence, P (C/A Eq) =0, P (C/A cE) =0, PICIAD EI*F, 2 PICIANE) =2 and PCIAN Ed) Ba E, PE) P (AVE) P CIA Ey %, PE) P (ANE) 10° 2) ye } ooo 3 2x 0 ‘ 2420 4 ay ae ape gandom Variable Random Vector) "phe a probability space. »R,R=]-<, e0[, such that for every ae R, ° vie called one dimensional random variable, +R, then X is having value which has two sents ts elements are in the form of pairs ‘xe, is 2-dimensional random variable, 4 it X:5 > R", then X is having values as n-tuples x,)and X is called n-dimensional random variable. Nate | dimension of a random variable is not 5 then it would be considered as one ional random variable. Important Theorems on Random Variables (Measurable Functions) Theorems 1. Let X:S —> R(J- 7,21). Then, X is a random variable if and only if {xe $| X(x)< a}ee where, a is any real « R. Theorem 2, If X;:S —9 R,%y:S—R be two random variables and C be a constant (real or complex), then CX; X; + Xp, and ‘X,X, are also random variables. Theorem 3, For any constants C; and C2 and S—5R, X,:$ —> R, two random variables. CX, +C,%, is also a random variable. Phorem 4. If {X,|nz1} is a eae of a "ables a defined on ae PX. iofX, lim Sup Xp, lim inf Xp BF “ees whenever they are finite for all x€ : De: Scriptive Statistics and Theory of Probability 931 Theorem 5. if X is a random variable, then 7 5p where © (x)=, if XW) =0 (i) _X,(x) = max{0, X00] (ili) _X_(9 = ~ min{0, X09] ivy |x) are random variables. Theorem 6, _ if X, and X, are random variables, then @® max(X, X,) Gi) min(X, X,) are also random variables. Theorem 7. If X is a random variable and f is a continuous function, then f(x) is also a random variable, Theorem 8 If X is a random variable and f is {increasing function, then f(x) is a random variable. Theorem 9, If X is a random variable and f is of bounded variation, then f(x) is a random variable, Characteristic Random Variable Let (5, ¢, P) be a probability space. Let Ac S. Let ¥4:5 —> {0, 1 be defined as follows won ft xed A= 10, ifxe It satisfies following properties @ ¥,00=0 Here, is null set. i) s(x) = 1 S being sample sapce. = B= ¥,(x)= Yel), A, BSS. Gv) IF ACB, then ¥,005 ¥alx), ABcS. () ¥,0)4 95001 AGS, A=S-A. (Wi) Fp) = FAO Mal) (ii) Pl) = ¥ 40) + Fa(0d — F400 900) Discrete and Continuous Random Variable fit takes countable number of values, then it is called discrete random variable. If it takes non-discrete random values (uncountable number of values), then, it is called continuous random variable, or ‘ts values cannot be put in one-to-one correspondence with Ure irespes then itis called continuous random variable. 932 The set of values which random variable X takes is called spectrum of X. e.g, (a random variable X having countably infinite values) A coin is tossed until a head appears. Its sample space is S= {HTH TH TITH ne, THs T7) Let X denote the number of times the coin is tossed. Then, X is a random variable X={1,2,3,...,00} In it ‘eo is for the case that only tails occur. Here, X is infinite (countably) discrete random variable. 8, (continuous random variable) A point is chosen in a circle C of radius r. Let X denote the distance of the point from the centre. Then, X is a random variable whose value can be any number between 0 and r, inclusive. Thus, X=(0, r]={x|0sxsr} X is uncountable set. It is as continuous random variable. eg. a linear array EMPLOYEE has n elements. Suppose, NAME appears randomly in the array and there is a linear search to find the location K of NAME, ie, to find K such that EMPLOYEE [kK] = NAME. Let fin) denote the number of ‘comparisons in the linear Search. (a) Let X =f(n)= The number of comparisons in the linear search, ‘Since, NAME can appear in any position in the array with the same probability of 3 SOX =1.2,.00/M, Its probability distribution is aw a poo | vo | vn Vn wat tea teen t = (142+...) } cee +d z If the NAME appears at the end of the array, then f(x) = worst possible case. Example 6. Let X be a continuous random variable with pdf a, «OS xS1 for= a, Isxs2 OA) act3a, 25x53 0, elsewhere () Determine the constant a (i) compute P(X $1.5) Solution. (i) Constant ‘a’ is determined from the consideration that total probability is unity, Le., [> fla dx=1 Itis 7, af maf, fo de+ J 100 dis [fod + Jada, UGC-CSIR NET Tutor Mathematical Sciences = flaxavt [oa det fic axt 3a deat - aie 2 salt a|-S 438 si L L seatal(-Z+2 =: ea 2 ; 1 erat = and : (ii) We have, {from Eq, @) in Pat Example 7. A random variable X has the followin probability function 1[2] 3 [4 [sien 2k | 7e+k (Find k i) Evaluate P(X <6) P(X 26) and P(0 3, find the minimum value of aand (iv) Determine the distribution function of X. Solution. () - E, phd =Vk-+2k+2k+ 3k +28 + 7K ket = 10k" + 9k-1=0 beg Ok-Dk+9=0 1 qo tka-t But since p( cannot be negative, Hence, k=, 10 ; writ (i) We have, P(X <6) =P (X =0) + PX +9-4.ct PED Yt 2 ae 10 "10 *i0*10* PIX2@=1-P(X , - > 5: By tral, we get a 4 ‘i Descriptive Statistics and Theory of Probability igribution function F, (2 of X is given in the adjoining ted ble x 0 0 % Taee 10 3 aha 10 z Shee 10 oo ak=4 5 5 . | snkeeeee | 10 eae = 9k + 10K? =1 fanple 8. Let X be a random variable whose density ian; Forms an isosceles triangle above the unit interval i Land 0 elsewhere. Then, prove that i) Height of the triangle is 2. t) Formula for pdf is 4x, if0Sxs 12 fx)= {ax +4, if Y2Sx81 0, elsewhere The mean wt = E(X) of X is V2 G ° 112 1 Siution, Let OPQ be isosceles triangle of height say K, base being OP =[0, 1. As pdf : f forms OPQ, so area of OPQ =". 1 i = OP-K=1 = otK=T 2 2 *. K=2 ‘fis linear between X =0 and x= /2, represented by OQ, Is sope” greene y2 Vb ain, fis linear between x=¥/2 and x=1, represented by QP its slope m=—K— = 4, 1-2) 4x, if O'SxS2 ah ads, if 28x51 0, elsewhere Hence, 933 We may view probability as wei {as the centre of gravity. as or raat sb Sena Since, triangle is symmetric, w=E) = 92 The mid-point of the base of triangle between 0 to 1. w= E00 = [7 aden acai ef) tac ae =y2 Probability Mass Function and Probability Distribution for a Discrete Random Variable X Let X be one dimensional random variable having countably infinite values X;,% ‘ For each x, let p, = PL lx) be probability that X takes value x). It satisfies (p20, ¥i Gi) Ypw)=1 a Here, p {xy % It is called probabili The set {(, px) }1x/€X, P)20, P= 1 a TF Xp} —9 10,1 mass function of X. is called probability distribution of X. In tabular form probability distribution of X ‘oF Por) Poa), tribution Function Let X be a random variable an (S, €, P). Let F,:X—3 0, 1 be such that Fy (a) = PX < a) a = PAxl XS a} AEN, where, -2< a< oo is called the distribution function of X- Properties of Distribution Fu 1. If F is a distribution function of the and ifab) fo ox Hence, ft {i We have, = [jfora= = 6fixt—ade=6 f xa = 3b? — 2° = (1-36? +264) > 4b’ - 6b? +1=0 = 2b-0(2b?-2b-0=0 e 2b 0 = bel or2b*—2b pa2tvie8 188 4 ee Hence, b= } is the only real value lying between 0 and 1 and satisfying (*). Joint Probability Law Let (s,¢,P) be a probability space. Let X:5—> R and ¥:5— R be two random variables at for any ae R {xeS|Xi)sahee, {xe SlYosabee ie, {xe S]Y(x)s a} and {xe S|¥(x)< a} are events of S. Here, X and Y have same probability space. ‘We consider X x ¥ = {(x,y)|xeY, ye} if Pry(t y)= Pl, ye El Then, it represents probability of the event E. iiee Probability function of X and Y. wo et 3 ve tom aia teen 5? Fa MSE Descriptive Statistics and Theory of Probability oy cy ={aIXEXG)y eV), «dimensional random variable on § 520 vied two dimensional random vector on s, 50 @ AY are discrete random variable, Xn} Va 11x, € X63), €¥(s)} int probability function of X and Y is pK =x, OY =y,)= ple, y) Yi Ym | Total Py Pw |B Pry Pan | Bp x PoP Pi Pm | Pay Poem Po Pi Pom i, fina the Probability Distribution of X s known as marginal probability function of X. tates Sp The Probability Distribution of y Ain= SAK =, ave=y)= Sp) im = Lphy))=p, “ “called marginal probability function of Y. ies By Sr Joint Prob; nt Distribution, Conditional abilities plsvy)) _ Pe Py) Pi 935 'scalled conditional probability function of x, given ¥ = y, AY mypx mye PY Pa Pils) By is called conditional probability function of Y, given X = x, Here, Pa a Pa and For Joint Distribution X and ¥ (r-v,)are Independent, if PK =x, ¥ =y))= PX =x)-AY =y)) IF itis not so, then X and ¥ are called dependent. Independent Random Variables in Terms of pdf's Let X and Y have joint pdf as f(x,y) and marginal pdf's f x) and fly), respectively. They are said to be stochastically independent, if fy ly) = £0)-5) In Terms of Distribution Function X and Y are stochastically independent if and only if there joint distribution function. Fy (x,y) is the product of their marginal distribution functions 00), By). ‘Two variables which are not stochastically independent are said to be stochastically dependent. Joint Probability Distribution Function for Continuous Random Variables Let X and Y be two random variables. Then, (X,Y) is two dimensional random variable. Its corresponding point distribution function is denoted by Fy, y). | It is defined as the probability that simultaneously the observation (X,Y) will have the property (X-

You might also like