Download as pdf
Download as pdf
You are on page 1of 316
= 22° About the Author Prof. T. Veerarajam, Professor of Mathematics is currently heading the Department of Science and Humanities, Sree Sowdambika College of Gold Medalist from Madras 42 years of teachiig experience at undergraduate and postgraduate levels in ‘hed engineering colleges in Tamil Nadu including Anna Engineering Mathematics, 20 (F *[ISBN: 0-07.053483~ ISBN: 0-07-053489-6] ‘Numerical Methods: 2E [ISBN: 0-07-060161-5] Propasitity, Statistics AND Ranpom Processes Second Edition T Veerarajan New Delhi New York St Lowis' San Francisco Auckland Bogotd Caracas ‘Kualo Lumpur Lisbon London . Madrid Mexico City Milan: Montreal San Juan Santiago Singapore Sytney Tokyo Toronto Tnfermation contained in this work bas boon obiaincd by Tata MeGraw Hi, om | tr thet profesional services. Irsuch services are vate professional should be sought. mM E==sll Tata McGraw-Hill © 2006, Tata McGraw-Hill Publishing Company Limited Fifth reprint 2008 RYLDRRQXRCRDD ‘No part ofthis publication may bé reproduced or distributed in any form or by any ‘means, eleconie, mechanical, photocopying, recording, or otherwise or stored in a Setabase or retrieval eystem, without the prior written permission of the publishers. ‘The program listings (if any) may be entered, stored and executed in a computer system, but they may not be reproduced for Publi ing Company Limited, 978.0.07-060170-4 1: 0-07-0601703 Published by the Tata MeGrav-f 7 West Patel Nagar, fypeset at Pashchim Vihar, New Delhi 110 063 and pt [No/310, Gandhi Sueet, Kotvakkam. Old Mahabalipuram Road, (Chennai 600 096 (Cover: Pint Shop Pvt. Ld. Contents Preface to the Second Edition Preface t0 the First Edition Syllabus Uc Probability Theory wt RandomEsperinest, 1 tical or Apriori Definition of Probability, 1 tic Definition of Probability 1.2 Conditionat Probability © 1.4 __ Independent Events /.5 \ Worked Example 1 (A). 1.6 Exercise A) 1.14 ‘Theorem of Total Probability 4.19 Baye’s Theorem or Theorem of Probability of Causes 1.19 Exercise 1(C) 1.29 , Answers 131 BAFandora Variables i 2a Discrete Random Variable 2. ‘Cumulative Distribution Function (cdf) 2.3 Properties of the cdf F(x) 2.3 Special Distributions 2.3 I wi Contents 2 Z Content 4 Discrete Distributions 2.4 ‘Standard Error of Fatimate of ¥) 4.37 Continuous Distributions 2.4 . Worked Example 4{C) 438 Worked Example 2(4) 2.5 Exercise 4(C) Exercise (A) 2.20 Characteristic Function 4.48 ‘Two-Dimensional Random Variables 2.23 Probability Function of (X,Y) 2.24 “Joint Probability Density Function 224 ‘Cumulant Generating Function (CGF). 4.51 Cummlative Distribution Function 2.24 Joint Characteristic Function 4.52 Properties of F(x, y) 2. Worked Example 4(D) 4.52 Marginal Probability Conditional Probability Distribution 2.26 Independent RVs 2.26 Random Vectors 2.27 Bienayme’s Inequality 4.64 Worked Example 2(B) 2.28 Schwartz Inequality 4.65 Marginal Probability Distribution of 2.31 Cauchy-Schwartz Inequality 4.65 ‘Marginal Probability Distribution of ¥: 231 Worked Example 4(E). 4.65 Exercise XB) 2.43 Exercise 4E) 4.71 Answers 246 ‘Convergence Concepts and Central Limit Theorem 4.72 Central Limit Theorem (Liapounoff's Form) 4.73 3. Functions of Random Variables 3a Central Limit Theorem (Lindeberg-Levy's Form) 4.73 Function of One Random Variable 3. Worked Example 4(F) 4.73 How to Find £0), When f,00 is Known 3.1 Exercise MF) 4.78 One Function of Two Random Variables 3.2 Answers 4.79 ‘Two Functions of Two Random Variables 3.5 | 4 ; papain Grade ntae =e ee 5. “Some Special Probability Distributions 5a Worked Example 3. 3.6 : Tnroduetion 5.1 Exercise 33.23 Special Discrete Distributions 5. Answers 3.27 ‘Mean and Variance of the Binomial Distribution . 5.2 Recusrence Formula for the Central Moments ofthe LA. Statistical Averages 4a ‘Binomial Distribution 5.3 Expected Values of a Two-Dimensional RV 4.2 Poisson Distribution as Limiting Form of Binomial Distribution 54 Properties of Expected Values 4.3 Mean and Variance of Poisson Distribution 5.6 Conditional Expected Values 4.4 ‘Recurrence Formula forthe Centcal Moments ofthe Properties 4.4 Poisson Distribution. 5.6 Worked Example 4A) 46 Mean and Variance of Geometric Distribution. 5.8 - Esercise MA) 4.14 i ‘Mean and Variance of Hypergeometrc Distribution 5.9 Linear Correlation 4.17 . | Binomial Distribution as Limiting Form of Conelation Coefficient 478 . : HypergeometricDistribution 5.1 fone Properties of Conon oetiient 429 M.GEF. of the Negative Binomial Distribution 5.12 Worked Example 5(A) 5.14 Worked Example 4B) 4.22 Exercise 5(A) 5.30 Exercise 4B) 433 . Special Continuous Distributions 5.36 ‘Moments of the Uniform Distribution U (a,b) 5.37 Regression 4.35, : ‘Mean an Variance of the Exponential Distribution 5.38 Equation of the Regression Line of YonX 4.36 Contents ‘Memoryless Property of the Exponential Distribution 5.39 ‘Meap and Variance of Exta: g Distsibution » 5.40 Reprodiictive Property of Gam Relation Between the Distrib slang Distbuton With 2= isson Distribution) Density uation ofthe Wei ‘Mean and Variance of Weibull Di Standard Normal Distribation 5.44 Normal Probability Curve 5.44 Properties of the Normal Distribution Nt, 0) 5.45 Importance of Normal Distribution 5.53 Worked Example 5(B) 5.53 Exercise 5(B) 5.78 Answeres 5,82 <6. Random Processes 6a ~ Classification of Random Processes 6.2 Stationarity 6.4 Example of a SSS Process 6.5 ‘Wiener Process as Limiting orm of Random Walk 6.15 Exercise 6(A) 6.19 Autocorrelation Function and its Properties 6.22 Properties of R(t) 6.22 Cross-Correlation Function and its Properties 6.24 Properties 6.24 Exgodicity 6.24 Moun-Exgodic Process 6.25 ‘Mean-Ergodic Theorem 6.25 Correlation Ergodic Process - 6.26 Distribution Ergodic Process 6.26 Worked Example 6(B) 6:27 Exercise 6(B) 6.34 Power Spectral Density Function 6.36 Propenies of Power Spectral Density Function 6.37 ‘System in the Form of Convolation 6.42 Unit Impulse Response of the System 6.42 Properties 6.42 Worked Example 6(C) 6.47 Exercise 6(C) 6.57 Answers 6.61 Contents “7. Special Random Processes Definition of a Gaussian Process 7.1 Procestes Depending on Stationary Gaussian Process 7.6 Quadrature Representation of a WSS Process 7.14 ‘Noise in Communication Systems 7.16 Thermal Noise 7.17 Filters 7.18 Worked Example (A) 7.19 Exercise 7A) Mean and Autocorrelation of the Poisson Process 7.35 Properties of Poisson Process. 7.36 Worked Example 7(B) 7.38 Exercise 7B) 7.43 Markov Process 7.45 Definition of a Markov Chain 7.46 Chapman-Kolmogorev Theorem 7.47 Classification of States of a Markov Chain 7.49 ‘Mean and Variance of the Population Size in a Linear Birth and Death Process 7.54 Pure Birth Process 7.55 Renewal Equation 7.57 Poisson Process 2s 3 Renewal Process 7.58 Worked Example 7(C) 7.60 Exercise 1C) 7.69 Answers 7.72 5.” Queueing Theory Symbolic Representation of a Queueing Model 8.3 Difference Equations Reiaedo Poisson Queue Systems 8.3 1A 81 Contents Values of P, and P, for Poisson Queue Systems 8.4 Characteristics of Infinite Capacity, Single Server Poisson ‘Qicue Model I [M/M/1): (co/FIFO) Model], when 2, = A-and we ue) 8S, Relations Among E(W,), E(N,), E(W,) and EQV,) 8.10 Characteristics of Infinite Capacity, Multiple Server Poisson ‘Queue Model II [M/M/s): («/FIFO) Model], When 2., = A. for all n(Q< su) 8.10 Characteristics of Finite Capacity, Single Server Poisson ‘Queue Model IT [(M/M/1); (K/FIFO) Model] 8.15. ‘Queue Model TV [(M/M/s): (K/FIFO) Model} 8.18 Non-Markovian Queueing Model V [(MIG/1) : (#GD) Model] 8.21 Worked Example $8.23 Exercise 8 8.54 Answers 8.60 UM Tests of Hypotheses on Introduction 9.1 Parameters and Statistics. 9.7 Sampling Distribution 9.2 Estimation and Testing of Hypotheses 9.2 Tests of Hypotheses and Tests of Significance 9.2 Critical Region and Level of ee 93 Errors in Hypotheses Testing 9.4 ‘One-Tailed and Two- Critical Values or Significant Values 9.5 Procedure for Testing of Hypothesis 9.6 Interval Estimation of Population Parameters 9.6 ‘Tests of Significance for Large Samples 9.7 Test 1 9.7 Test? 9.8 Test 3 9.8 Test4 9.9 Worked Example 9A) 9.12 Exercise A) 9.24 ‘Tests of Significance for Smafi Samples 9.30 Student's Distribution . 9.30 Properties of t-Distribution 9.30 Uses of Distribution 9.30 Critical Values of rand the -Table 9.31 Test 9.32 Test 9.33 Snedecor’s F-Distribution 9.34 Contents Properties of the F-Distribution 9.34 Use of F-Distribution 9.35 Worked Example 9(B) 9.36 Exercise {B) 9.47 Chi-Square Distribution 9,49 Properties of 7*Distribation 9.50 Uses of z7-Distribution 9.50 3f-Test of Goodness of Fit 9.50 Conditions forthe Validity of "Test 9.51 Test of Independence of Attributes 9.51 Worked Example C) 9.52 Exercise (C) 9.66 Answers 9.9 ee Design of Experiments ‘Aim ofthe Design of Experiments 10.1 Basic Principles of Experimental Design 10.2 Some Basic Designs of Experiment 10.2 1. Completely Randomised Design(CR.D.) 10.2 ‘Analysis of Variance (ANOVA). 10.3 ‘Analysis of Variance for One Factor of Classification 10.3 2. Randomised Block Design RBD.) 10.6 3. Latin Square Design (L-S.D.) 10.9 “Comparison of RBD and LSD 10.11 ‘Hote on Simplification of Computational Work 10.11 Worked Example 10 10.11 Exercise 10 10.25 Answers 10.32 Appendix A Statistical Tables Index Al cot Preface to the Second Edition am deeply gratified by the enthusiastic response shown tothe first edition of my ook “Probability, Statistics and Random Processes” (Ascent series) by the stu- dents and teachers throughout Tamil Nadu. ‘AS Anna University, Chennai has revised the syllabus in “Probability, Statistics and Random Processes” recently, I have revised this book thoroughly 80 as to cover the revised Anna University syllabi for the CSE, ECE and IT branches, ‘Chapter IX on “Rel Hypothesis" and Chapter as per Anna University of all the branches for: style of presentation of the theory, worked exampl Thope that the book will be received by both the faculty and students, as, centhusiasticaily ae the previous edition of the book and my other books. Critical evaluation and suggestions for further improvement of the book will be highly appreciated and acknowledged. Lam extremely grateful to the Management, Chockalingapuram Devangar Varthaga Sangam, Aruppukottai, which has sponsored Sree Sowdambika Engi- neering College, Ar ‘which { am presently working for the support extended to me in T wish to expr ‘Sowdambika Engineering College for the appreciative interest shown and con- stant encouragement giver to me while revising this book. vy Engineering” has been replaced by “Tests of thanks to Prof. Jagan Mohan, Principal, Sree TT, Veerarasan Preface to the First Edition ‘This book conforms to the syllabi of the Probability and Queueing Theory paper of computer science, the Random Processes paper of Electronics ad ‘Communication and ‘the Probability and Statistics paper of Information ‘Technology streams of engineering at Ansia Uni difficult. This is due jequate understanding of the bi probability theory and the wrong impression that the subjec branch of Mathematics. ‘The book is writen i the subject and may fi proofs of theorems are explained in as lucid a manner as possit ich a manner that beginners may develop an interestin wseful to pursue their studies. Basic concepts and Aloe te ‘working of prob communication ‘become personally involved in solving exercises. they cannot really develop understanding and appreciation of the ideas and a familiarity with the pertinent techniques. Hence, in addition to a large number of short-answer quéstions under Part-A, over 350 problems have been given under Part-B of the Exercises in vorious chapters. Answers are provided at the end of every chapter. ‘Though Chapters 7 and 8 are meant for Blectrical/Blectronics Eagineering seudents, the other chapters that deal with probability theory, random variables, probability distributions and statistics will be wseful to the students of other disciplines of engineering as well as those doing MCA and M.Se courses. Tam sure that the students and the faculty will find this book very useful. (Critical evaluation and suggestions for improvement ofthe book will be highly appreciated and gratefully acknowledged. wi Preface 10 the First Edition Lam extremely grateful to Dr K V Kuppusamy, Chairman, and Mr K Senthil Games, Managing Tistes, RVS Bdeatonal Trust, Ding, for the support ing and Technology, Dindigul, for the appreciative interest shown and constant encoaragerwent given to me while writing this book. Tam thankful to my publishers, Tata McGraw-Hill Publishing Company Limited, New Dethi, for their painstaking efforts and cooperation in bringing out this book in a short span of time. I would also like to thank Dr. A Rangan, Professor Department of for reviewing and providing valveble suggestions during the developmental stages of the book. Thave great pleasure in dedicating this book to my beloved students, past and + present, T. Veeeamuan Syllabus MAI252_ PROBABILITY AND QUEUEING THEORY Unit | Probability and Random Variable 943 Axioms of probability - Conditional probability -‘Tatal probability - Baye's theo- ‘rem Raindom variable - Probability mass function - Probability density function - Properties - Moments - Moment generating functions and their properties. Unit It Standard Distributions 943 Binomial, Poisson, Geomeric, Neative Binomial, Uniform, Exponential, Gamma, Weibsll snd Normal ebutions and Der properties Functions of @ random variable. Unit IIL Two Dimensional Random Variables o+3 Joint distributions - Marginal and conditional distributions - Covariance - Corre- lation and regression - Transformation of random variables ~ Centra limit theo- rem, Unit IV Random Processes and Markov Chains 943 Clasiication -Stalionary process - Markov process and death process - Markoy chain's Transition probabilities - Limiting distribu- tiois. - * Unit V Queueing Theory 943 ‘Markovian models - M/M/1, M/MIC, finite and infinite capacity - M/M/*- Fi- ajte source model - M/G/I queue (steady state solutions only) ~ Poliaczek - Khintchine formulla- Special cases. ‘Tutorial 1s ail Syllabus MAI254 RANDOM PROCESSES Unit 1 Probability and Random Variable o+3 Axioms of probability - Conditional probability - Total probs rem - Random variable - Probability mass function - Probs tions - Properties - Moments - Moment generating functions and their properties. Unit Il. Standard Distributions 943 Binomial, Poisson, Geometric, Negative Binomial, Uniform, Exponential, Gamma, Weibull and Normal distributions and their properties - Functions of a sandom variable, Unit II Two Dimensional Random Variables 943 {nal and conditional distributions - Covariance - Corre- 'ransformation of random variables ~ Central limit theo- O43 first order, second order, strictly stationary, wide — jonary and Ergodic process ~ Markov process - Binomial, Poisson and Normal processes - Sine wave process. Unit V_ Correlation and Spectral Densities o+3 ine relation ~ Relationship be- function - Linear time invari- with random inputs ~ auto 3d output. 1s MAI253 PROBABILITY AND STATISTICS Unit | Probability and Random Variable o+3 Bayes theo- lensity fune- Standard Distributions 943 fe Binomial, Uniform, Exponential, Syllabus x Unit Il Two Dimensional Random Variables o+ ‘Joint distributions - Marginal and conditional distributions - Covariance - Co relation and Regression - Transformation of random variables - Central lim theorem. Unit Vv Testing of Hypothesis o+ ‘Sampling distributions - Testing of hypothesis for mean, variance, proportior and differences using Normal, t, Chi-square and F distributions - Tests for ind« pendence of attributes and Goodness of fit. Unit V__ Design of Experiments o+ Analysis of variance One way classification - CRD - Two - way classificatio = RBD - Latin square, Tutorial 1 Chapter 1 Probability Theory ing the important features of these random experiments, Random Experiment ‘An experiment whose outcome or result can be predicted with certainty is called a deterministic experiment. For cxample, ifthe potential difference E ‘between the two ends of a conductor and the resistance R ase known, the current {flowing inthe conductor is uniquely determined by Ohms law, 1= =. “ Although all possible outcomes of an experiment may be known in advance, the outcome of a particular perf of the experiment cannot be predicted owing to a numbes of unknown, ‘Such an experiment is called a random Mathematical or Apriori Definition of Probability. Let S'be the sample space {the set of all i equilly: likely) atid A‘be an event (@ 12 Probability, Statistics and Random Processes n(S) and n(A) be the ty of event A occusting, couiéomes) associated with @ random experiment ‘inher of élements of § and A. Then the proba denoted as P(A), is defined by (ay = MA) . Number of cases favourebieto A (2) = s9(S) ~ Exhaustive numberof cases in S For example, the probability of gettipg an even mimber in the die tossing experimentis 0.5, a8 $= (1,2,3,4,5,6}.E= (2,4, 6},m(S) = 6 and n(E) =3. Statistical or Aposteriori Definition of Probability [Leta randomexperiment be repeated n times and let an event A occur”, times ut of them trials. The ratio “4 is called the relative frequency of the event . As n incteases, “4 shows a tendency to stabilise and to approach @ constant value. This value, denoted by P(A), is called the probability of theeventA, ie., (P(A) = lim *. Forexample, if we want to find the probability that a spare part produced by amachine is , we study the record of defective items produced by the ‘machine over a considerable period of time. If, out of 10,000 itens:produced, assumed that the probability of a defective item is 0.05, 500 are defective, it “From boil he definition prions that 0 P(A) S11 A He an mporils esent, PCA) = 0. Convery, if P(A) = 0, thn A can occur ina try seal percentage of times inv lng ram. On tbe aber bad, fA tsa certain even, PCA) = 1. ‘Conerey, if P(A) = Toth A may fil nace in ay small preantate of times in the song ren Asdomatic Definition of Probability Let S be the sample space and A be an event associated with a random experi- ‘ment. Then the probability of the event A, denoted by P(A), is defined as a real number satisfying the following axioms. P(A) S1 any one of them excludes the occurrence of the others mutually exclusive if A occurs and B does not occur ané tice versa. In other ‘words, A and cannot occur simultaneously, i.c., P(A B) = 0. Probability Theory 13 results are derived directly as can be seen from the using only the axioms of probat theorems. Theorem 1 ‘The probability of the impossible event is zero, ic, if $s the subset (event) containing 20 sample point, P(g) = 0. Proof ‘The certain event S and the impossible event are mutually exclusive. Hence PSU #) = P(S) +P) [Axiom But SUG=S, POS) = P(S) + PO) POH=0 Theorem 2 16H is the complementary event of 4, P(A) = i ~ P(A) <1. Proof A and A are mutually exclusive events, such thac A UA = S. PAL A)= PS) = 1 fAxiom es, P(A)+ P(A) = 1 [Axiom P(A) =1-" PA) Since PCA) 2, it follows that P(A) < 1 Theorem 3 Tf A and B are any 2 events, P(A WB) = P(A) + PCB) — P(A B) < FIA) + PCB) Proof Ais the union of the mutually exclusive events A and AB and Bis the union Of the mutually exclusive events AB and AB. P(A) = PAB) + PCAB) [Axiom and La Probabll PU(A) + PCB) = (P(AB) +\P(AB) + PAB) + P(AB)) = PAU B)+ PAB) ‘The result follows. Clearly, P(A) + PCB) ~ P(A. B) < PLA) + PCB) Theorem 4 WBA, PBS POA. Proof ” Band AB ace mutually exclusive events such that BU AB = A. P(B AB) = P(A) ie, PCB) + PAB) = P(A). [Axiom P(B) < P(A) In probability theory developed ssing the classical definition of probabil theorem 3 above is termed as Addition theorem of probability as applied fo any 2 ev theorem ea be extended to any 3 events, B and C as follows: PAU BUC } = Pat least one of A, B and C occurs) = P(A) + P(B) + P(C) = PA BY =PBAC)-PCNA)+ PARBAO, In the classical approach, probabil is termied as addition theorem of probability as applied to 2 mutually exclusive events, which is proved in the following way. ‘Lat the total number of cases (outcomes) be n, of which n, are favourable to the event A and n, are favourable ta the event B. ‘Therefore the number of cases favourable to A or B, i.e., A. Bis (1g +9), since the events A and B are disjoint. PAU B)= HABE. Ma +B = P(A) + PB) Conditional Probability ‘The conditional probability of an event B, assuming that the event A has happened, is denoted by P(B/A) and defined as Probability Theory Ls B) POBIA) = provided P(A) #0 PA) For example, when a fair dic is sossed, the conditional probability of getting fen that an odd number has been obtained, is equal to 1/3 as explained S=(1,2,3,4,5,6)4= (1,3, 55 B= (1) MAN) _ ia) = ae eee mA) 3 1¢ definition given above FOB 6 1 ‘As per the definition given above, P(BrA) = SEED. = 16 = 3 Rewriting the definition of conditional probability, we get P(A 7 B) = P(A) x PIA). This is sometimes referred to as Product theorem of probability, which is proved as follows. Let ty, Raabe the number of cases favourable to the events A and AB, out of the total number n of cases. PAO B) = Mah BA x Pah = Pea) x PCBIAY ‘The product theorem can be extended to 3 events A, B and Cas follows: PAM BOC) = PIA) x POBIA) x P(CIA and B) ly deduced from the definition of conditional L IfA CB, PWIA) = 1, sinceA VB=A PB) IBCA, 2 PCB), since A A B= B, and PD CA, POBIA) > PB), since AO B ay RPO as P(A) s P(S) =1 IfA and B are mutually exclusive events, P(B/A) = 0, since P(A > B) = 0 If P(A) > P(B), PiAIB) > (BIA) (MU — Apr. 96) IPA, CA3, PAYB) S PLAY) (BU — Apr. 96)) Independent Events A set of events is said to be independent if the occurrence of any oné of them does ‘not depend on the éccurrence or non-occusrence’of the others. ‘When 2 events A and B are independent, itis obvigus from the definition that (BIA) = P(B). If the events A and B are independent; the product theorem takes the form P(A B) = P(A) x PCB). Conversely, if P(A. B) = P(A) x PB), the events A and B are said to be independent (pairwise independent). The product thecrem can be extended to any number of independent events: IFA, 3. «Ay aren independent events sa PUL AY...) = PIA) x PCY) x PCA) 1 16 Probability, Sttisties and Random Processes ‘When this condition is satisfied, the events Ay, Ay, «Ay are-als0 said to be totally independent. A set of events Ay, Ay said t0 be mutually inde- ‘pendent if the events are totaly independent whea considered insets of 2,3, events nthe case'of more than 2 events, the term ‘independence’ is taken as ‘total independence’ unless specifies otherwise. Theorem 1 . Ifthe events A and B are independent, the events A and B (and similarly A and B)arealso independent. Proof ‘The events A’) Band A 9 B are mutually exclusive: such that (A.B) U (AB)=B. 2: PA BYSPCA 0B) = PCB) (by addition theorem) PCA BY = PB) PAB) (B) ~ P(A) P(B) (by product theorem) = PB) (1 - PAY) = PA) Pe) Theorem 2 the events d'and 8 ae independent, then so aré A and B. (MU— Apr. 9) a addition theorem) ince A and B are independent) @) ‘From (1) and (2); if lls that when tbe ents A and B are independ, PAU B) = 1=P(A) xP (3). Example ¢————_— A fair coin is tossed 4 times. Define the sample space corresponding to this ran- dom experiment, Also give the subsets corresponding tothe following events and find the respective probabilities: Probability Theory 47 (a). More heads than tails are obtained. (®) Tails occur on the even numbered tosses. ‘= (HHHH, HBHT, HHTH, HNTT, HTHH, ATHT, HITH, HTT, ‘THHH, THHT, THTH, THTT, TTHH, TTHT, TTTH, TTT) (a) Let be the event in which more heads occur than tail, ‘Then A= (HHH, HHHT, HHTH, HTHH, THHH) (b) Let be the event in which tails occur in the second and fourth tosses. Then = B = (HTHT, HITT, TTT, TTT) ‘There are 4 letters and 4 addressed envelopes. If the letters are placed in the envelopes at random, find the probability none of the letters is in the correct envelope writing the sample the event spaces, ied by A, B, Cand D and the corresponding letters. by a,b, cand d, a cenivelope. : We note that Fis the complement of £. Therefore E, consists of all the elements of $ except those in E oes 5 = 223 and PE) =1- PE) = 5 gy ng MED 1- P= 18 Probability, Statistics and Random Processes Probability Theory 19, Example 3 ‘with minor defects and 2-with random (without replace ‘A lot consists of 10 good article both articles are drawn simultaneously, as they are drawn without replacement, G@ P(both are good) = NO. of ways drawing 2 good articles Total no. of ways of drawing 2 articles 10G, 16¢, I 8 _ No. of ways of drawing 2 articles with major defects Total n0. of ways (iii) P(at least 1 is good) = Plexactly 1 is good or both are good) = Plexactly‘1is good and 1 is bad or both ae good) 1106, x60, +100," 7 ae cease] Gv) Platmost 1 is good) = P(none is good oF 1 is good and 1 is bad) = 10Gy x 6C, +106, «6G, Ss 16C, “8 ‘200d and 1 is bad) (vi) Plncitherhas major defects) = P(both are non-major defective articles) = HG _ 91 “16G, 120 (vii). P(neither is good) = P(both are defective), = SG ed “6c, 8 Example 4 From 6 positive iand.8 negative numbers, 4 numbers are chosen af random (without replacement) and multiplied. What isthe probability that the product is positive? of them must be positive and the other 2 ‘No. of ways of choosing 4 positive numbers = 6C, = 15. No, of ways of choosing 4 negative numbers = 8C, = 70. No. of ways of choosing 2 positive and 2 negative numbers = 6C, x 8C, = 420, Total no. of ways of choosing 4 numbers from all the 14 numbers = 4c, (the product is pos 1001. ) = No. of ways by which the product is positve Total no. of ways = 154704420 _ 505 wot 1001 =————— Example § = ‘A box contains tags marked 1,2, .., 9. Two tags ate chosen at random without Feplacemeat. Find the probability thatthe numbers. on the tags wil be conseeu- tive integers, TF the numbers on the tags are t be consecutive integers, they mustbe chosen as «par from the following patie @ 3); 3, 4); ...3 (n= 1, m) of ways of choosing: any one pair from the above (n ~ 1) pairs-= Gen. oof ways of choosing 2 tags from the n tags = nC. n=l 2 ma-D a + Required probability = “The first biscuit can be given to any one of the m children, Similarly the second biscuit can be given inm ways. ‘Therefore 2 biscuits can be given in m? ways. 1.30 Probability, Statistics and Random Processes Extending, n biscuits can be distributed in m" ways. Ther biscuits received by the particular child can be chosen from the m biscuits in nC, ways. If this ehild has got r biscuits, the remaining (n — 1) biscuits can be distributed among the ‘remaining (m— ildren in (m ~ 1)"~' ways. Renae ef ng re ee aoe ecm te" eid pba = 2ST —— Example 2? rg P(A) = P(B) = P(AB), show that P(AB + AB) = 0 [AB=A 7B}. By ation earem, Paw B= MA) Fa) PA From the Venn diagram on page (1.3), itis clear that AUB=AB+AB+AB P(A UB) = PAB) + P(AB) + P(AB) (by probability axiom) (2) Using the given condition in (1), PU B)= PAB) @ From (2) and (3), PAB) + P(AB) = 0 —————— Example 8 ————————— If A, B and € are any 3 events such that P(A) By = PBAC)=0; (EMA) = 18 Pind the prot ast 1 ofthe events A.B and C occur, Plat least one of A, B and C occurs) = P(A UBU C) PUA U BU C) = P(A) + PUB) + P(C)~ PUA B) =PBAC)= PCNA) + PAB OC) wo Since P(A AB) = PB A.C) = 0, PABA C)=0. Equation (1) becomes =3-0-0-4 5.5 PAL BUC)=3-0-0-523 Example 9 ———————— Solve Example 5, if the tags are chosen at random with replacement, If the tag with ‘1" is chosen in the first draw, the tag with ‘2' must be chosen in the second draw. Probability for each = I/n. Probability Theory Lan first draw and °2” in the second draw) = 1/n? (Product theorem) ” in the second draw = Un the number drawn second may be ‘I’ or ‘3°. mnsecutive numbers in this case ly, when the first number drawn is ‘3°40 "Ga—1) probability of 1g consecutive numbers will be in All the above possibilities are mutually exclusive. = Required probability = > 2258 tea x5 2d —— Example 10——————— A box contains 4 bad and 6 good tubes. Two are drawn out from the box at a time. One of thems tested and found to be good. What isthe probability that the other one is also good? Let = one of the tubes drawn is good and B = the other tube is good, P(A. B) = Plboth tubes drawn are good) -$4.1 “joc, 3 Snowing that one tube is good, the conditional probability thatthe other tube is also good is required, i., P(B/A) is required By definition, P(ANB)_ ee PA). 6K =——————— Example 14«-——$—_——— ‘Two defective tubes get mixed up with 2 g00d ones. The tubes are tested, one by one, until both defectives are found. What is the probability that the last defec- ) the second test, (ji) the third test and (iff) the fourth ive and N represent tion-defectiye tubs. (@ P(Second D in the I test) = PC in the I test and D in the I test) 1az Probability, Statistics and Randem Processes Gi) P(sccond D in the HT test) = PUD, A No. Ds of Ny VD; ™D = PCD, WN, Ds) + PON, D2 Ds) ie aber Pilg 2a dad 3 24a? Gil) P(econd D in the IV test) = PD, 0 Ng ONs 0 Dy) + PON, OD, Os AD) PON AN; Ds DD 2 Dahnte2e2adare2x2ador ie eens gse2 — a —- Example 12: : Ina shooting test, the probability of hitting the target is 1/2 for A, 2/3 for Band Bid for Cal of them fre asthe arget, find the probability that @) none of hem hits the target and (i) at least one of them hits the target. Let A= Event of A hitting the target, and soon. ay =3,p(5)=2,p(@)=1 P(A) = 5. PLB) = 5 P(E) = P(A) x P(B) x P(E) (by independence) Ayt,ted 23a 424 P(at least one hits the target PA nBOC) = 1 — P(aone hits the target) jie ~ 24°24 ————$——— Example 13 temately throw a pair of dice-A wins if he throws 6 before B throws 7 if he throws 7 before A throws 6. If begins, show that his chance of 30761. (BU — Apr. 96) ‘Throwing 6 with 2 dice = Getting 6 as the sum of the numbers shown on the upper faces of the 2 dice. = POhrowing 6 with 2 dice) = P(enrowing 7 with 2 dice) = 4 Probability Theory —_ 133 Leta =Event of A trowing 6. Let B = Event of Bteowing 7 A plays inthe fist thicd fifth. tal. Therefore A wil hhe throws 6 in the first trial or third tril or in sub- ‘sequent (odd) trial: P(A wins) = P(A or A Aor AB AB Aor...) = P(A) + PA B A) + P(A B AB a) +... dition theorem) 5 315) 5 31, sy) 5 ah e(x 3) 5 4 (Bx SP xD 4 upto 235 “(8% 3) 5° Sg) ae 5/36 ; = [—assyagy tine the series isan infinite geometric series) = 30 6t Example 14 ‘Show that 2"— (n+ 1) equations are needed to establish the mutual independence ofnevents. 7m events are mutually independent, if they are t considered in sets of 2, 3, .... nevents, Sets of r events can be chosen feam the n events in nC, ways. ‘To establish total independence of r event ly independent when ‘Therefore to establish total independence ofall the nC, sets, each of revents, weneed nC, equations, ‘Therefore the number of equations required to establish mutual independence = 0G, Eig + nC + nytt MC )=dew) + SCF IP 3) =P) ; —_——— Example 15 ae fair dice are thrown independently. Thee events A, B and C are defined as (2) Odd face with the first die (b) Odd face with the second die a sad __ Probability, ss and Random Processes ) Spm ofthe aunties inthe 2 dice sod. Are the events A,B and C (© aly independent Baku Ape 9D ‘The outcothes favourable to the event C are,(1, 2), (1, 4), (6), @, 5) and so on. a ai PO=3 Panb=Re00=RAnC=+ P(A 9 B) = P(A) PCB), and so on But P(A VB NC) =0, since C cannot happen when A and B occur. Therefore BAC) oma) =o) xP) Pca inc evens are pervs independent, but ot mutually independent, = Example 16 7 If A, B and C are random subsets (events) in a sample space and if they are pairwise independent and A is independent of (B UC), prove that, B and C are ‘mutually independent. (MU — Nov. 96) Given: P(AB)= P(A) x PCB) ¢ PBC) = PCB) x PC) P(CA) = (CT PA) PIA (BU C)) = PA) x PBL C) Consider PAG U C)) = PAB 0 AC) = P(AB) + P(AC) ~ PAB OAC) (by adation theorem) = P(A) x PCB) + P(A) x P(C) = PCABC) Woy (1) and 3)1G) “Therefore from (4) and (5), we get (ABC) = P(A) « PCB) + P(A) x PCC) ~ P(A) x PB UC) = F(A) x (PB) + PO) - PBL O}) =P 9.0) (by addition theorem) = POA) x PCB) x PCC) [by (23) © From (1), 2), @) and (6), the required result follows. Part A (Short answer questions) 1, What is a random experiment? Give an example. 2. Give the apriori definition of probability with an example. . Give the aposteriori definition of probability with an example, @ Give te definitions of join and conditional rob . IBA, prove that Prabal : ty They 145 Give the relative frequency definition of probability with awexampe, ‘Define the sabiple pce ad a event associated witha randoms expert ‘nent with an example Give the axiomatic defiition of probability State the axioms of probability ‘What do you infer from te statements P(A) = 0 and P(A) =17 Define mutually exclusive evens with an exaniple. (Example: Geiting an odd suber and getting an even number when a {faced die is tossed aré 2 mutually exclusive events.) ied aid 2 black ball, 2 dalls are drain at rat that they ate of the same colour i well-shuffled pack of playing cards, what is the probability that they are of the same suit?) ‘When A and B are 2 mutually exclusive events such that P(A) = 1/2,and P(B) = 173, find PAU B) and PAB), it P(A) = 029, PG) = 043, find POA Fi if A and B are mutually exchusive. ‘When A and 2 aré 2 mutually exclus and P(A B) = 05 consistent? Why? rove thatthe probability of an impossible event is zer0 (or prove that PC) =, Brove tnat P(A) = 1 = PCA), whete 7 is the coinplement of A, State addition theoreti. as applied to any 2 events. Extend3t to any 3 events, ‘events, are the valties P(A) = 0.6 IF PCA) = 3/4; PCB) = 5/8, prove that P(A A B) 2 378. nit well-shuffled pack of got both? = IEP(A) = 0.4, PB) =0.7 and PAB) = 03, find! P(A B) IEPA) = 0.35, P@) 3 0.75 and PLA vB) = 0.95, find P(AW B). . Proite tht PCA vB) < PA) + P(B). When does the equality hold got? 1B A, prove that PCB) < PA). 3 fies with examples ya deduce the progct theo- Give the definition of conditional pro ein Ut prebibilty TEAC B, prove thit PCRIA) = 1 (BIA) = PCB). IEA atta B ate mistaally exclusive events, prove that PCB/A) = 0 IE PR) > Pip); prove lian PA/B) > PCBIA). TEAC By ptove that HAIC) < PEBIO).. 116 Probability, Sttistice and Random Processes 32. IF P(A) = U/3, P(B) = 3/4 and P(A W B) = 11/12, find P(A/B) and POBIA), 33. When are 2 events said tobe independent? Give an example for 2 indepen- 38, 15% of a firm’s employees are BE degree holders, 25% are MBA degree holders and 59% have both the degrees. Find the probability of selecting a BE degree holder, ifthe selection is confined to MBAS. 39: Ina random experiment, P(A) = 1/12, PCB) = 5/12 and P(B/A) = 1/15, find PAUB), 40. ‘Whats te difference between tol independence and muta indepen- 41, Can 2 events b simultaneously independent and mutually exclusive? EX 12. lesa Banden tes, poe tat Han 8 ar in indepen 43 ene events, prove that A and B are also indepen- dent. 44, IE PCA) = 05, PCB) = 03 and PLA. B) = 0.15 find POALB). 45. If and B are independent events, prove that A’ and B ate also indepen- dent 46, If. A and B ate independent events, prove that PAW B)= 1 P(A) * PCB) - a fa ternately with the understanding th the head first wins. IFA starts, what is his chance of winning? until 2 consecutive defectives are produced or 4 items have been checked, whichever occurs first. Describe a sample space for this experiment. ‘Two balls are drawn one by one ., Won. What are the probabi 59. 60. Probability Theory _ 1at (a) ‘The first ball drawn is white, (b) Both the balls drawn are black. Also find the probabilities of the above events. Xx contains three 10-2 resistors labelled Rj, R and R, and two 50-2 1s labelled R, and Ry. Two resistors are drawn from this box with- all the outcomes ofthis random experiment as paits the outcomes associated with the following events find the corresponding probabilities. Both the resistors drawn are 10-0 resistors. 1 and one 50-Q resistor are drawn, ris drawn in the first draw and one $0-02 resistor is, raven in the second draw : Acbox contains 3 white balls and 2 black-bslls, We remove at random 2 balls in succession, What is the probability that the first removed ball is white and the second is red? (BDU — Apr, 96) An urn contains 3 white balls, 4 red balls and 5 black balls. Two balls are Grawn from the umn at fandom. Find the probability that (3) both of them. are of the same colour and (ii) they are of different colouts. If there are 4 persons A, B, C and D and if A tossed with B, then C tossed jth D and then the winners tossed. This process cont il the prize is es of each of the 4 to win? (MKU — Noy. 96) ‘Ten chips numbered 1 through 10 are mixed in a bowl. Two chips are. drawn from the bow! successively and without replacement, What is the probability that their sum is 10? A bag contains 10 tickets numbered 1, 2, .... 10. Three tickets are drawn, aatrandom and arranged in asceading order of magnitude. What is the prob- ability that the middle number is 5? Two fair dice arc thrown independently. Four events A, B, C and D are defined as follows: A: Even face with the first dice. on the 2 dice is od ints on the 2 die exceeds 2. Find the probabifities ofthe 4 events. ‘A.box contains 4 white, 5 red and 6 black balls. Pout balls are dcawn at random from the box. Find the probability that among the balls drawn, there is at leat 1 ball of each colour. Fout persons are chosen at random from a group consisting of 4 men, 3 women aid 2children, Find the chance thatthe selected proup contains 50. sins 2 white and 4 black = drawn on icement. Write the sample space corresponding to this experi- en 1 subsets corresponding to the foDowing events. i 138 61 a. 63. 4. 6 aa 68. 6s 10. 1 7 1: 46 ] Probability, Statistics and Random Processes ‘A.committee of 6 is to be formed from 5 lecturers and 3 professors. If the smemiers of the committee are chosen at random, what is the probability that there will be a majority of lecturers in she committee? “Twelve balls are placed at random in 3 boxes. What is the probability that the first box. will contain 3 balls? IfA and areany 2 events, show that P(A © B) < P(A) < P(A UB) < P(A) +P). (MU — Apr. 96) ‘A and B are 2 events i"with an experiment. If P(A) = 0.4 and PAUB)= and B are mutually exclusive (i) A and Bare independent, IEP(A +.B) =5/6, P(AB) = 1/3 and P(B) = 1/2, prove that the events A and Bare independent. WA GB, P(A) = 1/4 and PCB) = 1/3, find PCA/B) and PCB/A). ‘mabjects are selected from n objects (m B) =O, by definition. 43. When A and B are mutually exclu (A. 9 B) = P(A) = 0.29 P(A) = PAB) + P(A B= 0+ PAA B) ic., 0.6 =0.5, which is ~PANB)S1 19. PS UA) = PIS) + P(A)~ PSA) 20. P(A UB) = P(A) + P(B)- PAU B) 2,4 4 14 PAns)= 244-4216 ees es 21. P(A B) =1- PAB =1-[P(A)+P@)-PANBI =1-(044+0.7-03}=02 ¢ 22. P(AUB) =1-PAnB) =1-[P(A)+ PB) ~ PAU B)) =0.85 23, 25. Pe Probability, Statistics and Random Processes pore ta ena ee) 32. RAUB)= > : =}+3-uat Pane) 2 Pans) 1 = PAOB) 22. peyay= PAOD 2 PAB Ee 9 POM=" pay 2 34, Reduired probability = 1 ~P (both tals e1-1x723 2°24 38. 36. 37. 38. 39, AL 44, 47. 48. 49. Required probability =P (I from I dice and 3 from I dice of 2 from 1 and 2 from II or 3 from 4 and | from 1} etxtytxtytyted 6*6*6*%6*6*6 12 Required probability = 1 P (the problem is not solved) aca arian No, since P(A 9B) # P(A) x P(B) Pope) = PBEOMBA) _ 095. . 9.9 P(MBA) 0.25 : x aeeeeaseaaagiaaae PNB) =PA)xPBA)= ExT PUB) = P(A) +P@)- PAB) 1,357 _ 89 ei 3-4 e 12°12 180" 80 ‘A and B are independent, if P(A 9B) = P(A) x P(B). They are mutually exclusive, if P(A B) =O. ‘They are both independent and mutually exclusive if P(A) x PB) =0, ic, if P(A) =0 or P(B)= 0 or P(A) =0 and P(B) =O. The third eas is trivia. Hence A and B canbe both independent and motually exclusive, provided sither of the evens is an impossible event P(A OB) = P(A) x P(B) A and B arcindependent. Hence A and Bi are also independent. 2 PIB) =PUA=05. i Her Fed (DD, NDD, DNDD, DNDN, DNND, DNNN, NDND, NDNN, NNDD, NNDN, NNND, NNNN } 50, su 32. 33. 54, 55. $6. 37. 38, 59. 60. ot 62, 64 67. 68. 69. 70. 1 Probability Theory 1.33, 2 a 5 3 Gin Gf the ball is not replaced), -teP@m= ou PU)= >= PB)=P(CHPO)=— 48 1 3 18. 9 a 126,28" @ 03 ala ale Gi) 05 a “ ws 1 ws Gi) 0.50 1 1.34 Probability, Statistics and Random Processes 72. (a) 1 Be © 3 73. Pairwise independent, but not : 3 7m. @ i OF wt 9 8. & 7 7. 18 35 me 3 Exercise 1(B) 2. Ped ball) = P (se ball from bag I + P(e bal from bag Mh w1,2,1,3.19 i 2h cin SoD) PB) POW) 4, PBYW) = —_ P,)x POWIB,) + PCB) x POW/B,) 5. Pans P= 1; penn 1 pai = 4 ‘By Bayé’s theorem, P(M/B) = 6 & 221 2 @ 2000 1B 8 @ 8 Y 80 9. 047 10, (=p = P)ta Po Probability Theory 135 0.0062 ai 0.8005 1. & vw. & 25 Exercise 1(C) 2. Requied probaitty= P (getting exactly 3 or 4 heads) a =405(3) 3, Required number «256 126, (2) 1 2% = 506( = hg n= 50; P=50C, 4. p= deed = Ten = 50: P rT] 1-day 0.8655 i) 0.3243 Random Variables the outcomes of random experiments may be numerical or non-numerical in nature, For example, the umber of telephone calls received in a board useful to describe the outcome of a random ‘experiment by number, we will assign a number to each non-numerical outcome of the experiment. For example, of get ‘number x to every clement s of the sample space S. That ‘that maps the elements of the sample space into real numbers is call Hereafter, R, will be referred to as Range spac : Similarly: (X < x} represents the subset {s: X(s) < x) and hence an event ‘associated with the experiment. Discrete Random Variable If X is a random variable (RV) which can take a finite riumber or countably infinite number of values, Xi called a discrete RV. When the RV is discrete, the 22 _Probability, Statistics and Random Processes Possible values of X may be assumed a8}, «5X, --.- In the finite case, the list of values terminites and in the countably infinite case, the list goes upto infiaiy. For example, the number shown when a di is thrown and the number of alpha particles emitted by a radioactive source are discrete RV. Probability Function : HE.Xis a discrete RV which can take the valUues.x, 9,23, -- Such that P(X = x) = ‘Pa then p; is called the probability function or probability mass function or ‘point probability function, provided p; (i= 1, 2, 3, ..) satisfy the following conditions: |, 2 0, forall i and @D p=1 ‘The collection of pairs (sp p,}+i=1, 2,3, «is called the probabil tion of the RV X, Which is sometimes displayed in the form of a table as given below: Continuous Random Variable 16 X is an RV which can take all values interval, then X's called a continuous RV. For example, the length of time during which @ vacuum tube installed n'a circuit functions is a continuous RV. Probability Density Function If is a continuous RV such that, Lavexexttarte. p{x-jarsxseede } flax then f(x) is called the probability density function (shortly denoted as pdf) of X, provided (a) satisfies the Tollowing conditions: «PNW FG) 20, forall xe R,, and f f@ax=1 Moreover, Plas X'Sb) or Pla < X- binomial distribution with parameters n and p,which is denoted a B 2, If the discrete RV X can take the values 0, 1,2, ... such that Px av Sie then X is said to follow a Poisson distribution parameter 2 3. If the discrete RV X can take the values 0, » such that POX = i) = (n ¥i-1)C,p"¢, i=0, | then X is said to follow a Pascal (or negai ‘A Pascal distribution with param and p += is called a geomen Continuous Distributions 5. Ifthe pdf of a continuous RVX is f(x) = X follows a uniform distribution (or rectangular di z 7 a 2 2 ; | i =, then X Is said to follow a normal distribution (or Gaussian distr: buticn) with parameters 1 and ¢, which will be hereafter denoted as Nis, the paf of a continuous RVX isftx) = M4 O0, then X follows a gamma distribution with parameter n. Gamma distribu- tion is a particular case of Erlang distribution, the pat of which is f(x) = [ape cece n> 0,220 An Erlang distribution with m = 1 fic., fla) = ce, 0 < x<%, > 0) is called an exponential (or negative exponential) distribution with pata meter. 9, tf the pat of a continuous RV Xis fl) = 2p °°", O- 0, then X follows Laplace (or double exponential) distribution with pa~ rameters and p1 12, ep ofaconinanes RY ft) = Eq bog 0, wc eo then X follows a Cauchy distribution with parameter a Example 4————____——_—— From a lot containing 25 items, 5 of which are defective, 4 items are chosen at random. IfX is the number of defectives found, obtain the probability distribu- tion of X, when the items ate chosen (i) without replacement with replacement ‘Since only 4 items are chosen, X can take the Values 0, 1,2, 3 and 4. ‘The lot contains 20 non-defective and 5 defective items. Case (): When the items are chosen without replacement, we can assume that all the 4 items are chosen simultaneously. : P(X = 1) = P(choosing exactly r defect = F(choosing r defective and (4 —r) good items) items) 56,x20C,- o35G; ‘When the items are chosen with replacement, we note that the prob- ‘of an item being defective remains the same in each draw. S14 ie, -2-1 get andra oss ‘The problem is one of performing 4 Bernoulli's trials and finding the probability of exactly r successes. rasan =4e,(2) (SY 20,400 (r= 0.1, Case ( Example 2 Acshipment of 6 television sets contaiiis2 defective sets. A hotel makes a random purchase of 3 of the sets. If X is the number of defective sets purchased by the hotel, find the probability distribution of X. (MU — Apr. 96) All the 3 sets are purchased simultaneously. Since there are only 2 defective sets im the lat, X can take the values 0, 1 and 2. R Probability, Statistics and Random Processes P(X = 1) = P(choosing exacily r defective sets) = Plchoosing r defective and (3 — r) good sets} 26, x4G_, 6G ‘The required probability distribution is represented in the form of the follow- ing able x=r Pr o iS 1 35 2 | ts (r=0.1,2) [Preaar = Example 3 ‘A random variable X may assume 4 values with probabilities (1 + 30/4, (126, cL + 2i/4 andl ~ 40/4 Find the eondion on eo that thee vanes represent the probability funcion af X? P(X = x) = py = (1 + 324; py = (1 — 294; y= (1+ 23/4; py = (40/4 If the given probabilities represent probability fonction, each p, = 0 and Spel . In this problem, p, +p, py + P4= 1, for any x. But p20, ifx =~ 1;p5£0,i€ 5 i py 20, x2 1/2 and p20, i645 1/8 Therefore, the vaioes fx for which a probability functions defined lei the range “1/3 ©< U4 Example 4 Ifthe random variable X takes the values 1, 2, 3 and 4 such that 2PCX = 1) = 3P(X = 2) = PX = 3) = SPCX= 4), find the probability distribution and cumula- tive distribution function of X. Let POX = 3) = 30K. Since 2P(X = 1) = 20K, P(X © 1) = 15K. Similarly POC = 2) = 10K and POC = 4) = 6K. Since Ep, = 1, 1SK-+ 10K +30K + 6K = 1 27 Random Variables ‘The cf Fx) is defined as F(x) = P(X < x). Accordingly the cdf for the above distribution is found out as follows: When x< 1, Fis) =0 =paxsy= 1 When 1 $x<2, P@)= P= = 2 When 2 <.x<3, Flx)=POC= 1) + POC=2)= 25 When 3 << 4, Fa) = POr= 1) + PO=2) + Par= 3) = 3 When x24, Fn) = Por 1)+ Por=2)+ P= 3) + Parad) = 1. Example § —————_———_— ‘A random variable X has the following probability distribution. x -2 -1 0 1 2 3 pO): 01 K 02 2K 03 3K (a) Find K, (b) Evaluate POX < 2) and P(-2 < X < 2), (c) find the cdf of X and 1) evaluate the mean of X. (BU — Apr. 96) (a) Since 3 P(x) = 1, 6K + 0.6 = 1 gel 15 the probability distribution becomes x =2 ~1 0 L a 3 p@): Wo Ws Ws wis 30S. ) PX<2) = PX =-2,-1,00r = P(X =—2)+ P(X =~ 1) + PX =0) + P= 1) [since the events (X =~ 2), (X =~ 1) ete. are mutually exclusive] he “10 "15 PU2< X<2)= PO =—1,or 1) =P D+PK=O4 PRD e222 (© Fla) =0, when x <2 = 4 when-2 5) and PO is divis- = 1 je Fi ©, that is a geomettic series. i 2 s at ra 2 ‘The mean oF X is defined as L(X)= _E_ jp, (refer to Chapter 4), jot BQ) = 0424? + 30+ ... oe, where a= 5 Sal +2a+ 3a? ... 0) Random Variables 29 x sa(l-ayt=—2y on (-ay ay @ ‘The variance of X is defined as V(X) = E(X*) — [EQOF, where E(X*)= 5 jp! (refer to Chapter 4. j=l BO)= S fal, wherea= + j=l A = E wG+y-Aal= ¥ jg+tlel— ¥ ja! in isl i a (1242304340? +... 0) —a(1 +204 30+ =ax2(1—-a))—ax (1a)? fe lzaretete © (=a G- ay YQ = EOP) — (E00)? = 6-42 POC is even) = POC =2 or X=4 or X=6 or ete) P(X=2)+ PX =4) + te (since the events are mutually exclusive) JO P25) = P(X = S or X=6 or X=7 of etc.) =PK=5)+PXa 6+ te Probability, Statistics and Random Processes P(Xis divisible by 3) = P(X =3 or X= 6 or X=9etc.) . EPK=3) + PX=6) +. “Pen ‘Arandom variable X has the fol distribution. x cee 6 7 pa): 0 K 2K 2K 3K 2K 1K 4K Find (i the value 0€ K. (i) P(LS 2) and (ii) the smallest value of Afor which POX 4) > 1/2. E pla)=1 : 10K? + 9K =1 ie, (OK 1) (K+1)=0 1 ek ort x 10 . ‘The value K =~ makes some values of p(x) negative, which is meaningless. 1 Rey 0 “The acwal distribution is given below: re Oo Mt 2.3 67 ee es 27 po: 8 Go ww 100 100 (@) PUS2) = PAB), say P(A) “P@ _ Plas2)] ~ P(X>2) s i _P(x=9)+P(K=4) 10S Six q Raxdom Variables 2A i) By trials, P(X <0) = 0; PKS 1) = ¢ 5 xs 3)= 5 Pasar: Therefore, the smallest value of 2 satisfying the condition P(X < A) > 1/2is 4 Pas = 5 Example 8 awlee*?) x20 (a) show that p(x) is a pdf (of a continuous RY X.) . (©) find its distribution function PCs). (Bt —Nov.96) (@) If PG) is to be a pdf, p(x) > 0 and - J mart Qbiiously, p(x) = xe"? 20, when x20 Now J” paydr=f xe? ax= J ode puting +=.) en PG) is a lepiimate put ofa RV X. Fix) = PX <= f foodr Fe) =0, when x <0 aad P=] x PP de = 1 e°*?, when x > 0 —— Example 9 Ifthe density function of a continuous RV X is given by fas, OSxst =a 1sxs2 =Ba-at 25x53 =0, elsewhere find the value of a i) find the cdf of X If, and x, are 3 independent observations of X, what is the probability that exactly one of these,3 is greater than 1.52 242 Probability, Statistics and Random Processes (iy Since FG) isa pat, f flax =1 a ie, J fein 1 2 a le, J avar+ f adx+] Ga- andr =1 jen 2a=1 ae 2 Gi) F@) = PX sx) = 0, when <0 rove] Eae= = [fare fare Donen ses? ‘Choosing an X and observing its value can be considered as a trial and (X > 1.5) ccan be considered a success. fi pei, q= 12 ‘As we choose 3 independent observations of X, n= 3. By Bernoulli's theorem, Plexactly one value > 1.5) = PA success) = 36; x 0)! x @= 3 = Example 10 A continuous RV X that can assushe any value between x= 2 and x= 5 has @ density function given by f(x) = M1 +2). Find PX <4). (MU — Apr. 96) Random Variables 213 By the property of pa, J 70) 4x0 1 x1akes values between 2 and 5, 4 J Hit+xde=1 2 ie, eet 2 tak a Now pk <4)=p@ f setae = 3 GCE) = 48 (6) + 12? Ce) - 2aae) + MONE. =12 VO0 = £00) - (£00)? = 3 Example 12 The probability that a person will icin the time interval (13) is given by pays s 1)=f ata ” ‘The function a (1) is determined from long records and can be assumed to be: aye [3* 10°F (00-1)? 0<1<100 en a bility that a person will die between the ages 60 and 70 that he will die between those ages, assuming he lived (MSU — Apr. 96) neo 6) = P(60-< #< 70150 << 100) __P60<1<70) = P(o a) and Random Variables 24 hi) POX > b) = 005 @ P(Xsa)=PX> a) (BDU — Nov. 96 J sede=f stax a ie, pel-@ pat ie, dad 2. a= 0.1937 i) PRS B= 005 J 32a 005 ie, B= 95 b =0.9830 _—_—_— Banmple 14 ———————— tion of a RV Xis given by F(x) =1- (1 + 2e%, x2 0. Fine , mean and variance of X. (MKU — Nov. 96; given by f(x) = F(x) at points of continuity of FC), ‘The given cdf is continuous for x> 0. fa) = (+ Net =e = xe, 20 eee: EQ)=f Perarn6 VOX), = EOP) ~ [EGO = 2 —— Example 15 ————— ‘The cdf of a continuous RV Xis given by FQ) =0,x<0 ' 2, 1 =Yosxe1 2 3 od =1-26-x4< x< BONS 3 = 1x23 a Find the paf of ¥and evaluate P(X < 1) and P (Ps X-<4) using both the pat and cdf. i 216. Probability, Staistics and Random Processes S fa)=6,2<0 1 Ht, 05x65 Ge = yy OF Sed =0,x23 Although che points. x= 1/2, 3 ase points of discontinuity for (x), we may assume tn (2) =2 ent 73)=0 mie) epetsssi) + 6 7 J $e» as osing propery of pp eee I we use property of eat Posy =Pctsasnery-R- p= If we use the property of pdf Pansx—1 and £2 2) or (k< 2and kS—1)) Random Variables 237 = PD 2 oF KS 1)= PE 2) [since k takes values in (0,5) 7 1 =| oak $ 6-2 3 s . —— Example 17 A point P is taken at random on tine AB of length 2a, all positions ofthe point being equally likely. Find the probability thatthe product (AP x PB)> &, - Let AP =X. PB = (2a~X) ‘Since all positions of the point P are equally likely, X(=AP) is uniformly distrib uted over 0, 24), 1 ofx= 1 o#] , Fated ince unconditional Ra) = POX 3} _ PitsxXsx] ~ Prex d= fSx>e 1-FO, =Oforx 1) is given by SOtk> 9 = 5. FUIx> For the exponential distribution with parameter 2, Fis)=Ae*,x>0,and F(x) =f 4e*ds = 1- ae ™* a AM fix) FOIX> = ———— Example 19 Iff (2) is the unconditional density of the time of failure of a system and h( haar rte (or coniona ful ate or condional density of find fC) in eon of he), Dede that Follows «Rayleigh da Mont "The condtona density of 7 given 7 othe hazard rts ven by = fT >t) " aa isthe T>1) 1 when SPB) Laine MO) = PCTs 0)=Oas the system was putin operation “4 F@=Le? Random Variables 219 samo xe} neoae When (9 = ‘which isthe pdf of a Raleigh distribution Example 20 ———————— lows N(O, 2), find 1 1) Find deriveothirn=fPAg* 220. is the pdf of aRV X. If the paf of a RV Xis fix) = 3 in0. LS/K> 1), |. The RV X has the following probability distribution 26. 30. Part Random Variables 221 re a ar ce re (04k 02. 03 find k and the mean value of X. FX represents the outcome ofthe oss of a6 faced dice, find P(X<2) as a function of x Ifthe paf of @ RV. Ifthe cdf of a RV <0, find the paf of X If the cdf of a RV is given by Fla) = 0, forx <0; =2/16 for 0 sx-<4 and fox) = 2x, 0- 3. Ifthe probability mass function of a RV Xis given by P(X =r) =krsr=1, 2, 3,4, find (i) the value of k (i) P(L2 < X < 5/2/X> the mean and variance of X and (iv) the distribution function of X. Find the values ofa for which P@®=j) = (1 -a)a,j= represents probability mass function. Show also that for any 2 positive integers m andn POX > m+ nik > m) = POLE n), Ifa discrete probability distribution is given by POX = 1) = k(I — O a) and (iv) PX S'1/2/ 13 - 1/2/X<- 174), ‘56. Suppose that the life length of a certain radio tube (in hours) is a contint ous RV X with pdf fix) = 19° <> 100 and = 0, elsewhere. a's the maximum number of tubes that may be inserted into a set s that there is a probability of 0.1 that after 150 h of service all of them ar still functioning? A 51. Ifthe edf of a continuous RV Xs given by FOX) = e¥, x0, and FO) 1-5 @8,x> 0, find Pasi s /R). Prove that the density function of Xi pooe bet mc cm gienthaib> 0. 7 ‘58. If the distribution function of a continuous RV X is given by F(a) = ( whenx<0;=x, when 0). The concept of independent RVs is also extended inva natural way. The RVs (Xj, Xp, +X) are said tbe dependent, if = SiG) Sig 5) = Sa) nal density fpeton are define asin the following example. ifn = F(%p 2%) Fue) fay! a)

You might also like